Lighting

Posted on by
Back to Lighting...

Sicario: Grading for HDR (9 replies and 8 comments)

Almax
5 years ago
Almax 5 years ago

Hi Roger ... I went to an interesting session at AFTRS ( Australian Film & Radio TV School ) recently where HDR and UHD was discussed. Sounds like the added levels in white can be physically challening to the eye. 

For the hardcore here is a link to Lemac's livestream of the Queensland event.

UHD TV -From Lens to lounge room»

I was wondering was this the shot (pic below) in Sicario where Kate is against the Motel1 window where you previously mentioned was a little tricky to grade in HDR?

https://www.rogerdeakins.com/wp-content/uploads/2016/03/Kate-against-the-window.png
Tim Lookingbill
5 years ago
Tim Lookingbill 5 years ago

Almax, that's a 2 hour video. It covers a lot of ground about color science and human perception. Could you point to a location time in the video where highlight white levels are going to be a problem with cinematography?

I skipped through a lot of the color gamut and perception parlor game trickery and landed at 35 minutes in where the speaker talks about brightness nits 2000, 4000 WOW! but I didn't find or wasn't patient enough to wait for the part where UHD TV will add bits and luminance levels above the 255 we now work with in the computerized digital world to map detail to those extraordinary brightness levels. 

What's frustrating about reading technical discussions about HDR claims and I've read quite a bit online is right now I can shoot with my $500 2006 Pentax DSLR with Sony sensor and capture about the same amount of detail as in that posted sample shot if I expose to preserve window highlights shooting/processing in Raw. Of course I have a lot of noise lifting those shadows that would be plunged to black, but if I apply a shaping curve with just right amount of gradual slope tapering into those shadows I can blend that noise into the facial detail with a bit of noise reduction. But Roger most likely added some fill bounce or utilized Alexa's wider dynamic range capabilities where he didn't get much shadow noise. I can't tell in that posted shot because the faces are so dark I can barely see the African American actors face which wasn't the case when I viewed that scene on a Sony 4K projector.

Humans are not going to tolerate viewing blinding bright white displays just so they can see curtain detail in highlights. They're going to turn the TV down to living room viewing conditions so we're basically back to square one with HDR claims. It's a sliding scale with human viewing tolerances and none of these TV manufacturers are being specific where 255 levels of RGB detail is going to fit in that scale. What the hell does 4000 nits look like?

I don't think that 2 hour video tells the whole story if it can even be told no matter how much information is revealed. Or we're all going to have to watch our TV's outside during the daylight and that's not going to happen. Of course maybe drive in movie theaters will adopt such technology so they can show their movies during the day. I will not be attending.

Almax
5 years ago

Hi Tim the session I went to was the film school in Sydney and differs from the Queensland event So icant quote an exact time in the video. The comment came up a few times if you have the sun in a shot or a bright light it could potentially become a discomfort. The main take out I got was that 4K is not exactly wowing the consumers in terms of a marked improvement from existing HD. However HDR with a little extra in the low blacks and peak whites improves the potential for contrast and the contrast improvement coupled with 4K really starts to look impressive.

Tim Lookingbill
5 years ago

So did they explain about the 8-10 bit/255RGB digital imaging encoding limitations for tonemapping all that extra dynamic range on these advanced TV's that currently have a myriad of tone and contrast settings the user needs to fiddle with to get a decent picture?

I'm pretty new to the cinema side of digital imaging and have been bewildered by the ASC capture/DI process flowcharts posted online from various cinema process support service vendors involved in going from capture to viewing device output in order to render content as the creator intended and I can't see how a cinematographer is going to be able to shoot for a particular viewing device since almost every content device manufacturer has their own proprietary method of making that happen.

I understand color management and I've gotten pretty good editing Raw for still photography and getting almost exact match as intended on a $50 Epson printer, my one hour inkjet photo lab and online viewing on a calibrated display. It all works, but the movie biz seems to need a mind numbing level of complexity to do basically the same thing but for cinema projectors and TV's. And now they want all that complexity to ramp up to higher dynamic range TV's with all the loose ends and variables mentioned?

Standards? We don't need no stinkin' standards! We like proprietary workflows and equipment!

Almax
5 years ago

Yes Stewart P spoke of new metadata that needs to be passed up stream (camera to edit and post ) by D.P's or else things fall over! He was also saying that the key players are NHK & BBC, Phillips, Sony, Technicolour and Dolby. Dolby is looking to monetise through licensing surprise, surprise! I went along looking to gain a bit of a heads up and came away with sense of a new Wild West!

Tim Lookingbill
5 years ago

OMG! They're going to rely on metadata to insure creator intent consistency? No one can even agree on gamma encoding. Why does the DCI compliant projectors insist on 2.6 gamma for movies while editing displays for still photography rely on 2.2 gamma?

Today an online discussion on gamma verses tone response curve I was reading got me to do more searching on the subject because I just couldn't understand the differences regarding display standards, video card LUTs in how it relates to ICC profiles.

I stumbled upon a wiki page about a Jones Diagram. I've been into digital imaging since 1998 and never heard of it...

https://en.wikipedia.org/wiki/Jones_diagram»

The link at the bottom of that page took me to this...

http://www.theasc.com/magazine/april05/conundrum2/image9_big.html»

Click on the next/previous at the top which will lead you to Roger Deakin's process diagram of "The Village".

It's like the industry is speaking another language but coming up with the same picture.

Roger Deakins
5 years ago
Roger Deakins 5 years ago

'It's like the industry is speaking another language but coming up with the same picture.'

I'm not sure if they are coming up with the same picture. HDR is a case in point. If we had not done a separate timing pass for 'Sicario', the HDR version would have been like watching a completely different film. If you create a balance of light and dark on set you expect that balance to be maintained throughout the process. I personally resent being told my work looks 'better' with brighter whites and more saturation.

Tim Lookingbill
5 years ago

Sorry, Roger. Didn't mean the same picture. I meant it's the same results as still photography except with movies it's 24 still frames a second.

Generally speaking in the motion picture business from the links I posted there's quite a bit of process control complexity upstream to acquire and preserve creative intent than there is control over how the finished work is presented to the public across all media formats including web connected displays, smartphones, projectors and TV.

Case in point I'ld still like to know why the sample stills of your movies posted here as references for lighting tips are so different than what I saw at the theater and online. Then there's online Blu Ray screengrabs of movies that don't match up. Where's the standards for controlling creative intent?

Roger Deakins
5 years ago
Roger Deakins 5 years ago

The only control you have is to time every version of your work yourself. Obviously, that is never going to happen so you have this chaos of imagery. 

Tim Lookingbill
5 years ago

It's clear then I don't have a full grasp of who controls and decides the final look of a movie through timing. I was under the impression you and the director decide this and sign off as the final creative intent and is locked in with no further edits to tone and color. That's the way it is with still photography.

Of course with still photography posted online anyone can drag & drop/screengrab and manipulate it the way they want.

With major release movies such as yours there is a master final that goes out to theaters I'm assuming where there's no access to the data for manipulation, is that correct?

Or does the major studio who backed the movie and controls distribution for Blu Ray, TV and web get to tweak the color and contrast?

Thank you for your time and thought responding on this subject.

Rodrigo Prata
5 years ago

Roger, what is your view on HDR? I sometimes wonder if "More Highlight and Shadow Detail" is even desirable. I haven't studied the subject yet, but what does that mean? Does it mean that if I've shot a silhuete, there's going to be detail on their faces on the HDR version? For the more techinical person in here, does that mean that we have to grade our images in HDR so they look like the SDR version? What's the point then?

Roger Deakins
5 years ago
Roger Deakins 5 years ago

Yes, I wonder what the point is. It definitely seems like it needs a new grade. The problem is that a silhouette that has some detail becomes so bright in the background that it is impossible for the eye to read the detail in the 'original' image. 

BamaPete
4 years ago
BamaPete 4 years ago

I'm sorta late to this thread, but it is so very interesting and brings up questions I've always had as I try to get my arms around the complex digital video world.  I often peruse youtube video reviews and instructional info on the best DSLR or DSLM specific to best dynamic range.  It was so relatively simple in my old days of color negatives and color transparencies in my still photography.  But now the youtube commentators  tell me I'll never be able to produce a decent short or documentary unless I get a sensor with at bare minimum 12 stops of range.  I understand the Alexa has about 14 stops.  But as Tim Lookingbill so eloquently states, the end viewer is using a myriad of devices with a wild range of color reproductions and inherent dynamic ranges themselves.  I understand that my nice Samsung big screen and computer monitor come nowhere close to even 12 stops of viewable dynamic range, more like 6 stops.  I've come to the conclusion that less is more in my view of the small film maker, i.e. just shoot and get it right in camera, make sure my most important scenes are properly exposed in the key areas, if shadow detail goes black, let it go, same for blown out highlights, concentrate on good composition, tell a good story.  I bet I've seen my favorite old classic movies15 or 20 times each in my lifetime; Lawrence of Arabia, Bride Over River Kwai, The Graduate, etc.,  And I bet each time I watched it the color and dyanmic range and contrast  and brightness and sound was different.  Yet each time I watched, the movie was always as good as ever.  The Great story never changed, and the cinematography never changed in my mind because its overall composition and framing trumps any variances in color palette.

rlandry1
4 years ago
rlandry1 4 years ago

I would love to hear an actual color theorists thoughts on this subject.

Where's Josef Albers when you need him!?

rlandry1
4 years ago
rlandry1 4 years ago

I would think that shooting a picture that is inherently dark (physically)...

Like, Blade Runner...

;D

...would be a really interesting place to experiment with HDR.

 

Maybe one day when we all evolve into robots, we'll be able to appreciate HDR a bit more.

dmullenasc
4 years ago
dmullenasc 4 years ago

The problem here is that there are multiple ways to make and experience a movie.  If the goal is some sort of hyperreal immersive experience where the mechanics of filmmaking disappear and the movie becomes a large window on reality, then you'd be investigating things like HDR, wide gamut color, high frame rates, 3D, super high resolutions on very large screens, etc.  Maybe even 360 degree virtual reality.  But that sort of filmmaking often works better for things like nature or travel documentaries where you want a "you are there" experience... but I don't want to say that such techniques have no place in narrative cinema.

It's just that art often exists in boundaries, creativity comes from limitations, and composition needs a border or frame to be effective.  So we've had over 100 years of cinema, most of which has been at 24 fps, and most print information is within a 10-stop range or so. When we shot film, we knew that the negative had maybe 14 stops of information but the print stock would only display about 10 stops of that, so we exposed and lit the scene with that limitation in mind, knowing when shadows would go black and highlights would go white.  Often we wanted a limited range of tones on the screen for a graphic artistic effect, the goal wasn't to see a huge range of exposure information.

The other issue here is the problem -- that has always been around in one way or another -- of multiple versions of the same work of art.  Most artists aim for ONE optimal version of their work and the other viewing experiences are compromises that we may or may not live with.  The classic example of that compromise was the NTSC or PAL transfer of something shot for the cinemas.  So the notion that there will be TWO versions of a movie -- and HDR version and a standard version -- and that both are valid as viewing experiences, is bit hard to warp one's head around.  If the artist shoots for and prefers the HDR version, then they are going to want everyone to see that version, right?

BamaPete
4 years ago

Your commentary is spot on, insightful and instructional, thank you.

stuart.p
3 years ago
stuart.p 3 years ago

Hi, I came across this by accident but as I am the Stuart P in question thought I might just add some input even though its a bit old, apologies for the lengthy post. Very happy that Almax enjoyed the presentation. The premise for the presentation was to try and break down some of the jargon and make the basic understanding of technology accessible to those who use it, the creatives. This is a little driver for me. One of my goals is to break down the tech/sciences to allow the creatives to make the right choices and get the best results from their work. So, this presentation was after I wrote a 38-page document after a chat with Roger Bunch, Director of Engineering for FreeTV Australia and he was on the working party for BT2100 (current HDR spec) at the time and the paper became part of the Australian contribution. One of the primary objectives was to determine whether 4K/50p broadcasts was viable. Broadcasters do not want to broadcast sports at 25fps progressive (hence the use of interlace currently).

So, just a few points that I tried to make, and it was a big presentation that covered a lot of ground.

1.      Number one point is and always will be to try and democratize the sciences/tech and ensure the creatives can take advantage of the wonderful technology available. This is especially important to the majority who do not have access to large Studios with all their high-level tech support.

2.       The online video was unfortunately not quite what it should have been. But in defense it was put together at the last minute in a room that normally seats about 45 and had nearly 60 people so it was very crammed and rushed. The paper is available and I updated about 6 months or so ago. I am still waiting to get a HDR monitor to do the HLG grade and encode.

3.      My points about the perception parlor game trickery was in response to the bogey man stuff that was happening at the time about HDR. The point I tried to make was that issues with the human visual system and colour has been with us forever (including all the SDR stuff we do today) and something we have been dealing with whether we knew it or not. It was also to try and add a bit of fun.

4.      The issue of bit depth was explained, hopefully in terms that was understandable. This was aimed at not only production recording but most importantly the display technology. 8 bit screen technology is NOT suitable for HDR (nor really for SDR production), even if it does use FRC (so be very careful of the marketing hype). The display needs to be a minimum of 10bit depth, and this is for the display not just the supported input signals. The Lemac MD, who is far from technical understood the bit depth stuff and the bits in the bucket analogy, so I took that as a big tick for this area!

5.      High brightness light levels from the display. All the chat at the time (and it still continues today) is the analogy of a big 65 screen with 1,000/2,000 nit burning out your eyeballs in your lounge room. This I find this a little stupid, but the point I tried to make in this area was in fact not the full white panel syndrome but in fact the reaction of the human visual system to point source lights that can be commonly used onset. This can be very difficult with our eyes, if you have a basic black background and have a point source light at say 1,000nit, this will definitely be nasty to your audience. The last thing a creative would want is to break the emotional relationship with the story. As I am sure you all know, this happened quite often with 3D where you could rip someones eyes apart with poorly thought out 3D. Just sit in your dark lounge room, then turn on your phone and try not to squint.

6.      As per point 5, visual adaption is an important factor, going from dark to bright or vice versa. The cinematographer or creative needs to think about this in the story telling.

7.      But the most major and significant point I think is critical, and sort of goes to this thread, is that it is the creatives choice about the look. Just because you can doesnt mean you should. If Roger and the Director like the scene the way it is, whether HDR or SDR, then that is their choice. HDR does not dictate that this window shot should be at some certain level. It is purely a creative choice. All the standards and tech do is make a framework so that the creative can show their work wherever and whenever in the world, so long as that display/screen is calibrated to the standard, personally I think this is a wonderful gift to the arts. No standard dictates whether the imagery/ story is good, bad, ugly or indifferent. That is for the creatives.

8.      One other point that I am picking up which is very concerning, is that the cinematographer is being sidelined about the look of the film. This is not just about small to mid-level projects, but the from large Hollywood productions that have been shot in Qld. The DIT is determining the look onset. This I find particularly troublesome, but I feel comes from a very large number of DoPs not trying to understand the sciences, such as LUTs etc and the way of post-production.

Anyway, there is no way this presentation was put as some sort of complete digest on UHD and HDR/WCG; it was purely designed to assist those in our industry at the time, to allow those interested to explore further with some knowledge. One thing I have learned that learning is endless, and an open mind and respectfulness is necessary.

 

Back to Lighting...