Camera Resolution - Is 4K too much? (4 replies and 4 comments)
I am writing my dissertation on whether resolution has a perceptual effect on an audience.
Film has been the medium for over 100 years. Do you believe an audience can tell the difference between a high-res camera and something lower such as downscaled to 2.5K.
4K and even 8K are becoming the 'new normal' for shooting with in the industry.
Would love to hear your thoughts or other cinematographer's thoughts on this.
I think most people would just be watching the film rather than thinking about resolution. Maybe if it is a bad film! When I was shooting film I would sometimes want to see 'texture', grain, in the image. Maybe I am old fashioned but I find some high def. images too clean and crisp. They can look artificial to me.
I totally agree!
The ability to perceive higher resolutions has been tested already but the problem is that the tests are based on viewing distances and screen sizes, and there is a lot of variation of both in real life. And that doesn't even take into account how good the viewer's eyesight is... You can't really make a movie based or optimized for someone with 20/20 vision sitting "x" distance from a screen of "y" size with an image of "z" resolution.
I have a theory, untested, that the more "real" an image origination / viewing system is, the more "fake" it can make the fictional activities onscreen look. If you had some format that ran 1000 fps and had 16K resolution and was 3D, it might be great if used to shoot real gorillas in the jungle, it might even be exciting to view.
But take that same technology and point it at actors in make-up pretending to be in the 19th century, walking around sets, no matter how good the details are done for the screen, the hyperreal nature of the photographic process might create a greater burden for the audience to suspend their disbelief. Of course, if the acting and dialogue and story were great enough, that might not matter just as when watching a theatrical play -- you don't have to believe what is happening on stage is real for you to become involved emotionally. But the experience wouldn't be like watching a good movie where you get transported into the world on screen.
That is an interesting theory and definitely something i will investigate. Have you had thoughts about testing your theory?
That is very interesting! I never saw 'The Hobbit' in 48FPS in cinemas, but I heard that one of the downsides of the format was the makeup on the actors looked very unrealistic. But for me, when I look at that film in regular 24FPS, the makeup seems very well done. I suppose, the more 'real' it looks, the more the fake aspects of it stand out, hence why shooting real gorillas in the jungle would look thrilling and be very immersive!
Also, now I am thinking about the 4k remaster of LOTR which, to my eye, seems to have made the CGI stand out in a very jarring way. The 1080p versions I was used to watching allowed the CGI blend into the practical imagery in a naturalistic way, but looking at the new remastered images, it seems much easier to identify the seams.
I think that you're right. So much of this digital stuff looks completely phony. Especially stuff shot under the Sun. It's too real.
There is a silly and rather pointless 'Resolution War' with Red and a few others trying to be the first to hit the market with the next even higher pixel-count.
I remember the fuss that we had to go through when HD came to TV. In fact, I am old enough to remember the panic created by colour TV - now that's going back some! No more painted-on doors and fireplaces and painted-on woodland scenes! How on Earth will we ever cope???
But back around 2000, my pet discipline of audio was infested by people who swore blind that they could hear the difference between recordings made at sample rates of 44.1kHz (CDs) or 48kHz (film and video) and much, much higher rates. Because it takes two samples to describe a sound wave, a 48kHz sample rate means the recording goes to 24kHz - some 8kHz above adult hearing at its best and (more importantly) 4kHz above the upper limit of all studio microphones and all hi-fi equipment, all of which is capped at 20kHz!
But the 'Resolution War' in audio was on and most classical labels insisted on masters done in 96kHz and a mad few wanted 196kHz.
But just as there are a few processes in audio that actually do require higher resolutions, there are few in video and film that require very high resolutions. For example, imagine a live concert video where the whole stage is filmed and only a cut-out is shown of the lead singer. You save on a camera person having to follow that one figure and you can cut to the lead guitarist or the drummer as a cut-out of the one very high-res image.
Instead of 16 cameras following every movement of Rammstein or Nicki Minaj, we could go back to three or four using 8K or 12K (or even higher) sensors and save significant sums and push the fine-tuning off to post-production (and even get a better final product).
But back to film - 6K sensors with the final product on 4K for theatres and HD or UHD for the home seems to hit the sweet spot. Just as in audio, what goes on in front of the camera is 128.7-times more important than the technology! I measured it - twice!
I think what Mr. Mullen is talking about regarding willing suspension of disbelief was expressly the reason The Hobbit films moved away from miniatures and matte paintings and into fully CG effects work. The former techniques stood out as 'fake' when viewed at high frame rates. I also recall this is why they went with a fully digital young Will Smith for Gemini Man as, again, compositing just didn't look right in HFR.