Art Adams (Arri) on Signature Primes (27 replies and 20 comments)
I hope that I am not stepping out of line by posting material that is not 'Roger-related', but as I know that there are many aspiring cinematographers on this forum, I thought this video explaining the difference between the Signature Primes and 'other' lenses might be instructive.
Warning - it's 38 minutes long!
I don't think you're stepping out line at all! This is a place to discuss the craft and technology is a part of that. But I would like to say that I do sometimes regret I've spent so much time on technology rather than learning the craft.
If there is one thing about the technical part of cinematography that I deem the most important, it is the understanding of sensor saturation and colour space transformation. These have the biggest impact on image quality, by far. Lenses are also important, but not to the same extend as how you "expose and develop the digital negative".
I urge everyone who isn't already aware of ACES to try it out themselves and play around with exposing the sensor, taking measurements and develop it themselves. The image above was created using a prosumer camera, a very old stills lens (Takumar 50mm 1.4) and a DIY muslin bounce. Only one light armature was used.
And let's keep it real: it took me a long time to pull off these kind of shots on a consistent basis. And a 10.000 dollar lens won't help you AT ALL.
But having said that, if you're working on a large production that cannot afford a lens failure or whatever, the extra expense of the gear is justified and probably even mandatory. Reliability is what you're really paying for. Sure the image quality is great, but we're talking about micro-advancements that only account for maybe 3% of final feeling of image you create. So keep that in mind.
If renting an expensive lens mean, you have to get rid of a crew member, you better think twice about your priorities.
I like the simple node tree, but didn't get last few nodes"the add subtract blur compound", I've seen something like this done in photo retouching, some thing called high pass sharpening"adding local edge contrast".
It is what they call an unsharp mask. You create it by subtracting a slightly blurred copy from the original image. Then you sum this with the original image and it sharpens the image. Depending on the lens and shooting stop I use a blur radius of 2,3 or 4 pixels.
You can also sharpen locally in the same way, but you only blur a certain selection. By using a key or a manual mask.
Resolve has a dedicated sharpening tool but it is only available in the studio version. But it essentially does the same exact procedure displayed here. Nuke's sharpening is also like this. Most sharpening passes are done this way. I'm not even sure if there is another, proper way to do it.
Another important thing to mention is that here you see I perform the operation back in ACES linear space. But some people prefer to sharpen in a display referred space, preferably a log space, like cineon log or rec 2020. This should avoid what is called "ringing". So in stead of placing that second colour space transform before the sharpening pass, you place it after.
This guy explains/shows it: https://vimeo.com/72702892»
Thanks for the detailed explanation specially the color space note, i came across this old discussion in liftgammagain forum.
Believe it or not, but I actually do remember that thread! 🙂
And about the artistic use of sharpening: to me it's not about creating a look. No it's more about preserving continuity. Some shots have different lenses, different stops or a different contrast or lighting situation. And these all impact what is called 'perceived sharpness, or perceived contrast'.
And it's our goal (as in colour correction) in general, to minimise these differences, smooth out the creases if you will.
You use sharpening as tool to help guide the eye and to improve continuity!
In stills photography it's more about shaping the look the look yes, but in storytelling it's more about making sure the audience is focused on what they need to focus. The look itself is is not that important because you're watching the work itself and your eye and brain adapts to the look. After a while the look ceases to be a look. You get used to it. Like you get used to a smell.
It's only when suddenly there is a large contrast in the look that you'll again notice it. So for stills photographers it makes sense to create their own look in order to stand out from all the other photographers.
But in storytelling, you don't necessarily want this. Once people are into the film, you have to do everything within your powers to keep them into it! This is one of the most profound things I've learned from Roger.
Totally, i see post processing as a more of science than an art"technical fine tuning", the moment image manipulation goes in the art region images tends to go plastic quickly.
Yes, I agree! It's literally like cosmetic surgery. Some people take it too far, while with other people it's hardly noticeable: it just smooths out things a little bit.
What we do with light and in post is very much like cosmetic surgery.
I did a shoot recently for a cosmetic surgery practice and the owner (who takes great pride in his work) told me he flat out rejects clients who want to take it too far. He says that he wants his work to be invisible. At that point my prejudices about his craft vanished.
True, the question of when to stop before you go too far... its a mindset and requires great deal of self knowledge
Well, that brings us onto equipment costs v. production costs and in my first discipline (audio) the people get very tied up in equipment. Long battles break-out over stupid arguments whether ProTools is better than Reaper, or what mic to use for a VO.
But the whole issue of equipment v. production costs and the cost of hiring people was put into a nutshell when I attended a recording of a movie-score at Abbey Road. We were looking down at the orchestra scrubbing away in the main live-room.
I asked the studio manager what the place cost per day and she said "£3,600 plus extras, plus VAT."
I looked suitably impressed and said that it was a good bit more than our place.
The girl from the music agency said "That's nothing. The orchestra is costing us £82,000 per day!"
In other words, equipment may look expensive when viewed from the outside, but it is peanuts when compared to the cost of what goes on in front of the camera - or in that case, in front of the microphone!
Sadly, I do not have the budget for a set of Signature Primes, but I bought the best I could for the little I had. Yes, you can buy lenses for a few pounds or dollars, but then I remembered what legendary sound engineer Bruce Swedien once told me "Good microphones are the sound engineer's secret weapon!"
In the same vein, I would venture to state (from a position of relative ignorance) that good lenses and a good camera are the cinematographer's secret weapon! (I shall no doubt find out in the fullness of time!)
The job of an engineer is to work with what is available. It's about creative problem solving.
But you must never, ever think that the tools used to capture something are more important than that which is captured. If what you capture sucks, than no amount of money you spend on tools will be able to magically make it not suck. It's not about the price tag of the lens you put on the camera. It's about the focal length you choose and where you place the camera.
A even better sound engineer will tell you this: good microphone placement is the sound engineer's secret weapon.
You are making the same point I just made - but as for the lecture, I've been placing microphones, good, bad and indifferent, for the past 50+ years!
That's amazing. I can't begin to fathom how sensitive your ear must be!
Wouldn't you agree that the best sounding microphones aren't necessarily the most transparent ones on paper?
What I'm trying to get at: don't you think it's more important how familiar an engineer is with his tools and how the tools react to different sounds, rooms and how you place and gain them? If that's the case. You'll probably find out that the same applies to lenses.
The quality of a lens is hardly just about sharpness. What about flare factor? What about breathing? What about speed? What about size and weight? All these things come into consideration when choosing a set of lenses.
Of course, I absolutely agree that technology is not as important as imagination. There are plenty of examples of great films made with what we would consider lousy lenses and sub standard film stock. How many films made before 1970 would pass muster with the image quality police?
Thank you Wouter for this new perspective instead of which lens to use, what you covered is fundamental and should be the first thing to tackle "Image pipeline: exposure and development", because if that is not fixed discussing all other variables is useless, all the questions "how soft is soft, how to create a LUT, how close the light in that shot...etc" has no grounds to build on.
I think the Post & the DI section is lacking regarding this fundamental aspect in image making, topics examples"industry practices stock test procedures, image manipulation methods from capture to delivery...etc", i could name two to three books that covered this subject in for celluloid stock, but there is almost no equivalent material for the digital negative.
I owe all of this to a special someone who took bits of his own free time to explain this to me. I am forever grateful for that. I wish to transfer his gift onto others. It makes me happy that it resonates with you as well! Welcome to the family! 🙂
Good for you Wouter, some one helped you opening the black box.
"but there is almost no equivalent material for the digital negative."
There are some books, but as soon as they are published, they are several months out of date. It is a field that is moving so fast that as soon as you opt for one technology, it is replaced by another. There were no LF sensor cameras a couple of years ago - now there are several. CGI was once expensive, now basic CGI tools come bundled with your pet NLE. And so on!
When I began in television back in 49BC, zoom lenses that could actually be used were a dream (soon to be realised!) and programmes were B&W and recoded on tape and only a few types of cameras could be synchronised, so unsync'ed tape had to be physically cut into the black-burst to resync the image.
Today, advances hit us almost daily! Even Wikipedia can't keep up with all the new models and software options - and that ain't printed!
Today's state-of-the-art new feature is tomorrow's joke! I remember when Quantel introduced the first wire-removal plugin - it was a revolution! We were so impressed! I remember the mad rush to get one's hands on the very first (and expensive!) hard disk that could hold an entire gigabyte of data! Now the little SSD I have just shoved into my Atamos recorder holds one terabyte - and it was cheap!
Yet another version of Reaper came out with even more audio and A-for-V tools and the latest version of Resolve just came out - I have not looked at it yet.
No printed book could keep up with that rate of development.
This is true, this is why I am so passionate about telling people they shouldn't waste their time on every single change in tech!
I am advocating the cultivation of the underlying techniques and principles. Because these hardly change! These are timeless things. It's just the implementation, availability and marketing that changes constantly!
An LF is still a friggin' camera. You still have to expose it properly for it to work. The only real difference is that it is lighter, smaller and more sensitive to light. That's about it.
Understanding why colour space work and why we use them... this bit of knowledge is something that won't change in the next 100 years.... probably even longer! If by that time the world hasn't been reduced to a pile of rubble of course! 🙂
But I hear this point of view from the older generation all the time. My uncle he works with heating systems. He's a true craftsman, very pride about his work. But he can hardly keep up with all the new electronic stuff.. regulators, energy saving things, control panels and whatnot. And half of the time it doesn't properly work because these things are released at rates that make it impossible for engineers to properly engineer it.
So I empathise with you, more than you think!
Resolve has been an unstable piece of shit for many years. But there were ways of working around it if you exercised some hardship. But right now, I'm happy to say that it has reached a point that just stuns me. Me mouth hangs open at how far it has come!
As an audio engineer, you should definitely check out Fairlight! I'm not the biggest fan of the stock plugins, but they do work! It is now reaching a point that professional engineers in film might consider migrating to this platform.
But in the end it's still about knowing what you're doing. The principles and techniques hardly change. It's only the tech that changes. And don't believe for one second that one piece of tech has some magical qualities to it. Marketing wants to make you believe that. But it's not true.
In terms of tech, the most important thing is: was it well engineered and has it good support? Improvements in the tech itself evolve at micro-increments. There's no weird magic whatsoever.
But I suppose when AI implementation enters the stage, this might change.
"No printed book could keep up with that rate of development"
Couldn't agree more with Wouter "fundamentals, fundamentals, fundamentals".
Byre, thank you for enforcing my point further, your words are exactly the reason why I'm asking for more discussion in this subject matter, from time to time instead of non at all, reading a book may tell you how things works, but hearing how its used from people who has been doing it forever is priceless.
The pragmatic choice these days is Resolve. It now hardly crashes and you can do everything with it. And most importantly: their colour management is actually pretty solid.
I've tested the ACES colour space transforms in Nuke and Resolve side by side and there is LITERALLY no difference. I've also tested it in Blender (using the same OpenOCIO code Nuke uses - and Resolve presumably as well) and in Blender it's also identical. So that's an identical result in 3 different applications which indicates that the math is solid.
And if you're wondering what Nuke is: it's the industry's standard VFX/compositing software. They are legit.
Resolve is a safe bet. Just stick to it and it will become second nature to you. The company is doing very well, it does seem like it's future proof.
A few weeks ago, I tested Resolve in my Linux environment. And it works flawlessly. And yes, it runs faster on Linux. Even the startup loading time is cut by 75% in my estimation. But due to the instability of nvidia graphics in Linux, I can't afford to make the switch yet. But nvidia recently said that they were going to make an effort to provide quality linux support.
Interesting things are happening!
Open source software is the place where you want to keep your attention! It does require you to know your fundamentals, but once you get them down, it will benefit you endlessly in the future. If you're passionate and you can afford to spend the time on it, go for it!
But if you want the pragmatic solution, stick to windows/mac and just run Resolve. It's everything you need and more.