dmullenasc

Posted on by

Forum Replies Created

Viewing 15 replies - 211 through 225 (of 280 total)
  • Author
    Replies
  • in reply to: Artemis software use #207592
    dmullenasc
    Participant
      in reply to: Artemis software use #207511
      dmullenasc
      Participant

        in reply to: Artemis software use #207396
        dmullenasc
        Participant

          I use it as a lens finder. The problem with normal lens finders is that you line up the shot and then hand it to the director or camera operator to see but you don’t know if they are really framing it the same way. We use the iPhone Artemis on scouts or in prep to figure out the lens choice but on the set we use the Artemis Pro which allows use to mount our actual lens onto an iPad. This way we can line up the shot with the director watching and record the blocking rehearsal. We can store video or still frames; if done earlier with stand-ins, these can be used for a storyboard.

          dmullenasc
          Participant

            She has the more direct eyeline because she is the main character, later in the movie he will have the more direct eyeline.

            in reply to: Is there such a thing as ‘correct’ exposure? #205617
            dmullenasc
            Participant

              I think Gordon Willis (or maybe it was Conrad Hall) once said that there was nothing wrong about working on the edge… as long as you are consistent enough to not fall over that edge.  For example, maybe in your testing, you find that you can underexpose everything by 3-stops for a look and technically you are fine with the quality… but if you go a 1/2-stop too far, you fall off the cliff so to speak. So when working at the more extreme ISO settings, you have to understand your reduced ability for correcting errors.

              in reply to: Is there such a thing as ‘correct’ exposure? #204780
              dmullenasc
              Participant

                If you pick the “correct” exposure for the mood you want, then in theory you wouldn’t be pushing it around in post. That’s the issue, do you want to expose for the look you want… or do you want maximum flexibility to push it around in post, i.e. change your mind?

                It’s possible to split the difference, i.e. play it safer by working at a base ISO with fairly minimal noise so you have some room to adjust without noise becoming too problematic even while exposing for the look you want.

                in reply to: Uncompressed HD vs Arriraw #204779
                dmullenasc
                Participant

                  I got this info from someone at ARRI:

                  There are two distinct types of logarithmic data:

                  Users see LogC3 (pre-ALEXA 35) or LogC4 (ALEXA 35 and later) RGB data.

                  Writers at the device driver level see another logarithmic format, extremely early in the imaging pipeline, that is used to store the image while it is still a photomosaic, i.e. has not yet been debayered into RGB.

                  For pre-ALEXA 35 cameras, there is a ‘hidden’ form for 12-bit photomosaic data, and then for each exposure index there is a particular variant of the LogC3 curve (‘gamma’ if you will, though not in the mathematical sense). The LogC3 differences are slight enough that most people don’t even know they exist.

                  With the ALEXA 35 there is the same ‘hidden’ form, this time extended for 13-bit photomosaic data, and then there is one (and only one) LogC4 curve. 

                  SMPTE RDD 55, which documents ARRIRAW in MXF, lays out the details of both 12- and 13-bit low-level bitstreams.

                  in reply to: Uncompressed HD vs Arriraw #204670
                  dmullenasc
                  Participant

                    <p style=”text-align: left;”>I could be wrong, of course!</p>

                    in reply to: Uncompressed HD vs Arriraw #204667
                    dmullenasc
                    Participant

                      Log-C gamma is meant to emulate film negative scans but I think it’s applied to Arriraw conversions / debayering to RGB for color-correction, it’s not the 12-bit log storage format of Arriraw. ARRI ‘s own website is not completely clear on this but I believe we’re talking about two different things, the Log-C used for debayered images and the mathematical log used for data storage in Arriraw.

                      in reply to: Uncompressed HD vs Arriraw #204495
                      dmullenasc
                      Participant

                        Arriraw is a 12-bit log recording but it’s not Log-C, there’s no color space or gamma applied to the image, it’s just data that is converted from 16-bit linear to 12-bit log for storage.

                        in reply to: Uncompressed HD vs Arriraw #204447
                        dmullenasc
                        Participant

                          In the early days of the Alexa, uncompressed HD out to a Codex was used because Arriraw wasn’t an option yet. “Game of Thrones” also used that format at first. Uncompressed HD is uncompressed, unlike ProRes, but it’s still a debayered RGB signal so color temp and ISO are baked into a Log-C output like with ProRes. And it’s downsampled to HD resolution.

                          I also think three HD channels uncompressed is actually more data to handle than Arriraw… leaving the data as a single Bayer pattern signal is a form of data compression in that debayering it to RGB triples the amount of data (if keeping the same resolution — in this case, though, the signal is being downsampled to HD.)

                          Today with the internal ProRes 4444 xq option, or Arriraw, there’s no reason to record uncompressed HD out to a Codex.

                          dmullenasc
                          Participant

                            Lately if been pondering the reverse of your idea. What if I set my digital cinema camera to 3200K and then use an #85 or #85B as filtration? I was wondering if it may have a pleasing effect on skin tones or other “warmer” elements of the frame. Something I will test soon.

                            I think mainly you’ll just find that your blue channel got noisier, which is how a digital camera takes a raw conversion to RGB and makes it 3200K. There may be a subtle difference in the color values due to the dyes of the optical filter… but the question is whether that could just as easily be created with minimal color-correction.

                            in reply to: Exposed for film #203076
                            dmullenasc
                            Participant

                              My suggestion is that before you try and get tricky by overexposing film and restoring it to normal in timing (film or digital), if you’re new to film, you really should learn what it looks like exposed and developed normally. If you want a digital camera reference image, you can match the ISO being used by the film. For example, shoot 500T film at ISO 500, in tungsten light (3200K), and set the digital camera to the same settings, same shutter speed too.

                              From a creative standpoint of shooting something, lighting and exposure should not be a science project. I think if you shoot some film and see the results, you’ll find that it is not as hard as you think as long as your base exposure for the subject is what you intend in terms of how bright or dark you want it to look.

                              in reply to: Smoque 1 filter for (daytime) interiors? #202659
                              dmullenasc
                              Participant

                                I got the Smoque 1 and 2 filters years ago because smoke is dimensional, so when I would shoot inserts on a scene, like objects on a desk or wall, there was no smoke visible even though the room was hazed. So the filters allowed the inserts to maintain the look of the wider shots.  At some point, I had a few scenes where I couldn’t haze — one involved the windows being blown-out by an explosion — so I used the Smoque filter. It was convincing about half the time. One problem is that the filter needs a light source to hit it, like a window or a bright highlight, to really see the effect, but when someone passes between the window and the filter, the effect disappears momentarily, which is odd. So you have to think of it as an effect somewhere between using haze and using a Double Fog filter, it’s not a substitute for smoke… except to help match smoked shots with shots without smoke in a pinch.

                                in reply to: Ansel Adams zone system #201423
                                dmullenasc
                                Participant

                                  With digital post, it’s sort of the opposite — we not only have control over the gamma (contrast) of every frame, we have control over portions of the frame.

                                Viewing 15 replies - 211 through 225 (of 280 total)