History of digital acquisition? (2 replies and 2 comments)
Does anyone know of any books/articles that contain a detailed history of the film/tv industry's transition from celluloid acquisition to digital acquisition?
Some questions I have: when did television productions stop using 16mm? How did digital acquisition evolve in the 80s, 90s, and 00s? Is there a single text that summarizes this evolution?
Appreciate any responses!
I don’t think there is a specific book on that topic. I’m sure there have been plenty of magazine articles over the past twenty years discussing the transition. Also keep in the that there was a separate transition from analog to digital video in the 90’s — many people were still shooting analog video like Sony betacam into the early 2000s long after Sony digital betacam was released. Same thing was happening in consumer video in the 90s.
There were sort of two separate tracks for awhile, the use of interlaced-scan video, mainly used for TV news, live events, soap operas, etc. being used also for some narrative TV production. Then with trends like Dogma 95, you saw some indie movies use prosumer / consumer camcorders, most of which were interlaced-scan (but tending towards 50i PAL because that was easier to convert to 25P or 24P for recording to film, compared to 60i NTSC).
This trend predated notions of switching to digital because it had gotten more similar in look to film, these were movies that to some degree embraced the video look, or at least, felt it wasn’t relevant one way or another, the goal was to use small, low-cost cameras to be freed from the constraints of 35mm moviemaking.
But then in 2000 you had the first 24P digital camera, the Sony F900 and by 2002 you had the first 24P consumer video camera, the Panasonic DVX100. At this point, there was serious consideration for switching away from film because 24P video had a more film-like quality than the previous interlaced-scan cameras. But the Sony F900 still used the 2/3” 3-sensor design of most pro video cameras.
In 2003 you had the Thomson Viper camera, 2/3” 3-sensor, but could send out log gamma video (for wider dynamic range) to digital data recorders. Then in 2005 you had the Panavision Genesis camera which also could send out log to an HDCAM-SR deck but also had a single 35mm sensor so could use traditional cinema lenses.
Anyway, there are a number of articles and Wikipedia entries that could allow you to chart the transition.
There have also been external events that have changed the speed of the transition, like the 2011 Japanese tsunami that damaged Sony’s HD tape manufacturing facility, accelerating the transition to data recording to memory cards and drives. Or the 2008-2009 SAG contract negotiations that were stalled, pushing TV pilots in the spring of 2009 to shoot digitally using AFTRA actors due to an arcane rule. Or the success of “Avatar” in 2009 that accelerated the transition to digital projection so that more theaters could show 3D movies.
This is incredibly helpful dmullenasc. Thank you for taking the time to respond. Would it be fair to say that the introduction of the Alexa in 2010 was a major turning point for the film industry? I'm looking at the number of 2019 releases from the Hollywood majors that were shot on the Alexa (via IMDB) and the number is staggering.
Yes, the Alexa was an important milestone though it benefited from what came before, from log recording to memory cards, etc. I think the image quality from the Alev sensor with its film-like dynamic range and color science is what pushed many digital skeptics to finally embrace the switchover.
Another "external" force that helped push out 16mm film for U.K. television in favor of digital was the compression schemes being used for HD broadcast, which did not handle the random grain of 16mm very well. And now with the push towards UHD streaming, there is even more resistance to 16mm for original series origination.