September 21, 2014

Part 2 of Digital creep, even after shooting on film: The decline of optical intermediate stages

Having examined digital creep in the final viewing stage, and during the initial capture stage here and here, let's take a good long look into the rise of digital at the intermediate stage, and what changes it has brought about in the final image. In discussions about the adoption of digital, it's rare to hear about digital creep into the stages after the initial capture but before the final display. Usually you only hear about shooting on digital sensors vs. film, or viewing the output of digital vs. film projectors. I'll be going into considerable detail here to try to make up for the general lack of attention given to the matter.

I'll stipulate at the outset that digital intermediate stages make it more convenient to make corrections (brightness, color, contrast) and to add special effects. But corrections and special effects were being made long before the digital age, in analog fashion. The question therefore is how the introduction of this digital link in the chain has affected the overall look-and-feel of the medium, which previously was fully analog and which now is at best a mixture of analog and digital stages (and at worst digital all the way down).

Let's assume that you're going to both shoot on film and view the final product on film (at the theaters) or on photographic paper (for still camera hobbyists). There's still an intermediate stage of the process where the transition to digital has been all but completed, and where remaining purely analog is nearly impossible — making a positive image from the developed negative (at which time corrections or special effects may also be made).

When you expose film to light, the light-sensitive silver halide crystals react and capture only a latent image. Then the processing lab gives it a chemical bath that develops that latent image into a negative — something you can actually see, but without color, and with dark and bright areas switched around. The last bath undoes the light-sensitivity of the film and fixes the image on the negative. That's why you can hold your negatives up to light and they won't start forming a new image.

This is the absolute minimum of non-digital technology that is used when shooting on film. Where does the process go after that, turning the negative into a positive, with full color and where darks are dark and brights are bright?

Today, virtually every lab for both still and motion pictures will digitally scan the film negatives, and continue the process from there. Manipulating light levels, color, contrast, etc., will be done in a software program on the digital scans of the negatives. After that, if prints are made, they will be of these digital scans of the negative. (These scans may have been digitally corrected if you paid the lab to do it, rather than do so with your own computer program.)

Photographic paper is light sensitive, unlike ordinary printer paper that is written on with ink, so how do they get the digital image onto a paper that reacts to light? The computer is hooked up to a LightJet style of printer, in which lasers take the brightness and color information from the digital image and reproduce it while shining on the light-sensitive paper. Then the photographic paper is given the same bath from the old days to develop the latent image on the paper and fix this into a positive print for final viewing.

For motion pictures, some form of a film recorder is used to transfer digital information onto light-sensitive film. As in still photography, the film negatives are digitally scanned. Then these digital images are digitally corrected, digital effects are added, and the result is displayed on a monitor. A film camera is then aimed at the monitor and captures each of these digital images in sequence, making a film copy of a digital stream-of-images. Now a film print can be sent off to be projected by optical film projectors in theaters — the kind where a lightbulb shines behind the print to render the image visible, and an enlarging lens blows it up to the size of the big screen.

The key point is that both still and movie photography make digital scans of the film negatives, and then take it from there, whether the ultimate display format is digital (CD, hard drive) or analog (prints).

How did it work in the old days before digital scanning? For still photography, the negative was placed in an optical enlarger, which works like a film projector. A lightbulb above sends light beams down through the negative, which then travel through an enlarging lens (to blow up that dinky little negative into, say, a 4" x 6" size), and which finally strike light-sensitive paper at the bottom of the apparatus, where a latent image is formed (and then developed into a positive and fixed in place with a chemical bath).

One key difference from the method of digital scans and LightJet printers is that the very same beams of light both "pick up" the information in the negative and strike the light-sensitive paper. In the digital method there are two separate sources of light: the light beams in the scanner that "pick up" the information in the negative, and those that come from the printer's lasers that strike the light-sensitive paper. Computer software translates the findings from the team of beams in the scanning hardware, into instructions for the team of beams in the printing hardware.

We need at least one team of light for ultimately striking the light-sensitive paper to render the positive image. The question is, what is the source of their instructions? With optical enlargers, it is from a single unbroken path of light directly through the negative. With LightJet printers, it is indirectly from a copy of the negative — from a digitized scan of it.

Similar changes occurred in the motion picture world. Instead of digitally scanning a film negative to make a positive image, they made contact prints, akin to the optical process used for stills. A light source sent beams directly through the negative and into a light-sensitive medium that was pressed tightly against the other side of the negative; the resulting latent image was then developed and fixed chemically to yield a final positive for viewing.

Unlike the set-up in the still photo lab, in the movie world the light beams did not pass through that much air (with distortions caused by whatever was in the air) or through an enlarging lens (enlargement took place in the projection booth). But the basic approach was the same: shine a single beam of light through the negative onto a light-sensitive material that would hold the final positive.

It's not so much a matter of how many layers of copies there are between the original and the final image, though. It's the nature of how the copies are made — purely analog, with light passing through film negatives (and perhaps air and a glass lens), or digitally from scanners and software.

What differences are there in the print when it comes from a digital scan of the negative rather than an analog projection through the negative?

Here comparisons are hard to make because we need to take the same developed film negative and run it down two separate paths to the final print — the analog way with an optical enlarger, and the digital way with scanners and LightJet printers. Optical enlargers are vanishingly rare these days, so it will be hard to carry out a fully analog process on a roll of film that was shot and developed today.

But what if someone had some old negatives and optical prints of those negatives lying around, and decided to have the negatives digitally scanned and make a new set of prints from these digital scans, following current practice? Then they could compare the prints from digital scans to the original prints from optical projection.

In this thread at, a commenter provides just such a comparison, shown below. Someone took an old set of negatives to have them scanned and printed at Walgreen's photo lab (the way all prints from film are made nowadays), and compared these to the original prints made from analog means 17 years earlier during the pre-digital age. The picture shows a person's slicked-back graying hair. Click to enlarge and see all the details.

We see the difficulty of digital to deal with the extremes of the bright-to-dark spectrum. The limited range of light levels in digital was covered in the posts about the capture stage, linked at the top of this post.

At the dark end, notice how the left side of the hair shows fine gradation of darkness levels in the optical print, where only a small portion is deep-dark. This region looks more uniformly deep-dark in the print from digital scan. Ditto in the top-left area above the hair, where the optical print reveals a lot more detail on whatever that greenish thing is, while the print from digital scan smooshes all the various shades of dark into a single deep-dark value, and swamps out some of the green thing's details in darkness.

At the bright end, notice how uniformly ultra-bright the white hairs are in the print from digital scan, whereas the optical print shows a finer gradation of brightness levels.

So, not only at the capturing stage, but also when a digital scan is made of a developed film negative, the final print will show clipped highlights and lowlights, whereas a fully analog process would have yielded a more richly continuous range at the extremes of dark and bright. As a result, the print from scanning looks more harshly contrasting — one of the signature elements of the digital "look".

A separate color distortion is evident in the blown-up crop of the white hair, where the print from digital scan shows bluish blobs in what is supposed to be white or light gray hair. No such color artifacts are seen in the optical print.

Finally, notice how the print from digital scan renders the grain in the negative — the texture looks blockier and pixelated, and larger in scale. The film speed is ISO 100, which is fine-grained. The larger scale of the "grain" in the digital-derived print is a failure to preserve the fine and regular grain of the negative, a problem that the optical print did not have. Pixels on the scanner's sensor and grains in the film negative don't match up one-for-one, so we shouldn't expect a perfectly faithful rendition of fine film grain. But the result here is still pretty cruddy-looking.

While the print from digital looks more defined, it also looks more unnatural. Both aspects stem from the way that digital yields high-contrast images, as opposed to high-contrast films that have smoother gradations from one part of the spectrum to another.

You might object that the print from digital scan was probably rushed along by some random Walgreens employee whose main task is not digital scanning and correcting — or indeed anything related to visual media. But that's beside the point: we had high schoolers operating the lab at one-hour photo-mats back in the pre-digital days, yet those optical prints didn't look so crappy. The old analog process was more robust to the lab technician's lack of expertise, whereas the digital intermediate process is more fragile when the technician isn't so skilled.

This explains why digital looks less dull in Hollywood movies than in amateur photography. Hollywood hires teams of pros to work full-time at making digital look as good as it can. Don't expect that when you're operating the digital camera at the capture stage, or when you're doing the digital processing.

And even if you shoot on film and order prints, don't expect the digital intermediate stage to be handled by the local lab tech and the machines in the local lab the way they would be in the labs that serve Hollywood studios. Such elite services weren't needed in the analog / optical days (though that would've helped too), but now that there's a digital link in the chain, impressive results will require a more skilled technician for the digital scanning and correcting stage.

If you've been wondering why even movies shot on film (and displayed on film) don't look quite the way they used to, the digital intermediate stage is why. The final print is not the end result of a purely analog process. And if you've wondered why prints of film-captured images look different from prints of 20 years ago, that's why: the digital scanning of the negative introduces a non-analog step, with the effects seen in the comparison above.

Your pictures will still look better by capturing on film and making prints than going digital all the way. Just make sure to find someplace other than the drug store to have them processed and scanned before printing. There are still developing and printing labs for professionals, and they're happy to do jobs for hobbyists as well.

Is there still a place that does the fully analog optical printing process? Yes, Blue Moon Camera and Machine located where else but in Portland. You can mail them your film, and they'll mail you back the prints. They get good ratings on Yelp, and it's not just mindless hipster enthusiasm for all things vintage.

If it were, they'd be fleecing the customer. But to develop and make prints from a 24-exposure roll of color negative film, you're only out $14.80, compared to about $20 everywhere else using the digital scanning method. There is a minimum $8 return shipping cost, so you'd have to send them several rolls at a time to spread that out into a reasonable per-roll shipping price. You can also send in already developed negatives (new or old) and have optical prints made from them.

I haven't used them yet, but I'm definitely going to give them a try. Who knows how long we'll have left to make purely analog pictures? I'd regret passing up the chance, especially given how simple and affordable it is.

1 comment:

  1. This is a great couple of posts, agnostic. You illustrate very well what is meant by the "harshness" of digital photography. Many will argue the convenience factor and the portability of digital cameras (however according to some directors digital is actually more expensive), but there is no arguing with the superior quality of film. Being analogue it simply does things that digital cannot do. Sad to see it on the decline and for very little reason.


You MUST enter a nickname with the "Name/URL" option if you're not signed in. We can't follow who is saying what if everyone is "Anonymous."