Six years ago I discovered both the challenges of reverse engineering a digital camera to discover how it was made and the Internet photography forums where you could enlighten the world about what you discovered. Or thought you discovered. The Internet was just taking off. The few photo forums that were around then were full of discussions, spirited discussions and outright flame wars. A wild and sometimes informative time.
I fell into a polite disagreement with someone about dynamic range or noise or Ansel Adam's zone system or all three--I don't remember the details. To prove my point I decided I needed to experiment. With a series of photographs of an accurately printed zone system chart and some Photoshop magic, I would win the next round of discussions and establish myself as a photography guru to reckon with. (Naivete, thy name is Internet Newbie)
To accomplished this impossible dream I called around to the local camera stores. Only the Camera Company had anything close to what I wanted. For a mere $160 + dollars I could buy a calibrated 21 zone Kodak photographic step tablet no. 2.
My reply was "You gotta be kidding. There must be anything cheaper. I need this to settle an argument in an Internet forum."
Turns out they had the step tablet in stock because a grad student had special ordered it and then never came back to buy it. Since some money was better than no money for something that had been sitting around for years, the owner decided that if I came up with $25 the tablet was mine.
$25 was more then I wanted to spend, but...hey, who else but a true Internet guru would own a calibrated Kodak 21 zone step tablet no 2. If I could slip that fact into my postings it would add a touch of cachet. Didn't' work out that way but over the years I've wasted many hours playing with the step tablet, so I must have gotten my money's worth.
This is my latest setup
The step gauge consists of 21 neutral density filters printed out on a transparent strip. Their optical density ranges from 0.05, almost transparent, to 3.0, 1/1000 transmitting. To use it, I tape it to the black cardboard holder. That slips into the box in the lower picture. For a source, the white foamboard is lit from outside to make a diffuse and evenly illuminated background.
With the camera on the tripod I drape the black T-shirt over it as a drop cloth. Any stray light overwhelms the transmitted light of the more optically dense strips. This shows up as an offset in the imageJ graph where the low transmitting strips aren't close to zero .
Then I set the camera in manual mode and adjust the exposure so the first few zones are over exposed. Then it's a simple matter to increase both the ISO and the shutter speed to take a series of noise profiles with a constant exposure
For the record you don't need to use this or any other tablet or chart to do the experiment. You can take photos of a white card or wall at various exposures to make them as dark or light as you want. The tablet is convenient. And it along with ImageJ makes neat charts for the blog.
I you want to do the experiment you will need one more free program, ufraw. It's the raw converter that come with GIMP, the free version of Photoshop from the Linux people. Or you can download a stand alone version from here. http://ufraw.sourceforge.net/Install.html
It supports far more versions of RAW than the commercial RAW converters including the CHDK hacked versions. With its latest reincarnation, its graphic interface is easier to use than it used to be. Still doesn't do batch conversions yet, but I'm not complaining. It's free and also the only RAW converter I've found that does linear RAW conversions
What so important about that? In the last post I mentioned that once a sensor's data was turned into bits and bytes, there were many software tricks that camera folks could do to hide and mask the true noise. The most common is gamma conversion. It's important and usually necessary but it completely changes how the image and its noise looks.
With a glance, you can see the difference between the two noise profiles. The image in the center is lighter with a greater dynamic range- a clear advantage over the darker image on the far right.
The advantage shifts when you compare the two graphs. The noise is lower in the top graph, the noise profile of the darker image. The noise also decreases as the steps become darker.With the lower graph from the middle image the noise becomes greater as the steps darken
So which is better. Less noise with less dynamic range. Or the other way around.
Neither. Both graphs are from the same RAW file, one taken at ISO 800 with my friend's Canon 5D--one of the lowest noise camera around. The only difference was how they were processed by the ufRAW converter. The darker image is a linear image with no gamma correction. The lighter one has a gamma correction of 2.2.
The linear noise profile is how the sensor sees the world. Close down the lens a stop and you have half the light and half the number of photoelectons. This creates half the voltage for the A/D. (Analog to Digital converter, the hunk of electronics in the camera that turns the sensor signal into bits and bytes.) That's the definition of linear. Double or half what you put it; double or half what go get out.
Gamma correction is non-linear. Why is that important? Your eye-brain system is non-linear too. Your night vision and response to low light is much better than your daylight vision. Microsoft thinks a gamma of 2.2 is the correct correction. Apple says 1.8. Your real gamma as you read this depends on your eyesight, lighting conditions and what you had for breakfast this morning.
Since photon shot noise is in the light, the less the light the less the noise. That's what you see in the linear graph. With a gamma correction you are brightening the darker steps. Another way to look at it is you are amplifying your sensor signal with software just as you do with hardware when you set the camera to a higher ISO setting.
This amplifies the noise. It also amplifies the signal an equal amount. So the S/N ratio is the same.
It's the S/N ratio that has meaning in an image. Not the noise alone. The distinction is important. While this may sound like a quibble, if you don't distinguish between the two, the noise alone can lead you astray.
How far astray. As an example--this is what happened when we compared the 7D, 5D, and my D60 on Friday.
With photon shot noise, the measurement followed theory closely.
At ISO 800 the full frame 5D had a S/N of of 100 when it's sensor was just about to saturate . It had collected 100,000 photoelectrons in its 72 micron square pixel. My D60 had a S/N of 66 with it smaller 1.5 crop sensor. And the D7 with its 18,000,000 pixels jammed into a slightly smaller 1.6 crop sensor had S/N of 57.
No surprises here. With photon shot noise the cameras behaved just as theory predicted.
When it came to true camera noise, the noise at the bottom of the graph where there is almost no light, the results were different. My D60's noise was identical to his 5D's noise which delighted and surprised me. My friend's brand new 7D looked to be twice as noisy as the other two cameras. something that didn't make him grin wildly.
After a closer look at the data on Saturday morning, I called my friend with better news. For reasons I haven't worked out yet, the data from the two Canon cameras wasn't completely linear. This amplified their noise enough to skew their numbers.
With the corrections, the 5D is the quietest of the three cameras, the 7D is a close second and my D60 is about twice as noisy as the other two.
A mild disappointment, but not a surprising one. The Canon CMOS sensors have electronics built into each sensor to control and reduce the noise. That explains their factor of two noise advantage.
And that doesn't mean my D60 is a bad camera. According to the astrophotography web sites where they really worry and know about noise, the 5D's real camera noise is equivalent to 3-5 photoelectrons. So with the high estimate of 10 photo electrons in my D60, I need to collect only 100 photoelectrons in an exposure for the photon shot noise to equal the camera noise.
Be nice to own a full frame camera, but then we are talking big bucks for both the camera body and the lenses big enough to cover a full frame sensor. I can live with what I have.
So my next post will feature real pictures where I push my camera, lenses and noise reduction programs as far as they can conveniently go. It's the questions that prompted these posts on the theory and practice of camera noise.