Where were we? Ah, yes. I was making a big deal about the noise being in the light not the camera. Like why should anyone care?
Because .... it's fun to know things like...
Since noise is caused be the random arrival of photons that slam into a pixel and knock out photoelectrons, it's statistical. Just like polling voters to see if Mike or Marsha is going to be our next dog-catcher. If Ms Politico Pollster says that Marsha is up by 2%, but her poll has a margin of error of 3.3%,. Mike is still in the running. The pollster called only 1000 voters. To get to a margin of error of 1%, she would need to call 10,000 voters. A bit many for a dog-catcher election.
Your accuracy (or signal to noise) equals N, the number of samples (voters called) divided by the square root of N (noise). Elementary statistics. If statistics is ever elementary
So if you collect a signal of 1000 photoelectrons your noise is 33 photoelectrons. That gives you a signal to noise (S/N) of 33.3 or a margin of error of 3%.
With ImageJ you can measure S/N accurately. Which brings up the too-good-to-be-true problem.
Since photon shot noise--the biggest source of noise in most images--is from the light not the camera there is nothing a camera manufacturer can do to reduce it in the sensor. But once the signal has been digitized and turned into bits and bytes, there are a multitude of software tricks they can use to hide the noise. Some trick are useful and make for better pictures. As for others--let's say some tricks can be overdone.
Big pixels have less noise because they can hold more photoelectrons. They also cost more to make, one of the reasons a new big sensor DSLR body costs from $500 up while a decent small sensor B&S complete with lens starts around $200. So why don't the camera manufactures make better small sensors to get around the noise problem?
Like everything else sensors have limitations. The photoclectrons are nothing more than a pile of static electricity. Same as the static electricity you collect in your finger if you shuffle your feet on the carpet and get zapped when you touch a door knob.
To keep the camera's static electricity inside a pixels there are wall of negative electricity created by the circuity that defines the pixel. If sensor designer tried use more voltage to hold in more photoelectrons they would create holes. I won't go into the solid state physics of holes except to say they are atom sized PacMen that wander around a pixel and gobble up photoelectrons as soon as they are created. Not exactly what anyone would want in their camera.
By my calculations--I've yet to find the value on the Internet-- a well designed pixel can hold up to 1200 photoelectrons per every square micron of silicon real estate. If you overexpose and create more photoelectrons than the camera can handle, it blooms. Blooming, if you don't know the term, is the cause of the big blob of white covering a street light in a night shot. Instead of creating an image with any detail the photoelectrons have overflown into nearby pixels.
My D60's pixels are 38 microns square. Knock off 10% for the circuitry that forms the pixel wall and they have an active area of 34 square microns. So they can hold about 40,000 photoelectrons. This gives a maximum S/N of 200.
When I did the measurements on my D60 yesterday, I'd hoped to calculate that number. A S/N of a 150 wouldn't have surprised me. If it had been lower that a 100, right now I would be emailing about warranty repair.
Instead I measured a S/N of 500. That requires 250,000 photo electrons and over 6 times more silicon than there's in the camera.
A no questions asked much too good to be true moment.
Next blog post. How I did the measurements. And why I suspect this excessive S/N is caused by a bug is Nikon's RAW compression routine.
Finally if you are a glutton for statistics, I recommend my favorite textbook.
to be continued: