I usually prefer to watch technical discussions of this magnitude, however I think I have a question!
If a larger photosite doesn't increase the signal to noise ration (so to speak?) at higher sensitivities, then what does? I repeatedly see comments on DPR blaming photosite size for digital noise in small sensors. Is this wrong or just misleading?
If I understand what you're saying, the size of THE SPACE AROUND THE PHOTOSITE is more likely to increase the signal to noise ratio at higher sensitivities because it is likely to isolate the electronics that create the noise?
Noise isn't just about photon collection. You could have a huge sensor with huge photosites and still have very poor shadows if there is a lot of electronic read noise. Even ignoring read noise, big photosites really only mean lower resolution. The total number of photons captured in a given subject area is far more important than how many are captured in each pixel. Subdividing them into smaller pixels does not increase *image* shot noise.
The Canon model of having special amplifiers right at the photosite is going to hit a wall soon; the fact is, you can get much better base ISO IQ by having lots of little pixels, with readout optimized only for one true gain level. The Pentax K10D already has the lowest read noise per pixel in the industry, and it looks like the K20D may go even lower, even with 14MP in an APS sensor. I have not had the proper RAW samples to tell for sure just yet, but it looks like the K20D will have a pixel read noise of about 0.74 ADU, and the lowest of any other brand (1Dmk3, for example) is a 12-bit equivalent of 1.25 ADU. 14MP in a 1.5x crop means the equivalent of 31.5MP full-frame. The read noise would then be equivalent to 0.74/((31.5/14)^0.5) = 0.46 ADU (which would need 14 bits to avoid visible quantization). That's about 1.5 stops more DR than the Canon mk3 cameras.
Sorry, I replied to this post weeks ago but never typed in my reply. I'm cleaning up Firefox, where I had about 40 tabs open and it was bogging down, so I killed Firefox from the Task Manager, and brought it back up with the old session (it used about 85% less memory after the resurrection, even though everything was open that was open before, and included the history as well). Keep that in mind, anyone using Firefox. You can virtually hibernate Firefox (and clean up its memory usage) by killing its process tree in Windows Task Manager. When you launch Firefox again, it will as you if you want to resume the old session, or start fresh, at least the latest versions. Of course, it goes to the websites again, so if they have changed, you will lose the page.