Still not sure...
Hi Sean,
Just back from a week up in your neck of the woods. Quite dry there, far drier than here in the "desert" of New Mexico.
Sean DeMerchant said:
Not quite. When light is said to be collimated, then that means that parallel rays of light passed through an optical system remain parallel (note this is mathematicians interpretation of the optics and not an physicists or optical engineers summary). This is not correlated with the angle of incidence on the sensor in any way.
Take the case of a small (but not point) source of light on the lens optical axis. The rays reaching the center of the lens and the center of the sensor or film will be parallel and normal to the sensor (or nearly so), but that's really the only case. Now take that same light source and move it to the edge of the frame. The light rays reaching the front element of the lens can still be nearly parallel, but they can't possibly still be normal to the sensor unless the rear lens element is at least as large as the sensor diagonal (in other words, the rear element would have to be larger than the image circle). Over a small range the rays could still be parallel, but since there is some magnification they're not really.
I'm sure it's possible to build a lens with such a large rear element but I've never seen one. A quick survey of my small lens collection shows rear elements of 18-22mm, no where near the 43mm required for 35mm film and not even as large as the ~27mm diagonal on my 20D's APS-C sensor. It doesn't correlate to focal length or maximum aperature, either.
With a digital sensor this correlates with the ability of the microlenses over each sensor site to focus obliquely directed rays to the sensor. This is not a major issue with DSLR sensor due to the lens models used from my understanding which could be wrong.
From what I've read this is a major problem, actually. Thinking more about this I wonder if there is an algorithm built in to change the gain from center to edge or if they use different microlenses center to edge. I work in digital integrated circuit fabrication and I can't see how you would do such a thing, but since I've never dealt with microlenses that doesn't mean much.
I've read that some of the "best" lenses like Leica, Zeiss, and such that were designed for 35mm film vignette badly on digital because they're designed with a short distance from the rear element to the film. Some problem for film, more severe for digital sensors.
One should also not that most sensor dust creates shadows only a pixel or three wide which means that the de-Bayering of the sensor data will blur away a materially significant portion of the shadows shape.
Now that's an interesting point that I hadn't thought of. Even though I know better I still tend to think of all sensors as more of the Foveon style. Bad photographer-engineer.
Also, the eliptical shadow caused by non-perpendicular light paths is likely to be more prevalent with wide angle (DSLR) or rangefinder lenses (Leica and etcetera) from what I understand (I could be wrong as I just know where to start researching rather than answers as it has absolutly zero practical effect on my images).
Now we get into pure hypothesis: lenses with short rear element to sensor distance will show dust worse than those designed with a longer distance (I know there's a name for that, but I don't recall at the moment) due to the oblique angle of light on the sensor surface. Not terribly easy to check quantitatively, but I'll have go back through some times where I had serious muck on the lens and see if I can correlate it to aperature, focal length, and lens design. Rather academic, but I suppose it could tell you what lens to avoid if you know you have dust that you can't get rid of in the field.
But the practical result here is that with the Bayer array one needs several pixels of shadow before de-Bayering will allow even the small eliptical character of the shadow to be notable we would have to be beyond the
Nyquist Rate*(2 sample which is 4 pixels after de-Bayering). And at that point I have found that such dust/hair is visible to my naked eye (albeit, I am nearsighted).
I'm mildly farsighted, but still have good eyesight and have had limited success seeing "stuff" on the sensor unless it's huge. The 20D has a pixel pitch, as I recall, of 8.6um, so 4 pixels is over 30um. Definitely in the visible range. It's not easy to see down into a black hole and get decent, oblique light at the same time.
Excellent point. This correlates 100% with what I have observed. Albeit, this could be becausse I rarely have strong defocus at the center of the frame.
But one must also take into account light falloff in the corners due to reflection off the sensor surface (i.e., celluloid sensors have this problem too).
I note it being worse at longer focal lengths where DoF is shorter and we hence get stronger defocus. i.e., the dust must be more in focus than the image data.
Now I'm really confused. Without going back to check my images carefully I have noted more problems at longer focal lengths as well, but in checking a couple of my zooms the rear element is
farther from the sensor at long focal lengths. Shoots a big hole in my earlier hypothesis. I hate when that happens.
I have to think about your idea on the dust being in focus. Something's still not sinking in.
Aaron