• Please use real names.

    Greetings to all who have registered to OPF and those guests taking a look around. Please use real names. Registrations with fictitious names will not be processed. REAL NAMES ONLY will be processed

    Firstname Lastname

    Register

    We are a courteous and supportive community. No need to hide behind an alia. If you have a genuine need for privacy/secrecy then let me know!
  • Welcome to the new site. Here's a thread about the update where you can post your feedback, ask questions or spot those nasty bugs!

Why does aperture affect sensor dust visibility

A thought struck me in the bath (where else ?) today.

Why, when the sensor is "behind" the aperture, does dust and other junk show up more clearly when the aperture is stopped down ?

I have never really been an optics-as-science kinda person, so objective but non-complex explanations would be appreciated.
 

Asher Kelman

OPF Owner/Editor-in-Chief
Peter,

This is a depth of field issue. Try this experiment: photograph an animal in the zoo behind a the screen with a progressively open aperture. As the size of the aperture is increased, the wires and bars defocus and become close to invisible.

Asher
 

Ray West

New member
I also think, when the aperture is small, it is a point source of light, which throwss sharp shadows. When it is wide open, it is a larger source, so shadows from the dust will be blurred (think softbox in reverse, I guess) .... I ought really test that idea.
 
D

Doug Kerr

Guest
Hi, Ray,

When the aperture is small, the light emanating from the exit pupil of the lens has a small source area.

The dust particle is not on the face of the sensor itself, but rather a bit in front of it, on the face of the filter stack.

Thus a smaller light source will cast a smaller (but "darker") shadow on the actual face of the sensor.

Imagine casting the shadow of a coin (perhaps held with forceps) on a wall, first with a small light source (perhaps a pencil flashlight) and then with a larger source (perhaps a floodlight), in a room with modest general illumination. The shadow in the first case will be smaller in diameter but will have much greater contrast.
 

Ray West

New member
Hi Doug,

That was what I was thinking. To test the theory, I was thinking, but not fully through, of using a point source of light, but with the lens wide open (point source close to lens. But then said point source would be blurredsince not in focus. Thinking arc weldi, and macro lens - not....

Best wishes,

Ray
 

Asher Kelman

OPF Owner/Editor-in-Chief
Doug and Ray,

I think you are most probably correct.

Are we now saying that the area behinf the dust spot is filled in by oblique light when the aperture is wide open?

Asher
 
D

Doug Kerr

Guest
Hi, Asher

Are we now saying that the area behinf the dust spot is filled in by oblique light when the aperture is wide open?
Indeed, the more so the greater the aperture.

Thus the "shadow" of the dust spot is, as the say in connection with eclipses, a penumbra. The larger the aperture, the larger is the region that is partly shadowed, and the more "diluted" the shadow is by the obliique light.

And of course the overall illuminance is increased, but of course we compensate for that with shutter speed if we use metered exposure.

Thus, the exposure (in the sense of lux-seconds) on the shadowed area becomes greater (more nearly the illuminance of the general field; less contrast), and the size of that area increases.

Best regards,

Doug
 
Asher Kelman said:
Are we now saying that the area behind the dust spot is filled in by oblique light when the aperture is wide open?
Considering that the dust is often invisible (small hairs and fibers can be seen), the assumption that it is oblique light that is generates the majority of the dust spot is reasonable. As the light gets more, but not fully, collimated the sharpness of the shadow of the dust spot becomes sharper and hence visible. In other words, the shadow shifts from below the noise threshold to above it as one moves from a wide light source towards a point source (i.e., stops down).

Though in truth, all light recorded on the sensor (silcon or celluloid based) is oblique as to get down to a point source wide enough to transmit a single photon wide stream would entail huge amounts of diffraction. But it is the different paths light take from the same source point through the lens to the sensor that creates the oblique nature of incident light on the sensor.

It is this very same phenomena mixed with differing indices of refraction in optical media for differing wavelengths of light that creates chromatic abberations.

The physics here are relatively simple in comparison to many real world problems (i.e., the laminar behavior of a mixture of two viscous fluids in a mixing device (the chaotic behavior is an order of magnitude more complex)) while being nonetheless non-trivial.

some thoughts,

Sean
 

Terry Wedd

New member
Peter,

it's caused by depth of focus. As you have depth of field in front of the lens, so you have depth of focus at the film plane. I can't remember how it relates to aperture, but I remember that when I was a lab rat as enlargement factor increased so depth of focus decreased. I would imagine that as depth of focus increases so dust ect becomes sharper.

Cheers,

Terry
 
Terry Wedd said:
would imagine that as depth of focus increases so dust and ecetera becomes sharper.
As DoF increases, the relative collimation increases (i.e., the light striking a point tends to come from a single point). I use the term relative collimation as the light striking a sensor is not collimated (i.e., the light rays are not parallel but coming through a series of blur circles).

In other words, the invisible spec of dust's shadow becomes less diffused and eventually becomes visible. Something to note here is that dust tends to only be visible in out of focus areas where its diffuse shadow is sharper than the incident light. I never had an issue with with sensor dust until I got a macro lens. Then, suddenly due to the massive defocus induced in things meters away when focussed decimeters away I found I had a massive dust problem. The sky, the most common non-macro sensor dust detector, is out of focus in 99% or more of images and hence is the most common sensor dust detector (not everyone thinks macro work is nearly then be all and end all of photography as I do ;o).

all the best, :)

Sean
 

Aaron Strasburg

New member
I don't think it's collimation...

which I think is another way of saying angle of incidence. If a larger angle of incidence (further from normal to the sensor surface) caused an elongated shadow it would also cause corner light falloff to be worse at smaller aperatures instead of better. This is a particular problem with digital, since the microlenses above the photosites don't work well at larger angles of incidence. You would also see dust spots becoming progressively more noticeable as they land further from the center of the sensor.

I can't quite explain it still, but I think it's related to DOF. If so it should be worse for short focal lengths. Has anyone noticed that?
 
Aaron Strasburg said:
which I think is another way of saying angle of incidence.
Not quite. When light is said to be collimated, then that means that parallel rays of light passed through an optical system remain parallel (note this is mathematicians interpretation of the optics and not an physicists or optical engineers summary). This is not correlated with the angle of incidence on the sensor in any way.

Aaron Strasburg said:
If a larger angle of incidence (further from normal to the sensor surface) caused an elongated shadow it would also cause corner light falloff to be worse at smaller aperatures instead of better.
With a digital sensor this correlates with the ability of the microlenses over each sensor site to focus obliquely directed rays to the sensor. This is not a major issue with DSLR sensor due to the lens models used from my understanding which could be wrong.

One should also not that most sensor dust creates shadows only a pixel or three wide which means that the de-Bayering of the sensor data will blur away a materially significant portion of the shadows shape.

Also, the eliptical shadow caused by non-perpendicular light paths is likely to be more prevalent with wide angle (DSLR) or rangefinder lenses (Leica and etcetera) from what I understand (I could be wrong as I just know where to start researching rather than answers as it has absolutly zero practical effect on my images).

But the practical result here is that with the Bayer array one needs several pixels of shadow before de-Bayering will allow even the small eliptical character of the shadow to be notable we would have to be beyond the Nyquist Rate*(2 sample which is 4 pixels after de-Bayering). And at that point I have found that such dust/hair is visible to my naked eye (albeit, I am nearsighted).
Aaron Strasburg said:
This is a particular problem with digital, since the microlenses above the photosites don't work well at larger angles of incidence. You would also see dust spots becoming progressively more noticeable as they land further from the center of the sensor.
Excellent point. This correlates 100% with what I have observed. Albeit, this could be becausse I rarely have strong defocus at the center of the frame.

But one must also take into account light falloff in the corners due to reflection off the sensor surface (i.e., celluloid sensors have this problem too).
Aaron Strasburg said:
I can't quite explain it still, but I think it's related to DOF. If so it should be worse for short focal lengths. Has anyone noticed that?
I note it being worse at longer focal lengths where DoF is shorter and we hence get stronger defocus. i.e., the dust must be more in focus than the image data.


* While Wikipedia is not a good social reference, it tends to be a first tier starting place on technical topics (not a final reference, but a good place to start looking it up yourself as the social goofing about is less common on technical topics).

al the best, :)

Sean
 

Aaron Strasburg

New member
Still not sure...

Hi Sean,

Just back from a week up in your neck of the woods. Quite dry there, far drier than here in the "desert" of New Mexico.

Sean DeMerchant said:
Not quite. When light is said to be collimated, then that means that parallel rays of light passed through an optical system remain parallel (note this is mathematicians interpretation of the optics and not an physicists or optical engineers summary). This is not correlated with the angle of incidence on the sensor in any way.
Take the case of a small (but not point) source of light on the lens optical axis. The rays reaching the center of the lens and the center of the sensor or film will be parallel and normal to the sensor (or nearly so), but that's really the only case. Now take that same light source and move it to the edge of the frame. The light rays reaching the front element of the lens can still be nearly parallel, but they can't possibly still be normal to the sensor unless the rear lens element is at least as large as the sensor diagonal (in other words, the rear element would have to be larger than the image circle). Over a small range the rays could still be parallel, but since there is some magnification they're not really.

I'm sure it's possible to build a lens with such a large rear element but I've never seen one. A quick survey of my small lens collection shows rear elements of 18-22mm, no where near the 43mm required for 35mm film and not even as large as the ~27mm diagonal on my 20D's APS-C sensor. It doesn't correlate to focal length or maximum aperature, either.

With a digital sensor this correlates with the ability of the microlenses over each sensor site to focus obliquely directed rays to the sensor. This is not a major issue with DSLR sensor due to the lens models used from my understanding which could be wrong.
From what I've read this is a major problem, actually. Thinking more about this I wonder if there is an algorithm built in to change the gain from center to edge or if they use different microlenses center to edge. I work in digital integrated circuit fabrication and I can't see how you would do such a thing, but since I've never dealt with microlenses that doesn't mean much.

I've read that some of the "best" lenses like Leica, Zeiss, and such that were designed for 35mm film vignette badly on digital because they're designed with a short distance from the rear element to the film. Some problem for film, more severe for digital sensors.

One should also not that most sensor dust creates shadows only a pixel or three wide which means that the de-Bayering of the sensor data will blur away a materially significant portion of the shadows shape.
Now that's an interesting point that I hadn't thought of. Even though I know better I still tend to think of all sensors as more of the Foveon style. Bad photographer-engineer.

Also, the eliptical shadow caused by non-perpendicular light paths is likely to be more prevalent with wide angle (DSLR) or rangefinder lenses (Leica and etcetera) from what I understand (I could be wrong as I just know where to start researching rather than answers as it has absolutly zero practical effect on my images).
Now we get into pure hypothesis: lenses with short rear element to sensor distance will show dust worse than those designed with a longer distance (I know there's a name for that, but I don't recall at the moment) due to the oblique angle of light on the sensor surface. Not terribly easy to check quantitatively, but I'll have go back through some times where I had serious muck on the lens and see if I can correlate it to aperature, focal length, and lens design. Rather academic, but I suppose it could tell you what lens to avoid if you know you have dust that you can't get rid of in the field.

But the practical result here is that with the Bayer array one needs several pixels of shadow before de-Bayering will allow even the small eliptical character of the shadow to be notable we would have to be beyond the Nyquist Rate*(2 sample which is 4 pixels after de-Bayering). And at that point I have found that such dust/hair is visible to my naked eye (albeit, I am nearsighted).
I'm mildly farsighted, but still have good eyesight and have had limited success seeing "stuff" on the sensor unless it's huge. The 20D has a pixel pitch, as I recall, of 8.6um, so 4 pixels is over 30um. Definitely in the visible range. It's not easy to see down into a black hole and get decent, oblique light at the same time.

Excellent point. This correlates 100% with what I have observed. Albeit, this could be becausse I rarely have strong defocus at the center of the frame.

But one must also take into account light falloff in the corners due to reflection off the sensor surface (i.e., celluloid sensors have this problem too).

I note it being worse at longer focal lengths where DoF is shorter and we hence get stronger defocus. i.e., the dust must be more in focus than the image data.
Now I'm really confused. Without going back to check my images carefully I have noted more problems at longer focal lengths as well, but in checking a couple of my zooms the rear element is farther from the sensor at long focal lengths. Shoots a big hole in my earlier hypothesis. I hate when that happens.

I have to think about your idea on the dust being in focus. Something's still not sinking in.

Aaron
 
Top