• Please use real names.

    Greetings to all who have registered to OPF and those guests taking a look around. Please use real names. Registrations with fictitious names will not be processed. REAL NAMES ONLY will be processed

    Firstname Lastname

    Register

    We are a courteous and supportive community. No need to hide behind an alia. If you have a genuine need for privacy/secrecy then let me know!
  • Welcome to the new site. Here's a thread about the update where you can post your feedback, ask questions or spot those nasty bugs!

Couple questions about measuring MTF of camera

Marwan Daar

New member
Hi everyone, first post here.

A bit of background about my project. I'm a CRT enthusiast, and collect and calibrate high end CRTs, in particular the Sony GDM-FW900.

My most recent efforts are geared towards measuring the sharpness of these displays and how this varies across different conditions, such as stimulus contrast, horizontal scanning frequency, type of video cable used, type of video card used, and whether or not the anti glare film is removed.

After some research, it appears that measuring the modulation transfer function (MTF) of the display would probably be a solid way to approach this issue.

The general approach is to use my DSLR (canon EOS 450D) as an imaging photometer. I've reversed the lens, allowing a very high magnification: each sensel captures about 1.3 microns across in the focal plane; after subsampling to extract trichromatic quantities (of which luminance is the most important), this will rise to 2.6 microns per pixel.

Before I measure lines that my display renders, however, I figured it would be a good idea to get some baseline measurements of the MTF of my camera itself. In particular, I want to determine the optimal aperture - that sweet spot between diffraction and lens aberration - so that I can use that aperture when performing my measurements of the display itself.

In order to achieve this, I bought a Ronchi ruling, which is a high quality square wave optical target, and have been taking measurements of this ruling.

See the image below that illustrates the setup. I have taped the ruling to the CRT surface, which allows me to use the CRT as a source of illumination (the ruling comprises alternating black and transparent lines, as can be seen in the live preview on the top left of the display).

14jxa8p.png


After doing a bit more reading, I came across the slanted edge technique for measuring the MTF. I didn't have a good grasp of the logic behind this technique until I came across Doug Kerr's excellent article, which was absolutely instrumental in my learning process (and is the reason I'm posting on this forum).

Using the technique described here for estimating the angle of edge orientation, and the methodology described here for determining the number of lines to supersample (also see footnote 5 in Kerr's article), I've been able to successfully implement code in Matlab that takes an image of a slanted edge, and outputs the MTF.

However, something is bugging me:

Let's assume that the black stripes of the ronchi ruling have zero transmittance and reflectance. Then, the actual variation in luminance across the axis of the edge spread function is going to start at 0 cd/m^2, and finish at the luminance of the "white" part of the grating. Given the finite bit depth of the analog to digital converter of the camera, much of the information about this ESF is going to be missing. Now I can effectively expand the bit depth by combining images with different exposures, but my gut tells me that this would be overkill for these purposes (I already have 14 bits of dynamic range to work with in a single exposure).

So my first question is this:

For the purposes of measuring the MTF of my camera, what kind of information, in the fourier domain, would be gained by acquiring more finely quantized images. And, if a single exposure is sufficient, should I prioritize my exposure to capture the high end, or the low end, of this dynamic range?


My second question is this:

Once I move on to measuring the MTF of the display itself (which will be done by generating a simple test pattern on the CRT - a black patch bordering a white patch), can I use the same slanted edge technique? (I can achieve the slant by slightly tilting the camera itself)

I understand that the final measured MTF will be a combination of the camera MTF and the display MTF, but given that I'm more interested in making comparitive measurements between different conditions, rather than actually measuring the absolute MTF of the display, will the technique suffice?

Also, at the magnifications I'm working with, I'm assuming that the MTF of the lens is significant well below the noise level of the display.

I appreciate any guidance, this is a rather new area to me.
 

Doug Kerr

Well-known member
Hi, Marwan,

Welcome aboard.

I haven't thought about these matters for a while, so it will take a while for my old brain to get into gear.

With a little luck, you will also hear from member Bart van der Wolf, who is far more familiar with these matters than I.

And thanks for your very clear exposition of your project and what you have done so far.
After doing a bit more reading, I came across the slanted edge technique for measuring the MTF. I didn't have a good grasp of the logic behind this technique until I came across Doug Kerr's excellent article, which was absolutely instrumental in my learning process (and is the reason I'm posting on this forum).

I'm so glad that helped. I myself was absolutely baffled by this, and the published explanations weren't all that great, so I finally decided to "triangulate" among the available information until I "got it".

However, something is bugging me:
Let's assume that the black stripes of the ronchi ruling have zero transmittance and reflectance. Then, the actual variation in luminance across the axis of the edge spread function is going to start at 0 cd/m^2, and finish at the luminance of the "white" part of the grating. Given the finite bit depth of the analog to digital converter of the camera, much of the information about this ESF is going to be missing.

Well, the resulting quantization means that your information will be "imperfect".

Now I can effectively expand the bit depth by combining images with different exposures, but my gut tells me that this would be overkill for these purposes (I already have 14 bits of dynamic range to work with in a single exposure).

So my first question is this:

For the purposes of measuring the MTF of my camera, what kind of information, in the fourier domain, would be gained by acquiring more finely quantized images. And, if a single exposure is sufficient, should I prioritize my exposure to capture the high end, or the low end, of this dynamic range?

I really have no sense of that, never having actually done any of this!

My second question is this:

Once I move on to measuring the MTF of the display itself (which will be done by generating a simple test pattern on the CRT - a black patch bordering a white patch), can I use the same slanted edge technique? (I can achieve the slant by slightly tilting the camera itself)

I understand that the final measured MTF will be a combination of the camera MTF and the display MTF, but given that I'm more interested in making comparitive measurements between different conditions, rather than actually measuring the absolute MTF of the display, will the technique suffice?

Again I can't be all all sure, but that sounds reasonable.

Also, at the magnifications I'm working with, I'm assuming that the MTF of the lens is significant well below the noise level of the display.

I'm not sure I know exactly what you mean by that.

Sorry I can't be of better help to you just now. But Keep me up-to-date as to how your bigmouths evolve.

Best regards,

Doug
 

Marwan Daar

New member
Thanks for the reply Doug.

That last line wasn't worded very well. I meant something along the following:

Given that I'm working at such a high level of magnification, any perceptually meaningful* differences in the sharpness of my display, between various operating conditions, will exist at spatial frequencies that are well below the nyquist limit of my camera. In other words, the contrast modulation of my lens, at perceptually meaningful spatial frequencies, may well be close to unity.

I'll update this thread once I make progress in my measurements/techniques.

*perceptually meaningful in the context of normal display viewing conditions.
 
Top