|
All images © Bob Atkins
This website is hosted by:
|
Author
|
Topic: Effects of Sensor Size on Photogrammetric Measurement Accuracy (Read 11157 times)
|
Massi
Newbie
Posts: 2
|
In my project I want to optimize the accuracy of measuring dimensional features on small objects (roughly 10mmx10mm) by using a digital camera and photogrammetry software package. When deciding on which camera to buy, I attempted using my intuition and reasoned that, since I wanted high accuracy, I should go for the smallest sensor possible. Here's how I arrived at that conclusion: Since the focusing lens focuses the incident light to a point, the smaller the sensor, the closer its position is to the lens's focal length (and thus, the sharper the image).
Then I read Bob Atkins' article 'Size Matters,' relating sensor size to image quality, in which he concluded that the smaller sensor size, in fact, results in lower MTF when reproducing the image. Basically, smaller sensors don't reproduce object details as well as larger sensors.
I had made my decision before finding Bob Atkins' article and bought the Canon EOS Rebel T1i with an 18-55mm EF-S lens. The attractive features were the relatively small sensor size (22.3mm x 14.9mm) and its high resolution (15.1 MP), which would give me very small pixels.
Even though I will not need to reproduce the images on paper (I will be strictly working with them on the computer), my thought is that I made a mistake in looking for the smallest sensor size possible.
Any thoughts? It might help to say that my goal in this project is to achieve accuracy of .05mm for an object size of 10mm x 10mm.
Thank you for any feedback.
Massi
|
|
|
Logged
|
|
|
|
|
Bob Atkins
|
It depends on the magnification you are using.
If you fill the frame with a 10mm x 10mm object (which is about a 1.5x magnification), then 10mm will be represented by apprximately 3000 pixels (the whole frame is 4752 x 3168).
Since 10mm is represented by 3000 pixels, 0.05mm will be represented by 15 pixels, so in principle the sensor will have sufficient resolution to measure distances with a precision of 0.05mm (which corresponds to a precision of 15 pixels)
Of course this assumes that image distortion across the frame isn't an issue and that you have a good lens operating at an aperture that forms an image with a resolution higher than your required measurement accuracy.
I think your reasoning that smaller sensors are better is flawed.
|
|
|
Logged
|
|
|
|
Massi
Newbie
Posts: 2
|
Thank you, Bob, for your quick response.
Do you believe that, for the scope of my project, what is important is the pixel count and not the pixel size? If both are important, how would the pixel size, then, affect the precision of my measurements?
In other words, would I have been better off buying a camera with a larger sensor (assuming the same resolution)?
Again, I appreciate your help as I am getting acquainted with these topics.
Massi
|
|
|
Logged
|
|
|
|
KeithB
|
I think the EF-S 60 macro lens might be a good idea, too. You could have quite a bit of distortion with the 18-55, which may throw off your measurements.
The EF-S 60 is essentially distortion free and will allow for 1:1 reproduction. That is, at its closest focussing distance, about 4 inches, the image on the sensor will be the same size as the object you are photographing.
If you need more working distance, the 100 mm EF macro lens will about double that distance.
|
|
|
Logged
|
|
|
|
Bob Atkins
|
What you have is fine. No need for more expense and really no benefit unless you go to something like the EOS 5D MkII. As long as you fill the frame with the same amount of the object you are trying to measure, it's the pixel count that matters, not pixel size. I'd agree that an EF-S 60/2.8 macro would be a good lens to use. Sharp and low distortion.
|
|
|
Logged
|
|
|
|
KeithB
|
For the ultimate you might want to consider doing your own RAW processing. Because of the Bayer Matrix color filter, the camera takes *at least* 4 pixels and mushes them together into one.
There are custom CR2 RAW converters out there which will let you get at the raw sensor data. If color is not critical to you, and you can tell the difference for your purpose no matter if it is a red, green, or blue filtered pixel, (i.e., what you are measuring has contrast in any color light) you can get the ultimate in sharpness by using the raw sensor data.
|
|
|
Logged
|
|
|
|
Bob Atkins
|
I'm not sure I agree. In fact there is a 1:1 correspondance between the RAw pixels and Jpeg pixels when it come to position, i..e there in no change in resolution. What is interpolated is color. Color is derived by looking at a lot of "nearest neighor" pixels. Since this application is just looking at position, Raw and Jpeg should yield the same results.
|
|
|
Logged
|
|
|
|
KeithB
|
My reasoning goes like this. Let us say that you have a single pixel line right through a single line of pixels. After de-mosaicing, pixels on either side of that line will have information from the pixels that have the line in order to get a full complement of RGB information. While there is a one-to-one correspondence in pixels the actual color resolution is more than a single pixel - edges will be blurred.
|
|
|
Logged
|
|
|
|
|
|
|