Digital Cameras: What is true "Resolution"?

greenspun.com : LUSENET : Imaging Resource Discussion : One Thread

Here's a question that has tormented me near the brink of insanity for months now: My company just bought a digital camera (Toshiba PDR-M1). This camera is considered a "true megapixel, high resolution" camera, and is capable of 1280 x 1024 "resolution". It can also be switched to 640 x 480. Now, the only difference I can see between shooting at 1280 x 1024 and 640 x 480 is, one image is bigger than the other. When I bring either picture into my image editing software, they are both 72 dpi (ppi). Isn't resolution determined by how many pixels there are in a square inch? How can 1280 x 1024, 640 x 480, etc. be considered anything else but a size measurement? My printer doesn't care one wit abut that stuff; all it cares about is how many pixels to squeeze in. Am I insane???

-- Mike Henderson (mhenderson@matric.com), March 01, 1999

Answers

Good taste in cameras, I own one too. Resolution is basically a measure of how much information is present in a given volume or standard. In the case of camera's the standard would be an image. If an image has 1280x1024 pixels it contains 4x's as much information as an image composed of 640x480 pixels; therefore, we say it has a higher resolution. If you printed both pictures at the same size, the one with more pixels should reveal greater detail provided it isn't printed at too high a ppi for your printer.

Both images show up at 72dpi in your image editor probably because the image editor is telling you that it is displaying the image on your monitor at 72 dpi. It probably also has a setting that controls the ppi or dpi the image would be printed at, or skips this and lets you just set the size at which you want the image printed.

To confuse matters further, printer dpi is not equivalent to image ppi. One pixel of an image can generally be any of 16.8 million colors, but a printer dot can only usually be one of 5 to eight colors: white, black, cyan, magenta, yellow, or possibly a second shade of magenta, yellow, or cyan. So from this you can see that trying to print an image at 720 PPI on a printer capable of 720 DPI won't work very well because the printer needs several dots per inch to represent the color of each pixel. Through trial and error I've found that printing 128ppi images on my 720dpi printer yields decent results in depicting a 1280x1024 image as an 10x8" photo. You have to play around to get the PPI that looks best on your printer, but I'd guess it would be somewhere between 1/3 and 1/6 of the printer's DPI rating. Too many pixels per inch and you don't get enough colors to properly depict the image. Too few pixels per inch and the image looks grainy. It's always something... :-)

It can be confusing stuff juggling terms like dpi, ppi, and image, printer, and display resolutions. Good luck.

Whoops. I almost forgot, "Yes. You're insane. Aren't we all?" :-)

-- Gerald Payne (gmp@francorp.francomm.com), March 01, 1999.


Thanks Gerald. Describing resolution as "how much information is present in a given volume or standard" is helpful. I can see how this works by changing the dpi (ppi) of a 1280 x 1024 image from, say, 72 dpi to 150 dpi. Printed at 72 dpi, the image is much larger than printing at 150 dpi, proving that increasing the dpi simply compresses the given information into a smaller space; hence, including more information in a given volume. The most confusing aspect for me remains the difference between how the monitor & software renders resolution (dimension, ppi, etc.) vs. how the printer does. (For instance, I can change the ppi in my software without it affecting the actual image size.)

Digital cameras are relatively new to me, and the approach is quite different from scanning, where you pick your pixels per inch and go from there, rather than picking your image size as a determining factor of resolution, as cameras seem to do. (Accurate description?) Kinda reminds me of the old aperture priority vs. shutter priority debate...

-- Mike Henderson (mhenderson@matric.com), March 02, 1999.


Yes, resolution is unquestionably one of the most confusing areas in digital imaging for most folks to deal with. To further confuse the issue, there's the matter of different aspect ratios for various cameras: Some cameras shoot wider, shorter image, while others are more narrow/tall. The ultimate measurement of resolution is "spatial frequency", or the maximum number of pairs of white/black bars spread across the image area that you can visually distinguish. This is the basis of the resolution test chart we use in our testing, the ISO 12233 chart. It addresses the issue of aspect ratio by arbitrarily fixing on the picture height, and specifying spatial frequency using the (somewhat arcane) metric of "line pairs per picture height". The standard provides for an absolute way of measuring the spatial frequency response of the camera, but it's a terribly involved process, taking way more time and effort than we can afford to devote. We use the much simpler "visual" resolution, which simply says "where does your eye think the resolution limit is?" The problem with this of course, is that it leaves a lot up to interpretation, as the eye is very good at finding lines, even where they don't *really* exist...

-- Dave Etchells (hotnews@imaging-resource.com), March 05, 1999.

Moderation questions? read the FAQ