Resolution---What am I missing?

greenspun.com : LUSENET : Imaging Resource Discussion : One Thread

Looking into buying a digital camera and I have been reading all I can to get informed. After reading the FAQ on Imaging-Resource, I downloaded the 'musicians' sample images, shot at different resolutions and then enlarged to 1600 x 1200 (see HOW MANY PIXELS). I loaded each of the images to my monitor and also printed them.

I then asked 3 people to view the various images on my 17" monitor (original shots from 800 x 600 and enlarged to 1600 x 1200, as compared to the original shot and displayed at 1600 x 1200). Guess what, 2 of 3 could see no difference, the third thought the lower res image was better. Similar results when showed the printed images from my 1200dpi inkjet.

Sooo, how come "more" is not "better"? All 4 of us have bad eyes? Something wrong with my setup/equipment? Is this one of those cases where the only people able to discern differences are the owners of the high-end equipment? I was able to detect subtle differences when I enlarged (zoomed in) the images 3x with my image editing software, but I would not consider that practical for 95% of what I would be using a camera for.

If it makes a difference, I was using a MAG 17" 26dpi monitor at a resolution of 1024 x 768. The printer was a Lexmark Z51 1200dpi printed at highest quality and the image editor was PaintShop Pro 6. Thanks in advance for any words of wisdom.

-- Kirk Andrews (kandrews@mtnhome.com), March 20, 2000

Answers

Well, you said yourself you were showing a bunch of 2-million pixel images on a monitor that displays three-quarters of a million pixels, and you wonder why the 2-million-pixel images didn't look any better than the million-pixel ones. If you had viewed all images on a 4-inch television set probably all digital camera images look the same.

The thing everybody is interested in is the paper prints from digicams looking pretty much as good as prints you get from the drugstore from a 35mm camera. And that's where we all see the difference in the digicam results, on the paper prints, not when we look at things on low-resolution (i.e. relatively fuzzy) displays.

-- Russell Bozian (finaldesign@hotmail.com), March 20, 2000.


Computer monitors of all quality are only 72 dpi (dots per inch). Standard magazine quality photo finishing prints are produced at 300 dpi. The higher the resolution of a digicam the higher the dpi one can achieve with the printed output for a given print size.

Since the highest output resolution of digital cameras is fixed, as one prints various sized images the printed density or dpi is varied by the printing software resulting in varying levels of quality.

The math is as easy as one might expect, to print an 8x10" at 300 dpi, just mulitply - you need a computer image of 2400x3000 = 7.2MP

Keep in mind that most argue that 180 dpi is needed to achieve "photo quality" output (ie. you don't necessarily need 300 dpi prints to look good). So depending on your desired print sizes you may find a 1MP, or 2MP camera suits your needs. If you intend to only post images online (72dpi) then your needs are even easier to satisfy.

-- Mark P (digismurf@yahoo.com), March 21, 2000.


Sorry to be pedantic, but can I correct the obvious mis-conceptions about monitor resolution displayed (if you'll pardon the pun) in the above posts.

Monitors are not all 72 dpi, otherwise why would anyone ever buy a 17" or 19" monitor with a high-resolution screen. They usually have a dot-pitch of between 0.28 and 0.25mm which works out to about 90 to 102 dots per inch. Those figures don't really give any indication, though, of how much better the image on a 0.25mm dot-pitch tube looks than those on a bog-standard 0.28mm tube; the difference is like chalk and cheese.

The 72dpi is just a web "standard" that keeps images about the right size when they're printed from a web-page.

When comparing pictures downloaded from the web, you've also got to take the degree of JPEG compression into account. Images taken from a consumer digital camera have already gone through one lossy JPEG process. If they are then re-sized and saved in JPEG format again, they aren't going to look as good as the original.

-- Pete Andrews (p.l.andrews@bham.ac.uk), March 21, 2000.


When I started this thread I realized I didn't understand all I knew about resolution. The above posts have helped but I still have a couple questions. My purpose in asking is to keep from spending more for resolution than necessary, given how I think I will use the images.

Monitor---if my monitor resolution is 1024 x 768 (view area 12.5 x 9.5 inches), doesn't it follow that the image created by the camera must be at least that many pixels in order to take advantage of my monitor's capability? If so, then why don't I see a difference in an image that was created at 800 x 600, but "blown up" to 1600 x 1200 vs an image that was created at 1600 x 1200? If my monitor was only capable of 800 x 600 I could understand why I couldn't see a difference at any image size above that. Maybe I just didn't understand the FAQ examples and they are "adding" pixels through interpolation when they resized to 1600 x 1200 and not just making the image larger with the same 480,000 pixels.

Printer---if it only requires 180dpi for 'photo quality', is there any real advantage of buying a 600 or 1200dpi printer as far as resolution goes?

-- Kirk Andrews (kandrews@mtnhome.com), March 21, 2000.


Just to throw another discussion point into this thread :)

Looking back at your original post (the comparison of the separate images) another factor that would influence the very subjective view of which (camera / image) is best is colour accuracy. In my experience poor colour reproduction or poor colour saturation can be just as off-putting as a low resolution in an image.

This is where a higher resolution printer also helps. A printer can only reproduce 3 (or sometimes 5) colours, normally Cyan, Magenta or Yellow (CMYK - where K represents black). A greater printer resolution helps in the blending of these basic 3 (or 5) to achieve more realistic colour in a final print.

A rule of thumb that is often used to optimise an image's size against quality of printing is to have an image that is 1/3 of the dpi of the output, (although this does tend to bottom out between 250- 300 dpi depending on the printer).

good luck

Martin

-- Martin Ellis (inca@globalnet.co.uk), March 21, 2000.



Kirk: To try and answer your printer resolution question.

A pixel on the computer is composed of 3 colours; Red, Green, and Blue. In order to print each one of those pixels 3 colour inks must be used Cyan, Yellow, and Magenta (supplemented by black where necessary). Printer resolution given as dpi, means the maximum number of ink dots that can be printed per inch. So, either a Cyan or yellow or magenta or black dot can be printed in each position on the paper. This situation effectively divides the printers dpi resolution by 3, or multiplies the pixel resolution by 3 depending on how you look at it.

A printer resolution of 1200 dpi will therefore only need 400 dpi worth of pixels to supply it with all the information it needs.

I'll forestall another question by explaining that the frequency of Cyan ink dots is the reciprocal of the Red value in any given pixel, the Yellow the reciprocal of the Blue and Magenta is the Green reciprocal. The Black ink is used whenever the shading in the picture falls below a certain threshold.

BTW good quality magazine reproduction uses only 133 dots per inch, but that's a different system and another story.

-- Pete Andrews (p.l.andrews@bham.ac.uk), March 21, 2000.


To answer your original question, why a 1600x1200 image looks as crappy as an 800x600 image. The answer is your video card is not set to show "millions- of-colors" it is probably set at 256 colors and dithering the rest of the colors. The main reason is to display millions on a monitor set at 1600x1200 I think you would need a video card that has at least 8 mb of video ram just for 2D. I don't need to explain why millions squeezed to 256 colors looks like crap do I?

Just to clarify the difference between a printed page from a magazine, your inkjet printer and your monitor display: 1> A printed page on magazines usually uses four colors (CMYK) at different angles and uses "screens" that has a frequency called Lines-Per-Inch. The LPI and DPI works together to create the image - the dots or lines on an LPI changes in size from zero screen to 100% and a 100% screen could be made- up of 19 dots square on film if used on a 2540dpi imagesetter. Use a lupe or a magnifying glass to see for yourself. 2> Your non-postscript inkjet printer though it may also use CMYK inks, doesn't use line screen so instead it uses some sort of dithering like diffusion or pattern screening. So if you put lexmark printed photo at 1200dpi next to a magazine that has at least 150LPI (which means this was printed using a 2540DPI imagesetter) you'ld see a big difference. 3> Since the above uses 1-bit per channel (its either dot or no-dot), it looks pretty coarse when compared to the resolution on a monitor set at "millions of colors" which means that each pixel can represent a continous range between 0 to 256 shades of three channels (RGB). So a 1024x768 pixel rgb image viewed on a (26 dot-pitch not dpi) monitor would look a lot better than the same image printed on your 1200 dpi lexmark. More information wins, 8- bits per channel vs. 1-bit per channel.

-- Bert C. (bert@longlivethemac.com), March 28, 2000.


Oops! your card is set to 1024x768 not 1600x1200, check your card to make sure your monitor is showing "millions-of-colors". This is possible if your card has at least 4mb of VRAM. On a PC right-click your screen and select properties to set it.

-- Bert C. (bert@longlivethemac.com), March 28, 2000.

Moderation questions? read the FAQ