Optimum print size from fixed image resolution

greenspun.com : LUSENET : Imaging Resource Discussion : One Thread

Ref: "The Digital Darkroom" I found the article very informative but the logic of the formula for calculating image size escapes me. From the method suggested it would seem that the optimum print size from a fixed image resolution decreases as printer resolution and the corresponding output resolution increases. In the example given optimum print size for a 1152x872 image on a printer with recommended output resolution of 150 ppi is app; 7x5. On say a 1400 dpi printer with a recommended output resolution of 250 ppi the optimum print size reduces to app. 4.5x3.5 when 1152 and 872 are divided

-- Paul Manuel (paulmanuel@hunterlink.net.au), May 19, 1999

Answers

You are correct, Sir! :-)

What I mean to say is that I think you have the concepts down as I understand them, but perhaps you're not seeing the underlying principles. Odd as it may seem, as printer output resolution increases, it takes a higher resolution image file to maintain the same size output. The reason for this is that it takes a certain number of printed Dots to represent each Pixel and yield a full range of color possibilities, since each Dot is only one primary shade; typically white, black, cyan, magenta, or yellow. (OK, the white isn't usually printed, but just the absence of ink.)

Trying to produce too large a print from a given resolution image:

If you use too many printed Dots to represent each Pixel of an image the Pixels become too large and the image starts to appear blocky, pixelated, or grainy(pick one you dislike.) The way you combat this is to capture a higher resolution image. With a scanner this is typically a no brainer, with a digicam the native sensor resolution becomes very important. As printing technology produces finer and finer output resolutions we'll need cameras that produce higher resolution output to produce larger images with good detail. This is only true to a certain extent though. Once the printers get to the point where the individual printer Dots are small enough to combine to print a Pixel that subtly blends with the Dots of the Pixel next to it then the technology will have advanced far enough -as long as you reasonably limit the maximum output size. We're probably close to that now with printers capable of 1440 DPI. Once the printer can print Dots so fine we can't see them it can use them to form Pixels so well integrated with the next Pixel that the line between one Pixel and the next becomes ambiguous. Voila, you now have a continuous or apparently continouous tone image that looks like it came out of a dye-sublimation or better technology printer.

Trying to produce too small a print from a given resolution image:

If you use too few printed Dots to represent each Pixel of an image the Pixels become too small and the printer can't accurately reproduce the color of the Pixel because it doesn't have room to print enough Dots to dither together to get the shade it needs for each Pixel. You avoid this one by printing the image at a lower PPI so the printer can print enough Dots, but you get a larger image. You can also beat this one by using a printer capable of higher resolution output or by down-sampling the image to less Pixels. Of course, down-sampling probably means you'll loose detail. Yech!

I hope that makes some sense. It comes down to the idea that once you have a printing technology with a really high resolution you have a lot more flexibility, but you'll probably need higher resolution images to take full advantage of it and print corespondingly larger images that show a good amount of detail. Does that work for you?

-- Gerald M. Payne (gmp@francorp.francomm.com), May 19, 1999.


Moderation questions? read the FAQ