We often hear that we should scan at 72 dpi for the video screen, like it's some kind of magic number. It's not. I know a few of the best sources tell us this, and I can't defend them, but it's very much less than the whole story. I believe an early Apple model described a 72 dpi monitor at one time, back before screen dimensions were adjustable, and we're stuck with it now, even without other evidence. There is nothing we can do with that number. For one thing, monitors vary today.
A 15 inch monitor screen might measure 10.6 inches horizontally (the actual bright image area). If it is currently set to 800x600 pixels screen size, then the image is obviously 800 dots / 10.6 inches = 75 dpi apparent resolution in that case.
Reset the same monitor to different screen sizes, and we get different numbers. Monitors vary, and the horizontal size control even changes it too. I have made some assumptions about screen widths in inches, and computed this for different monitor sizes:
|APPARENT resolution of different size monitor screens|
|640 x 480||66 dpi||60 dpi||51 dpi||44 dpi||40 dpi|
|800 x 600||82 dpi||75 dpi||64 dpi||56 dpi||50 dpi|
|1024 x 768||106 dpi||97 dpi||82 dpi||71 dpi||64 dpi|
|1152 x 864||119 dpi||109 dpi||92 dpi||80 dpi||72 dpi|
|1280 x 1024||132 dpi||121 dpi||102 dpi||89 dpi||80 dpi|
|1600 x 1200||165 dpi||151 dpi||128 dpi||111 dpi||101 dpi|
The number in the first column (like 800x600 pixels) is called the resolution of video systems. This shows why the larger screen sizes like 1024x768 pixels are called "high resolution". More pixels in the same area, smaller pixels. We need a combination producing about 70 dpi for the screen to look decent, and 80 dpi is better (opinion). When trying 1024x768 pixel or larger screens for good graphics, don't forget that you have the "Large Fonts" selection on that same Windows video settings screen. There is no reason to abandon the better graphics size just because the fonts are too small.
72 dpi is not a very common result. An 800x600 pixel screen would have to be 11.1 inches wide to show 72 dpi, but the largest 15 inch I have seen is about 10.8 inches wide. A 14 inch 640x480 pixel screen that is only 8.9 inches wide would do it, but it would have to have ½ inch of black border on both sides. Monitors were like that ten years ago, the image didn't fill the glass area.
The 'generic' 72 dpi can be thought of as a ballpark number that will more or less display an image on a generic monitor in actual size in inches. That's based on information that isn't actually specifically true in any case however. It is often within maybe plus or minus 20% in many common cases, but real size in inches is a foreign concept on a monitor anyway, because they vary so much. Even if 72 dpi were an accurate number, the only advantage of using that same resolution would be that the scanned image would display at the same size in inches as the original.
Is that important?
And on who's monitor anyway?
Why, mine, obviously! <grin>
Put that image on a web page or send it via email, and you have no clue what kind of monitor may view it. But you can assume it will vary over a wide range, and it won't be 72 dpi.
Some sources tell us "A computer screen displays images at 72 or 75 dpi."
They don't mean the image should be 72 dpi (unfortunately, some of them do). Video monitor specifications have no associated dpi number, and screens will simply show higher resolution scans as larger images (chapter 5). For example, there is no way you can fill a 800x600 pixel screen from a 6x4 inch photo if scanning at 72 dpi. It requires about 150 dpi. You will only get 432 x 288 pixels if scanning 6x4 inches at 72 dpi, and it will fill only about ¼ of a 800x600 pixel screen. This is so very trivial to verify as true, but those that still parrot the false 72 dpi claim must not have ever bothered to look at the results they get.
What the "sources" mean is that the 72 dpi image will appear on screen at approximately actual or original size. But this is only approximate, because actual size is simply not a valid concept on a video screen. Sure, it might be desirable in a few cases, we want lots of things we can't have, but accurate actual size on video is impossible, because monitors vary. Using 65 to 90 dpi will be close to actual size in general, but it will not be repeatable on other monitors. If you do want the size of image that 72 dpi will give, it is a much better argument to scan at 75 dpi, see chapter 9 about integer divisors.
I don't know why we would want a 6x4 inch photo to appear exactly 6x4 inches on our screen, the original paper size no longer seems important. But even if we calculated the actual size precisely for our own monitor, our image cannot likely repeat that feat on a different monitor. It would be impossible on two screen sizes, like 640x480 vs. 800x600 pixels, even on the same one monitor. Accurate size in inches is simply not a consideration for video screens, it cannot be done. Inches don't count in video. Only pixels count on the screen.
We do need to care about the accurate size of printed images in inches. But that's really only because the paper size is measured in inches, and it allows us to know how much of the paper area will be filled. But a video screen is measured in pixels, and the video system only knows how large the image is in pixels. Knowing the image is 600x400 pixels tells us a lot about how it will fit on a 640x480 pixel screen. That's like knowing an image is 8x10 inches on 8.5x11 inch paper. But to complicate things, not all screens are 640x480 pixels, same as all paper is not 8.5x11 inches.
The point here is that monitor size varies, and they are obviously not 72 dpi. Screens work only with pixels, there are no inches and there are no dpi on the screen video system.
It is very common to hear the advice : "Monitors can only show 72 dpi so scan all your web images at 72 dpi".
For sure don't believe that. It is the world's worst imaging advice. You only have to test it once to know it's wrong. On any one monitor, we can easily see that there is obviously a tremendous difference in viewing the same photo scanned at 72 dpi and at say 200 dpi. The 200 dpi image is much larger on the screen, about 3 times larger. It contains correspondingly greater detail. And our so-called 72 dpi monitors will certainly show all of that detail, because the screen simply presents every image pixel, one by one. We are obviously NOT limited to 72 dpi, and 72 dpi is not a valid concept. The 200 dpi image is simply larger on the screen, and this was probably our goal, our design for the image. Or perhaps it is too large, so that it may or may not fit our screen size without scrolling, but all the pixels are there. No pixels are discarded to limit out at some magic 72 dpi resolution limit. 72 dpi video has no meaning in that context. It is not a limit, it is not real, this fictitious 72 dpi property simply does not exist.
If it were said that a certain printer has a 300 dpi resolution limit or scanning color prints beyond 300 dpi won't help show more detail, then those limits are real. These limits can be seen, because using more resolution will not increase the detail that can be seen. But video screens don't work that way. Since a video screen can obviously benefit (with increased size and greater detail) for images scanned at over 72 dpi, then there is obviously no reason to arbitrarily limit our scans to 72 dpi.
We should instead scan at whatever resolution it takes to create the image size we desire.
Video monitors have no associated dpi number, instead they simply show pixels, and they will therefore show higher resolution scans as larger images. For example, if scanning 6x4 inch prints to be full screen on a 1024x768 pixel monitor, you probably want 200 dpi to allow cropping it a little. There is no way you can create that size of image scanning 6x4 inches at 72 dpi.
My advice is to forget 72 dpi. It won't help you get better results, and it is counterproductive to understanding how it really works. I can't tell you what 72 dpi means and how to use it, because it's not real and it has no use. It is never mentioned in the monitor's specifications. Numbers like 640x480 pixels and 600x800 pixels are always mentioned however, and this is what is important.
I really should stop here, but you could, I suppose, specify 72 dpi scan resolution, and then declare a scaling factor like 200% to get the image size you really wanted, like 800x600 pixels. And scaling would certainly work, you definitely would get the image size you really wanted. One might even be fooled into thinking the 72 dpi was significant, like if it were a printer. But for video, the 72 dpi is simply wasted effort, an extra confusion factor, since resolution dpi is not used for displaying video images. On the screen, you would get exactly the same results using 172 or 272 dpi, or any value at all, assuming you also scale it to the same final image size like 800x600 pixels. See Scaling in Chapter 6, but scaling is only for printing. Video screens and printers are very different from each other.
Nevertheless, some do scale their images to 72 dpi for the screen. You can too if you like, it won't hurt anything, but it is pointless, as it will be ignored and will have no effect at all until you print the image. Even if did you want to print it, printing it at 72 dpi is pretty coarse on the printer.
So forget about 72 dpi. What works for video is to know we have a 640x480 or 800x600 or 1024x768 pixel screen size, and to know how our 500x400 pixel image fits there, how much room it takes and how much room is left. It especially helps to know how our 500x400 pixel image will be seen on the generic web visitors 640x480 pixel monitor.
Continued, the proof follows: