We still frequently hear the very bad advice: "Computer video screens show images at 72 dpi, so scan all your images for the screen at 72 dpi". This is incredibly wrong; it simply doesn't work that way.
Regardless what you may have heard almost everywhere, there is no purpose or reason for 72 dpi images on computer video screens or web pages. As a concept, 72 dpi is simply a false notion. It is a myth. It is NOT how video systems work, and it is detrimental to understanding how things really do work. It is very easy to show proof of this here.
If you want to know why, you're in the right place.
This section is written unusually, as an argument, because there are some people that are, let's say, extremely certain that 72 dpi (or 96 dpi) is surely somehow quite important for images on the computer video screen. They may have no clue how or why, they have just always heard it was important. Sometimes they want to argue this anyway, so this is written for them, and I hope it may give them cause to simply look at what actually happens on their own screen before they email me. Other people have also heard how 72 dpi was supposedly important, but they can't make it work, it does not produce the desired results (which is in fact the correct answer about the validity of 72 dpi). This is also written to be useful for them, but the diatribe may be a bit thick. <grin>
When I say video, I simply mean video systems which operate the computer monitor. I am not referring to movies or animation.
All the digital image basics fully apply to images from scanners or from digital cameras. The only difference is the method of creating the images. Once created, it is all just pixels, and printing and video are still exactly the same ideas for all. I would invite you to view:
Menu to Topics below:
Each of the three images below are 412x324 pixels in size, which is why the video system shows each of them as 412x324 pixels size on the screen.
To make the point that the image resolution dpi number (the scaled printing resolution stored in the image file) simply does not matter on the screen, these three images below are now scaled individually to 7 dpi, 72 dpi, and 720 dpi. If you print them (using a photo editor), they are now three very different images in that regard. These image files actually do individually contain these three different dpi numbers, which is instruction to the printer how to print them (how to space the pixels on paper, how large to print the image on paper).
412 x 324 pixels, 7 dpi, prints 58 x 46 inches
412 x 324 pixels, 72 dpi, prints 5.7 x 4.5 inches
412 x 324 pixels, 720 dpi, prints 0.57 x 0.45 inches
Frankly, I don't see much difference on the screen. <grin> (and this is YOUR video system that we are using too). I am not making this up, you can see what in fact actually happens, and it's going to be real hard to dispute that. But for anyone who still wants to argue that there is any significance of 72 dpi (or 96 dpi) on the screen, please begin by explaining why there is no evidence whatsoever of it on the screen. If 72 dpi matters at all, then how can these three very different images appear identical? Please also explain how any 72 dpi notion might have been used to create these 412x324 pixel images from small 35 mm film (film size is 36x24 mm, about 1.4 x 0.92 inches).
I'm teasing. I know it is not possible that anyone can explain how 72 dpi is important in any way to these (or any) screen images. The notion of 72 dpi is flat wrong - video simply doesn't work that way. There is no concept of dpi on the video screen. It should instead be very clear from the above example that images are obviously shown on any screen only according to their size in pixels. In every case, you should just create the image size (pixels) that is the size that you want to see (pixels). So, forget about 72 dpi, it does not work that way. 72 dpi is a false notion, even worse than useless, because it is counter-productive, both to results and to understanding how it really works. There is no concept of dpi in the video system. There are only pixels.
Yes, these three 412x324 pixel images are all the same image, three identical copies of one image, created from a 35 mm negative scanned at 2820 dpi, and then resampled to 1/8 size, effectively 2820/8 = 352 dpi size now (scanning 0.92 inches film width at 352 dpi would also give 324 image pixel height). However that really doesn't matter now, since they are in fact scaled to 7, 72, and 720 dpi now as stated, and this makes them be three very different images when they are printed on paper. The 7, 72, and 720 dpi values here are very true, honest and legitimate in every respect.
To understand this better, you can save these images on your disk by clicking each image with the RIGHT or 2nd mouse button, and selecting Save Picture As, and then examine their individual image properties in your photo editor. These images will look exactly the same in your photo editor too, because it will show the same 412x324 pixel image size (the size the image actually is). Your examination will show that these images really do all in fact carry exactly the three different dpi values indicated here. The three images are identical size on the video screen (all are 412x324 pixels - dpi is ignored on the screen). But the three image files are indeed all very different now when printed on paper, at least when you print them individually from a photo editor. Dpi is for printing on paper. The three images are obviously the same size on the screen (that size is 412x324 pixels), but three images are indeed scaled to 7 dpi, 72 dpi, and 720 dpi, and the image properties definitely will show printed size of 58 inches, 5.7 inches, and 0.57 inches on paper. This dpi applies only to paper, or wherever inches might exist. DPI DOES NOT APPLY TO VIDEO SCREENS. The three images will obviously print at those inch sizes on paper (printed from your photo editor's File - Print menu, NOT from the web browser - more about printing from web browsers near the end). I don't know how to print the largest 58x46 inch image (unless you own a billboard printing company), but you can just examine its properties.
Do NOT use the clipboard Copy function to open the images. That never works to show this. The clipboard Copy does not retain the dpi information stored in the JPG file (because video screens have no use for dpi info), and then Windows programs will always show you 96 dpi for any file (or Adobe will always show 72 dpi) - for any image that has not retained the file dpi info (discussed on next page). So, save the actual file as instructed above, and then open that actual file to see the actual dpi information in the file. I am discussing the actual image file.
Our printer drivers definitely do use dpi - dpi is all important to printers because printers print on paper, and paper is dimensioned in inches. But video systems know no concept of dpi at all, nor any concept of inches either. Video systems simply show the 412x324 pixels, pixel for pixel. Any dpi value is always ignored on the screen, regardless what value it is. Dpi is simply not defined in the video system.
I am not making up anything - you can clearly see what actually happens on your own video screen. That is the Real World, and that is all there is to it, and what actually happens is all that matters, so if you understand this much - that any image will always be displayed on the video screen only according to its size in pixels - then you've got it right. But if 72 dpi is still of any interest, then read on. It may be a bit much, more than you expect, but the details are here.
For example, this image is from 35 mm film. If an area the size of 35 mm film (36x24 mm, about 1.4x0.92 inches) is scanned at 72 dpi, we get an image size of (1.4 inches x 72 dpi) x (0.92 inches x 72 dpi), which is about 100x66 pixels, if from full frame. The width of this one shown is cropped slightly smaller horizontally, to 84x66 pixels, but it is full height. Any 72 dpi scan of 35 mm film is not likely very useful, this is too small for most purposes, only thumbnail size. 72 dpi is simply the wrong concept for the video screen. It really does not matter what scanning resolution was used, just as long as it was the appropriate value to create the desired number of image pixels from the area you are scanning (see details in the previous Video Resolution Basics section).
As just demonstrated, the video system does not even look at the image dpi number when displaying the image on the screen. Screens only know about pixels. Yet I get some comments like "No way that image is 7 dpi, it looks as good as the others". Yes, on the screen, that was pretty much my point too. <grin> But for printing, you can definitely believe that these three images above are indeed actually scaled to 7, 72, and 720 dpi (just look for yourself). Yet all obviously appear the same 412x324 pixel size and have the same appearance on your screen, on my screen, on any screen, including on this web page in your browser, and in your photo editor program too. And this is the point; it shows that video systems simply do not use image dpi. Never ever, no way, no how. I don't know how to make it more clear. <grin>
Note that this argument is NOT about if the term should be called dpi or ppi. I tend to say dpi, like historical usage always has. If you want to call it ppi, that's fine, it is still exactly what I am discussing here. More about that name here, but it is unrelated - the video system has no concept of either term. The video system simply shows pixels directly, and has no concept of inches at all. You can and should notice that the terms "dpi" or "ppi" simply do not appear in any user manual for any monitor or for any video board. The really good reason is because the concept of dpi simply does not exist in the video system. The numbers that you will see in the video specifications will be screen size numbers like 1024x768 pixels.
The difference in video screens and printers is that paper is dimensioned in inches, and printers do use dpi, and these three images will all print on paper at very different sizes (printed from a photo editor instead of printing this web page). Dpi is for printing, and your printer definitely will honor printing resolution to print individual images, and will print these individually at the sizes shown above (except your printer probably cannot handle the big one). But screens only show pixels, and these three 412x324 pixel images are the same 412x324 pixel size on any screen.
All of the above is the obvious evidence which you can clearly see actually happens, if you simply look. Any correct theory must actually describe what we can clearly see actually happens. Nothing about 72 dpi can ever do that (the major reason is because video does not work that way - the video system totally ignores any dpi number in image files, and shows the pixels directly).
There is no concept of inches in any video system. There are only pixels. This should be obvious, because regardless of the size of our monitor screen (inches), our video settings can only specify a certain screen size measured in pixels. The inches do not matter at all, because if we specify a 1024x768 pixel screen size, then 1024x768 pixels of video board memory will be used to build the screen image of that size. This 1024x768 pixel image size requires a video board with at least (1024x768x3RGB) = 2.3 MB of video memory to hold that image. This is the purpose of that video board memory, to hold that 1024x768 pixel screen image. That same 1024x768 pixel memory image is directly output to the monitor cable, regardless of what size monitor we attach. We can connect any size of monitor, and a big monitor will show the 1024x768 pixel screen image bigger, and a small monitor will show it smaller, but if we specified 1024x768 pixels, then we are definitely going to see 1024x768 pixels, regardless of monitor size. Our video system is only about those 1024x768 pixels.
We can of course specify other screen dimensions, but if we specify 1024x768 pixels, then 1024x768 pixels is what we get and see. 1024x768 pixels is a popular choice, and I will refer to it generically here, to mean "whatever screen dimension setting that you have chosen".
The video system does not even know nor care what size the monitor tube is. We can attach any monitor we wish, 12 inch or 21 inch, and nothing changes in the video system. The video system only knows about the 1024x768 pixels, and we are going to see 1024x768 pixels spread over the full screen of any monitor we attach to show it. Any embedded 412x324 pixel image (as above) is just some part of those total 1024x768 screen pixels. The 412x324 pixel image will be shown at 412x324 pixels size, simply because that is its size. That is how video works. Inches are totally unknown, there are only pixels. And if no inches exist, then clearly any notion of dpi is extremely suspect too. This much is very obvious, just look at what you see.
In our world of digital images, dpi is for printing images on paper, or for scanning images from paper or film. Dpi means "pixels per inch" and it implies inches on paper, or film, or someplace where inches exist. Paper is dimensioned in inches, but video screens are dimensioned in pixels, and there is no concept of dpi in the video system.
Forgive my saying the same thing over and over, but some people are reluctant to imagine this immediately. The three images above are each 412 x 324 pixels in size. Digital images are dimensioned in pixels. Those pixel dimension numbers are extremely important, because video systems pay very careful attention to the image size in pixels. Images are dimensioned in pixels, and screens are dimensioned in pixels, and image pixels are directly displayed on the screen, one for one - one image pixel at one video board pixel location (assuming we view 100% image size). That is simply how video images are shown. Pixels are shown directly, without any concept of dpi at all. So these three images above were each shown to be 412x324 pixels in size, on any screen, simply because the images are in fact 412x324 pixels in size, and because screens show pixels directly.
Complicated today: We can zoom the web screen by using Windows text size zoom, and/or by zooming in the View menu of browsers. Windows zoom affects only the size of text (Control Panel - Display is one access point), and it does not affect image size in Windows. Except today, browsers zoom the images as well as the text, to preserve original web page layout. When browsers zoom the screen size, they compute as if the physical screen size actually got larger (a new imaginary size called CSS pixels), and the browser window sees a portion of that. Browsers differ, which makes reporting screen size messy.
Update: Messy distraction, sorry, I should delete that part now. Things are changing in unpredictable ways. The numbers above appear to still be true of Firefox or IE, including browser zoom and any Windows text size zoom. But true of Chrome only if the browser zoom is 100%. If any problem above, your screen size can be reported wrong now, showing numbers you know are not true. Video is not "dpi aware", but since Vista, Microsoft has been attempting to add optional "dpi aware" features anyway. Any false report would be because, if in Windows 7 (or Vista), you have zoomed your browser's text size to be different. Then the browser scales the image size with text size, to keep the web page proportions right. But now, we see that Windows 7 may scale your apparent screen size (smaller for larger text) - not on the actual desktop of course, but the "screen size" numbers seen by the browser window change in the same inverse percentage proportion, and you may see these numbers of that false "screen size" when the text size is larger. If so, you can reset this size to default with Firefox menu: View - Zoom - Reset, and then reload the page. Chrome has its menu at right end of toolbar. Or with IE menu: View - Zoom - 100% (but IE might come out like 99% and still be slightly off). Your text will become tiny again though. There are ifs and buts about this, it is more complicated and it's messy, and there is more below, so let's just say you might or might not see this issue.
(There is a setting about this (at Custom DPI, "Use Windows XP style DPI scaling"), but it seems flakey. The issue is trying to apply inches to video anyway, where there is no definition of inches (only pixels).
Regardless, the idea is still the same - images are shown in video at their size in pixels, but then browsers will scale the images to follow the text size to maintain web page proportions. These new screen size numbers are only related to imaginary logical inches for text (next section), and is certainly not about any real dpi number in any image file.
The same image will appear at a different apparent size (inches) on any other size screen, or on a different size monitor. Meaning, other people with different size screens (pixels or inches either) will see these same images at a different apparent size than you may see. For example, someone with a 640x480 pixel screen size will see these images larger, at 64% x 67% of their full screen size, on whatever size of monitor they have. But regardless, even then, the 412x324 pixel image will always still fill the same 412x324 pixels in every case. This is because digital image size is measured in pixels, and screen size is also measured in pixels. It really couldn't be any simpler.
But the printer will certainly care about dpi, and these images will all print at three VERY different sizes on paper (at sizes as shown above) if saved and printed individually using the File - Print menu (when printed individually from a photo editor program, not a web browser)
The numbers 72 dpi or 96 dpi are never used in any way by the video system to show images on the screen. The 72 or 96 dpi number is NOT about images at all, and that number never affects any screen image in any way. Period. The video system simply shows pixels, directly, one for one, without any concept of dpi.
The numbers 72 and 96 dpi do sort of exist (in their imaginary way) in computer operating systems. This existence does seriously confuse people who imagine these numbers might be about showing images, but these numbers never affect any image in any way. These numbers are only used to compute Text Size on the screen. The reason is because text font size in Points is dimensioned in inches (designed for paper), but screens are only dimensioned in pixels. The definition of a Point is 1/72 inch - there are 72 points per real inch on paper. A 12 Point font is 12/72 inch height printed on paper (inches exist on paper, and paper is dimensioned in inches, and fonts are dimensioned in inches). But we are in real trouble on the screen - screens are only dimensioned in pixels. We desperately need some concept of inches to size this font, but unfortunately, there simply is no concept of inches on the screen. Only the 1024x768 pixels exist there.
But not to worry, the operating system simply dreams up and uses some fake 72 and 96 dpi numbers to compute text size on the screen. If we invent some imaginary notion of 96 dpi SIZE (if we simply pretend 96 pixels will cover one inch on the screen, more or less, without asking "what size of screen?"), then we can use this to compute that the 12 point font should be shown to be (12/72 inch x 96 pixels per inch) = 16 pixels tall in Windows. 16 pixels is a perfectly usable screen dimension, and we're as happy now as if it were actually true, so long as we don't ask any questions.
Programmers call this concept Logical Inches. In computer science lingo, logical (as opposed to physical) means "This is NOT real, but let's use it anyway, as if it were real, because it is convenient to pretend it is real". Logical is how we are going to think of it, regardless of how it might really be physically. There is no other way to imagine inches on the screen - the video system is only dimensioned in pixels.
The Windows operating system assumes one logical inch on the screen is 96 pixels, because, well, because it would just be very convenient if it were somehow actually true. The video system does not know the size of the monitor, so there simply is no concept of real inches, so it is necessary that we make something up. Real inches have an exact defined size, but a logical inch does not, it is whatever size the 96 pixels happen to be on this one specific screen, and the logical inch will be a different length on a different screen. Don't confuse "logical inches" with "real inches" - very different concepts.
We do have a video adjustment for this 96 dpi number for some cases. The Windows Large Font setting can assume a different number to show the text at a different size (the Large Font default is 120 dpi). But otherwise the same imaginary 96 dpi or 120 dpi number is used for text on any and every screen, regardless of any details about the size of that system. No image is affected in any way by this number - instead we are just playing fast and loose with the inch for text fonts. A logical inch will be whatever size the 96 pixels actually are. This is a very well-known situation, and yes, of course the programmers know this is not true (at least those that work at this level). It is trivial to show that screens only display pixels directly, like those 1024x768 pixels. No video system knows or uses ANY concept of dpi. But as a rough guess about text size, this logical inch idea is actually not too far wrong (which is why these numbers were used).
What should the accurate number be? There is really no answer possible, because the actual real world screen size numbers vary quite a bit. For example, a 15 inch monitor screen might measure 10.6 inches horizontally (its actual bright image area). If it is currently set to 800x600 pixels screen size, then the image is obviously 800 dots / 10.6 inches = 75 dpi (pixels per inch) apparent resolution in that case.
But reset the same monitor to different screen sizes, and we get very different numbers. Monitors vary, even 15 inch monitors vary, but I measured a few screen widths in inches, and computed this for different CRT monitor sizes:
|APPARENT resolution of different size CRT monitor screens|
|14 inch monitor
Width 9.7 inch
|15 inch monitor
Width 10.6 inch
|17 inch monitor
Width 12.5 inch
|19 inch monitor
Width 14.4 inch
|21 inch monitor
Width 15.9 inch
|640 x 480||66 dpi||60 dpi||51 dpi||44 dpi||40 dpi|
|800 x 600||82 dpi||75 dpi||64 dpi||56 dpi||50 dpi|
|1024 x 768||106 dpi||97 dpi||82 dpi||71 dpi||64 dpi|
|1152 x 864||119 dpi||109 dpi||92 dpi||80 dpi||72 dpi|
|1280 x 1024||132 dpi||121 dpi||102 dpi||89 dpi||80 dpi|
|1600 x 1200||165 dpi||151 dpi||128 dpi||111 dpi||101 dpi|
These are standard width 4:3 monitors. A 23" wide screen (20 inches wide) at 1920x1080 is 1920/20 = 96 dpi.
These computed dpi values are merely calculations after the fact - calculating the apparent size of the pixels. We can say this calculation is the apparent resolution of the result on the one specific screen, but the video system does NOT know about those dpi numbers, and did not use any such numbers to show the image on the screen. The numbers that are important for showing images on our screens are the screen size numbers in the first column. If we have a 800x600 pixel screen, that is what we see, and then we know exactly how a 400x300 pixel image will fit on it.
That screen size in pixels (first column) is called the resolution of video systems, and it is named "Screen resolution" in the Windows Control Panel. We need a size combination producing about 70 dpi for the screen to look decent, and 80 dpi is better (opinion). This chart shows why the larger screen sizes like 1600x1200 pixels are called "high resolution". More pixels in the same screen area are smaller pixels, with more potential for detail on the overall screen. This means that we could show a larger image then, more pixels with more detail.
However the existing 412x324 pixel images above will still be the same existing 412x324 pixels, and this means that as the screen dimension is made larger, the apparent size of these images gets smaller, because they are the same 412x324 pixels as before. This existing image size is now a smaller percentage of the larger full screen size, so the same pixels cover fewer inches on the larger screen.
And of course this apparent size reduction on a larger screen also happens to text fonts (which are dimensioned in pixels on the screen by using 96 dpi logical inches), so the same text gets smaller on a larger screen too. If you change a 800x600 pixel screen to be 1600x1200 pixels, everything including the text will become half size. It would be called higher resolution, but you may not be able to read it then (unless you also bought a much larger monitor to compensate). Video screens are all about pixels, and you can see all of this happen on your own screen.
Our monitors and screen sizes vary greatly, but what we have in common is that many of us generally adjust our screen size as large as we can to still be able to read text as it gets smaller. And since we must all be able to read text, the text size usually comes out about the same size. We usually choose one of the common combinations marked in yellow in the chart above, so this narrows the probable range, so that most of us are often perhaps within 30% on text size (except maybe the larger laptop screens). When the text size is too small at the larger screen settings, you can increase the size of your logical inches with the "Large Fonts" option in the same Windows video settings. For good readability, you will probably prefer text logical inches maybe 20% to 30% larger than your number in the chart above.
So the assumed 96 dpi logical inch in Windows, and the 72 dpi logical inch for Macintosh, are somewhat close to the size of our screens, so these defaults are at least a crude approximation for appropriate text size in inches. But there is no one "correct" or better guess, because screens vary. This Logical Inch guess definitely is a valiant try for an impossible problem (there simply is no concept of inches in the video system). It makes it possible to show fonts (dimensioned in points) on the screen. But it is not very accurate, because screens in fact vary in size and settings. This is why our word processors usually give us a Zoom control to see the text page any size we want to see it.
Printing routinely requires images that are much larger than our screen size. When viewed at 100% Actual size, a 300 dpi size image for printing will appear 4x printed size on a 75 dpi size video screen, simply because screens do in fact show pixels directly, and that is how many pixels there are. So photo programs normally also provide a zoom control that will resize an image to show it any size we want to see it on the screen. The original image data is not affected by this temporary zoom, and that original data is still what will print, but what we see on the screen is a temporary copy which is different resampled pixels (unless we specify 100% Actual size). The image window title bar shows the new size (normally as a percentage) as a warning that we are not seeing the actual data pixels. The initial default is usually a smaller size that will fit in the program window. But whatever new size we select to see, the new copy's resampled pixels are still simply shown directly, pixel for pixel. The video system has no other way to show pixels.
Again, logical inches are the false (but nevertheless very useful) notion that 96 pixels define one imaginary inch on any Windows screen. We must overlook that there will actually be more pixels in a real inch on some screens, and fewer on others, but it is usually not too far wrong. But be very clear that logical inches are only used to resize inch dimensions into pixel dimensions for the screen (this can include computer-drawn graphics, but it is normally about text font size in points). Logical inches are NOT related to showing bitmap images. This notion is not needed for images, nor is it even usable for images, because images already have a size in pixels, and images are simply shown according to their actual size in pixels, pixel for pixel.