Apple's 72 dpi logical inch was close to accurate text size on the early Macintosh screen (1984), which is why Apple selected the 72 dpi number back then. Apple still uses that number today for text logical inches, but that number was only about the size of THAT screen. That 1984 Macintosh graphical screen was something new, it could show various font faces and sizes, italics and bold, even proportional fonts on the screen for the first time. We take this for granted today, but graphic text on the video screen was a new idea then. MS-DOS screens did not show graphic text, and instead used screens dimensioned as 80x25 fixed-width characters of one bit-mapped font.
And back then, Apple did brag about 72 dpi and about WYSIWYG (What You See Is What You Get), meaning text on that screen looked the same, and matched the size of the same text printed on paper. The size accuracy was not because of the number 72 dpi per se, it was because this logical inch size was selected to match this ONE physical screen size. This first Logical Inch declares that 72 pixels is an inch, without regard for which screen size (which is Not true of course, but it was simplification). 10-Point fonts could be represented to be 10 pixels tall, to be 10/72 inch on the screen, even if they weren't.
We don't hear about WYSIWYG today - it is not possible now because modern screens vary in size. But that one early screen was always the same size then, and this advertising was very effective for Apple back then. However it was unfortunately also detrimental to our understanding how things work, because it let us assume 72 dpi had some significance to the video system. But it doesn't - it can't - there are NO inches and NO dpi in ANY video system - our screens are dimensioned in pixels. It was only about the size of THAT screen.
That early 9 inch Mac screen in 1984 was 512x324 pixels, and each of the 412x312 pixel images on previous page would be nearly full screen then.
However back in those early days, we didn't show many images on the screen - the video system was only 16 colors then - 8 combinations of red, green or blue, plus a low intensity setting (today's 24 bits is 256 intensity settings for 16.7 million color combinations). Also TIF and GIF and JPG file formats did not exist in 1984, so it really wasn't important yet that video doesn't work that way for images. The confusion about 72 dpi spread and stuck - we didn't know, we simply misunderstood reality. But the only reality was then and is now that screens show pixels directly, and that Logical Inches are an artificial guess about the size of pixels on the screen, used only to size text dimensions into pixels on the screen.
The use of photo images on the screen has really changed in 30 years, night and day. But we did read text on the screen back in those early days, and the 72 dpi logical inch value had the same size effect then as now. It was easy for us to naively assume that this artificial dpi number was also somehow used to show images on the screen, but that was simply wrong (logical inches only affected text). Now that we actually work with images, it becomes obvious that it doesn't work that way at all. Instead, images are dimensioned in pixels, and screens are dimensioned in pixels, and pixels is all there is on the screen. It is easy to see that this is true, yet it is a mighty persistent myth, and many people still today insist "all screens are 72 dpi", even if they don't know what it might mean, or doesn't mean. Yogi Berra said "You can observe a lot just by watching". If one simply thinks about what they can clearly see actually does happen, then the truth can change your life, at least digitally. <grin>
Microsoft later used 96 dpi logical inches in Windows to intentionally show larger screen text for better readability - and we often select 120 dpi Large Font size settings on today's larger screens (a screen 1024 pixels tall shows text pixels smaller than a screen 800 pixels tall, and logical 120 dpi simply restores the larger text size). Macintosh and Windows systems could use the same monitors - the monitor is not different and Windows screens are NOT higher resolution. If the video in both systems are set to show 1024x768 pixels, then both are going to show 1024x768 pixels, regardless of what monitor is present. Microsoft just assumed the logical inches were larger, so the text would be larger (logical inches are imaginary and therefore arbitrary).
But unfortunately, this fact about text logical inches causes people to advise us that screens are indeed 96 or 72 dpi "because Microsoft and Apple say so!" That is not totally incorrect, they do. But the way that these people understand it is totally incorrect. This is very bad advice if we want to show images on the screen. We need to know about pixels instead, because the video system only knows about pixels. It is true Apple and Microsoft do use logical inches to be able to approximate the size of text fonts, and so yes, certainly we can indeed find those numbers, but we are seriously mistaken if we assume that these numbers are related in any way to showing images on the screen. That is simply wrong, very incredibly wrong.
If you need visible proof, notice that the 96 dpi setting in the Windows video settings (Control Panel - Display - Advanced button - General tab) was always called FONT SIZE - until XP, which changed the wording to say DPI Setting - Normal Size. Windows 7 restores the name as Custom Text Size (DPI). This name is extremely significant, because it only affects text size. Notice that regardless how you might change that setting, it does not affect any image in any way on any screen... images are already dimensioned in pixels, and this is the logical inches which only affect text size. Yes, it does affect the size of dialog boxes, which you will notice are designed to show text, so yes, dialog boxes are necessarily dimensioned in text units, and yes, dialog boxes will change size. That's the point, to change text size. But notice that ALL desktop icon images remain the same size, and ALL of your photo and web images remain the same size, and even the screen itself remains the same size, in pixels. All this setting can change is the size of the imaginary logical inches used to size text fonts. Images already are dimensioned in pixels, and that dimension is still how you will see the image, and its size will not change with different text size settings. It will be a very trivial exercise to prove to yourself that this text setting has no effect whatsoever except for text size. Just look at what you see actually happens. Think a little about what you actually see happens.
The notion of logical inches is indeed about approximate actual size on the screen - this is generally the ballpark size of our screens, used as an attempt to approximate actual size of text fonts (inches) on the screen (pixels). So there is also an alternate meaning - that if you scan any object (say a postage stamp or a postcard) at about 72 to 96 dpi, then you will create an appropriate number of pixels (72 to 96 pixels per inch of object) so that the image will also appear at near actual size (inches) on many common screen sizes. Scan one inch at say 75 dpi, and you will create 75 pixels, which will cover about one inch on many screens, more or less (the screen only shows pixels). It won't be very accurate, but it also won't be too far wrong. In the ballpark.
For example, the smallest image on previous page, scanned at 72 dpi (the small fourth image) appears there at roughly original 35 mm film size (depending on your screen size - it may appear at a somewhat different size on a different computer screen). But if this is not the size image you wanted (and most likely it is not), then 72 dpi is simply the wrong concept. It simply doesn't work that way.
But its size on the screen won't be really far wrong from actual size, and for this reason, the printing industry does use 72 dpi images to show document pages at approximate actual size on the screen. Their business is printing pages of paper, so their programs are called Page Layout programs, which are specifically designed to create and print document pages. Their documents are dimensioned in inches of paper. These programs can also show replica images of the page of paper on the screen. On the screen, that paper page image size is scaled to match the font size.
So the printing industry does tend to think of computer video screens as being 72 dpi SIZE (this is Apple's logical inch, and due to Apple's early lead with graphical text, they have historically used Apple systems). Note this is only a SIZE, not a video resolution. No video system knows any concept of dpi, but screens have historically been about that size, and so they say 72 dpi images are about the right size too, and this works well enough for this limited purpose. A document containing 300 dpi images for printing must be resampled much smaller in the screen view, and not only was it excessive size for that purpose, but it can look bad too (you have seen 300 dpi PDF files that print great, but are crude on the screen).
So we hear them say "Use 72 dpi for the screen", and it is good advice for this purpose, but unfortunately, not everyone understands what it means. When it is repeated without understanding, it becomes "All screens are 72 dpi", which is wrong thinking, and unfortunately continues to propagate the false myth. The video system knows nothing about this number, period. This 72 dpi is just the notion of logical inches in Page Layout programs - text is shown at that size in Mac systems. It is useful to the printing industry in that way (certainly 72 dpi is a more useful image size on the screen than 300 dpi - it better matches the text size), but it is NOT about how video systems show images. They are specifically trying to show a replica of a paper page (dimensioned in inches) on the screen (where there are no inches), and it is only about the size of that image.
However if the screen combination actually used instead computes as 97 dpi (chart on previous page), then the 72 pixels will only cover 72/97 = 74% of the 97 pixels actually representing one inch, so it will be small. In this case, you could zoom it in your program to 97/72 = 135% size to see it actual size. Or you might use your screen's actual computed number to scan it at 97 dpi to hit accurate size on your one screen. However the image cannot repeat that size on a different screen, so in the public case, it's pointless to try for more than the rough ball park size. But actual size on the screen is not normally the effect that we want anyway, which is fortunate, because without inches, it is impossible to have any concept of exact actual size in inches on the screen (at least not on multiple screens).
The statement "Screens are 72 dpi" is extremely misleading if not outright false, and it is responsible for causing us most of this grief. The ONLY meaning that it possibly can have is about creating sufficient pixels to view the desired image SIZE, in this way:
It absolutely NEVER means that images must be 72 dpi for the screen, that's ridiculous. It's simply wrong - a myth - someones false conclusion back when no one knew better. Video systems simply do not work that way. Screens show pixels directly, with no concept of dpi. Scanning at 72 dpi is simply one of very many possible image size choices that you can create. However it does happen to be known for being roughly approximate ballpark of actual size on many screens - its ONLY significance, if that is the size you want. But approximate actual size is rarely a concern to us. We often prefer to enlarge the images, for example if scanning a postage stamp, or 35 mm film, or even a 6x4 inch photo. No way we can fill a 1024x768 pixel screen using 72 dpi. Even an 8.5x11 inch page would only be only 612x792 pixels at 72 dpi, and an A4 page would be only 595x842 pixels. We can't tolerate being limited to only 72 dpi size.
Of course, there are exceptions according to the size we need. For example, an Amazon.com web page thumbnail book cover image is roughly 100x140 pixels, which is generally about 15 dpi size from books about 7x9 inches size (I'd suggest scanning it larger, like 50 or 75 dpi, and then resample smaller). So we do need various size images, and as the bottom line, just use whatever scan resolution (pixels per inch) that will give the image size you want (pixels) from the original size you have (inches). But it won't likely be 72 dpi.
The statement would have been better worded as "Many screens are about 72 dpi size". The 72 dpi number is vaguely about the SIZE we see, but we do see different sizes. The screen settings most of us choose are generally in the 72 to 100 dpi ballpark size, more or less. The 1600x1200 pixel laptop screens usually exceed 100 dpi size (displaying corresponding smaller text). The video system knows absolutely nothing about this number, and does not use any such number to show images. The video system can only create the screen image pixels directly, as pixels, like 1024x768 pixels.
Some image files contain no dpi information. Examples are:
JPG from Adobe Save for Web menus
JPG from many digital cameras, at least the early ones.
Video screen-captured images do not retain dpi info, it is of no use to video.
GIF files (see CompuServe GIF)
Kodak PhotoCD PCD files (not used today).
Some PNG files (see PNG and older Adobe versions).
Yes, the Adobe Elements or Adobe Photoshop Save for Web menu does appear to scale the web image to 72 dpi. Save it for web, then check it, and the image properties do appear to say 72 dpi then. However don't be fooled, it doesn't actually do that. Instead the Save for Web menu simply removes the JPG EXIF data (containing the dpi field) from the JPG image file, to save a few bytes of file size for the web. The dpi information is omitted, discarded (but the menu File - Save As - JPG does save dpi). The web purpose has no use for dpi, because the video system will always ignore dpi. Retaining it would just be wasted bytes in the file (only a few bytes, but the web purpose doesn't need it).
Adobe shows us 72 dpi when the dpi value is missing in the image properties. When we ask to see properties, the photo program has a big problem with an empty property field, so it makes something up to tell us. The dummy 72 dpi value we see is just something to fill the missing field. Adobe shows us 72 dpi when we ask about a missing dpi value. This 72 dpi value only means "the dpi value is still undefined". The way Adobe says that is 72 dpi.
Note that other programs will likely tell us different values, for the SAME image file (we are still speaking about image files which contain no dpi information, for example images from the Save For Web menu). For one example, the Windows PAINT program at Windows menu Start - Programs - Accessories - Paint. For this SAME Save For Web JPG file, its menu at Image - Attributes will tell us 96 dpi or 120 dpi (for the SAME image Adobe says 72 dpi - this value is NOT in the file). Most Windows programs (except Adobe) will show either 96 dpi or 120 dpi (96 dpi unless you have set Large Fonts in your video, then they will show 120 dpi). Macintosh and Adobe programs usually show 72 dpi. PaintShopPro has a Preferences menu, and it will simply show you any number you want to see, whatever value you set in the menu (used only when the file has no dpi value to show).
If you see a dpi value that is NOT 72 dpi (or NOT these cases of 96 or 120 dpi as applicable), then you can trust it, that value is present in the file. Any real value will override the 72 dpi default for unknown values. But when you see 72 dpi (or 96 or 120 dpi), then it probably just means there is no value present.
Digital camera JPG images do the same thing. Dpi is only used for printing on paper (pixels per inch), and the camera has absolutely no clue what size we may print the image on paper. So the early thinking was that there is no point of their JPG files guessing some arbitrary dpi value. So they didn't, it would be pointless. But then Adobe will tell us 72 dpi again (until we change it). And today, images are so large, so now this Adobe 72 dpi value causes the printed size to be shown as maybe 5x3 feet (at 72 dpi), which is totally unreal. So to prevent that, newer cameras started saving a phony dpi guess, maybe 180 to 300 dpi, merely to show a more reasonable size, maybe one foot, instead of 4 or 5 feet at 72 dpi, which only caused bewilderment. But the camera still has no clue what size we may print it, and digital images are dimensioned in pixels, not inches. It can be any dpi number at will, the meaning of which is "pixels per inch", which will compute some printed size in inches, but it simply does not matter what it is until we go to print it.
Printing resolution is not retained for screen captured images either - dpi is of no use to the video. GIF and PhotoCD file formats don't even have a place to store dpi. GIF was designed by CompuServe for the early video screen, and Kodak also knew dpi was unknown that point. When asked to see properties for any of these, if there is no dpi number, the photo program will make something up, because, well, because we asked to see a value. Video doesn't use any dpi number. Any time we see dpi associated with any image, it refers only to printing on paper (or scanning paper). But no one prints at 72 dpi. When we see a 72 or 96 dpi value that we did not specify, it just means that no printing dpi number is specified yet. It's just a red flag needing our attention before we print it.
What would be a better number? No one has a clue until we specify the size it should print. So when this value is missing, programs usually show their logical inch value for unknown image resolutions (they need a number bad, and they have nothing better to show). Mac programs generally assume 72 dpi for unknown values. Windows programs usually show 96 dpi (or 120 dpi if Large Fonts were selected). However Adobe programs (with Mac roots) still show 72 dpi in Windows, for the same image on the same screen that others will show as 96 dpi. Is it wrong to indicate 72 dpi in Windows? It's hard to fault a meaningless number on accuracy, no other number is better. This is printing resolution, and we see the number only because we asked to see it. If not specified yet, then this is just some dummy number to fill a blank field for an unknown printing value, not specified yet. It is no big deal.
However, the image will definitely print at that 72 or 96 dpi value shown if we don't change it. The best try at understanding it is that this value will print on paper at approximate size shown on the screen, per the discussion about logical inches/actual size a few pages back. The 72 or 96 dpi is poor quality for printing, but on paper, it is near screen size, and this is the total meaning that it's guessed value has. So don't believe everything you see. Just scale it to the size it should print if you will print it. Ignore it otherwise.
What sense do we make of this? None really, dpi simply has no meaning on the screen. Dpi is for printing on paper, and you will scale it when ready to print, hopefully something better than 72 dpi.
Every time I said "print these three images" on the previous page, I also qualified it by saying "print from a photo editor program File - Print menu, not a web browser". The reason is because web browsers are a very special case. Other programs (like page layout and word processors) print pages designed for paper and dimensioned in inches. However web pages are designed for the video screen, and their screen page view is therefore dimensioned in pixels, because that is all the screen has. Any web images are of course necessarily shown on the screen according to their size in pixels, because this is how video works. There are no inches on the screen, but web browsers do show the text pixels on the video screen based on 96 dpi logical inch size (even Mac versions of Microsoft Internet Explorer are 96 dpi too). So far, this is all exactly as discussed here.
But when we print a web page, there is a problem - paper is dimensioned in inches, but the web page is video-oriented, and has no concept of inches. The web page can dimension text fonts in a few different ways, but any display of text font size on the screen view necessarily uses the logical inch concept, and we can say that the text characters will print at a corresponding size. The point here is that therefore the web page images are also scaled to 96 dpi size when printed on paper to match the text size. This is done in order to retain the proportions of the original screen page layout better (obvious only after we think about it <grin>). This is why all three images on previous page will print at the same 4.4 x 3.4 inch size if printed from a web browser (412 pixels / 96 dpi = 4.3 inches). Speaking of the Microsoft and Netscape browsers, this 96 dpi value on paper seems true even if our own logical inches were specified as 120 dpi Large Fonts - the browsers simply use 96 dpi on paper - to size the image to match the text - to maintain same page format.
EDIT: Times change, and todays browsers do it better now, and do not necessarily print at 96 dpi. Still same concept, but instead, when we change the browser text size, today the images we see on the page are resampled to match the text size - to keep the same page proportions, same size relationship to text size. And actually today, for example, if your monitor resolution is set to be 1680x1050 pixels and if your Windows text enlargement factor is set to 115%, then your screen resolution is scaled and reported to be 1461x913 pixels (divided by 1.15). So those three images may print slightly different size now (depending on your text size), but all three will still print SAME size, and they will NOT print at 7, 72, and 720 dpi (unless printed from a photo editor, when they will).
No Great Truths here, it only means the browser still ignores the dpi in the image file (because the video system does too), and does whatever it needs to do for its own purpose. But it is really hard to make any case at all for 72 dpi. <grin>
NO, this doesn't imply web images should be 96 dpi either. It is very clear that it could not matter less what dpi value the web image thinks it is, since its dpi will always be ignored on a web page. Its size in pixels is the only important factor.
Note that this browser action is exactly the opposite action of page layout programs like Word or Publisher, or Acrobat or InDesign, which are designed to print paper pages dimensioned in inches. Page layout programs do that well, but they must invent something different to show on the screen. Browsers show the screen well, and must invent something different for paper. Photo editor programs are a third category, best of both worlds. Photo editors are concerned with one individual image, instead of a page of paper with an image on it. They manipulate images dimensioned in pixels, and show this on the screen, but they also know how to print the image at actual size using the actual dpi on paper. The point is that various software does have varied purposes, so we should try to see the big picture, which is easy if we understand a few basics about how things actually work. Be aware that false emphasis on what some one specific case appears to do can confuse our thinking to create some false conclusions, like "video screens require 72 dpi images". This has happened in the past. <grin>
So, my prior discussion about printing the 7, 72, and 720 dpi images is about how photo editors will print them, i.e., the meaning of dpi. It is about printing from page layout programs too, if we can assume no additional size manipulation is done. But you are using a web browser to view a web page now, so it seems necessary to point out that it is not about printing from web browsers, because it is trivial to show that web browsers totally ignore image dpi in every case. Just look at what you see.
The 72 or 96 dpi numbers are NEVER used to show images on the monitor screen. There is no concept of inches or dpi in the video system. Those logical inch dpi numbers are only used as a crude approximation to size text fonts on the screen. The size results are not very accurate, but we have nothing better for text fonts. These numbers have absolutely NOTHING to do with showing images on any screen, no way, no how, not then, not now, no matter how many times you hear others that don't understand it tell you otherwise (they are also victims, and are just parroting what they heard too). Images already have a size in pixels (the three 412x324 pixel images on previous page), and the video system simply shows those pixels directly, one for one. Video only shows pixels. This is the easiest possible result. Don't make it be hard, the correct way is much easier. If you show a 412x324 pixel image on the screen, you will see 412x324 pixels. That's all there is to it, how things actually work, the only theory that will allow us to predict accurately what will actually happen. You can clearly see this is true if you will just look at what actually happens on the screen. Dpi is not a factor on the screen, and this is one of the most basic and necessary fundamentals of digital images.
Reality goes much better if we forget about 72 dpi, and instead simply create the number of pixels that we want to see on the screen.