A few scanning tips

www.scantips.com

Say No to 72 dpi

We still frequently hear the very bad advice: "Computer video screens show images at 72 dpi, so scan all your images for the screen at 72 dpi". This is incredibly wrong; it simply doesn't work that way. It has a very different meaning.

Regardless what you may have heard almost everywhere, there is no purpose or reason for 72 dpi images on computer video screens or web pages. As a concept, 72 dpi is simply a false notion. It is a myth. It is NOT how video systems work, and it is detrimental to understanding how things really do work. It is very easy to show proof of this here.

If you want to know why, you're in the right place.

This section is written unusually, as an argument, because there are some people that are, let's say, extremely certain that 72 dpi (or 96 dpi) is surely somehow quite important for images on the computer video screen. They may have no clue how or why, they have just always heard it was important. Sometimes they want to argue this anyway, so this is written for them, and I hope it may give them cause to simply look at what actually happens on their own screen before they email me. Other people have also heard how 72 dpi was supposedly important, but they can't make it work, it does not produce the desired results (which is in fact the correct answer about the validity of 72 dpi). This is also written to be useful for them, but the diatribe may be a bit thick. 😊
When I say video, I simply mean video systems which operate the computer monitor. I am not referring to movies or animation.

All the digital image basics fully apply to images from either scanners or from digital cameras. Same thing, digital images. The only difference is the method of creating the images. Once created, it is all just pixels, and printing and video are still exactly the same ideas for all. I would invite you to view these basics:

Menu to the topics below on this page:

Menu to the topics on next page:

Proof that Video Systems do NOT use dpi

Each of the three images below are 412x324 pixels in size, which is why the video system shows each of them as 412x324 pixels size on the screen.

To make the point that the image resolution dpi number (the scaled printing resolution stored in the image file) simply does not matter on the screen, these three image files below are now scaled individually to 7 dpi, 72 dpi, and 720 dpi. If you print them (using a photo editor), they are now three very different images in that regard. These image files actually do individually contain these three different dpi numbers, which is instruction to the printer how to print them (how to space the pixels on paper, how large to print the image on paper).

7 dpi
412x324 pixels, 7 dpi, prints 58x46 inches
72 dpi
412x324 pixels, 72 dpi, prints 5.7x4.5 inches
720 dpi
412x324 pixels, 720 dpi, prints 0.57x0.45 inch

Frankly, I don't see much difference on the screen. 😊   (and this is YOUR video system that we are using too). I am not making this up, you can see what in fact actually happens, and it's going to be real hard to dispute that. But for anyone who still wants to argue that there is any significance of 72 dpi (or 96 dpi) on the screen, please begin by explaining why there is no evidence whatsoever of it on the screen. If 72 dpi matters at all, then how can these three very different images appear identical? Please also explain how any 72 dpi notion might have been used to create these 412x324 pixel images from small 35 mm film (film size is 36x24 mm, about 1.4x0.92 inches).

I'm teasing. I know it is not possible that anyone can explain how 72 dpi is important in any way to these (or any) screen images. The video system shows images to be their size dimensions in pixels (totally ignoring any dpi declaration in the file). The notion of 72 dpi video resolution is flat wrong — video simply doesn't work that way. There is no concept of dpi being used on the video screen. The actual video dpi resolution is the size of the screen in pixels divided by its size in inches. And monitors certainly vary, if a desktop computer screen, or a wall TV screen, or a cell phone screen. But today, they all can show the same 1920x1280 pixel HD movie image. The idea is one image pixel goes into every screen pixel. However, in the case of images too large for the screen pixels, it likely gets resampled to a smaller copy to be able to fit one resulting image pixel into every screen pixel.

It should instead be very clear from the above example that images are obviously shown on any screen only according to their size in pixels. In every case, you should just create the image size (pixels) that is the size that you want to see (pixels).   So, forget about 72 dpi, it does not work that way. 72 dpi is a false notion, even worse than useless, because it is counter-productive, both to results and to understanding how it really works. There is no concept of dpi in the video system. There are only pixels.

Yes, these three 412x324 pixel images are all the same image, three identical copies of one image, created from a 35 mm negative scanned at 2820 dpi, and then resampled to 1/8 size, effectively 2820/8 = 352 dpi size now (scanning 0.92 inches film width at 352 dpi would also give 324 image pixel height). However that really doesn't matter now, not on the video screen (which simply shows pixels), but since they are in fact individually scaled to 7, 72, and 720 dpi now as stated, and this makes them be three very different images when they are printed on paper (paper is dimensioned in inches (or mm, same idea) and is printed to fill inches, but video just fills the pixels). The 7, 72, and 720 dpi values here are very true, honest and legitimate in every respect.

To understand this better, you can save these images on your disk by clicking each image with the RIGHT or 2nd mouse button, and selecting Save Image As, and then examine the individual image properties of those disk files in your photo editor. These three images will look exactly the same in your photo editor too, because it will show the same 412x324 pixel image size (the size the image actually is). But your examination of the image files will show that these images really do all in fact carry exactly the three different dpi values indicated here. The three images are identical size on the video screen (all are 412x324 pixels — dpi is ignored on the screen). But the three image files are indeed all very different now when printed on paper, at least when you print them individually from a photo editor. Dpi is for printing on paper (where inches exist). The three images are obviously the same size on the screen (that size is 412x324 pixels), but three images are indeed scaled to 7 dpi, 72 dpi, and 720 dpi, and the image properties definitely will show printed size of 58 inches, 5.7 inches, and 0.57 inches on paper. This dpi applies only to paper, or wherever inches might exist. IMAGE DPI DOES NOT APPLY TO VIDEO SCREENS. The three images will obviously print at those inch sizes on paper (printed from your photo editor's File - Print menu, but NOT from the web browser - more about printing from web browsers near the end). I don't know how to print the largest 58x46 inch image (unless you own a billboard printing company), but you can just examine its properties.

Do NOT use the clipboard Copy function to inspect the images. That never works to show this, because the video screen image does not retain any dpi information (because video screens have no use for dpi info). So the clipboard Copy cannot retain the dpi information stored in the image file with no dpi information. That certainly does NOT make 96 or 72 dpi true, it is just a guess replacing an absent value. However then, if blank dpi information, Windows programs will always show you 96 dpi for any file (except Adobe will always show 72 dpi) - for any image that has not retained the file dpi info (see Logical Inch discussed on next page). The video screen did NOT save the dpi number because the screen has no use for dpi. So, save the actual file as instructed above, and then open that actual file in your photo editor to see the actual dpi information in the file. I am discussing the actual image file. On second thought, maybe do try the clipboard copy on the image, knowing what to expect in regard to dpi, and knowing that it will only show a false dpi number from the operating system (NOT about this image), just to see it and to understand and believe it.

Our printer drivers definitely do use dpi — dpi is all important to printers because printers print on paper, and paper is dimensioned in inches. Unless you change dpi as you print it, your home printer definitely will use the dpi number in the image file. And if you ask the one hour print shop to print an 8x10 inch print, they will change dpi to do that. But video systems know no concept of dpi at all, nor have any concept of inches either. Video systems simply show the 412x324 pixels, pixel for pixel on their perhaps 1920x1080 pixel screen. Any dpi value is always ignored on the screen, regardless what value it is. Dpi is simply not defined in the video system.

I am not making up anything — you can clearly see what actually happens on your own video screen. That is the Real World, and that is all there is to it, and what actually happens is all that matters, so if you understand this much — that any image will always be displayed on the video screen only according to its size in pixels — then you've got it right. But if 72 dpi is still of any interest, then read on. It may be a bit much, more than you expect, but the details are here.

For the video screen, and therefore obviously for web pages too, simply scan images at whatever resolution necessary to get the image size desired (pixels) from the area size of the original being scanned. This is very easily computed. The size of the image in pixels is all that matters for using an image. We need it to be as large as our use requires. Or possibly also archived even larger to also be suitable for some unknown future use. If a future use turned out to need a smaller image, we can always resample a copy to provide a smaller image (but save the archived larger image for the next future use).

For example, this image above is scanned from 35 mm film. If an area the size of 35 mm film (36x24 mm, about 1.4x0.92 inches as scanned) were scanned at 72 dpi, we get an image size of (1.4 inchesx72 dpi)x(0.92 inchesx72 dpi) = 100x66 pixels. Except the width of this one shown here was then cropped slightly smaller horizontally, to 84x66 pixels, but it is full height of the film. This one will always be shown on monitors as the 84x66 pixels, but it will still show as different sizes on different monitors. If it is seen on a 72 dpi CRT, it will show at about the scanned film size. But it is seen a bit smaller on a modern wide-screen 1920x1080 monitor at about 100 dpi, as about 0.87x0.62 inches (Because 72 dpi will fill only 72% of the 100 dpi pixels that today's larger LCD monitor sees in an inch).

If you might want to show it larger, then scanning the tiny film at more than 72 dpi will be very necessary. For starters, see the Scanning/Printing dpi calculator.

The point is, any 72 dpi scan of 35 mm film is Not likely very useful (maybe as an icon?), but that is too small for most purposes, only thumbnail size. 72 dpi is simply the wrong concept for the video screen (unlikely to be what you want). The origin of the thought was that back in the day, when seen on a CRT monitor that had a screen resolution of about 72 dpi, the 72 dpi scan would show at about actual size (and perhaps that was more useful if scanning a photo print, but still, we may have other interests needing something different). What is important is that it be the appropriate value to create the desired number of image pixels from the area you are scanning (see details in the previous Video Resolution Basics section).

Creating the necessary pixels: If we scanned the 35 mm film at 3000 dpi, it would be (1.4 inchesx3000 dpi)x(0.92 inchesx3000 dpi) = 4200x2760 pixels (11.6 megapixels), and it would fill 4200x2760 pixels on any screen large enough for it (not many screens are that large). On a smaller screen, a photo editor would resample a copy smaller for viewing, perhaps to 1/4 or 1/3 size that would fit in the window. But if we printed that image at 300 dpi, then it would fill (4200 / 300 dpi)x(2760 / 300 dpi) = 10.67x9.2 inches printed, which perhaps is the intended goal.

So the purpose of the scan resolution choice is to create the digital image which will be the correct needed size in pixels. Regardless if using the image as seen on a monitor, or as printed on paper, the entire purpose is to create the image size in pixels that best serves our goal. Or sometimes we just scan it large and archive it, to be large enough for some yet unknown future purpose.

As just demonstrated before, the video system does not even look at, or even keep the image dpi number when displaying the image on the screen. The video screen has no use for a dpi number. Screens only know about pixels. Yet I get some comments like "No way that image is 7 dpi, it looks as good as the others". Yes, on the screen, that was pretty much my point too. 😊

The dpi is for scanning resolution, to determine the image size (pixels) created from the area scanned. The target goal is the image size dimensions in pixels. Thereafter, dpi is just a number added to the image file to instruct printers what size to print it, which dpi number can easily be changed for other printing goals (called size scaling). For printing, you can definitely believe that these three images above are indeed all the same image file, and then actually scaled to 7, 72, and 720 dpi (just look for yourself). Yet all obviously appear at the same 412x324 pixel size and have the same appearance on your screen, on my screen, on any screen, including on this web page in your browser, and in your photo editor program too. And the point is, it shows that video systems simply do not use image dpi. Never ever, no way, no how. I don't know how to make it more clear.

Note that this argument is NOT about if the term should be called dpi or ppi. Dpi is jargon maybe, but each pixel certainly is a "dot" of color. I learned to say dpi, like historical usage always has used. If you want to call it ppi, that's fine, same thing, exactly what ppi is, and it is still exactly what I am discussing here (we all do definitely need to understand it either way we hear it). More about that name here, but the video system has no concept of either term about inches. The video system is dimensioned in pixels, and it simply shows pixels directly, and has no concept of inches or cm at all. You can and should notice that the terms "dpi" or "ppi" simply do not appear in any user manual for any monitor or for any video board. The really good reason is because the concept of dpi simply does not exist in the video system. The numbers that you will see in the video specifications will be screen size numbers like 1920x180 pixels.

The difference in video screens and printers is that paper is dimensioned in inches, and printers do use dpi to print inches, and these three images with their current individual dpi numbers will all print on paper at very different sizes (printed from a photo editor instead of printing this web page. A browser will print them to maintain the size seen on the video monitor, still all three the same size. The video has no concept of dpi). Dpi is for printing, and your printer definitely will honor printing resolution to print individual images, and will print these individually at the sizes shown above (except your printer probably cannot handle the big one). But screens only show pixels, and these three 412x324 pixel images are the same 412x324 pixel size on any screen. The computer video board will create some size screen, like probably 1920x1080 pixels today, and that will be shown on whatever size monitor is connected.

Whatever video monitor you use will create and show some number of pixels (like 1920x1080 pixels wide screen) across some number of inches of screen (maybe around 17 to 20 inches width), which will compute a screen resolution of maybe about 100 pixels per inch. But video screens do not otherwise use inches, and all of the 412x324 pixel images will be shown in 412x324 pixels of the video screen, however large that might be. We can compute some resulting pixels per inch result on the screen, but the video system had no concept or need of it.

All of the above is the obvious evidence which you can clearly see actually happens, if you simply look. Any correct theory must actually describe what we can clearly see actually happens. Nothing about 72 dpi can ever do that. The major reason is because video does not work that way — the video system totally ignores any dpi number in image files, and shows the pixels directly. So for the video screen, you use whatever scan resolution that will create the image size you want to see. Or make it larger, and then you can always crop or resample it smaller to the desired size.

Image Size Goal for
desired Scan or Print Size

To scan or print x

inches
mm
   at dpi resolution

This simple calculator is added here to show how dpi is used, and it is all you need to serve two general purposes, of scanning and printing. There is another fancier scanning and printing calculator, but actually, the procedure is so easy and fully shown, you don't even need the calculator:

Scanning to print a copy at the same size is a very common goal. It's important to realize that an area scanned at 300 dpi will create the pixels necessary to also print the same size at 300 dpi. The concept either way is pixels per inch. And 300 dpi is likely what you want for a photo job. The one-hour print shops accept larger images, but many of their machines are set to use 250 dpi. Which won't matter. This dpi number does NOT need to be exact, 10% or 15% variation won't have great effect. Just scale it to print size. But planning size to have sufficient pixels to be printing somewhere near the size ballpark of 240 to 360 pixels per inch is a very good thing for printing. I suggest 300 dpi if you can.

The One True Way Video Actually Works

There is no concept of inches or cm in any video system. There are only pixels. We buy a monitor with its diagonal size dimensioned in inches. It doesn't matter to the computer what size, because it only shows pixels. This should be obvious, because regardless of the size of our monitor screen (inches), our video settings can only specify a certain screen size measured in pixels. The inches do not matter at all, because if we specify a 1920x180 pixel screen size, then 1920x180 pixels of video board memory will be used to build the screen image of that size. This 1920x180 pixel image size requires a video board with at least (1920x180 x3 RGB) = 6 MB of video memory to hold that image. This is the purpose of that video board memory, to hold that 1920x180 pixel screen image. That same 1920x180 pixel memory image is directly output to the monitor cable, regardless of what size monitor we attach. We can connect any size of monitor (inches), and a big monitor will show that 1920x1080 pixel screen image bigger, and a small monitor will show it smaller, but if we specified 1920x1080 pixels, then we are definitely going to see the same 1920x1080 pixels, regardless of monitor size (be it cell phone screen size or huge HDTV wall screen size). Our video system is only about those 1920x180 pixels.

We can specify other screen dimensions in our computer software (but shouldn't for best sharpness on a LCD monitor), but if we specify 1920x180 pixels, then 1920x180 pixels is what we get and see. 1920x180 pixels is a popular choice, and I will refer to it generically here, to mean "whatever screen dimension setting that you have chosen".

The video system does not even know nor care what size the monitor tube is. We can attach any monitor we wish, 12 inch or 27 inch, and nothing changes in the video system. The video system only knows about the 1920x180 pixels, and we are going to see 1920x180 pixels spread over the full screen of any monitor we attach to show it. Any embedded 412x324 pixel image (as above) is just some part of those total 1920x180 screen pixels. The 412x324 pixel image will be shown at 412x324 pixels size, simply because that is its size. That is how video works. Inches are totally unknown, there are only pixels. And if no inches exist, then clearly any notion of dpi is extremely suspect too. This much is very obvious, just look at what you see.

In our world of digital images, dpi is for printing images on paper, or for scanning images from paper or film. Dpi means "pixels per inch" and it implies inches on paper, or film, or someplace where inches exist. Paper is dimensioned in inches, but video screens are dimensioned in pixels, and there is no concept of dpi in the video system.

Forgive my saying the same thing over and over, but some people are reluctant to imagine this immediately. The three images above are each 412x324 pixels in size. Digital images are dimensioned in pixels. Those pixel dimension numbers are extremely important, because video systems pay very careful attention to the image size in pixels. Images are dimensioned in pixels, and screens are dimensioned in pixels, and image pixels are directly displayed on the screen, one for one — one image pixel at one video board pixel location (assuming we view 100% image size). That is simply how video images are shown. Pixels are shown directly, without any concept of dpi at all. So these three images above were each shown to be 412x324 pixels in size, on any screen, simply because the images are in fact 412x324 pixels in size, and because screens show pixels directly.

Complicated today: We can zoom the web screen by using Windows text size zoom, and/or by zooming in the View menu of browsers. Windows zoom affects only the size of text (Control Panel - Display is one access point), and it does not affect image size in Windows. Except today, browsers zoom the images as well as the text, to preserve original web page layout. When browsers zoom the screen size, they compute as if the physical screen size actually got larger (a new imaginary size called CSS pixels), and the browser window sees a portion of that. Browsers differ, which makes reporting screen size messy.

Update: Messy distraction, sorry, I should delete some parts now. Things are changing in unpredictable ways. Video has never been "dpi aware", but since Windows Vista, Microsoft has been attempting to add optional "dpi aware" features anyway. Any false report would be because, if in Windows 7 (or Vista), you may have zoomed your browser's text size to be different. Then the browser scales the image size with text size, to keep the web page proportions right. But now, we see that Windows 7 may scale your apparent screen size (smaller for larger text) - not on the actual desktop, but the "screen size" numbers seen by the browser window change in the same inverse percentage proportion, and you may see these numbers of that false "screen size" when the text size is larger. If so, you can reset this size to default with Firefox menu: View - Zoom - Reset, and then reload the page. Chrome has its menu at right end of toolbar. Or with IE menu: View - Zoom - 100% (but IE might come out like 99% and still be slightly off). Your text will become tiny again though. There are ifs and buts about this, it is more complicated and it's messy, and there is more below, so let's just say you might or might not see this issue.

(There is a setting about this (at Custom DPI, "Use Windows XP style DPI scaling"), but it seems flakey. The issue is trying to apply inches to video anyway, where there is no definition of inches (only pixels).

Regardless, the idea is still the same — images are shown in video at their size in pixels, but then browsers will scale the images to follow the text size to maintain web page proportions. These new screen size numbers are only related to imaginary logical inches for text (next section), and is certainly not about any real dpi number in any image file.

The same image will appear at a different apparent size (inches) on any other size screen, or on a different size monitor. Meaning, other people with different size screens (pixels or inches either) will see these same images at a different apparent size than you may see. For example, someone with a 640x480 pixel screen size (a cell phone maybe) will see these images larger, at 64%x67% of their full screen 640 pixel width, on whatever size of monitor they have. But regardless, even then, the 412x324 pixel image will always still fill the same 412x324 pixels. This is because digital image size is measured in pixels, and screen size is also measured in pixels.   It really couldn't be any simpler.

But the printer will certainly care about dpi, and these images will all print at three VERY different sizes on paper (at sizes as shown above) if saved and printed individually using the File - Print menu (when printed individually from a photo editor program, not a web browser)

The imaginary numbers 72 dpi or 96 dpi are never used in any way by the video system to show images on the screen. The 72 or 96 dpi number is about TEXT, and is NOT about images at all, and that number never affects any screen image (not directly). The video system simply shows pixels, directly, one for one, without any concept of dpi.

Again, a modern disclaimer about my text: Our Windows or Apple dpi setting, and also Windows or Apple browser zoom setting, does affect TEXT size on the screen. And today, modern web browsers now generally also modify image sizes (in pixels) to maintain a constant page size relationship with zoomed text size. Text size is discussed next.

The Origin of the Myth - One Shred of Truth

The imaginary numbers 72 and 96 dpi do sort of exist (in their imaginary way) in computer operating systems. This existence does seriously confuse people who imagine these numbers might be about showing images, but these numbers don't affect the images. These numbers are only used to compute Text Size on the screen. The reason is because text font size in Points is dimensioned in inches (designed for paper), but screens are only dimensioned in pixels. The definition of a Point is 1/72 inch — there are 72 points per real inch on paper. A 12 Point font is 12/72 inch height printed on paper (inches exist on paper, and paper is dimensioned in inches, and fonts are dimensioned in inches). But we are in real trouble on the screen — screens are only dimensioned in pixels. We desperately need some concept of inches to size this font, but unfortunately, there simply is no concept of inches on the screen. Only the 1920x180 pixels exist there.

But not to worry, the operating system simply dreams up and uses some fake 72 and 96 dpi numbers to compute text size on the screen. If we invent some imaginary notion of 96 dpi SIZE (if we simply pretend 96 pixels will cover one inch on the screen, more or less, without asking "what size of screen?"), then we can use this to compute that the 12 point font should be shown to be (12/72 inchx96 pixels per inch) = 16 pixels tall in Windows. 16 pixels is a perfectly usable screen dimension, and we're as happy now as if it were actually true, so long as we don't ask any questions.

Programmers call this concept Logical Inches. In computer science lingo, logical (as opposed to physical) means "This is NOT real, but let's use it anyway, as if it were real, because it is convenient to pretend it is real". Logical is how we are going to think of it, regardless of how it might really be physically. The easiest example is a disk file, which is perhaps several thousands of sequential bytes, but which some applications may think of it as being records of perhaps 80 characters each, or other formats (just because we say it is, and use it accordingly). That would be a logical record length, LRECL. That software simply reads record length groups of bytes as if formatted records.

In the case of video screens, which are composed of pixels, and dimensioned in pixels, and there is no concept of inches, but some software still prefers to imagine screens as if instead inches of paper (showing a text page on the screen, for example).

So the Macintosh and Windows operating systems assume one logical inch on the screen is some number of pixels (Macintosh 72 and Windows 96), because, well, because it would just be very convenient if it were somehow actually true. These logical inch numbers were chosen because it was halfway close on computer monitors, sometimes. The video system does not know the inch size of the monitor (some are cell phone size and some are wall TV size), so there simply is no concept of real inches, so it is necessary that we make something up. Real inches have an exact defined size, but a logical inch does not, it is whatever size the 96 pixels happen to be on this one specific screen, and the logical inch will be a different length on a different screen. Don't confuse "logical inches" with "real inches" — very different concepts.

Windows does have a video adjustment for this 96 dpi number for some cases. The Windows Large Font setting can assume a different number to show the text at a different size (the Large Font default is 120 dpi). But otherwise the same imaginary 96 dpi or 120 dpi number is used for text on any and every screen, regardless of any details about the size of that system. No image is affected in any way by this number — instead we are just playing fast and loose with the inch for text fonts. A logical inch will be whatever size the 96 pixels actually are. This is a very well-known situation, and yes, the programmers know this is not true (at least those that work at this level). It is trivial to show that screens only display pixels directly, like those 1920x180 pixels. No video system knows or uses ANY concept of dpi. But as a rough guess about text size, this logical inch idea is actually not too far wrong (which is why these numbers were used).

What should the accurate number be? There is really no answer possible, because the actual real world screen size numbers vary quite a bit. For example, a 15 inch CRT monitor screen might measure 10.6 inches horizontally (its actual bright image area). If it is currently set to 800x600 pixels screen size, then the image is obviously 800 dots / 10.6 inches = 75 dpi (pixels per inch) apparent resolution in that case.

But reset the same monitor to different screen sizes, and we get very different numbers. Monitors vary, even 15 inch monitors vary, but I measured a few screen widths in inches, and computed this for different CRT monitor sizes:

APPARENT resolution of different size CRT monitor screens
Screen Size
Pixels
14 inch monitor
Width 9.7 inch
15 inch monitor
Width 10.6 inch
17 inch monitor
Width 12.5 inch
19 inch monitor
Width 14.4 inch
21 inch monitor
Width 15.9 inch
640x48066 dpi60 dpi51 dpi44 dpi40 dpi
800x60082 dpi75 dpi64 dpi56 dpi50 dpi
1024x768106 dpi97 dpi82 dpi71 dpi64 dpi
1152x864119 dpi109 dpi92 dpi80 dpi72 dpi
1280x1024132 dpi121 dpi102 dpi89 dpi80 dpi
1600x1200165 dpi151 dpi128 dpi111 dpi101 dpi

The CRT screen is analog, which has inches, but has no pixels. Your computer video graphics card creates the pixels to be any screen size you choose to specify, like 1024x768. Your image is created in those video graphics card pixels. Then these pixels are output to the CRT as a stream of colored dots, which are shown as the analog sweep beam moves across the screen width.

The chart above is the measured Width of 4:3 CRT monitors, when back in the day, we were concerned with which screen size to show on our CRT, which choice was entirely about readable text size. Text is shown at a specified size in pixels (in the computers graphic board), so showing more pixels on a smaller screen reduces the size of the pixels even smaller, and then text rapidly becomes too small to read easily, even on a large screen. CRT monitors were NOT pixel based, but our popular screen size choices tended to choose the graphic board resolutions that are marked in yellow, as the best compromise between a larger screen size (in pixels) and a readable text size.

But for best sharpness on a LCD screen (which is pixel based), your video card should specify the same size screen as the native resolution of your LCD digital screen. The chart below is computed for LCD monitors (from the advertised diagonal dimension), and today, desktop monitors commonly are 1920x180 pixel HDTV size (some smaller or less expensive HD sets instead may show 1280x720 or 1366x768 pixels).

Cell phone screen resolution varies with model, but exceeds 300 dpi today. That screen resolution dpi number on your phone or monitor is computed this way: (same as in next table)

Apparent LCD Screen dpi =
pixels of width or height dimension

screen inches of that same dimension

DCI and SMPTE are cinema specifications. ATSC is North American digital television specifications.
I made up the term "CS" to denote the Cinemascope crop

Digital TV video screen sizes
ATSC HDTV1280x720 pixels16:9720p HD
ATSC HDTV1920x1080 pixels16:91080i Full size HD
wide1366x768 pixels16:9 Early wide screens
DCI 2K2048x1080 pixels1.90:1 ~17:9
DCI 2K (CS)2048x858 pixels2.39:1Cinemascope crop
SMTPE UHD3840x2160 pixels16:9Ultra HD *
DCI 4K4096x2160 pixels1.90:12160p  ~17:9
DCI 4K (CS)4096x1716 pixels2.39:1Cinemascope crop
SMPTE 8K7680x4320 pixels16:9
DCI 8K8192x4320 pixels1.90:1~17:9

* UHD (Ultra HD) is commonly called 4K, but which are technically different formats, except those two are in fact fairly close in size (about 6.7% difference in the dimensions of 4096 and 3840 pixels). The actual binary 4K number is 4096, but UHD is 3840 pixels. UHD is in the TV sets and broadcasting available in consumer markets. The DCI 2K and 4K aspect ratios are 1.896:1 (256:135, approximately 17.06:9. If 4096x2304, it would be 16:9). But in the 4K market, UHD is the consumer standard, at 16:9 aspect ratio (1.777:1). UHD dimensions are 2x that of 1920x180 HD (4x more total pixels), which maintains the same 16:9 aspect shape. Some streaming and satellite channels and some Blu-Ray DVD do offer some UHD media (which is often called 4K), but no broadcast TV channels can offer 4K or UHD.

There are two sequences here.
HDTV 1920x1080 and UHD 3840x2160 and SMPTE 8K 7680x4320 are each twice as large as the prior.
DCI 2K, 4K, 8K are each twice as large again, and just slightly wider than the first sequence.

1920x180 HD screens (1920 is nearly 2K = 2048) are called 1080i, the i meaning Interlaced, meaning every other pixel line is shown in the frame of one pass, and the next pass 1/60 second later shows the frame containing the remaining lines (requires two frames to show one, which could be thought of as 1/30 second intervals). Any fast action on the screen (like football games) could change the screen in the 1/30 second between these two frames, which could blur fast action a little.
1080p would require much greater bandwidth, which won't fit in the channel frequency assignments, so is not done.

The old NTSC analog TV was broadcast as 480i. Today, TV stations often also transmit a few subchannels, often syndicated channels of old movies or old TV series, which are usually 480i SD or some are 720p HD. Here is a summary. Many of these channels are on cable too. Depending on your age, this may be some of the best TV.

1280x720 HD screens are called 720p, the p meaning Progressive, meaning all lines of pixels are shown in order from top to bottom each 1/60 second, meaning a little less resolution, and is still a 1/60 second shutter, but shows the complete frame and faster action can be more clear, so sports networks such as ESPN broadcast in 720 HD format.

Wikipedia has a list of resolutions used by USA broadcast and cable channels. Major broadcast stations are 1280x720 or 1920x180. USA Broadcast networks ABC and Fox are 720p, and CBS, NBC and PBS are 1080i. Cable channels vary between these choices.

4K TV sets are actually UHD, which might be called 4K UHD, and are spec'd as 2160p (but UHD is cable only, and cannot be broadcast over the air).

These computed screen dpi values above are merely calculations after the fact — calculating the apparent size of the existing image pixels. The same image will be shown on any connected monitor. We can say this calculation is the apparent resolution of the result on the one specific screen, but the video system does NOT know about those dpi numbers, and did not use any such dpi or inch numbers to show the image on the screen. The numbers that are important for showing images on our screens are the screen size numbers in the first column. If we have a 800x600 pixel screen, that is what we see, and then we know exactly how a 400x300 pixel image will fit on it.

That screen size in pixels is called the resolution of video systems, and it is selected and named "Screen resolution" in the Windows Control Panel ("Display"). Today, LED monitors are digital and can specify their actual native default, which should be retained and used. This specified size image is created by the OS in your video card memory. Then this specified video card image is shown on your monitor, regardless of any monitor details (and if LED, it's good if the two grids match). We needed a size combination producing about 70 dpi for the CRT screen to look decent, and 80 dpi was better. Today many LED cases are likely about 100 dpi. This chart shows why the larger screen sizes like 1600x1200 pixels were called "high resolution". More pixels (if actually existing) in the same screen area are smaller pixels, with more potential for detail on the overall screen. This means that we could show a larger image then, more pixels with more detail.

When we show a too-big image (larger than our viewing screen or window, everything is dimensioned in pixels), our viewing software normally instead shows us a temporary quickly resampled copy, small enough to be able to fit on the screen so we can see it, for example, perhaps maybe 1/4 actual size (this fraction is normally indicated to us in photo editors, so we know it is not the full size real data). We still see the pixels of that smaller image presented directly, one for one, on the screen, which is the only way video can show images. When we edit it, we change the corresponding pixels in the original large image data, but we still see a new smaller resampled copy of those changes. But then the screen is still showing us that resampled copy at this apparent computed resolution. Video simply shows pixels "one for one" on the screen.

However the three existing 412x324 pixel images (above) will still be the same existing 412x324 pixels, and this means that as the screen dimension is made larger, the apparent size of these images gets smaller, because they are the same 412x324 pixels as before. This existing image size is now a smaller percentage of the larger full screen size, so the same pixels cover fewer inches on the larger screen. Printers do use dpi to cover inches of paper, but video screens do not use dpi numbers. Video screens are dimensioned in pixels, so they simply show the image pixels directly as pixels on their own pixels.

And this apparent size reduction on a larger screen also happens to text fonts (which are dimensioned in pixels on the screen by using 96 dpi logical inches), so the same text gets smaller on a larger screen too. If you change a 800x600 pixel screen to be 1600x1200 pixels, everything including the text will become half size. It would be called higher resolution, but you may not be able to read it then (unless you also bought a much larger monitor to compensate). Video screens are all about pixels, and you can see all of this happen on your own screen.

Our monitors and screen sizes vary greatly, but what we have in common is that when using smaller CRT monitors, many of us generally adjusted our screen size as large as we can to see more page area, but to still be able to read text as it gets smaller. And since we must all be able to read text, the text size usually comes out about the same size. We usually choose one of the common combinations marked in yellow in the chart above, so this narrows the probable range, so that most of us are often perhaps within 30% on text size (except maybe the larger laptop screens). When the text size is too small at the larger screen settings, you can increase the size of your logical inches with the "Large Fonts" option in the same Windows video settings. For good readability, you will probably prefer text logical inches maybe 20% to 30% larger than your number in the chart above.

So the assumed 96 dpi logical inch in Windows, and the 72 dpi logical inch for Macintosh, are somewhat close to the size of our screens, so these defaults are at least a crude approximation for appropriate text size in inches. But there is no one "correct" or better guess, because screens vary. This Logical Inch guess definitely is a valiant try for an impossible problem (there simply is no concept of inches in the video system). It makes it possible to show fonts (dimensioned in points) on the screen. But it is not very accurate, because screens in fact vary in size and settings. This is why our word processors usually give us a Zoom control to see the text page any size we want to see it.

Printing routinely requires images that are much larger than our screen size. When viewed at 100% Actual size, a 300 dpi size image for printing will appear 4x printed size on a 75 dpi size video screen, simply because screens do in fact show pixels directly, and that is how many pixels there are. So photo programs normally also provide a zoom control that will resize an image to show it any size we want to see it on the screen. The original image data is not affected by this temporary zoom, and that original data is still what will print, but what we see on the screen is a temporary copy which is different resampled pixels (unless we specify showing it at 100% Actual size). The image window title bar shows the new size (normally as a percentage) as a warning that we are not seeing the actual data pixels. The initial default is usually a smaller size that will fit in the program window. But whatever new size we select to see, the new copy's resampled pixels are still simply shown directly, pixel for pixel. The video system has no other way to show pixels.

Images shown same size on the video screen at 75 dpi to 96 dpi can look better than a print with 3x or 4x the pixels printed at 300 dpi (better color depth). The devices are simply different concepts. A paper print is viewed in the light reflected from the ink on the paper, where the screen is directly transmitted light. Color changes are the only detail in images. The video screen sees colors as three RGB pixels, each can be any value 0 to 255. The ink jet printer only has four colors of CMYK ink, and must print several or more dots of ink with a mix to approximate one of the 16.78 million possible 24-bit pixel colors. These ink drops cannot all fit into the area space of one 300 dpi image pixel, and to average out closer, some pixels are left a bit brighter and some are left a bit darker, called dithering. Commercial lab chemical color prints (real photo paper) are better, continuous tone.

Again, logical inches are the false (but nevertheless very useful for text) notion that 96 pixels define one imaginary inch on any Windows screen. We must overlook that there will actually be more pixels in a real inch on some screens, and fewer on others, but it is usually not greatly wrong. But be very clear that logical inches are only used to resize inch dimensions into pixel dimensions for the screen (this can include computer-drawn graphics, but it is normally about text font size in points). Logical inches are NOT related to showing bitmap images. This notion is not needed for images, nor is it even usable for images, because images already have a size in pixels, and images are simply shown according to their actual size in pixels, pixel for pixel. It could appear otherwise in some degree however, because today as we change the web browser font size for text, the images are usually resized to retain a similar page layout.

Continued - Apple and Microsoft

Copyright © 1997-2024 by Wayne Fulton - All rights are reserved.

Previous Main Next