There are two very different ways to use images, printing or video screens.
We scan for the capability of our output device.
We choose the scan resolution based strictly on the needs of the output device that will process that image. At home, that output device is normally a printer or a video monitor.
Video monitors and printers work very differently from each other, and must be discussed one at a time. All of the rules are different for images intended for these two devices. The following material details the significance of these differences. (when I say "video", I mean the video monitor screen).
All of the points below will be covered.
|Image size is measured on paper in inches or centimeters (paper size is also measured in inches)||Image size is measured on the screen in pixels (screen size is also measured in pixels)|
|Image size does NOT vary with scanned resolution||Image size varies with scanned resolution|
|Image size is modified on paper by scaling||Image size is modified on screen by resampling|
|Image pixels are spaced on paper using specified scaled resolution (dpi)||Image pixels are located at each screen pixel location, one for one|
|Several printer ink dots are used to represent the color of one image pixel||One screen pixel location contains one image pixel, and can be of any RGB value|
So because of these great and fundamental differences, when this text says "it's this way" or "it's that way", then notice that it also says "for printing" or "for video". Don't get them out of context, because the two modes are very different, with different properties and concerns.
There are generally two different goals for scanning an image, to show it on the video screen, or to print it on paper. These uses have different rules.
The purpose of high resolution is to provide for enlargement, so an important way to think of it is this way:
We normally want to print photos at about 300 dpi on paper (line art maybe 600 dpi). So to print a copy at original size, also scan photos at 300 dpi.
To create sufficient pixels for print enlargement, the ratio of (scanning resolution / printing resolution) is the enlargement factor. For example,
Scan at 300 dpi, print at 300 dpi, for 300/300 = 1X size (prints original size or 100% size)
Scan at 150 dpi, print at 300 dpi, for 150/300 = 1/2X size (prints half size or 50% size)
Film is small, and so needs more enlargement, but still the same rule. One example is to scan 35 mm film at 2700 dpi, and print at 300 dpi, for 2700/300 = 9X size enlargement. 9X is about 8x12 inches (about A4 size) from full frame 35 mm (about 0.9 x 1.4 inches). The ratio of (scanning resolution / printing resolution) is the print enlargement factor.
FWIW, film is designed with fine grain, so it enlarges well. Frankly, prints don't enlarge so well, 2x might be acceptable, but you may not like 3x. Prints are already enlarged, and the paper has more coarse grain, not designed to be enlarged. The film is the original master source. Or digital cameras provide the original source file. Either way, it's better to go back to the original source than to scan a print copy.
But to scan an 8.5x11 inch page at high resolution like 2700 dpi is surely nonsense, a misunderstanding. You're not planning a 9x enlargement and will have no use for such a large image. The concept is about desired enlargement, a goal for a purpose. When you just want to print a copy of a photo print or a page at original size, just automatically scan and print at 300 dpi. One exception, if and only if the scan is line art mode, then scan and print at 600 dpi (600/600 is still original size) can be better line art quality on a good printer.
Dpi is pixels per inch (naysayers, see below). Therefore, the numeric requirement is that if you want to print the image as 8x10 inch size at 300 dpi, then this obviously will require an image size of:
(8 inches x 300 dpi) x (10 inches x 300 dpi) = 2400x3000 pixels.
Sufficient pixels to print at 240 to 300 dpi is optimum to print photo images. This is true for images sent to online printng services, or true of your own inkjet printer too. More pixels really cannot help the printer, but very much less is detrimental to quality. This is very simple, but it is essential to know and keep track of. This simple little calculation will show the image size needed for optimum photo printing.
This print resolution does NOT need to be exact at all, but somewhere near this size ballpark (of 250 to 300 pixels per inch) is a very good thing for printing. There are more details of essential digital basics HERE.
If you scan 8x10 inches at 300 dpi, then 2400x3000 pixels is what you will have (assuming 100% scaling).
If you scan 4x5 inches at 600 dpi, then 2400x3000 pixels is what you will have (the rule above).
This is called "scaling", and this enlargement concept is true for scanning anything, photo prints, documents, film, etc.
The role of scan resolution for the video screen is to create the appropriate image size. There are two factors:
b) Video screens are dimensioned in pixels. Images are dimensioned in pixels. So the image pixels are pretty much displayed one for one on the screen, so your target goal is the size of image that you want. Inches are a factor on paper, but inches are not used on the screen (size will vary on different screens). The screen is dimensioned in pixels. Images are dimensioned in pixels. A 600x400 pixel image will fill 600x400 pixels of the larger video screen.
Again, scanning 6x4 inches at 100 dpi will produce (6 inches x 100 dpi) x (4 inches x 100 dpi) = 600x400 pixel of image size. Plug in the appropriate numbers to get the size image you want (in pixels) from what you are scanning. dpi = pixels per inch.
The scanner menu should compute this math for you, and will help with this. The typical scanning menu will show in advance, the size of the input area to be scanned, the resolution and any enlargement factor, and the size of the output image, in pixels and/or inches. True of scanning to print too, and there are a couple of ways to do it, but you can just make the numbers show your goal.
Your video screen is set to display one of various sizes, perhaps say 1366x768 pixels. The concept is that this 600x400 pixel image will fill 600x400 pixels of that 1366x768 pixel screen (in this example, the image will fill 600/1366 or 44% of the screen width). One common exception, normally if the image is larger than the screen (both in pixels), the viewing software will typically resample a smaller copy, and show the smaller copy which fits on the screen. Of course, not every internet screen is the same size as your screen, so many will see a different size than you see (on their screen size, if different in pixels or inches).
That dpi concept is true for printing goals too, meaning that if you plan to print 8x10 inches at 300 dpi, then of course in preparation, you need to create in the ballpark of (8 inches x 300 dpi) x (10 inches x 300 dpi) = 2400x3000 pixels. It doesn't need to be precisely that, ± 10% of 300 dpi won't matter much, but it should be ballpark.
This will all be covered in more detail, but that is pretty much the essential basics.
The next page will discuss "What is a pixel?", but before we get started, a note relating to context usage of "dpi":
Printer ink dots and image pixels are very different concepts, but both use the term dpi in their own way (dots per inch).
Inkjet printer dpi ratings refer to printer ink dots (the four colors of ink), which is NOT AT ALL the same thing as image pixels. These are such different concepts that some people imagine we ought to reserve the term dpi for those inkjet ink dots, and reserve use of ppi only for image pixels. Not really a bad plan, except that this view fails to recognize real world usage.
We may hear scanning resolution called spi (Samples Per Inch), and that is indeed what it is. We often hear image resolution called ppi (Pixels Per Inch), and that is indeed what it is. The spi and ppi terms are correct. But historical and common usage has always said dpi for image resolution, meaning pixels per inch, and fully interchangeable with ppi. Pixels are conceptually a kind of colored dot too, and resolution has always been called dpi, for years before we had inkjet printers. Dpi is just jargon perhaps, but it is a fact of life, and always has been. Scanners and scanner ratings say dpi too, meaning pixels per inch (see dialog pictures here, here, here, and here). I habitually always say dpi myself, an old habit.
We may use the term of our own preference, but we need to understand it both ways. Some photo editor programs have switched to saying ppi now, which has much to be said for it. But others have not switched, so insisting on conformity for others to only say ppi will necessarily encounter much frustration, because the real world simply isn't that way, and obviously is not ready to switch yet.
My point here is that we must understand it both ways, because we will see it both ways, often, in the real world.
It's easy, not a problem - the idea of printing digital images is always about pixels per inch, so when the context pertains to images instead of printers, all of these terms, spi, ppi, and dpi, are exactly the same equivalent concept - they all mean pixels per inch.
There is no problem understanding any use of dpi if you know the context. It always means the only thing it can possibly mean. If the context pertains to images or printing pixels, dpi means "pixels per inch". If the context pertains to inkjet printer quality ratings, dpi means "ink dots per inch". There is no other meaning possible. This should be clear and no big deal - the English language is full of multiple context definitions.
So yes, inkjet printer rating dpi is something entirely different, referring to inkjet printer ink drops instead of image pixels. Ink drops is about print quality (dithering quality), and is NOT about printing resolution. The inkjet printer only has three or four colors of ink - or maybe double that many in some cases - but nowhere near the 16.7 million color possibilities of 24 bit color, one of which the pixel might be. The concept of a pixel is in fact a colored dot, and the printers goal is to reproduce the color of that dot. But the limited inkjet printer cannot directly reproduce the color of a pixel. The inkjet printer just tries to approximate the color of a pixel, with multiple ink drops chosen from maybe six colors of ink. This method of simulating colors by combining multiple individual ink drops is called Dithering. If it was, say, a 250 dpi pixel (size, 1/250 inch), it must make several ink dots of its few CMYK ink colors, which are located on the perhaps 1200 or 1440 dpi spacing within that pixel's 1/250 inch space (not very exactly, see the Printer Basics section). Any color error is carried over to adjacent neighbor pixels, which are then intentionally tinted the opposite way to compensate. The printer is trying the best it possibly can to reproduce the pixel's color, but inkjets cannot reproduce colored pixels directly. The point is, image pixels and inkjet printer ink drops are NOT the same thing at all. Yet the spacing of both is called dpi, with very different meanings, understood in context of use. If it is about ink drops, it is about ink drops. If it is about pixels, it is about pixels. Image files do not contain any ink drops, so any discussion about images is about pixels. Seems simple.
However, continuous tone printers (dye-subs, and Fuji Frontier types) don't print discrete ink dots of three colors like inkjet printers must - instead they mix the color of the pixel directly, and they print pixels (called continuous tone). There are no dithered ink dots then, just pixels. But these printer ratings still refer to the spacing of those image pixels with the term dpi, simply because dpi has always been the name for "pixels per inch".
Scanner ratings also always call it dpi, also referring to pixels of course (scanners don't use ink dots). You have never heard of a 4800 ppi scanner (and there are no ink drops used in scanners - scanners create pixels of course).
The formal technical specifications at the very heart of our digital imaging definitions use dpi:
Dwelling on the obvious, but all of the above is good enough for me. There are no ink drops in image files. Those fundamental and elite specification documents written by the experts do not use ppi one time - dpi has simply always been the name of it. I always say dpi too, for same reason, simply because that has always been the name for pixel resolution.
Yes, sure, the term dpi does have multiple meanings, same as most English words do. This confuses some people who dream up their own imagined restrictions, demanding the term dpi ought to be reserved only for printer ink drops (simply their notion, about how they wish things were instead). They apparently don't know the real world term for image resolution has always been dpi. (Sorry, yes, I am going too far here, just for them)
Also, search Google for the phases "72 dpi" and "72 ppi" (using the quotes to be a phrase). We all know "72 dpi" is about video, and has absolutely nothing to do with ink drops, but look at the count of the hits... in 2016, I see 9.56 million for "72 dpi", and only 32.5K for "72 ppi". That's 300 to 1, two orders of magnitude, an overwhelming difference in popular usage. I am just teasing the troops, but some of them should "get over it". Dpi is the name of it, always has been, and we might as well get used to it.
Ppi is a relatively new term, we never saw it until recent years, but we are seeing ppi used some now, and it seems a perfectly fine name too, since dpi with respect to images does mean "pixels per inch". It might even have been a good plan, but it was not the plan. It may sound a bit silly to pronounce ppi, but recent photo editor software often does say ppi, while scanner software generally says dpi (but we see some exceptions to both). But usage makes either term correct now, even if the long established name for image resolution has always been dpi, for many years before inkjet printers could print photo images. Nothing makes ppi mandatory.
I am not arguing for the use of dpi. I don't care which you use, I understand it either way. I am just saying you should understand both too. My own preference has always been dpi (not that my choice is important, but that's what it's always been called). Mainly, my rant is simply explaining why you need to expect to see it used both ways. It's OK with me if you want to use your own preference, since both terms mean pixels per inch. We will all understand it either way, and you should too. BUT - again - regardless of your own preference, you definitely will often see both dpi or ppi used, so for your benefit, in the real world, you MUST understand what you read both ways. It should not even be confusing. If about images, it can only be about pixels. If about printers, it may be about ink drops. Both can be dots. That's simply how things are (and you already know that). Think of this as training to understand what you will see elsewhere.
I am not arguing which is better, or how things could have been, or ought to have been. It simply wasn't, and I am arguing what actually is. Facts are about what is. The important thing to me (and my big peeve) is that it certainly does the beginner no favor for the "holier than thou" to stand up and incorrectly shout that dpi is wrong, and that everything we read everywhere is therefore wrong. That is confusion indeed, harmful, not helpful, since it does not promote understanding of the real world that actually is. That is my fight. The only reasonable action is to instead simply state that both terms are used, they mean the same thing.
(to those so-called "advisors" - remember, when everyone except you is wrong, you probably don't understand the situation.)
The only proper instruction is that "both terms are used, interchangeably. Expect to see either. To understand what we read, we must understand it either way."
There is really no problem understanding the two uses of the word dpi if you know the basics, and realize the context. It always means the ONLY thing it can possibly mean in context. This should be no big deal, the English language thrives on multiple context definitions. If it is about ink drops, it is about ink drops. If it is about pixels, it is about pixels. If it is about the images themselves, it is not about ink drops.
If the usage context pertains to images or printing pixels (and it almost always does), then dpi always means "pixels per inch". So does ppi, same thing exactly. It cannot mean anything else, printing is about spacing pixels on paper. The two terms are fully interchangeable, use either according to your whim (but you gotta understand it both ways). If we have a 300 dpi image, both terms mean it will print at 300 pixels per inch (pixel spacing on paper), so that 300 pixels will cover one inch.
If the usage context pertains to inkjet printer ink dot ratings, dpi means "ink drops per inch" (but since the ink drops are actually larger than their spacing, the rating is more specifically about carriage and paper motor stepping intervals). If the printer rating is 1440 dpi, it means its motors can space 1440 ink dots per inch while trying to simulate the color of the pixels in that 300 dpi image. The pixels still remain 300 dpi size (as best as they can be reproduced). Most printer drivers have renamed this now anyway, as Quality, offering Good, Better, Best Quality, or maybe Fast, Standard, High Quality. This ink drop spacing is a quality parameter, about reproducing those pixels to the degree possible - it is not an image resolution parameter.
Use the NEXT button on each page (below) to continue through the remaining pages.