A few scanning tips
There are two very different ways to use images, printing or video screens.
We scan for the capability of our output device.
We choose the scan resolution based strictly on the needs of the output device that will process that image. At home, that output device is normally a printer or a video monitor.
Video monitors and printers work very differently from each other, and must be discussed one at a time. All of the rules are different for images intended for these two devices. The following material details the significance of these differences. (when I say "video", I mean the video monitor screen).
All of the points below will be covered.
|Image size is measured on paper in inches or centimeters (paper size is also measured in inches)||Image size is measured on the screen in pixels (screen size is also measured in pixels)|
|Image size does NOT vary with scanned resolution||Image size varies with scanned resolution|
|Image size is modified on paper by scaling||Image size is modified on screen by resampling|
|Image pixels are spaced on paper using specified scaled resolution (dpi)||Image pixels are located at each screen pixel location, one for one|
|Several printer ink dots are used to represent the color of one image pixel||One screen pixel location contains one image pixel, and can be of any RGB value|
So because of these great and fundamental differences, when this text says "it's this way" or "it's that way", then notice that it also says "for printing" or "for video". Don't get them out of context, because the two modes are very different, with different properties and concerns.
The next page will discuss "What is a pixel?", but before we get started, a note relating to context usage of "dpi":
Printer ink dots and image pixels are very different concepts, but both use the term dpi in their own way (dots per inch).
Inkjet printer dpi ratings refer to printer ink dots (the four colors of ink), which is NOT AT ALL the same thing as image pixels. These are such different concepts that some people imagine we ought to reserve the term dpi for those inkjet ink dots, and reserve use of ppi only for image pixels. Not really a bad plan, except that this view fails to recognize real world usage.
We may hear scanning resolution called spi (Samples Per Inch), and that is indeed what it is. We often hear image resolution called ppi (Pixels Per Inch), and that is indeed what it is. The spi and ppi terms are correct. But historical and common usage has always said dpi for image resolution, meaning pixels per inch, and fully interchangeable with ppi. Pixels are conceptually a kind of colored dot too, and resolution has always been called dpi, for years before we had inkjet printers. Dpi is just jargon perhaps, but it is a fact of life, and always has been. Scanners and scanner ratings say dpi too, meaning pixels per inch (see dialog pictures here, here, here, and here). I habitually always say dpi myself.
We may use the term of our own preference, but we need to understand it both ways. Some photo editor programs have switched to saying ppi now, which has much to be said for it. But others have not switched, so insisting on conformity for others to only say ppi will necessarily encounter much frustration, because the real world simply isn't that way, and obviously is not ready to switch yet.
My point here is that we must understand it both ways, because we will see it both ways, often, in the real world.
It's easy, not a problem - the idea of printing digital images is always about pixels per inch, so when the context pertains to images instead of printers, all of these terms, spi, ppi, and dpi, are exactly the same equivalent concept - they all mean pixels per inch.
There is no problem understanding any use of dpi if you know the context. It always means the only thing it can possibly mean. If the context pertains to images or printing pixels, dpi means "pixels per inch". If the context pertains to inkjet printer quality ratings, dpi means "ink dots per inch". There is no other meaning possible. This should be clear and no big deal - the English language is full of multiple context definitions.
So yes, inkjet printer rating dpi is something entirely different, referring to inkjet printer ink drops instead of image pixels. The inkjet printer only has three or four colors of ink - or maybe double that many in some cases - but nowhere near the 16.7 million color possibilities of 24 bit color, one of which the pixel might be. The concept of a pixel is in fact a colored dot, and the printers goal is to reproduce the color of that dot. But the limited inkjet printer cannot directly reproduce the color of a pixel. The inkjet printer just tries to approximate the color of a pixel, with multiple ink drops chosen from maybe six colors of ink. This method of simulating colors by combining multiple individual ink drops is called Dithering. If it was, say, a 250 dpi pixel (size, 1/250 inch), it must make several ink dots of its few CMYK ink colors, which are located on the perhaps 1200 or 1440 dpi spacing within that pixel's 1/250 inch space (not very exactly, see the Printer Basics section). Any color error is carried over to adjacent neighbor pixels, which are then intentionally tinted the opposite way to compensate. The printer is trying the best it possibly can to reproduce the pixel's color, but inkjets cannot reproduce colored pixels directly. The point is, image pixels and inkjet printer ink drops are NOT the same thing at all. Yet the spacing of both is called dpi, with very different meanings, understood in context of use. If it is about ink drops, it is about ink drops. If it is about pixels, it is about pixels. Image files do not contain any ink drops, so any discussion about images is about pixels. Seems simple.
However, continuous tone printers (dye-subs, and Fuji Frontier types) don't print discrete ink dots of three colors like inkjet printers must - instead they mix the color of the pixel directly, and they print pixels (called continuous tone). There are no dithered ink dots then, just pixels. But these printer ratings still refer to the spacing of those image pixels with the term dpi, simply because dpi has always been the name for "pixels per inch".
Scanner ratings also always call it dpi, also referring to pixels of course (scanners don't use ink dots). You have never heard of a 4800 ppi scanner (and there are no ink drops used in scanners - scanners create pixels of course).
The formal technical specifications at the very heart of our digital imaging definitions use dpi:
All of the above is good enough for me. Those fundamental and elite specification documents do not use ppi one time - dpi has simply always been the name of it. I always say dpi too, for same reason, simply because that has always been the name for pixel resolution.
The term dpi does have multiple meanings, same as most English words do. This confuses some people who dream up their own imagined restrictions, demanding the term dpi ought to be reserved only for printer ink drops (simply their notion, about how they wish things were instead). They apparently don't know the real world term for image resolution has always been dpi. (Sorry, yes, I am going too far here, just for them)
Also, search Google for the phases "72 dpi" and "72 ppi" (using the quotes to be a phrase). We surely all know "72 dpi" is about video, and has absolutely nothing to do with ink drops, but look at the count of the hits... (in 2012, I see 267 million for "72 dpi", and only 2 million for "72 ppi". Over 100 to 1, two orders of magnitude, an overwhelming difference in popular usage. I am just teasing the troops, but they should "get over it". Dpi is the name of it, always has been, and we might as well get used to it.
Ppi is a relatively new term, we never saw it until recent years, but we are seeing ppi used some now, and it seems a perfectly fine name too, since dpi with respect to images does mean "pixels per inch". It might even have been a good plan, but it was not the plan. It may sound a bit silly to pronounce ppi, but recent photo editor software often does say ppi, while scanner software generally says dpi (but we see some exceptions to both). But usage makes either term correct now, even if the long established name for image resolution has always been dpi, for many years before inkjet printers could print photo images. Nothing makes ppi mandatory.
I am not arguing for the use of dpi. I don't care which you use, I understand it either way. I am just saying you should understand both too. My own preference has always been dpi (not that my choice is important), but mainly, my rant is simply explaining why you need to expect to see it used both ways. It's OK with me if you want to use your own preference, since both terms mean pixels per inch. We will all understand it either way, and you should too. BUT - again - regardless of your own preference, you definitely will often see both dpi or ppi used, so for your benefit, you MUST understand what you read both ways. It may be a bit confusing at first, but that's simply how things are (and you already know that). Think of this as training to understand what you will see elsewhere.
I am not arguing which is better, or how things could have been, or ought to have been. It simply wasn't, and I am arguing what actually is. Facts are about what is. The important thing to me (and my big peeve) is that it certainly does the beginner no favor to be incorrectly told dpi is wrong, and that everything he reads everywhere is therefore wrong. That is confusion indeed, harmful, not helpful, since it does not promote understanding of the real world that actually is. That is my fight.
(to those so-called "advisors" - remember, when everyone except you is wrong, probably you don't understand the situation.)
The only proper instruction is that "both terms are used, interchangeably. Expect to see either, we must understand it either way."
There is really no problem understanding the two uses of the word dpi if you know the basics, and realize the context. It always means the ONLY thing it can possibly mean in context. This should be no big deal, the English language thrives on multiple context definitions. If it is about ink drops, it is about ink drops. If it is about pixels, it is about pixels. If it is about the images themselves, it is not about ink drops.
If the usage context pertains to images or printing pixels (and it almost always does), then dpi always means "pixels per inch". So does ppi, same thing exactly. It cannot mean anything else, printing is about spacing pixels on paper. The two terms are fully interchangeable, use either according to your whim (but you gotta understand it both ways). If we have a 300 dpi image, both terms mean it will print at 300 pixels per inch (pixel spacing on paper), so that 300 pixels will cover one inch.
If the usage context pertains to inkjet printer ink dot ratings, dpi means "ink drops per inch" (but since the ink drops are actually larger than their spacing, the rating is more specifically about carriage and paper motor stepping intervals). If the printer rating is 1440 dpi, it means its motors can space 1440 ink dots per inch while trying to simulate the color of the pixels in that 300 dpi image. The pixels still remain 300 dpi size (as best as they can be reproduced). Most printer drivers have renamed this now anyway, as Quality, offering Good, Better, Best Quality, or maybe Fast, Standard, High Quality. This ink drop spacing is a quality parameter, about reproducing those pixels to the degree possible - it is not an image resolution parameter.
Use the NEXT button on each page (below) to continue through the remaining pages.