Digital images are new to some, and there are things that we need to know, how it works. If you have a question, the idea is that it's hopefully answered here.
There are two very different ways to use images, printing or video screens.
We scan for the capability of our output device. We choose the scan resolution based strictly on the needs of the output device that will process that image. At home, that output device is normally a printer or a video monitor.
Video monitors and printers work very differently from each other, and must be discussed one at a time. All of the rules are different for images intended for these two devices. The following material details the significance of these differences. (when I say "video", I mean the video monitor screen, not movies).
|Properties of Printed Images||Properties of Video Images|
|Image size is measured on paper in inches or centimeters (paper size is also measured in inches)||Image size is measured on the screen in pixels (screen size is also measured in pixels)|
|Image size seen does NOT vary with scanned resolution||Image size seen varies with scanned resolution|
|Image size is modified on paper by scaling||Image size is modified on screen by resampling|
|Image pixels are spaced on paper using specified scaled resolution (dpi)||Image pixels are located at each screen pixel location, one for one|
|Several printer ink dots are dithered to represent the color of one image pixel||One screen pixel location contains one image pixel, and can be of any RGB value|
So because of these great and fundamental differences, when this text says "it's this way" or "it's that way", then notice that it also says "for printing" or "for video". Don't get them out of context, because the two modes are very different, with different properties and concerns.
There are generally two different goals for creating an image (in scanner or camera), either to show it on the video screen, or to print it on paper. These uses have different rules. These digital basics are summarized here. Some Printing Guidelines are here. The bottom line is that we do have to know about pixels, at least that they do exist. Our digital images are dimensioned in pixels. Pixels are ALL that there is in a digital image.
The following specifically speaks of scanning and printing to make a copy an image or a document.
The purpose of high scan resolution is to create more pixels to provide for enlargement, so an important way to think of it is this way:
Paper is dimensioned in inches (or mm). Images are dimensioned in pixels. Paper has a fixed spatial dimension. Pixels do not, we can print images any size (dpi), or show on any size of video screen.
We normally always want to print photos at about 300 dpi on paper. 300 dpi is simply the magic number representing the maximum detail that a good eye can normally see at close distance. Paper is dimensioned in inches, and images are dimensioned in pixels, so pixels per inch is a concern.
Honest, 300 dpi is an optimum and desirable goal for printing color or grayscale photos. This is an assumed given here, not for debate. Except your one-hour photo print shop likely uses 250 dpi, and it is adequate too. Dpi is pixels per inch (which is jargon perhaps, but naysayers see below).
Line art mode: Line art is black ink on white paper (no gray tones), for example line drawings or printed text. Otherwise, we are mostly speaking of color or grayscale photo images here. However an exception is that B&W documents in line art mode (without color or tonal data) can print a little better at 600 dpi, or 1200 dpi for best commercial printing, with smaller "jaggies". But 300 dpi line art is still fair (fax is 200 dpi line art). But for more complex tonal work (color or grayscale), our printer cannot print 600 dpi tonal detail. It is not designed to try, because our eye at normal viewing distances cannot resolve more than 300 dpi detail.
To help distinguish your scanner menus about this: Line art mode is one bit data, only two colors, black ink and white paper, ideal for inked line drawings or printed text documents. Only two colors allows higher resolution, and eliminates unwanted intermediate colors, like pink or blue color casts. Some scanners call it line art mode (and I do), but other scanners may call it B&W mode (meaning ink, NOT meaning photos).
|Scanners have Three scan modes||and Three modes about source|
|Color, or maybe called RGB mode||Reflective mode (prints, paper)|
|Grayscale, or B&W Photo mode||Film slide (positive)|
|Line art, or B&W mode||Film negative (inversion,|
and orange mask if color)
What we really need to know first is how to use our images. How to scan what we need for our purpose, and how to print them right, how to properly make the best use of them. We have to learn that pixels exist, and that images are dimensioned in pixels. We have to think in terms of pixels. Then it all becomes easy.
Scan resolution of 300 dpi means that the scanner will create 300 pixels for every inch scanned. Scanning five inches creates 1500 pixels across it (to reproduce five inches).
Print resolution of 300 dpi means that the printer will space the available pixels at 300 pixels per inch of paper. Printing 1500 pixels at 300 dpi will cover five inches.
So, to create a copy at original size, simply scan photos at 300 dpi, and print them at 300 dpi. Then the same dpi number means the source will be copied at same original size, at optimum 300 pixels per inch.
We imagine high resolution is about showing greater detail, and of course it is, however the only way we can see that detail is to enlarge it bigger to see it better. Enlargement is the important tool. We use high resolution for enlargement. We need more pixels to show it larger to see the detail. This is NOT speaking of scaling or resampling larger, which cannot increase image detail. This enlargement is due to greater scan resolution, sampling more pixels reproducing finer image detail from the original image.
To create sufficient pixels for print enlargement, the ratio of (scanning resolution / printing resolution) is the enlargement factor. For example,
Film is small, and so needs more enlargement, but still the same rule. One example is to scan 35 mm film at 2700 dpi, and print at 300 dpi, for 2700/300 = 9X size enlargement. 9X is about 8x12 inches (about A4 size) from full frame 35 mm (about 1.4 x 0.9 inches). The ratio of (scanning resolution / printing resolution) is the print enlargement factor.
Film is designed to be enlarged, with fine grain, so it enlarges well. Frankly, prints are only of adequate quality at original size, and they don't enlarge well (attempts will be somewhat degraded). 2x might be still be acceptable for some uses, but it seems likely you won't like 3x (detail resolution will be low, spread out, not sharp). Prints are already enlarged, and are assumed to be the final purpose. The paper emulsion is designed with more coarse grain, not designed to be enlarged. In contrast, the film is the original master source, with fine detail designed to be enlarged. Or digital cameras provide the original source file, even better (depending on adequate image size). Either way, it can be better quality to go back to the original source than to plan to enlarge a print copy. Copying prints at original size should work out well though.
Note that to scan a full page size at high resolution like 2700 dpi is nonsense, surely a misunderstanding. You're not planning a 9x enlargement and will have no use for such a large image. The concept is about desired enlargement, a goal for a purpose. A full page size is already large. When you just want to print a copy of a photo print or a document page at original size, just automatically scan and print at 300 dpi. One exception, if and only if the scan is line art mode, then maybe scanning and printing line art at 600 dpi can be a little better (no tonal colors to dither, and 600/600 is still original size).
Again, if scanning something small, like a 35 mm film frame, and wanting to print it large, like full page size, then the idea is to scan it at maybe 2700 dpi at 100%, or 300 dpi at 900%, which is the same result, either of which will create enough pixels so that then printing it at 300 dpi will enlarge it to 2700/300 = 9x, like to A4 or 8x12 inch size.
But if wanting to scan something and print it as same original size, then scan at 300 dpi, and print at 300 dpi, and you will reproduce original size at optimum quality.
This is very easy, and very important to know, and this little bit to know may answer most questions.
(8 inches x 300 dpi) x (10 inches x 300 dpi) = 2400x3000 pixels.
Fill in your own numbers, but for example, if our film source to be scanned is say 1 x 1.25 inches, and if we want 8x enlargement (8 inches / 1 inch = 8x enlargement), then scanning at 300 dpi print resolution x 8x enlargement = 2400 dpi scan resolution will do it.
(1 inch x 2400 dpi) x (1.25 inches x 2400 dpi) = 2400x3000 pixels.
which is calculated to be the exact same requirement for pixels. Simply scan at (300 x 8) = 2400 dpi to enlarge for printing 8x original size at 300 dpi.
Or if we want to print a copy at the same original size, then scan and print at 300 dpi (which is simply 1x enlargement).
Or if we want to print as half size of original, then scan at (300 dpi x 1/2) = 150 dpi (and print at 300 dpi).
That little formula is the basic concept of digital scanning and printing. How hard is that?
Again, scanning 6x4 inches of paper at 100 dpi will produce
(6 inches x 100 dpi) x (4 inches x 100 dpi) = 600x400 pixels of image size.
Plug in the appropriate numbers to get the size image you want (in pixels) from what you are scanning (inches). dpi = pixels per inch.
That dpi concept is true for printing goals too, meaning that if you plan to print 8x10 inches at 300 dpi, then in preparation, you need to create in the ballpark of
(8 inches x 300 dpi) x (10 inches x 300 dpi) = 2400x3000 pixels.
The scanner software tool will compute this math for you, and will help with this. It provides more calculator than this little one below, to make work easy for you. The concept is INPUT from scanner, and OUTPUT to printer. The typical scanning menu will show in advance, the size of the INPUT area to be scanned, the resolution and any enlargement factor, and the size of the OUTPUT image, in pixels and/or inches. Output for video screens is only concerned with output size in pixels, ignore output inches then. You can just make the numbers show your output goal.
The scanner tool converts an INPUT scan area to an OUTPUT print area, using dpi and enlargement, same as described here. Unless 100% scale (1x enlargement), the dpi value entered will be assumed to be the desired OUTPUT printing resolution value, and the actual scanning resolution is computed according to enlargement, which not many scanners will show. The two dpi are equal Only if at 100% scale, and the dpi seen will be the printing resolution goal (see this page). Otherwise, the scanning resolution will be Enlargement x Printing dpi, which you should know this even if not shown. The scanner may also have a beginners automated menu to hide much more of this detail.
Repeating concept, if you scan a slide at 300 dpi 9x enlargement, the output file will be marked with 300 dpi, to print 9x size, as is.
If you scan it at 2700 dpi 100%, you get exactly the same pixels, but the output file is marked 2700 dpi, to print original size (at excessive resolution specified). We scale this file to 300 dpi later, before we print it (which simply only changes the dpi number in the file, so then these are equivalent in that way).
In practice, the computed scan resolution number might come out like maybe 1037 dpi. Which will work, but good practice for critical work might increase it to the next even number the scanner menu offers, like when 150 300, 600, 1200, 2400, 4800 dpi is offered, choose the next larger one, like 1200 dpi in this case. Then the final scan can be cropped and resampled as necessary (see this page). Not much actual difference today, but the even menu value can be more a critically precise scan (the scanner carriage motor moves in certain steps, and the scan sensor also has certain intervals, which these defaults will match. Then the photo editor can better resample the larger full image).
There is a qualification about printing. There are two methods that should be mentioned:
Any time when "fitting the paper", your job is to provide sufficient pixels to match your desired actual print size. Optimum is about 300 pixels per inch for best quality photos (or 250 dpi works well too). If you want 4x6 inches, that's 1200x1800 pixels. If you want 8x10 inches, that's 2400x3000 pixels. (Large wall posters probably have too few pixels for their size, and must be less dpi, but they are viewed at greater distances.)
There is another procedure often necessary, because camera images and print paper are often Not the same SHAPE, and so cannot be fitted to the paper as is. For example, a 4x6 image will not fit 4x5 paper. The print is not going to work out well unless they are the same shape (see this about cropping to fit the paper). But the most essential basic fact is that images are dimensioned in pixels, and that is how we must think of it. That makes it easier, not harder. And it is necessary, and it's very important to plan your pixels.
Sufficient pixels to print at 240 to 300 dpi is an optimum goal to print photo images. This is true for images sent to online printing services, or true of your own inkjet printer too. More pixels really cannot help the printer (in color mode), but very much less is detrimental to quality. This is very simple, but it is essential to know and keep track of. This simple little calculation will show the image size needed for optimum photo printing.
Using inches shows the dpi concept more obviously, so it will compute in mm if desired, but it also shows conversion.
This little calculator has these purposes: (or there's another fancier calculator)
3000 pixels spaced at 300 pixels per inch (dpi) is 10 inches. This easy fact is about the least that we all should know about using our images. There is a larger dpi calculator that knows more details about scanning, printing, and enlargement.
So repeating to make a special point. We know that to scan a 35 mm film slide to print at A4 or 8x12 inches, we need to scan at about 2700 dpi, to achieve about 9x enlargement. However, we do not specify BOTH 2700 dpi and 9x numbers at once. One of them at a time is plenty.
When scanning paper prints, it's important to realize that an area scanned at 300 dpi will create the pixels necessary to also print the same size at 300 dpi. The concept either way is pixels per inch. 300 dpi is likely what you want for a photo copy job (but a line art mode scan of black text or line drawings can use 600 dpi well).
But film has to be enlarged, so it has to be scanned at higher resolution to create sufficient pixels for enlargement.
This printing dpi number does NOT need to be exact at all, ± 10% of 300 dpi won't matter much, but it should be ballpark. But planning size to have sufficient pixels to be somewhere near this size ballpark (of 250 to 300 pixels per inch) is a very good thing for printing.
The role of scan resolution for the video screen is to create the appropriate image size, the dimensions of image width x height in pixels. Scanning dpi and enlargement do affect the final video image size (pixels), but on the screen, only the output image size in pixels has any importance (not inches). Video screens and printing on paper are just simply very different concepts. Video is easier actually, but we have to understand the difference. It should all be obvious, and second nature, once we know.
(6 inches x 100 dpi) x (4 inches x 100 dpi) = 600x400 pixels
It is a fundamental basic. Printing spaces pixels the same way (pixels per inch).
Inches (or mm) do exist on paper or film. Paper is dimensioned in inches (or mm).
Scanners and printers use paper. Video systems do not.
So the image pixels are pretty much directly displayed one for one on the screen, one image pixel on one screen pixel, so your target goal is the size of image that you want to see. Inches are a factor on paper, but inches are not used on the screen. The physical size you see will depend on how large a screen you bought, so size will vary on different screens. The screen is dimensioned in pixels. Images are dimensioned in pixels. A 600x400 pixel image will fill 600x400 pixels of the larger video screen. It simply does not matter to the video system if that image file is claiming 72 dpi or 300 dpi or any other value, video simply doesn't look at dpi. The 600x400 pixel image will be shown in 600x400 pixels on the video screen. This is the easiest way it could be.
Your digital camera does not use paper either. Film was dimensioned in mm like paper, but a digital sensor directly creates an image dimensioned in pixels. Yes, the sensor is dimensioned in mm (like film), but the "scanning" job is complete now, and what we get out of the camera is pixels. However, your camera will also stick in some arbitrary dpi number, maybe 180 or 240 or 300 dpi. At best it is just a bad guess, because the camera has no clue what size you might print it, if at all. This might indicate a huge print size of a couple of feet? But that has no meaning there, don't worry about it yet. You will determine a useful print size and dpi number when you are ready to print. This dpi number does not affect the pixels that the camera creates. Dpi is just a separate lone number stored someplace in the image file to later tell the printer how to space the pixels on paper, and this number can simply be changed at will (called scaling, to match the paper size).
Digital home movies were 640x480 pixels in the past, and today are 1920x1080 or 1280x720 pixels, to match HDTV screens which are also one of the same two sizes. Digital is about pixels.
Ifs and buts ... (confusion factors)
But otherwise, video screens normally show pixels as pixels, one image pixel on one screen pixel. That does make converting paper documents dimensioned in inches be a problem on the screen, which has no concept of inches or dpi. See method about video "logical inches" (on its 2nd page).
The digital camera can work very well to copy photos, but (like any other photo), copy work depends on your procedures.
The scanner is carefully designed to control the lighting, the position of the light, and the color and white balance of the light. It holds the media square and flat. These are big jobs. The scanner is designed to do the copy job well, and automatically.
However, all of this is left up to the camera operator, who needs to have some understanding of how to do that part well too. This camera copy setup is Not automatic, and the details of what you do are important. To use a camera for scanning documents, see Google for the standard copy procedures. It will be a closeup, so you surely will need to use a macro lens.
Cameras don't use dpi, their sensor creates a fixed image size in pixels. However, the sensor is composed of pixels (pixel size dimensioned in micro-mm), and the pixel spacing does determine a dpi resolution, which is typically a very large number, because even a full frame size sensor must be enlarged about 8x to print an 8x10 inch print. A cell phone or compact camera image is enlarged 30x or 40x to print the same 8x10. The computed scan dpi is pixels shown / inches shown. So photographing a 6x4 inch printed photo copy into (if 24 megapixels) 6000x4000 pixels, which is 6000 pixels / 6 inches (pixels per inch) = 1000 dpi, which is scan resolution. Then this 6000x4000 pixel image printed at 300 dpi is 6000 pixels / 300 dpi = 20 inches wide on paper. You'll likely want to resample it smaller to 1800x1200 pixels to print a copy at original 6x4 inch size at 300 dpi.
See copying slide film with the digital camera.
The next page will start with "What is a pixel?", but before we get started, a note relating to context usage of "dpi":
Any controversy is that ink jet printer ink drops and image pixels are very different concepts, but both use the term dpi in their own way (dots per inch).
Inkjet printer dpi ratings can refer to printer ink drops per inch (the four colors of CMYK ink), which is NOT AT ALL the same thing as image pixels. ink jet ink "drops per inch" will be a number several times larger than pixels per inch, and is just their way of mixing the primary color inks to be one of the millions of image colors that one pixel represents (an image pixel is only some ONE of millions of possible colors). The ink jet "ink drops per inch" is a color quality specification, NOT an image resolution detail. These are such different concepts that some people imagine we ought to reserve the term dpi for those inkjet ink dots, and reserve use of ppi only for image pixels. Not really a bad plan, except that this view fails to recognize existing real world usage, and that the dpi number was already in use as image resolution. And FWIW, todays ink jet printers are minimizing their usage of the term "4800 ink drops per inch" (which was not understandable anyway), and are now instead adopting a "Good, Better, Best" system of specifying color quality settings.
Image files simply have NO capability to specify printer ink drops per inch. Only the printer driver does that (Print menu). Image files can only specify the pixel's color, and the pixel spacing when printed. Image parameters are only about pixels, with no concept of ink drops. If the image file specifies dpi or ppi (interchangeable), it is the image resolution for printing, but it can only be about pixels. Ink drops is a different issue, about how the specific printer tries to reproduce the ink color of those pixels.
We may hear scanning resolution called spi (Samples Per Inch), and that is indeed what it is. We often hear image resolution called ppi (Pixels Per Inch), and that is indeed what it is. The spi and ppi terms are correct. However historical and common usage has always said dpi for image resolution, meaning pixels per inch, and that is indeed what it is. Dpi is fully interchangeable with ppi. Pixels are conceptually a kind of colored dot too, and resolution has always been called dpi, for years before we had inkjet printers. Dpi is just jargon perhaps, but it is a fact of life, and make no mistake, it always has been. Scanners and scanner ratings say dpi too, meaning pixels per inch (see dialog pictures here, here, here, and here). I habitually always say dpi myself, an old habit. Image resolution has always been called dpi (referring to scanning or the printed image, pixels per inch of printed paper).
We may use either term of our own preference, but we certainly need to understand it both ways. Some scanners and photo editor programs have even switched to saying ppi now, and there is nothing wrong about that. But others have not switched, so insisting on conformity requiring all others to only say ppi will necessarily encounter much opposition, because the real world simply isn't that way, and obviously not all are willing to switch yet. Their view is that dpi has always been the term for image resolution, regardless of any whippersnaper's opinion. 😊
I'm OK with either term you choose to use, I understand it either way. My point here is that we must understand it both ways, because we will see it both ways, often, in the real world.
It's easy, not a problem — the idea of printing digital images is always about pixels per inch, so when the conversation context pertains to images instead of printers, all of these terms, spi, ppi, and dpi, are exactly the same equivalent concept — they all mean pixels per inch.
There is no problem understanding any use of dpi if you know the context. Context always means the only thing it can possibly mean. If the context pertains to images or resolution or printing pixels, dpi means "pixels per inch". If the context pertains to inkjet printer color quality ratings, dpi means "ink drops per inch". There is no other meaning possible. This should be clear and no big deal — the English language is full of multiple context definitions.
So yes, inkjet print quality rating dpi is something entirely different than printing resolution in dpi, referring to inkjet printer ink drops per inch instead of image pixels. Ink drops per inch is only about printing quality (dithering quality), and is NOT about printing resolution. Ink drops per inch is NOT data stored in the image file. It is NOT asked in any photo editor EXCEPT only as a question that the print driver asks as you go to print it (important to ink-jet dithering, but actual print shops won't ever ask, they know what to do). The inkjet printer only has four colors of ink — or maybe a few more in some cases — but nowhere near the 16.7 million color possibilities of 24 bit color, one of which the pixel might be. The concept of a pixel is in fact a colored dot, and the printers goal is to reproduce the color of that dot. But the limited inkjet printer cannot directly reproduce the color of a pixel. The inkjet printer just tries to approximate the color of a pixel, with multiple ink drops chosen from 4 (to maybe 8) colors of ink. This method of simulating colors by combining multiple individual ink drops is called Dithering. If it was, say, a 250 dpi pixel (size, 1/250 inch), it must make several ink dots of its few CMYK ink colors, which are located on the perhaps 1200 or 1440 dpi spacing within that pixel's 1/250 inch space (not very exactly, see the Printer Basics section). Any color error is carried over to adjacent neighbor pixels, which are then intentionally tinted the opposite way to compensate. The printer is trying the best it possibly can to reproduce the pixel's color, but inkjets cannot reproduce colored pixels directly. The point is, image pixels and inkjet printer ink drops are NOT the same thing at all. Yet the spacing of both is called dpi, with very different meanings, understood in context of use. If it is about ink drops, it is about ink drops. If it is about pixels, it is about pixels. Image files do not contain any ink drops, so any discussion about images is about pixels. Seems simple.
However, continuous tone printers (dye-subs, and the chemical photo paper printers) don't print discrete ink dots of three colors like inkjet printers must. Instead they mix the color of the pixel directly, and they print pixels (called continuous tone). There are no dithered ink dots then, just pixels. But these printer ratings still refer to the spacing of those image pixels with the term dpi, simply because dpi has always been the name for "pixels per inch".
Scanner ratings also always call it dpi, also referring to pixels of course (scanners don't use ink dots). You have never heard of a 4800 ppi scanner (and there are no ink drops used in scanners — scanners create pixels).
The formal technical specifications at the very heart of our digital imaging definitions say dpi:
Dwelling on the obvious, but the reason for all of the above is because "dpi" has simply always been the name of printed image resolution. That's plenty good enough for me. There are no ink drops in image files. No ink drops in scanners or cameras either. Those fundamental and elite specification documents written by the experts do not use ppi one time — dpi has simply always been the name of it. I always say dpi too, for same reason, simply because that's always been the name for pixel resolution. Some users today think that sounds wrong, and their notion is to call it ppi. But dpi has always been the real world name of it, however what's important to you is that you do need to understand it either way. Basically, if usage context refers to images, dpi means pixels. If usage context refers to printer printheads, dpi means ink drops.
Yes, sure, the term dpi does have multiple meanings, same as most English words do. Now we have many newbies getting into digital, and this term confuses some of them who dream up their own imagined restrictions, demanding the term dpi ought to be reserved only for printer ink drops (simply their notion, about how they wish things were instead). They apparently don't know the real world term for image resolution has always been dpi. (Sorry, yes, I am going too far here, just for them)
Also, search Google for the phases "72 dpi" and "72 ppi" (using the quotes to be a phrase). We all know "72 dpi" is about video pixels, and has absolutely nothing to do with ink drops, but look at the count of the hits... in 2017, I see 137 million for "72 dpi", and only 163K for "72 ppi". That's 840 to 1, nearly three orders of magnitude, an overwhelming difference in popular usage. I am just teasing the troops, but some of them should "get over it". Dpi is the name of it, always has been, and we might as well get used to it.
Fortunately, inkjet printers are phasing out their use of the pirated term dpi, and now the printers quality selection menu choice is usually stated as Good, Better, Best (HP), or Fast, Standard, High (Canon), or maybe Draft, Standard, High (Epson), regarding choosing printing quality. Not confusing us with X dpi ink drop density seems good to me, and print quality is what it is. Many beginners have never even seen the idea of ink drops per inch today.
Ppi is a relatively new term, we never saw it until recent years, but we are seeing ppi used some now, and it seems a perfectly fine name too, since dpi with respect to images does mean "pixels per inch". It might even have been a good plan, but it was not the plan. It may sound a bit silly to pronounce ppi, but recent photo editor software often does say ppi, while scanner software generally says dpi (but we see some exceptions to both). But usage makes either term correct now, even if the long established name for image resolution has always been dpi, for many years. Nothing makes ppi mandatory.
I am NOT arguing we must use the term dpi. I don't care which you use, I understand it either way. I am just saying you absolutely should understand both too, because both is what you will see today, it is used interchangeably. I learned it as dpi (not that my choice is important, but dpi is what it's always been called). Mainly, my rant is simply explaining why you need to expect to see it used both ways. It's OK with me if you want to use your own preference, since both terms mean pixels per inch. We all understand it either way, and you should too. BUT again, regardless of your own preference, you definitely will often see both dpi or ppi used, so for your benefit, in the real world, you MUST understand what you read both ways. It should be second nature for you, not confusing. If about images, it can only be about pixels. If about printers, it may be about ink drops in the print menu, but that won't be in any Edit menu. Both can be dots. That's simply how things are (and you already know that). Think of this as training to understand what you will see elsewhere.
I am not arguing which is better, or how things could have been, or ought to have been. It simply wasn't, and I am arguing what actually is. Facts are about what is, which is both ways today. The important thing to me (and my big peeve) is that it certainly does the beginner no favor for the "holier than thou" to stand up and incorrectly shout that dpi is wrong, and that everything we read everywhere is therefore wrong. That is confusion indeed, harmful, not helpful, since it does not promote understanding of the real world that actually is. That's my fight. The only reasonable action is to instead simply state that both terms are used, they mean the same thing if about image resolution. (To those so-called "advisers" — remember, when everyone except you is wrong, you probably don't understand the situation.)
The only proper instruction is that both terms are used, interchangeably. Expect to see either. To understand what we read, we must understand it either way.
There is really no problem understanding the two uses of the word dpi if you know the basics, and realize the context. It always means the ONLY thing it can possibly mean in context. This should be no big deal, the English language thrives on multiple context definitions. If it is about ink drops, it is about ink drops. If it is about pixels, it is about pixels. If it is about the images themselves, it is Not about ink drops. If it is about the printer, it might be about spacing the image pixels, or it could be about the spacing of the ink drops to color each pixel (but inkjet printers today are instead adopting a "Good, Better, Best" notation about specifying ink drop printing quality, instead of saying dpi).
If the usage context pertains to images or printing pixels (and it almost always does), then dpi always means "pixels per inch". So does ppi, same thing exactly. It cannot mean anything else, printing is about spacing pixels on paper. The two terms are fully interchangeable, use either according to your whim (but we gotta understand it both ways). If we have a 300 dpi image, both terms mean it will print at 300 pixels per inch (pixel spacing on paper), so that 300 pixels will cover one inch.
If the usage context pertains to inkjet printer ink dot ratings, dpi means "ink drops per inch" (but since the ink drops are actually larger than their spacing, the rating is more specifically about carriage and paper motor stepping intervals). If the printer rating is 1440 dpi, it means its motors can space 1440 ink dots per inch while trying to simulate the color of the pixels in that 300 dpi image. The pixels still remain 300 dpi size (as best as they can be reproduced). Most printer drivers have renamed this now anyway, as Quality, offering Good, Better, Best Quality, or maybe Fast, Standard, High Quality. This ink drop spacing is a quality parameter, about reproducing those pixels to the degree possible — it is not an image resolution parameter.
Use the NEXT button on each page (below) to continue through the remaining pages.