A few scanning tips

www.scantips.com

Video Boards and Modes

The scanner creates 24 bit RGB images, and our video system should match it to view those images.

Thirty years ago (Windows 3.1 and early 486 vintage computers), video boards were 8 bit boards showing only 256 colors (indexed color, page 131). Inferior today but there were only a few scanners then. Compuserve developed the GIF file format then, which was for the 8 bit images current then. However Windows today can still select 256 color mode, even on the best video cards today. Games often required it back then, and the Windows 9x installation default was only 256 colors until we changed it. Windows 7 can still select 256 color mode now .

Our 24 bit scans will look pretty bad if the video board is set to 256 color mode (which also applies to images placed in a GIF file, which are only 256 colors). If your image results are poor (grainy, speckled, mottled, splotchy), the first thing to check is your video mode (Windows Control Panel - Display - Settings). It used to be a common problem that people didn't realize they were stuck in 256 color mode, causing any scanned images to look poor.

While you are at the video settings, be sure your screen is not stuck in 640x480 pixel video size (Windows XP offers 640x480 only in Compatibility mode now). 640x480 pixels is surely rather limiting for scanning or even web browsing. 15 inch monitor screens are usually set to 800x600 pixels, and 17 inch screens are usually set to 1024x768 pixel size.

In the mid 90's, we had 16 bit video boards (16 bit mode is called High Color in Windows), and the modern era began, at least we started seeing more photos on the screen then. The 16 bit High Color mode is almost "good enough" quality to show photo images, at least for most purposes. 16 bit color is 5 bits of each Red, Green and Blue packed into one 16 bit word (2 bytes per pixel). 5 bits can show 32 shades of each primary RGB channel, and 32x32x32 is 32K colors. Green used the extra one bit for 6 bits to achieve 64K colors overall, but half of them are green. The human eye is most sensitive to green-yellow, and more shades are a bigger advantage there. Green has twice the luminance of Red, and six times more than Blue, so this is very reasonable. Video boards do vary, but 24 bits is normally not so much better in most cases, except in wide smooth gradients.

Video boards for the last few years are 24 bit color (called True Color). Note that there is no 32 bit color. The confusion is that 24 bit color mode normally uses 32 bit video mode today, referring to the efficient 32 bit accelerator chips (word size). The 24 bit color mode and so-called 32 bit video mode show the same 24 bit colors, the same 3 bytes RGB per pixel. 32 bit mode simply discards one of the four bytes (wasting 25% of video memory), because having 3 bytes per pixel severely limits video acceleration functions.

Again the difference is this:
24 bit color in 24 bit video mode is three 8-bit bytes of RGB color information in three 8-bit bytes.
24 bit color in 32 bit video mode is three 8-bit bytes of RGB color information in one 32-bit memory word.

Processor chips can only copy data in byte multiples (8, 16, 32, or 64 bits). A 24 bit copy done with a hardware video accelerator would require three 8-bit transfers per pixel instead of one 32-bit transfer. 32 bit video mode is for speed, and it shows 24 bit color. There is no 32 bit "color".

24 bit color is 8 bits each of RGB, allowing 256 shades of each primary color, and 256x256x256 = 16.7 million color combinations. Studies show that the human eye can detect about 100 intensity steps (at any one current brightness adaptation of the iris), so 256 tones of each primary is more than enough. We won't see any difference between RGB (90,200,90) and (90,201,90) but we can detect 1% steps (90,202,90) (on a CRT tube, but 6-bit LCD panels show 1.5% steps). So our video systems and printers simply don't need more than 24 bits.

If our screen is set to show say 1024x768 pixels, the video board has memory chips to hold that current 1024x768 pixel screen image. This memory image is the entire basis of our video system, it is what we see. The video board requires 3 bytes of video memory per pixel for 24 bit color, or 4 bytes for 32 bit video. So a 1024x768 pixel screen size (2.3 MB) requires a 4 MB video board. A 1280x1024 pixel screen size requires 4 MB if 24 bits, or 8 MB if 32 bit mode. Additional video board memory does not affect 2D video speed, it only allows larger combinations of screen size vs. color depth.

Memory is cheap today, and current video boards have 8 or 16 MB of video memory (some have 128 MB for 3D games). We all probably have sufficient video memory for most 2D screen sizes. But this is recent, it certainly wasn't always that way, older computers (when memory was expensive) had real shortcomings in that department not so long ago.

Some seeking faster video continue to use 16 bit color mode. They should try 32 bit mode now. Even if 16 bit mode measures a bit faster, today's computers are so fast that it is imperceptible to humans. There's no reason to limit what we see now.

16 bit color allows far fewer tones than 24 bits. However, the actual visual difference is not usually so different, 16 bits is often about "good enough". Smooth gradients are most affected by 16 bit color mode. 24 bit color has enough tones to show a smooth gradient like the top image below. 16 bit color mode may see banding or streaks in continuous colors, because the color is limited to 32 shades instead of 256 shades in each color. But 32 shades are a lot, and you won't see many images like this anyway... perhaps in the skies of landscapes.

To try to show that effect at worst case, here is a large gradient fill made with the PhotoImpact Paint tool. 24 bit video boards should show a smooth gradient.   16 bit mode will see numerous vertical streaks where the display limits the colors to 32 shades instead of 256 shades. Since a great width was necessary to show it, the fill was 768 pixels wide. The 32 shades in 16 bit mode should be about 768/32= 24 pixels wide each. On a 24 bit board, the 256 shades are about 768/256= 3 pixels wide. Still, 32 shades are quite a few, and frankly, it looks a tad better in an image program than in a web browser. JPG compression didn't seem to hurt it (this one is at 90% Quality), but here is a TIF file (138K) of the same gradient image.

But if the above image looks all speckled and dotted like this partial image below, then your video board is set to display only 256 colors, and MUST BE CORRECTED if you have hopes of ever seeing images look their best. This image below is the above file captured from a 256 color 640x480 video display. The speckled dots are smaller and harder to see at higher screen resolutions, but I think you'll get the idea.

It is a bit naive to implicitly believe our own monitor is perfect without testing it. Adjusting the monitor correctly is extremely important to see images well. Many users cannot see the dark tones if it is not bright enough, and everything dark appears full black. 24 bit video boards should just be able to detect the end tones in the chart below. 16 bit video mode will only see every other tone as unique, but you're doing OK if you see them. Browsers do not have the best image quality reputations, I lose the first black step when in Netscape, so this is better done in your favorite image program.

Correct monitor adjustment is very important to how we see our images. Many monitors (especially old ones) are not bright enough to distinguish dark tones, and everything dark appears full black. Assume the Brightness and Contrast controls are reversed (they really are). Show a large black window (like a Dos or Cmd prompt window), and adjust the Brightness control   ¤   up just enough until the background Black Level blooms to a detectable faint gray, and then back down just enough to be a black again. Too low hides dark detail, too high reduces contrast, so this Black Level is important. Then on regular brighter screen content, set the Contrast control   Contrast   for any pleasant viewing brightness, and for good whites. It is probably most of the way up.

If the monitor setting is available, set the monitor's color temperature to 6500 degrees Kelvin, which is the standard for video, and you will quickly get used to it. Then, some image editor programs offer a monitor gamma adjustment in their Preferences menu (and unfortunately, this is sometimes best left disabled, for example Paint Shop Pro and Picture Publisher). This is not to be confused with Gamma, which is about image data brightness at the midpoint, see page 168.

The purpose of a monitor gamma adjustment is to give a correct view of images. The idea is to see it "right" so we can adjust it "right", so our image is seen the same way on other monitors. It calibrates video brightness at the midpoint. This does not affect the image data at all, but only affects how it is seen in this program, which may affect how you adjust the image.

The monitor gamma tool shows two areas of middle tone color, sometimes gray, but often three RGB patches. One area is made of equal numbers of pixels of 0 and 255 values, mixed 50% so that the averaged result is assumed to be the correct middle tone, by definition. The Brightness and Contrast settings determine these two end points. The other area is RBG pixels of actual gray at mid-range value 128. This method compares three points on the monitor's response curve; 0, 128, and 255. This midpoint gain adjustment makes the two areas look alike in tone, bringing midpoint 128 to the actual middle. When the gray tone appears the same brightness as the mixed endpoint pixels, then the midpoint brightness is considered correctly placed at 50%. This only affects images seen using this one program, only for images viewed in that one program.


Copyright © 1997-2010 by Wayne Fulton - All rights are reserved.

Previous Main Next