What we will see is about to change.
I am posting this piece because our world of viewing is about to change dramatically, with 1024 x 768 being a problem for many, and I have recently auditioned monitors that have moved beyond my 1920 x 1200 pixels to 2048 x 1536 and 3800 x 2400. I had enough trouble working on the new template, but can you image what things will be like when these things hit in 2009 and 2010?
I suggest that everyone check their monitor output and see just how far you can go before the transition and obsolescence of entire generation of monitors and video cards is upon us. There will be some firmware/software upgrades that will work for some, but for some really older machines; the clock is ticking.
Image Resolution for the Video Monitor Screen
For images viewed on computer screens, scan resolution merely determines image size. The bottom line is that dpi or ppi means pixels per inch, which means that if you scan 6 inches at 100 dpi (or 1 inch at 600 dpi), you will create 600 pixels, which will display on any screen as 600 pixels in size.
We think of greater resolution as showing more detail, and while that's generally true (within reasonable limits), it's because it makes the image larger. But we are always greatly limited by our output device, and often cannot take advantage of maximum resolution. The images are huge, and our screens are simply not large enough.
If you don't know your screen size, then the Windows - Start - Settings - Control Panel - Display icon - Settings tab will show or change it. Or on the Macintosh, at the Apple Monitor Control Panel.
Popular video screen size settings are:
640x480 or 800x600 pixels for 14 inch monitors
800x600 or 1024x768 pixels for 15 inch monitors
1024x768 or 1152x864 pixels for 17 inch monitors
1152x864 or 1280x1024 pixels for 19 inch monitors
Which screen resolutions do most people use?
Your web browser reports that your screen size is currently set to 1920x1200 pixels (the overall full area of your screen, not just the browser window). Therefore, no matter what you look at, your screen always shows 1920x1200 pixels (at current setting). For a size comparison, the red image below is exactly 500 pixels wide, meaning that it will always fill 500 pixels of your 1920 pixel screen width (or 500 pixels on any other size screen too). It will appear at 500/1920 = 26% width and 200/1200 = 17% height of your full screen size.
If your screen is set to say 1024x768 pixel size, then no matter what you look at, your screen always shows 1024x768 pixels (at that current setting). For a size comparison, the red image below is exactly 500 pixels wide, meaning that it will always fill 500 pixels of your screen width (or 500 pixels on any other size screen too). (1)
Our monitor screens show pixels directly. Images are dimensioned in pixels, and screens are dimensioned in pixels, and video systems show pixels directly. This may seem pretty obvious, but the point is, it's very important, because it's how it works, and it's all there is to it, and it's good to know that.
You do need to realize that each of us will see this 500 pixel image width at a different size on our different monitor screens. We don't necessarily see the same thing. On the video screen, we may see the 500 pixel image anywhere between 4 and 8 inches wide, depending on our screen size and settings. My 19 inch 1280x960 pixel CRT screen shows this image as 5.6 inches wide (my 19 inch 1280x1024 pixel LCD screen shows it as 5.8 inches wide). My 17 inch 1024x768 pixel CRT monitor shows it 6.0 inches wide. Most of us will be in that general ballpark, but a 15 inch monitor set to 640x480 pixels may see it at 8 inches wide, or more. A laptop with 1600x1200 pixel display may see it at 4 inches wide, or less. That's a very wide size range for the same image on different screens, and it shows that there is no concept of exact size in inches on the screen. We don't all see the same size on the screen.
So it will be quite necessary to forget about inches on the screen, because video only shows pixels, and screens differ in size. The only one correct answer possible about video image size is that every screen will always see 500 pixels as exactly 500 pixels (assuming we view images at 100% Actual size, otherwise it is about other different pixels). Images are dimensioned in pixels, and screens are dimensioned in pixels, and these 500 pixels will fill 500 pixels on any screen, but those same 500 pixels will fill a different area in inches on different size screens.
Inches simply have no meaning on the computer screen, we all see something different. Inches are not defined in the video system. There is no concept of dpi in the video system either. The way video works is that when you set your video settings to say 1024x768 pixels, then that 1024x768 pixels of memory on your video board defines your video system. The programs you use will copy your pixels directly into that 1024x768 pixel video memory. One image pixel goes into one video board pixel memory location, one for one. A 500x200 pixel image fills 500x200 of those 1024x768 pixels. Those 1024x768 pixels are output to your screen, regardless of the size of the glass tube attached. Video is only about those 1024x768 pixels (or whatever the current setting is).
Unfortunately, we frequently hear how 72 dpi or 96 dpi images are somehow important for the video screen, but that's totally wrong. Video simply doesn't work that way. Video systems have no concept of inches or dpi. No matter what dpi value may be stored in your image file (like 300 dpi for printing), your video system totally ignores it, and always just shows the pixels directly. The truth of this should be clearly apparent if you simply watch what it does.
Watch what it does, but be aware of the special cases that can confuse what we think we see.
About what we see:
Your photo editor program normally automatically resamples a too-large image to be smaller so it will fit into the program's window size. Then we only see the smaller copy on the screen, new different pixels, not the original pixels. It also provides a View or Zoom menu, we can create the copy any size we wish, without affecting the original data. The window title bar will show the size reduction ratio, as warning that this screen copy is not the real image data. For example, the title bar might indicate we are viewing a copy at 33% size, or it might say 1:3 ratio of real size (1:3 is also 33% size). We only see the actual original pixels when it says 100% or 1:1 size.
A second situation is Page Layout programs (like MS Word, Publisher, Acrobat, InDesign, PageMaker, Quark). These handle images differently than a photo editor. The very least a photo editor can create is a one-pixel image, its purpose is to create images. But these page layout programs have the one purpose to design and print paper documents. The very least that a page layout program can create is a blank page, which at minimum specifies a paper page dimension in inches. This is very different, it is totally about that page of paper (but people get confused about this). We add text and images to that document to fill inches of printed paper. Page layout programs necessarily do show our document on the video screen, but what we see is an image replica of that page of paper. It may have other embedded images filling areas on that paper page, which are resampled very small to fit their allotted space in the image of the full page we see on the screen. Again, we have a Zoom menu to show that image of the page any size we wish.
In both of these cases, we only see the smaller image copy on the screen (different new pixels), but we print the larger image data using the original pixels. Both cases provide a View or Zoom menu, we can show the images at any size we wish on the screen without affecting the original data. The point is that these are not exceptions, because the video system shows these new resampled pixels in the only way it can, directly, one for one. However, the new image size in pixels is not normally specified to us. Every dimension number we see still pertains to the original size data, (or the size on printed paper), but that is not necessarily about the image pixels we see on the screen (unless we view 100% Actual size).
It also means that when you want to evaluate your image critically, be sure to view the image at 100% Actual size (even if you must scroll around on it), so you are seeing the genuine image pixels that will print, and not a rough resampled temporary copy.
The screen is typically larger than our photographs, so enlargement is often used to show a snapshot photo. We often scan at higher resolutions to fill more of the screen. When we increase scan resolution, we get more pixels, so it increases the image size. But a little goes a long way, and there's no advantage in wrestling with overly huge images just to discard most of the pixels when we display them. So don't scan at 300 dpi or 600 dpi when there's no purpose for it.
How to do that?
Continued
How to do that? Very simple. The term dpi means Dots Per Inch, referring to image "pixels per inch" (dpi and ppi are the exact same thing as related to images, but printer rating dpi is different than image dpi, see the Printer Basics section).
(Here is the part you really need to know)
Assuming 100% scaling (see Scaling), the meaning of scanned dpi is that if you scan a 6x4 inch photo at 110 dpi, then you will necessarily get an image size of
(6 inches x 110 dpi) x (4 inches x 110 dpi) = 660 x 440 pixels
which more or less totally fills a 640x480 monitor screen.
Or scanning the 6x4 inch photo at 140 dpi gives
(6 inches x 140 dpi) x (4 inches x 140 dpi) = 840 x 560 pixels
which more or less totally fills a 800x600 monitor screen.
Or scanning the 6x4 inch photo at 180 dpi gives
(6 inches x 180 dpi) x (4 inches x 180 dpi) = 1080 x 720 pixels
which more or less totally fills a 1024x768 monitor screen.
We are not being very fussy about the exact screen dimensions, but you know the size of the area you are scanning (inches), and you know the size of the video image you wish to achieve (overall pixels), so you adjust resolution to get it (dots per inch, meaning pixels per inch). It depends on how many inches you have, and how many pixels you want. That's how it works, that's all there is to it. Really!
The idea is like this: Those three scanning resolutions just mentioned would create three different sized images from that one photo that could be shown on three screen resolutions, something like this: (2)
In real world practice, we are much more likely to scan at round numbers like 100, 150 or 200 dpi (giving 600x400 pixels, 900x600 pixels and 1200x800 pixels, from a 6x4 inch photo) instead of 110, 140, or 180 dpi (see about integer divisors), and these may be the only choices the scanner offers anyway. Regardless, the image size you create (pixels) is computed from the inches scanned and the resolution used, as shown above. The image size you want depends on your purpose, how the image is to be used. You scan at whatever resolution is required to create this desired image size from the inches you have to scan. If the image is for the screen, then the computer video system will show those pixels one for one, at that size on the screen. It is as simple as that.
So what is important to determine scanning resolution to create a certain image size is:
- How large is the area to be scanned?
- How large do we want the final image to be?
And the obvious answer is to select a resolution that will scale that input size in inches to that desired output size in pixels.
An example:
You have a 4x6 inch photo portrait, and maybe you want the six inch dimension to fill a 800x600 pixel screen vertically. Then obviously 600 pixels / 6 inches = 100 dpi scan resolution is required. Exactly 100 dpi, no more, no less. If the overwhelming requirement is that it must fit the screen, a different resolution is simply not a consideration in this case. Scanning 6x4 inches at 100 dpi will produce an image of (6x100) x (4x100) = 600x400 pixels, aligned vertically just fills the 800x600 pixel screen height. Horizontally, this 400 pixel image would fill half the 800 pixel screen width, so we could put two of these side by side. However, if it were a 8x10 inch photo, then 600 / 10 = 60 dpi resolution would produce an 480x600 pixel image, which fits vertically, but we would have to crop to fit two of them horizontally.
The point is, we can easily predict the exact results by just looking at the numbers.
The scanner's twain driver will measure and calculate this output image size for us before the scan. Just change the settings to show pixels instead of inches and it will show the final image size corresponding to the cropped Input area in the Preview, which corresponds to the resolution you have specified. Both the Input and Output fields change to inch or pixel units. (3)
This example scans 6x4 inches of photo at 100 dpi, and creates 600x400 pixels (100% scale is assumed - see Scaling). Or going the other way, if you want a specific image size, you can type that specific size into the ScanWizard Input window, and the Preview area will change to that size, which you can then center on your image to crop as desired. This is Microtek ScanWizard 5, but regardless of which scanner, the idea is always the same. You specify a resolution, and the size of input image (typically by selecting an area with the mouse in the Preview), and these numbers create the specific size of the output image.
So for an image which is to be shown on a video monitor, you choose scanning resolution to get the desired image size. If your purpose was to show it on a 800x600 pixel screen, then you would NOT scan at 300 dpi to get a (6x300) x (4x300) = 1800x1200 pixel image. Images can be resized, but over ¾ of your total pixels would have to be discarded to fit the screen (what a waste!).
Video Resolution
So what about 72 dpi?
We often hear that we should scan at 72 dpi for the video screen, like it's some kind of magic number. It's not. I'd suggest you forget about 72 dpi. We hear a lot of things that just don't stand up well to examination.
There is no way to use notions of 72 dpi to produce a useful result (I bet you already knew that), because video simply doesn't work that way. There is no concept of dpi in the video system. Video systems show pixels, one for one. If you scan at 72 dpi, what you get is 72 pixels per inch of original photo dimension. That may or may not be the size of image you want. Often it is not (even if it were, 75 dpi would be better, see integer divisors).
The only one possible virtue of the concept of scanning at 72 to 96 dpi is that it will create an image size in pixels that is usually a rough approximation of original size in inches on many common monitors. However it is simply not accurate, the same image will still appear at different sizes on different screens (as stated before).
Even if accurate original size were possible (on paper yes, but it is not possible on the screen), would you even want it? Scan something small at 72 dpi, a postage stamp, or a 35 mm film frame. All you get is a small thumbnail image. Then try 600 or 1200 dpi. See the difference? There is a HUGE difference, and there are many choices, and the previous page showed how to accurately predict precisely what will happen, so you can get any result that you want.
For the screen, scanning at 72 ppi is simply one of many choices, and it produces one specific image size (72 pixels per inch of original). On most commonly used screen sizes, scanning at 70 to 100 ppi creates images usually seen as about original size in inches (very roughly). However, exact original size on the screen is not a valid concept, because screen sizes vary, so any image varies in size on different screens.
There are of course many other size choices than 72 ppi. If 72 ppi is the image size you want on the screen, that's fine, but try scanning at 75 ppi, the scanner can do slightly better (Chapter 9). But if this is not the size you want, then forget about 72 ppi, it has no significance at all.
The loss of the false 72 dpi myth can be pretty earth-shaking for some, so if it's a problem, here is more elaboration.
Video monitors are relatively low resolution devices. A 17 inch monitor screen might measure 12.5 inches horizontally. If it is set to 1024x768 pixel screen size, then the image is obviously 1024 pixels / 12.5 inches = 82 dpi apparent resolution in that case (if we had an image 82 pixels wide, it would appear as one inch on that screen). A 15 inch monitor at 800x600 pixels might be 75 dpi. A 14 inch monitor at 640x480 pixels might be about 65 dpi. Other sizes and settings compute other numbers, but most combinations normally used are vaguely in the rough range of 72 to 96 dpi.
This computation is the origin of those numbers, and their only significance. Screens are NOT 72 dpi in any way except this one way, which is not a factor in scanning for the screen (an effect, not a cause). We may compute that apparent dpi number, but the video system had no concept of it. The only important factor is the size of the screen, like 1024x768 pixels, and how your image size fits in those 1024x768 pixels.
The concept of those calculated dpi numbers has little significance to scanning. Neither the video driver nor the video board has any concept of screen size in inches, and therefore dpi can have no meaning for the images either. We can compute that apparent dpi number, but again, the video system had no concept of it. The video system does not use those numbers, video simply shows those 1024x768 pixels directly. You can scan at that computed dpi number to show actual original size on that one computed screen, but that's generally pointless, because this image size (in inches) would not be the same on a second monitor, screens vary in size.
This section is about video screens. Screens only show pixels, directly, and screens vary in size. This is an extremely important and fundamental concept. You won't make much progress without it. And the best part is that the right answer is extremely simple, the whole story is here. See how to use it on the next page.
Video Resolution - How much to scan?
The useful way to think of video resolution is that our screens show a specific area of pixels, which is adjustable to be 800x600 pixels or 1024x768 pixels, and other sizes too. Therefore, to fit an image onto this screen area, the only number that is important to describe video images is the X by Y image size in pixels, like say 400x300 pixels. For video screens, it is unimportant if that 400x300 pixel image was scanned at 72 or 972 ppi, if the original areas were such that that the screen dimensions come out 400x300 pixels either way. This is very simple.
On the screen, resolution determines image size, not quality (printing is the opposite).
The scan resolution is your choice, and it determines the image size created (in pixels), from whatever content you want to show in that image. But once the image is created, all that is important to the video system is the "X by Y" image size in pixels. Knowing this image size, we can judge how much of our screen it will fill.
A 400x300 pixel image will always fill 400x300 pixels on any screen, but this same image will look larger on a 640x480 pixel screen (will fill more of the 640x480 pixels) than on an 800x600 pixel screen. It will be smaller yet on a 1024x764 pixel screen (400x300 will fill less of the 1024x768 pixels). Larger or smaller meaning that it fills more or less of the total screen area, but it is always 400x300 pixels on any screen. The percentage of "fullness" of image size varies with screen size. Don't let this be hard.
More examples:
If we intend to scan a 0.5 inch width and want to create an image with 1000 pixel width, then we need to scan at (1000 pixels / 0.5 inch) = 2000 dpi (good luck, this is quite extreme, unless we are scanning film). Or if we will scan an 8 inch width and need an image with 400 pixels of width, then we scan at (400 pixels / 8 inches) = 50 dpi. Remember that "dpi" is Dots Per Inch, meaning pixels per inch.
It's still a hard question however. What size do we want? Are we scanning to fill a quarter of a 640x480 pixel screen, or to totally fill a 1280x1024 pixel screen? Only you can answer questions about your purpose.
But if scanning for the web, keep in mind that a few of us still use 800x600 pixel screens, whether you do or not. So, it is a very good idea to switch to 800x600 pixels (or at least to 1024x768 pixels) and check your own web pages (design your web pages to float to adapt to any video resolution).
I hope everyone will always test for themselves the various ideas about getting better results. For example, speaking of tiny images, some suggest images are made better by scanning larger and resampling smaller to get the reduced size. Like the high resolution can somehow be retained when we discard the excess pixels? No, it doesn't work that way. But yes, in moderation, this can in fact sometimes help, in some cases (not because resampling smaller helps, that only discards pixels, but because the photo editor can probably do that resample better than the scanner can). If we want 16 dpi size for a small thumbnail image of a large book cover, then scanning at 50 dpi and resampling to 1/3 size is reasonable for reasons concerning resampling. The scanner may not be as good at such drastic reductions (from 1200 dpi to 16 dpi). But scanning at 300 dpi for the only purpose to be able to intentionally do an extreme resample to 1/10 size is hardly reasonable, it becomes pointless.
Opinions vary, and this is a subject that we will keep coming back to. It is a popular claim that we should scan at only even divisors like 75, 100, 150 or 300 dpi, and then resample the image to be smaller to get the final size. This idea claims that elaborate image programs (like Photoshop) can do a better resampling job than the scanner can. More on this later, but I have to agree, sometimes it can.
Wrestling with the huge images probably builds our fortitude and character, but it's probably best to just scan to get the image size you need in the first place. And it's so practical.
However (there's always a however
Your screen resolution setting is 1920x1200 pixels.. This current vote was not counted again, since you have voted before. (See a description of video resolution) Survey results were reset to zero at 1 July 2005. Times are changing, and screens are getting larger. Previous survey results can be viewed Here
Another FREE Script from BigNoseBird.Com |
No comments:
Post a Comment