Stop color fading Restart color fading

The Differences Between 256 Colors, 16-Bit Color, and 24-Bit Color

Two hundred fifty-six colors, 16-bit color, and 24-bit color are terms that describe "color depth," the number of colors displayed on your computer screen.

Early monochrome screens displayed white, green, or amber text on a black background. Computers with those screens use a single bit per pixel to represent color. Since a bit has two possible states (1 or 0), each pixel can be in one of two states, on or off. If the pixel is "on," that means it is glowing, which shows up as white (or green or amber, depending on the screen).

256 colors

It wasn't long before people wanted more color on their screens. The next step up were screens that could display 16 different colors. This requires four bits per pixel. Four bits can represent 16 possible states because 2 to the 4th power is 16. But with only 16 colors, you still don't get a very realistic color effect.

The next step up was 8 bits per pixel, which allows 256 colors. That's about the level of color you see in business graphics. When you get to 256 colors (8-bit color), you can start making cartoons and graphics that look like drawings. Icons, for the most part, use either 16 or 256 colors.

16-bit color

The 256-color scheme is pretty good for simple graphics but not for photo reproduction. As graphic displays on the computer got more sophisticated, people wanted to see photos on their computers and on the Web, so they added even more bits per pixel. With 16 bits per pixel (16-bit color) you get 2 to the 16th power worth of color combinations -- 65,000 color combinations. That's sometimes called a high-color display, and it's good enough for most graphics. Most games use 16-bit color.

24-bit color

It is estimated that the eye can resolve roughly 2 million different colors and shades. To get 2 million shades, you need 24 bits of color information per dot, or 24-bit color. This is called "true color."

For almost everybody -- except high-end graphic artists -- 24-bit color is sufficient. However, there are displays that can go to an even higher color resolution. Using 32-bit color produces over 4 billion different shades.

If you have a VGA monitor, it's your video card, not the monitor, that determines how may colors you can display. Most video cards can display at least 8-bit color, and almost all can display 16-bit color (high color). If you have enough memory in your video card, you can display 24-bit color (true color).

Macs use different naming conventions. On the Mac, 16-bit color is called "thousands of colors" and 24-bit color is "millions of colors."


Home Information Page HTML Lessons Java Script Lessons