GTK+ / Gnome Application Development | |||
---|---|---|---|
<<< Previous | Home | Next >>> |
Unfortunately, not all hardware is created equal. The most primitive X servers support only two colors; each pixel is either on or off. This is referred to as a "one bit per pixel (bpp)" display. A display with one bit per pixel is said to have a depth of one. More advanced X servers support 24 or 32 bits per pixel, and allow you to specify a different depth on a window-by-window basis. 24 bits per pixel allows 2^24 different pixels, which includes more colors than the human eye can differentiate.
Conceptually, a bitmap display consists of a rectangular grid of pixels. Each pixel consists of some fixed number of bits; pixels are mapped to visible colors in a hardware-dependent way. One way to think about this is to imagine a two-dimensional array of integers, where the integer size is chosen to hold the required number of bits. Alternatively, you can think of a display like this as a stack of bit planes, or two-dimensional arrays of bits. If all the planes are parallel to one another, a pixel is a perpendicular line passing through the same coordinates on each plane, taking a single bit from each one. This is the origin of the term depth, since the number of bits per pixel is equal to the depth of the stack of bit planes.
In the X Window System, pixels represent entries in a color lookup table. A color is a red, green, blue (RGB) value---monitors mix red, green, and blue light in some ratio to display each pixel. Take an eight bit display, for example: eight bits are not enough to encode a color in-place; only a few arbitrary RGB values would be possible. Instead, the bits are interpreted as an integer and used to index an array of RGB color values. This table of colors is called the colormap; it can sometimes be modified to contain the colors you plan to use, though this is hardware-dependent---some colormaps are read-only.
A visual is required to determine how a pixel's bit pattern is converted into a visible color. Thus, a visual also defines how colormaps work. On an 8-bit display, the X server might interpret each pixel as an index into a single colormap containing the 256 possible colors. 24-bit visuals typically have three colormaps: one for shades of red, one for shades of green, and one for shades of blue. Each colormap is indexed with an eight-bit value; the three eight-bit values are packed into a 24-bit pixel. The visual defines the meaning of the pixel contents. Visuals also define whether the colormap is read-only or modifiable.
In short, a visual is a description of the color capabilities of a particular X server. In Xlib, you have to do a lot of fooling around with visuals; GDK and GTK+ shield you from most of the mess.
Xlib can report a list of all available visuals and information about each; GDK keeps a client-side copy of this information in a struct called GdkVisual. GDK can report the available visuals, and rank them in different ways. Most of the time you will only use gdk_visual_get_system(), which returns a pointer to the default visual (Figure 2). (If you're writing a GtkWidget, gtk_widget_get_visual() returns the visual you should use; more on this in the chapter called Writing a GtkWidget.) The returned visual is not a copy, so there is no need to free it; GDK keeps visuals around permanently.
For reference, here are the contents of GdkVisual; most of the members are used to calculate pixel values from colors. Since this is fairly involved and rarely used, this book glosses over the topic. The depth member is convenient sometimes. the section called Types of Visual has more to say about the type member.
typedef struct _GdkVisual GdkVisual; struct _GdkVisual { GdkVisualType type; gint depth; GdkByteOrder byte_order; gint colormap_size; gint bits_per_rgb; guint32 red_mask; gint red_shift; gint red_prec; guint32 green_mask; gint green_shift; gint green_prec; guint32 blue_mask; gint blue_shift; gint blue_prec; }; |
Visuals differ along several dimensions. They can be grayscale or RGB, colormaps can be modifiable or fixed, and the pixel value can either index a single colormap or contain packed red, green, and blue indexes. Here are the possible values for GdkVisualType:
GDK_VISUAL_STATIC_GRAY means the display is either monochrome or gray scale, and the colormap cannot be modified. A pixel value is simply a level of gray; each pixel is "hard coded" to represent a certain on-screen color.
GDK_VISUAL_GRAYSCALE means the display has a modifiable colormap, but only levels of gray are possible. The pixel represents an entry in the colormap, so a given pixel can represent a different level of gray at different times.
GDK_VISUAL_STATIC_COLOR represents a color display which uses a single read-only colormap rather than a separate colormap for each of red, green, and blue. The display is almost certainly 12-bit or less (a 24-bit display using a single colormap would need a colormap with 2^24 entries, occupying close to half a gigabyte---not very practical!). This is an annoying visual, because relatively few colors are available and you can't change which colors they are.
GDK_VISUAL_PSEUDO_COLOR is the most common visual on low-end PC hardware from several years ago. If you have a one-megabyte 256-color video card, this is most likely your X server's visual. It represents a color display with a read/write colormap. Pixels index a single colormap.
GDK_VISUAL_TRUE_COLOR is a color display with three read-only colormaps, one for each of red, green, and blue. A pixel contains three indexes, one per colormap. There is a fixed mathematical relationship between pixels and RGB triplets; you can get a pixel from red, green, and blue values in [0, 255] using the formula: gulong pixel = (gulong)(red*65536 + green*256 + blue).
GDK_VISUAL_DIRECT_COLOR is a color display with three read-write colormaps. If you use the GDK color handling routines, they simply fill up all three colormaps to emulate a true color display, then pretend the direct color display is true color.
A GdkColor stores an RGB value and a pixel. Red, green, and blue are given as 16-bit unsigned integers; so they are in the range [0, 65535]. The contents of the pixel depend on the visual. Here is GdkColor:
typedef struct _GdkColor GdkColor; struct _GdkColor { gulong pixel; gushort red; gushort green; gushort blue; }; |
Before you can use a color to draw, you must:
Ensure that the pixel value contains an appropriate value.
Ensure that the color exists in the colormap of the drawable you intend to draw to. (A drawable is a window or pixmap you can draw to; see the section called Drawables and Pixmaps.)
In Xlib, this is an enormously complicated process, because it has to be done differently for every kind of visual. GDK conceals things fairly well. You simply call gdk_colormap_alloc_color() to fill in the pixel value and add the color to the colormap (Figure 3). Here is an example; it assumes a preexisting GdkColormap* colormap, which should be the colormap of the drawable you are targetting:
GdkColor color; /* Describe a pure red */ color.red = 65535; color.green = 0; color.blue = 0; if (gdk_colormap_alloc_color(colormap, &color, FALSE, TRUE)) { /* Success! */ } |
If gdk_colormap_alloc_color() returns TRUE, then the color was allocated and color.pixel contains a valid value. The color can then be used to draw. The two boolean arguments to gdk_colormap_alloc_color() specify whether the color should be writeable, and whether to try to find a "best match" if the color can't be allocated. If a best match is used instead of allocating a new color, the color's RGB values will be changed to the best match. If you request a best match for a non-writeable entry, allocation really should not fail, since even on a black and white display either black or white will be the best match; only an empty colormap could cause failure. The only way to get an empty colormap is to create a custom colormap yourself. If you don't ask for the best match, failure is quite possible on displays with a limited number of colors. Failure is always possible with writeable colormap entries (where best match makes no sense, because the entry can be modified).
A writeable colormap entry is one that you can change at any time; some visuals support this, and some don't. The purpose of a writeable colormap entry is to change an on-screen color without redrawing the graphics. Some hardware stores pixels as indices into a color lookup table, so changing the lookup table changes how the pixels are displayed. The disadvantages of writeable colormap entries are numerous. Most notably: not all visuals support them, and writeable colormap entries can't be used by other applications (read-only entries can be shared, since other applications know the color will remain constant). Thus, it is a good idea to avoid allocating writeable colors. On modern hardware, they are more trouble than they're worth; the speed gain compared to simply redrawing your graphics will not be noticeable.
When you're finished with a color, you can remove it from the colormap with gdk_colormap_free_colors(). This is only really important for pseudo color and grayscale visuals, where colors are in short supply and the colormap can be modified by clients. GDK will automatically do the right thing for each visual type, so always call this function.
A convenient way to obtain RGB values is the gdk_color_parse() function. This takes an X color specification, and fills in the red, green, and blue fields of a GdkColor. An X color specification can have many forms; one possibility is an RGB string:
RGB:FF/FF/FF |
This specifies white (red, green, and blue are all at full intensity). The RGB: specifies a "color space," and determines the meaning of the numbers after it. X also understands several more obscure color spaces. If the color specification string doesn't begin with a recognized color space, X assumes it's a color name and looks it up in a database of names. So you can write code like this:
GdkColor color; if (gdk_color_parse("orange", &color)) { if (gdk_colormap_alloc_color(colormap, &color, FALSE, TRUE)) { /* We have orange! */ } } |
As you can see, gdk_color_parse() returns TRUE if it figures out the string you pass it. There is no way to know exactly what will be in the color database, so always check this return value.
#include <gdk/gdk.h> |
gboolean
gdk_colormap_alloc_color
(GdkColormap* colormap, GdkColor* color, gboolean writeable, gboolean best_match);
void
gdk_colormap_free_colors
(GdkColormap* colormap, GdkColor* colors, gint ncolors);
gint gdk_color_parse
(gchar* spec, GdkColor* color);
Figure 3. Color Allocation
If you're writing a GtkWidget subclass, the correct way to obtain a colormap is with gtk_widget_get_colormap() (see the chapter called Writing a GtkWidget). Otherwise, the system (default) colormap is usually what you want; call gdk_colormap_get_system(), which takes no arguments and returns the default colormap.
The GdkRGB module (see the section called RGB Buffers) is another way to deal with colors; among other capabilities, it can set the foreground and background colors of a graphics context from an RGB value. The relevant functions are gdk_rgb_gc_set_foreground() and gdk_rgb_gc_set_background(). GdkRGB has a pre-allocated colormap that it uses to pick a best-match color; using it means that your application can share limited colormap resources with other applications using GdkRGB (such as the Gimp). You can also obtain GdkRGB's colormap and use it directly (see the section called RGB Buffers).