Texture mapping
Texture mapping is a method for adding detail, surface texture, or colour to a computer-generated graphic or 3D model. Its application to 3D graphics was pioneered by Dr Edwin Catmull in his Ph.D. thesis of 1974.
Texture mapping
A texture map is applied (mapped) to the surface of a shape, or polygon. This process is akin to applying patterned paper to a plain white box. In the pictured example, a texture map of the Earth's coloration is applied to a sphere to create the illusion of color detail that would take very many additional polygons to realize otherwise.
Multitexturing is the use of more than one texture at a time on a polygon[1]. For instance, a light map texture may be used to light a surface as an alternative to recalculating that lighting every time the surface is rendered. Another multitexture technique is bump mapping, which allows a texture to directly control the facing direction of a surface for the purposes of its lighting calculations; it can give a very good appearance of a complex surface, such as tree bark or rough concrete, that takes on lighting detail in addition to the usual detailed coloring. Bump mapping has become popular in recent video games as graphics hardware has become powerful enough to accomodate it.
The way the resulting pixels on the screen are calculated from the texels (texture pixels) is governed by texture filtering. The fastest method is to use the nearest neighbour interpolation, but bilinear interpolation is commonly chosen as good tradeoff between speed and accuracy. In the event of a texture coordinate being outside the texture, it is either clamped or wrapped.
Perspective correctness
Texture coordinates are specified at each vertex of a given triangle (for graphics hardware, polygons are generally broken down into triangles for rendering), and these coordinates are interpolated using an extended Bresenham's line algorithm. If these texture coordinates are linearly interpolated across the screen, the result is affine texture mapping. This is a fast calculation, but there can be a noticeable discontinuity between adjacent triangles when these triangles are at an angle to the plane of the screen (see figure at right).
Perspective correct texturing accounts for the vertices' positions in 3D space, rather than simply interpolating a 2D triangle. This achieves the correct visual effect, but it is slower to calculate. Instead of interpolating the texture coordinates directly, the coordinates are divided by their depth (relative to the viewer), and the reciprocal of the depth value is also interpolated and used to recover the perspective-correct coordinate. This correction makes it so that in parts of the polygon that are closer to the viewer the difference from pixel to pixel between texture coordinates is smaller (stretching the texture wider), and in parts that are farther away this difference is larger (compressing the texture).
- Affine texture mapping directly interpolates a texture coordinate between two endpoints and :
- where
- Perspective correct mapping interpolates after dividing by depth , then uses its interpolated reciprocal to recover the correct coordinate:
Most modern graphics hardware implements perspective correct texturing, but when games still relied on software rendering, perspective correct texturing had to be used sparingly because of its computational expense. Several different techniques were developed to hide the defect of affine texture mapping. For instance, Doom perspective corrected either vertically or horizontally (depending on whether floor or wall was being drawn), but not both at once, allowing the perspective correction calculation to be done only once per line (and then using affine interpolation between the endpoints), but restricting the geometry of the level to horizontal and vertical planes. A different approach was taken for Quake, which would calculate perspective correct coordinates only once every 16 pixels of a scanline and linearly interpolate between them, producing a compromise between the speed of affine texturing and perspective correctness[2].
See also
- Bump mapping
- Clamping
- Normal mapping
- Displacement mapping
- Edwin Catmull
- Texture filtering
- Texture splatting - a technique for combining textures.
- Wrapping (graphics)
- Texture atlas
- Dynamic textures
References
- ^ Blythe, David. Advanced Graphics Programming Techniques Using OpenGL. Siggraph 1999. (see: Multitexture)
- ^ Abrash, Michael. Michael Abrash's Graphics Programming Black Book Special Edition. The Coriolis Group, Scottsdale Arizona, 1997. ISBN 1-57610-174-6 (PDF) (Chapter 70, pg. 1282)
External links
- Perspective Corrected Texture Mapping at GameDev.net
- Introduction into texture mapping using C and SDL
- Programming a textured terrain using XNA/DirectX, from www.riemers.net
- Perspective correct texturing