chapter 8 color and texture mapping © 2008 cengage learning emea

34
CHAPTER 8 CHAPTER 8 Color and Texture Mapping Color and Texture Mapping © 2008 Cengage Learning EM

Upload: leon-derrick-fleming

Post on 13-Jan-2016

225 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

CHAPTER 8CHAPTER 8

Color and Texture MappingColor and Texture Mapping

© 2008 Cengage Learning EMEA

Page 2: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

LEARNING OBJECTIVESLEARNING OBJECTIVES In this chapter you will learn about:

– Color– Texturing– Texture filtering– Mipmapping– Nearest point interpolation– Bilinear filtering– Trilinear filtering– Anisotropic filtering– Basic texture mapping– Bump mapping– Cube mapping (environmental mapping)

Page 3: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

COLORCOLOR

The representation of color in display devices (using a red, green, and blue component) can directly be linked to the human perception of it.

The human brain only picks up three color values at any given moment (as opposed to a complete color distribution).

These three values, called tri-stimulus values, are the result of the human visual system’s color cones.

Page 4: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

COLORCOLOR

Color cones reduce any perceived color range to three distinct values.

The significance of this reduction to computer graphics is that any color can be reproduced using three color elements, namely, a red, green, and blue component.

Page 5: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

COLORCOLOR

Computers make use of the red–green–blue (RGB) color model.

This is an additive color model where perceived colors are formed by overlapping the primary red, green, and blue colors

Page 6: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

COLORCOLOR

Another commonly used color model is the cyan–magenta–yellow (CMY) model, also called the subtractive color model.

The perceived colors are formed by overlapping the complementary colors cyan, magenta, and yellow

Page 7: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

COLORCOLOR Working in true-color (a

representation of red–green–blue color values using 24 bits per pixel), we can specify a cube by viewing color as a specific point in three-dimensional space.

This cube is defined via a coordinate system analogous to the three primary colors with the intensity of a color represented by the distance from the origin to any other location within the cube – the color vector.

Page 8: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

COLORCOLOR

The hexcone model presents a hexagonal cone for the representation of the color space.

Using a hexagonal cone, also referred to as a hexcone, leads to a greater level of perceptual linearality.

Page 9: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

TEXTURINGTEXTURING

Texture mapping is an easy way of adding realism to a computer generated object.

The texture (be it a tile-able photograph or complex pattern) is mapped (fitted) to the computer generated object, either stretched or tiled to encompass the entire object area.

Page 10: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

TEXTURINGTEXTURING Textures consist of a number of fundamental

subunits called texels or texture elements (which can be considered the pixels of a texture).

Arrays of texels make up a texture just as images are created using arrays of pixels.

Textures can be one, two or three dimensional in nature (sometimes even four dimensional) with two-dimensional textures being the most common of all these.

A one-dimensional texture is simply an array of texture elements.

Page 11: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

TEXTURINGTEXTURING

Two-dimensional textures are represented using a two-dimensional array with each texture element addressable via an x and y vector.

These textures are the type that we will be working with; even our depth and normal maps used during bump mapping are represented using these two-dimensional bitmap arrays.

Page 12: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

TEXTURINGTEXTURING

Volumetric textures, also called three dimensional textures, are another interesting texture resource type.

These textures, represented as three-dimensional volumes, are useful for describing solid material blocks from which arbitrary objects can be shaped.

Page 13: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

TEXTURINGTEXTURING

Texture mapping is based on the manipulation of individual fragments during the graphics pipeline’s fragment processing stage.

The method used to perform the actual texture mapping at application level depends mainly on the level of quality required.

Page 14: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

TEXTURINGTEXTURING The most common method maps a two-

dimensional texture resource onto the surface of an object.

This texture mapping process starts out in two-dimensional texture space and moves to three-dimensional object space where the texture is mapped onto the object – a process known as surface parameterization.

A projection transformation is then used to move from object space to screen space.

Page 15: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

TEXTURINGTEXTURING Textures are loaded into

system memory as arrays with coordinates, called texture coordinates.

These coordinates allow us to address and access the individual texel elements making up the array.

Texture coordinates are generally scaled to range over the interval (0, 1).

Page 16: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

TEXTURINGTEXTURING

Two-dimensional textures can be described using the notation T(u, v) with u and v the texture coordinates uniquely defined for each vertex on a given surface.

The process of texture mapping is thus concerned with aligning each texel’s texture space coordinates with a vertex on the surface of an object.

Page 17: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

Texture FilteringTexture Filtering

Every pixel of an onscreen image contains an independently controlled color value obtained from the texture.

Texture filtering, also called ‘texture smoothing’, controls the way in which pixels are colored by blending the color values of adjacent texture elements.

Page 18: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

Texture FilteringTexture Filtering

Mipmapping– A A mipmap is a series of pre-filtered

texture images of varying resolution.

Page 19: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

Texture FilteringTexture Filtering

Nearest Point Interpolation– The point matching the center

of a texture element is rarely obtained when texture coordinates are mapped to a two-dimensional array of texels.

– Nearest point interpolation is used to approximate this point by using the color value of the texel closest to the sampled point.

Page 20: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

Texture FilteringTexture Filtering

Bilinear Filtering– Bilinear filtering builds on the concept of

nearest point interpolation by sampling not just one but four texture elements when texture coordinates are mapped to a two-dimensional array of texels.

Page 21: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

Texture FilteringTexture Filtering

Trilinear Filtering– Bilinear filtering does not perform any

interpolation between mipmaps, resulting in noticeable quality changes where the graphics system switches between mipmap levels.

– Trilinear filtering solves this quality issue by extending the previous technique to perform a texture lookup coupled with a bilinear filtering operation for the two bordering mipmap images, one for the higher resolution texture, and the other for the lower resolution one.

Page 22: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

Texture FilteringTexture Filtering

Anisotropic Filtering– Anisotropy is a distortion visible in the texels of

a textured object when the object is rotated at a specific angle to the point of view.

Page 23: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

Texture FilteringTexture Filtering Anisotropic Filtering

– Anisotropic texture filtering deals with this blurriness by sampling texture elements using a quadrilateral modified according to the viewing angle.

– A single pixel could encompass more texel elements in one direction, such as along the x-axis, than in another, for instance along the z-axis.

– By using a modifiable quadrilateral for the sampling of texels, we are able to maintain proper perspective and precision when mapping a texture to an object.

Page 24: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

Basic Texture Mapping Basic Texture Mapping ImplementationImplementation

[see the textbook and source code examples, “TextureMapping(Direct3D)” and “TextureMapping(OpenGL)”, on the book’s website for detailed examples].

Page 25: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

Bump MappingBump Mapping There are a number of techniques

combining lighting calculations with texture surface normal perturbations to create more realistic looking object surfaces.

Bump mapping is one such technique. Bump mapping can be described as a form

of texture mapping incorporating light reflection to simulate real-world surfaces where the unevenness of a surface influences the reflection of light.

Bump mapping combines per-pixel lighting calculations with the normals calculated at each pixel of a surface.

Page 26: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

Bump MappingBump Mapping

Page 27: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

Bump MappingBump Mapping

Page 28: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

Bump MappingBump Mapping

Page 29: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

Implementing Bump Implementing Bump MappingMapping

We can summarize the process of bump mapping as follows:1 Determine the inverse TBN matrix. This is required

because the TBN matrix translates coordinates from texture space to object space and we need to convert the light vector from object space to texture space.

2 Calculate the light vector.3 Transform the light vector from object space to texture

space by multiplying it with the TBN matrix.4 Read the normal vector at the specific pixel.5 Calculate the dot product between the light vector and

normal vector.6 Multiply the result from step 5 with the color of the light

and that of the surface material (this is the final diffuse light color).

7 Repeat the previous six steps for each and every pixel of the textured surface.

Page 30: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

Implementing Bump Implementing Bump MappingMapping

[see the textbook for a detailed example and discussion].

Page 31: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

Cube MappingCube Mapping Cube mapping, also called environmental

mapping or sometimes reflection mapping, allows us to simulate complex reflections by mapping real-time computed texture images to the surface of an object.

Each texture image used for environmental mapping stores a ‘snapshot’ image of the environment surrounding the mapped object.

These snapshot images are then mapped to a geometric object to simulate the object reflecting its surrounding environment.

An environment map can be considered an omnidirectional image.

Page 32: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

Cube MappingCube Mapping

Page 33: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

Cube MappingCube Mapping

Cube mapping is a type of texturing where six environmental maps are arranged as if they were faces of a cube.

Images are combined in this manner so that an environment can be reflected in an omnidirectional fashion.

Page 34: CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA

Implementing Cube Implementing Cube MappingMapping

[see the textbook for a detailed example and discussion].