Platform SDK: DirectX |
The texture coordinates you specify for each vertex receiving an environment mapping should address the texture as a function of the reflective distortion created by the curvature of the surface. Applications must compute these texture coordinates for each vertex to achieve the desired effect. One very simple and effective way to generate texture coordinates uses the vertex normal as input. Although several methods exist, the following formula is common among applications that perform environment mapping with sphere maps.
In the preceding formulas, u and v are the texture coordinates being computed, and Nx and Ny are the x and y components of the camera-space vertex normal. The preceding formula is simple, but effective. If the normal has a positive x component, the normal points to the right, and the u coordinate is adjusted to address the texture appropriately. Likewise for the v coordinate: positive y indicates that the normal points up. The opposite is of course true for negative values in each component.
If the normal points directly at the camera, the resulting coordinates should receive no distortion. The +0.5 bias to both coordinates places the point of zero-distortion at the center of the sphere map, and a vertex normal of (0, 0, z) addresses this point. Note that this formula doesn't take account for the z component of the normal, but applications that use the formula might optimize their computations by skipping vertices with a normal that has a positive z element. This works because, in camera space, if the normal points away from the camera (positive z), the vertex will be culled when the object is rendered.