Thursday, June 23, 2011

GLSL - Bump mapping (Normal Mapping)

http://www.opengl.org/sdk/docs/tutorials/TyphoonLabs/Chapter_3.pdf
http://www.opengl.org/sdk/docs/tutorials/TyphoonLabs/Chapter_4.pdf (pretty good explanation)


http://www.ozone3d.net/tutorials/bump_mapping.php

http://www.swiftless.com/tutorials/glsl/8_bump_mapping.html

- we can apply standard lighting techniques that require a normal value on a per-pixel basis instead of a per-vertex or per-surface basis. This gives our applications a greatly added sense of realism.

- Regular bump mapping only cares about the normal maps relation to lights. Tangent space bump mapping takes it a step further and takes into account the objects surfaces in relation-to the light as well as the normal maps relation.

__________________________________________________________________________
- The bump mapping is one of the per pixel lighting techniques, which means that all the lighting calculations (application of the light equations) are calculated for each pixel. The power of the current graphics processors (or gpu) makes it possible to reach this precision while preserving acceptable frame rates.

- These differences between the performances (Texturing and Bump mapping) are due to two major reasons: First, the bump mapping is basically a multitexturing technique. In our case, two textures are used for the bump effect. Accessing several textures is more penalizing than dealing with one texture. The second reason is that all the lighting calculations are made for each pixel.

- The main goal of the bump mapping is to simulate some relief on a flat geometry. That makes it possible to render at a lower CPU cost, objects that have a very detailed appearance. The bump mapping technique is very simple since it consists in using a normal vector that is deformed at the level of the pixel being processed.

- The bump mapping technique precisely consists in giving more life and sparkling to this poor N (normal vector) by drawing for each pixel a normal vector from a normal-map.

- Now let us see another point, more significant from the bump mapping implementation point of view than from the global understanding of the bump mapping itself. It is about the vertex space more known as the tangent space. This space is in fact a frame of reference attached to each vertex in which the position of the vertex is {0.0, 0.0, 0.0 } and the coordinates of the normal vector to the vertex are {0.0, 0.0, 1.0 }.

- The three vectors forming this orthonormal frame of reference are named tangent, binormal and normal with:
tangent vector = {1.0, 0.0, 0.0} or X axis
binormal vector = {0.0, 1.0, 0.0} or Y axis
normal vector = {0.0, 0.0, 1.0} or Z axis.

- Most of the bump-maps creation tools use normal vectors in the tangent space (the nVidia plugins for PhotoShop or the tool provided by ATI are good examples).

- The problem is that the normal vector is expressed in the tangent space whereas the other vectors used for calculations (light and view vectors) are expressed in another reference space (the camera space). It is thus necessary to express all these vectors in the same single reference space so that calculations (mainly dot product) can have a meaning. This reference space is the tangent space. The following matrix algebra shows how to pass the L light vector, expressed in the camera space, to the tangent space:

|x|   |Tx Ty Tz|   |Lx|
|y| = |Bx By Bz| x |Ly|
|z|   |Nx Ny Nz|   |Lz|
 
gl_NormalMatrix    3x3 Matrix representing the inverse transpose model-view matrix.      This matrix is used for normal transformation.

- It is called “Tangent Space,” and it is defined for each face of the mesh. We need this space because it allows us to keep the normals unchanged. For example, if we store the normals in object space, when we rotate the model, we have to rotate the normals too, to maintain coherency. However, with the normals relatives to each face, this is not needed. Those normals are independent of the model orientation and position.

- In order to build this Tangent Space, we need to define an orthonormal pervertex basis, which will define our tangent space. To build this basis we need three vectors. We already have one of them: the vertex Normal. The other vector has to be tangent to the surface at the vertex (that’s why this space is
called Tangent Space). This vector is the Tangent vector. The last one can be obtained with the cross product of the normal and the tangent vector, and is called the Binormal vector. The computation of the tangent vector is quite complicated. It won't be explained here, but Shader Designer provides both Tangent and Binormal attributes automatically, as well as providing normals and texture coordinates.






- The only difference with this shader and the per-pixel lighting is the way we compute the light vector and the normal. In the per-pixel shader, the light vector was in object space, and the normal was interpolated. Here the light vector is in tangent space, and the normal is extracted from a texture. For everything else, the computations remain the same.

No comments:

Post a Comment