Triplanar Texturing

The material and so the look of the still white and boring triangles is done with triplanar texturing. First, this article shows the core blending of three textures and then, it extends it with normal mapping for a relief like look with more structure. The given code here are CG-Shaders which are platform independent.

Triplanar Texturing

The main problem is mapping the vertex coordinates to UV-coordinates on a texture. On classic terrain, the problem is often solved with planar mapping. This means, that the texture is coming from above on the terrain like a blanket and stretched until it fits. It’s a one liner in the fragment shader:

float2 coord = position.xz * scale;
float4 col1 = tex2D(texPlanarSampler, coord);

scale is a factor which determines how often the texture is repeated all over the terrain.

But for volume terrain, the stretchings are not acceptable anymore as figure 1 shows

1: Planar texture mapping

1: Planar texture mapping

Here comes triplanar texture mapping into the game. Basically, it is mapping three different textures from above, from the front and from the side. How much each of the textures are chosen for the final color is controlled by the normal. If the normal at the position points mostly upwards then the texture from above has the main part. In the in between areas, the three textures are smoothly blended together.

The basics are presented by Geiss [Gei07] and they are extended by a few more control parameters by Lengyel [Len10, P. 48].

First, the blend factors are calculated from the normal:

b_x = \left( max\left\lbrace \frac{\left|\vec{N}_x\right|}{\left\| \vec{N} \right\| } - \delta, 0 \right\rbrace \right)^m
b_y = \left( max\left\lbrace \frac{\left|\vec{N}_y\right|}{\left\| \vec{N} \right\| } - \delta, 0 \right\rbrace \right)^m
b_z = \left( max\left\lbrace \frac{\left|\vec{N}_z\right|}{\left\| \vec{N} \right\| } - \delta, 0 \right\rbrace \right)^m

\vec{N} is the current normal, m controls the transitions speed between the three textures and \delta is a plateau on which small components of the normal have no influence. Now, those factors gets normalized:

b'_x = \frac{b_x}{b_x + b_y + b_z}
b'_y = \frac{b_y}{b_x + b_y + b_z}
b'_z = \frac{b_z}{b_x + b_y + b_z}

With this, the final color is blended together from the three texture colors:

c = b'_x \cdot tex_{yz} + b'_y \cdot tex_{xz} + b'_z \cdot tex_{xy}

tex_{yz}, tex_{xz} and tex_{xy} are fetched from the three planar texture projections:

texCoord_{yz} = \begin{pmatrix} p_y \\ p_z \end{pmatrix} \cdot scale
texCoord_{xz} = \begin{pmatrix} p_x \\ p_z \end{pmatrix} \cdot scale
texCoord_{xy} = \begin{pmatrix} p_x \\ p_y \end{pmatrix} \cdot scale

Or spoken in CG:

float3 blendWeights = abs(unitNormal);
blendWeights = blendWeights - plateauSize;
blendWeights = pow(max(blendWeights, 0), transitionSpeed);
float2 coord1 = (position.yz + nLength) * texScale;
float2 coord2 = (position.zx + nLength) * texScale;
float2 coord3 = (position.xy + nLength) * texScale;
float4 col1 = tex2D(texFromX, coord1);
float4 col2 = tex2D(texFromY, coord2);
float4 col3 = tex2D(texFromZ, coord3);
float4 textColour = float4( * blendWeights.x + * blendWeights.y + * blendWeights.z, 1);

Specular Mapping

Specular mapping extends the normal specular lighting by the fact, that specular highlights are not found on all places of an object. The metal button on the west has specular lighting, not the cloth.

The information where to highlight something and where not can be stored in the texture in a single color channel as factor of the normal specular light calculation. This factor is already stored in the blended alpha channel of the color.

Figure 2 shows the result so far.

2: Triplanar Texturing

2: Triplanar Texturing

Normal Mapping

This already looks nice, but some simulated relief via normal mapping will improve the look. With normal mapping, the normals of a fragment are modified and so the lighting gets influenced. This way, a relief appears [NFHS11, P. 339 – 341].

First, the color fetched from the normal map texture needs to be converted from the range 0-1 to a proper normal:

\vec{n}_{NM} = (\vec{n_T} - 0,5) * 2

And again, the three normal map vectors are blended together by the triplanar factors:

n'_T = b'_x \cdot n_{Tyz} + b'_y \cdot n_{Txz} + b'_z \cdot n_{Txy}

And now some theory: The fetched normals are in a space different to the one of the light sources and objects. So one space must be transformed into another. Here, it’s the view direction defined by normalized difference between the camera and the fragment. The fragment has a tangent \vec{t}, a bitangent \vec{b} and its normal \vec{n}. Together, they define it’s orientation matrix. With the transposed one, the fragment is brought into the texture space of the normal mapping normals [NFHS11, P. 345 – 347].

\vec{v'} = \begin{pmatrix} t_x b_x n_x \\ t_y b_y n_y \\ t_z b_z n_z \end{pmatrix}^t \cdot \vec{v}

Bitagent and tagent can be calculated. First the bitagent with \vec{t'} = (1, 0, 0)^t :

\vec{b} = \frac{\vec{t'} \times \vec{n}}{|\vec{t'} \times \vec{n}|}

And the final tangent:

\vec{t} = \frac{\vec{b} \times \vec{n}}{|\vec{b} \times \vec{n}|}

Putting it again in CG, where normal2 will be used for all lighting calculations from there on:

float3 expand(float3 v)
	return (v - 0.5) * 2;

float3 tangent = float3(1, 0, 0);
float3 binormal = normalize(cross(tangent, unitNormal));
tangent = normalize(cross(unitNormal, binormal));
float3x3 TBN = float3x3(tangent, binormal, unitNormal);
float3 eyeDir2 = normalize(mul(TBN, eyeDir));
float3 bumpFetch1 = expand(tex2D(texFromXNormal, coord1).rgb);
float3 bumpFetch2 = expand(tex2D(texFromYNormal, coord2).rgb);
float3 bumpFetch3 = expand(tex2D(texFromZNormal, coord3).rgb);
float3 normal2 = * blendWeights.x + * blendWeights.y + * blendWeights.z;

And this is the final result:

3: Triplanar texturing with normal mapping

3: Triplanar texturing with normal mapping

If you like this work and want to support it, feel free to donate something or click on any Flattr button!