DirectX Documentation

Computational Graphics - Texture Mapping

Texture Mapping in DirectX

Texture mapping is a fundamental technique in computer graphics used to add detail, color, and surface properties to 3D models. It involves applying a 2D image (the texture) onto a 3D surface (the geometry). DirectX provides robust support for efficient and versatile texture mapping.

What is Texture Mapping?

At its core, texture mapping works by assigning UV coordinates to each vertex of a 3D model. These UV coordinates act as a mapping from the 3D surface to the 2D texture image. When the model is rendered, the graphics pipeline interpolates these UV coordinates across the surface, allowing it to sample (read) pixel data from the texture and apply it to the corresponding fragment of the 3D object.

Texture Coordinates (UVs)

UV coordinates are typically represented as a pair of floating-point values, ranging from 0.0 to 1.0, where:

  • U corresponds to the horizontal axis of the texture.
  • V corresponds to the vertical axis of the texture.

A vertex with UV coordinates (0.0, 0.0) will map to the bottom-left corner of the texture, while (1.0, 1.0) maps to the top-right corner. Other values allow for mapping specific parts of the texture or repeating it across the surface.

Texture Filtering

When a texture is sampled, the UV coordinates often do not fall precisely on a texel (texture pixel). Texture filtering algorithms are used to determine which texel(s) to read and how to combine them to produce the final color for the fragment. Common filtering methods include:

  • Point Sampling (Nearest Neighbor): The color of the nearest texel is used. Fastest but can result in aliasing and blocky appearances.
  • Bilinear Filtering: A weighted average of the four nearest texels is calculated. Smoother than point sampling, reducing aliasing.
  • Trilinear Filtering: Extends bilinear filtering by interpolating between two sets of bilinearly filtered mipmap levels. Provides excellent smoothness across different distances but is more computationally expensive.
  • Anisotropic Filtering: The most advanced technique, it considers the viewing angle and aspect ratio of the surface to sample textures more accurately, especially at glancing angles.

Mipmapping

Mipmaps are pre-calculated, downscaled versions of a texture. They are used to improve rendering performance and visual quality by selecting the appropriate mipmap level based on the distance of the object from the camera. Smaller, lower-resolution mipmaps are used for distant objects, reducing aliasing and memory bandwidth requirements.

Texture Address Modes

Texture address modes define how the UV coordinates are clamped or repeated when they fall outside the [0.0, 1.0] range. Common modes include:

  • Clamp: Coordinates are clamped to the nearest edge.
  • Wrap (Tile): Coordinates wrap around, tiling the texture.
  • Mirror: Coordinates wrap around, mirroring the texture on each repeat.
  • Border: A specific border color is used for coordinates outside the range.

Implementation in DirectX

In DirectX, textures are represented by ID3D11ShaderResourceView (for DirectX 11 and later) or IDirect3DTexture9 (for DirectX 9). You bind these resources to a shader stage (like the pixel shader) and use sampler states to control filtering and addressing modes.

Example: Creating and Binding a Texture (DirectX 11)

This is a simplified C++ example showing the conceptual steps.


HRESULT hr;

// Load texture from file (e.g., DDS, WIC-compatible image)
Microsoft::WRL::ComPtr<ID3D11Texture2D> pTextureBuffer;
hr = DirectX::CreateDDSTextureFromFile(
    pDevice.Get(),                // The Direct3D device
    L"path/to/your/texture.dds", // Path to the texture file
    pTextureBuffer.GetAddressOf(), // Pointer to receive the texture buffer
    nullptr                       // Optional: Pointer to receive the shader resource view
);

if (FAILED(hr)) {
    // Handle error
    return;
}

// Create Shader Resource View
Microsoft::WRL::ComPtr<ID3D11ShaderResourceView> pTextureRV;
D3D11_SHADER_RESOURCE_VIEW_DESC srvDesc = {};
srvDesc.Format = DXGI_FORMAT_BC3_UNORM; // Or the format of your texture
srvDesc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D;
srvDesc.Texture2D.MostDetailedMip = 0;
srvDesc.Texture2D.MipLevels = 1; // Or the number of mip levels

hr = pDevice.Get()->CreateShaderResourceView(pTextureBuffer.Get(), &srvDesc, pTextureRV.GetAddressOf());

if (FAILED(hr)) {
    // Handle error
    return;
}

// Create Sampler State
Microsoft::WRL::ComPtr<ID3D11SamplerState> pSamplerState;
D3D11_SAMPLER_DESC samplerDesc = {};
samplerDesc.Filter = D3D11_FILTER_ANISOTROPIC; // Or D3D11_FILTER_MIN_MAG_MIP_LINEAR
samplerDesc.AddressU = D3D11_TEXTURE_ADDRESS_WRAP;
samplerDesc.AddressV = D3D11_TEXTURE_ADDRESS_WRAP;
samplerDesc.AddressW = D3D11_TEXTURE_ADDRESS_WRAP;
samplerDesc.MipLODBias = 0.0f;
samplerDesc.MaxAnisotropy = 16; // For anisotropic filtering
samplerDesc.ComparisonFunc = D3D_COMPARISON_NEVER;
samplerDesc.BorderColor[0] = samplerDesc.BorderColor[1] = samplerDesc.BorderColor[2] = samplerDesc.BorderColor[3] = 0.0f;
samplerDesc.MinLOD = 0.0f;
samplerDesc.MaxLOD = D3D11_FLOAT32_MAX;

hr = pDevice.Get()->CreateSamplerState(&samplerDesc, pSamplerState.GetAddressOf());

if (FAILED(hr)) {
    // Handle error
    return;
}

// In your rendering loop:
// Bind the shader resource view and sampler state to the pixel shader stage
pImmediateContext.Get()->PS_SetShaderResources(0, 1, pTextureRV.GetAddressOf());
pImmediateContext.Get()->PS_SetSamplers(0, 1, pSamplerState.GetAddressOf());

Shader Code (Pixel Shader Example)

Inside your HLSL pixel shader, you would sample the texture like this:


// In HLSL Pixel Shader
Texture2D g_Texture;
SamplerState g_Sampler;

struct PixelInput {
    float4 pos : SV_POSITION;
    float2 tex : TEXCOORD; // Texture coordinate from vertex shader
};

float4 PSMain(PixelInput input) : SV_TARGET {
    float4 texColor = g_Texture.Sample(g_Sampler, input.tex);
    return texColor;
}

Conclusion

Mastering texture mapping is crucial for creating visually rich and believable 3D environments in DirectX. By understanding UV coordinates, filtering, mipmapping, and address modes, developers can significantly enhance the aesthetic quality of their applications.

For more in-depth information, refer to the official DirectX 11 Texture Documentation.