Pixel Shading
Introduction to Pixel Shaders
Pixel shaders, also known as fragment shaders in OpenGL, are programs that run on the GPU to determine the final color of each pixel (or fragment) on the screen. They are a crucial component of modern graphics rendering, enabling complex visual effects, realistic lighting, and intricate material properties.
Unlike vertex shaders, which operate on individual vertices and transform them in 3D space, pixel shaders process each pixel fragment generated by the rasterization stage. They receive interpolated data from the vertex shader (such as texture coordinates, normals, and colors) and use this information, along with textures and uniform variables, to calculate the final RGBA (Red, Green, Blue, Alpha) color of the pixel.
Core Concepts
Interpolation
The values computed by the vertex shader for each vertex (e.g., texture coordinates, vertex colors) are interpolated across the surface of the primitive (triangle, line) during rasterization. The pixel shader receives these interpolated values for each pixel it processes. This interpolation is typically linear but can be perspective-correct for attributes like texture coordinates to avoid warping.
Textures
Textures are images that can be sampled by pixel shaders to provide surface detail, color variations, or other surface properties. Common texture lookups include diffuse maps, normal maps, specular maps, and emissive maps. The shader uses the interpolated texture coordinates to determine which texel (texture pixel) to sample.
Diagram illustrating texture sampling within a pixel shader.
Lighting Models
Pixel shaders are essential for implementing realistic lighting. They often incorporate lighting models like Phong, Blinn-Phong, or more advanced physically based rendering (PBR) techniques. These models use factors such as the surface normal, light direction, viewer direction, material properties (diffuse, specular, ambient), and light color to calculate the illumination of the pixel.
Uniforms
Uniform variables are constant values passed from the CPU to the GPU that are the same for all pixels processed by a shader program within a single draw call. These can include light positions, camera matrices, material parameters, and time-varying values.
Shader Languages and APIs
Modern graphics APIs like DirectX and OpenGL provide shader languages for writing pixel shaders:
- HLSL (High-Level Shading Language): Used with DirectX.
- GLSL (OpenGL Shading Language): Used with OpenGL.
- Metal Shading Language (MSL): Used with Apple's Metal API.
These languages share many similarities, offering constructs for variables, functions, control flow, and built-in functions for common graphics operations (e.g., vector math, texture sampling).
Example HLSL Pixel Shader (Simplified)
This example demonstrates a basic pixel shader that applies diffuse lighting and samples a diffuse texture.
// Constant buffer for data passed from CPU to GPU
cbuffer cbPerFrame : register(b0)
{
float4x4 WorldViewProjection;
float3 LightDirection;
float3 CameraPosition;
};
cbuffer cbPerObject : register(b1)
{
float4x4 World;
};
// Texture and sampler
Texture2D txDiffuse : register(t0);
SamplerState ssLinear : register(s0);
// Input structure from vertex shader
struct VS_OUTPUT
{
float4 Position : SV_POSITION;
float3 WorldNormal : NORMAL;
float2 Tex : TEXCOORD0;
float3 WorldPos : POSITION;
};
// Output structure for the pixel shader
struct PS_OUTPUT
{
float4 Color : SV_TARGET;
};
PS_OUTPUT main(VS_OUTPUT input)
{
PS_OUTPUT output;
// Normalize the normal vector
float3 normal = normalize(input.WorldNormal);
// Transform light direction to world space (assuming it's already there or passed differently)
float3 lightDir = normalize(LightDirection);
// Calculate diffuse lighting factor (Lambertian model)
float diffuseFactor = max(0.0f, dot(normal, lightDir));
// Sample the diffuse texture
float4 diffuseColor = txDiffuse.Sample(ssLinear, input.Tex);
// Combine diffuse color and lighting
output.Color = diffuseColor * diffuseFactor;
// Add a small ambient term
output.Color.rgb += diffuseColor.rgb * 0.1f;
// Ensure alpha is not zero if texture has alpha
output.Color.a = diffuseColor.a;
return output;
}
Explanation:
cbuffer: Defines constant buffers for per-frame and per-object data.Texture2DandSamplerState: Declare a texture resource and a sampler to control texture filtering.VS_OUTPUT: The structure defining the data received from the vertex shader.PS_OUTPUT: The structure defining the output color of the pixel shader.mainfunction: The entry point for the pixel shader.normalize(): Ensures vectors have a length of 1.dot(): Calculates the dot product, used here for diffuse lighting.max(0.0f, ...): Clamps the diffuse factor to be non-negative.txDiffuse.Sample(...): Samples the texture at the given coordinates using the specified sampler.SV_TARGET: A semantic indicating that this is the render target output.
Advanced Techniques
Pixel shaders are used to implement a wide range of sophisticated graphical effects:
- Normal Mapping: Uses a texture to store surface normals, allowing for detailed surface geometry without increasing the actual polygon count.
- Specular Mapping: Controls the shininess and intensity of specular highlights across a surface.
- Parallax Mapping/Height Mapping: Creates an illusion of depth by offsetting texture coordinates based on viewer angle and a height map.
- Screen-Space Ambient Occlusion (SSAO): Approximates how much ambient light is blocked by nearby geometry.
- Post-Processing Effects: Bloom, depth of field, motion blur, color correction, and more are often implemented using full-screen pixel shaders.
- Physically Based Rendering (PBR): Shaders that adhere to real-world light physics for more accurate and consistent material representations.