DirectML Operators

Explore the comprehensive list of operators available in DirectML for accelerating machine learning workloads on Windows.

Introduction

DirectML (Direct Machine Learning) is a high-performance, hardware-accelerated machine learning inference and training API for Windows. It provides a standardized way for applications to leverage the power of DirectX 12-compatible GPUs for machine learning tasks.

The DirectML operator model defines a rich set of building blocks that can be composed to create complex neural network architectures. This page provides an overview of available operators, their categories, and how to use them.

Operator Categories

DirectML operators are broadly categorized to help you find the functionality you need:

Common Operators

Here's a selection of frequently used DirectML operators:

Convolution

Performs a convolution operation, a fundamental building block for CNNs.

Category: Convolutional

Activation (ReLU)

Applies the Rectified Linear Unit activation function element-wise.

Category: Activations

Pooling (Max)

Performs max pooling over an input tensor.

Category: Pooling

Add

Performs element-wise addition of two tensors.

Category: Element-wise Operations

MatMul (Matrix Multiplication)

Performs matrix multiplication between two tensors.

Category: Matrix Operations

Batch Normalization

Normalizes the activations across a batch dimension.

Category: Normalization

Softmax

Applies the softmax function, often used in output layers for classification.

Category: Activations

Operator Details and Syntax

Each DirectML operator has a specific signature and set of properties. When defining an operator in your DirectML graph, you typically specify input tensors, output tensors, and operator-specific attributes.

Example: Defining a Convolution Operator

Consider defining a 2D convolution. The syntax often involves creating an operator object and setting its properties:


// Assuming 'device' is your IDMLDevice and 'context' is your IDMLCommandRecorder
DML_CONVOLUTION_OPERATOR_DESC convDesc = new DML_CONVOLUTION_OPERATOR_DESC();
convDesc.InputTensor = &inputTensorDesc; // Pointer to input tensor description
convDesc.FilterTensor = &filterTensorDesc; // Pointer to filter tensor description
convDesc.OutputTensor = &outputTensorDesc; // Pointer to output tensor description
convDesc.Mode = DML_CONVOLUTION_MODE_DEFAULT;
convDesc.Direction = DML_CONVOLUTION_DIRECTION_FORWARD;
convDesc.DimensionCount = 2; // For 2D convolution
convDesc.Strides = new uint[] { 1, 1 }; // Example strides
convDesc.Dilations = new uint[] { 1, 1 }; // Example dilations
convDesc.Padding = new DML_PADDING_BORDER_DESC[]
{
    new DML_PADDING_BORDER_DESC { Type = DML_BORDER_TYPE_CONSTANT, Value = 0 },
    new DML_PADDING_BORDER_DESC { Type = DML_BORDER_TYPE_CONSTANT, Value = 0 }
};
convDesc.OutputPadding = new uint[] { 0, 0 };
convDesc.Groups = 1;

// Create the operator
IDMLOperator* convolutionOperator;
device->CreateOperator(&convDesc, IID_PPV_ARGS(&convolutionOperator));
            

Key Parameters:

For a complete list of operators and their detailed parameters, please refer to the official DirectML Operator Reference.

Getting Started with DirectML Operators

To integrate DirectML operators into your application:

  1. Initialize DirectML: Obtain an IDMLDevice and an IDMLCommandRecorder.
  2. Define Tensors: Describe the input, output, and persistent tensors using DML_TENSOR_DESC.
  3. Create Operators: Instantiate operators using their respective DML_*_OPERATOR_DESC structures and IDMLDevice::CreateOperator.
  4. Build a Graph: Link operators together to form a computational graph.
  5. Execute: Record commands to the command recorder and submit them to a Direct3D 12 command queue.

Explore the DirectML Samples repository on GitHub for practical examples and best practices.