DirectML Operators
This section details the operators supported by DirectML, the Microsoft DirectX Low-Level Machine Learning API. These operators form the building blocks for creating and executing machine learning graphs.
Core Operators
DML_ACTIVATION_ELU_OPERATOR
Exponential Linear Unit (ELU) activation function.
DML_ACTIVATION_HARD_SIGMOID_OPERATOR
Hard Sigmoid activation function.
DML_ACTIVATION_LINEAR_OPERATOR
Linear activation function (identity).
DML_ACTIVATION_MISH_OPERATOR
Mish activation function.
DML_ACTIVATION_RELU_OPERATOR
Rectified Linear Unit (ReLU) activation function.
DML_ACTIVATION_SCALED_ELU_OPERATOR
Scaled Exponential Linear Unit (SELU) activation function.
DML_ACTIVATION_SIGMOID_OPERATOR
Sigmoid activation function.
DML_ACTIVATION_SOFTMAX_OPERATOR
Softmax activation function.
DML_ACTIVATION_TANH_OPERATOR
Hyperbolic Tangent (Tanh) activation function.
DML_BATCH_NORMALIZATION_OPERATOR
Batch Normalization operator.
DML_CONVOLUTION_OPERATOR
2D Convolution operator.
DML_GEMM_OPERATOR
General Matrix Multiply (GEMM) operator.
DML_MAX_POOLING_OPERATOR
2D Max Pooling operator.
DML_SUM_OPERATOR
Element-wise addition of two tensors.
DML_ELEMENT_WISE_ADD_OPERATOR
Element-wise addition of two tensors.
DML_ELEMENT_WISE_MULTIPLY_OPERATOR
Element-wise multiplication of two tensors.
Tensor Manipulation
DML_REDUCE_OPERATOR
Reduces a tensor along specified dimensions.
DML_SLICE_OPERATOR
Extracts a slice from a tensor.
DML_TILE_OPERATOR
Tiles a tensor along specified dimensions.
Other Operators
DML_QUANTIZE_OPERATOR
Quantizes a tensor to a lower-precision format.
DML_DEQUANTIZE_OPERATOR
Dequantizes a tensor from a lower-precision format.