Azure AI & Machine Learning Deployment Samples

Real-time Inference with Managed Endpoints

Deploy your trained machine learning models as real-time inference endpoints using Azure Machine Learning managed endpoints. This sample covers containerization, environment setup, and scoring script creation.

MLOps Python Managed Endpoints REST API

Batch Inference with Batch Endpoints

Learn how to perform large-scale batch inference on datasets using Azure Machine Learning batch endpoints. This sample demonstrates data preparation, job submission, and output retrieval.

MLOps Batch Processing Batch Endpoints Large Datasets

Deploying to Azure Kubernetes Service (AKS)

This sample guides you through deploying your models to a managed Azure Kubernetes Service (AKS) cluster for highly scalable and customizable deployments. Includes Kubernetes manifests and configuration.

Kubernetes AKS Scalability Containerization

Model Deployment for Cognitive Services

Explore how to deploy custom models within Azure Cognitive Services, leveraging pre-built capabilities with your own trained models for specialized tasks.

Cognitive Services Custom Models Integration

Serverless Deployment with Azure Functions

Utilize Azure Functions for serverless model inference. This sample shows how to trigger model predictions via HTTP requests, ideal for event-driven scenarios.

Serverless Azure Functions Event-Driven API Gateway

CI/CD Pipelines for Model Deployment

Automate your model deployment process using Azure DevOps or GitHub Actions. This sample focuses on setting up continuous integration and continuous deployment pipelines.

CI/CD Azure DevOps GitHub Actions Automation