Azure AI & Machine Learning Deployment Samples
Real-time Inference with Managed Endpoints
Deploy your trained machine learning models as real-time inference endpoints using Azure Machine Learning managed endpoints. This sample covers containerization, environment setup, and scoring script creation.
Batch Inference with Batch Endpoints
Learn how to perform large-scale batch inference on datasets using Azure Machine Learning batch endpoints. This sample demonstrates data preparation, job submission, and output retrieval.
Deploying to Azure Kubernetes Service (AKS)
This sample guides you through deploying your models to a managed Azure Kubernetes Service (AKS) cluster for highly scalable and customizable deployments. Includes Kubernetes manifests and configuration.
Model Deployment for Cognitive Services
Explore how to deploy custom models within Azure Cognitive Services, leveraging pre-built capabilities with your own trained models for specialized tasks.
Serverless Deployment with Azure Functions
Utilize Azure Functions for serverless model inference. This sample shows how to trigger model predictions via HTTP requests, ideal for event-driven scenarios.
CI/CD Pipelines for Model Deployment
Automate your model deployment process using Azure DevOps or GitHub Actions. This sample focuses on setting up continuous integration and continuous deployment pipelines.