This tutorial guides you through the process of building, training, and deploying an image classification model using Azure Machine Learning. You'll learn how to leverage Azure's powerful tools to create a custom AI solution for understanding images.
By the end of this tutorial, you will have a deployed Azure Machine Learning model capable of classifying images into predefined categories.
Ensure your Azure Machine Learning workspace is provisioned and you have the necessary SDKs installed. This typically involves installing the azureml-sdk package.
pip install azureml-sdk[notebooks] azureml-mlflow
pip install pandas scikit-learn matplotlib seaborn Pillow
You'll also need to configure your local environment to connect to your workspace using a config.json file or by authenticating programmatically.
For this tutorial, we'll use a publicly available dataset. In a real-world scenario, you would upload your own labeled images. We'll be using the CIFAR-10 dataset for demonstration.
You can fetch this dataset programmatically. The process involves defining a data path and potentially creating an Azure ML Data Asset.
Ensure your images are organized into folders, where each folder name represents a class label.
We will use Azure Machine Learning's capabilities to train a deep learning model, specifically a Convolutional Neural Network (CNN). You can choose between built-in algorithms or bring your own custom training script.
Here's a simplified Python snippet illustrating the training process:
from azureml.core import Workspace, Experiment, Dataset
from azureml.core.compute import ComputeTarget, AmlCompute
from azureml.core.runconfig import PyTorchConfiguration
from azureml.core.script_run_config import ScriptRunConfig
import os
# Load workspace
ws = Workspace.from_config()
# Define compute target
cluster_name = "gpu-cluster"
try:
compute_target = ComputeTarget(workspace=ws, name=cluster_name)
print(f"Found existing compute target: {cluster_name}")
except Exception:
print(f"Creating new compute target: {cluster_name}")
compute_config = AmlCompute.provisioning_configuration(vm_size='STANDARD_NC6', max_nodes=4)
compute_target = ComputeTarget.create(ws, cluster_name, compute_config)
compute_target.wait_for_completion(show_output=True)
# Load or prepare dataset (assuming you have a data asset named 'cifar10-images')
dataset = Dataset.get_by_name(ws, name='cifar10-images')
# Define training script and arguments
script_dir = "./scripts"
entry_script = "train.py"
arguments = [
'--data-path', dataset.as_mount(),
'--num-classes', 10,
'--epochs', 10
]
# Create a PyTorch configuration for the run
run_config = PyTorchConfiguration()
run_config.node_count = 1
run_config.process_count_per_node = 1
run_config.gpu_count_per_node = 1
# Create a ScriptRunConfig
src = ScriptRunConfig(source_directory=script_dir,
script=entry_script,
arguments=arguments,
compute_target=compute_target,
run_config=run_config)
# Start experiment
experiment = Experiment(workspace=ws, name='image-classification-tutorial')
run = experiment.submit(src)
run.wait_for_completion(show_output=True)
print(f"Run completed with status: {run.get_status()}")
# Log metrics and save model
run.log_image(name='trained_model_accuracy', value='accuracy.png') # Example
run.register_model(model_path='outputs/model.pth', model_name='image-classifier', tags={'area': 'vision'})
The train.py script would contain your actual model definition (e.g., using PyTorch or TensorFlow), data loading, training loop, and model saving logic.
Once trained, you can deploy your model to Azure Container Instances (ACI) for testing or to Azure Kubernetes Service (AKS) for production workloads. This makes your model accessible via a REST API.
Deployment involves creating an inference script (score.py) and an environment definition (.yml file).
from azureml.core.model import Model
from azureml.core.environment import Environment
from azureml.core.conda_dependencies import CondaDependencies
from azureml.core.webservice import AciWebservice, Webservice
from azureml.core import Workspace
# Load workspace
ws = Workspace.from_config()
# Load the registered model
model = Model(ws, 'image-classifier') # Assumes model was registered in previous step
# Define the environment for inference
env = Environment.from_conda_specification(name="image-classifier-env", file_path="env.yml")
# Alternatively, use existing environments or build custom ones
# Define inference configuration
script_file_name = "score.py"
inference_config = AciWebservice.deploy_configuration(cpu_cores=1, memory_gb=1, description="Image classification service")
# Create the service
service = Webservice.deploy_from_model(
workspace=ws,
name='image-classifier-service',
models=[model],
inference_config=inference_config,
environment=env,
deployment_config=inference_config
)
service.wait_for_deployment(show_output=True)
print(f"Service name: {service.name}")
print(f"Service state: {service.state}")
print(f"Service scoring URI: {service.scoring_uri}")
The score.py would load the model and define the run() function to process incoming requests and return predictions.
Send sample image data to your deployed web service endpoint and observe the predictions. You can use tools like Postman or write a Python script to interact with the REST API.
import requests
import json
# Replace with your service's scoring URI and key
scoring_uri = "YOUR_SERVICE_SCORING_URI"
api_key = "YOUR_SERVICE_KEY" # You can get this from the Azure ML studio
# Prepare sample data (e.g., base64 encoded image)
# For simplicity, this is a placeholder. You'd load and encode your actual image.
sample_image_data = {"image_base64": "YOUR_BASE64_ENCODED_IMAGE_STRING"}
headers = {
'Content-Type': 'application/json',
'Authorization': f'Bearer {api_key}'
}
response = requests.post(scoring_uri, data=json.dumps(sample_image_data), headers=headers)
if response.status_code == 200:
predictions = response.json()
print("Predictions:", predictions)
else:
print(f"Error: {response.status_code} - {response.text}")
For detailed code examples and advanced configurations, please refer to the official Azure Machine Learning documentation.