Edit

Share via


Docker images for inference

Azure Machine Learning provides prebuilt Docker images for inference (scoring). These images include popular machine learning frameworks and commonly used Python packages. Extend an image to add more packages if needed.

Why use prebuilt images

Using prebuilt images helps in several ways:

  • Reduces model deployment latency
  • Increases deployment success rate
  • Avoids building container images during deployment
  • Keeps the image small by containing only the required dependencies and minimal access rights

List of prebuilt Docker images for inference

Important

The list in the following table includes only the inference Docker images that Azure Machine Learning currently supports.

  • All images run as non-root users.
  • Use the latest tag. Prebuilt images are published to the Microsoft Container Registry (MCR). To see available tags, go to the MCR GitHub repository.
  • If you need a specific tag, Azure Machine Learning supports tags that are up to six months older than latest.

Inference minimal base images

Framework version CPU/GPU Pre-installed packages MCR path
NA CPU NA mcr.microsoft.com/azureml/minimal-ubuntu22.04-py39-cpu-inference:latest
NA GPU NA mcr.microsoft.com/azureml/minimal-ubuntu22.04-py39-cuda11.8-gpu-inference:latest
NA CPU NA mcr.microsoft.com/azureml/minimal-py312-inference:latest

Note

Azure Machine Learning supports curated environments. To browse curated environments in Studio, go to Manage environments in Studio and apply the filter Tags: Inferencing.