Azure ML Containers
In this post we explain how Azure ML builds the containers used to run your code.
#
DockerfileEach job in Azure ML runs with an associated Environment
. In practice, each environment
corresponds to a Docker image.
There are numerous ways to define an environment - from specifying a set of required Python packages through to directly providing a custom Docker image. In each case the contents of the associated dockerfile are available directly from the environment object.
For more background: Environment
#
ExampleSuppose you create an environment - in this example we will work with Conda:
We can create and register this as an Environment
in our workspace ws
as follows:
In order to consume this environment in a remote run, Azure ML builds a docker image that creates the corresponding python environment.
The dockerfile used to build this image is available directly from the environment object.
Let's take a look:
Notice:
- The base image here is a standard image maintained by Azure ML. Dockerfiles for all base images are available on github: https://github.com/Azure/AzureML-Containers
- The dockerfile references
mutated_conda_dependencies.yml
to build the Python environment via Conda.
Get the contents of mutated_conda_dependencies.yml
from the environment:
Which looks like