Five tricks you should know about Azure Machine Learning Service

Azure Machine Learning Service gathers a lot of essential tools to build a real end-to-end Machine Learning project : from data sources to predictive web services, including versioning (code + model), scalability (compute resources) and monitoring.

In this post, we will discover five “secret” features of Azure Machine Learning. Let’s go to discover them !

1-Generate a perfect score.py script

Wait, you were expecting a fully no code experience, is that right ? Now, have a look at the “Outputs + logs” of the child run of the model.

You will find the expected scoring script and the related environment definition in a Conda YAML file. Do help yourself and download locally these two files and upload them when the wizard asks for !

2-Rotate your Container Registry keys

But Azure ML won’t pull any Docker images anymore ! You have to resync the keys with CLI az ml. First of all, you have to connect with the command az login or use the cloud shell in your Azure portal, which is auto connected.

Check az ml version, you should use the 2.0.1a6 version (or upper) and then, run the following command :

az ml workspace sync-keys -w <your_workspace> -g <your_resource_group>

3-Code local, execute remote

pip install -U azureml-sdk==1.37.0

Download the config.json file from Azure portal and put it in your local folder.

To interact with your Azure ML Service, you will have to authenticate yourself in an interactive way (login and password in a web browser) or with a service principal.

Interactive authentication

“You have logged into Microsoft Azure ML Service!”

You will be able to register datasets or models and deploy web services on Azure, without running a compute instance. Moreover, it will be easy to use git commands with Visual Studio Code interface.

Register local file as a Azure ML dataset

Download this sample of code from GitHub.

4-Link Databricks to Azure ML

First of all, install the azureml-sdk[databricks] and azureml-mlflow packages on your cluster.

We can quickly check that all dependencies are now installed.

An interactive authentication is needed and you will be connected until the cluster restarts.

Let’s do some configuration in the Azure portal. Click on the “Link Azure ML workspace” button, from the Databricks resource overview.

Some information are needed to identify the Azure Machine Learning workspace.

Identify the Azure ML workspace

According to this discussion: “Once you have linked with this experience, you don’t need to run the following ws.write_config(), Workspace.from_config(), and mlflow.set_tracking_uri().” We launch a notebook in Databricks with some codes to set the tracking URI and experiment name.

Download the notebook from GitHub.

When we go back to Azure ML Studio, we find an experiment with a run and all the logs are now in the workspace.

In the “Outputs + logs” menu, we find all the files needed to deploy the model as a predictive web service.

5-Clean your Container Registry periodically

We are going to schedule a cleaning script to delete old images.

Download the script from GitHub.

This script lists down or deletes the Docker images older than 90 days but keeps a minimum of 10 images even when they are too old.

You can now deploy this script in an Azure Function and schedule it, for example, in an Azure Data Factory pipeline. Finally, add the #finops keyword on your curriculum !

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store