Azure Machine Learning Engineering - Sina Fakhraee - E-Book

Azure Machine Learning Engineering E-Book

Sina Fakhraee

0,0
29,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

Data scientists working on productionizing machine learning (ML) workloads face a breadth of challenges at every step owing to the countless factors involved in getting ML models deployed and running. This book offers solutions to common issues, detailed explanations of essential concepts, and step-by-step instructions to productionize ML workloads using the Azure Machine Learning service. You’ll see how data scientists and ML engineers working with Microsoft Azure can train and deploy ML models at scale by putting their knowledge to work with this practical guide.
Throughout the book, you’ll learn how to train, register, and productionize ML models by making use of the power of the Azure Machine Learning service. You’ll get to grips with scoring models in real time and batch, explaining models to earn business trust, mitigating model bias, and developing solutions using an MLOps framework.
By the end of this Azure Machine Learning book, you’ll be ready to build and deploy end-to-end ML solutions into a production system using the Azure Machine Learning service for real-time scenarios.

Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:

EPUB
MOBI

Seitenzahl: 287

Veröffentlichungsjahr: 2023

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Azure Machine Learning Engineering

Deploy, fine-tune, and optimize ML models using Microsoft Azure

Sina Fakhraee, Ph.D.

Balamurugan Balakreshnan

Megan Masanz

BIRMINGHAM—MUMBAI

Azure Machine Learning Engineering

Copyright © 2022 Packt Publishing

All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews.

Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the authors, nor Packt Publishing or its dealers and distributors, will be held liable for any damages caused or alleged to have been caused directly or indirectly by this book.

Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information.

Group Product Manager: Gebin George

Publishing Product Manager: Ali Abidi

Content Development Editor: Priyanka Soam

Technical Editor: Devanshi Ayare

Copy Editor: Safis Editing

Project Coordinator: Farheen Fathima

Proofreader: Safis Editing

Indexer: Tejal Daruwale Soni

Production Designer: Prashant Ghare

Marketing Coordinator: Shifa Ansari, Vinishka Kalra

First published: Dec 2022

Production reference: 1291222

Published by Packt Publishing Ltd.

Livery Place

35 Livery Street

Birmingham

B3 2PB, UK.

ISBN 978-1-80323-930-9

www.packt.com

To my wife, Dr. Shabnam Behdin, for being my loving and supportive partner in every step of our lives. To my son Nickan and my daughter Nila, who are my entire world and motivation to conquer any challenges encountered in life. To my wonderful parents, Dr. Hossein Fakhraee and Soraya Golrokh, for the countless sacrifices they made for me growing up and that they still make now in my adulthood.

– Sina Fakhraee

To my wife, Nandini, and my daughter, Aarushi, who have been very supportive, and for the sacrifices they have made. To my wonderful mom, Sellamma, and aunt Vasantha, who I can’t thank enough, and my family for supporting me throughout my career. To my friends who have helped me, for the support they have provided.

– Balamurugan Balakreshnan

To my daughter, LD Bobs – to becoming a woman of vision. To my husband, Joe, who picks up the slack. To my Dad, who consistently teaches me what it means to be a fighter, and to my Mom, who demonstrates leadership with love. Mom, thanks for all the edits and endless hours – we are appreciative of your efforts.

– Megan Masanz

Contributors

About the authors

Sina Fakhraee, Ph.D., is currently working at Microsoft as an enterprise data scientist and senior cloud solution architect. He has helped customers to successfully migrate to Azure by providing best practices around data and AI architectural design and by helping them implement AI/ML solutions on Azure. Prior to working at Microsoft, Sina worked at Ford Motor Company as a product owner for Ford’s AI/ML platform. Sina holds a Ph.D. degree in computer science and engineering from Wayne State University and prior to joining the industry, he taught various undergrad and grad computer science courses part time. If you would like to know more about Sina, please visit his LinkedIn: https://www.linkedin.com/in/sina-fakhraee-ph-d-2798ba70/.

I would like to thank my manager, Rod Means, for his outstanding guidance, support, and leadership over the past few years. I would also like to thank Ali Abidi, Priyanka Soam, Kirti Pisat, and the rest of the team at Packt for their help and support throughout the process. I would like to thank my amazing team members, Bala and Megan, for their amazing collaboration and teamwork.

Balamurugan Balakreshnan is a principal cloud solution architect at Microsoft Data/AI Architect and Data Science. He has provided leadership on digital transformations with AI and cloud-based digital solutions. He has also provided leadership in terms of ML, the IoT, big data, and advanced analytical solutions.

A big thank you to my manager Shruti Harish for her guidance and support throughout the book. I also thank the publishers, Packt, and their team – Ali Abidi, Priyanka Soam, Kirti Pisat, and rest of the team. Thank you to all my friends and colleagues for providing me with this wonderful opportunity to collaborate on the book (Sina Fakhraee and Megan Masanz).

Megan Masanz is a principal cloud solution architect at Microsoft focused on data, AI, and data science, passionately enabling organizations to address business challenges through the establishment of strategies and road maps for the planning, design, and deployment of Azure Cloud-based solutions. Megan is adept at paving the path to data science via computer science given her master’s in computer science with a focus on data science (https://meganmasanz.azurewebsites.net/).

I would like to thank my manager, Marc Grove, a wonderful source of support and guidance. I would like to thank the Packt team for their partnership in bringing this book forward, and for the opportunity they have provided. I would like to thank my team members for their amazing collaboration and teamwork.

About the reviewers

Vijender Singh is a certified multi-cloud expert with over 5 years of experience. He is currently working with the Amazon Alexa AI team to tackle the effective use of AI on Alexa. He completed an MSc with distinction at Liverpool John Moores University with research work on keyphrase extraction. He has completed MLPE GCP, 5x Azure, 2x AWS certification, and TensorFlow certifications. Vijender is instrumental in co-mentoring and teaching his colleagues about ML and TensorFlow, which is a fundamental tool for the ML journey. He believes in working toward a better tomorrow.

Remon van Harmelen started his career as a software developer. Later, he became intrigued by Azure and made a career change to become an Azure consultant. He worked for several consulting parties and as part of multiple multinational projects, ranging from lift-and-shift migration to creating data analytical platforms. Now working for Microsoft as a cloud solution architect, he spends his days supporting customers on their Azure journeys.

Olivier Mertens is a cloud solution architect for data and AI at the Microsoft EMEA HQ in Ireland. He is responsible for designing data and AI solutions at scale on Azure. After specializing in AML, Olivier was selected as an advanced cloud expert for ML and MLOps. He works on the most complex ML cases in the EMEA region and leads knowledge-sharing initiatives. Before Microsoft, Olivier worked as a data scientist in Belgium. Olivier is a guest lecturer at PXL Digital Business School and holds an MSc in information management, a postgraduate degree in AI business architecture, and BSc in business management.

Nirbhay Anand has worked in the role of technical program manager and completed a master’s in computer science and an MBA, and has 16 years of industry experience in software product development. He has developed software in different domains such as investment banking, manufacturing, supply chains, power forecasting, railroad infrastructure, and contract management. Currently, he is associated with Cloudmoyo, a leading cloud and analytics partner for Microsoft. CloudMoyo brings together powerful BI capabilities using Azure Data Platform to transform complex data into business insights. He is a passionate blogger and book reviewer.

I would like to thank my wife, Vijeta and kids, Navya and Nitrika, for their support. I would also like to thank my friends, family, and well-wishers for their never-ending support.

Deepak Mukunthu is a customer-focused and results-oriented AI/ML leader with over 20 years of experience leading teams and driving product vision and strategy. He currently leads the AI platform and initiatives for DocuSign. Prior to joining DocuSign, Deepak led multiple critical initiatives at Microsoft, including AutoML and labeling in AML, the data platform for Bing/Cortana, and ML-powered personalized news feeds. Deepak is passionate about democratizing AI for everyone and drives customer adoption of AI/ML technologies for their business-critical needs. Deepak is the published author of Practical Automated Machine Learning on Azure and loves mentoring and sharing his knowledge via platforms such as Product School and Sharebird.

Table of Contents

Preface

Part 1: Training and Tuning Models with the Azure Machine Learning Service

1

Introducing the Azure Machine Learning Service

Technical requirements

Building your first AMLS workspace

Creating an AMLS workspace through the Azure portal

Creating an AMLS workspace through the Azure CLI

Creating an AMLS workspace with ARM templates

Navigating AMLS

Creating a compute for writing code

Creating a compute instance through the AMLS GUI

Adding a schedule to a compute instance

Creating a compute instance through the Azure CLI

Creating a compute instance with ARM templates

Developing within AMLS

Developing Python code with Jupyter Notebook

Developing using an AML notebook

Connecting AMLS to VS Code

Summary

2

Working with Data in AMLS

Technical requirements

Azure Machine Learning datastore overview

Default datastore review

Creating a blob storage account datastore

Creating a blob storage account datastore through Azure Machine Learning Studio

Creating a blob storage account datastore through the Python SDK

Creating a blob storage account datastore through the Azure Machine Learning CLI

Creating Azure Machine Learning data assets

Creating a data asset using the UI

Creating a data asset using the Python SDK

Using Azure Machine Learning datasets

Read data in a job

Summary

3

Training Machine Learning Models in AMLS

Technical requirements

Training code-free models with the designer

Creating a dataset using the user interface

Training on a compute instance

Training on a compute cluster

Summary

4

Tuning Your Models with AMLS

Technical requirements

Understanding model parameters

Sampling hyperparameters

Understanding sweep jobs

Truncation policies

Median policies

Bandit policies

Setting up a sweep job with grid sampling

Setting up a sweep job for random sampling

Setting up a sweep job for Bayesian sampling

Reviewing results of a sweep job

Summary

5

Azure Automated Machine Learning

Technical requirements

Introduction to Azure AutoML

Featurization concepts in AML

AutoML using AMLS

AutoML using the AML Python SDK

Parsing your AutoML results via AMLS and the AML SDK

Summary

Part 2: Deploying and Explaining Models in AMLS

6

Deploying ML Models for Real-Time Inferencing

Technical requirements

Understanding real-time inferencing and batch scoring

Deploying an MLflow model with managed online endpoints through AML Studio

Deploying an MLflow model with managed online endpoints through the Python SDK V2

Deploying a model with managed online endpoints through the Python SDK v2

Deploying a model for real-time inferencing with managed online endpoints through the Azure CLI v2

Summary

7

Deploying ML Models for Batch Scoring

Technical requirements

Deploying a model for batch inferencing using the Studio

Deploying a model for batch inferencing through the Python SDK

Summary

8

Responsible AI

Responsible AI principles

Responsible AI Toolbox overview

Responsible AI dashboard

Error analysis dashboard

Interpretability dashboard

Data explorer

What-if counterfactuals

Fairness

Summary

9

Productionizing Your Workload with MLOps

Technical requirements

Understanding the MLOps implementation

Preparing your MLOps environment

Creating a second AML workspace

Creating an Azure DevOps organization and project

Connecting to your AML workspace

Moving code to the Azure DevOps repo

Setting up variables in Azure Key Vault

Setting up environment variable groups

Creating an Azure DevOps environment

Setting your Azure DevOps service connections

Creating an Azure DevOps pipeline

Running an Azure DevOps pipeline

Summary

Further reading

Part 3: Productionizing Your Workload with MLOps

10

Using Deep Learning in Azure Machine Learning

Technical requirements

Labeling image data using the Data Labeling feature of Azure Machine Learning

Training an object detection model using Azure AutoML

Deploying the object detection model to an online endpoint using the Azure ML Python SDK

Summary

11

Using Distributed Training in AMLS

Technical requirements

Data parallelism

Model parallelism

Distributed training with PyTorch

Distributed training code

Creating a training job Python file to process

Distributed training with TensorFlow

Creating a training job Python file to process

Summary

Index

Other Books You May Enjoy

Preface

Data scientists working on productionizing machine learning workloads face a breadth of challenges at every step owing to the countless factors involved in getting ML models deployed and running. This book offers solutions to common issues, detailed explanations of essential concepts, and step-by-step instructions to productionize ML workloads using the Azure Machine Learning service. You’ll see how data scientists and machine learning engineers working with Microsoft Azure can train and deploy ML models at scale by putting their knowledge to work with this practical guide.

Throughout the book, you’ll learn how to train, register, and productionize ML models by leveraging the power of the Azure Machine Learning service. You’ll get to grips with scoring models in real time and batch, explaining models to earn business trust, mitigating model bias, and developing solutions using an MLOps framework.

By the end of this Azure machine learning book, you’ll be ready to build and deploy end-to-end ML solutions into a production system using AML for real-time scenarios.

Who this book is for

Machine learning engineers and data scientists who want to move to ML engineering roles will find this AMLS book useful. Familiarity with the Azure ecosystem will help you understand the concepts covered.

What this book covers

Chapter 1, Introducing the Azure Machine Learning Service, introduces the basic concepts of the Azure Machine Learning (AML) service. You will create an AML workspace, create a compute instance, and connect AML to VS Code for further development in later chapters.

Chapter 2, Working with Data in AMLS, covers how to work with data in AMLS. In particular, you will learn how to load data, save data as datasets, and use datasets in later development projects.

Chapter 3, Training Machine Learning Models in AMLS, shows you how to train machine learning models using AMLS experiments as well as the code-free designer. You will see how to train jobs remotely and save models to the AMLS model registry for later use.

Chapter 4, Tuning Your Models with AMLS, demonstrates how to tune hyperparameters for your machine learning models using AMLS HyperDrive.

Chapter 5, Azure Automated Machine Learning, covers how to script an AutoML job to automatically train a machine learning model.

Chapter 6, Deploying ML Models for Real-Time Inferencing, teaches you how to deploy models in the AML to support real-time inferencing.

Chapter 7, Deploying ML Models for Batch Scoring, shows you how to apply batch scoring to models using AML batch endpoints.

Chapter 8, Responsible AI, teaches you how to explain your machine learning models using AMLS and Azure Interpret.

Chapter 9, Productionizing Your Workload with MLOps, has you setting up an Azure DevOps pipeline to orchestrate model training and deployment to multiple environments.

Chapter 10, Using Deep Learning in Azure Machine Learning, demonstrates how to label image data using Azure Machine Learning’s Data Labeling feature, which we will use to train an object detection model. You will learn how to train an object detection model using AMLS AutoML and how to deploy the trained model for inferencing using AMLS.

Chapter 11, Using Distributed Training in AMLS, teaches how to perform distributed training in AMLS. In particular, you will learn how to train models in a distributed fashion using two popular deep learning frameworks, PyTorch and TensorFlow.

To get the most out of this book

To get the most out of this book, you will need to have an Azure subscription and the latest versions of Windows PowerShell and Command Prompt.

Software/hardware covered in the book

Operating system requirements

Windows PowerShell

or

Command Prompt

Windows, macOS, or Linux

If you are using the digital version of this book, we advise you to type the code yourself or access the code from the book’s GitHub repository (a link is available in the next section). Doing so will help you avoid any potential errors related to the copying and pasting of code.

Download the example code files

You can download the example code files for this book from GitHub at https://github.com/PacktPublishing/Azure-Machine-Learning-Engineering. If there’s an update to the code, it will be updated in the GitHub repository.

We also have other code bundles from our rich catalog of books and videos available at https://github.com/PacktPublishing/. Check them out!

Download the color images

We also provide a PDF file that has color images of the screenshots and diagrams used in this book. You can download it here: https://packt.link/8s9Lt.

Conventions used

There are a number of text conventions used throughout this book.

Code in text: Indicates code words in text, database table names, folder names, filenames, file extensions, pathnames, dummy URLs, user input, and Twitter handles. Here is an example: “The actual data file, which in this case is titanic.csv.”

Any command-line input or output is written as follows:

az extension remove -n azure-cli-ml az extension remove -n ml

Bold: Indicates a new term, an important word, or words that you see onscreen. For instance, words in menus or dialog boxes appear in bold. Here is an example: “The rest of the options are the default, and you can click on the Review + create button.”

Tips or important notes

Appear like this.

Get in touch

Feedback from our readers is always welcome.

General feedback: If you have questions about any aspect of this book, email us at [email protected] and mention the book title in the subject of your message.

Errata: Although we have taken every care to ensure the accuracy of our content, mistakes do happen. If you have found a mistake in this book, we would be grateful if you would report this to us. Please visit www.packtpub.com/support/errata and fill in the form.

Piracy: If you come across any illegal copies of our works in any form on the internet, we would be grateful if you would provide us with the location address or website name. Please contact us at [email protected] with a link to the material.

If you are interested in becoming an author: If there is a topic that you have expertise in and you are interested in either writing or contributing to a book, please visit authors.packtpub.com.

Share Your Thoughts

Once you’ve read Azure Machine Learning Engineering, we’d love to hear your thoughts! Please click here to go straight to the Amazon review page for this book and share your feedback.

Your review is important to us and the tech community and will help us make sure we’re delivering excellent quality content.

Download a free PDF copy of this book

Thanks for purchasing this book!

Do you like to read on the go but are unable to carry your print books everywhere? Is your eBook purchase not compatible with the device of your choice?

Don’t worry, now with every Packt book you get a DRM-free PDF version of that book at no cost.

Read anywhere, any place, on any device. Search, copy, and paste code from your favorite technical books directly into your application.

The perks don’t stop there, you can get exclusive access to discounts, newsletters, and great free content in your inbox daily

Follow these simple steps to get the benefits:

Scan the QR code or visit the link below

https://packt.link/free-ebook/9781803239309

Submit your proof of purchaseThat’s it! We’ll send your free PDF and other benefits to your email directly

Part 1: Training and Tuning Models with the Azure Machine Learning Service

Readers will learn how to use the Azure Machine Learning service to train and tune different types of models in Part 1, taking advantage of its unique job tracking capabilities.

This section has the following chapters:

Chapter 1, Introducing the Azure Machine Learning ServiceChapter 2, Working with Data in AMLSChapter 3, Training Machine Learning Models in AMLSChapter 4, Tuning Your Models with AMLSChapter 5, Azure Automated Machine Learning

1

Introducing the Azure Machine Learning Service

Machine Learning (ML), leveraging data to build and train a model to make predictions, is rapidly maturing. Azure Machine Learning (AML) is Microsoft’s cloud service, which not only enables model development but also your data science life cycle. AML is a tool designed to empower data scientists, ML engineers, and citizen data scientists. It provides a framework to train and deploy models empowered through MLOps to monitor, retrain, evaluate, and redeploy models in a collaborative environment backed by years of feedback from Microsoft’s Fortune 500 customers.

In this chapter, we will focus on deploying an AML workspace, the resource that leverages Azure resources to provide an environment to bring together the assets you will leverage when you use AML. We will showcase how to deploy these resources using a Guided User Interface (GUI), followed by setting up your AML service via the Azure Command-Line Interface (CLI) ml extension (v2), which is the ml extension for the Azure CLI, allowing model training and deployment through the command line. We will proceed with setting up the workspace by leveraging Azure Resource Management (ARM) templates, which are referred to as ARM deployments.

During deployment, key resources will be deployed, including AML Studio, a portal for data scientists to manage their workload, often referred to as your workspace; Azure Key Vault for storing sensitive information; Application Insights for logging information; Azure Container Registry to store docker images to leverage; and an Azure storage account to hold data. These resources will be leveraged behind the scenes as you navigate through the Azure Machine Learning service workspace, creating compute resources for writing code by leveraging the Integrated Development Environments (IDE) of your choice, including Jupyter Notebook, Jupyter Lab, as well as VS Code.

In this chapter, we will cover the following topics:

Building your first AMLS workspaceNavigating AMLSCreating a compute for writing codeDeveloping within AMLSConnecting AMLS to VS Code

Technical requirements

In this section, you will sign up for an Azure account and use the web-based Azure portal to create various resources. As such, you will require internet access and a working web browser.

The following are the prerequisites for the chapter:

Access to the internetA web browser, preferably Google Chrome or Microsoft Edge ChromiumIf you do not already have an Azure subscription, you can leverage the $200 Azure credit available for 30 days by following this link: https://azure.microsoft.com/en-us/free/.The Azure CLI >= 2.15.0The code leveraged throughout this book has been made available in the following repository: https://github.com/PacktPublishing/Azure-Machine-Learning-Engineering.git.You will be leveraging code from a GitHub repository:https://github.com/Azure/azure-quickstart-templates/blob/master/quickstarts/microsoft.machinelearningservices/machine-learning-workspace/azuredeploy.jsonhttps://github.com/Azure/azure-quickstart-templates/blob/master/quickstarts/microsoft.machinelearningservices/machine-learning-compute-create-computeinstance/azuredeploy.json

Building your first AMLS workspace

Within Azure, there are numerous ways to create Azure resources. The most common method is through the Azure portal, a web interface that allows you to create resources through a GUI. To automate the creation of resources, users can leverage the Azure CLI with the ml extension (V2), which provides you with a familiar terminal to automate deployment. You can also create resources using ARM templates. Both the CLI and the ARM templates provide an automatable, repeatable process to create resources in Azure.

In the upcoming subsections, we will first create an AMLS workspace through the web portal. After you have mastered this task, you will also create another workspace via the Azure CLI. Once you understand how the CLI works, you will create an ARM template and use it to deploy a third workspace. After learning about all three deployment methods, you will delete all excess resources before moving on to the next section; leaving excess resources up and running will cost you money, so be careful.

Creating an AMLS workspace through the Azure portal

Using the portal to create an AMLS workspace is the easiest, most straightforward approach. Through the GUI, you create a resource group, a container to hold multiple resources, along with your AMLS workspace and all its components. To create a workspace, navigate to https://portal.azure.com and follow these steps:

Navigate to https://portal.azure.com and type Azure Machine Learning into the search box as shown in Figure 1.1 and press Enter:

Figure 1.1 – Selecting resource groups

On the top left of the Azure portal, select the + Create option shown in Figure 1.2:

Figure 1.2 – Creating an AML workspace

Selecting the + Create option will bring up the Basics tab as shown here:

Figure 1.3 – Filling in the corresponding fields to create the ML workspace

In the Basics tab shown in Figure 1.3 for creating your AML workspace, populate the following values:Subscription: The Azure subscription you would like to deploy your resource.Resource group: Click on Create new and enter a name for a resource group. In Azure, resource groups can be thought of as folder, or container that holds resources for a particular solution. As we deploy the AMLS workspace, the resources will be deployed into this resource group to ensure we can easily delete the resources after performing this exercise.Workspace name: The name of the AMLS workspace resource.The rest of the options are the default, and you can click on the Review + create button.This will cause validation to occur – once the information has been validated, click on the + Create button to deploy your resources.It usually takes a few minutes for the workspace to be created. Once the deployment is completed, click on Go to resource in the Azure portal and then click on Launch studio to go to the AMLS workspace.

You are now on the landing page for AMLS as shown in Figure 1.4:

Figure 1.4 – AMLS

Congratulations! You have now successfully built your first AMLS workspace. While you can start by loading in data right now, take the time to walk through the next section to learn how to create it via code.

Creating an AMLS workspace through the Azure CLI

For people who prefer a code-first approach to creating resources, the Azure CLI is the perfect fit. At the time of writing, the AML CLI v2 is the most up-to-date extension for the Azure CLI available. While leveraging the Azure CLI v2, assets are defined by leveraging a YAML file, as we will see in later chapters.

Note

The Azure CLI v2 uses commands that follow a format of az ml <noun> <verb> <options>.

To create an AMLS workspace via the Azure CLI ml extension (v2), follow these steps:

You need to install the Azure CLI from https://docs.microsoft.com/en-us/cli/azure/install-azure-cli.Find your subscription ID. In the Azure portal in the search box, you can type Subscriptions, and bring up a list of Azure subscriptions and the ID of the subscriptions. For the subscription that you would like to use, copy the Subscription ID information to use it with the CLI.

Here’s a view of Subscriptions within the Azure portal:

Figure 1.5 – Azure subscription list

Launch your command-line interpreter (CLI) based on your OS – for example, Command Prompt (CMD) or Windows Powershell (Windows PS) – and check your version of the Azure CLI by running the following command: az version

Note

You will need to have a version of the Azure CLI that is greater than 2.15.0 to leverage the ml extension.

You will need to remove old extensions if they are installed for your CLI to work properly. You can remove the old ml extensions by running the following commands: az extension remove -n azure-cli-ml az extension remove -n mlTo install the ml extension, run the following command: az extension add -n ml -yNow, let’s connect to your subscription in Azure through the Azure CLI by running the following command here, replacing xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx with the Subscription ID information you found in Figure 1.5: az login az account set --subscription xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxxCreate a resource group by running the following command. Please note that rg_name is an example name for the resource group, just as aml-ws is an example name for an AML workspace: az group create --name aml-dev-rg  --location eastus2Create an AML workspace by running the following command, noting that eastus2 is the Azure region in which we will deploy this AML workspace: az ml workspace create -n aml-ws -g aml-dev-rg -l eastus2

You have now created an AMLS workspace with the Azure CLI ml extension and through the portal. There’s one additional way to create an AMLS workspace that’s commonly used, ARM templates, which we will take a look at next.

Creating an AMLS workspace with ARM templates

ARM templates can be challenging to write, but they provide you with a way to easily automate and parameterize the creation of Azure resources. In this section, you will first write a simple ARM template to build an AMLS workspace and then deploy your template using the Azure CLI. To do so, take the following steps:

An ARM template can be downloaded from GitHub and is found here: https://github.com/Azure/azure-quickstart-templates/blob/master/quickstarts/microsoft.machinelearningservices/machine-learning-workspace/azuredeploy.json.

This template creates the following Azure services:

Azure Storage AccountAzure Key VaultAzure Application InsightsAzure Container RegistryAn AML workspace

The example template has three required parameters:

environment, where the resources will be createdname, which is the name that we are giving to the AMLS workspacelocation, the Azure Region the resource will be deployed toTo deploy your template, you have to create a resource group first as follows: az group create --name rg_name --location eastus2Make sure your command prompt is opened to the location to which you downloaded the azuredeploy.json file, and run the following command: az deployment group create --name "exampledeployment" --resource-group "rg_name" --template-file "azuredeploy.json" --parameters name="uniquename" environment="dev" location="eastus2"

It will take a few minutes for the workspace to be created.

We have covered a lot of information so far, whether creating an AMLS workspace using the portal, the CLI, or now using ARM templates. In the next section, we will show you how to navigate the workspace, often referred to as the studio.

Navigating AMLS

AMLS provides access to key resources for a data science team to leverage. In this section, you will learn how to navigate AMLS exploring the key components found within the studio. You will learn briefly about its capabilities, which we will cover in detail in the rest of the chapters.

Open a browser and go to https://portal.azure.com. Log in with your Azure AD credentials. Once logged into the portal, you will see several icons. Select the Resource group icon and click on the Azure Machine Learning resource.

In the Overview page, click on the Launch Studio button as seen in the following screenshot:

Figure 1.6 – Launch studio

Clicking on the icon shown in Figure 1.6 will open AMLS in a new window.

The studio launch will bring you to the main home page of AMLS. The UI includes functionality to match several personas, including no-code, low -code, and code-based ML. The main page has two sections – the left-hand menu pane and the right-hand workspace pane.

The AMLS workspace home screen is shown in Figure 1.7:

Figure 1.7 – AMLS workspace home screen

Now, let us understand the preceding screenshot in brief:

In section 1 of Figure 1.7, the left-hand menu pane is displayed. Clicking on any of the words in this pane will bring up a new right workspace pane, which includes sections 2 and 3 of the screen. We can select any of these keywords to quickly access key resources within our AMLS workspace. We will drill into these key resources as we begin exploring the AMLS workspace.In section 2 of Figure 1.7, quick links are provided to key resources that we will leverage throughout this book, enabling AMLS users to create new items covering the varying personas supported.As we continue to explore our environment and dig into creating assets within the AMLS workspace, both with code-based and low-code options, recent resources will begin to appear in section 3 of Figure 1.7, providing users with the ability to see recently leveraged resources, whether the compute, the code execution, the models created, or the datasets that are leveraged.

The home page provides quick access to the key resources found within your AMLS workspace. In addition to the quick links, scroll down and you can view the Documentation section. In the Documentation section, we see great documentation to get you started in understanding how to best leverage your AML environment.

The Documentation section, a hub for documentation resources, is displayed on the right pane of the AMLS home screen:

Figure 1.8 – Documentation

As shown in Figure 1.8, the AMLS home page provides you with a wealth of documentation resources to get you started. The links include training modules, tutorials, and even blogs regarding how to leverage AMLS.

On the top-right side of the page, there are several options available:

Notifications: The bell icon represents notifications, which display the messages that are generated as you leverage your AMLS workspace. These messages will contain information regarding the creation and deletion of resources, as well as information regarding the resources running within your workspace.

Figure 1.9 – Top-right options

Settings: The icon next to the bell that appears as a gear showcases settings for your Azure portal. Clicking on the icon provides the ability to set basic settings as shown in Figure 1.10:

Figure 1.10 – Settings for workspace customization

Within the Settings blade, options are available to change the background of the workspace UI with themes. There are light and dark shades available. Then, there is a section for changing the preferred language and formats. Check the Language dropdown for a list of languages – the list of languages will change as new languages are added to the workspace.

Help: The question mark icon provides helpful resources, from tutorials to placing support requests. This is where all the Help content is organized:

Figure 1.11 – Help for AMLS workspace support

Links are provided for tutorials on how to use the workspace and how to develop and deploy data science projects. Click on Launch guided tour to use the step-by-step guided tour.

To troubleshoot any issue with a workspace, click on Run workspace diagnostics and follow the instructions:

Support: This is the section where technical and subscription core limits, and other Azure-related issues, are linked to create a ticket.Resources: This is the section that provides links to the AML documentation, as well as a useful cheat sheet that is hosted on GitHub. A link to Microsoft’s Privacy and Terms is also available in this section.

Clicking on the smiley icon will bring up the Send us feedback section:

Figure 1.12 – Feedback page

Leveraging this section, an AMLS workspace user can provide feedback to the AMLS product team.

In the following screenshot, we can see the workspace selection menu:

Figure 1.13 – Workspace selection menu

When