Вы находитесь на странице: 1из 103

MCT USE ONLY.

STUDENT USE PROHIBITED


Microsoft
Official
Course

AZ-100T02
Implementing and
Managing Storage
MCT USE ONLY. STUDENT USE PROHIBITED
AZ-100T02
Implementing and Managing
Storage
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
Contents

■■ Module 0 Welcome  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  1
Start Here  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  1
■■ Module 1 Overview of Azure Storage  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  7
Azure Storage Accounts  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  7
Data Replication  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  12
Azure Storage Explorer  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  17
Module 1 Review Questions  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  20
■■ Module 2 Storage Services  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  23
Virtual Machine Storage  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  23
Blob Storage  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  27
Azure Files  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  30
Structured Storage  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  38
Module 2 Review Questions  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  41
■■ Module 3 Securing and Managing Storage  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  43
Shared Access Keys  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  43
Azure Backup  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  48
Azure File Sync  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  54
Module 3 Review Questions  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  60
■■ Module 4 Storing and Accessing Data  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  63
Azure Content Delivery Network  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  63
Import and Export Service  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  70
Data Box  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  75
Module 4 Review Questions  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  81
■■ Module 5 Monitoring Storage  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  83
Metrics and Alerts  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  83
Activity Log  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  89
Module 5 Review Questions  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  95
■■ Module 6 Lab-Implement and Manage Storage  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  97
Lab  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  97
MCT USE ONLY. STUDENT USE PROHIBITED
Module 0 Welcome

Start Here
Azure Administrator Curriculum
This course is part of a series of courses to help you prepare for Microsoft’s Azure Administrator certifica-
tion tests. There are two exams:
●● AZ-100, Microsoft Azure Infrastructure and Deployment1, and
●● AZ-101, Microsoft Azure Integration and Security2.
Each exam measures your ability to accomplish certain technical tasks. For example, AZ-100 includes five
study areas, as shown in the table. The percentages indicate the relative weight of each area on the exam.
The higher the percentage, the more questions you are likely to see in that area.

AZ-100 Study Areas Weights


Manage Azure subscriptions and resources 15-20%
Implement and manage storage 20-25%
Deploy and manage virtual machines 20-25%
Configure and manage virtual networks 20-25%
Manage identities 15-20%
✔️This course will focus on preparing you for the Implement and manage storage area of the AZ-100
certification exam.

About This Course


Course Description
This course teaches IT Professionals how to implement Azure storage solutions for a variety of scenarios.
Students learn about the different storage accounts and services as well as basic data replication con-
cepts and available replication schemes. Students are also introduced to Storage Explorer as a convenient

1 https://www.microsoft.com/en-us/learning/exam-az-100.aspx
2 https://www.microsoft.com/en-us/learning/exam-az-101.aspx
MCT USE ONLY. STUDENT USE PROHIBITED 2  Module 0 Welcome

way to work with Azure storage data. Students also learn the types of storage and how to work with
managed and custom disks.
Azure blob storage is how Azure stores unstructured data in the cloud, and students learn how to work
with blobs and blob containers. They also learn how to use Azure Files to work with file shares that are
accessed via the Server Message Block (SMB) protocol. In addition to blob storage, the course covers
Table and Queue storage as storage options for structured data.
Students then learn how to secure and manage storage using Shared Access Signatures (SAS) and Azure
Backup, using Recovery Services Vault. Next, students learn how to use Azure File Sync to centralize an
organization’s file Shares in Azure Files. Content Delivery Network (CDN) is used to store cached content
on a distributed network of servers that are close to end users. Students learn how to optimize content
delivery with Azure CDN, as well as how to transfer large amounts of data using the Azure Import/Export
service and Azure Data Box.
Lastly, students learn how to monitor Azure storage by configuring metrics and alerts and using the
Activity Log. Students learn how to analyze usage trends, trace requests, and diagnose issues with a
storage account.
Level: Intermediate
Audience
This course is for Azure Administrators. Azure Administrators manage the cloud services that span
storage, networking, and compute cloud capabilities, with a deep understanding of each service across
the full IT lifecycle. They take end-user requests for new cloud applications and make recommendations
on services to use for optimal performance and scale, as well as provision, size, monitor and adjust as
appropriate. This role requires communicating and coordinating with vendors. Azure Administrators use
the Azure Portal and as they become more proficient they use PowerShell and the Command Line
Interface.
Prerequisites
Successful Azure Administrators start this role with experience on operating systems, virtualization, cloud
infrastructure, storage structures, and networking.
Expected learning
●● Create Azure storage accounts for different data replication, pricing, and content scenarios.
●● Implement virtual machine storage, blob storage, Azure files, and structured storage.
●● Secure and manage storage with shared access keys, Azure backup, and Azure File Sync.
●● Store and access data using Azure Content Delivery Network, the Import and Export service, and Data
Box.
●● Monitor Azure storage with metrics and alerts, and the Activity Log.

Syllabus
This course includes content that will help you prepare for the certification exam. Other content is
included to ensure you have a complete picture of Azure storage. The course content includes a mix of
videos, graphics, reference links, module review questions, and practice labs.
Module 1 – Overview of Azure Storage
In this module, you’ll learn about storage accounts – Standard and Premium – as well as storage end-
points and how to configure custom domain accounts. You’ll have an opportunity to practice creating
and managing storage accounts. The module also covers data replication and provides a comparison of
MCT USE ONLY. STUDENT USE PROHIBITED
Start Here  3

the different available replication schemes. You’ll be introduced to Azure Storage Explorer, a utility that
lets you easily work with and manipulate Azure Storage data. Lessons include:
●● Azure storage accounts
●● Data replication
●● Azure Storage Explorer
Module 2 – Storage Services
In this module, you’ll learn about the disks component of Azure Storage as it relates to virtual machines.
Disks are how virtual machines store their VHD files. You will learn about the types of disks and storage
and how Azure simplifies IaaS disk management by creating and managing the storage accounts associ-
ated with the virtual machine disks.
You will also learn about how Azure blob storage stores unstructured data in the cloud as objects, or
blobs (BLOB = binary large object). And you’ll explore Azure Files, which offers fully managed file shares
in the cloud that are accessible via the Server Message Block (SMB) protocol. The other file storage
options covered in the module are Tables and Queues for structured storage. Lessons include:
●● Virtual machine storage
●● Blob storage
●● Azure files
●● Structured storage
Module 3 – Securing and Managing Storage
In this module, discover how a shared access signature (SAS) can be used to provide delegated access to
resources in storage accounts, allowing clients access to those resources with sharing the storage account
keys. You’ll also learn how to use Azure backup as a cloud-based solution for an existing on-premises or
off-site backup and data protection solution.
This module also covers Azure File Sync as a way to centralize an organization’s file shares in Azure Files,
and using Windows Server to cache the Azure file share locally, thus enabling scenarios such as “lift and
shift,” backup and disaster recovery, and file archiving. Lessons include:
●● Shared access keys
●● Azure backup
●● Azure File Sync
Module 4 – Storing and Accessing Data
In this module, you’ll learn about using a content delivery network (CDN) to deliver cached content that is
stored on a distributed network of edge servers closer to end-users. You’ll also learn how to transfer large
amount of data to and from the cloud using the Azure Import/Export service and Azure Data Box.
Lessons include:
●● Azure Content Delivery Network
●● Import and Export service
●● Data Box
Module 5 – Monitoring Storage
In this module, you will learn techniques for monitoring the health of Azure storage. With metrics and
alerts you can check a variety of performance metrics and send notifications to your system administrator
MCT USE ONLY. STUDENT USE PROHIBITED 4  Module 0 Welcome

team. With the Activity Log you can search and query for specific events, even across subscriptions.
Lessons include:
●● Metrics and Alerts
●● Activity Log
✔️ More complete coverage of Azure Monitor is found in the Managing Azure Subscriptions and Re-
sources course.

Study Guide
The Implement and manage storage accounts objective of the AZ-100 exam, consists of four main areas
of study: Create and configure storage accounts, Import and Export data to Azure, Configure Azure files,
and Implement Azure backup. These tables show you what may be included in each test area and where
it is covered in this course.
✔️ We recommend you use these tables as a checklist to ensure you are prepared in each area.
✔️ We also recommend supplementing your study with a practice test.3 Also, hands-on practice is
critical to understanding these concepts and passing the certification exams. There are several ways to
get an Azure subscription4.
Create and configure storage accounts

Testing May Include Course Content


Configure network access to the storage account Module 3 - Securing and Managing Storage
Create and configure storage account Module 1 - Overview of Azure Storage
Install and use Azure Storage Explorer Module 1 - Overview of Azure Storage
Manage access keys Module 3 - Securing and Managing Storage
Monitor Activity log by using Log Analytics Module 5 - Monitoring Storage
Implement Azure storage replication Module 1 - Overview of Azure Storage
Import and Export data to Azure

Testing May Include Course Content


Create Export from Azure job Module 4 - Storing and Accessing Data
Create Import into Azure job Module 4 - Storing and Accessing Data
Use Azure Data Box Module 4 - Storing and Accessing Data
Configure and Use Azure blob storage Module 2 - Storage Services
Configure Azure Content Delivery Network (CDN) Module 4 - Storing and Accessing Data
Endpoints
Configure Azure files

Testing May Include Course Content


Create Azure File Share (which includes quota) Module 2 - Storage Services
Create Azure File Sync service Module 3 - Securing and Managing Storage
Create Azure Sync group Module 3 - Securing and Managing Storage
Troubleshoot Azure File Sync Module 3 - Securing and Managing Storage
Implement Azure backup

3 https://us.mindhub.com/az-100-microsoft-azure-infrastructure-deployment-microsoft-official-practice-test/p/MU-AZ-100
4 https://azure.microsoft.com/en-us/offers/ms-azr-0044p/
MCT USE ONLY. STUDENT USE PROHIBITED
Start Here  5

Testing May Include Course Content


Configure and review backup reports Module 3 - Securing and Managing Storage
Perform backup operation Module 3 - Securing and Managing Storage
Create Recovery Services Vault Module 3 - Securing and Managing Storage
Create/Configure backup policy Module 3 - Securing and Managing Storage
Perform a restore operation Module 3 - Securing and Managing Storage
MCT USE ONLY. STUDENT USE PROHIBITED
Module 1 Overview of Azure Storage

Azure Storage Accounts


Video: Introduction to Azure Storage

Azure Storage
Azure Storage is a service that you can use to store files, messages, tables, and other types of informa-
tion. You can use Azure storage on its own—for example as a file share—but it is often used by develop-
ers as a store for working data. Such stores can be used by websites, mobile apps, desktop applications,
and many other types of custom solutions. Azure storage is also used by IaaS virtual machines, and PaaS
cloud services. You can generally think of Azure storage in three categories.
Storage for Virtual Machines
This includes disks and files. Disks are persistent block storage for Azure IaaS virtual machines. Files are
fully managed file shares in the cloud.
Unstructured Data
This includes Blobs and Data Lake Store. Blobs are highly scaleable, REST based cloud object store. Data
Lake Store is Hadoop Distributed File System (HDFS) as a service.
Structured Data
This includes Tables, Cosmos DB, and Azure SQL DB. Tables are a key/value, auto-scaling NoSQL store.
Cosmos DB is a globally distributed database service. Azure SQL DB is a fully managed data-
base-as-a-service built on SQL.
MCT USE ONLY. STUDENT USE PROHIBITED 8  Module 1 Overview of Azure Storage

For more information, you can see:


Azure Storage - https://azure.microsoft.com/en-us/services/storage/ 1

Azure Storage Accounts


An Azure storage account provides a unique namespace in the cloud to store and access your data
objects in Azure Storage. A storage account contains any blobs, files, queues, tables, and disks that you
create under that account.
Storage Account Types (Kinds)
When you create a storage account you can choose from: Storage (general purpose v1), Storage V2
(general purpose v2), and Blob storage.

A general-purpose storage account gives you access to Azure Storage services such as tables, queues,
files, blobs and Azure virtual machine disks under a single account. This type of storage account has two
performance tiers:
●● A standard storage performance tier which allows you to store tables, queues, files, blobs, and
Azure virtual machine disks.
●● A premium storage performance tier which currently only supports Azure virtual machine disks.
A Blob storage account is a specialized storage account for storing your unstructured data as blobs
(objects) in Azure Storage. Blob storage has different tiers based on frequency of use:
●● A Hot access tier which indicates that the objects in the storage account will be more frequently
accessed.
●● A Cool access tier which indicates that the objects in the storage account will be less frequently
accessed.
●● An Archive access tier which only applies to blob level storage in the general purpose v2 accounts.
✔️ To take advantage of the new archive access tier and for the lowest price per gigabyte, it's recom-
mended that you create new storage accounts as general-purpose v2 accounts. You can upgrade your
GPv1 account to a GPv2 account using PowerShell or Azure CLI.

For more information, you can see:

1 https://azure.microsoft.com/en-us/services/storage/
MCT USE ONLY. STUDENT USE PROHIBITED
Azure Storage Accounts  9

Storage Account Overview - https://docs.microsoft.com/en-us/azure/storage/common/storage-ac-


count-options#overview2
Upgrade a storage account to GPv2 - https://docs.microsoft.com/en-us/azure/storage/common/
storage-account-options#upgrade-a-storage-account-to-gpv23

Standard and Premium Storage Accounts


As discussed previously, general purpose storage accounts have two tiers: Standard and Premium.

Standard storage accounts are backed by magnetic drives (HDD) and provide the lowest cost per GB.
They are best for applications that require bulk storage or where data is accessed infrequently.
Premium storage accounts are backed by solid state drives (SSD) and offer consistent low-latency perfor-
mance. They can only be used with Azure virtual machine disks and are best for I/O-intensive applica-
tions, like databases. Additionally, virtual machines that use Premium storage for all disks qualify for a
99.99% SLA, even when running outside an availability set.
✔️ It is not possible to convert a Standard storage account to Premium storage account or vice versa. You
must create a new storage account with the desired type and copy data, if applicable, to a new storage
account.

Storage Account Endpoints


Every object that you store in Azure Storage has a unique URL address. The storage account name forms
the subdomain of that address. The combination of subdomain and domain name, which is specific to
each service, forms an endpoint for your storage account.
For example, if your storage account is named mystorageaccount, then the default endpoints for your
storage account are:
●● Blob service: http://mystorageaccount.blob.core.windows.net
●● Table service: http://mystorageaccount.table.core.windows.net
●● Queue service: http://mystorageaccount.queue.core.windows.net
●● File service: http://mystorageaccount.file.core.windows.net
The URL for accessing an object in a storage account is built by appending the object's location in the
storage account to the endpoint. For example, to access myblob in the mycontainer, use this format:
http://mystorageaccount.blob.core.windows.net/mycontainer/myblob.
✔️A Blob storage account only exposes the Blob service endpoint. And, you can also configure a custom
domain name to use with your storage account.
For more information, you can see:
Storage Account Endpoints - https://docs.microsoft.com/en-us/azure/storage/common/storage-cre-
ate-storage-account?toc=%2fazure%2fstorage%2fblobs%2ftoc.json#storage-account-endpoints4

2 https://docs.microsoft.com/en-us/azure/storage/common/storage-account-options
3 https://docs.microsoft.com/en-us/azure/storage/common/storage-account-options
4 https://docs.microsoft.com/en-us/azure/storage/common/storage-create-storage-account?toc=%2fazure%2fstorage%2fblobs%2ftoc.json
MCT USE ONLY. STUDENT USE PROHIBITED 10  Module 1 Overview of Azure Storage

Configuring Custom Domain Names


You can specify a custom domain for accessing blob content instead of using the Azure URLs. There are
two ways to configure this service: Direct CNAME mapping and an intermediary domain.
Direct CNAME mapping for example, to enable a custom domain for the blobs.contoso.com sub domain
to an Azure storage account, create a CNAME record that points from blobs.contoso.com to the Azure
storage account [storage account].blob.core.windows.net. The following example maps a domain to an
Azure storage account in DNS:

CNAME record Target


blobs.contoso.com contosoblobs.blob.core.windows.net
Intermediary mapping with asverify Mapping a domain that is already in use within Azure may result
in minor downtime as the domain is updated. If you have an application with an SLA, by using the
domain you can avoid the downtime by using a second option, the asverify subdomain, to validate the
domain. By prepending asverify to your own subdomain, you permit Azure to recognize your custom
domain without modifying the DNS record for the domain. After you modify the DNS record for the
domain, it will be mapped to the blob endpoint with no downtime.
The following examples maps a domain to the Azure storage account in DNS with the asverify intermedi-
ary domain:

CNAME record Target


asverify.blobs.contoso.com asverify.contosoblobs.blob.core.windows.net
blobs.contoso.com contosoblobs.blob.core.windows.net
For more information, you can see:
Configure a custom domain name for your Blob storage endpoint - https://docs.microsoft.com/en-us/
azure/storage/blobs/storage-custom-domain-name

Pricing and Billing


All storage accounts use a pricing model for blob storage based on the tier of each blob. When using a
storage account, the following billing considerations apply:
●● Storage costs: In addition to, the amount of data stored, the cost of storing data varies depending on
the storage tier. The per-gigabyte cost decreases as the tier gets cooler.
●● Data access costs: Data access charges increase as the tier gets cooler. For data in the cool and
archive storage tier, you are charged a per-gigabyte data access charge for reads.
●● Transaction costs: There is a per-transaction charge for all tiers that increases as the tier gets cooler.
●● Geo-Replication data transfer costs: This charge only applies to accounts with geo-replication
configured, including GRS and RA-GRS. Geo-replication data transfer incurs a per-gigabyte charge.
●● Outbound data transfer costs: Outbound data transfers (data that is transferred out of an Azure
region) incur billing for bandwidth usage on a per-gigabyte basis, consistent with general-purpose
storage accounts.
●● Changing the storage tier: Changing the account storage tier from cool to hot incurs a charge equal
to reading all the data existing in the storage account. However, changing the account storage tier
from hot to cool incurs a charge equal to writing all the data into the cool tier (GPv2 accounts only).
For more information, you can see:
MCT USE ONLY. STUDENT USE PROHIBITED
Azure Storage Accounts  11

Pricing model for Blob storage accounts - https://azure.microsoft.com/pricing/details/storage/


Outbound data transfer charges - https://azure.microsoft.com/pricing/details/data-transfers/

Demonstration: Creating Storage Accounts


Demonstration Creating Storage Accounts
To get started with Azure Storage, you first need to create a new storage account. You can create an
Azure storage account using the Azure portal, Azure PowerShell, or Azure CLI.
MCT USE ONLY. STUDENT USE PROHIBITED 12  Module 1 Overview of Azure Storage

Data Replication
Video: Planning Storage

Replication Options
The data in your Azure storage account is always replicated to ensure durability and high availability.
Azure Storage replication copies your data so that it is protected from planned and unplanned events
ranging from transient hardware failures, network or power outages, massive natural disasters, and so on.
You can choose to replicate your data within the same data center, across zonal data centers within the
same region, and even across regions.

When you create a Standard storage account there are three replications schemes: Locally-redundant
storage (LRS), Geo-redundant storage (GRS), and Read-access geo-redundant storage (RA-GRS).

✔️ If you select Premium performance only LRS replication will be available.


✔️ If you create availability sets for your virtual machines, then Azure uses Zone-redundant Storage
(ZRS). Read more in the next topics.
For more information, you can see:
Azure storage replication - https://docs.microsoft.com/en-us/azure/storage/common/storage-re-
dundancy
MCT USE ONLY. STUDENT USE PROHIBITED
Data Replication  13

Locally Redundant Storage


Replication Copies Strategy
Locally redundant storage Maintains three copies of your Data is replicated three time
(LRS) data. within a single facility in a single
region.
Locally redundant storage (LRS) maintains three copies of your data in a storage scale unit in a data-
center. All copies of the data exist within the same region.
LRS is a low-cost option for protecting your data from local hardware failures. If a datacenter-level
disaster (for example, fire or flooding) occurs, all replicas may be lost or unrecoverable. To mitigate this
risk, Microsoft recommends using either zone-redundant storage (ZRS) or geo-redundant storage (GRS).
However, LRS may be appropriate in these scenarios:
●● If your application stores data that can be easily reconstructed if data loss occurs, you may opt for
LRS.
●● Some applications are restricted to replicating data only within a country due to data governance
requirements.


✔️ Do you think LRS is a good choice for your organization?
For more information, you can see:
Locally-redundant storage: Low-cost redundancy - https://docs.microsoft.com/en-us/azure/storage/
common/storage-redundancy-lrs

Geo-redundant storage
Replication Copies Strategy
Geo-redundant storage (GRS) Maintains six copies of your data. Data is replicated three times
within the primary region and is
also replicated three times in a
secondary region hundreds of
miles away from the primary
region.
Read access geo-redundant Maintains six copies of your data. Data is replicated to a secondary
storage (RA-GRS) geographic location and pro-
vides read access to your data in
the secondary location.
Geo-redundant storage (GRS) is the default and recommended replication option and is sometimes
called cross-regional replication. GRS replicates your data to a secondary region (hundreds of miles away
from the primary location of the source data). GRS costs more than LRS, but GRS provides a higher level
of durability for your data, even if there is a regional outage.
If you opt for GRS, you have two related options to choose from:
●● GRS replicates your data to another data center in a secondary region, but that data is available to be
read only if Microsoft initiates a failover from the primary to secondary region.
●● Read-access geo-redundant storage (RA-GRS) is based on GRS. RA-GRS replicates your data to
another data center in a secondary region, and also provides you with the option to read from the
MCT USE ONLY. STUDENT USE PROHIBITED 14  Module 1 Overview of Azure Storage

secondary region. With RA-GRS, you can read from the secondary regardless of whether Microsoft
initiates a failover from the primary to the secondary.
✔️ If you enable RA-GRS and your primary endpoint for the Blob service is myaccount.blob.core.windows.
net, then your secondary endpoint is myaccount-secondary.blob.core.windows.net. The access keys for
your storage account are the same for both the primary and secondary endpoints.
For more information, you can see:
Geo-redundant storage (Cross-regional replication for Azure Storage) - https://docs.microsoft.com/
en-us/azure/storage/common/storage-redundancy-grs?toc=%2fazure%2fstorage%2fblobs%2ftoc.
json
Read-access geo-redundant storage - https://docs.microsoft.com/en-us/azure/storage/common/
storage-redundancy-grs?toc=%2fazure%2fstorage%2fblobs%2ftoc.json#read-access-geo-redun-
dant-storage5

Zone Redundant Storage


Replication Copies Strategy
Zone-redundant storage (ZRS) Maintains three copies of your Data is replicated three times
data. across two to three facilities,
either within a single region or
across two regions.
Zone Redundant Storage (ZRS) synchronously replicates your data across three (3) storage clusters in a
single region. Each storage cluster is physically separated from the others and resides in its own availabili-
ty zone. Each availability zone, and the ZRS cluster within it, is autonomous, with separate utilities and
networking capabilities.
Storing your data in a ZRS account ensures that you will be able access and manage your data if a zone
becomes unavailable. ZRS provides excellent performance and extremely low latency.
Here are a few of more things to know about ZRS:
●● ZRS is not yet available in all regions.
●● Changing to ZRS from another data replication option requires the physical data movement from a
single storage stamp to multiple stamps within a region. Read more about manual and live migration
at the link below.
●● ZRS may not protect your data against a regional disaster where multiple zones are permanently
affected. Instead, ZRS offers resiliency for your data in the case of temporal unavailability.
✔️Consider ZRS for scenarios that require strong consistency, strong durability, and high availability even
if an outage or natural disaster renders a zonal data center unavailable.
For more information, you can see:
Zone-redundant storage (ZRS): Highly available Azure Storage applications - https://docs.microsoft.
com/en-us/azure/storage/common/storage-redundancy-zrs?toc=%2fazure%2fstorage%2f-
blobs%2ftoc.json
Converting to ZRS - https://docs.microsoft.com/en-us/azure/storage/common/storage-redundan-
cy-zrs?toc=%2fazure%2fstorage%2fblobs%2ftoc.json#converting-to-zrs-replication6

5 https://docs.microsoft.com/en-us/azure/storage/common/storage-redundancy-grs?toc=%2fazure%2fstorage%2fblobs%2ftoc.json
6 https://docs.microsoft.com/en-us/azure/storage/common/storage-redundancy-zrs?toc=%2fazure%2fstorage%2fblobs%2ftoc.json
MCT USE ONLY. STUDENT USE PROHIBITED
Data Replication  15

What are availability zones? - https://docs.microsoft.com/en-us/azure/availability-zones/az-over-


view

Replication Option Comparison


The following table provides a quick overview of the scope of durability and availability that each replica-
tion strategy will provide you for a given type of event (or event of similar impact).

Replication Option LRS ZRS GRS RA-GRS


Node unavailabil- Yes Yes Yes Yes
ity within a data
center
An entire data No Yes Yes Yes
center (zonal or
non-zonal)
becomes unavail-
able
A region-wide No No Yes Yes
outage
Read access to No No No Yes
your data (in a
remote, geo-rep-
licated region) in
the event of
region-wide
unavailability
Available in GPv1, GPv2, Blob Standard,GPv2 GPv1, GPv2, Blob GPv1, GPv2, Blob
storage account
types
For more information, you can see:
Read-access geo-redundant storage - https://docs.microsoft.com/en-us/azure/storage/common/
storage-redundancy-grs#read-access-geo-redundant-storage7

Storage Accounts PowerShell Tasks


Here are a few common storage accounts tasks using PowerShell.

Task Example
Check to see if a storage account name is Get-AzureRmStorageAccountNameAvailability
available. -Name ‘mystorageaccount’
Create a storage account. New-AzureRmStorageAccount -ResourceGroup-
Name MyResourceGroup -AccountName mystor-
ageaccount -Location westus -SkuName Stand-
ard_GRS
Retrieve a specific storage account or all the Get-AzureRmStorageAccount -ResourceGroupN-
storage accounts in a resource group or ame "RG01" -AccountName “mystorageaccount”
subscription.

7 https://docs.microsoft.com/en-us/azure/storage/common/storage-redundancy-grs
MCT USE ONLY. STUDENT USE PROHIBITED 16  Module 1 Overview of Azure Storage

Task Example
Modify storage account properties, such as Set-AzureRmStorageAccount -ResourceGroupN-
type. ame "MyResourceGroup" -AccountName “mystor-
ageaccount” -Type "Standard_RAGRS"
✔️ Be sure to try a few commands using the reference link below. You’ll need to create unique names for
your own storage accounts and resource groups.
For more information, you can see:
Create a storage account - https://docs.microsoft.com/en-us/azure/storage/common/storage-pow-
ershell-guide-full#create-a-storage-account8

8 https://docs.microsoft.com/en-us/azure/storage/common/storage-powershell-guide-full
MCT USE ONLY. STUDENT USE PROHIBITED
Azure Storage Explorer  17

Azure Storage Explorer


Azure Storage Explorer
‎Microsoft Azure Storage Explorer is a standalone app from Microsoft that allows you to easily work with
Azure Storage data.

Some of the benefits of Azure Storage Explorer are:


●● Access multiple accounts and subscriptions.
●● Create, delete, view, and edit storage resources.
●● View and edit Blob, Queue, Table, File, Cosmos DB storage and Data Lake Storage.
●● Obtain shared access signature (SAS) keys.
●● Available for Windows, Mac, and Linux.
For more information, you can see:
Download Azure Storage Explorer - https://azure.microsoft.com/en-us/features/storage-explorer/

Storage Explorer Functionality


Azure Storage Explorer has many uses when it comes to managing your storage. See the following
articles to learn more. Also, check out the videos that follow this topic.
●● Connect to an Azure subscription9: Manage storage resources that belong to your Azure subscrip-
tion.
●● Work with local development storage10: Manage local storage by using the Azure Storage Emulator.
●● Attach to external storage11: Manage storage resources that belong to another Azure subscription
or that are under national Azure clouds by using the storage account's name, key, and endpoints.
●● Attach a storage account by using an SAS12: Manage storage resources that belong to another
Azure subscription by using a shared access signature (SAS).
●● Attach a service by using an SAS13: Manage a specific storage service (blob container, queue, or
table) that belongs to another Azure subscription by using an SAS.

9 https://docs.microsoft.com/en-us/azure/vs-azure-tools-storage-manage-with-storage-explorer
10 https://docs.microsoft.com/en-us/azure/vs-azure-tools-storage-manage-with-storage-explorer
11 https://docs.microsoft.com/en-us/azure/vs-azure-tools-storage-manage-with-storage-explorer
12 https://docs.microsoft.com/en-us/azure/vs-azure-tools-storage-manage-with-storage-explorer
13 https://docs.microsoft.com/en-us/azure/vs-azure-tools-storage-manage-with-storage-explorer
MCT USE ONLY. STUDENT USE PROHIBITED 18  Module 1 Overview of Azure Storage

●● Connect to an Azure Cosmos DB account by using a connection string14: Manage Cosmos DB


account by using a connection string.

Video: Overview of Azure Storage Explorer


Overview of Azure Storage Explorer
After watching the video try the Quickstart: Upload, download, and list blobs using Azure Storage
Explorer15.

Video: Keyword Search in Azure Storage Explor-


er

Video: Storage Explorer table query

Demonstration: Storage Access Tools


Demonstration Storage Access Tools
In the video, Corey shows how to work with the contents of an Azure storage account using a combina-
tion of various tools: the portal, Microsoft’s Azure Storage Explorer and Azure PowerShell.

14 https://docs.microsoft.com/en-us/azure/vs-azure-tools-storage-manage-with-storage-explorer
15 https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-storage-explorer
MCT USE ONLY. STUDENT USE PROHIBITED
Azure Storage Explorer  19

Additional Practice - Use Storage Explorer


✔️ Once you download the Azure Storage Explorer16, you can use the following Quickstart17 to learn
about the different ways you can use Storage Explorer to connect to and manage your Azure storage
accounts.
In this practice, you will learn how to:
●● Create a storage account and log into Storage Explorer18.
●● Create a container and upload, download, and view blobs in a container19.
●● Manage snapshots20.
✔️ You may want to keep the storage account for the next lesson and the Shared Access Signature
practice.
For more information, you can see:
Getting Started with Storage Explorer - https://docs.microsoft.com/en-us/azure/vs-azure-tools-stor-
age-manage-with-storage-explorer?tabs=windows

16 https://azure.microsoft.com/en-us/features/storage-explorer/
17 https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-storage-explorer
18 https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-storage-explorer
19 https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-storage-explorer
20 https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-storage-explorer
MCT USE ONLY. STUDENT USE PROHIBITED 20  Module 1 Overview of Azure Storage

Module 1 Review Questions


Module 1 Review Questions
Azure Storage Categories
Azure storage can be used for many different categories of storage. The storage categories are:
●● Storage for Virtual Machines
●● Unstructured Data
●● Structured Data
What are examples of technologies or methodologies for each of the categories?

Click for suggested answer ↓ 


●● Storage for Virtual Machines. This includes disks and files. Disks are persistent block storage for
Azure IaaS virtual machines. Files are fully managed file shares in the cloud.
●● Unstructured Data. This includes Blobs and Data Lake Store. Blobs are highly scalable, REST based
cloud object store. Data Lake Store is Hadoop Distributed File System (HDFS) as a service.
●● Structured Data. This includes Tables, Cosmos DB, and Azure SQL DB. Tables are a key/value, au-
to-scaling NoSQL store. Cosmos DB is a globally distributed database service. Azure SQL DB is a fully
managed database-as-a-service built on SQL.
Azure Storage Account Types
You are the administrator for your organizations file servers, which includes a large amount of data
considered outdated, but needs to be kept for compliance and regulation purposes. You need to mini-
mize costs, while still taking end-user performance into consideration. What type of storage accounts
should you use for the active data, and what type should you use for archived data?

Click for suggested answer ↓ 


For actively used data, you should use a General Purpose v2 account, with a standard storage perfor-
mance tier. For archived data, use a Blob storage account, with Archive tier.
A general-purpose storage account gives you access to Azure Storage services such as tables, queues,
files, blobs and Azure virtual machine disks under a single account. This type of storage account has two
performance tiers:
●● A standard storage performance tier which allows you to store tables, queues, files, blobs, and
Azure virtual machine disks.
●● A premium storage performance tier which currently only supports Azure virtual machine disks.
A Blob storage account is a specialized storage account for storing your unstructured data as blobs
(objects) in Azure Storage. Blob storage has different tiers based on frequency of use:
●● A Hot access tier which indicates that the objects in the storage account will be more frequently
accessed.
●● A Cool access tier which indicates that the objects in the storage account will be less frequently
accessed.
MCT USE ONLY. STUDENT USE PROHIBITED
Module 1 Review Questions  21

●● An Archive access tier which only applies to blob level storage in the general purpose v2 accounts.
To take advantage of the new archive access tier and for the lowest price per gigabyte, it's recommended
that you create new storage accounts as general-purpose v2 accounts.
Data Replication
You are responsible for Disaster Recovery and High Availability for your organization. Company policy
dictates that business-critical systems must be available as well as writeable in the event of a natural
disaster at the primary location. All business-critical data is stored in Azure and contains PII content
subject to GDPR.
Which data replication option should you choose and why?

Click for suggested answer ↓ 


The data in your Azure storage account is always replicated to ensure durability and high availability.
Azure Storage replication copies your data so that it is protected from planned and unplanned events
ranging from transient hardware failures, network or power outages, massive natural disasters, and so on.
You can choose to replicate your data within the same data center, across zonal data centers within the
same region, and even across regions.

Requirements:
●● Must be replicated between two zones
●● Subject to export restrictions
●● Data must be read and writeable
Based on the requirements, the only replication scheme that meets all the requirements is zone-redun-
dant storage (ZRS). Locally-redundant storage (LRS) will not work as it is single-zone only. With geo-re-
dundant storage (GRS), you cannot export the data. Additionally, in the case of read-access geo-redun-
dant storage (RA-GRS), the data is not writeable.
MCT USE ONLY. STUDENT USE PROHIBITED
Module 2 Storage Services

Virtual Machine Storage


Video: Virtual Machine Storage
Virtual Machine Storage
Note: This default numbers in this video are accurate as of the time of the recording. Azure is constantly
being updated so be sure to check the documentation if something seems out of date.

Demonstration: Virtual Machine Storage

Virtual Machine Disks


Just like any other computer, virtual machines in Azure uses disks as a place to store an operating system,
applications, and data. All Azure virtual machines have at least two disks – a Windows operating system
disk (in the case of a Windows VM) and a temporary disk. Virtual machines also can have one or more
data disks. All disks are stored as VHDs.
MCT USE ONLY. STUDENT USE PROHIBITED 24  Module 2 Storage Services

Operating System Disks


Every virtual machine has one attached operating system disk. It’s registered as a SATA drive and labeled
as the C: drive by default. This disk has a maximum capacity of 2048 gigabytes (GB).
Temporary Disk
Each VM contains a temporary disk. The temporary disk provides short-term storage for applications and
processes and is intended to only store data such as page or swap files. Data on the temporary disk may
be lost during a maintenance event or when you redeploy a VM. During a successful standard reboot of
the VM, the data on the temporary drive will persist.
●● On Windows virtual machines, this disk is labeled as the D: drive by default and it used for storing
pagefile.sys.
●● On Linux virtual machines, the disk is typically /dev/sdb and is formatted and mounted to /mnt by the
Azure Linux Agent.
Data Disks
A data disk is a VHD that's attached to a virtual machine to store application data, or other data you need
to keep. Data disks are registered as SCSI drives and are labeled with a letter that you choose. Each data
disk has a maximum capacity of 4095 GB. The size of the virtual machine determines how many data
disks you can attach to it and the type of storage you can use to host the disks.
✔️ Don’t store data on the temporary disk. It provides temporary storage for applications and processes
and is intended to only store data such as page or swap files.
For more information, you can see:
About disk storage for Azure Windows virtual machines - https://docs.microsoft.com/en-us/azure/
virtual-machines/windows/about-disks-and-vhds?toc=%2Fazure%2Fvirtual-machines%2Fwin-
dows%2Ftoc.json

Types of Storage
Azure Premium Storage delivers high-performance, low-latency disk support for virtual machines (VMs)
with input/output (I/O)-intensive workloads. VM disks that use Premium Storage store data on solid-state
drives (SSDs). To take advantage of the speed and performance of premium storage disks, you can
migrate existing VM disks to Premium Storage.
In Azure, you can attach several premium storage disks to a VM. Using multiple disks gives your applica-
tions up to 256 TB of storage per VM. With Premium Storage, your applications can achieve 80,000 I/O
operations per second (IOPS) per VM, and a disk throughput of up to 2,000 megabytes per second (MB/s)
per VM. Read operations give you very low latencies.
Azure offers two ways to create premium storage disks for VMs:
Unmanaged disks
MCT USE ONLY. STUDENT USE PROHIBITED
Virtual Machine Storage  25

The original method is to use unmanaged disks. In an unmanaged disk, you manage the storage ac-
counts that you use to store the virtual hard disk (VHD) files that correspond to your VM disks. VHD files
are stored as page blobs in Azure storage accounts.
Managed disks
When you choose Azure Managed Disks, Azure manages the storage accounts that you use for your VM
disks. You specify the disk type (Premium or Standard) and the size of the disk that you need. Azure
creates and manages the disk for you. You don't have to worry about placing the disks in multiple storage
accounts to ensure that you stay within scalability limits for your storage accounts. Azure handles that for
you. We recommend that you choose managed disks, to take advantage of their many features.
✔️ For the best performance for your application, we recommend that you migrate any VM disk that
requires high IOPS to Premium Storage. If your disk does not require high IOPS, you can help limit costs
by keeping it in standard Azure Storage. In standard storage, VM disk data is stored on hard disk drives
(HDDs) instead of on SSDs.
For more information, you can see:
Managed disks overview - https://docs.microsoft.com/en-us/azure/virtual-machines/windows/
managed-disks-overview
Premium Storage - https://docs.microsoft.com/en-us/azure/virtual-machines/windows/premi-
um-storage
Standard Storage - https://docs.microsoft.com/en-us/azure/virtual-machines/windows/stand-
ard-storage

Video: Resiliency with Managed Disks


Resiliency with Managed Disks
Azure Managed Disks simplifies disk management for Azure IaaS VMs by managing the storage accounts
associated with the VM disks. You only specify the type (Standard HDD, Standard SSD, or Premium SSD)
and the size of disk you need, and Azure creates and manages the disk for you. This video covers man-
aged disks, snapshots, and premium storage.

Demonstration: Upload Custom Disks


MCT USE ONLY. STUDENT USE PROHIBITED 26  Module 2 Storage Services

Demonstration: Migrating from Managed Disks

Additional Practice - Virtual Machine Storage


Take some time to practice what you have learned in this section.
●● Attach a data disk to a Windows VM using PowerShell1. You can attach both new and existing
disks to a Windows virtual machine using PowerShell. Remember the size of the virtual machine
controls how many data disks you can attach.
●● Detach a data disk from a Windows virtual machine2. When you no longer need a data disk that's
attached to a virtual machine, you can easily detach it. This removes the disk from the virtual machine
but doesn't remove it from storage.
●● Convert Azure managed disks storage from standard to premium, and vice versa3. Managed
disks offer two storage options: Premium (SSD-based) and Standard (HDD-based). This allows you to
easily switch between the two options with minimal downtime based on your performance needs. This
capability is not available for unmanaged disks.
●● Convert a Windows virtual machine from unmanaged disks to managed disks4. If you have
existing Windows virtual machines (VMs) that use unmanaged disks, you can convert the VMs to use
managed disks through the Azure Managed Disks service. This process converts both the OS disk and
any attached data disks.
✔️ Can you see the advantage of completing tasks with PowerShell?
For more information, you can see:
Azure Virtual Machine PowerShell samples - https://docs.microsoft.com/en-us/azure/virtual-ma-
chines/windows/powershell-samples

1 https://docs.microsoft.com/en-us/azure/virtual-machines/windows/attach-disk-ps
2 https://docs.microsoft.com/en-us/azure/virtual-machines/windows/detach-disk
3 https://docs.microsoft.com/en-us/azure/virtual-machines/windows/convert-disk-storage
4 https://docs.microsoft.com/en-us/azure/virtual-machines/windows/convert-unmanaged-to-managed-disks
MCT USE ONLY. STUDENT USE PROHIBITED
Blob Storage  27

Blob Storage
Blob Overview

Azure Blob storage is a service that stores unstructured data in the cloud as objects/blobs. Blob storage
can store any type of text or binary data, such as a document, media file, or application installer. Blob
storage is also referred to as object storage.
Common uses of Blob storage include:
●● Serving images or documents directly to a browser.
●● Storing files for distributed access, such as installation.
●● Streaming video and audio.
●● Storing data for backup and restore, disaster recovery, and archiving.
●● Storing data for analysis by an on-premises or Azure-hosted service.
For more information, you can see:
Azure Blob Storage - https://azure.microsoft.com/en-us/services/storage/blobs/

Blob Containers
A container provides a grouping of a set of blobs. All blobs must be in a container. An account can
contain an unlimited number of containers. A container can store an unlimited number of blobs. You can
create the container in the Azure Portal.

Name: The name may only contain lowercase letters, numbers, and hyphens, and must begin with a letter
or a number. The name must also be between 3 and 63 characters long.
MCT USE ONLY. STUDENT USE PROHIBITED 28  Module 2 Storage Services

Public access level: Specifies whether data in the container may be accessed publicly. By default, con-
tainer data is private to the account owner.
●● Use Private to ensure there is no anonymous access to the container and blobs.
●● Use Blob to allow anonymous public read access for blobs only.
●● Use Container to allow anonymous public read and list access to the entire container, including the
blobs.
✔️ You can also create the Blob container with PowerShell using the New-AzureStorageContainer
command.
✔️ Have you thought about how you will organize your containers?
For more information, you can see:
Create a Container - https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quick-
start-blobs-powershell#create-a-container5

Uploading Blobs
A blob can be any type and size file. Azure Storage offers three types of blobs: block blobs, page blobs,
and append blobs. You specify the blob type when you create the blob. The default is a block blob.

●● Block blobs are ideal for storing text or binary files, such as documents and media files.
●● Append blobs are like block blobs in that they are made up of blocks, but they are optimized for
append operations, so they are useful for logging scenarios.
●● Page blobs can be up to 8 TB in size and are more efficient for frequent read/write operations. Azure
virtual machines use page blobs as OS and data disks.
✔️ Once the blob has been created, its type cannot be changed.
✔️ You can also upload a local file to blob storage using the PowerShell Set-AzureStorageBlobContent
command.

5 https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-powershell
MCT USE ONLY. STUDENT USE PROHIBITED
Blob Storage  29

Demonstration: Blob Storage

Additional Practice - Blob Storage


There are many ways to work with storage. In this practice, you will experiment with some of the options
that are available to manage blob storage.
Take a minute to review the Upload, download, and list blobs using the Azure portal6 how-to (using
the Azure portal). If possible give it a try on your subscription. You will learn how to:
●● Create a container to hold block blobs
●● Upload and download block blobs
Then try the Upload, download, and list blobs using Azure PowerShell7 to transfer files between local
disk and Azure Blob storage. You’ll learn how to use PowerShell to:
●● Create a storage account and a container
●● Upload blobs to the container
●● List the blobs in a container
●● Download blobs to your local disk
You can also use the Azure Storage Explorer8 (a tool that you can download for free) to manage the
contents of your storage accounts. If you have time, try the following Quickstart9 where you will use
Azure Storage Explorer to transfer files between a local disk and Azure Blob storage.
For more information, you can see:
Understanding Block Blobs, Append Blobs, and Page Blobs - https://docs.microsoft.com/en-us/rest/
api/storageservices/understanding-block-blobs–append-blobs–and-page-blobs10

6 https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-portal
7 https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-powershell
8 https://azure.microsoft.com/en-us/features/storage-explorer/
9 https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-storage-explorer
10 https://docs.microsoft.com/en-us/rest/api/storageservices/understanding-block-blobs--append-blobs--and-page-blobs
MCT USE ONLY. STUDENT USE PROHIBITED 30  Module 2 Storage Services

Azure Files
File storage

File storage11 offers shared storage for applications using the industry standard SMB protocol12.
Microsoft Azure virtual machines and cloud services can share file data across application components via
mounted shares, and on-premises applications can also access file data in the share.
Applications running in Azure virtual machines or cloud services can mount a file storage share to access
file data, just as a desktop application would mount a typical SMB share. Any number of Azure virtual
machines or roles can mount and access the File storage share simultaneously.
Common uses of file storage include:
●● Replace and supplement. Azure Files can be used to completely replace or supplement traditional
on-premises file servers or NAS devices.
●● Access anywhere. Popular operating systems such as Windows, macOS, and Linux can directly mount
Azure File shares wherever they are in the world.
●● Lift and shift. Azure Files makes it easy to “lift and shift” applications to the cloud that expect a file
share to store file application or user data.
●● Azure File Sync. Azure File shares can also be replicated with Azure File Sync to Windows Servers,
either on-premises or in the cloud, for performance and distributed caching of the data where it's
being used.
●● Shared applications. Storing shared application settings, for example in configuration files.
●● Diagnostic data. Storing diagnostic data such as logs, metrics, and crash dumps in a shared location.
●● Tools and utilities. Storing tools and utilities needed for developing or administering Azure virtual
machines or cloud services.
✔️ Which of the usage cases for file shares are you most interested in?
For more information, you can see:

‎Why Azure files are useful - https://docs.microsoft.com/en-us/azure/storage/files/storage-files-in-


troduction#why-azure-files-is-useful13

11 https://docs.microsoft.com/en-us/azure/storage/files/storage-files-introduction
12 https://msdn.microsoft.com/library/windows/desktop/aa365233.aspx
13 https://docs.microsoft.com/en-us/azure/storage/files/storage-files-introduction
MCT USE ONLY. STUDENT USE PROHIBITED
Azure Files  31

Azure Files
Azure Files offers fully managed file shares in the cloud that are accessible via the industry standard
Server Message Block (SMB) protocol. Azure File shares can be mounted concurrently by cloud or
on-premises deployments of Windows, Linux, and macOS. Additionally, Azure File shares can be cached
on Windows Servers with Azure File Sync (next lesson) for fast access near where the data is being used.
Sometimes it is difficult to decide when to use file shares instead of blobs or disk shares. Take a minute to
review this table that compares the different features.

Feature Description When to use


Azure Files Provides an SMB interface, client You want to “lift and shift” an
libraries, and a REST interface application to the cloud which
(https://docs.microsoft.com/ already uses the native file
en-us/rest/api/storageservices/ system APIs to share data
file-service-rest-api) that allows between it and other applica-
access from anywhere to stored tions running in Azure. You want
files. to store development and
debugging tools that need to be
accessed from many virtual
machines.
Azure Blobs Provides client libraries and a You want your application to
REST interface (https://docs. support streaming and ran-
microsoft.com/en-us/rest/api/ dom-access scenarios.You want
storageservices/blob-service-rest- to be able to access application
api) that allows unstructured data from anywhere.
data to be stored and accessed
at a massive scale in block blobs.
Azure Disks Provides client libraries and a You want to lift and shift applica-
REST interface (https://docs. tions that use native file system
microsoft.com/en-us/rest/api/ APIs to read and write data to
compute/manageddisks/disks/ persistent disks. You want to
disks-rest-api) that allows data to store data that is not required to
be persistently stored and be accessed from outside the
accessed from an attached virtual machine to which the disk
virtual hard disk. is attached.
Other distinguishing features, when selecting Azure files.
●● Azure files are true directory objects. Azure blobs are a flat namespace.
●● Azure files are accessed through file shares. Azure blobs are accessed through a container.
●● Azure files provide shared access across multiple virtual machines. Azure disks are exclusive to a single
virtual machine.
✔️ When selecting which storage feature to use, you should also consider pricing.
Take a minute to view the Azure Storage Overview pricing14 page.
For more information, you can see:

14 https://azure.microsoft.com/en-us/pricing/details/storage/
MCT USE ONLY. STUDENT USE PROHIBITED 32  Module 2 Storage Services

Comparison: Files and Blobs - https://docs.microsoft.com/en-us/azure/storage/common/stor-


age-decide-blobs-files-disks?toc=%2fazure%2fstorage%2ffiles%2ftoc.json#compari-
son-files-and-blobs15
Comparison: Files and Disks - https://docs.microsoft.com/en-us/azure/storage/common/storage-de-
cide-blobs-files-disks?toc=%2fazure%2fstorage%2ffiles%2ftoc.json#comparison-files-and-disks16

Creating File Shares


To access your files, you will need a file share. There are several ways to create a file share.
Creating a file share (Portal)
Before you can create a file share you will need a storage account. Once that is in place, the steps to
create a file share in the Portal are very straightforward.

1. In the Portal access the Storage Account blade.


2. Select Files and then click File Share.
3. Provide the file share Name and the Quota. Quota refers to total size of files on the share.
4. Ensure the file share was created. Test by adding a directory or uploading a file.

✔️ You can also use PowerShell to create a file share.


# Retrieve storage account and storage account key
$storageContext = New-AzureStorageContext <storage-account-name> <stor-
age-account-key>
# Create the file share, in this case “logs”
$share = New-AzureStorageShare logs -Context $storageContext

For more information, you can see:


Create a file share in Azure Files - https://docs.microsoft.com/en-us/azure/storage/files/storage-
how-to-create-file-share#Create file share through the Portal

15 https://docs.microsoft.com/en-us/azure/storage/common/storage-decide-blobs-files-disks?toc=%2fazure%2fstorage%2ffiles%2ftoc.json
16 https://docs.microsoft.com/en-us/azure/storage/common/storage-decide-blobs-files-disks?toc=%2fazure%2fstorage%2ffiles%2ftoc.json
MCT USE ONLY. STUDENT USE PROHIBITED
Azure Files  33

Create a file share through PowerShell - https://docs.microsoft.com/en-us/azure/storage/files/


storage-how-to-create-file-share#create-file-share-through-powershell

Naming and Referencing File Shares


A storage account can contain zero or more Azure File shares. A share contains properties, metadata, and
zero or more files or directories. A directory contains properties and zero or more files or directories. A
file is any single entity comprised of binary data, properties, and metadata.
Share Names
The rules for File service share names are more restrictive than what is prescribed by the SMB protocol for
SMB share names, so that the Blob and File services can share similar naming conventions for containers
and shares. The naming restrictions for shares are as follows:
●● A share name must be a valid DNS name.
●● Share names must start with a letter or number, and can contain only letters, numbers, and the dash
(-) character.
●● Every dash (-) character must be immediately preceded and followed by a letter or number; consecu-
tive dashes are not permitted in share names.
●● All letters in a share name must be lowercase.
●● Share names must be from 3 through 63 characters long.
For more information, you can see:
Naming and Referencing Shares, Directories, Files, and Metadata – https://docs.microsoft.com/en-us/
rest/api/storageservices/naming-and-referencing-shares–directories–files–and-metadata17

Mapping File Shares (Windows)


Prerequisites for mapping the file share in Windows
You can connect to your Azure file share with Windows or Windows Server. Here are the prerequisites:
●● Storage Account Name: To mount an Azure File share, you will need the name of the storage
account.
●● Storage Account Key: To mount an Azure File share, you will need the primary (or secondary) storage
key. SAS keys are not currently supported for mounting.
●● Ensure port 445 is open: Azure Files uses SMB protocol. SMB communicates over TCP port 445 -
check to see if your firewall is not blocking TCP ports 445 from client machine.
Map the file share with File Explorer

17 https://docs.microsoft.com/en-us/rest/api/storageservices/naming-and-referencing-shares--directories--files--and-metadata
MCT USE ONLY. STUDENT USE PROHIBITED 34  Module 2 Storage Services

1. In the Portal, access the Connect tab for your file share. Copy the UNC path information which will
take the form \storage-account-name.file.core.windows.net<share-name>.
2. Open This PC and Map a Network Drive.

3. When prompted select a drive letter and provide the UNC path.
4. When prompted provide your storage credentials.
✔️ You can also connect to the file share from the portal. It provides the necessary command. Be sure to
try it.

For more information, you can see:


‎Mount the Azure file share with File Explorer - https://docs.microsoft.com/en-us/azure/storage/files/
storage-how-to-use-files-windows#mount-the-azure-file-share-with-file-explorer
Mount the Azure File share with PowerShell - https://docs.microsoft.com/en-us/azure/storage/files/
storage-how-to-use-files-windows#mount-the-azure-file-share-with-powershell18

18 https://docs.microsoft.com/en-us/azure/storage/files/storage-how-to-use-files-windows
MCT USE ONLY. STUDENT USE PROHIBITED
Azure Files  35

Mount the Azure File share with Command Prompt - https://docs.microsoft.com/en-us/azure/stor-


age/files/storage-how-to-use-files-windows#mount-the-azure-file-share-with-command-prompt19

Mounting File Shares (Linux)


Azure file shares can be mounted in Linux distributions using the CIFS kernel client. There are two ways to
mount an Azure file share:
●● On-demand with the mount command.
●● On-boot (persistent) by creating an entry in /etc/fstab.
Prerequisites for mounting the file share in Linux
You can connect to your Azure file share with Linux. In addition to the Windows prerequisites, you also
need:
●● Install the cifs-utils package. Consult the documentation to ensure you are running Linux distribu-
tion that supports this package.
●● Understand the SMB client requirements. Azure Files can be mounted either via SMB 2.1 or SMB
3.0. For connections coming from clients on-premises or in other Azure regions, Azure Files will reject
SMB 2.1 (or SMB 3.0 without encryption). If secure transfer required is enabled for a storage account,
Azure Files will only allow connections using SMB 3.0 with encryption.
●● Decide on the directory/filechmod permissions20.
Mount the file share
sudo mount -t cifs //<storage-account-name>.file.core.windows.
net/<share-name> <mount-point> -o vers=<smb-version>,username=<storage-ac-
count-name>,password=<storage-account-key>,dir_mode=0777,file_mode=0777,-
serverino

For more information, you can see:


Use Azure Files with Linux - https://docs.microsoft.com/en-us/azure/storage/files/storage-how-to-
use-files-linux
Create a persistent mount point for the Azure file share with /etc/fstab - https://docs.microsoft.com/
en-us/azure/storage/files/storage-how-to-use-files-linux#create-a-persistent-mount-point-for-
the-azure-file-share-with-etcfstab21

Secure Transfer Required


The secure transfer option enhances the security of your storage account by only allowing requests to the
storage account by secure connection. For example, when calling REST APIs to access your storage
accounts, you must connect using HTTPs. Any requests using HTTP will be rejected when Secure transfer
required is enabled.

19 https://docs.microsoft.com/en-us/azure/storage/files/storage-how-to-use-files-windows
20 https://en.wikipedia.org/wiki/Chmod
21 https://docs.microsoft.com/en-us/azure/storage/files/storage-how-to-use-files-linux
MCT USE ONLY. STUDENT USE PROHIBITED 36  Module 2 Storage Services

When you are using the Azure files service, connection without encryption will fail, including scenarios
using SMB 2.1, SMB 3.0 without encryption, and some versions of the Linux SMB client.
You can also use tooling to enable this feature. Here is how to use PowerShell and the EnableHttpsTraf-
ficOnly parameter.
Set-AzureRmStorageAccount -Name "{StorageAccountName}" -ResourceGroupN-
ame "{ResourceGroupName}" -EnableHttpsTrafficOnly $True

✔️ Because Azure storage doesn’t support HTTPs for custom domain names, this option is not applied
using a custom domain name.
For more information, you can see:
Require secure transfer in Azure Storage - https://docs.microsoft.com/en-us/azure/storage/common/
storage-require-secure-transfer
FAQ for Azure Files - https://docs.microsoft.com/en-us/azure/storage/files/storage-files-faq

Demonstration: File Shares

Additional Practice - Azure Files


Take a minute to review the create a file share in Azure Files22 how-to (using the Azure portal). If
possible give it a try on your subscription. You will learn how to:
●● Create a file share
●● View a file share
●● Upload a file
●● Manage directories and files
Then try creating another file share using PowerShell23. You’ll learn how to:
●● Create a context for your storage account and key
●● Create the new file share

22 https://docs.microsoft.com/en-us/azure/storage/files/storage-how-to-create-file-share
23 https://docs.microsoft.com/en-us/azure/storage/files/storage-how-to-create-file-share
MCT USE ONLY. STUDENT USE PROHIBITED
Azure Files  37

# Retrieve storage account and storage account key


$storageContext = New-AzureStorageContext <storage-account-name> <stor-
age-account-key>
# Create the file share, in this case “logs”
$share = New-AzureStorageShare logs -Context $storageContext

Next, try mounting the Azure file share you created with File Explorer. Remember to obtain the storage
account key by going to the Connect pane.
You can also try mounting your Azure file share through PowerShell.
Finally, enable Secure Transfer Required for your storage account using PowerShell.
For more information, you can see:
Create a file share in Azure Files - https://docs.microsoft.com/en-us/azure/storage/files/storage-
how-to-create-file-share#Create file share through the Portal
Create a file share through PowerShell - https://docs.microsoft.com/en-us/azure/storage/files/
storage-how-to-create-file-share#create-file-share-through-powershell24
Mount the Azure file share with File Explorer - https://docs.microsoft.com/en-us/azure/storage/files/
storage-how-to-use-files-windows#mount-the-azure-file-share-with-file-explorer25
Mount the Azure File share with PowerShell - https://docs.microsoft.com/en-us/azure/storage/files/
storage-how-to-use-files-windows#mount-the-azure-file-share-with-powershell26
Enable Secure Transfer with PowerShell - https://docs.microsoft.com/en-us/azure/storage/common/
storage-require-secure-transfer#enable-secure-transfer-required-setting-with-powershell

24 https://docs.microsoft.com/en-us/azure/storage/files/storage-how-to-create-file-share
25 https://docs.microsoft.com/en-us/azure/storage/files/storage-how-to-use-files-windows
26 https://docs.microsoft.com/en-us/azure/storage/files/storage-how-to-use-files-windows
MCT USE ONLY. STUDENT USE PROHIBITED 38  Module 2 Storage Services

Structured Storage
Video: Structured Storage Overview

Table Storage
The Azure Table storage service stores large amounts of structured data. The service is a NoSQL key-value
data store which accepts authenticated calls from inside and outside the Azure cloud.
Azure tables are ideal for storing structured, non-relational data. Think of tables more as a spreadsheet of
information where there is no linkage or relationship (joins) between the information. So, table storage
supports transactions for entities in the same table and table partition, but not across tables or partitions.

Common uses of the table service include:


●● Storing terabytes of structured data capable of serving web scale applications.
●● Storing datasets that don't require complex joins, foreign keys, or stored procedures and can be
denormalized for fast access.
●● Quickly querying data using a clustered index.
✔️ You can use the Table service to store and query huge sets of structured, non-relational data, and your
tables will scale as demand increases.
✔️ A NoSQL (originally referring to “non SQL” or “non-relational”) database provides a mechanism for
storage and retrieval of data which is modeled in means other than the tabular relations used in relational
databases.
For more information, you can see:
Table storage - https://azure.microsoft.com/en-us/services/storage/tables/

Implementing Table Storage


When implementing Azure tables there are a few things to remember. An entity can have up to 255 prop-
erties, including three system properties: PartitionKey, RowKey, and Timestamp.
MCT USE ONLY. STUDENT USE PROHIBITED
Structured Storage  39

●● You are responsible for inserting and updating the values of PartitionKey and RowKey. Together the
PartitionKey and RowKey must uniquely identify every entity within a table.
●● The server manages the value of Timestamp, which cannot be modified.
Since Azure Tables are very different from relational databases, design principles revolve around whether
you want to read or write data to the table. Read more at the link below.
✔️ The Table service is not part of the exam objective domain but is included to complete your study of
the Storage service.
For more information, you can see:
Guidelines for table design - https://docs.microsoft.com/en-us/azure/cosmos-db/table-storage-de-
sign-guide#guidelines-for-table-design27

Demonstration - Table Storage

Queue Storage
Azure Queue storage is a service for storing large numbers of messages that can be accessed from
anywhere in the world via authenticated calls using HTTP or HTTPS. A single queue message can be up to
64 KB in size, and a queue can contain millions of messages, up to the total capacity limit of a storage
account.

Common uses of Queue storage include:


●● Creating a backlog of work to process asynchronously.
●● Passing messages from an Azure web role to an Azure worker role.
✔️ The Queue service is not part of the exam objective domain but is included to complete your study of
the Storage service.

27 https://docs.microsoft.com/en-us/azure/cosmos-db/table-storage-design-guide
MCT USE ONLY. STUDENT USE PROHIBITED 40  Module 2 Storage Services

For more information, you can see:


Queue Storage - https://azure.microsoft.com/en-us/services/storage/queues/

Demonstration - Queue Storage


MCT USE ONLY. STUDENT USE PROHIBITED
Module 2 Review Questions  41

Module 2 Review Questions


Module 2 Review Questions
Azure Storage Types
You are an administrator for your organization and are looking for alternative solutions for your on-prem-
ises infrastructure.
You currently have the following types of data:
●● 15 Tb of video and audio files
●● 8 Tb of Virtual Machine data
●● 900 Gb of SQL Data
●● 1 Tb of Word documents
●● 900 Gb of log-files
What type of storage should you use for each data type, and why?

Click for suggested answer ↓ 


Block Blob storage is best for video, audio, and documents. For virtual machine files, you should use Page
Blob storage. Table storage works for SQL data, and you should use Append Blob storage for log files.

Common uses of Blob storage include:


●● Serving images or documents directly to a browser.
●● Storing files for distributed access, such as installation.
●● Streaming video and audio.
●● Storing data for backup and restore, disaster recovery, and archiving.
●● Storing data for analysis by an on-premises or Azure-hosted service.
A blob can be any type and size file. Azure Storage offers three types of blobs: block blobs, page blobs,
and append blobs. You specify the blob type when you create the blob.
●● Block blobs are ideal for storing text or binary files, such as documents and media files.
●● Append blobs are like block blobs in that they are made up of blocks, but they are optimized for
append operations, so they are useful for logging scenarios.
●● Page blobs can be up to 8 TB in size and are more efficient for frequent read/write operations. Azure
virtual machines use page blobs as OS and data disks.
Blob Relationship
You are the administrator of your company’s Azure infrastructure. All unmanaged data resides in Blob
storage. You need to understand how containers are used, how they relate to blobs, and what the
limitations are in the Azure Portal.

How are the types related?


MCT USE ONLY. STUDENT USE PROHIBITED 42  Module 2 Storage Services

Click for suggested answer ↓ 


A container provides a grouping of a set of blobs. All blobs must be in a container. An account can
contain an unlimited number of containers. A container can store an unlimited number of blobs. You can
create the container in the Azure Portal.

Azure Files
Your organization has migrated most of the business-critical data and processes to Azure. One of the
applications is using a Linux distribution with attached file shares mounted via SMB 2.1. The organization
is performing a security audit, and the Linux system was listed as needing to be secured. What should
you do?

Click for suggested answer ↓ 


Azure Files can be mounted either via SMB 2.1 or SMB 3.0. For connections coming from clients on-prem-
ises or in other Azure regions, Azure Files will reject SMB 2.1 (or SMB 3.0 without encryption). If secure
transfer required is enabled for a storage account, Azure Files will only allow connections using SMB 3.0
with encryption.
MCT USE ONLY. STUDENT USE PROHIBITED
Module 3 Securing and Managing Storage

Shared Access Keys


Video - Storage Security Overview

Shared Access Signature


A shared access signature (SAS) provides delegated access to resources in your storage account. With a
SAS, you can grant clients access to resources in your storage account, without sharing your account
keys. SAS is a secure way to share your storage resources without compromising your account keys.

A SAS gives you granular control over the type of access you grant to clients who have the SAS, includ-
ing:
●● An account-level SAS can delegate access to multiple storage services. For example, blob, file, queue,
and table.
●● An interval over which the SAS is valid, including the start time and the expiry time.
MCT USE ONLY. STUDENT USE PROHIBITED 44  Module 3 Securing and Managing Storage

●● The permissions granted by the SAS. For example, a SAS for a blob might grant read and write
permissions to that blob, but not delete permissions.
Optionally, you can also:
●● Specify an IP address or range of IP addresses from which Azure Storage will accept the SAS. For
example, you might specify a range of IP addresses belonging to your organization.
●● The protocol over which Azure Storage will accept the SAS. You can use this optional parameter to
restrict access to clients using HTTPS.
✔️ There are two types of SAS: account and service. The account SAS delegates access to resources in
one or more of the storage services. The service SAS delegates access to a resource in just one of the
storage services: Blob, Queue, Table, or File service.
For more information, you can see:
What is a shared access signature? - https://docs.microsoft.com/en-us/azure/storage/common/
storage-dotnet-shared-access-signature-part-1?toc=%2fazure%2fstorage%2fblobs%2ftoc.
json#what-is-a-shared-access-signature1

Configuring SAS Parameters (Portal)


Configuring a SAS includes allowed services, allowed resource types, allowed permissions, start and
expiry date/times, and allowed IP addresses.

✔️ Take a few minutes to access the Azure Portal. Select a Storage Account and click Shared Access
Signature. Use the Information icon to step through each of the settings to learn more about what that
parameter does.

1 https://docs.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1?toc=%2fazure%2fstorage%2f-
blobs%2ftoc.json
MCT USE ONLY. STUDENT USE PROHIBITED
Shared Access Keys  45

URI and SAS Parameters


As you create your SAS in the Azure Portal a URI is created using parameters and tokens. The URI consists
of your Storage Resource URI and the SAS token.

Here is an example URI. Each part is described in the table below.


https://myaccount.blob.core.windows.net/?restype=service&comp=proper-
ties&sv=2015-04-05&ss=bf&srt=s&st=2015-04-29T22%3A18%3A26Z&se=2015-04-
30T02%3A23%3A26Z&sr=b&sp=rw&sip=168.1.5.60-168.1.5.70&spr=https
&sig=F%6GRVAZ5Cdj2Pw4tgU7IlSTkWgn7bUkkAg8P6HESXwmf%4B

Name SAS portion Description


Resource URI https://myaccount.blob.core. The Blob service endpoint, with
windows.net/?restype=ser- parameters for getting service
vice&comp=properties properties (when called with
GET) or setting service properties
(when called with SET).
Storage services version sv=2015-04-05 For storage services version
2012-02-12 and later, this
parameter indicates the version
to use.
Services ss=bf The SAS applies to the Blob and
File services
Resource types srt=s The SAS applies to service-level
operations.
Start time st=2015-04- Specified in UTC time. If you
29T22%3A18%3A26Z want the SAS to be valid imme-
diately, omit the start time.
Expiry time se=2015-04- Specified in UTC time.
30T02%3A23%3A26Z
Resource sr=b The resource is a blob.
Permissions sp=rw The permissions grant access to
read and write operations.
IP Range sip=168.1.5.60-168.1.5.70 The range of IP addresses from
which a request will be accepted.
Protocol spr=https Only requests using HTTPS are
permitted.
MCT USE ONLY. STUDENT USE PROHIBITED 46  Module 3 Securing and Managing Storage

Name SAS portion Description


Signature sig=F%6GRVAZ5Cdj2Pw4tgU7Il- Used to authenticate access to
STkWgn7bUkkAg8P6HESXwm- the blob. The signature is an
f%4B HMAC computed over a string-
to-sign and key using the
SHA256 algorithm, and then
encoded using Base64 encoding.
For more information, you can see:
Shared access signature parameters - https://docs.microsoft.com/en-us/azure/storage/common/
storage-dotnet-shared-access-signature-part-1?toc=%2fazure%2fstorage%2fblobs%2ftoc.
json#shared-access-signature-parameters2
Service SAS URI example - https://docs.microsoft.com/en-us/azure/storage/common/storage-dot-
net-shared-access-signature-part-1#service-sas-uri-example3
Account SAS URI example - https://docs.microsoft.com/en-us/azure/storage/common/storage-dot-
net-shared-access-signature-part-1#account-sas-uri-example4

Demonstration - Storage SAS and Keys

Demonstration - Storage Encryption

Best Practices
When you use shared access signatures in your applications, you need to be aware that if a SAS is leaked,
it can be used by anyone who obtains it. If a SAS provided to a client application expires and the applica-
tion is unable to retrieve a new SAS from your service, then the application's functionality may be
hindered.

2 https://docs.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1?toc=%2fazure%2fstorage%2f-
blobs%2ftoc.json
3 https://docs.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1
4 https://docs.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1
MCT USE ONLY. STUDENT USE PROHIBITED
Shared Access Keys  47

To mitigate these risks, here are some best practices. Read more at the reference link:
●● Always use HTTPS to create or distribute a SAS. This will prevent man-in-the-middle attacks from
reading the SAS and potentially compromising your data.
●● Reference stored access policies where possible. Stored access policies give you the option to
revoke permissions without having to regenerate the storage account keys. Set the expiration on
these very far in the future (or infinite) and make sure it's regularly updated to move it farther into the
future.
●● Be careful with SAS start time. If you set the start time for a SAS to now, then due to clock skew
(differences in current time according to different machines), failures may be observed intermittently
for the first few minutes. In general, set the start time to be at least 15 minutes in the past. Or, don't
set it at all, which will make it valid immediately in all cases.
●● Be specific with the resource to be accessed. A security best practice is to provide a user with the
minimum required privileges. If a user only needs read access to a single entity, then grant them read
access to that single entity, and not read/write/delete access to all entities. This also helps lessen the
damage if a SAS is compromised because the SAS has less power in the hands of an attacker.
For more information, you can see:
Best practices when using SAS - https://docs.microsoft.com/en-us/azure/storage/common/storage-
dotnet-shared-access-signature-part-1#best-practices-when-using-sas5
Configure Azure Storage Firewalls and Virtual Networks - https://docs.microsoft.com/en-us/azure/
storage/common/storage-network-security

Additional Practice - Shared Access Signatures


In this practice, you can try using a Shared Access Signature (SAS) to connect to and manage storage
accounts. First, if you have a subscription and have downloaded the Azure Storage Explorer, try using
storage keys from an existing storage account in the Azure portal to connect to an Azure storage account
through Storage Explorer. Notice that you have full access to the data in the storage account. For exam-
ple, you can create containers and file shares, create and manipulate queues or tables, and so on. In this
scenario, you have full, unrestricted access to the data until such time as the storage key changes. Now
you will do a similar exercise, this time using Shared Access Signatures.
●● Generate a SAS in the azure portal (look for steps in docs and link). From the storage account, click
Shared access signature under Settings and configure the settings for the SAS.
●● Click Generate SAS and connection string.
●● Use Shared Access Signatures6 to connect to your storage account through Storage Explorer.
●● Configure additional Share Access Signatures to work with different resources, for example blobs in a
container. Access those blobs directly using a SAS.
✔️ Don’t forget to review the videos in this lesson as you try some of these activities for yourself.
For more information, see:
Using shared access signatures (SAS) – https://docs.microsoft.com/en-us/azure/storage/common/
storage-dotnet-shared-access-signature-part-1

5 https://docs.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1
6 https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-storage-explorer
MCT USE ONLY. STUDENT USE PROHIBITED 48  Module 3 Securing and Managing Storage

Azure Backup
Azure Backup
Azure Backup is the Azure-based service you can use to back up (or protect) and restore your data in the
Microsoft cloud. Azure Backup replaces your existing on-premises or off-site backup solution with a
cloud-based solution that is reliable, secure, and cost-competitive.
Azure Backup offers multiple components that you download and deploy on the appropriate computer,
server, or in the cloud. The component, or agent, that you deploy depends on what you want to protect.
All Azure Backup components (no matter whether you're protecting data on-premises or in the cloud)
can be used to back up data to a Recovery Services vault in Azure.
Azure Backup delivers these key benefits:
●● Automatic Storage Management. Azure Backup automatically allocates and manages backup
storage, and it uses a pay-as-you-use model.
●● Unlimited scaling. There is no need to worry about high-availability for your data in the cloud.
●● Multiple storage options. Azure Backup offers two types of replication: locally redundant storage
and geo-redundant storage.
●● Unlimited data transfer. Azure Backup does not limit the amount of inbound or outbound data you
transfer.
●● Data encryption. Data encryption allows for secure transmission and storage of your data in the
public cloud.
●● Application consistent backup. An application-consistent backup means a recovery point has all
required data to restore the backup copy.
●● Long-term retention. You can use Recovery Services vaults for short-term and long-term data
retention.
✔️ What are some of the reasons your organization might choose Azure Backup?
For more information, you can see:
Why use Azure Backup? - https://docs.microsoft.com/en-us/azure/backup/backup-introduc-
tion-to-azure-backup#why-use-azure-backup7
Azure Backup Protection Support Matrix - https://docs.microsoft.com/en-us/azure/backup/back-
up-mabs-protection-matrix#protection-support-matrix8

Recovery Services Vault


Recovery Services vault is a storage entity in Azure that houses data. The data is typically copies of data,
or configuration information for virtual machines (VMs), workloads, servers, or workstations. You can use
Recovery Services vaults to hold backup data for various Azure services such as IaaS VMs (Linux or
Windows) and Azure SQL databases. Recovery Services vaults support System Center DPM, Windows
Server, Azure Backup Server, and more. Recovery Services vaults make it easy to organize your backup
data, while minimizing management overhead.
Within an Azure subscription, you can create up to 25 Recovery Services vaults per region.

7 https://docs.microsoft.com/en-us/azure/backup/backup-introduction-to-azure-backup
8 https://docs.microsoft.com/en-us/azure/backup/backup-mabs-protection-matrix
MCT USE ONLY. STUDENT USE PROHIBITED
Azure Backup  49

Creation and management of Recovery Services vaults in the Azure portal is easy because the Backup
service is integrated into the Azure Settings menu. This integration means you can create or manage a
Recovery Services vault in the context of the target service.
For example, to view the recovery points for a VM, select it, and click Backup in the Settings menu. The
backup information specific to that VM appears. In the previous example, ContosoVM is the name of the
virtual machine. ContosoVM-demovault is the name of the Recovery Services vault. You don't need to
remember the name of the Recovery Services vault that stores the recovery points, you can access this
information from the virtual machine.
✔️ If multiple servers are protected using the same Recovery Services vault, it may be more logical to
look at the Recovery Services vault. You can search for all Recovery Services vaults in the subscription and
choose one from the list.
✔️ Prior to November 2017 there were Backup vaults. Now all Backup vaults have been upgraded to
Recovery Services vaults.
For more information, you can see:
Managing your Recovery Services vaults in the portal - https://docs.microsoft.com/en-us/azure/
backup/backup-azure-recovery-services-vault-overview

Practice - Configure Azure Backup Reports


You can configure reports for Azure Backup using Recovery Services vault. Supported scenarios include:
1. Azure Backup reports are supported for Azure virtual machine backup and file/folder backup to cloud
using Azure Recovery Services Agent.
2. Reports for Azure SQL, DPM and Azure Backup Server are not supported at this time.
3. You can view reports across vaults and across subscriptions, if the same storage account is configured
for each of the vaults. The storage account selected should be in the same region as the recovery
services vault.
4. The frequency of scheduled refresh for the reports is 24 hours in Power BI. You can also perform an
ad-hoc refresh of the reports in Power BI, in which case latest data in customer storage account is
used for rendering reports.
5. Azure Backup Reports are currently not supported in National clouds.
MCT USE ONLY. STUDENT USE PROHIBITED 50  Module 3 Securing and Managing Storage

Use the process in the following article9 to configure the storage account for recovery services vault
using Azure portal.
✔️ After configuring storage account for reports using recovery services vault, it takes around 24 hours
for reporting data to start flowing in. After 24 hours of setting up a storage account, you can use the
process in this article10 to view, customize and create reports in Power BI.
For more information, you can see:
Configure storage account for reports - https://docs.microsoft.com/en-us/azure/backup/backup-az-
ure-configure-reports#configure-storage-account-for-reports11

Security Features for Hybrid Backups


Concerns about security issues, like malware, ransomware, and intrusion, are increasing. These security
issues can be costly, in terms of both money and data. To guard against such attacks, Azure Backup now
provides security features to help protect hybrid backups. These features include:

●● Prevention. An additional layer of authentication is added whenever a critical operation like changing
a passphrase is performed. This validation is to ensure that such operations can be performed only by
users who have valid Azure credentials.
●● Alerting. An email notification is sent to the subscription admin whenever a critical operation like
deleting backup data is performed. This email ensures that the user is notified quickly about such
actions.
●● Recovery. Deleted backup data is retained for an additional 14 days from the date of the deletion.
This ensures recoverability of the data within a given period, so there is no data loss even if an attack
happens. Also, a greater number of minimum recovery points are maintained to guard against corrupt
data.
When these features are enabled, the Security Settings page for the Recovery Services vault will summa-
rize the new capabilities.

9 https://docs.microsoft.com/en-us/azure/backup/backup-azure-configure-reports
10 https://docs.microsoft.com/en-us/azure/backup/backup-azure-configure-reports
11 https://docs.microsoft.com/en-us/azure/backup/backup-azure-configure-reports
MCT USE ONLY. STUDENT USE PROHIBITED
Azure Backup  51

✔️ Security features should not be enabled if you are using infrastructure as a service (IaaS) VM backup.
These features are not yet available (as of April 2018) for IaaS VM backup, so enabling them will not have
any impact.
For more information, you can see:
Security features to help protect hybrid backups that use Azure Backup - https://docs.microsoft.com/
en-us/azure/backup/backup-azure-security-feature

Backup Files and Folders


Backing up files and folders to Azure Backup12 is easy and follows a simple process.

1. Create a recovery services vault. To back up your files and folders, you need to create a Recovery
Services vault in the region where you want to store the data. You also need to determine how you
want your storage replicated, either geo-redundant (default) or locally redundant. By default, your
vault has geo-redundant storage. If you are using Azure as a primary backup storage endpoint, use
the default geo-redundant storage. If you are using Azure as a non-primary backup storage endpoint,
then choose locally redundant storage, which will reduce the cost of storing data in Azure.
2. Download files. Download the Backup Agent for Windows Server or Windows Client and the vault
credentials. The vault credentials will be used in the next step to register the backup agent.
3. Install and register the backup agent. Enabling backup through the Azure portal is coming soon.
Currently, you use the Microsoft Azure Recovery Services Agent on-premises to back up your files and
folders.
4. Backup your files and folders. Your initial backup includes two key tasks: schedule the backup and
back up the files and folders for the first time.
For more information, you can see:
Questions about the Backup Service - https://docs.microsoft.com/en-us/azure/backup/backup-az-
ure-backup-faq

Restore Files and Folders


Once you have created your backup you can use the Backup Agent to Recover Data13.

12 https://azure.microsoft.com/en-us/documentation/articles/backup-try-azure-backup-in-10-mins/
13 https://azure.microsoft.com/en-us/documentation/articles/backup-azure-restore-windows-server/
MCT USE ONLY. STUDENT USE PROHIBITED 52  Module 3 Securing and Managing Storage

When you are restoring you can:


●● Select Recovery Mode: Identify the server where the backup was originally created.
●● Select Volume and Date: You can restore from any point in time. First, select the date, and then select
the time.
●● Select Items to Recover: Select the files and folders you wish to restore.
●● Specify Recovery Options: You can restore to the original location or to another location in the same
machine. If the file/folder you wish to restore exists in the target location, you can create copies (two
versions of the same file), overwrite the files in the target location, or skip the recovery of the files
which exist in the target. It is highly recommended that you leave the default option of restoring the
ACLs on the files which are being recovered.

Video: Backup and Recovery

Demonstration: Backup Files and Folders


MCT USE ONLY. STUDENT USE PROHIBITED
Azure Backup  53

Additional Practice - Backup Azure File Shares


Take a minute to review the how to back up Azure file shares14 how-to. If possible give it a try on your
subscription. You will learn how to:
●● Configure a Recovery Services vault to back up Azure Files
●● Run an on-demand backup job to create a restore point
●● Restore a file or files from a restore point
●● Manage Backup jobs
●● Stop protection on Azure Files
●● Delete your backup data
For more information, you can see:
Backup Azure Files - https://docs.microsoft.com/en-us/azure/backup/backup-azure-files
Questions about backing up Azure Files - https://docs.microsoft.com/en-us/azure/backup/backup-
azure-files-faq
Troubleshoot problems backing up Azure Files - https://docs.microsoft.com/en-us/azure/backup/
troubleshoot-azure-files

14 https://docs.microsoft.com/en-us/azure/backup/backup-azure-files
MCT USE ONLY. STUDENT USE PROHIBITED 54  Module 3 Securing and Managing Storage

Azure File Sync


Azure File Sync
Use Azure File Sync to centralize your organization's file shares in Azure Files, while keeping the flexibility,
performance, and compatibility of an on-premises file server. Azure File Sync transforms Windows Server
into a quick cache of your Azure file share. You can use any protocol that's available on Windows Server
to access your data locally, including SMB, NFS, and FTPS. You can have as many caches as you need
across the world.

There are many uses and advantages to file sync.


1. Lift and shift. The ability to move applications that require access between Azure and on-premises
systems. Provide write access to the same data across Windows Servers and Azure Files. This lets
companies with multiple offices have a need to share files with all offices.
2. Branch Offices. Branch offices need to backup files, or you need to setup a new server that will
connect to Azure storage.
3. Backup and Disaster Recovery. Once File Sync is implemented, Azure Backup will back up your
on-premises data. Also, you can restore file metadata immediately and recall data as needed for rapid
disaster recovery.
4. File Archiving. Only recently accessed data is located on local servers. Non-used data moves to Azure
in what is called Cloud Tiering15. Cloud Tiering files will have greyed icons with an offline O file
attribute to let the user know the file is only in Azure.
✔️ At the time of this writing Azure File Sync is in preview. Be sure to consult the documentation for any
new updates.
For more information, you can see:
Planning for an Azure File Sync deployment - https://docs.microsoft.com/en-us/azure/storage/files/
storage-sync-files-planning
Release notes for the Azure File Sync agent - https://docs.microsoft.com/en-us/azure/storage/files/
storage-files-release-notes

15 https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-planning
MCT USE ONLY. STUDENT USE PROHIBITED
Azure File Sync  55

Video: File Sync Overview

File Sync Service Deployment (Initial Steps)


There are a few things that need to be configured before you synchronize your files.

1. Deploy the Storage Sync Service16. The Storage Sync Service is the top-level Azure resource for
Azure File Sync. A distinct top-level resource from the storage account resource is required because
the Storage Sync Service can create sync relationships with multiple storage accounts via multiple sync
groups. A subscription can have multiple Storage Sync Service resources deployed.
2. Prepare Windows Server to use with Azure File Sync17. For each server that you intend to use with
Azure File Sync, including server nodes in a Failover Cluster, you will need to configure the server.
Preparation steps include temporarily disabling Internet Explorer Enhanced Security and ensuring you
have latest PowerShell version.
3. Install the Azure File Sync Agent18. The Azure File Sync agent is a downloadable package that
enables Windows Server to be synced with an Azure file share. The Azure File Sync agent installation
package should install relatively quickly. We recommend that you keep the default installation path
and that you enable Microsoft Update to keep Azure File Sync up to date.
4. Register Windows Server with Storage Sync Service19. When the Azure File Sync agent installation
is finished, the Server Registration UI automatically opens. Registering Windows Server with a Storage
Sync Service establishes a trust relationship between your server (or cluster) and the Storage Sync
Service. Registration requires your Subscription ID, Resource Group, and Storage Sync Service (created
in step one). A server (or cluster) can be registered with only one Storage Sync Service at a time.

16 https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-deployment-guide
17 https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-deployment-guide
18 https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-deployment-guide?tabs=portal#install-the-azure-file-sync-agent
19 https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-deployment-guide
MCT USE ONLY. STUDENT USE PROHIBITED 56  Module 3 Securing and Managing Storage

✔️ Do you see in the last step why you may need multiple storage sync services? Continue to the next
topic for an explanation of how files are synchronized.
For more information, you can see:
Deploy Azure File Sync - https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-
deployment-guide#deploy-the-storage-sync-service20
FAQ for Azure File Sync - https://docs.microsoft.com/en-us/azure/storage/files/storage-files-
faq#azure-file-sync21

File Sync Service Deployment (Synchronization)

A sync group defines the sync topology for a set of files. Endpoints within a sync group are kept in sync
with each other. A sync group must contain at least one cloud endpoint, which represents an Azure file
share created in your storage account, and at least one server endpoint, which represents a path on a
Windows Server.
Server endpoints

20 https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-deployment-guide
21 https://docs.microsoft.com/en-us/azure/storage/files/storage-files-faq
MCT USE ONLY. STUDENT USE PROHIBITED
Azure File Sync  57

You should be able to create a file share in your Azure storage account, but the Server endpoint is
probably new to you. So, let’s take a look at the configuration information.

●● Registered server. The name of the server or cluster where you want to create the server endpoint.
●● Path. The Windows Server path to be synced as part of the sync group. The path should not be the
root volume.
●● Cloud Tiering. A switch to enable or disable cloud tiering.
●● Volume Free Space. The amount of free space to reserve on the volume on which the server endpoint
is located. For example, if volume free space is set to 50% on a volume that has a single server
endpoint, roughly half the amount of data is tiered to Azure Files.
✔️ Regardless of whether cloud tiering is enabled, your Azure file share always has a complete copy of
the data in the sync group.
✔️ Azure File Sync moves file data and metadata exclusively over HTTPS and requires port 443 to be
open outbound. Based on policies in your datacenter, branch or region, further restricting traffic over
port 443 to specific domains may be desired or required.
✔️ There is a lot to consider when synchronizing large amounts of files. For example, you may want to
copy the server files to the Azure file share before you configure file sync. Take a few minutes to read the
reference link and decide upon a strategy.
For more information, you can see:
Onboarding with Azure File Sync - https://docs.microsoft.com/en-us/azure/storage/files/storage-
sync-files-deployment-guide#onboarding-with-azure-file-sync22

Demonstration: Azure File Sync


Demonstration Azure File Sync
In this demonstration Corey walks through the previous deployment topics. Notice that he proceeds
through the steps in a little different order than what was presented in the course materials. That is
perfectly fine.

22 https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-deployment-guide
MCT USE ONLY. STUDENT USE PROHIBITED 58  Module 3 Securing and Managing Storage

Troubleshooting Azure File Sync


Azure File Sync can be used in file share scenarios ranging from the very simple to the most complex. It’s
important to be able to troubleshoot and resolve issues that you might encounter with your Azure File
Sync deployment.
The first step to troubleshooting Azure File Sync is to try to determine what area is having problems. Each
link below provides troubleshooting steps and information for typical scenarios that can occur with an
Azure File Sync deployment.
●● Agent installation and server registration23 During server registration, some PowerShell commands
may fail or are not recognized because they are not supported in the current version of the Sync
agent. Steps are given to remediate the issue. The next version of the Sync agent will be fixed to
support the latest version of PowerShell. Additionally, installation agent failures can occur for various
reasons, including versioning issues with both the agent and the OS. The article details workarounds
for these issues.
●● Sync group management24 Within sync group management, failures with cloud and server endpoint
creation can occur. Remediation options are provided.
●● File synchronization25 Issues with file synchronization range from being able to automatically detect
exactly when a sync has occurred (Azure Files does not currently support notifications or journaling)
to sync errors on the server and lack of free space.
●● Cloud tiering26 There are two paths for failures in cloud tiering:
●● Files can fail to tier, which means that Azure File Sync unsuccessfully attempts to tier a file to Azure
Files.
●● Files can fail to recall, which means that the Azure File Sync file system filter (StorageSync.sys) fails
to download data when a user attempts to access a file which has been tiered.
●● Troubleshooting information is provided including how to determine if an issue is a cloud storage
issue or a server issue.
For more information, you can see:
Troubleshoot Azure File Sync - https://docs.microsoft.com/en-us/azure/storage/files/stor-
age-sync-files-troubleshoot
Azure Support Forum - https://social.msdn.microsoft.com/Forums/en-US/home?forum=windowsa-
zuredata
How can we improve Azure storage? - https://feedback.azure.com/forums/217298-storage/catego-
ry/180670-files

23 https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-troubleshoot
24 https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-troubleshoot
25 https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-troubleshoot
26 https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-troubleshoot
MCT USE ONLY. STUDENT USE PROHIBITED
Azure File Sync  59

Video: Azure Friday Hybrid Storage with Azure


File Sync
Azure Friday Hybrid Storage with Azure File Sync
This video reviews the File Sync process and provides another demonstration. Review the video to ensure
you have learned the main points of this lesson.

Additional Practice - Configure and Deploy Az-


ure File Sync
If possible, use your subscription to set up an Azure File Sync deployment. Both videos in this lesson
provide simple examples of how to configure this, and it’s a good idea to watch them a couple of times.
Additionally, you can refer to the documentation.
1. Create a storage account
2. Deploy the Storage Sync Service27 Create an Azure File Sync service and put it in the same resource
group as your storage account.
3. Then create a Sync Group28 where you specify an Azure file share to sync with. This is the sync
group’s first cloud endpoint.
4. Create a file share in the storage account. (Refer to the previous lesson if you need to refresh your
memory on how to do this.)
5. Register your on-premise or local server29 with the storage sync service. Make sure that the Azure
PowerShell cmdlets are installed on the machine.
6. Configure the synchronization which involves deploying the File Sync agent30.
7. Add a server endpoint31. A Server Endpoint integrates a subfolder of a volume from a Registered
Server as a location to sync. You can now test the service by copying, adding and deleting some files.
Your files are now kept in sync across your Azure file share and Windows Server.
✔️ Remember that your storage account must be located in one of the regions in which Azure File Sync is
supported.

27 https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-deployment-guide
28 https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-deployment-guide?tabs=portal
29 https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-deployment-guide?tabs=portal
30 https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-deployment-guide?tabs=portal
31 https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-deployment-guide?tabs=portal
MCT USE ONLY. STUDENT USE PROHIBITED 60  Module 3 Securing and Managing Storage

Module 3 Review Questions


Module 3 Review Questions
Shared Access Signature
Your organization provides access to legal documentations for law firms to do research. All files must be
handles securely, and it is critical that information is only accessible to approved individuals. All case
contents have been stored on-premises, but will now be transferred to Azure Storage for improved
availability. You implement a Shared Access Signature solution.

What benefits do you gain from SAS? How can you increase security even further?

Click for suggested answer ↓ 


A shared access signature (SAS) provides delegated access to resources in your storage account. With a
SAS, you can grant clients access to resources in your storage account, without sharing your account
keys. SAS is a secure way to share your storage resources without compromising your account keys.

Optionally, you can also:


●● Specify an IP address or range of IP addresses from which Azure Storage will accept the SAS. For
example, you might specify a range of IP addresses belonging to your organization.
●● The protocol over which Azure Storage will accept the SAS. You can use this optional parameter to
restrict access to clients using HTTPS.
SAS URI
You are granting access to an Azure Storage container using a URI. Each part of the URI specifies the
context of which a user can access content. What are the different parameters of the URI?

Click for suggested answer ↓ 


The different parameters in the SAS URI include: Resource URI, Storage services version (sv), Services (ss),
Resource types (srt), Start time (st), Expiry time (se), resource (sr), permissions (sp), IP range (sip), protocol
(spr), and signature (sig).

Azure Backup
Your organization is using an old tape backup solution to back up critical data.

What are some of the benefits to implementing a cloud solution for backup? What are some of the
applications of the backup technology?

Click for suggested answer ↓ 


●● Automatic Storage Management. Azure Backup automatically allocates and manages backup
storage, and it uses a pay-as-you-use model.
●● Unlimited scaling. There is no need to worry about high-availability for your data in the cloud.
MCT USE ONLY. STUDENT USE PROHIBITED
Module 3 Review Questions  61

●● Multiple storage options. Azure Backup offers two types of replication: locally redundant storage
and geo-redundant storage.
●● Unlimited data transfer. Azure Backup does not limit the amount of inbound or outbound data you
transfer.
●● Data encryption. Data encryption allows for secure transmission and storage of your data in the
public cloud.
●● Application consistent backup. An application-consistent backup means a recovery point has all
required data to restore the backup copy.
●● Long-term retention. You can use Recovery Services vaults for short-term and long-term data
retention.
Other benefits could be quick restore of on-premises data in the case of a malware, ransomware or
intrusion attack.

Azure File Sync could allow sharing of production data between multiple office, access to business data
for roaming users without needing to access the company’s internal network.
MCT USE ONLY. STUDENT USE PROHIBITED
Module 4 Storing and Accessing Data

Azure Content Delivery Network


CDN Benefits
A content delivery network (CDN) is a distributed network of servers that can efficiently deliver content to
users. CDNs store cached content on edge servers that are close to end-users.
CDNs are typically used to deliver static content such as images, style sheets, documents, client-side
scripts, and HTML pages. The main benefits of using a CDN are:
●● Lower latency and faster delivery of content to users, regardless of their geographical location in
relation to the datacenter where the application is hosted.
●● Help to reduce load on a server or application, because it does not have to service requests for the
content that is hosted in the CDN.

Typical uses for a CDN include:


●● Delivering static resources for client applications, often from a website.
●● Delivering public static and shared content to devices such as cell phones and tablet computers.
●● Serving entire websites that consist of only public static content to clients, without requiring any
dedicated compute resources.
MCT USE ONLY. STUDENT USE PROHIBITED 64  Module 4 Storing and Accessing Data

●● Streaming video files to the client on demand.


●● Generally improving the experience for users, especially those located far from the datacenter hosting
the application.
●● Supporting IoT (Internet of Things) solutions, such as distributing firmware updates.
●● Coping with peaks and surges in demand without requiring the application to scale, avoiding the
consequent increased running costs.
✔️ CDN provides a faster, more responsive user experience. Do you think your organization would be
interested in this feature?
✔️ Use the following link to review some of the challenges with deploying CDN including security,
deployment, versioning, and testing.
For more information, you can see:
Best practices for using content delivery networks (CDNs) - https://docs.microsoft.com/en-us/azure/
architecture/best-practices/cdn?toc=%2Fazure%2Fcdn%2Ftoc.json

How CDN Works


You can enable Azure Content Delivery Network (CDN) to cache content for the user. The Azure CDN is
designed to send audio, video, images, and other files faster and more reliably to customers using servers
that are closest to the users. This dramatically increases speed and availability, resulting in significant user
experience improvements.

1. A user (Alice) requests a file (also called an asset) using a URL with a special domain name, such as
endpointname.azureedge.net. DNS routes the request to the best performing Point-of-Presence (POP)
location, which is usually the POP that is geographically closest to the user.
2. If the edge servers in the POP do not have the file in their cache, the edge server requests the file
from the origin. The origin can be an Azure Web App, Azure Cloud Service, Azure Storage account, or
any publicly accessible web server.
3. The origin returns the file to the edge server, including optional HTTP headers describing the file's
Time-to-Live (TTL).
4. The edge server caches the file and returns the file to the original requestor (Alice). The file remains
cached on the edge server until the TTL expires. Azure CDN automatically applies a default TTL of
seven days unless you've set up caching rules in the Azure portal.
MCT USE ONLY. STUDENT USE PROHIBITED
Azure Content Delivery Network  65

5. Additional users may then request the same file using that same URL and may also be directed to that
same POP.
6. If the TTL for the file hasn't expired, the edge server returns the file from the cache.
✔️ After you enable CDN access to a storage account, all publicly available objects are eligible for CDN
edge caching. If you modify an object that's currently cached in the CDN, the updated content will not be
available via CDN until CDN refreshes its content after the time-to-live period for the cached content
expires.
For more information, you can see:
Overview of the Azure Content Delivery Network - https://docs.microsoft.com/en-us/azure/cdn/
cdn-overview
Azure CDN POP locations by region - https://docs.microsoft.com/en-us/azure/cdn/cdn-pop-loca-
tions

CDN Profiles
A CDN profile is a collection of CDN endpoints with the same pricing tier and provider (origin). You may
create multiple profiles to organize endpoints. For example, you could have profiles with endpoints to
different internet domains, web applications, or storage accounts. You can create up to 8 CDN profiles
per subscription.

You can create a CDN profile from the Azure portal.

The CDN service is global and not bound to a location, however you must specify a resource group
location where the metadata associated with the CDN profile will reside. This location will not have any
impact on the runtime availability of your profile.
MCT USE ONLY. STUDENT USE PROHIBITED 66  Module 4 Storing and Accessing Data

Several pricing tiers are available. At the time of this writing, there are three tiers: Premium Verizon,
Standard Verizon, and Standard Akamai. Pricing is based on TBs of outbound data transfers. Be sure to
read about the pricing models in the link at the end of this topic.
Notice you can create your first profile endpoint directly from this blade (last checkbox).
✔️ Can you think of different scenarios that would require different CDN profiles?
For more information, you can see:
CDN Pricing - https://azure.microsoft.com/en-us/pricing/details/cdn/

CDN Endpoints
When you create a new CDN endpoint directly from the CDN profile blade you are prompted for CDN
endpoint name, Origin type, and Origin hostname. To access cached content on the CDN, use the CDN
URL provided in the portal. In this case,
ASHStorage.azureedge.net/<myPublicContainer>/<BlobName>

There are four choices for Origin type: Storage, Cloud Service, Web App, and Custom origin. In this course
we are focusing on storage CDNs.
When you select Storage as the Origin type, the new CDN endpoint uses the host name of your storage
account as the origin server.
There are additional CDN features for your delivery, such as compression, query string, and geo filtering.
You can also add custom domain mapping to your CDN endpoint and enable custom domain HTTPS.
These options are configured in the Settings blade for the endpoint.
✔️ Because it takes time for the registration to propagate, the endpoint isn't immediately available for
use. For Azure CDN from Akamai profiles, propagation usually completes within one minute. For Azure
CDN from Verizon profiles, propagation usually completes within 90 minutes, but in some cases can take
longer. Be sure to review the Troubleshooting pages reference link.
For more information, you can see:
MCT USE ONLY. STUDENT USE PROHIBITED
Azure Content Delivery Network  67

Create a new CDN endpoint - https://docs.microsoft.com/en-us/azure/cdn/cdn-create-new-end-


point#create-a-new-cdn-endpoint1
Troubleshooting CDN endpoints returning 404 statuses - https://docs.microsoft.com/en-us/azure/
cdn/cdn-troubleshoot-endpoint

CDN Time-to-Live
Any publicly accessible blob content can be cached in Azure CDN until its time-to-live (TTL) elapses. The
TTL is determined by Cache-directive headers in the HTTP response from the origin server. If the
Cache-Control header does not provide the TTL information or if you prefer, you can configure caching
rules to set the Cache Expiration Duration.
●● Global caching rules. You can set the Cache Expiration Duration for each endpoint in your profile,
which affects all requests to the endpoint. TTL is configured as days, hours, minutes, and seconds.

●● Custom caching rules. You can also create custom caching rules for each endpoint in your profile.
Custom caching rules match specific paths and file extensions, are processed in order, and override
the global caching rule.

For more information, you can see:


Cache-directive Headers - https://docs.microsoft.com/en-us/azure/cdn/cdn-how-caching-works#-
cache-directive-headers2
Control Azure CDN caching behavior with caching rules - https://docs.microsoft.com/en-us/azure/
cdn/cdn-caching-rules

CDN Compression
File compression is a simple and effective method to improve file transfer speed and increase page-load
performance by reducing a file's size before it is sent from the server. File compression can reduce
bandwidth costs and provide a more responsive experience for your users.

1 https://docs.microsoft.com/en-us/azure/cdn/cdn-create-new-endpoint
2 https://docs.microsoft.com/en-us/azure/cdn/cdn-how-caching-works
MCT USE ONLY. STUDENT USE PROHIBITED 68  Module 4 Storing and Accessing Data

There are two ways to enable file compression:


●● Enable compression on your origin server. In this case, the CDN passes along the compressed files
and delivers them to clients that request them.
●● Enable compression directly on the CDN edge servers. In this case, the CDN compresses the files and
serves them to end users.
Enabling compression in the standard tiers
In the Azure portal, you can enable Compression and modify the MIME types list to tune which content
formats to compress.

✔️Although it is possible, it is not recommended to apply compression to compressed formats. For


example, ZIP, MP3, MP4, or JPG.
For more information, you can see:
Premium tier compression - https://docs.microsoft.com/en-us/azure/cdn/cdn-improve-perfor-
mance#premium-tier3
Compression behavior tables - https://docs.microsoft.com/en-us/azure/cdn/cdn-improve-perfor-
mance#compression-behavior-tables4
Troubleshooting CDN file compression - https://docs.microsoft.com/en-us/azure/cdn/cdn-trouble-
shoot-compression

Video Using Azure CDN features in the Azure


Portal
This is an older video that mentions the Classic portal which is no longer being used. Also, the Portal GUI
may have changed, but it is still a good video for configuring the basics of CDN.

3 https://docs.microsoft.com/en-us/azure/cdn/cdn-improve-performance
4 https://docs.microsoft.com/en-us/azure/cdn/cdn-improve-performance
MCT USE ONLY. STUDENT USE PROHIBITED
Azure Content Delivery Network  69

Video: Optimize Your Content Delivery with Az-


ure CDN

Additional Practice - Optimize Your Content De-


livery with Azure CDN
After you watch the videos, take a few minutes to try out the Azure content Delivery Network (CDN). In
this Quickstart5, you enable Azure Content Delivery Network (CDN) by creating a new CDN profile and
CDN endpoint. You’ll need to first create a storage account named mystorageacct123, which you will use
for the origin hostname. You’ll perform the following tasks:
●● Create a new CDN profile6.
●● Create a new CDN endpoint7.
Set Azure CDN caching rules
In this tutorial8, try creating some caching rules to set of modify the default cache expiration behavior of
your CDN profile, such as a URL path and file extension. You’ll perform the following tasks:
●● Access the azure CDN caching rules page9.
●● Set global caching rules10.
●● Set custom caching rules11.
For more information, you can see:
Troubleshooting CDN endpoints returning 404 statuses - https://docs.microsoft.com/en-us/azure/
cdn/cdn-troubleshoot-endpoint
How caching works – https://docs.microsoft.com/en-us/azure/cdn/cdn-how-caching-works
Control Azure CDN caching behavior – https://docs.microsoft.com/en-us/azure/cdn/cdn-caching-
rules

5 https://docs.microsoft.com/en-us/azure/cdn/cdn-create-new-endpoint
6 https://docs.microsoft.com/en-us/azure/cdn/cdn-create-new-endpoint
7 https://docs.microsoft.com/en-us/azure/cdn/cdn-create-new-endpoint
8 https://docs.microsoft.com/en-us/azure/cdn/cdn-caching-rules-tutorial
9 https://docs.microsoft.com/en-us/azure/cdn/cdn-caching-rules-tutorial
10 https://docs.microsoft.com/en-us/azure/cdn/cdn-caching-rules-tutorial
11 https://docs.microsoft.com/en-us/azure/cdn/cdn-caching-rules-tutorial
MCT USE ONLY. STUDENT USE PROHIBITED 70  Module 4 Storing and Accessing Data

Import and Export Service


Import and Export Service
When it comes to transferring very large amounts of data to or from the cloud you will want to consider
using the Azure Import/Export service. The Azure Import/Export Service allows you to:
●● Import. Securely transfer large amounts of data to Azure Blob storage (block and page blobs) and
Azure Files by shipping disk drives to an Azure data center. In this case, you will be shipping hard
drives containing your data.
●● Export. Transfer data from Azure storage to hard disk drives and ship to your on-premise sites.
Currently, you can only export Block blobs, Page blobs or Append blobs from Azure storage using
this service. Exporting Azure Files is not currently supported. In this case, you will be shipping empty
hard drives.
Consider using Azure Import/Export service when uploading or downloading data over the network is too
slow or getting additional network bandwidth is cost-prohibitive. Scenarios where this would be useful
include:
●● Migrating data to the cloud. Move large amounts of data to Azure quickly and cost effectively.
●● Content distribution. Quickly send data to your customer sites.
●● Backup. Take backups of your on-premises data to store in Azure blob storage.
●● Data recovery. Recover large amount of data stored in blob storage and have it delivered to your
on-premises location.
✔️ Only 2.5” SSD or 2.5" or 3.5" SATA II or III internal HDD are supported for use with the Import/Export
service.
For more information, you can see:
Import/Export Prerequisites - https://docs.microsoft.com/en-us/azure/storage/common/storage-im-
port-export-service#prerequisites12
Azure Import and Export Service - https://azure.microsoft.com/en-us/documentation/articles/
storage-import-export-service/
Import/Export Pricing - https://azure.microsoft.com/en-us/pricing/details/storage-import-export/

12 https://docs.microsoft.com/en-us/azure/storage/common/storage-import-export-service
MCT USE ONLY. STUDENT USE PROHIBITED
Import and Export Service  71

Import Jobs

At a high level, an import job involves the following steps:


●● Determine the data to be imported, and the number of drives you need.
●● Identify the destination blob or file location for your data in Azure storage.
●● Use the WAImportExport Tool to copy your data to one or more hard disk drives and encrypt them
with BitLocker.
●● Create an import job in your target storage account using the Azure portal or the Import/Export REST
API. If using the Azure portal, upload the drive journal files.
●● Provide the return address and carrier account number to be used for shipping the drives back to you.
●● Ship the hard disk drives to the shipping address provided during job creation.
●● Update the delivery tracking number in the import job details and submit the import job.
●● Drives are received and processed at the Azure data center.
●● Drives are shipped using your carrier account to the return address provided in the import job.
For more information, you can see:
Inside an import job - https://docs.microsoft.com/en-us/azure/storage/common/storage-im-
port-export-service#inside-an-import-job13

13 https://docs.microsoft.com/en-us/azure/storage/common/storage-import-export-service
MCT USE ONLY. STUDENT USE PROHIBITED 72  Module 4 Storing and Accessing Data

Export Jobs

At a high level, an export job involves the following steps:


●● Determine the data to be exported and the number of drives you need.
●● Identify the source blobs or container paths of your data in Blob storage.
●● Create an export job in your source storage account using the Azure portal or the Import/Export REST
API.
●● Specify the source blobs or container paths of your data in the export job.
●● Provide the return address and carrier account number for to be used for shipping the drives back to
you.
●● Ship the hard disk drives to the shipping address provided during job creation.
●● Update the delivery tracking number in the export job details and submit the export job.
●● The drives are received and processed at the Azure data center.
●● The drives are encrypted with BitLocker; the keys are available via the Azure portal.
●● The drives are shipped using your carrier account to the return address provided in the import job.
For more information, you can see:
Inside an export job - https://docs.microsoft.com/en-us/azure/storage/common/storage-im-
port-export-service#inside-an-export-job14

14 https://docs.microsoft.com/en-us/azure/storage/common/storage-import-export-service
MCT USE ONLY. STUDENT USE PROHIBITED
Import and Export Service  73

Additional Practice - Azure Import Export Tool


The Microsoft Azure Import/Export Tool (WAImportExport.exe) is the drive preparation and repair tool
that you can use with the Microsoft Azure Import/Export service. You can use the tool for the following
functions:
●● Before creating an import job, you can use this tool to copy data to the hard drives you are going to
ship to an Azure data center.
●● After an import job has completed, you can use this tool to repair any blobs that were corrupted, were
missing, or conflicted with other blobs.
●● After you receive the drives from a completed export job, you can use this tool to repair any files that
were corrupted or missing on the drives.
✔️ This practice involves a fairly lengthy process and in a real world use case you would package an
import job to the appropriate media in order to ship it to an Azure data center. Even if you don’t attempt
this practice, it’s worth reading through the documentation.
●● Install and set up the Azure Import/Export Tool15.
●● Prepare your hard drives for a job where you import data from your drives to Azure Blob
Storage16.
●● Review the status of a job with Copy Log Files17.
●● Repair an import job18.
●● Repair an export job19.
●● Troubleshoot the Azure Import/Export Tool20.
✔️ If you decide to try this practice, you’ll need one or more empty 2.5-inch or 3.5-inch SATAII or III or
SSD hard drives connected to the copy machine.
For more information, you can see:
Using the Azure Import/Export Tool - https://docs.microsoft.com/en-us/azure/storage/common/
storage-import-export-tool-how-to?toc=%2fazure%2fstorage%2fblobs%2ftoc.json

AzCopy
Now that you know a little bit about the Import/Export service now is a good time to review an alterna-
tive method for transferring data.
AzCopy is a command-line utility designed for copying data to/from Microsoft Azure Blob, File, and Table
storage, using simple commands designed for optimal performance. You can copy data between a file
system and a storage account, or between storage accounts. The AZCopy Help pages shows the basic
syntax.

15 https://docs.microsoft.com/en-us/azure/storage/common/storage-import-export-tool-setup?toc=%2fazure%2fstorage%2fblobs%2ftoc.
json
16 https://docs.microsoft.com/en-us/azure/storage/common/storage-import-export-tool-preparing-hard-drives-import?toc=%2fazure%2f-
storage%2fblobs%2ftoc.json
17 https://docs.microsoft.com/en-us/azure/storage/common/storage-import-export-tool-reviewing-job-status-v1?toc=%2fazure%2fstor-
age%2fblobs%2ftoc.json
18 https://docs.microsoft.com/en-us/azure/storage/common/storage-import-export-tool-repairing-an-import-job-v1?toc=%2fazure%2fstor-
age%2fblobs%2ftoc.json
19 https://docs.microsoft.com/en-us/azure/storage/common/storage-import-export-tool-repairing-an-export-job-v1?toc=%2fazure%2fstor-
age%2fblobs%2ftoc.json
20 https://docs.microsoft.com/en-us/azure/storage/common/storage-import-export-tool-troubleshooting-v1?toc=%2fazure%2fstorage%2f-
blobs%2ftoc.json
MCT USE ONLY. STUDENT USE PROHIBITED 74  Module 4 Storing and Accessing Data

The examples at the end of the Help pages are also helpful.

✔️ Take a few minutes to download the tool and try a few simple transfer examples.
For more information, you can see:
Download and install AzCopy on Windows - https://docs.microsoft.com/en-us/azure/storage/
common/storage-use-azcopy#download-and-install-azcopy-on-windows21

21 https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy
MCT USE ONLY. STUDENT USE PROHIBITED
Data Box  75

Data Box
Video – Introducing the Data Box Family

Offline - Data Box Products


Data is being generated at record levels, and moving stored or in-flight data to the cloud can be chal-
lenging. Azure Data Box products provide both offline and online solutions for moving your data to the
cloud. In this topic we will concentrate on the offline data products.

Offline solutions transfer large amounts of data to Azure where there is limited or no network bandwidth.
●● Data Box. This ruggedized device, with 100 TB of capacity, uses standard NAS protocols and common
copy tools. It features AES 256-bit encryption for safer transit.
●● Data Box Disk. Our 8 TB SSD, with a USB/SATA interface, has 128-bit encryption. Customize it to your
needs—it comes in packs of up to five for a total of 40 TB.
●● Data Box Heavy. As its name implies, this ruggedized, self-contained device is designed to lift 1 PB of
data to the cloud.
For more information, you can see:
Azure Data Box Products - https://azure.microsoft.com/en-us/services/storage/databox/
Azure Data Box - https://docs.microsoft.com/en-us/azure/databox-family/
MCT USE ONLY. STUDENT USE PROHIBITED 76  Module 4 Storing and Accessing Data

Ignite 2018 Azure Data Box Family - https://azure.microsoft.com/en-us/resources/videos/ignite-


2018-the-new-additions-to-the-azure-data-box-family/

Offline – Use Cases


Data Box Edge uses a physical device supplied by Microsoft to accelerate secure data transfer. The
physical device resides in your premises and you write data to it using the NFS and SMB protocols. Data
Box Edge has all the gateway capabilities of Data Box Gateway. Data Box is additionally equipped with
AI-enabled edge computing capabilities that help analyze, process, or filter data as it moves to Azure
block blob, page blob, or Azure Files.
Use cases
Here are some scenarios where Data Box Gateway can be used for data transfer.
●● Preprocess data. Preprocessing can be used to: aggregate data, modify data (for example to remove
Personally Identifiable Information), subset and transfer data for deeper analytics, and analyze and
react to events.
●● Inference Azure Machine Learning. With Data Box Edge, you can run Machine Learning (ML) models
to get quick results that can be acted on before the data is sent to the cloud. The full data set is
transferred to continue to retrain and improve your ML models.
●● Transfer data over network to Azure. Use Data Box Edge to quickly transfer data to Azure to enable
further compute and analytics or for archival purposes.
✔️ Data Box Edge has many of the same benefits of Data Box Gateway.
For more information, you can see:
Data Box Edge - https://docs.microsoft.com/en-us/azure/databox-online/data-box-edge-overview

Offline - Product Selection


Data Box is designed to move large amounts of data to Azure with no impact to the network. When
selecting an offline product consider speed and security.
Speed. Use the estimated speed to determine which box will transfer the data in the time frame you
need. For data sizes < 40 TB, use Data Box Disk and for data sizes > 500 TB, sign up for Data Box Heavy.

Product Network Interfaces


Data Box Disk USB 3.0 connection
Data Box 1 Gbps or 10 Gbps network interfaces
Data Box Heavy High performance 40 Gbps network interfaces
Security. All products can only be unlocked with a password provided in the Azure portal. All services are
protected by Azure security features. Ensure your selection meets your organization’s security require-
ments.

Product Physical security Encryption


Data Box Disk The disks are tamper-resistant The data is secured with AES
and support secure update 128-bit encryption.
capability.
Data Box Rugged device casing secured by The data is secured with AES
tamper-resistant screws and 256-bit encryption.
tamper-evident stickers.
MCT USE ONLY. STUDENT USE PROHIBITED
Data Box  77

Product Physical security Encryption


Data Box Heavy Rugged device casing secured by The data is secured with AES
tamper-resistant screws and 256-bit encryption.
tamper-evident stickers.
✔️ Once your data is uploaded to Azure, the disks on the device are wiped clean, in accordance with
NIST 800-88r1 standards.

Offline - Implementation Workflow


The implementation workflow is the same for Data Box, Data Box Disk, and Data Box Heavy.

1. Order22. Create an order in the Azure portal, provide shipping information, and the destination Azure
storage account for your data. If the device is available, Azure prepares and ships the device with a
shipment tracking ID.
2. Receive, unpack, connect, and unlock23. Once the device is delivered, cable the device for network
and power using the specified cables. Turn on and connect to the device. Configure the device
network and mount shares on the host computer from where you want to copy the data.
3. Copy and validate the data24. Copy data to Data Box shares.
4. Return, upload, verify25. Prepare, turn off, and ship the device back to the Azure datacenter. Data is
automatically copied from the device to Azure. The device disks are securely erased as per the
National Institute of Standards and Technology (NIST) guidelines.
✔️ Take a few minutes to review each link. The links are for Data Box, there are similar pages for Data Box
Disk and Data Box Heavy.
✔️ Throughout this process, you are notified via email on all status changes.
For more information, you can see:
Quickstart: Deploy Azure Data Box using the Azure portal - https://docs.microsoft.com/en-us/azure/
databox/data-box-quickstart-portal

Online - Data Box Products


Online solutions products act as network storage gateways to manage data between your site and Azure.

22 https://docs.microsoft.com/en-us/azure/databox/data-box-deploy-ordered
23 https://docs.microsoft.com/en-us/azure/databox/data-box-deploy-set-up
24 https://docs.microsoft.com/en-us/azure/databox/data-box-deploy-copy-data
25 https://docs.microsoft.com/en-us/azure/databox/data-box-deploy-picked-up
MCT USE ONLY. STUDENT USE PROHIBITED 78  Module 4 Storing and Accessing Data

●● Data Box Gateway. Data Box Gateway transfers data to and from Azure. It’s a virtual appliance.
●● Data Box Edge. This on-premises physical network appliance transfers data to and from Azure.
Analyze, process, and transform your on-premises data before uploading it to the cloud using
AI-enabled edge compute capabilities.
For more information, you can see:
What is Azure Data Box Gateway? - https://docs.microsoft.com/en-us/azure/databox-online/da-
ta-box-gateway-overview
What is Azure Data Box Edge? - https://docs.microsoft.com/en-us/azure/databox-online/data-box-
edge-overview

Online - Data Box Gateway


Data Box Gateway is a virtual device based on a virtual machine provisioned in your virtualized environ-
ment or hypervisor. The virtual device resides in your premises and you write data to it using the NFS and
SMB protocols. The device then transfers your data to Azure block blob, page blob, or Azure Files.
Use cases
Here are some scenarios where Data Box Gateway can be used for data transfer.
●● Cloud archival. Copy hundreds of TBs of data to Azure storage using Data Box Gateway in a secure
and efficient manner. The data can be ingested one time or an ongoing basis for archival scenarios.
●● Data aggregation. Aggregate data from multiple sources into a single location in Azure Storage for
data processing and analytics.
●● Integration with on-premises workloads. Integrate with on-premises workloads such as backup and
restore that use cloud storage and need local access for commonly used files.
Benefits
Data Box Gateway has the following benefits:
●● Easy data transfer. Makes it easy to move data in and out of Azure storage as easy as working with a
local network share.
●● High performance. Takes the hassle out of network data transport with high-performance transfers
to and from Azure.
●● Fast access. Caches most recent files for fast access of on-premises files.
●● Limited bandwidth usage. Data can be written to Azure even when the network is throttled to limit
usage during peak business hours.
MCT USE ONLY. STUDENT USE PROHIBITED
Data Box  79

For more information, you can see:


Data Box Gateway - https://docs.microsoft.com/en-us/azure/databox-online/data-box-gate-
way-overview

Online – Implementation Workflow

1. Prepare. Create and configure your Data Box Gateway resource prior to provisioning a Data Box
Gateway virtual device26. This includes: checking prerequisites, creating a new Data Box Gateway in
the portal, downloading the virtual device image for Hyper-V or VMware, and obtaining the activation
key. This key is used to activate and connect your Data Box Gateway device with the resource.
2. Provision. For Hyper-V27, provision and connect to a Data Box Gateway virtual device on a host
system running Hyper-V on Windows Server 2016 or Windows Server 2012 R2. For VMware28,
provision and connect to a Data Box Gateway virtual device on a host system running VMware ESXi
6.0 or 6.5. For both hypervisors you will: verify requirements, provision the device, start the device, and
get the IP address.
3. Connect, setup, and activate29. Connect to the local web UI setup page. Provide the device name
and activation key. The Network settings, Web proxy settings, and Time settings are optional.
4. Add, connect to the share30. Your share can be SMB or NFS. There are settings for both in the portal.
Once the share is created you can connect and begin transferring data.
✔️ Be sure to view the documentation for each step,
✔️ The steps for Data Box Edge are different. Use the reference link to walk through those tutorials.
For more information, you can see:
Tutorial: Prepare to deploy Azure Data Box Edge - https://docs.microsoft.com/en-us/azure/data-
box-online/data-box-edge-deploy-prep

Data Box FAQ


How is data box priced?
Each product has a different pricing method31. Data Box, Data Box Disk, and Data Box Heavy provide a
service/processing fee for a fixed amount of time. Data Box Gateway and Data Box Edge are subscription
services.

26 https://docs.microsoft.com/en-us/azure/databox-online/data-box-gateway-deploy-prep
27 https://docs.microsoft.com/en-us/azure/databox-online/data-box-gateway-deploy-provision-hyperv
28 https://docs.microsoft.com/en-us/azure/databox-online/data-box-gateway-deploy-provision-vmware
29 https://docs.microsoft.com/en-us/azure/databox-online/data-box-gateway-deploy-connect-setup-activate
30 https://docs.microsoft.com/en-us/azure/databox-online/data-box-gateway-deploy-add-shares
31 https://azure.microsoft.com/en-us/pricing/details/storage/databox/
MCT USE ONLY. STUDENT USE PROHIBITED 80  Module 4 Storing and Accessing Data

Can I use multiple storage accounts with Data Box?


Yes. Data Box supports up to 10 storage accounts. For performance considerations, three accounts are
recommended. All the storage accounts should be in the same Azure region. Azure blobs (block and
page) and Azure Files are supported.
How does Azure Data Box service help support customers chain of custody procedure?
Azure Data Box service natively provides reports that you can use for your chain of custody documenta-
tion. The audit and copy logs are available in your storage account in Azure and you can download the
order history32 in the Azure portal after the order is complete.
Are there any tips to speed up the data copy?
●● Use multiple streams of data copy. For instance, with Robocopy, use the multithreaded option.
●● Use multiple sessions.
●● Instead of copying over a network share (where you could be limited by the network speeds) ensure
that you have the data residing locally on the computer to which the Data Box is connected.
●● Benchmark the performance of the computer used to copy the data. Download and use the Bluestop
FIO tool33 to benchmark the performance of the server hardware.
For more information, you can see:
Azure Data Box: Frequently Asked Questions - https://docs.microsoft.com/en-us/azure/databox/
data-box-faq

32 https://docs.microsoft.com/en-us/azure/databox/data-box-portal-admin
33 https://bluestop.org/fio/
MCT USE ONLY. STUDENT USE PROHIBITED
Module 4 Review Questions  81

Module 4 Review Questions


Module 4 Review Questions
Azure Content Delivery Network
You are a trainer for a large multi-national corporation. The training division has decided to record all
training sessions so users can consume audio and video at their own leisure if they are unable to attend.
The internal server solution currently in use is causing buffering time and distorted audio and video quali-
ty. You are looking to deploy a solution in Azure, to make content available more readily. You decide to
use Azure Content Delivery Network (CDN) as a part of the solution.

What benefits will you realize compared to the internal solution? What additional functionality can the
CDN be used for?

Click for suggested answer ↓ 


A content delivery network (CDN) is a distributed network of servers that can efficiently deliver content to
users. CDNs store cached content on edge servers that are close to end-users.

CDNs are typically used to deliver static content such as images, style sheets, documents, client-side
scripts, and HTML pages. The main benefits of using a CDN are:
●● Lower latency and faster delivery of content to users, regardless of their geographical location in
relation to the datacenter where the application is hosted.
●● Help to reduce load on a server or application, because it does not have to service requests for the
content that is hosted in the CDN.
Typical uses for a CDN include:
●● Delivering static resources for client applications, often from a website.
●● Delivering public static and shared content to devices such as cell phones and tablet computers.
●● Serving entire websites that consist of only public static content to clients, without requiring any
dedicated compute resources.
●● Streaming video files to the client on demand.
●● Generally improving the experience for users, especially those located far from the datacenter hosting
the application.
●● Supporting IoT (Internet of Things) solutions, such as distributing firmware updates.
●● Coping with peaks and surges in demand without requiring the application to scale, avoiding the
consequent increased running costs.
CDN Profiles
Your organization has decided to invest in Azure Content Delivery Network (CDN). You need to decide
which configuration is needed for your internal requirements.

Which configuration options must be considered for Azure CDN?


MCT USE ONLY. STUDENT USE PROHIBITED 82  Module 4 Storing and Accessing Data

Click for suggested answer ↓ 


Number of CDN Profiles: A CDN profile is a collection of CDN endpoints with the same pricing tier and
provider (origin). You may create multiple profiles to organize endpoints. You can create up to 8 CDN
profiles per subscription.

Pricing Tier: At the time of this writing, there are three tiers: Premium Verizon, Standard Verizon, and
Standard Akamai. Pricing is based on TBs of outbound data transfers.
Origin Type: There are four choices for Origin type: Storage, Cloud Service, Web App, and Custom origin.
Additional Features: There are additional CDN features for your delivery, such as compression, query
string, and geo filtering.
Azure Import/Export Tools
You are considering the Azure Import/Export feature. What scenarios are appropriate for this feature?
Name two tools that can used.

Click for suggested answer ↓ 


Scenarios for Import/Export include: Migrating data to the cloud, Content distribution, Backup, and Data
recovery. Two tools are the Microsoft Azure Import/Export Tool (WAImportExport.exe) and AzCopy.

Data Box
You are considering the Azure Data Box family of products? What are the main categories for the prod-
ucts? What are the specific products, and how would you decide which product to use?

Click for suggested answer ↓ 


The Azure Data Box family is divided into offline and online products. Offline products include Data Box
Disks, Data Box, and Data Box Heavy. Online products include Data Box Gateway and Data Box Edge. To
decide what product to use you might consider: security, storage capacity, use case, and time frames.
MCT USE ONLY. STUDENT USE PROHIBITED
Module 5 Monitoring Storage

Metrics and Alerts


Monitor Metrics
Azure Monitor provides unified user interfaces for monitoring across different Azure services. Azure
Storage integrates Azure Monitor by sending metric data to the Azure Monitor platform. With metrics on
Azure Storage, you can analyze usage trends, trace requests, and diagnose issues with your storage
account.
Azure Monitor provides multiple ways to access metrics. You can access them from the Azure Portal,
Monitor APIs (REST, and .Net) and analysis solutions such as the Operation Management Suite and Event
Hubs. Metrics are enabled by default, and you can access the past 30 days of data. If you need to retain
data for a longer period, you can archive metrics data to an Azure Storage account.
Metrics is a Shared Service where you can specify the resource, sub-service, metric, and aggregation
criteria. Additionally, you specify more than one metric, filter by a metric, and Export to Excel.

✔️ Take a minute to locate the Monitor service and then Metrics on the Shared Services blade.
For more information, you can see:
Azure Storage metrics in Azure Monitor - https://docs.microsoft.com/en-us/azure/storage/common/
storage-metrics-in-azure-monitor?toc=%2fazure%2fstorage%2fblobs%2ftoc.json
Monitoring Azure applications and resources - https://docs.microsoft.com/en-us/azure/monitor-
ing-and-diagnostics/monitoring-overview
MCT USE ONLY. STUDENT USE PROHIBITED 84  Module 5 Monitoring Storage

Capacity and Transaction Metrics


Let’s look at an example of how the Monitor service can help you review your storage information. When
you select a Storage Account resource your sub-service choice is: Account, Blob, File, Queue, and Table.

Capacity metrics values are sent to Azure Monitor every hour. The values are refreshed daily. Your
sub-service selection determines what Capacity metrics are available. For example, if you choose the Blob
sub-service, then the Capacity metrics are: Blob Capacity, Blob Container Count, and Blob Count.

Transaction metrics are sent from Azure Storage to Azure Monitor every minute. All transaction metrics
are available at both account and service level (Blob storage, Table storage, Azure Files, and Queue
storage).

✔️ Take few minutes to read about the Capacity and Transaction metrics and create a few metric graphs
in the Azure portal.
For more information, you can see:
Capacity Metrics - https://docs.microsoft.com/en-us/azure/storage/common/storage-met-
rics-in-azure-monitor?toc=%2fazure%2fstorage%2fblobs%2ftoc.json#capacity-metrics1
Transaction Metrics - https://docs.microsoft.com/en-us/azure/storage/common/storage-met-
rics-in-azure-monitor?toc=%2fazure%2fstorage%2fblobs%2ftoc.json#transaction-metrics2

Azure Monitor Alerts


Alerts has a new experience. The older alerts experience is now under the Alerts (Classic) tab.

1 https://docs.microsoft.com/en-us/azure/storage/common/storage-metrics-in-azure-monitor?toc=%2fazure%2fstorage%2fblobs%2ftoc.
json
2 https://docs.microsoft.com/en-us/azure/storage/common/storage-metrics-in-azure-monitor?toc=%2fazure%2fstorage%2fblobs%2ftoc.
json
MCT USE ONLY. STUDENT USE PROHIBITED
Metrics and Alerts  85

The new Alerts experience has many benefits.


●● Better notification system. All newer alerts use action groups, which are named groups of notifica-
tions and actions that can be reused in multiple alerts.
●● A unified authoring experience. All alert creation for metrics, logs and activity log across Azure
Monitor, Log Analytics, and Application Insights is in one place.
●● View Log Analytics alerts in Azure portal. You can now also see Log Analytics alerts in your sub-
scription. Previously these were in a separate portal.
●● Separation of Fired Alerts and Alert Rules. Alert Rules (the definition of condition that triggers an
alert), and Fired Alerts (an instance of the alert rule firing) are differentiated, so the operational and
configuration views are separated.
●● Better workflow. The new alerts authoring experience guides the user along the process of configur-
ing an alert rule, which makes it simpler to discover the right things to get alerted on.
For more information, you can see:
The new alerts experience in Azure Monitor - https://docs.microsoft.com/en-us/azure/monitor-
ing-and-diagnostics/monitoring-overview-unified-alerts

Alert Rules
In the new Alerts experience, alerts can be authored in a consistent manner regardless of the monitoring
service or signal type. All alerts fired and related details are available in single page.
Authoring an alert is a three-step task where the user first picks a target for the alert, followed by select-
ing the right signal and then specifying the logic to be applied on the signal as part of the alert rule.

1. Define alert condition includes:


a. Target selection. For example, storage account.
b. Alert criteria. For example, Used Capacity.
c. Alert logic. For example, over a six-hour period whenever the Used Capacity is over 1000000 bytes.
1. Define alert details includes: Alert rule name, description, and severity. There are five severity levels,
Severity 0 to Severity 4.
MCT USE ONLY. STUDENT USE PROHIBITED 86  Module 5 Monitoring Storage

2. Define action group. Create an action group to notify your team via email and text messages or
automate actions using webhooks and runbooks.
✔️ Take a few minutes to create an alert rule and look at the options.

Action Groups
Action groups enable you to configure a list of actions to take when the alert is triggered. Action groups
ensure that the same actions are taken each time an alert is triggered. There are several action types you
can select when defining the group: Select Email/SMS3/Push/Voice, Logic App4, Webhook5, IT Service
Management6, or Automation Runbook.

Each action type is different in the details that must be provided. Here is a screenshot for the Email and
SMS configuration.

✔️ Take a few minutes to create an action group using the link below.
For more information, you can see:
Create an action group by using the Azure portal - https://docs.microsoft.com/en-us/azure/monitor-
ing-and-diagnostics/monitoring-action-groups#create-an-action-group-by-using-the-azure-por-
tal7
Action specific information - https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/
monitoring-action-groups#action-specific-information8
Rate limiting for Voice, SMS, emails, Azure App push notifications and webhook posts - https://docs.
microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-alerts-rate-limiting

3 https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-activity-log-alerts-webhook
4 https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-action-groups-logic-app
5 https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-activity-log-alerts-webhook
6 https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-itsmc-overview
7 https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-action-groups
8 https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-action-groups
MCT USE ONLY. STUDENT USE PROHIBITED
Metrics and Alerts  87

Signal Types and Metrics


Signals are emitted by the Target resource and can be of several types. Metric, Activity log, Application
Insights, and Log are supported Signal types.

Newer metric alerts specifically have the following improvements:


●● Improved latency: Newer metric alerts can run as frequently as every minute. Log alerts still have a
longer than 1-minute delay due to the time is takes to ingest the logs.
●● Support for multi-dimensional metrics: You can alert on dimensional metrics allowing you to
monitor an interesting segment of the metric.
●● More control over metric conditions: You can define richer alert rules. The newer alerts support
monitoring the maximum, minimum, average, and total values of metrics.
●● Combined monitoring of multiple metrics: You can monitor multiple metrics (currently, up to two
metrics) with a single rule. An alert is triggered if both metrics breach their respective thresholds for
the specified time-period.
●● Metrics from Logs (limited public preview): Some log data going into Log Analytics can now be
extracted and converted into Azure Monitor metrics and then alerted on just like other metrics.
For more information, you can see:
Alert rule terminology - https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/
monitoring-overview-unified-alerts#alert-rules-terminology9

Video: Monitoring Storage

9 https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-overview-unified-alerts
MCT USE ONLY. STUDENT USE PROHIBITED 88  Module 5 Monitoring Storage

Additional Practice - Alerts


The Azure Activity Log provides a history of subscription-level events in Azure. It offers information about
who created, updated, or deleted what resources and when they did it. You can create an Activity Log
alert to receive email, SMS, or webhook notifications when an activity occurs that match your alert
conditions.
Take a minute to review and try the Audit and receive notifications about important actions in your
Azure subscription10 Quickstart. This Quickstart steps through creating a simple network security group,
browsing the Activity Log to understand the event that occurred, and then authoring an Activity Log alert
to become notified when any network security group is created going forwards.
You will learn how-to:
●● Create a network security group
●● Browse the Activity Log in the portal
●● Browse an event in the Activity log
●● Create an Activity log alert
●● Test the Activity log alert
✔️ Are you seeing the difference between the older way of measuring metrics at the resource level and
the newer way of monitoring at the subscription level?

10 https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitor-quick-audit-notify-action-in-subscription
MCT USE ONLY. STUDENT USE PROHIBITED
Activity Log  89

Activity Log
Activity Log
The Azure Activity Log is a subscription log that provides insight into subscription-level events that have
occurred in Azure. This includes a range of data, from Azure Resource Manager operational data to
updates on Service Health events. The Activity Log was previously known as “Audit Logs” or “Operational
Logs”.
Using the Activity Log, you can determine the ‘what, who, and when’ for any write operation taken on the
resources in your subscription. For example, who stopped a service. It provides an audit trail of the
activities or operations performed on your resources by someone working on the Azure platform. You can
also understand the status of the operation and other relevant properties.

This diagram shows many of the things you can do with the activity log including:
●● Send data to Log Analytics for advanced search and alerts.
●● Query or manage events in the Portal, PowerShell, CLI, and REST API.
●● Stream information to Event Hub.
●● Archive data to a storage account.
●● Analyze data with Power BI.
✔️ The Activity Log differs from Diagnostic Logs11. Activity Logs provide data about the operations on a
resource from the outside (the “control plane”). Diagnostics Logs are emitted by a resource and provide
information about the operation of that resource (the "data plane").
For more information, you can see:
Monitor Subscription Activity with the Azure Activity Log - https://docs.microsoft.com/en-us/azure/
monitoring-and-diagnostics/monitoring-overview-activity-logs

11 https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-overview-of-diagnostic-logs
MCT USE ONLY. STUDENT USE PROHIBITED 90  Module 5 Monitoring Storage

Query the Activity Log

In the Azure portal, you can filter your Activity Log by these fields:
●● Subscription. One or more Azure subscription names.
●● Resource group. One or more resource groups within those subscriptions.
●● Resource (name). The name of a specific resource.
●● Resource type. The type of resource, for example, Microsoft.Compute/virtualmachines.
●● Operation name. The name of an Azure Resource Manager operation, for example, Microsoft.SQL/
servers/Write.
●● Timespan. The start and end time for events.
●● Category. The event category is described in the next topic.
●● Severity. The severity level of the event (Informational, Warning, Error, Critical).
●● Event initiated by. The ‘caller,’ or user who performed the operation.
●● Search - This is an open text search box that searches for that string across all fields in all events.
Once you have defined a set of filters, you can save it as a query that is persisted across sessions if you
ever need to perform the same query with those filters applied again in the future. You can also pin a
query to your Azure dashboard to always keep an eye on specific events.
For more information, you can see:
Query the Activity Log in the Azure portal - https://docs.microsoft.com/en-us/azure/monitor-
ing-and-diagnostics/monitoring-overview-activity-logs#query-the-activity-log-in-the-azure-por-
tal12

Event Categories
The Activity Log provides several event categories. You may select one or more.

12 https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-overview-activity-logs
MCT USE ONLY. STUDENT USE PROHIBITED
Activity Log  91

●● Administrative - This category contains the record of all create, update, delete, and action operations
performed through Resource Manager. Examples of the types of events you would see in this catego-
ry include “create virtual machine” and "delete network security group". The Administrative category
also includes any changes to role-based access control in a subscription.
●● Service Health - This category contains the record of any service health incidents that have occurred
in Azure. An example of the type of event you would see in this category is “SQL Azure in East US is
experiencing downtime.” Service health events come in five varieties: Action Required, Assisted
Recovery, Incident, Maintenance, Information, or Security.
●● Alert - This category contains the record of all activations of Azure alerts. An example of the type of
event you would see in this category is “CPU % on myVM has been over 80 for the past 5 minutes.”
●● Autoscale - This category contains the record of any events related to the operation of the autoscale
engine based on any autoscale settings you have defined in your subscription. An example of the type
of event you would see in this category is “Autoscale scale up action failed.”
●● Recommendation - This category contains recommendation events from certain resource types, such
as web sites and SQL servers. These events offer recommendations for how to better utilize your
resources.
●● Security - This category contains the record of any alerts generated by Azure Security Center. An
example of the type of event you would see in this category is “Suspicious double extension file
executed.”
●● Policy and Resource Health - These categories do not contain any events; they are reserved for
future use.
For more information, you can see:
Categories in the Activity Log - https://docs.microsoft.com/en-us/azure/monitoring-and-diagnos-
tics/monitoring-overview-activity-logs#categories-in-the-activity-log

Activity Log and Log Analytics


It is easy to access the Activity Log Analytics solution.
MCT USE ONLY. STUDENT USE PROHIBITED 92  Module 5 Monitoring Storage

With the Azure Activity Logs tile, you can do many things:
●● Analyze the activity logs with pre-defined views.
●● Analyze and search activity logs from multiple Azure subscriptions.
●● Keep activity logs for longer than the default of 90 days.
●● Correlate activity logs with other Azure platform and application data.
●● See operational activities aggregated by status.
●● View trends of activities happening on each of your Azure services.
●● Report on authorization changes on all your Azure resources.
●● Identify outage or service health issues impacting your resources.
●● Use Log Search to correlate user activities, auto-scale operations, authorization changes, and service
health to other logs or metrics from your environment.
✔️ Log Analytics collects activity logs free of charge and stores the logs for 90 days free of charge. If you
store logs for longer than 90 days, you will incur data retention charges for the data stored longer than
90 days. When you're on the Free pricing tier, activity logs do not apply to your daily data consumption.
For more information, you can see:
Collect and analyze Azure activity logs in Log Analytics - https://docs.microsoft.com/en-us/azure/
log-analytics/log-analytics-activity

Collect Across Subscriptions


This topic covers the strategy to collect Azure Activity Logs into a Log Analytics workspace using the
Azure Log Analytics Data Collector connector for Logic Apps. Use this strategy when you need to send
logs to a workspace in a different Azure Active Directory. For example, if you are a managed service
provider, you may want to collect activity logs from a customer's subscription and store them in a Log
Analytics workspace in your own subscription.
MCT USE ONLY. STUDENT USE PROHIBITED
Activity Log  93

The basic strategy is to have Azure Activity Log send events to an Event Hub13 where a Logic App14
sends them to your Log Analytics workspace.

Advantages of this approach include:


●● Low latency since the Azure Activity Log is streamed into the Event Hub. The Logic App is then
triggered and posts the data to Log Analytics.
●● Minimal code is required, and there is no server infrastructure to deploy.
✔️ Do you think your organization would benefit from this strategy?
For more information, you can see:
Collect Azure Activity Logs into Log Analytics across subscriptions - https://docs.microsoft.com/en-us/
azure/log-analytics/log-analytics-activity-logs-subscriptions

Video: Activity Log

Additional Practice - Activity Log


Azure Activity Logs are a platform service for working with logs and metrics across your subscription. In
this practice, try configuring the tasks in the Azure portal first. In most cases, you can also perform the
tasks using PowerShell or the CLI.
●● Query the Activity Log in the Azure portal15
●● Create an activity log alert16
●● Configure log profiles using the Azure portal17
●● Enable streaming of the Activity Log18
●● Archive the Activity Log using the portal19
●● Configure the Activity Log Analytics solution for your workspaces20

13 https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-what-is-event-hubs
14 https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-overview
15 https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-overview-activity-logs#query-the-activity-log-in-the-az-
ure-portal
16 https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-activity-log-alerts
17 https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-overview-activity-logs
18 https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-stream-activity-logs-event-hubs
19 https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-archive-activity-log
20 https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-activity
MCT USE ONLY. STUDENT USE PROHIBITED 94  Module 5 Monitoring Storage

✔️ The tasks listed are only a representative sampling of what you can do with Activity Logs. Explore
some of the other tasks as you have time. Don’t forget to view the associated Activity Log dashboards.
(Click the Azure Activity Logs tile to open the Azure Activity Logs dashboard.)
For more information, see:
Create activity log alerts – https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/
monitoring-activity-log-alerts
Stream the Azure Activity Log to Event Hubs – https://docs.microsoft.com/en-us/azure/monitor-
ing-and-diagnostics/monitoring-stream-activity-logs-event-hubs
Archive the Azure Activity Log – https://docs.microsoft.com/en-us/azure/monitoring-and-diagnos-
tics/monitoring-archive-activity-log
Collect and analyze Azure activity logs in Log Analytics - https://docs.microsoft.com/en-us/azure/
log-analytics/log-analytics-activity
MCT USE ONLY. STUDENT USE PROHIBITED
Module 5 Review Questions  95

Module 5 Review Questions


Module 5 Review Questions
Azure Monitor
You need to report on the health and status of your Azure environment on a weekly basis. You decide to
use Azure Monitor as your monitoring solution. How can you access Azure Monitor, and how is it used?
How much historical data is available?

Click for suggested answer ↓ 


Azure Monitor provides multiple ways to access metrics. You can access them from the Azure Portal,
Monitor APIs (REST, and .Net) and analysis solutions such as the Operation Management Suite and Event
Hubs. Metrics are enabled by default, and you can access the past 30 days of data.

The Monitor service can help you review your storage information. When you select a Storage Account
resource your sub-service choice is: Account, Blob, File, Queue, and Table.
Alert Logic and Rules
You are monitoring the health of you Azure environment. You need to ensure that you receive an alert if a
service stop or performance is degraded.
Describe how the alert authoring task is created. What alert conditions can be used?

Click for suggested answer ↓ 


Authoring an alert is a three-step task where the user first picks a target for the alert, followed by select-
ing the right signal and then specifying the logic to be applied on the signal as part of the alert rule.
1. Define alert condition includes: Target selection, Alert criteria, and Alert logic.
2. Define alert details includes: Alert rule name, description, and severity. There are five severity levels,
Severity 0 to Severity 4.
3. Define action group. Create an action group to notify your team via email and text messages or
automate actions using webhooks and runbooks.
Activity Log Event Categories
You are monitoring your organizations Azure environment, and are setting up alerts to inform you of the
health of your infrastructure. Which other event categories are available in the Activity Log?

Click for suggested answer ↓ 


●● Administrative - This category contains the record of all create, update, delete, and action operations
performed through Resource Manager. Examples of the types of events you would see in this catego-
ry include create virtual machine and delete network security group. The Administrative category
also includes any changes to role-based access control in a subscription.
●● Service Health - This category contains the record of any service health incidents that have occurred
in Azure. An example of the type of event you would see in this category is SQL Azure in East US is
MCT USE ONLY. STUDENT USE PROHIBITED 96  Module 5 Monitoring Storage

experiencing downtime. Service health events come in five varieties: Action Required, Assisted
Recovery, Incident, Maintenance, Information, or Security.
●● Alert - This category contains the record of all activations of Azure alerts. An example of the type of
event you would see in this category is CPU percent on myVM has been over 80 for the past 5
minutes.
●● Autoscale - This category contains the record of any events related to the operation of the autoscale
engine based on any autoscale settings you have defined in your subscription. An example of the type
of event you would see in this category is Autoscale scale up action failed.
●● Recommendation - This category contains recommendation events from certain resource types, such
as web sites and SQL servers. These events offer recommendations for how to better utilize your
resources.
●● Security - This category contains the record of any alerts generated by Azure Security Center. An
example of the type of event you would see in this category is Suspicious double extension file
executed.
●● Policy and Resource Health - These categories do not contain any events; they are reserved for
future use.
MCT USE ONLY. STUDENT USE PROHIBITED
Module 6 Lab-Implement and Manage Stor-
age

Lab
Lab
Scenario
Adatum Corporation wants to leverage Azure Storage for hosting its data.
Exercise 0
Prepare the lab environment.
Exercise 1
Implement and use Azure Blob Storage
Exercise 2
Implement and use Azure File Storage
Estimated Time: 70 minutes
✔️ If you are in a classroom, ask your instructor for the lab guide. If you are in a self-paced online course,
check the Course Handouts page.
MCT USE ONLY. STUDENT USE PROHIBITED 98  Module 6 Lab-Implement and Manage Storage