BestSeller
Best Seller!
$27.49
$24.99
DP-203: Data Engineering on Microsoft Azure

DP-203: Data Engineering on Microsoft Azure Certification Video Training Course

DP-203: Data Engineering on Microsoft Azure Certification Video Training Course includes 262 Lectures which proven in-depth knowledge on all key concepts of the exam. Pass your exam easily and learn everything you need with our DP-203: Data Engineering on Microsoft Azure Certification Training Video Course.

115 Students Enrolled
262 Lectures
10:17:00 hr
$27.49
$24.99

Curriculum for Microsoft Azure DP-203 Certification Video Training Course

DP-203: Data Engineering on Microsoft Azure Certification Video Training Course Info:

The Complete Course from ExamCollection industry leading experts to help you prepare and provides the full 360 solution for self prep including DP-203: Data Engineering on Microsoft Azure Certification Video Training Course, Practice Test Questions and Answers, Study Guide & Exam Dumps.

Design and implement data storage – Basics

9. Azure Data Lake Gen-2 storage accounts

So now we come all the way to the Azure data lake and Gen.Two storage accounts. So this is just a service that actually provides the option of hosting a data lake on Azure. So, in the event of big data, when you're working with large data sets and data is arriving in large volumes and at a rapid rate, companies consider having data leaks in place for data hosting. So in Azure, you can actually make use of Azure data. Lake Gen has two storage accounts. Now. Azure Data Lake Gen 2 storage accounts are just a service that is built on top of Azure Block Storage. In the early chapter, we looked at Azure Storage accounts, and Azure Data Lake is based on Azure Storage accounts themselves. With Azure Data Lake Gen 2 Storage accounts, you have the ability to host an enterprise data lake on Azure. Here you get something known as the "feature of a hierarchical namespace" on top of Azure Block Storage itself. This hierarchy helps to organise the objects and the files into a hierarchy of directories for efficient data access. So I said, when it comes to storing data, initially when a company wants to take and store data coming from multiple sources, this data could be in different formats. You could have image files, you could have documents, you could have text-based files, you could have JSON-based files—files in different formats. And at first, the company simply wants a location to store all of that data in whatever format it is. They would go ahead and have something known as a data lake. And when it comes to Azure, you canactually make use of Azure Data Lake Gentwo Storage Accounts in the background. When it comes to storage, you don't have to worry about it. You don't have to think about adding more and more storage to the storage account. You can just keep on uploading your data. The service itself will manage the storage for you. And as your data leak is actually built for big data and hosting large sums of data, you can upload data in its native raw format. And it is optimised for storing terabytes and even petabytes of data. The data can actually come from a variety of data sources, and the data can be in a variety of formats, whether it be structured, semistructured, or unstructured data. So now in the next chapter, let's go ahead andcreate an as your data Lake Gentle storage account. You.

10. Lab - Creating an Azure Data Lake Gen-2 storage account

So here we are in Azure. Now we'll create a new resource. So in all resources, I'll hit on "Create." To create an Azure Data Lake Gentle storage account, wehave to create nothing but a normal storage account. Search for the storage account service and click Create. Here, I'll choose our resource group. That's our Data GRP Resource Group. Choose the location as "North Europe." Again, I need to give a unique data lake storage account name. So that's fine. Again, for redundancy, I'll choose locally redundant storage. I'll go on next for advance in the advance screen. This is what is important. There is an option for Data Lake Storage Gen 2. We have to enable the hierarchical namespace. This ensures that our storage account now behaves as a Data Lake storage Gentle account. I'll enable the setting and all the other settings in the subsequent screens. I'll just leave it as it is. I won't make any changes. I'll go on review and create, and I'll hit on create. This will just take a couple of minutes. Let's wait till we have the storage account in place. Once our deployment is complete, I can move on to the resource and the entire layout. The entire overview of the data lake storage account is similar to a normal storage account, which you had seen earlier on. on the left-hand side. Again, we have containers, we have file shares, we have queues, and we have tables. Here the data lake service is the containers, which are based on our blob service. If I go on to containers again, I can create a container. Then within the container, I can start uploading my objects. So here, if I create a simple container known as "data," we have the public access level of either a private blob or container. Anonymous access. I'll leave it private. no anonymous access. I'll hit on create.If you go on to the container in the container now, you can also add directories to the container. So let's say you're storing Rawfiles in this particular directory. You can create a directory known as "Raw Hit on Save." You can go on to the directory, and you can start uploading your files and objects over there. So, when it comes to the block service and the data lake, when you upload something to it, say a file, this file is referred to as a blob or an object when it comes to the blob service. Because it is ultimately stored in binary format on the underlying storage service, it is referred to as a blob or an object when it comes to the blob service. Also, another quick note before I forget, so I forgot to mention this in each chapter when looking at storage accounts: when it comes to the block service, if I go back onto all resources, go back onto my view. I want to go on to the storage account we had created earlier on the data store here. We've seen that if I go to containers in our data container and click on any object. If you go on to the edit section,we can see what are the contents ofthat particular file or the particular blob. At the same time, if you go on to the overview, every object or blob in the storage account gets a unique URL. This URL can be used to access the blob since we've given access to the container. So if I go back onto the container here, in terms of the access level, we had given it "blob anonymous read access." That means we can read the blobs as an anonymous user. And what does this mean? If I click on an object and copy the URL to the clipboard, then go to a new tab and control V paste that complete URL, we get the name of our storage account blob. So this is our service bureau, windows.net. This is the name of our container, and this is the name of the image. If I hit Enter, I can see the blob itself. In this tab We are now anonymous users. In this tab, we have not logged into Azure in this tab.We are actually logged into our Azure account. But yeah, we are logged in as an anonymous user. We are not logged in; any user who is has access. So these are all the different security measures that you should actually consider when it comes to accessing your objects. And as I said in subsequent chapters, we actually look at the different security measures in place. So I thought, before I forget, let me kind of, you know, give you that note. When it comes to the URL feature, which is available for blobs in your storage account, the same concept is also available for the DeederLake Gen 2 storage accounts as well. So, returning to our Deed elic Gen 2 storage account, I'll navigate to Deedra Lake and select containers, my data container, and my raw folder. I'll upload a file that I have on my local system. I'll click on upload. So again, in my temp folder, I have a JSON-based file. I'll just open up that file, and don't worry, I'll include this file as a resource in this chapter. I'll hit upload. So I have the JSON file in place. If I proceed to the file, if I proceed to edit. So I've got some information here. This information is actually based on the diagnostic settings, which are available for an Azure SQL database. So that diagnostic setting is sending diagnostic information about the database. So, for example, it is sending the metrics about the database itself. You have different metrics, such as the CPU percentage, the memory percentage, et cetera, and at different points in time, it's actually sending that information. So I just have this sample file in place, a sample JSON file, and I've uploaded this file onto my data lake. Please know that we have a lot of chapters in which I'll actually show how we can continuously stream data onto your data lake Gen 2 storage accounts. Because we still have a lot to cover in this particular course, at this point in time, I just want to show you how we can upload a simple file onto a data lake generation 2 storage account. So at this point in time, you should understand what the purpose of a data lake is. It's based on the Blob service when it comes to Azure. And here you have the ability to store different kinds of files in different formats and based on different sizes. So at this point in time, I just want you all to know about the service that is available in Azure for hosting a dealer lake, which is an Azure DealerLake Gentle Storage account when you can upload different types of files that are in varying sizes. Right, so this marks the end of this chapter.

11. Using PowerBI to view your data

in this chapter. I just want to give a quick example when it comes to the visualisation of your data, which is available in a Data Lake Gen 2 storage account. So I'll go on to my Data Lake Gen 2 storage account. I'll go on to my containers, and I'll go on to my data containers. I'll go on to the Raw folder. I have my JSON-based file. I'll go ahead and upload a new file into this folder. In my temp directory, I have a log CSV file. I'll hit "open." I'll hit upload. I'll just quickly go ahead and open up this log CC file. So we'll be using the same LogCAC file in subsequent chapters as well. This actually contains the information from my Azure activity logs. Here I have an ID column, which I have self generated.Then I have something known as the correlation ID. What is the operation name, what is the status,the event category, the level, the timestamp I havethe subscription, the event initiated by what is theresource type and what is the resource group. I'll tell you the way that I actually generated this particular file. So if I just quickly open up all resources in a new tab, I want to go on to the Azure Monitoring Service. So the Azure Monitor Service is a central monitoring servicefor all of your resources that you have in Azure. If I search for Monitor and go to the activity log, let me just hide this. So all of the activities that I perform as part of your account are administrative-level activities. So, for example, if I've gone ahead and created a storage account or deleted a storage account or created a SQL database, everything will be listed over here. So what I've actually done is change the time span over here. I've chosen a custom duration of three months, so we can only look at the last three months' data, and then I download all the content as a CSV file when you download the CSV file, so you don't get this ID column. So I've gone ahead and generally created thisID column in Excel, and you'll also behaving one more column as the resource column. For now, I've just gone ahead and deleted the data in this resource column, right? So I have all of this information in my log CSV file. If I go on to my data lake storage account, if I choose my container and change the access level, just for now, I'll press Blob Anonymous, read Access for Blobs Only, and hit on OK, Next. If you want to start working with PowerBi now, PowerBi is a powerful visualisation tool. You can use this tool to create reports based on different data sources. So there is integration with Power Bi, withdata sources that are available not only inAzure, but with other thirdparty platforms as well. You can actually go ahead and download the Power Bi desktop tool. This is a freely available tool. This tool is available for download on your local system. I'm on a Windows Ten system. I've already gone ahead to download and install the Power Bi desktop tool. So I'm just starting the PowerBI desktop tool now. The first thing I'll do is to click on Get Sources. actually just close all of the screens. I'll click on "get data" and hit on "more." Yes, I can choose as your and youhave a lot of options in place. I can choose it as your data lake storage generation two. Hit "connect." Now I need to provide a URL. So I go back to my data lake. Gen 2 storage account. I'll actually go on to other points. I'll scroll down. I'll take the end point, which is available for Data Lake storage. So I'll copy this. I'll place it over here. Now my file is in the data container in the raw folder, and I look at my Log CSC file. I'll hit okay; I can actually choose my account key. Also, I can get my account key over here so I can scroll on top. I can go on to access keys. I can show the keys. I can take either Key One or Key Two. So I'll take the key one. I'll place it over here. Hit on Connect. So I have my log.dot.CC file in place. I'll hit on transform data. Yeah. I get the Power Query Editor. Now I'll click on this binary link for the content, and then I should see all of the information in my Log CSV file. So I can see all of the information being displayed over here. I can just right-click and rename this particular query as log data. Then I can click Close and Apply, and my data will be saved in Power Bi. I can go ahead and close this. So it's loaded with all of the rules here. We should read all of our columns. So, for example, if you want to go ahead and have a clustered column chart, you can go ahead and just click this. Just expand it over here. I can close the filters. Let's say I want to display the count of the ideas based on the operation name. I'll get an entire graph over here. So now, based on the data that we have in your data lake, gentlemen, you can see you can already start building some basic analysis on this. But normally, when it comes to your rawdata, you'll actually first perform cleansing of yourdata and transformation of your data. So these are concepts that we learn a little bit later on in this particular section. I'd like to start by saying that you can now begin storing your data in your daily Lake Gentle storage account.

12. Lab - Authorizing to Azure Data Lake Gen 2 - Access Keys - Storage Explorer

Hi and welcome back. Now in this chapter, I just want to show you how you can use a tool known as your Storage Explorer to explore your storage accounts. So if you have employees in an organisation that only need to access storage accounts within their Azure account, instead of actually logging into the Azure Portal, if they only want to look at the data, they can actually make use of the Azure Storage Explorer. This is a free tool that is available for download, so they can go ahead and download the tool. It's available for a variety of operating systems. I've already gone ahead to download and install the tool. It's a very simple installation. Now, as soon as you open up Microsoft Azure Storage Explorer, you might be prompted to connect on to an Azure resource. So, here, you can actually log in using the subscription option in case you don't get the screen. If you can just see what is the AzureStore Explorer, this is what it looks like. You can go on to the Manage Accounts section over here and click on Add an account, and you'll get the same screen. I'll choose a subscription. I'll choose Azure. I'll proceed to the next one. You will need to sign on to your account. So I'll use my account information as your admin account information. Now, once we are authenticated, I'll just choose my test environment subscription. I'll hit "Apply." So I have many subscriptions in place. Now under my test environment subscription, I can see all of my storage accounts. If I actually go on to Data Store 2000 here, I can see my blog containers, and I can go on to my data containers. I can see all of my image files. If I go on to do Lake 2000, onto that storage account, onto Blob containers, onto my data container, onto my raw folder, I can see my JSON file. Here, I can download the file. I can upload new objects on to the container. So the Azure Storage Explorer is an interface that allows you to work with not only your Azure storage accounts but also with your Data Lake storage accounts as well. Now that we have logged in as your administrator, There are other ways you can authorise yourself to work with storage accounts. One way is to use access keys. See, here we are seeing all of the storage accounts. But let's say you want a user to only see a particular storage account. One way is to make use of the access keys that are linked to a storage account. If I go back onto my Data Lake Gen 2 storage account here, if I scroll down onto the Security and Networking section, there is something known as access keys. If I go on to the access keys, let me go ahead and just hide this. I click on "Show keys," and here I have key one. So we have two keys in place for a storage account. You have key one, and you have key two. A person can actually authorise themselves to use the storage account using this access key. So here I can take the key I copied to the clipboard and open it as your storage explorer. I'll go back on to manage accounts. Here, I'll add an account. I'll choose a storage account. It says account name and key here. I'll proceed to the next one. I'll paste in the account key. I need to give the account name. I'll get back onto Xiao. I can copy the account name from here. I can place it over here. place the same as the display name. Go on to Next and hit Connect. Now, here in the local and attached storage accounts, I can see my data lake Gen 2 storage account, so I can still have the view of all of my storage accounts that are part of my Azure admin over here. But at the same time, I can see only my data lake Gen 2 storage account. If I go onto my blog containers, onto my data containers, onto my raw folder here, I can see my JSON file. I said, if you want, you can go ahead and even download the JSON file locally so you can select the location, click on the select folder, and it will transfer the file from the data lake in your Gentle Storage account. So this is one way of allowing users to authorise themselves to use the data lake Gen 2 storage account.

13. Lab - Authorizing to Azure Data Lake Gen 2 - Shared Access Signatures

Now in the private chapter, I've shown how we could connect to a storage account. That is basically our deal. Storage Account using Access Keys As I mentioned before, there are different ways in which you can authorise a data lake storage account. Now, when it comes to security, if you look at the objectives for the exam, the security for the services actually falls in the section of "design and implement data security." But at this point in time, I want to show the concept of using something known as shared access signatures to authorise or use an account as your daily lake storage account. The reason I want to show this at this point in time is because when we look at Azure Synapse, we are going to see how to use access keys and share access signatures to connect and pull out data from an Azure data lake gen 2 storage account. And that's why, at this point in time, I want to show how we can make use of shared access signatures for authorising ourselves to use your data lake generation 2 storage account. So, going back to our resources, I'll go on to our data lake storage account. Now here, if I scroll down, in addition to access keys when it comes to security and networking, we also have something known as a shared access signature. I'll go on to it. Let me go ahead and hide this. Now, with the help of a shared access signature, you can actually give selective access to the services that are present in your storage account with an access key. So, remember how in the previous chapter we connected to a storage account using an access key? Now with the access key, the user can go ahead and work with not only the Blob service but also with file shares, queues, and the table service as well. So these are all the services that are available as part of the storage account. But if you want to limit the access to just a particular service, let's say that you are going to get the shared access signature onto a user, and you want that user to only have the ability to access the Blob service in the storage account. So with that, you can actually make use of shared access signatures. Here, what you'll do is that in the allowed services, you will just unselect the file queue and the table service so that the shared access signature can only be used for the Blob service. In the allowed resource types, I need to get access to the service itself, and I need to give access for the user to have the ability to see the container in the Blob service and also have access to the objects themselves. So I'll select all of them. In terms of the allowed permissions, I can go ahead and give selective permissions. So in terms of the permissions, I just want to use it to have the ability to list the blocks and read the blogs in my Azure Data Lake Gentle Storage account. I won't do anything or give permissions when it comes to enabling deletion of versions. So I'll leave it as it is. With the shared access signature, you can also give a start and expiration date time.That means that after the end date and time, this shared access signature will not be valid anymore. You can also specify which IP addresses will be valid for this shared access signature. At the moment, I'll leave. Everything has this. I'll scroll down here. It will use one of the access keys of the storage account to generate the shared access signature. So here I'll go ahead and click on this button for Generate SAS and Connection String." And here we have something known as a connection string, the SAS token, and the Blob Service SAS URL. The SAS token is something that we are going to use when we look at connecting onto the Data Lake Gen 2 storage account from Azure Synapse. At this point, let's see how to now connect to this Azure Data Lake Gen 2 storage account using a shared access signature. If I go back onto the Storage Explorer, what I'll do first is just right-click on the attached storage account, which we have done already, while the access key is highlighted. I'll right-click on this and click on Detach. So I'll say yes. Now I want to again connect to the storage account, but this time using the shared access signature. So I'll go on to manage accounts. I'll add an account. Here I'll choose the storage account. And here I'll choose to share access by signature. I'll continue to next year, but you must provide the SAS connection string. So I'll either copy this entire connection string or I can also go ahead and copy the Blob Service SAS URL. So let me go ahead and copy the service's SAS URL. I'll place it over here. I'll just paste it. You can see the display name. I'll go on next, and I'll go ahead and hit Connect. So, in terms of the Data Lake, you can now see. I am connected by the SAS shared access signature. And here you can see that I only have access to the blob containers. I don't have access to the table service, the queue service, or the file share service. As a result, we are now restricting access to the Blob service only. at the same time. Remember, I mentioned that this particular shared access feature would not be valid after this date and time. So if you want to give some sort of validity to this particular shared access signature, something that you can actually specify over here So I said the main point of this particular chapter was to explain to students the concept of a shared access signature. So there are different ways in which you can authorise yourself to use a storage account. When it comes to Azure services, there are a lot of security features that are available for how you can access the service. It should not be the case that the service is open to everyone. There has to be some security in place, and there aren't different ways in which you can actually authorise yourself to use a particular service in Azure. Right, so this marks the end of this chapter. As I mentioned before, we are looking at using shared access signatures. In later chapters, we look at it as your synapse.

Read More

Comments
* The most recent comment are at the top

Add Comments

Feel Free to Post Your Comments About EamCollection's Microsoft Azure DP-203 Certification Video Training Course which Include Microsoft DP-203 Exam Dumps, Practice Test Questions & Answers.

Similar Microsoft Video Courses

Administering a SQL Database Infrastructure
113
4.4
9 hrs
70-764 - Administering a SQL Database Infrastructure
Administering Microsoft Azure SQL Solutions
129
5.0
14 hrs
$24.99
DP-300 - Administering Microsoft Azure SQL Solutions
Administering Microsoft System Center Configuration Manager and Cloud Services Integration
132
4.5
4 hrs
70-703 - Administering Microsoft System Center Configuration Manager and Cloud Services Integration
Administering Windows Server 2012
451
4.5
8 hrs
70-411 - Administering Windows Server 2012
Administering Windows Server Hybrid Core Infrastructure
119
5.0
18 hrs
$24.99
AZ-800 - Administering Windows Server Hybrid Core Infrastructure
Analyzing and Visualizing Data with Microsoft Excel
135
4.6
5 hrs
70-779 - Analyzing and Visualizing Data with Microsoft Excel
Analyzing and Visualizing Data with Microsoft Power BI
102
4.3
11 hrs
70-778 - Analyzing and Visualizing Data with Microsoft Power BI
Analyzing Data with Microsoft Power BI
137
5.0
1 hr
$24.99
DA-100 - Analyzing Data with Microsoft Power BI
Architecting Microsoft Azure Solutions
129
4.5
1 hr
70-535 - Architecting Microsoft Azure Solutions
Architecting Microsoft Azure Solutions (70-534)
516
4.6
1 hr
70-534 - Architecting Microsoft Azure Solutions (70-534)
Cloud Fundamentals
124
4.6
1 hr
98-369 - Cloud Fundamentals
Configuring Advanced Windows Server 2012 Services
198
4.6
8 hrs
70-412 - Configuring Advanced Windows Server 2012 Services
Configuring and Operating Microsoft Azure Virtual Desktop
102
5.0
6 hrs
$24.99
AZ-140 - Configuring and Operating Microsoft Azure Virtual Desktop
Configuring Windows Devices
285
4.6
16 hrs
70-697 - Configuring Windows Devices
Configuring Windows Server Hybrid Advanced Services
136
5.0
19 hrs
$24.99
AZ-801 - Configuring Windows Server Hybrid Advanced Services
Core Solutions of Microsoft Exchange Server 2013
102
4.6
1 hr
70-341 - Core Solutions of Microsoft Exchange Server 2013
Core Solutions of Microsoft SharePoint Server 2013
97
4.5
1 hr
70-331 - Core Solutions of Microsoft SharePoint Server 2013
Database Fundamentals
94
4.5
3 hrs
$24.99
98-364 - Database Fundamentals
Deploying Microsoft 365 Teamwork
126
4.6
1 hr
MS-300 - Deploying Microsoft 365 Teamwork
Deploying SharePoint Server Hybrid
93
4.5
1 hr
MS-301 - Deploying SharePoint Server Hybrid
Designing an Azure Data Solution
87
4.5
5 hrs
DP-201 - Designing an Azure Data Solution
Designing and Deploying Microsoft Exchange Server 2016
144
4.4
2 hrs
70-345 - Designing and Deploying Microsoft Exchange Server 2016
Designing and Implementing a Data Science Solution on Azure
132
4.6
9 hrs
$24.99
DP-100 - Designing and Implementing a Data Science Solution on Azure
Designing and Implementing a Microsoft Azure AI Solution
113
5.0
5 hrs
$24.99
AI-102 - Designing and Implementing a Microsoft Azure AI Solution
Designing and Implementing an Azure AI Solution
125
4.5
2 hrs
AI-100 - Designing and Implementing an Azure AI Solution
Designing and Implementing Cloud Data Platform Solutions
104
4.5
1 hr
70-473 - Designing and Implementing Cloud Data Platform Solutions
Designing and Implementing Microsoft Azure Networking Solutions
87
5.0
5 hrs
$24.99
AZ-700 - Designing and Implementing Microsoft Azure Networking Solutions
Designing and Implementing Microsoft DevOps Solutions
135
4.5
2 hrs
$24.99
AZ-400 - Designing and Implementing Microsoft DevOps Solutions
Designing Business Intelligence Solutions with Microsoft SQL Server 2012
131
4.5
4 hrs
70-467 - Designing Business Intelligence Solutions with Microsoft SQL Server 2012
Designing Database Solutions for Microsoft SQL Server 2012
130
4.5
6 hrs
70-465 - Designing Database Solutions for Microsoft SQL Server 2012
Designing Microsoft Azure Infrastructure Solutions
94
5.0
8 hrs
$24.99
AZ-305 - Designing Microsoft Azure Infrastructure Solutions
Developing Microsoft Azure Solutions
367
4.6
1 hr
70-532 - Developing Microsoft Azure Solutions
Developing Microsoft SharePoint Server 2013 Core Solutions
100
4.5
1 hr
70-488 - Developing Microsoft SharePoint Server 2013 Core Solutions
Developing Microsoft SQL Server 2012/2014 Databases
128
4.6
4 hrs
70-464 - Developing Microsoft SQL Server 2012/2014 Databases
Developing Solutions for Microsoft Azure
130
4.5
1 hr
AZ-203 - Developing Solutions for Microsoft Azure
Developing Solutions for Microsoft Azure
98
4.6
14 hrs
$24.99
AZ-204 - Developing Solutions for Microsoft Azure
Developing SQL Data Models
100
4.5
7 hrs
70-768 - Developing SQL Data Models
Developing SQL Databases
141
4.5
8 hrs
70-762 - Developing SQL Databases
Enabling Office 365 Services
484
4.5
1 hr
70-347 - Enabling Office 365 Services
Endpoint Administrator
138
5.0
13 hrs
$24.99
MD-102 - Endpoint Administrator
Excel 2013
126
4.6
11 hrs
$24.99
77-420 - Excel 2013
Excel 2016: Core Data Analysis, Manipulation, and Presentation
90
4.5
6 hrs
$24.99
77-727 - Excel 2016: Core Data Analysis, Manipulation, and Presentation
Identity with Windows Server 2016
98
4.5
20 hrs
70-742 - Identity with Windows Server 2016
Implementing a Hybrid and Secure Messaging Platform
112
4.5
4 hrs
MS-201 - Implementing a Hybrid and Secure Messaging Platform
Implementing a SQL Data Warehouse
140
4.3
6 hrs
70-767 - Implementing a SQL Data Warehouse
Implementing an Azure Data Solution
137
4.5
5 hrs
DP-200 - Implementing an Azure Data Solution
Implementing Data Models and Reports with Microsoft SQL Server 2012
124
4.5
5 hrs
70-466 - Implementing Data Models and Reports with Microsoft SQL Server 2012
Implementing Microsoft Azure Infrastructure Solutions
296
4.5
1 hr
70-533 - Implementing Microsoft Azure Infrastructure Solutions
Installation, Storage, and Compute with Windows Server 2016
439
4.5
8 hrs
70-740 - Installation, Storage, and Compute with Windows Server 2016
Installing and Configuring Windows Server 2012
315
4.4
10 hrs
70-410 - Installing and Configuring Windows Server 2012
Introduction to Programming Using Python
101
4.6
7 hrs
$24.99
98-381 - Introduction to Programming Using Python
Managing Microsoft SharePoint Server 2016
123
4.5
1 hr
70-339 - Managing Microsoft SharePoint Server 2016
Managing Microsoft Teams
135
4.5
9 hrs
$24.99
MS-700 - Managing Microsoft Teams
Managing Modern Desktops
141
4.5
7 hrs
$24.99
MD-101 - Managing Modern Desktops
Managing Office 365 Identities and Requirements
112
4.6
1 hr
70-346 - Managing Office 365 Identities and Requirements
Managing Projects with Microsoft Project 2013
85
4.6
1 hr
74-343 - Managing Projects with Microsoft Project 2013
MCSA Administering Microsoft SQL Server 2012/2014 Databases
477
4.6
11 hrs
70-462 - MCSA Administering Microsoft SQL Server 2012/2014 Databases
MCSA Implementing a Data Warehouse with Microsoft SQL Server 2012/2014
100
4.5
6 hrs
70-463 - MCSA Implementing a Data Warehouse with Microsoft SQL Server 2012/2014
MCSA Networking with Windows Server 2016
142
4.6
5 hrs
70-741 - MCSA Networking with Windows Server 2016
MCSA Querying Microsoft SQL Server 2012/2014
546
4.5
12 hrs
70-461 - MCSA Querying Microsoft SQL Server 2012/2014
MCSD Developing ASP.NET MVC Web Applications
131
4.6
18 hrs
70-486 - MCSD Developing ASP.NET MVC Web Applications
MCSD Developing Windows Azure and Web Services
102
4.6
13 hrs
70-487 - MCSD Developing Windows Azure and Web Services
MCSD Programming in C#
370
4.5
8 hrs
70-483 - MCSD Programming in C#
MCSD Programming in HTML5 with JavaScript and CSS3
156
4.6
7 hrs
70-480 - MCSD Programming in HTML5 with JavaScript and CSS3
MCSE Designing and Implementing a Server Infrastructure
683
4.5
9 hrs
70-413 - MCSE Designing and Implementing a Server Infrastructure
MCSE Implementing an Advanced Server Infrastructure
220
4.4
8 hrs
70-414 - MCSE Implementing an Advanced Server Infrastructure
Microsoft 365 Administrator
134
5.0
5 hrs
$24.99
MS-102 - Microsoft 365 Administrator
Microsoft 365 Fundamentals
86
4.5
2 hrs
$24.99
MS-900 - Microsoft 365 Fundamentals
Microsoft 365 Identity and Services
94
4.5
3 hrs
$24.99
MS-100 - Microsoft 365 Identity and Services
Microsoft 365 Messaging
132
4.4
11 hrs
$24.99
MS-203 - Microsoft 365 Messaging
Microsoft 365 Mobility and Security
135
4.5
5 hrs
$24.99
MS-101 - Microsoft 365 Mobility and Security
Microsoft 365 Security Administration
117
4.5
7 hrs
$24.99
MS-500 - Microsoft 365 Security Administration
Microsoft Access Expert Exam
141
5.0
8 hrs
$24.99
MO-500 - Microsoft Access Expert Exam
Microsoft Azure Administrator
134
4.5
10 hrs
$24.99
AZ-104 - Microsoft Azure Administrator
Microsoft Azure Administrator
121
4.6
7 hrs
AZ-103 - Microsoft Azure Administrator
Microsoft Azure AI Fundamentals
126
5.0
5 hrs
$24.99
AI-900 - Microsoft Azure AI Fundamentals
Microsoft Azure Architect Design
115
4.5
9 hrs
AZ-301 - Microsoft Azure Architect Design
Microsoft Azure Architect Design
100
4.5
12 hrs
$24.99
AZ-304 - Microsoft Azure Architect Design
Microsoft Azure Architect Technologies
114
4.6
10 hrs
$24.99
AZ-303 - Microsoft Azure Architect Technologies
Microsoft Azure Architect Technologies
113
4.6
9 hrs
AZ-300 - Microsoft Azure Architect Technologies
Microsoft Azure Data Fundamentals
118
4.4
2 hrs
$24.99
DP-900 - Microsoft Azure Data Fundamentals
Microsoft Azure Fundamentals
121
4.7
3 hrs
$24.99
AZ-900 - Microsoft Azure Fundamentals
Microsoft Azure Security Technologies
112
4.5
7 hrs
$24.99
AZ-500 - Microsoft Azure Security Technologies
Microsoft Cybersecurity Architect
122
5.0
12 hrs
$24.99
SC-100 - Microsoft Cybersecurity Architect
Microsoft Dynamics 365 Business Central Functional Consultant
105
5.0
1 hr
$24.99
MB-800 - Microsoft Dynamics 365 Business Central Functional Consultant
Microsoft Dynamics 365 customer engagement Online Deployment
126
4.6
1 hr
MB2-715 - Microsoft Dynamics 365 customer engagement Online Deployment
Microsoft Dynamics 365 Customer Service Functional Consultant
116
4.5
2 hrs
$24.99
MB-230 - Microsoft Dynamics 365 Customer Service Functional Consultant
Microsoft Dynamics 365 Customization and Configuration
111
4.5
12 hrs
MB2-716 - Microsoft Dynamics 365 Customization and Configuration
Microsoft Dynamics 365 Finance Functional Consultant
95
5.0
8 hrs
$24.99
MB-310 - Microsoft Dynamics 365 Finance Functional Consultant
Microsoft Dynamics 365 for Field Service
98
4.5
1 hr
$24.99
MB-240 - Microsoft Dynamics 365 for Field Service
Microsoft Dynamics 365 for Sales
86
4.6
1 hr
MB2-717 - Microsoft Dynamics 365 for Sales
Microsoft Dynamics 365 for Sales
145
5.0
9 hrs
$24.99
MB-210 - Microsoft Dynamics 365 for Sales
Microsoft Dynamics 365 Fundamentals
143
5.0
3 hrs
MB-901 - Microsoft Dynamics 365 Fundamentals
Microsoft Dynamics 365 Fundamentals Customer Engagement Apps (CRM)
90
5.0
5 hrs
$24.99
MB-910 - Microsoft Dynamics 365 Fundamentals Customer Engagement Apps (CRM)
Microsoft Dynamics 365: Core Finance and Operations
94
5.0
3 hrs
$24.99
MB-300 - Microsoft Dynamics 365: Core Finance and Operations
Microsoft Dynamics CRM 2016 Customization and Configuration
117
4.5
3 hrs
$24.99
MB2-712 - Microsoft Dynamics CRM 2016 Customization and Configuration
Microsoft Dynamics CRM 2016 Sales
125
4.4
1 hr
MB2-713 - Microsoft Dynamics CRM 2016 Sales
Microsoft Excel (Excel and Excel 2019)
142
4.5
11 hrs
$24.99
MO-200 - Microsoft Excel (Excel and Excel 2019)
Microsoft Excel Expert (Excel and Excel 2019)
107
5.0
8 hrs
$24.99
MO-201 - Microsoft Excel Expert (Excel and Excel 2019)
Microsoft Identity and Access Administrator
128
5.0
3 hrs
$24.99
SC-300 - Microsoft Identity and Access Administrator
Microsoft Information Protection Administrator
142
5.0
6 hrs
$24.99
SC-400 - Microsoft Information Protection Administrator
Microsoft Power BI Data Analyst
104
5.0
3 hrs
$24.99
PL-300 - Microsoft Power BI Data Analyst
Microsoft Power Platform + Dynamics 365 Core
135
5.0
10 hrs
MB-200 - Microsoft Power Platform + Dynamics 365 Core
Microsoft Power Platform App Maker
103
5.0
15 hrs
$24.99
PL-100 - Microsoft Power Platform App Maker
Microsoft Power Platform Developer
119
5.0
11 hrs
$24.99
PL-400 - Microsoft Power Platform Developer
Microsoft Power Platform Functional Consultant
124
5.0
7 hrs
$24.99
PL-200 - Microsoft Power Platform Functional Consultant
Microsoft Power Platform Fundamentals
135
5.0
6 hrs
$24.99
PL-900 - Microsoft Power Platform Fundamentals
Microsoft Security Operations Analyst
126
5.0
12 hrs
$24.99
SC-200 - Microsoft Security Operations Analyst
Microsoft Security, Compliance, and Identity Fundamentals
137
5.0
7 hrs
$24.99
SC-900 - Microsoft Security, Compliance, and Identity Fundamentals
Microsoft Word (Word and Word 2019)
111
5.0
5 hrs
$24.99
MO-100 - Microsoft Word (Word and Word 2019)
Networking Fundamentals
122
4.5
4 hrs
$24.99
98-366 - Networking Fundamentals
Perform Cloud Data Science with Azure Machine Learning
100
4.5
1 hr
70-774 - Perform Cloud Data Science with Azure Machine Learning
Planning and Administering Microsoft Azure for SAP Workloads
97
5.0
5 hrs
$24.99
AZ-120 - Planning and Administering Microsoft Azure for SAP Workloads
Planning and Configuring a Messaging Platform
135
4.5
4 hrs
MS-200 - Planning and Configuring a Messaging Platform
Provisioning SQL Databases
141
4.6
8 hrs
70-765 - Provisioning SQL Databases
Querying Data with Transact-SQL
120
4.6
6 hrs
70-761 - Querying Data with Transact-SQL
Securing Windows Server 2016
90
4.5
13 hrs
70-744 - Securing Windows Server 2016
Security Fundamentals
110
4.6
6 hrs
$24.99
98-367 - Security Fundamentals
SharePoint 2010
97
4.4
2 hrs
$24.99
77-886 - SharePoint 2010
Software Development Fundamentals
112
4.5
5 hrs
$24.99
98-361 - Software Development Fundamentals
Technology Literacy for Educators
143
5.0
1 hr
$24.99
62-193 - Technology Literacy for Educators
Upgrading Your Skills to MCSA Windows Server 2012
124
4.5
5 hrs
70-417 - Upgrading Your Skills to MCSA Windows Server 2012
Upgrading Your Skills to MCSA: Windows Server 2016
140
4.5
2 hrs
70-743 - Upgrading Your Skills to MCSA: Windows Server 2016
Windows 10
97
4.5
6 hrs
$24.99
MD-100 - Windows 10
Windows Operating System Fundamentals
144
4.4
3 hrs
$24.99
98-349 - Windows Operating System Fundamentals
Windows Server Administration Fundamentals
124
4.6
6 hrs
$24.99
98-365 - Windows Server Administration Fundamentals
Cloud Fundamentals
Cloud Fundamentals
124
4.6
1 hr
Excel 2013
$24.99
Excel 2013
126
4.6
11 hrs
Windows 10
$24.99
Windows 10
97
4.5
6 hrs

Only Registered Members Can Download VCE Files or View Training Courses

Please fill out your email address below in order to Download VCE files or view Training Courses. Registration is Free and Easy - you simply need to provide an email address.

  • Trusted By 1.2M IT Certification Candidates Every Month
  • VCE Files Simulate Real Exam Environment
  • Instant Download After Registration.
Please provide a correct e-mail address
A confirmation link will be sent to this email address to verify your login.
Already Member? Click Here to Login

Log into your ExamCollection Account

Please Log In to download VCE file or view Training Course

Please provide a correct E-mail address

Please provide your Password (min. 6 characters)

Only registered Examcollection.com members can download vce files or view training courses.

Registration is free and easy - just provide your E-mail address. Click Here to Register

SPECIAL OFFER: GET 10% OFF

ExamCollection Premium

ExamCollection Premium Files

Pass your Exam with ExamCollection's PREMIUM files!

  • ExamCollection Certified Safe Files
  • Guaranteed to have ACTUAL Exam Questions
  • Up-to-Date Exam Study Material - Verified by Experts
  • Instant Downloads
Enter Your Email Address to Receive Your 10% Off Discount Code
A Confirmation Link will be sent to this email address to verify your login
We value your privacy. We will not rent or sell your email address

SPECIAL OFFER: GET 10% OFF

Use Discount Code:

MIN10OFF

A confirmation link was sent to your e-mail.
Please check your mailbox for a message from support@examcollection.com and follow the directions.

Next

Download Free Demo of VCE Exam Simulator

Experience Avanset VCE Exam Simulator for yourself.

Simply submit your e-mail address below to get started with our interactive software demo of your free trial.

Free Demo Limits: In the demo version you will be able to access only first 5 questions from exam.