AZ-104: Microsoft Azure Administrator Certification Video Training Course
AZ-104: Microsoft Azure Administrator Certification Video Training Course includes 132 Lectures which proven in-depth knowledge on all key concepts of the exam. Pass your exam easily and learn everything you need with our AZ-104: Microsoft Azure Administrator Certification Training Video Course.
Curriculum for Microsoft Azure AZ-104 Certification Video Training Course
AZ-104: Microsoft Azure Administrator Certification Video Training Course Info:
The Complete Course from ExamCollection industry leading experts to help you prepare and provides the full 360 solution for self prep including AZ-104: Microsoft Azure Administrator Certification Video Training Course, Practice Test Questions and Answers, Study Guide & Exam Dumps.
So we're back at the overview screen of our storage account, and we hear right on the overview screen that there are four types of data that the general purpose view storage that we created can store. The most common type is a blob, which is stored in what's called Taylor's. There is a file system, which is an SMB file share, that we can mount from our Windows servers and from our Windows workstations or Linux. And so it's shared file storage. There is a table storage format. It's not really a SQL Server database, but it does store data in columns and rows. And finally, there is the concept of queues, which is effectively a messaging system that has a first-in-first type of metaphor. Now, if we go into it, we're going to focus on containersright now if we go into it.Now, I have created a container called test. Actually it's very easy to delete that container, and it's telling me that if I had locks or any other ArbAC settings, I'm going to fail. I can attempt to override it, but I'm just going to leave the container and you'll see it successfully deleted. This plus container button appears when you create a new container. I'm going to call it "New Test," and we do not have the chance of having private blob or container access. Private access does require the security keys that we were talking about for shared access signature blob and container access to allow anonymous access. So be very careful when selecting these two. Private is fine. Now to store a file in a container, go into the container. Now look at this message. It says you do not have access. Now this is very interesting because when we created the storage account, we set up a virtual network for it. So I'm going to go back up to the storage account level and we're going to go into the firewalls and virtual network settings. As you can see, only certain virtual networks have access to the storage account. We could change this to apply to all networks, which would remove the error message, or we could add our owners to the firewall, which is what I'm going to do. So I check this checkbox that says "Add your client IP address." So now I've whitelisted myself by including this virtual network and myself in the storage account. If I go back to overview, go back into containers, go back into test, now we don't get the authorization error. So now we do have access because my IP has been whitelisted. Now from this point we can upload files, but this is not a very good This container view is not a very good way of interacting with files. For instance, when you're able to preview the contents of a file, it does become kind of hard to move stuff around. So there are better ways to do this. Let's go back to the container view. Now if I go into the overview, one of the first items in the top left corner says Open in Explorer, so we can go into what's called Storage Explorer. The same view is under Storage Explorer, and it's going to give us a different and maybe more efficient view of our storage account inclusions. We just created this new test, and now we can sort of do filtering and searching. There is more information available within Storage Explorer. Now, this browser for the storage account isn't even the most efficient way. Microsoft provides a Windows and Linux version of the Storage Explorer software that you can download to your computer and manage your Azure Storage accounts using this software. Now all Azure services use what's called a rest API. And so you can programmatically add storage to your own code as well. This recipe may be the most efficient if you need to view, move, and copy items into your Storage account. Or this Windows download, Linux download, or Mac OS download will be more efficient when working with files. But we have this browser view, and it's quite convenient sometimes, creating this new test container. Let's upload a file to it. So I'm going to say upload, and we're given this to select a file. I can programmatically pick one. I'm going to pick one of the videos that are recorded for this course. Now we open up the advanced section. We can see that there are a couple of ways to authenticate. We can authenticate using the access keys, or if we want to authenticate using an Azure Active Directory user, we'll talk about our back authentication, and in a couple of videos, we'll show you how we can do that. Now we do have the choice of a blob type. There are blocks, pages, and pen blobs. Blobs is now the default, and it's likely to be adequate in most situations. Page blob is sort of optimised for when you need to update the file, but not the entire file. So this will be good for a virtualhard disc or some type of piece ofdata that is updated, but only partially. And finally, append blobs are optimised for adding to a file. So like a log file where you're just always appending to it, That would be ideal for that. So Blockbusters is going to be fine for 99% of the cases you'll see here. It says to upload VHD files as page blocks. That really is the optimised case for page blocks. If we're interested, we can choose the block size, which really, really is not an important decision for most cases. Maybe you know your file system so well that 64-kilobyte or 128-kilobyte blocks are optimal because you've got a lot of really tiny files. Let's say you have hundreds of thousands of really tiny files. Maybe you do want to go to a smaller block size for efficiency. But if your files average around four megabytes or higher, then that's a pretty good size for that too. Remember, we talked previously about Hot coolarchive tiers is the default, archive isthe cheapest of them all. So let's say Hot costs 2GB, Cool costs one cent of a gigabyte, and Archive costs one 10th of a centimetre gigabyte. that type of relative cost per gigabyte. And finally, there is a virtualized folder metaphor. Let's just leave with the defaults and say "upload." So the 24 megabyte file is now being transferred from my computer to this new test via my browser and the open internet. In this video, we saw that there are many ways to access the contents of our storage account, including Storage Explorer, which is available in the browser as well as in separate software.
Now, in this video, we'll go over how to tell if your storage account is performing well right from the overview screen. Actually, if I minimise this top-level scroll a little bit, we can see that there are some overview graphs right on the overview screen. And so the data is, you can look at the last seven days, the last 24 hours, the last 12 hours, all the way down to the last four. Now, this storage account, which I only just created at the beginning of this section, has lived for all that time, and we're not really using it in a production setting. So the data is leaving Azure. The egress is quite limited. We did upload a 20-minute file. As a result, the data arriving at Azure We can actually see that file upload moment right here. We can also look at the amount of time that the requests are taking, which is 2 seconds on average. And the request are comingin success authentication, other errors. This isn't a super helpful, it's a goodoverview of what's going on altrid mentioned. We can look at the entire account as a whole or break down the Blob containers, the floors, the tables, and the queues as separate reports. If any of these tables pique your interest and you want to save them, there's a little pin icon that will pin that chart through the portal's dashboard, and you'll just see that. So if I click Pin, then when I go to the dashboard, this egress chart for containers at the 1-hour view will always be there. So that's one way of monitoring the storage account. If we scroll down on the settings blade, there is a monitoring section and factors for entering sections. One is the old style of monitoring, and one is a newer style of monitoring. We're starting off with monitoring Classic under diagnostic settings. At the time that I created the storage account, diagnostics were enabled by default. So I did not have to enable it, but I can have it on by default. It's choosing 1-hour metrics for the Blob storage. And if I go into the dial tablesand queues, it's also on an hourly basis. So it's a rare type diagnosis, but it's not particularly bothersome. It also stores it for up to seven days, according to the slider. And obviously, if I need to keep it for 70 days, I can do that as well. I can also get more fine-grained diagnostics by enabling the minute level. and so every 60 seconds it will pull the diagnostics from this. So our egress and ingress and data usage Except if we wanted to store this in a file, we could enable logging. Let's say I want to enable all the reads and writes that are coming in, and that would store reads and writes for later. So this is just the performance metric. And down here are the actual calls to that service. Now for the purposes of this, I'm going to skip over the metrics and alerts. In the classic view, we're going to skip over to the metrics and alerts that are the more modern view. So if I go into metrics here, you're going to notice that almost every service within Azure has this pretty standard way of building a graph. So I have my storage account, and I can select the account level, Blob level. This is very familiar from the overview screen. And now what do I want to see? Let's say I want to look at the number of Blobs in my account, and it's going to average that over time. Now I've created a graph that is going to average the number of Blobs on my account. Now remember, this is an hourly view, so it's going to be zero. We just uploaded one. If we were to go up to the right hereand change the graph from a 24 hours view toa 1 hour view, it's a 1 hour aggregation. I can basically change that to a 1-minute view or a 30-minute period. Now, in this particular case, it's not very helpful. As I previously stated, we don't have it, this isn't a production issue, and we've got blobs coming in and out, but it hasn't yet registered the file that we've uploaded in this view. But once again, if this is something youwant to see your blobs growing over timeor you can even add the capacity. So on a scale of zero to 100, how close are you to exceeding your storage account? So that kind of thing can be interesting. You can pin that to the dashboard. You can also say, "Let's create an alert rule that I want to be in place. I want to receive an SMS message when the capacity of the storage account exceeds 75%." So I can go alert rule, that's the storageaccount, the condition, what metric do I want? I want the use capacity, and I want to see more than what is listed in bytes. So the capacity of the storage accounts is five terabytes or five petabytes. Sorry. So, if it was a foot petabyte, we could say so, right? So 1 million bytes is a megabyte, that's a gigabyte, that's a terabyte, and that's a petabyte. So if this capacity is greater than one petabyte and it's going to check every minute and average it over the hour, okay, then I want to receive an SMS message. Now, this is just a very basic type of alert that you want to get. There is a cost of alert. Now go down here. If I have an existing action group, I can add it, but I can also create an action group. An action group is basically going to hear SMS messages and emails, run a function, do some sort of automation on an app, call a URL, or do any kind of thing that we want to happen. We can set that up as an action groupand I'm not going to create that right now. Close this out without creating it. But, in essence, that is how you would alert. Finally, we should talk about insights. Now when I open this up, this gets into the Azure Monitor. So if I was to go to the Azure monitor service, then we would see something similar to this. This Insights preview is basically a predefined template of interesting reports. Again, we can pin this to the dashboard if we need it. We can always go back to the gallery and look at other templates, or we can template it to become our own storage dashboard. Assume we want to go into capacity here and see some existing graphs on how our storage accounts are doing in terms of capacity or performance. And we can see how can see thereis how much of this success is happeningthe various milliseconds for all the various costs. So the rest API put set delete create is giving up performance metrics. So this salary is sort of a predefined list of interesting reports. And if any one of these things again, itbecomes interesting to you can pin it or youcan even create an alert based on that.
Now remember, when we created this storage account, we chose locally redundant storage for this case. I said at the time that the definition of locally redundant storage is that Azure will keep three copies of your data in the same data center. The risk is when that data centre or region goes down, which does happen, let's say once a year or some other infrequent time, then you're not going to have access to your data. So let's go into the configuration setting. And we can see here that we can actually change the redundancy of our data. So we upgrade from locally redundant storage to geo-redundant storage and click save. What we're doing is getting Azure to replicate our data to another data center. Now what it's doing, of course, is copying all the contents of this storage into three other places. It's behind the scenes, of course, setting up that stuff in another geographical region. Now, as the pop-up said, it does go into detail here. It does take up to 30 seconds to take effect. It's pretty quick, actually. Back to the overview screen. And now we're into geo-redundant storage. So all of our existing files have now been replicated in an additional region. If I was to upload another one, then that would automatically be replicated. This is relatively new: the geo-replication settings within Azure. Now we can see this is Georgundant Storage and that it is primarily in the Canada Central region, which we created. and it's now been replicated in the Canada East region. So in these stores in these regions, they basically have pairs. So Canada Central and Canada East are a pair. You'll also notice it's in Canada. And so our data is not actually leaving the geographic region. You'll put your data in the United States, and then your data will stay in the US. If you put your data in Europe, then the georedundant storage will choose another data centre in Europe. These are called region pairs, and they have the fastest high-speed fibre optic cable connection between them. and generally they're in the same region. There are some exceptions. like there's only one data center. There's only one region in South America, in Brazil, and so here is the United States. But that only works one way. So there are two geographical storage systems. Now I'm going to pause this video. It's doing the synchronization. It tells me failure can be initialized. Replication is still in progress. Pause the video. When we come back, we're going to be able to set up an automatic failover between these two regions or a manual failover. So that took about ten or 15 minutes. And we've now got a situation where we've got our primary data centre set up in the secondary data center. And I guess replication has occurred. We've got three copies of your files in Canada Central and three copies of your files in Canada East. now because this is redundant storage and not the read-access kind. We don't have access to the secondary account where there's no direct access to you all. If I go into the properties of this storage account, there's only one URL for the primary endpoint. If we had the read-access version, we would have two of them. But let's just say Canada Central is down and we're like, "No, we need to get these files." We can basically set up this failover, so we can basically force the primary to become Canada East. So, in this case, we're going to use geo replication to control how we add these files, especially in an emergency. Now, I'm not going to go through the process of starting the geo replication, the failover process, and moving from Canada Central to Canada East. But this is how we would do that if we needed to get in there and force a failover.
Now, Azure gives us many different ways to give others access to the content of storage accounts. We've already seen that access keys are sort of the central element of the storage account. You don't want to give these access keys to too many people, or to anyone if at all possible. The shared access signature was one way to generate a token or a URL that you can grant to others who can have limited permits that expire without having to touch your access keys. Microsoft also provides a sufficient method using role-based access control (RBAC).If we go under the storage account under the Access Control tab, then we can see a familiar Access Control interface. This is controlled for almost every resource within Azure. So we can go into here, we can click on Role Assignments, and we can see a list of applications and other services that have been given access to this account and even other individuals that have been granted guest access. If we click Add, then Add Role Assign, we'll be taken to this page where we can select an individual. So I'm going to go with this John Delicate, who currently has no access to the storage account through our back. When we open the drop-down for the role, we can see the typical owner, contributor, and reader roles, which can be granted to your entire subscription effectively. But if we scroll down a little bit, we can see a lot of storage-based access roles. So we've got the storage account contributor, the storage account key operator, Blob-related roles, file-related roles, and curated roles. So let's say you want this John Doeto to be able to read basic Blob data. So you choose the storage blob datareader role, grant that, and say save. Now, it does take about 5 minutes for this to propagate, but within about five, any application or user that authenticates with Azure Active Directory as JohnDoe will have reader access to this storage account. So in this way, we're using role-based access control to access a storage account. Now, we can see that this role has been granted for this resource only and not across the subscription. If we left this storage account out and went to another storage account in Access Controls, we can scroll down, but we will not see the John Doe account that has been granted access to that. So using this technique actually restricts their scope by only being focused on this individual resource.
So, as we can see here within the portal, we have the ability to upload the file. With the three dots here, we can access the View Edit Command, view the file, download a copy to our local, or even delete it. But there are not a lot of options at the portal level when it comes to manipulating this file, such as moving it to another storage account, moving another container, etc. So in this particular case, we're going to want to download a programme called AZ Copy. Now if we go into the research engine and search for AZ Copy, you'll find a link that basically downloads it directly to your computer. And if you download that software and hit the setup button, you'll be able to basically install it right there on your C drive there.
Now the reason why we're installing AZ Copy is because we're going to use this command-line tool to manipulate files and move them. So we want to move this one-hyphen-one file into the Images folder. We'll show you how to do that. It was now downloaded by default into my Scapefiles x86 Microsoft SDK Azure SSP Folder. I do need to add this to my Path file. So, if I go into the environment variables and settings with system properties in Windows and click Path and then click Edit, this is obviously in my account, on my Windows. I can obviously make it system wide.But adding the AZ copy to the path is what's going to allow me to access it. I now prefer to do things with PowerShell. If I go into here and I say, "AZ copy," then I've found a copy. Sweet. Now AzCopy is going to be fairly easy to use, and it actually says, "Questionnaire marketing." Here are some examples. What we want to do is copy a blob within a storage account. Okay? So we're going to use something like this to say that from this container and this folder, we went to this other container and this other folder. So let's use typed syntax. So it's an AZ copy. Apologize. Actually, I'll clarify it there. So before we continue, let's login to our Azure account. That is an AZ login. This is going to pop up a dialogue in which we're going to have to log into our Microsoft Azure Account. I'm going to use my account. It says you have logged into Azure. You can close this window, or it's going to close itself. I'll leave that there. So now we're logged in. So we want to use AZ. Copy, command. The first parameter of that is source. Okay, back here, I'm going to close this. In this case, we want to duplicate the single JPEG. So the source that we need is going to basically be the container.
So we're going to go up to the blob level. We see that we only have the container, and I go into Container Properties, and I can get the URL for the container right here, so go back to the Windows PowerShell, and this is the source container; now we need the nation container. Okay, now the destination container is going to be the same container, but it's just going to have a different folder structure, so in this case, we're going to make it the images virtual directory, so if I go back here. Go back into the container; we know we want to copy this file from here into the images directory, so we simply make the destination the images directory, and the third part is called "pattern." Now that we know that our file is 101 jpg, that is the pattern that we're going to look for in the first container to copy into the images virtual directory. Now we do need to provide keys for this, so we go back to our account, do a backup at the storage account level, and probably need to go into the access keys and copy one of the keys. Now that's not the only option; it does support SAS shared access signatures, so if we go into our blob here, we could have generated a secure access signature on this file and then copied that; we could have copied the SAS token as opposed to using the key for the sake of this example.
I'm going to use the key, so I'm going to say source key; it's this. I'm going to put this into quotes just to be safe, and the other parameter is destinationkey, and since this is the same storage account, the key is the same as well. Obviously, copying this across multiple storage accounts would have different keys. If I were to run this, I would expect that the 101 image would be copied. Duplicating a service copy means that the file is not downloaded to my local and then re-uploaded, which is especially important for megabyte, gigabyte, and larger files because you don't want to incur the cost and time of downloading the file and then having to upload it again. Microsoft charges bandwidth costs as well as time-spending server-side copy, which is what you do with AZ copy. If you do want to copylocal, there is now a parameter called sync copy. If I put this parameter, it would copy the image into local memory; it's not going to save it to my system; it would copy this image to my local memory and upload it to the new storage account; but I want this to all happen behind the scenes, so I won't do that. So recklessly hit enter, and we can see that the number of files transferred is one; it took a fraction of a second; and if I go back to the storage account and go into the image holder, I will see that that file has been copied from the route into the images folder. Okay, so that's how you do server-side copying. You could use a z-copy if you wanted to, then delete the image from the root. That's a separate command, of course. And so that's how you would basically move files for storage accounts between entire subscriptions. As long as you have the access keys for that. You can do that.
Download Free Microsoft Azure AZ-104 Practice Test Questions, Microsoft Azure AZ-104 Exam Dumps
|Microsoft.actualtests.AZ-104.v2023-10-12.by.ida.206q.vce||5||6.49 MB||Oct 18, 2023|
|Microsoft.pass4sures.AZ-104.v2022-01-04.by.sofiya.198q.vce||4||6.08 MB||Feb 17, 2022|
|Microsoft.questionspaper.AZ-104.v2020-09-18.by.ximena.53q.vce||3||1008.79 KB||Nov 13, 2020|
Similar Microsoft Video Courses
Only Registered Members Can Download VCE Files or View Training Courses
Please fill out your email address below in order to Download VCE files or view Training Courses. Registration is Free and Easy - you simply need to provide an email address.
Log into your ExamCollection Account
Please Log In to download VCE file or view Training Course
Only registered Examcollection.com members can download vce files or view training courses.
SPECIAL OFFER: GET 10% OFF
Pass your Exam with ExamCollection's PREMIUM files!
SPECIAL OFFER: GET 10% OFF
Use Discount Code:
A confirmation link was sent to your e-mail.
Please check your mailbox for a message from email@example.com and follow the directions.
Download Free Demo of VCE Exam Simulator
Experience Avanset VCE Exam Simulator for yourself.
Simply submit your e-mail address below to get started with our interactive software demo of your free trial.