Exams > Microsoft > AZ-204: Developing Solutions for Microsoft Azure
AZ-204: Developing Solutions for Microsoft Azure
Page 2 out of 33 pages Questions 11-20 out of 321 questions
Question#11

HOTSPOT -
You are developing a back-end Azure App Service that scales based on the number of messages contained in a Service Bus queue.
A rule already exists to scale up the App Service when the average queue length of unprocessed and valid queue messages is greater than 1000.
You need to add a new rule that will continuously scale down the App Service as long as the scale up condition is not met.
How should you configure the Scale rule? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Discover Answer Hide Answer

Answer:
Box 1: Service bus queue -
You are developing a back-end Azure App Service that scales based on the number of messages contained in a Service Bus queue.

Box 2: ActiveMessage Count -
ActiveMessageCount: Messages in the queue or subscription that are in the active state and ready for delivery.

Box 3: Count -

Box 4: Less than or equal to -
You need to add a new rule that will continuously scale down the App Service as long as the scale up condition is not met.
Box 5: Decrease count by

Question#12

DRAG DROP -
You have an application that uses Azure Blob storage.
You need to update the metadata of the blobs.
Which three methods should you use to develop the solution? To answer, move the appropriate methods from the list of methods to the answer area and arrange them in the correct order.
Select and Place:

Discover Answer Hide Answer

Answer:
Metadata.Add example:
// Add metadata to the dictionary by calling the Add method
metadata.Add("docType", "textDocuments");
SetMetadataAsync example:
// Set the blob's metadata.
await blob.SetMetadataAsync(metadata);
// Set the blob's properties.
await blob.SetPropertiesAsync();
Reference:
https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-properties-metadata

Question#13

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are developing an Azure solution to collect point-of-sale (POS) device data from 2,000 stores located throughout the world. A single device can produce
2 megabytes (MB) of data every 24 hours. Each store location has one to five devices that send data.
You must store the device data in Azure Blob storage. Device data must be correlated based on a device identifier. Additional stores are expected to open in the future.
You need to implement a solution to receive the device data.
Solution: Provision an Azure Event Grid. Configure the machine identifier as the partition key and enable capture.
Does the solution meet the goal?

  • A. Yes
  • B. No
Discover Answer Hide Answer

Answer: A
Reference:
https://docs.microsoft.com/en-us/azure/event-grid/compare-messaging-services

Question#14

You develop Azure solutions.
A .NET application needs to receive a message each time an Azure virtual machine finishes processing data. The messages must NOT persist after being processed by the receiving application.
You need to implement the .NET object that will receive the messages.
Which object should you use?

  • A. QueueClient
  • B. SubscriptionClient
  • C. TopicClient
  • D. CloudQueueClient
Discover Answer Hide Answer

Answer: D
A queue allows processing of a message by a single consumer. Need a CloudQueueClient to access the Azure VM.
Incorrect Answers:
B, C: In contrast to queues, topics and subscriptions provide a one-to-many form of communication in a publish and subscribe pattern. It's useful for scaling to large numbers of recipients.
Reference:
https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-queues-topics-subscriptions

Question#15

DRAG DROP -
You are maintaining an existing application that uses an Azure Blob GPv1 Premium storage account. Data older than three months is rarely used.
Data newer than three months must be available immediately. Data older than a year must be saved but does not need to be available immediately.
You need to configure the account to support a lifecycle management rule that moves blob data to archive storage for data not modified in the last year.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Select and Place:

Discover Answer Hide Answer

Answer:
Step 1: Upgrade the storage account to GPv2
Object storage data tiering between hot, cool, and archive is supported in Blob Storage and General Purpose v2 (GPv2) accounts. General Purpose v1 (GPv1) accounts don't support tiering.
You can easily convert your existing GPv1 or Blob Storage accounts to GPv2 accounts through the Azure portal.
Step 2: Copy the data to be archived to a Standard GPv2 storage account and then delete the data from the original storage account
Step 3: Change the storage account access tier from hot to cool
Note: Hot - Optimized for storing data that is accessed frequently.
Cool - Optimized for storing data that is infrequently accessed and stored for at least 30 days.
Archive - Optimized for storing data that is rarely accessed and stored for at least 180 days with flexible latency requirements, on the order of hours.
Only the hot and cool access tiers can be set at the account level. The archive access tier can only be set at the blob level.
Reference:
https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-storage-tiers

Question#16

You develop Azure solutions.
You must connect to a No-SQL globally-distributed database by using the .NET API.
You need to create an object to configure and execute requests in the database.
Which code segment should you use?

  • A. new Container(EndpointUri, PrimaryKey);
  • B. new Database(EndpointUri, PrimaryKey);
  • C. new CosmosClient(EndpointUri, PrimaryKey);
Discover Answer Hide Answer

Answer: C
Example:
// Create a new instance of the Cosmos Client
this.cosmosClient = new CosmosClient(EndpointUri, PrimaryKey)
//ADD THIS PART TO YOUR CODE
await this.CreateDatabaseAsync();
Reference:
https://docs.microsoft.com/en-us/azure/cosmos-db/sql-api-get-started

Question#17

You have an existing Azure storage account that stores large volumes of data across multiple containers.
You need to copy all data from the existing storage account to a new storage account. The copy process must meet the following requirements:
✑ Automate data movement.
✑ Minimize user input required to perform the operation.
✑ Ensure that the data movement process is recoverable.
What should you use?

  • A. AzCopy
  • B. Azure Storage Explorer
  • C. Azure portal
  • D. .NET Storage Client Library
Discover Answer Hide Answer

Answer: A
You can copy blobs, directories, and containers between storage accounts by using the AzCopy v10 command-line utility.
The copy operation is synchronous so when the command returns, that indicates that all files have been copied.
Reference:
https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-blobs-copy

Question#18

DRAG DROP -
You are developing a web service that will run on Azure virtual machines that use Azure Storage. You configure all virtual machines to use managed identities.
You have the following requirements:
✑ Secret-based authentication mechanisms are not permitted for accessing an Azure Storage account.
✑ Must use only Azure Instance Metadata Service endpoints.
You need to write code to retrieve an access token to access Azure Storage. To answer, drag the appropriate code segments to the correct locations. Each code segment may be used once or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:

Discover Answer Hide Answer

Answer:
Azure Instance Metadata Service endpoints "/oauth2/token"
Box 1: http://169.254.169.254/metadata/identity/oauth2/token
Sample request using the Azure Instance Metadata Service (IMDS) endpoint (recommended):
GET 'http://169.254.169.254/metadata/identity/oauth2/token?api-version=2018-02-01&resource=https://management.azure.com/' HTTP/1.1 Metadata: true
Box 2: JsonConvert.DeserializeObject<Dictionary<string,string>>(payload);
Deserialized token response; returning access code.
Reference:
https://docs.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/how-to-use-vm-token https://docs.microsoft.com/en-us/azure/service-fabric/how-to-managed-identity-service-fabric-app-code

Question#19

DRAG DROP -
You are developing a new page for a website that uses Azure Cosmos DB for data storage. The feature uses documents that have the following format:

You must display data for the new page in a specific order. You create the following query for the page:

You need to configure a Cosmos DB policy to support the query.
How should you configure the policy? To answer, drag the appropriate JSON segments to the correct locations. Each JSON segment may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:

Discover Answer Hide Answer

Answer:
Box 1: compositeIndexes -
You can order by multiple properties. A query that orders by multiple properties requires a composite index.

Box 2: descending -
Example: Composite index defined for (name ASC, age ASC):
It is optional to specify the order. If not specified, the order is ascending.
{
"automatic":true,
"indexingMode":"Consistent",
"includedPaths":[
{
"path":"/*"
}
],
"excludedPaths":[],
"compositeIndexes":[
[
{
"path":"/name",
},
{
"path":"/age",
}
]
]
}

Question#20

HOTSPOT -
You are building a traffic monitoring system that monitors traffic along six highways. The system produces time series analysis-based reports for each highway.
Data from traffic sensors are stored in Azure Event Hub.
Traffic data is consumed by four departments. Each department has an Azure Web App that displays the time series-based reports and contains a WebJob that processes the incoming data from Event Hub. All Web Apps run on App Service Plans with three instances.
Data throughput must be maximized. Latency must be minimized.
You need to implement the Azure Event Hub.
Which settings should you use? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Discover Answer Hide Answer

Answer:
Box 1: 6 -
The number of partitions is specified at creation and must be between 2 and 32.
There are 6 highways.

Box 2: Highway -
Reference:
https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-features

chevron rightPrevious Nextchevron right