Exams > Google > Professional Cloud Architect: Professional Cloud Architect on Google Cloud Platform
Professional Cloud Architect: Professional Cloud Architect on Google Cloud Platform
Page 8 out of 27 pages Questions 71-80 out of 270 questions
Question#71

You are managing an application deployed on Cloud Run for Anthos, and you need to define a strategy for deploying new versions of the application. You want to evaluate the new code with a subset of production traffic to decide whether to proceed with the rollout. What should you do?

  • A. Deploy a new revision to Cloud Run with the new version. Configure traffic percentage between revisions.
  • B. Deploy a new service to Cloud Run with the new version. Add a Cloud Load Balancing instance in front of both services.
  • C. In the Google Cloud Console page for Cloud Run, set up continuous deployment using Cloud Build for the development branch. As part of the Cloud Build trigger, configure the substitution variable TRAFFIC_PERCENTAGE with the percentage of traffic you want directed to a new version.
  • D. In the Google Cloud Console, configure Traffic Director with a new Service that points to the new version of the application on Cloud Run. Configure Traffic Director to send a small percentage of traffic to the new version of the application.
Discover Answer Hide Answer

C

Question#72

You are monitoring Google Kubernetes Engine (GKE) clusters in a Cloud Monitoring workspace. As a Site Reliability Engineer (SRE), you need to triage incidents quickly. What should you do?

  • A. Navigate the predefined dashboards in the Cloud Monitoring workspace, and then add metrics and create alert policies.
  • B. Navigate the predefined dashboards in the Cloud Monitoring workspace, create custom metrics, and install alerting software on a Compute Engine instance.
  • C. Write a shell script that gathers metrics from GKE nodes, publish these metrics to a Pub/Sub topic, export the data to BigQuery, and make a Data Studio dashboard.
  • D. Create a custom dashboard in the Cloud Monitoring workspace for each incident, and then add metrics and create alert policies.
Discover Answer Hide Answer

D
Reference:
https://cloud.google.com/monitoring/charts/dashboards

Question#73

You are implementing a single Cloud SQL MySQL second-generation database that contains business-critical transaction data. You want to ensure that the minimum amount of data is lost in case of catastrophic failure. Which two features should you implement? (Choose two.)

  • A. Sharding
  • B. Read replicas
  • C. Binary logging
  • D. Automated backups
  • E. Semisynchronous replication
Discover Answer Hide Answer

CD
Backups help you restore lost data to your Cloud SQL instance. Additionally, if an instance is having a problem, you can restore it to a previous state by using the backup to overwrite it. Enable automated backups for any instance that contains necessary data. Backups protect your data from loss or damage.
Enabling automated backups, along with binary logging, is also required for some operations, such as clone and replica creation.
Reference:
https://cloud.google.com/sql/docs/mysql/backup-recovery/backups

Question#74

You are working at a sports association whose members range in age from 8 to 30. The association collects a large amount of health data, such as sustained injuries. You are storing this data in BigQuery. Current legislation requires you to delete such information upon request of the subject. You want to design a solution that can accommodate such a request. What should you do?

  • A. Use a unique identifier for each individual. Upon a deletion request, delete all rows from BigQuery with this identifier.
  • B. When ingesting new data in BigQuery, run the data through the Data Loss Prevention (DLP) API to identify any personal information. As part of the DLP scan, save the result to Data Catalog. Upon a deletion request, query Data Catalog to find the column with personal information.
  • C. Create a BigQuery view over the table that contains all data. Upon a deletion request, exclude the rows that affect the subject's data from this view. Use this view instead of the source table for all analysis tasks.
  • D. Use a unique identifier for each individual. Upon a deletion request, overwrite the column with the unique identifier with a salted SHA256 of its value.
Discover Answer Hide Answer

B

Question#75

Your company has announced that they will be outsourcing operations functions. You want to allow developers to easily stage new versions of a cloud-based application in the production environment and allow the outsourced operations team to autonomously promote staged versions to production. You want to minimize the operational overhead of the solution. Which Google Cloud product should you migrate to?

  • A. App Engine
  • B. GKE On-Prem
  • C. Compute Engine
  • D. Google Kubernetes Engine
Discover Answer Hide Answer

D
Reference:
https://cloud.google.com/security/compliance/eba-outsourcing-mapping-gcp

Question#76

Your company is running its application workloads on Compute Engine. The applications have been deployed in production, acceptance, and development environments. The production environment is business-critical and is used 24/7, while the acceptance and development environments are only critical during office hours. Your CFO has asked you to optimize these environments to achieve cost savings during idle times. What should you do?

  • A. Create a shell script that uses the gcloud command to change the machine type of the development and acceptance instances to a smaller machine type outside of office hours. Schedule the shell script on one of the production instances to automate the task.
  • B. Use Cloud Scheduler to trigger a Cloud Function that will stop the development and acceptance environments after office hours and start them just before office hours.
  • C. Deploy the development and acceptance applications on a managed instance group and enable autoscaling.
  • D. Use regular Compute Engine instances for the production environment, and use preemptible VMs for the acceptance and development environments.
Discover Answer Hide Answer

B
Reference:
https://cloud.google.com/blog/products/it-ops/best-practices-for-optimizing-your-cloud-costs

Question#77

You are moving an application that uses MySQL from on-premises to Google Cloud. The application will run on Compute Engine and will use Cloud SQL. You want to cut over to the Compute Engine deployment of the application with minimal downtime and no data loss to your customers. You want to migrate the application with minimal modification. You also need to determine the cutover strategy. What should you do?

  • A. 1. Set up Cloud VPN to provide private network connectivity between the Compute Engine application and the on-premises MySQL server. 2. Stop the on-premises application. 3. Create a mysqldump of the on-premises MySQL server. 4. Upload the dump to a Cloud Storage bucket. 5. Import the dump into Cloud SQL. 6. Modify the source code of the application to write queries to both databases and read from its local database. 7. Start the Compute Engine application. 8. Stop the on-premises application.
  • B. 1. Set up Cloud SQL proxy and MySQL proxy. 2. Create a mysqldump of the on-premises MySQL server. 3. Upload the dump to a Cloud Storage bucket. 4. Import the dump into Cloud SQL. 5. Stop the on-premises application. 6. Start the Compute Engine application.
  • C. 1. Set up Cloud VPN to provide private network connectivity between the Compute Engine application and the on-premises MySQL server. 2. Stop the on-premises application. 3. Start the Compute Engine application, configured to read and write to the on-premises MySQL server. 4. Create the replication configuration in Cloud SQL. 5. Configure the source database server to accept connections from the Cloud SQL replica. 6. Finalize the Cloud SQL replica configuration. 7. When replication has been completed, stop the Compute Engine application. 8. Promote the Cloud SQL replica to a standalone instance. 9. Restart the Compute Engine application, configured to read and write to the Cloud SQL standalone instance.
  • D. 1. Stop the on-premises application. 2. Create a mysqldump of the on-premises MySQL server. 3. Upload the dump to a Cloud Storage bucket. 4. Import the dump into Cloud SQL. 5. Start the application on Compute Engine.
Discover Answer Hide Answer

A

Question#78

Your organization has decided to restrict the use of external IP addresses on instances to only approved instances. You want to enforce this requirement across all of your Virtual Private Clouds (VPCs). What should you do?

  • A. Remove the default route on all VPCs. Move all approved instances into a new subnet that has a default route to an internet gateway.
  • B. Create a new VPC in custom mode. Create a new subnet for the approved instances, and set a default route to the internet gateway on this new subnet.
  • C. Implement a Cloud NAT solution to remove the need for external IP addresses entirely.
  • D. Set an Organization Policy with a constraint on constraints/compute.vmExternalIpAccess. List the approved instances in the allowedValues list.
Discover Answer Hide Answer

D
Reference:
https://cloud.google.com/compute/docs/ip-addresses/reserve-static-external-ip-address

Question#79

Your company uses the Firewall Insights feature in the Google Network Intelligence Center. You have several firewall rules applied to Compute Engine instances.
You need to evaluate the efficiency of the applied firewall ruleset. When you bring up the Firewall Insights page in the Google Cloud Console, you notice that there are no log rows to display. What should you do to troubleshoot the issue?

  • A. Enable Virtual Private Cloud (VPC) flow logging.
  • B. Enable Firewall Rules Logging for the firewall rules you want to monitor.
  • C. Verify that your user account is assigned the compute.networkAdmin Identity and Access Management (IAM) role.
  • D. Install the Google Cloud SDK, and verify that there are no Firewall logs in the command line output.
Discover Answer Hide Answer

B
Reference:
https://cloud.google.com/network-intelligence-center/docs/firewall-insights/how-to/using-firewall-insights

Question#80

Your company has sensitive data in Cloud Storage buckets. Data analysts have Identity Access Management (IAM) permissions to read the buckets. You want to prevent data analysts from retrieving the data in the buckets from outside the office network. What should you do?

  • A. 1. Create a VPC Service Controls perimeter that includes the projects with the buckets. 2. Create an access level with the CIDR of the office network.
  • B. 1. Create a firewall rule for all instances in the Virtual Private Cloud (VPC) network for source range. 2. Use the Classless Inter-domain Routing (CIDR) of the office network.
  • C. 1. Create a Cloud Function to remove IAM permissions from the buckets, and another Cloud Function to add IAM permissions to the buckets. 2. Schedule the Cloud Functions with Cloud Scheduler to add permissions at the start of business and remove permissions at the end of business.
  • D. 1. Create a Cloud VPN to the office network. 2. Configure Private Google Access for on-premises hosts.
Discover Answer Hide Answer

C

chevron rightPrevious Nextchevron right