google associate-cloud-engineer practice test

Associate Cloud Engineer


Question 1

You have a Google Cloud Platform account with access to both production and development projects. You need to create an
automated process to list all compute instances in development and production projects on a daily basis. What should you
do?

  • A. Create two configurations using gcloud config. Write a script that sets configurations as active, individually. For each configuration, use gcloud compute instances list to get a list of compute resources.
  • B. Create two configurations using gsutil config. Write a script that sets configurations as active, individually. For each configuration, use gsutil compute instances list to get a list of compute resources.
  • C. Go to Cloud Shell and export this information to Cloud Storage on a daily basis.
  • D. Go to GCP Console and export this information to Cloud SQL on a daily basis.
Answer:

A

Discussions
0 / 600

Question 2

You need to create a Compute Engine instance in a new project that doesn’t exist yet. What should you do?

  • A. Using the Cloud SDK, create a new project, enable the Compute Engine API in that project, and then create the instance specifying your new project.
  • B. Enable the Compute Engine API in the Cloud Console, use the Cloud SDK to create the instance, and then use the -- project flag to specify a new project.
  • C. Using the Cloud SDK, create the new instance, and use the --project flag to specify the new project. Answer yes when prompted by Cloud SDK to enable the Compute Engine API.
  • D. Enable the Compute Engine API in the Cloud Console. Go to the Compute Engine section of the Console to create a new instance, and look for the Create In A New Project option in the creation form.
Answer:

B

Discussions
0 / 600

Question 3

You have an object in a Cloud Storage bucket that you want to share with an external company. The object contains
sensitive data. You want access to the content to be removed after four hours. The external company does not have a
Google account to which you can grant specific user-based access privileges. You want to use the most secure method that
requires the fewest steps. What should you do?

  • A. Create a signed URL with a four-hour expiration and share the URL with the company.
  • B. Set object access to ‘public’ and use object lifecycle management to remove the object after four hours.
  • C. Configure the storage bucket as a static website and furnish the objects URL to the company. Delete the object from the storage bucket after four hours.
  • D. Create a new Cloud Storage bucket specifically for the external company to access. Copy the object to that bucket. Delete the bucket after four hours have passed.
Answer:

A

Discussions
0 / 600

Question 4

Your auditor wants to view your organizations use of data in Google Cloud. The auditor is most interested in auditing who
accessed data in Cloud Storage buckets. You need to help the auditor access the data they need. What should you do?

  • A. Turn on Data Access Logs for the buckets they want to audit, and then build a query in the log viewer that filters on Cloud Storage.
  • B. Assign the appropriate permissions, and then create a Data Studio report on Admin Activity Audit Logs.
  • C. Assign the appropriate permissions, and the use Cloud Monitoring to review metrics.
  • D. Use the export logs API to provide the Admin Activity Audit Logs in the format they want.
Answer:

D

Explanation:
Reference: https://cloud.google.com/storage/docs/audit-logging

Discussions
0 / 600

Question 5

You need to set up permissions for a set of Compute Engine instances to enable them to write data into a particular Cloud
Storage bucket. You want to follow Google-recommended practices. What should you do?

  • A. Create a service account with an access scope. Use the access scope https://www.googleapis.com/auth/devstorage.write_only.
  • B. Create a service account with an access scope. Use the access scope ‘https://www.googleapis.com/auth/cloud-platform’.
  • C. Create a service account and add it to the IAM role ‘storage.objectCreator’ for that bucket.
  • D. Create a service account and add it to the IAM role ‘storage.objectAdmin’ for that bucket.
Answer:

D

Discussions
0 / 600

Question 6

Your management has asked an external auditor to review all the resources in a specific project. The security team has
enabled the Organization Policy called Domain Restricted Sharing on the organization node by specifying only your Cloud
Identity domain. You want the auditor to only be able to view, but not modify, the resources in that project. What should you
do?

  • A. Ask the auditor for their Google account, and give them the Viewer role on the project.
  • B. Ask the auditor for their Google account, and give them the Security Reviewer role on the project.
  • C. Create a temporary account for the auditor in Cloud Identity, and give that account the Viewer role on the project.
  • D. Create a temporary account for the auditor in Cloud Identity, and give that account the Security Reviewer role on the project.
Answer:

C

Discussions
0 / 600

Question 7

You are working for a hospital that stores its medical images in an on-premises data room. The hospital wants to use Cloud
Storage for archival storage of these images. The hospital wants an automated process to upload any new medical images
to Cloud Storage. You need to design and implement a solution. What should you do?

  • A. Create a Pub/Sub topic, and enable a Cloud Storage trigger for the Pub/Sub topic. Create an application that sends all medical images to the Pub/Sub topic.
  • B. Deploy a Dataflow job from the batch template, Datastore to Cloud Storage. Schedule the batch job on the desired interval.
  • C. Create a script that uses the gsutil command line interface to synchronize the on-premises storage with Cloud Storage. Schedule the script as a cron job.
  • D. In the Cloud Console, go to Cloud Storage. Upload the relevant images to the appropriate bucket.
Answer:

C

Discussions
0 / 600

Question 8

You significantly changed a complex Deployment Manager template and want to confirm that the dependencies of all defined
resources are properly met before committing it to the project. You want the most rapid feedback on your changes. What
should you do?

  • A. Use granular logging statements within a Deployment Manager template authored in Python.
  • B. Monitor activity of the Deployment Manager execution on the Stackdriver Logging page of the GCP Console.
  • C. Execute the Deployment Manager template against a separate project with the same configuration, and monitor for failures.
  • D. Execute the Deployment Manager template using the -preview option in the same project, and observe the state of interdependent resources.
Answer:

D

Explanation:
Reference: https://cloud.google.com/deployment-manager/docs/deployments/updating-deployments

Discussions
0 / 600

Question 9

You have sensitive data stored in three Cloud Storage buckets and have enabled data access logging. You want to verify
activities for a particular user for these buckets, using the fewest possible steps.
You need to verify the addition of metadata labels and which files have been viewed from those buckets. What should you
do?

  • A. Using the GCP Console, filter the Activity log to view the information.
  • B. Using the GCP Console, filter the Stackdriver log to view the information.
  • C. View the bucket in the Storage section of the GCP Console.
  • D. Create a trace in Stackdriver to view the information.
Answer:

A

Discussions
0 / 600

Question 10

You have an application that uses Cloud Spanner as a backend database. The application has a very predictable traffic
pattern. You want to automatically scale up or down the number of Spanner nodes depending on traffic. What should you
do?

  • A. Create a cron job that runs on a scheduled basis to review Cloud Monitoring metrics, and then resize the Spanner instance accordingly.
  • B. Create a Cloud Monitoring alerting policy to send an alert to oncall SRE emails when Cloud Spanner CPU exceeds the threshold. SREs would scale resources up or down accordingly.
  • C. Create a Cloud Monitoring alerting policy to send an alert to Google Cloud Support email when Cloud Spanner CPU exceeds your threshold. Google support would scale resources up or down accordingly.
  • D. Create a Cloud Monitoring alerting policy to send an alert to webhook when Cloud Spanner CPU is over or under your threshold. Create a Cloud Function that listens to HTTP and resizes Spanner resources accordingly.
Answer:

D

Discussions
0 / 600
To page 2