DON'T WANT TO MISS A THING?

Certification Exam Passing Tips

Latest exam news and discount info

Curated and up-to-date by our experts

Yes, send me the newsletter

Master Professional Cloud Architect Exams with Exam Questions & Study Materials, Google Professional Cloud Architect | SPOTO

Master the Professional Cloud Architect exams with our comprehensive collection of exam questions and study materials. Our resources include a variety of practice tests, mock exams, and sample questions to help you thoroughly prepare for the certification. Access our exam dumps and exam materials for detailed explanations and answers to key concepts. Utilize our online exam questions and exam simulator to simulate real exam conditions and assess your readiness. With our expertly crafted study materials, you'll build the knowledge and confidence needed to excel on exam day. Trust SPOTO for the most effective exam preparation resources and guidance to become a certified Professional Cloud Architect.
Take other online exams

Question #1
Your architecture calls for the centralized collection of all admin activity and VM system logs within your project. How should you collect these logs from both VMs and services?
A. All admin and VM system logs are automatically collected by Stackdriver
B. Stackdriver automatically collects admin activity logs for most services
C. Launch a custom syslogd compute instance and configure your GCP project and VMs to forward all logs to it
D. Install the Stackdriver Logging agent on a single compute instance and let it collect all audit and access logs for your environment
View answer
Correct Answer: B
Question #2
Mountkirk Games has deployed their new backend on Google Cloud Platform (GCP). You want to create a through testing process for new versions of the backend before they are released to the public. You want the testing environment to scale in an economical way. How should you design the process?
A. Create a scalable environment in GCP for simulating production load
B. Use the existing infrastructure to test the GCP-based backend at scale
C. Build stress tests into each component of your application using resources internal to GCP to simulate load
D. Create a set of static environments in GCP to test different levels of load – for example, high, medium, and low
View answer
Correct Answer: A
Question #3
You have been engaged by your client to lead the migration of their application infrastructure to GCP. One of their current problems is that the on-premises high performance SAN is requiring frequent and expensive upgrades to keep up with the variety of workloads that are identified as follows: 20 TB of log archives retained for legal reasons; 500 GB of VM boot/data volumes and templates; 500 GB of image thumbnails; 200 GB of customer session state data that allows customers to restart sessions even if off-
A. Local SSD for customer session state data
B. Memcache backed by Cloud Datastore for the customer session state data
C. Memcache backed by Cloud SQL for customer session state data
D. Memcache backed by Persistent Disk SSD storage for customer session state data
View answer
Correct Answer: D
Question #4
A development team at your company has created a dockerized HTTPS web application. You need to deploy the application on Google Kubernetes Engine (GKE) and make sure that the application scales automatically. How should you deploy to GKE?
A. Use the Horizontal Pod Autoscaler and enable cluster autoscaling
B. Use the Horizontal Pod Autoscaler and enable cluster autoscaling on the Kubernetes cluster
C. Enable autoscaling on the Compute Engine instance group
D. Enable autoscaling on the Compute Engine instance group
View answer
Correct Answer: B
Question #5
You are using Cloud SQL as the database backend for a large CRM deployment. You want to scale as usage increases and ensure that you don’t run out of storage, maintain 75% CPU usage cores, and keep replication lag below 60 seconds. What are the correct steps to meet your requirements?
A. 1
B. 1
C. 1
D. 1
View answer
Correct Answer: A
Question #6
Your BigQuery project has several users. For audit purposes, you need to see how many queries each user ran in the last month. What should you do?
A. Connect Google Data Studio to BigQuery
B. In the BigQuery interface, execute a query on the JOBS table to get the required information
C. Use ‘bq show’ to list all jobs
D. Use Cloud Audit Logging to view Cloud Audit Logs, and create a filter on the query operation to get the required information
View answer
Correct Answer: C
Question #7
Your company is building a new architecture to support its data-centric business focus. You are responsible for setting up the network. Your company’s mobile and web-facing applications will be deployed on-premises, and all data analysis will be conducted in GCP. The plan is to process and load 7 years of archived .csv files totaling 900 TB of data and then continue loading 10 TB of data daily. You currently have an existing 100-MB internet connection. What actions will meet your company’s needs?
A. Compress and upload both archived files and files uploaded daily using the gsutil –m option
B. Lease a Transfer Appliance, upload archived files to it, and send it to Google to transfer archived data to Cloud Storage
C. Lease a Transfer Appliance, upload archived files to it, and send it to Google to transfer archived data to Cloud Storage
D. Lease a Transfer Appliance, upload archived files to it, and send it to Google to transfer archived data to Cloud Storage
View answer
Correct Answer: B
Question #8
You are running a cluster on Kubernetes Engine (GKE) to serve a web application. Users are reporting that a specific part of the application is not responding anymore. You notice that all pods of your deployment keep restarting after 2 seconds. The application writes logs to standard output. You want to inspect the logs to find the cause of the issue. Which approach can you take?
A. Review the Stackdriver logs for each Compute Engine instance that is serving as a node in the cluster
B. Review the Stackdriver logs for the specific GKE container that is serving the unresponsive part of the application
C. Connect to the cluster using gcloud credentials and connect to a container in one of the pods to read the logs
D. Review the Serial Port logs for each Compute Engine instance that is serving as a node in the cluster
View answer
Correct Answer: B
Question #9
TerramEarth has equipped all connected trucks with servers and sensors to collect telemetry data. Next year they want to use the data to train machine learning models. They want to store this data in the cloud while reducing costs. What should they do?
A. Have the vehicle’s computer compress the data in hourly snapshots, and store it in a Google Cloud Storage (GCS) Nearline bucket
B. Push the telemetry data in real-time to a streaming dataflow job that compresses the data, and store it in Google BigQuery
C. Push the telemetry data in real-time to a streaming dataflow job that compresses the data, and store it in Cloud Bigtable
D. Have the vehicle’s computer compress the data in hourly snapshots, and store it in a GCS Coldline bucket
View answer
Correct Answer: D
Question #10
Your company acquired a healthcare startup and must retain its customers’ medical information for up to 4 more years, depending on when it was created. Your corporate policy is to securely retain this data, and then delete it as soon as regulations allow. Which approach should you take?
A. Store the data in Google Drive and manually delete records as they expire
B. Anonymize the data using the Cloud Data Loss Prevention API and store it indefinitely
C. Store the data in Cloud Storage and use lifecycle management to delete files when they expire
D. Store the data in Cloud Storage and run a nightly batch script that deletes all expired data
View answer
Correct Answer: C
Question #11
You have a Python web application with many dependencies that requires 0.1 CPU cores and 128 MB of memory to operate in production. You want to monitor and maximize machine utilization. You also want to reliably deploy new versions of the application. Which set of steps should you take?
A. Perform the following: 1
B. Perform the following: 1
C. Perform the following: 1
D. Perform the following: 1
View answer
Correct Answer: B
Question #12
Your company pushes batches of sensitive transaction data from its application server VMs to Cloud Pub/Sub for processing and storage. What is the Googlerecommended way for your application to authenticate to the required Google Cloud services?
A. Ensure that VM service accounts are granted the appropriate Cloud Pub/Sub IAM roles
B. Ensure that VM service accounts do not have access to Cloud Pub/Sub, and use VM access scopes to grant the appropriate Cloud Pub/Sub IAM roles
C. Generate an OAuth2 access token for accessing Cloud Pub/Sub, encrypt it, and store it in Cloud Storage for access from each VM
D. Create a gateway to Cloud Pub/Sub using a Cloud Function, and grant the Cloud Function service account the appropriate Cloud Pub/Sub IAM roles
View answer
Correct Answer: A
Question #13
Your customer wants to capture multiple GBs of aggregate real-time key performance indicators (KPIs) from their game servers running on Google Cloud Platform and monitor the KPIs with low latency. How should they capture the KPIs?
A. Store time-series data from the game servers in Google Bigtable, and view it using Google Data Studio
B. Output custom metrics to Stackdriver from the game servers, and create a Dashboard in Stackdriver Monitoring Console to view them
C. Schedule BigQuery load jobs to ingest analytics files uploaded to Cloud Storage every ten minutes, and visualize the results in Google Data Studio
D. Insert the KPIs into Cloud Datastore entities, and run ad hoc analysis and visualizations of them in Cloud Datalab
View answer
Correct Answer: A
Question #14
You need to develop procedures to test a disaster plan for a mission-critical application. You want to use Google-recommended practices and native capabilities within GCP. What should you do?
A. Use Deployment Manager to automate service provisioning
B. Use Deployment Manager to automate service provisioning
C. Use gcloud scripts to automate service provisioning
D. Use gcloud scripts to automate service provisioning
View answer
Correct Answer: B
Question #15
You are tasked with building an online analytical processing (OLAP) marketing analytics and reporting tool. This requires a relational database that can operate on hundreds of terabytes of data. What is the Google-recommended tool for such applications?
A. Cloud Spanner, because it is globally distributed
B. Cloud SQL, because it is a fully managed relational database
C. Cloud Firestore, because it offers real-time synchronization across devices
D. BigQuery, because it is designed for large-scale processing of tabular data
View answer
Correct Answer: D

View Answers after Submission

Please submit your email and WhatsApp to get the answers of questions.

Note: Please make sure your email ID and Whatsapp are valid so that you can get the correct exam results.

Email:
Whatsapp/phone number: