DON'T WANT TO MISS A THING?

Certification Exam Passing Tips

Latest exam news and discount info

Curated and up-to-date by our experts

Yes, send me the newsletter

AWS SAP-C02 Exam Questions and Practice Test for Solutions Architect Professional

Exam Name AWS Solutions Architect Professional (AWS-SAP)
Exam CodeSAP-C02
Exam Price$300 USD
Duration180 minutes
Number of Questions75
Passing Score750 / 1000
Recommended Training / BooksAdvanced Architecting on AWS
Recommended PracticeAWS Certified Solutions Architect - Professional Practice Test

Get ready for the AWS Certified Solutions Architect Professional exam with this compilation of SAP-C02 practice questions and answers. Evaluate your exam readiness. Prepare to pass the AWS Certified Solutions Architect Professional exam with this set of SAP-C02 practice questions designed for exam success. Test your Solutions Architect skills.

Take other online exams

Question #1
A company wants to change its internal cloud billing strategy for each of its business units Currently, the cloud governance team shares reports for overall cloud spending with the head of each business unit. The company uses AWS Organizations to manage the separate AWS accounts for each business unit. The existing tagging standard in Organizations includes the application environment and owner. The cloud governance team wants a centralized solution so each business unit receives monthly reports on its cloud spending. The solution should also send notifications for any cloud spending that exceeds a set threshold Which solution is the MOST cost-effective way to meet these requirements?
A. onfigure AWS Budgets in each account and configure budget alerts that are grouped by application, environment and owner Add each business unit to an Amazon SNS topic for each alert
B. onfigure AWS Budgets in the organization's master account and configure budget alerts that are grouped by application, environment and owner Add each business unit to an Amazon SNS tope for each alert Use Cost Explorer in the organization's master account to create monthly reports for each business unit
C. onfigure AWS Budgets in each account and configure budget alerts that are grouped by application, environment, and owner Add each business unit to an Amazon SNS topic for each alert
D. nable AWS Cost and Usage Reports m the organization's master account and configure reports grouped by application environment and owner Create an AWS Lambda function that processes AWS Cost and Usage Reports sends budget alerts and sends monthly reports to each business unit's email list
View answer
Correct Answer: B

View The Updated SAP-C02 Exam Questions

SPOTO Provides 100% Real SAP-C02 Exam Questions for You to Pass Your SAP-C02 Exam!

Question #2
A company has an on-premises monitoring solution using a PostgreSQL database for persistence of events. The database is unable to scale due to heavy ingestion and it frequently runs out of storage. The company wants to create a hybrid solution and has already set up a VPN connection between its network and AWS. The solution should include the following attributes:1.Managed AWS services to minimize operational complexity. 2.A buffer that automatically scales to match the throughput of data and requires no ongoing administration. 3.A visualization tool to create dashboards to observe events in near-real time. 4.Support for semi-structured JSON data and dynamic schemas. Which combination of components will enable the company to create a monitoring solution that will satisfy these requirements? (Select TWO.)
A. se Amazon Kinesis Data Firehose to buffer events
B. reate an Amazon Kinesis data stream to buffer events
C. onfigure an Amazon Aurora PostgreSQL DB cluster to receive events
D. onfigure Amazon Elasticsearch Service (Amazon ES) to receive events
E. onfigure an Amazon Neptune DB instance to receive events
View answer
Correct Answer: BD
Question #3
A solutions architect needs to advise a company on how to migrate its on-premises data processing application to the AWS Cloud. Currently, users upload input files through a portal. The web server then stores the uploaded tiles on NAS and messages the processing server over a message queue. Each media file can lake up to 1 hour to process. The company has determined that the number of media files awaiting processing is significantly higher during business hours, with the number of files rapidly declining after business hours. What is the MOST cost-effective migration recommendation?
A. reate a queue using Amazon SQS Configure the existing web server to publish to the new queue When there are messages m the queue, invoke an AWS Lambda (unction to pull requests from the queue and process the files Store the processed files in an Amazon S3 bucket
B. reate a queue using Amazon MQ Configure the existing web server to publish to the new queue When there are messages in the queue, create a new Amazon EC2 instance to pull requests from the queue and process the files Store the processed files in Amazon EPS Shut down the EC2 instance after the task is complete
C. reate a queue using Amazon MQ Configure the existing web server to publish to the new queue When there are messages in the queue, invoke an AWS Lambda function to pull requests from the queue and process the files Store the processed files in Amazon EFS
D. reate a queue using Amazon SQS Configure the existing web server to publish to the new queue Use Amazon EC2 instances in an EC2 Auto Seating group to pull requests from the queue and process the files Scale the EC2 instances based on the SQS queue length Store the processed files in an Amazon S3 bucket
View answer
Correct Answer: D
Question #4
A weather service provides high-resolution weather maps from a web application hosted on AWS in the eu- west-1 Region. The weather maps are updated frequently and stored in Amazon S3 along with static HTML content. The web application is fronted by Amazon CloudFront. The company recently expanded to serve users in the us-east Region and these new users report that viewing their respective weather maps is slow from time to time. Which combination of slops will resolve the us-east performance issues? (Select TWO)
A. onfigure the AWS Global Accelerator endpoint for the S3 bucket m eu-west-1 Configure endpoint groups for TCP ports 80 and 443 in us-east-1
B. reate a new S3 bucket m us-east-1 Configure S3 across-Region replication to synchronize from the S3 bucket m eu-west-1
C. se Lambda@Edge to modify requests from North America lo use the S3 Transfer Acceleration endpoint in us-east-1
D. se Lambda@Edge to modify requests from North America to use the S3 bucket m us-east-1
E. onfigure the AWS Global Accelerator endpoint for us-east-1 as an origin on the CloudFront distribution Use lambda@Edge to modify requests from North America to use the new origin
View answer
Correct Answer: BD
Question #5
A company is using AWS CloudFormation to deploy its infrastructure. The company is concerned that, if a production CloudFormation stack is deleted, important data stored in Amazon RDS databases or Amazon EBS volumes might also be deleted. How can the company prevent users from accidentally deleting data in this way?
A. odify the CloudFormation templates to add a DeletionPolicy attribute to RDS and EBS resources
B. onfigure a stack policy that disallows the deletion of RDS and EBS resources
C. odify IAM policies to deny deleting RDS and EBS resources that are tagged with an "aws:cloudformation:stack-name" tag
D. se AWS Config rules to prevent deleting RDS and EBS resources
View answer
Correct Answer: A
Question #6
An AWS partner company is building a service in AWS Organizations using its organization named org1. This service requires the partner company to have access to AWS resources in a customer account, which is in a separate organization named org2. The company must establish least privilege security access using an API or command line tool to the customer account. What is the MOST secure way to allow org1 to access resources in org2?
A. he customer should provide the partner company with their AWS account access keys to log in and perform the required tasks
B. he customer should create an IAM user and assign the required permissions to the IAM user
C. he customer should create an IAM role and assign the required permissions to the IAM role
D. he customer should create an IAM role and assign the required permissions to the IAM role
View answer
Correct Answer: D
Question #7
A company is hosting a three-tier web application in an on-premises environment Due to a recent surge in traffic that resulted in downtime and a significant financial impact, company management has ordered that the application be moved to AWS. The application is written in .NET and has a dependency on a MySQL database. A solutions architect must design a scalable and highly available solution to meet the demand of 200,000 daily users. Which steps should the solutions architect take to design an appropriate solution?
A. se AWS Elastic Beanstalk to create a new application with a web server environment and an Amazon RDS MySQL Multi-AZ DB instance The environment should launch a Network Load Balancer (NLB) in front of an Amazon EC2 Auto Scaling group in multiple Availability Zones
B. se AWS CloudFormation to launch a stack containing an Application Load Balancer (ALB) in front of an Amazon EC2 Auto Scaling group spanning three Availability Zones
C. se AWS Elastic Beanstalk to create an automatically scaling web server environment that 6pans two separate Regions with an Application Load Balancer (ALB) in each Region
D. se AWS CloudFormation to launch a stack containing an Application Load Balancer (ALB) in front of an Amazon ECS cluster of Spot Instances spanning three Availability Zones
View answer
Correct Answer: B
Question #8
A life Sciences company is using a combination of open source tools to manage data analysis workflows and Docker containers running on servers in its on-premises data to process genemics dat a. Sequencing data is generated and stored on a local storage area network (SAN) and then the data is processed. The research and development teams are running into capacity issues and have decided to re-architect their genomics analysis platform on AWS to scale based on workload demands and reduce the turn around time from weeks to days. The company has a high-speed AWS Direct Connect connection Sequencers will generate around 200 GB of data for each genome, and individual hours to process the daa with ideal compute capacity. The end result will be stored in Amazon S3. The company is expecting 10-15 job requests each day. Which solution meets these requirements?
A. se regularly scheduled AWS snowball Edge devices to transfer the sequencing data into AWS
B. se AWS Data Pipeline to transfer the sequencing data to Amazon S3
C. se AWS DataSync to transfer the sequensing data to Amazon S3
D. se an AWS Storage Gateway file gateway to transfer the sequencing data to Amazon S3
View answer
Correct Answer: C
Question #9
A financial services company receives a regular data feed from its credit card servicing partner Approximately 5 000 records are sent every 15 minutes in plaintext delivered over HTTPS directly into an Amazon S3 bucket with server-side encryption This feed contains sensitive credit card primary account number (PAN) data The company needs to automatically mask the PAN before sending the data to another S3 bucket for additional internal processing The company also needs to remove and merge specific fields and then transform the record into JSON format Additionally extra feeds are likely to be added in the future so any design needs to be easily expandable. Which solutions will meet these requirements?
A. rigger an AWS Lambda function on file delivery that extracts each record and writes it to an Amazon SQS queue Trigger another Lambda function when new messages arrive in the SQS queue to process the records, writing the results to a temporary location in Amazon S3 Trigger a final Lambda function once the SQS queue is empty to transform the records into JSON format and send the results to another S3 bucket for internal processing n
B. rigger an AWS Lambda function on file delivery that extracts each record and writes it to an Amazon SQS queue Configure an AWS Fargate container application to automatically scale to a single instance when the SQS queue contains messages Have the application process each record and transform the record into JSON format When the queue is empty send the results to another S3 bucket for internal processing and scale down the AWS Fargate instance
C. reate an AWS Glue crawler and custom classifier based on the data feed formats and build a table definition to match Trigger an AWS Lambda function on file delivery to start an AWS Glue ETL job to transform the entire record according to the processing and transformation requirements Define the output format as JSON Once complete have the ETL job send the results to another S3 bucket for internal processing
D. reate an AWS Glue crawler and custom classifier based upon the data feed formats and build a table definition to match Perform an Amazon Athena query on file delivery to start an Amazon EMR ETL job to transform the entire record according to the processing and transformation requirements Define the output format as JSON Once complete send the results to another S3 bucket for internal processing and scale down the EMR cluster
View answer
Correct Answer: C
Question #10
A finance company is running its business-critical application on current-generation Linux EC2 instances. The application includes a self-managed MySQL database performing heavy I/O operations. The application is working fine to handle a moderate amount of traffic during the month. However, it slows down during the final three days of each month due to month-end reporting, even though the company is using Elastic Load Balancers and Auto Scaling within its infrastructure to meet the increased demand. Which of the following actions would allow the database to handle the month-end load with the LEAST impact on performance?
A. re-warming Elastic Load Balancers, using a bigger instance type, changing all Amazon EBS volumes to GP2 volumes
B. erforming a one-time migration of the database cluster to Amazon RDS, and creating several additional read replicas to handle the load during end of month
C. sing Amazon CloudWatch with AWS Lambda to change the type, size, or IOPS of Amazon EBS volumes in the cluster based on a specific CloudWatch metric
D. eplacing all existing Amazon EBS volumes with new PIOPS volumes that have the maximum available storage size and I/O per second by taking snapshots before the end of the month and reverting back afterwards
View answer
Correct Answer: B

View The Updated AWS Exam Questions

SPOTO Provides 100% Real AWS Exam Questions for You to Pass Your AWS Exam!

View Answers after Submission

Please submit your email and WhatsApp to get the answers of questions.

Note: Please make sure your email ID and Whatsapp are valid so that you can get the correct exam results.

Email:
Whatsapp/phone number: