CCNA 200-301

CCNP Enterprise

CCNP Security

CCIE Enterprise Lab

CCIE Security Lab

CCNP Service Provider

CCNP Data Center

CCNP Collaboration

CCIE DC Lab

We provide free access to the most recent AWS MLS-C01 practice exam questions and answers. You can practice the most recent AWS MLS-C01 exam questions online!

The free exam questions are simply a small part of what we offer. If you want to receive complete AWS MLS-C01 test questions and answers, leave your email adress and we’ll sent you more.

Related Read:

Download 2021 Update Free & Valid DVA-C01 Practice Test

QUESTION 1
A company plans to use machine learning (ML) to build a customer classification system based on profiles. The source transactional data is stored in an Amazon RDS for MySQL DB instance. The relevant table needs to be ingested to Amazon S3 daily for model retraining. The table has an ID column as the primary key. The column contains integer-type data and has been assigned the AUTO_ INCREMENT attribute with an AUTO_ INCREMENT value of 1.

The daily ingestion job should be able to automatically find rows that need to be ingested that day so that the job only ingests new data without duplicating older entries. The job should not omit any new data that needs to be ingested.

Which combination of actions should a data engineer perform to meet these requirements? (Select THREE)

A. Set up an AWS Glue connection to the source database.

B. Configure an AWS Glue crawler to use the ID column as the bookmark key for the job bookmark. Use the crawler to create the source table.

C. Use an AWS Glue crawler to create the table in the AWS Glue Data Catalog from the source database.

D. Schedule an Apache Spark job in AWS Glue with job bookmark enabled.

E. Schedule an AWS Glue crawler to find new rows in the source database and ingest the data into Amazon S3.

F. Create an AWS Glue workflow. Schedule a Python shell job in AWS Glue to get the timestamp of the latest object upload in the relevant S3 bucket. Use the same job to perform an SQL query to fetch the data in the source table that is newer than the timestamp.

Correct Answer: ADF

QUESTION 2

A car company is planning to run a marketing campaign to market a new vehicle to prior customers. The company has offered several similar promotions in the past and has the results of those promotions for building a model. The company has decided to experiment with sending a more expensive, glossy package of materials to a smaller number of customers. The company needs to contact at least 90% of the prior purchasers while minimizing the cost of mailing the marketing packages.

The company has trained a model using the linear learner algorithm in Amazon SageMaker. The model has a recall score of 80% and a precision of 75%. What should the company do to improve the model to meet its requirements?

A. Retrain the model by setting the target_recall hyperparameter to 90% and the binary_classifier_model_selection_criteria hyperparameter to recall_at_target_ precision.

B. Retrain the model by setting the target_precision hyperparameter to 90% and the binary_classifier model_selection_criteria hyperparameter to precision_at_target_recall.

C. Reshuffle the input data to increase the share of historical data used for input. Retrain the model with the number of epochs set to 20.

D. Retrain the model with the normalize_label hyperparameter set to true.

Correct Answer: A

QUESTION 3

A machine learning specialist needs to analyze comments on a news website with users across the globe. The specialist must find the most discussed topics in the comments that are in either English or Spanish.

What steps could be used to accomplish this task? (Select TWO.)

A. Use an Amazon SageMaker BlazingText algorithm to find the topics independently from the language. Proceed with the analysis.

B. Use an Amazon SageMaker seq2seq algorithm to translate from Spanish to English, if necessary. Use a SageMaker Latent Dirichlet Allocation (LDA) algorithm to find the topics.

C. Use Amazon Translate to translate from Spanish to English, if necessary. Use Amazon Comprehend topic modeling to find the topics.

Use Amazon Translate to translate from Spanish to English, if necessary. Use Amazon Lex to extract the topics from the content.

D. Use Amazon Transcribe to translate from Spanish to English, if necessary. E. Use Amazon SageMaker Neural Topic Model (NTM) to find the topics.

Correct Answer: AE

QUESTION 4

A company is launching a new product and needs to build a mechanism to monitor comments about the company and its new product on social media. The company needs to be able to evaluate the sentiment expressed in social media posts, and visualize trends and configure alarms based on various thresholds.

The company needs to implement this solution quickly, and wants to minimize the infrastructure and data science resources needed to evaluate the messages. The company already has a solution in place to collect posts and store them within an Amazon S3 bucket.
What services should the data science team use to deliver this solution?

A. Train a model in Amazon SageMaker by using the Blazing Text algorithm to detect sentiment in the corpus of social media posts. Expose an endpoint that can be called by AWS Lambda. Trigger a Lambda function when posts are added to the S3 bucket to invoke the endpoint and record the sentiment in an Amazon DynamoDB table and in a custom Amazon CloudWatch metric. Use CloudWatch alarms to notify analysts of trends.

B. Train a model in Amazon SageMaker by using the semantic segmentation algorithm to model the semantic content in the corpus of social media posts. Expose an endpoint that can be called by AWS Lambda. Trigger a Lambda function when objects are added to the S3 bucket to invoke the endpoint and record the sentiment in an Amazon DynamoDB table. Schedule a second Lambda function to query recently added records and send an Amazon Simple Notification Service (Amazon SNS) notification to notify analysts of trends.

C. Trigger an AWS Lambda function when social media posts are added to the S3 bucket. Call Amazon Comprehend for each post to capture the sentiment in the message and record the sentiment in an Amazon DynamoDB table. Schedule a second Lambda function to query recently added records and send an Amazon Simple Notification Service (Amazon SNS) notification to notify analysts of trends.

D. Trigger an AWS Lambda function when social media posts are added to the S3 bucket. Call Amazon Comprehend for each post to capture the sentiment in the message and record the sentiment in a custom Amazon CloudWatch metric and in S3. Use CloudWatch alarms to notify analysts of trends.

Correct Answer: A

QUESTION 5

A company will use Amazon SageMaker to train and host a machine learning (ML) model for a marketing campaign. The majority of the data is sensitive customer data. The data must be encrypted at rest. The company wants AWS to maintain the root of trust for the master keys and wants encryption key usage to be logged. Which implementation will meet these requirements?

A. Use encryption keys that are stored in AWS Cloud HSM to encrypt the ML data volumes, and to encrypt the model artifacts and data in Amazon S3.

B. Use SageMaker built-in transient keys to encrypt the ML data volumes. Enable default encryption for new Amazon Elastic Block Store (Amazon EBS) volumes.

C. Use customer managed keys in AWS Key Management Service (AWS KMS) to encrypt the ML data volumes, and to encrypt the model artifacts and data in Amazon S3.

D. Use AWS Security Token Service (AWS STS) to create temporary tokens to encrypt the ML storage volumes, and to encrypt the model artifacts and data in Amazon S3.

Correct Answer: A

QUESTION 6

A healthcare company is using an Amazon SageMaker notebook instance to develop machine learning (ML) models. The company’s data scientists will need to be able to access datasets stored in Amazon S3 to train the models. Due to regulatory requirements, access to the data from instances and services used for training must not be transmitted over the internet.

Which combination of steps should an ML specialist take to provide this access? (Select TWO.)

A. Configure the SageMaker notebook instance to be launched with a VPC attached and internet access disabled.

B. Create and configure a VPN tunnel between SageMaker and Amazon S3.

C. Create and configure an S3 VPC endpoint. Attach it to the VPC.

D. Create an S3 bucket policy that allows traffic from the VPC and denies traffic from the internet.

E. Deploy AWS Transit Gateway. Attach the S3 bucket and the SageMaker instance to the gateway.

Correct Answer: AD

QUESTION 7

A machine learning specialist stores IoT soil sensor data in an Amazon DynamoDB table and stores weather event data as JSON files in Amazon S3. The dataset in DynamoDB is 10 GB in size and the dataset in Amazon S3 is 5 GB in size. The specialist wants to train a model on this data to help predict soil moisture levels as a function of weather events using Amazon SageMaker.

Which solution will accomplish the necessary transformation to train the Amazon SageMaker model with the LEAST amount of administrative overhead?

A. Launch an Amazon EMR cluster. Create an Apache Hive external table for the DynamoDB table and S3 data. Join the Hive tables and write the results out to Amazon S3.

B. Crawl the data using AWS Glue crawlers. Write an AWS Glue ETL job that merges the two tables and writes the output to an Amazon Redshift cluster.

C. Enable Amazon DynamoDB Streams on the sensor table. Write an AWS Lambda function that consumes the stream and appends the results to the existing weather files in Amazon S3.

D. Crawl the data using AWS Glue crawlers. Write an AWS Glue ETL job that merges the two tables and writes the output in CSV format to Amazon S3.

Correct Answer: B

QUESTION 8

A company offers an online shopping service to its customers. The company wants to enhance the site’s security by requesting additional information when customers access the site from locations that are different from their normal location. The company wants to update the process to call a machine learning (ML) model to determine when additional information should be requested.

The company has several terabytes of data from its existing ecommerce web servers containing the source IP addresses for each request made to the web server. For authenticated requests, the records also contain the login name of the requesting user.

Which approach should an ML specialist take to implement the new security feature in the web application?

A. Use Amazon SageMaker Ground Truth to label each record as either a successful or failed access attempt. Use Amazon SageMaker to train a binary classification model using the factorization machines (FM) algorithm.

B. Use Amazon SageMaker to train a model using the IP Insights algorithm. Schedule updates and retraining of the model using new log data nightly.

C. Use Amazon SageMaker Ground Truth to label each record as either a successful or failed access attempt. Use Amazon SageMaker to train a binary classification model using the IP Insights algorithm.

D. Use Amazon SageMaker to train a model using the Object2Vec algorithm. Schedule updates and retraining of the model using new log data nightly.

Correct Answer: D

QUESTION 9

A machine learning specialist previously trained a logistic regression model using scikit-learn on a local machine, and the specialist now wants to deploy it to production for inference only.

What steps should be taken to ensure Amazon SageMaker can host a model that was trained locally?

A. Build the Docker image with the inference code. Tag the Docker image with the registry hostname and upload it to Amazon ECR.

B. Serialize the trained model So the format is compressed for deployment. Tag the Docker image with the registry hostname and upload it to Amazon S3.

C. Serialize the trained model So the format is compressed for deployment. Build the image and upload it to Docker Hub.

D. Build the Docker image with the inference code. Configure Docker Hub and upload the image to Amazon ECR.

Correct Answer: A

QUESTION 10

A company is building a line-counting application for use in a quick-service restaurant. The company wants to use video cameras pointed at the line of customers at a given register to measure how many people are in line and deliver notifications to managers if the line grows too long. The restaurant locations have limited bandwidth for connections to external services and cannot accommodate multiple video streams without impacting other operations.

Which solution should a machine learmning specialist implement to meet these requirements?

A. Install cameras compatible with Amazon Kinesis Video Streams to stream the data to AWS over the restaurant’s existing internet connection. Write an AWS L ambda function to take an image and send it to Amazon Rekognition to count the number of faces in the image. Send an Amazon Simple Notification Service (Amazon SNS) notification if the line is too long.

B. Deploy AWS DeepL ens cameras in the restaurant to capture video. Enable Amazon Rekognition on the AWS DeepLens device, and use it to trigger a local AWS Lambda function when a person is recognized. Use the Lambda function to send an Amazon Simple Notification Service (Amazon SNS) notification if the line is too long.

C. Build a custom model in Amazon SageMaker to recognize the number of people in an image. Install cameras compatible with Amazon Kinesis Video Streams in the restaurant. Write an AWS Lambda function to take an image. Use the SageMaker endpoint to call the model to count people. Send an Amazon Simple Notification Service (Amazon SNS) notification if the line is too long.

D. Build a custom model in Amazon SageMaker to recognize the number of people in an image. Deploy AWS DeepLens cameras in the restaurant. Deploy the model to the cameras. Deploy an AWS Lambda function to the cameras to use the model to count people and send an Amazon Simple Notification Service (Amazon SNS) notification if the line is too long.

Correct Answer: C

QUESTION 11
A data scientist is training a multilayer perceptron (MLP) on a dataset with multiple classes. The target class of interest is unique compared to the other classes within the dataset, but it does not achieve an acceptable recall metric. The data scientist has already tried varying the number and size of the MLP’s hidden layers, which has not significantly improved the results. A solution to improve recall must be implemented as quickly as possible.
Which techniques should be used to meet these requirements?

A. Gather more data using Amazon Mechanical Turk and then retrain.

B. Train an anomaly detection model instead of an MLP.

C. Train an XGBoost model instead of an MLP.

D. Add class weights to the MLP’s loss function and then retrain.

Correct Answer: D

Thank you for taking the time to read! I have shown you how to pass the AWS MLS-C01 test. You can get directly to the AWS MLS-C01 Exam dumps channel by visiting https://cciedump.spoto.net/aws-certificated-mls.php  to obtain the key to passing the exam!

 

Get the Love to Learn Sale offer to save even more money on authentic MLS-C01 exam dumps.

Latest passing report-100% pass guarantee

Please follow and like us:
Last modified: November 7, 2023

Author

Comments

Write a Reply or Comment

Your email address will not be published.