لا تريد أن تفوت شيئا؟

نصائح اجتياز امتحان الشهادة

آخر أخبار الامتحانات ومعلومات الخصم

برعاية وحديثة من قبل خبرائنا

نعم، أرسل لي النشرة الإخبارية

خذ اختبارات أخرى عبر الإنترنت

السؤال #1
A company wants to automate the creation of secure test databases with random credentials to be stored safely for later use. The credentials should have sufficient information about each test database to initiate a connection and perform automated credential rotations. The credentials should not be logged or stored anywhere in an unencrypted form. Which steps should a Database Specialist take to meet these requirements using an AWS CloudFormation template?
A. Create the database with the MasterUserName and MasterUserPassword properties set to the default value
B. Then, create the secret with the user name and password set to the same default value
C. Add a Secret Target Attachment resource with the SecretId and TargetId properties set to the Amazon Resource Names (ARNs) of the secret and the databas
D. Finally, update the secret’s password value with a randomly generated string set by the GenerateSecretString property
E. Add a Mapping property from the database Amazon Resource Name (ARN) to the secret AR
F. Then, create the secret with a chosen user name and a randomly generated password set by the GenerateSecretString propert G
عرض الإجابة
اجابة صحيحة: D
السؤال #2
A company is concerned about the cost of a large-scale, transactional application using Amazon DynamoDB that only needs to store data for 2 days before it is deleted. In looking at the tables, a Database Specialist notices that much of the data is months old, and goes back to when the application was first deployed. What can the Database Specialist do to reduce the overall cost?
A. Create a new attribute in each table to track the expiration time and create an AWS Glue transformation to delete entries more than 2 days old
B. Create a new attribute in each table to track the expiration time and enable DynamoDB Streams on each table
C. Create a new attribute in each table to track the expiration time and enable time to live (TTL) on each table
D. Create an Amazon CloudWatch Events event to export the data to Amazon S3 daily using AWS Data Pipeline and then truncate the Amazon DynamoDB table
عرض الإجابة
اجابة صحيحة: A
السؤال #3
A company’s Security department established new requirements that state internal users must connect to an existing Amazon RDS for SQL Server DB instance using their corporate Active Directory (AD) credentials. A Database Specialist must make the modifications needed to fulfill this requirement. Which combination of actions should the Database Specialist take? (Choose three.)
A. Disable Transparent Data Encryption (TDE) on the RDS SQL Server DB instance
B. Modify the RDS SQL Server DB instance to use the directory for Windows authentication
C. Use the AWS Management Console to create an AWS Managed Microsoft A
D. Create a trust relationshipwith the corporate AD
E. Stop the RDS SQL Server DB instance, modify it to use the directory for Windows authentication, and startit agai
F. Create appropriate new logins
عرض الإجابة
اجابة صحيحة: D
السؤال #4
A Database Specialist is performing a proof of concept with Amazon Aurora using a small instance to confirm a simple database behavior. When loading a large dataset and creating the index, the Database Specialist encounters the following error message from Aurora: ERROR: cloud not write block 7507718 of temporary file: No space left on device What is the cause of this error and what should the Database Specialist do to resolve this issue?
A. The scaling of Aurora storage cannot catch up with the data loadin
B. The Database Specialist needs tomodify the workload to load the data slowly
C. The scaling of Aurora storage cannot catch up with the data loadin
D. The Database Specialist needs toenable Aurora storage scaling
E. The local storage used to store temporary tables is ful
F. The Database Specialist needs to scale up theinstance
عرض الإجابة
اجابة صحيحة: D
السؤال #5
An ecommerce company is using Amazon DynamoDB as the backend for its order-processing application. The steady increase in the number of orders is resulting in increased DynamoDB costs. Order verification and reporting perform many repeated GetItem functions that pull similar datasets, and this read activity is contributing to the increased costs. The company wants to control these costs without significant development efforts. How should a Database Specialist address these requirements?
A. Use AWS DMS to migrate data from DynamoDB to Amazon DocumentDB
B. Use Amazon DynamoDB Streams and Amazon Kinesis Data Firehose to push the data into AmazonRedshift
C. Use an Amazon ElastiCache for Redis in front of DynamoDB to boost read performance
D. Use DynamoDB Accelerator to offload the reads
عرض الإجابة
اجابة صحيحة: A
السؤال #6
A financial company has allocated an Amazon RDS MariaDB DB instance with large storage capacity to accommodate migration efforts. Post-migration, the company purged unwanted data from the instance. The company now want to downsize storage to save money. The solution must have the least impact on production and near-zero downtime. Which solution would meet these requirements?
A. Create a snapshot of the old databases and restore the snapshot with the required storage
B. Create a new RDS DB instance with the required storage and move the databases from the old instancesto the new instance using AWS DMS
C. Create a new database using native backup and restore
D. Create a new read replica and make it the primary by terminating the existing primary
عرض الإجابة
اجابة صحيحة: C
السؤال #7
An online gaming company is planning to launch a new game with Amazon DynamoDB as its data store. The database should be designated to support the following use cases: Update scores in real time whenever a player is playing the game. Retrieve a player’s score details for a specific game session. A Database Specialist decides to implement a DynamoDB table. Each player has a unique user_id and each game has a unique game_id. Which choice of keys is recommended for the DynamoDB table?
A. Create a global secondary index with game_id as the partition key
B. Create a global secondary index with user_id as the partition key
C. Create a composite primary key with game_id as the partition key and user_id as the sort key
D. Create a composite primary key with user_id as the partition key and game_id as the sort key
عرض الإجابة
اجابة صحيحة: BD
السؤال #8
A manufacturing company’s website uses an Amazon Aurora PostgreSQL DB cluster. Which configurations will result in the LEAST application downtime during a failover? (Choose three.)
A. Use the provided read and write Aurora endpoints to establish a connection to the Aurora DB cluster
B. Create an Amazon CloudWatch alert triggering a restore in another Availability Zone when the primary Aurora DB cluster is unreachable
C. Edit and enable Aurora DB cluster cache management in parameter groups
D. Set TCP keepalive parameters to a high value
E. Set JDBC connection string timeout variables to a low value
F. Set Java DNS caching timeouts to a high value
عرض الإجابة
اجابة صحيحة: A
السؤال #9
A company runs online transaction processing (OLTP) workloads on an Amazon RDS for PostgreSQL Multi-AZ DB instance. Tests were run on the database after work hours, which generated additional database logs. The free storage of the RDS DB instance is low due to these additional logs. What should the company do to address this space constraint issue?
A. Log in to the host and run the rm $PGDATA/pg_logs/* command
B. Modify the rds
C. Create a ticket with AWS Support to have the logs deleted
D. Run the SELECT rds_rotate_error_log() stored procedure to rotate the logs
عرض الإجابة
اجابة صحيحة: ABC
السؤال #10
A company is looking to move an on-premises IBM Db2 database running AIX on an IBM POWER7 server. Due to escalating support and maintenance costs, the company is exploring the option of moving the workload to an Amazon Aurora PostgreSQL DB cluster. What is the quickest way for the company to gather data on the migration compatibility?
A. Perform a logical dump from the Db2 database and restore it to an Aurora DB cluste
B. Identify the gaps andcompatibility of the objects migrated by comparing row counts from source and target tables
C. Run AWS DMS from the Db2 database to an Aurora DB cluste
D. Identify the gaps and compatibility of theobjects migrated by comparing the row counts from source and target tables
E. Run native PostgreSQL logical replication from the Db2 database to an Aurora DB cluster to evaluate themigration compatibility
F. Run the AWS Schema Conversion Tool (AWS SCT) from the Db2 database to an Aurora DB cluster
عرض الإجابة
اجابة صحيحة: D
السؤال #11
A global digital advertising company captures browsing metadata to contextually display relevant images,pages, and links to targeted users. A single page load can generate multiple events that need to be storedindividually. The maximum size of an event is 200 KB and the average size is 10 KB. Each page load mustquery the user’s browsing history to provide targeting recommendations. The advertising company expectsover 1 billion page visits per day from users in the United States, Europe, Hong Kong, and India
A. Amazon DocumentDB
B. Amazon RDS Multi-AZ deployment
C. Amazon DynamoDB global table
D. Amazon Aurora Global Database
عرض الإجابة
اجابة صحيحة: B

عرض الإجابات بعد التقديم

يرجى إرسال البريد الإلكتروني الخاص بك والواتس اب للحصول على إجابات الأسئلة.

ملحوظة: يرجى التأكد من صلاحية معرف البريد الإلكتروني وWhatsApp حتى تتمكن من الحصول على نتائج الاختبار الصحيحة.

بريد إلكتروني:
رقم الواتس اب/الهاتف: