All Posts

AWS Certified Big Data - Specialty Certification - Part 9

Mary Mary Smith
03 Mar 2023
4 min
0

1. You build a game-high points in the Dynamo database. You will retain your high score for each user for each game. with a large number of games, all of which have relatively similar levels of use and the number of players. You should be able to look at the highest score of any game. What is the best key structure Dynamo DB?

A) Game ID hash / one key.
B) Highest rated the hash I only key.
C) Game ID that range I just key
D) None
E) ID of the game, that square. Highest scores that range key.


2. You are currently handles an application that uses the client library use Kinesis Kinesis read power. Have you noticed that you get closed throughput exceeded exemption Clouds hours of flow. What are the possible solutions to fix this error? Select 2 Select:

A) Add more pieces of Kinesis in a stream
B) Add repeats application logic using KCL library
C) Using more streams KINESIS
D) None
E) Add other applications using KCL library


3. Which of the following can be used to monitor a cluster of CSOs and provide reports on the implementation of the cluster as a whole?

A) cloud track
B) magazines Cloud Watch
C) ganglia
D) AWS Config
E) None


4. Are you currently working for a company that looks at baggage handling. There are GPS devices available in the baggage delivery devices for the delivery of the device's coordinates every 10 seconds. You should consider these coordinates in real time from multiple sources. Which instrument should be used to digest data

A) Pipeline data AWS
B) amazon Kinesis
C) None
D) amazon SQS
E) amazon EMR


5. There is a requirement to convert and transfer the local Oracle database for AWS Aurora. Which of the following steps involved in this process? 3 Select an answer from the options listed below(Select 3answers)

A) Complete the following migration steps
B) Conversion database scheme and code conversion tool AWS Schedule
C) Using Data Pipeline AWS for data transmitted from Oracle on AWS Aurora
D) Transferring data from the source database to the target database using AWS database migration service



1. Right Answer: E
Explanation:

2. Right Answer: B
Explanation: The most likely cause is that not enough shards and reading the program from a fragment of a temporary or permanent faster

3. Right Answer: B
Explanation: Ganglia open source is projected, minimizing the impact on their productivity. When the cluster of ganglia, you can create reports and view the results of the cluster as a whole, as well as to verify the performance of individual instances of nodes. Ganglia are also configured to drink and visualize Hadoop and Spark statistics

4. Right Answer: B
Explanation: AWS documentation mentions the following -. Amazon Kinesis makes it easy to collect and analyze the process of real-time streaming data, so that you n receive the understanding of time and to react quickly to new information. Amazon Kinesis provides significant opportunities for cost-effective processing of data streaming in any scale, along with the flexibility to choose the tool that best suits the requirements of your application. With Amazon Kinesis you can eat real time data such as video, audio, application logs. Website dick streams and telemetry data for machine learning. analysis and other applications. Amazon Kinesis allows you to process and analyze the data received and to respond immediately, rather than waiting until all data has been collected before

5. Right Answer: A,B,D
Explanation: AWS documentation mentions involved in the transformation of the following steps: a database schema 1. Convert and code 0?. It can help you convert the original database schema and SQL in C ++. CL Java or other program in a format compatible with any code that can not be converted automatically are clearly marked in such a way that it is possible to manually convert. 2. The transfer of data from the source database to the target database. You can start the migration data with a few mouse clicks in the AWS Management Console. Source database will remain in full operation during 3.Perform migration after migration activities such as executing SQL queries to verify the types of objects. object number and row number of each table between the source and target databases.

0 Comments
Leave a comment