Skip to main content
Cloud Computing ⭐ Premium

AWS Certified Data Engineer Associate (DEA-C01) - 340 Questions

By Webmaster Certland ❤️ 0 likes

Practice exam for the AWS Certified Data Engineer Associate (DEA-C01). Covers data ingestion and transformation, data store management, data operations and support, and data security and governance.

🔒

Premium Content

This exam is exclusive to Premium users. Upgrade to get unlimited access!

Become Premium

👁️ Free Preview (5 of 340 questions)

1. A data engineer needs to ingest clickstream events from a website into AWS for real-time processing. The application generates approximately 5,000 records per second, and each record is 2 KB in size. The engineer needs to determine the minimum number of shards required for an Amazon Kinesis Data Stream. How many shards are needed?

A 5 shards
B 10 shards
C 20 shards
D 2 shards

2. A company is using Amazon Kinesis Data Streams to ingest sensor data. The default data retention period is not long enough for their recovery requirements. A data engineer needs to retain records in the stream for 10 days to allow replay in case of downstream failures. What is the correct action?

A Rely on the default 7-day retention period, which already covers 10 days
B Enable server-side encryption on the stream to extend the retention period
C Modify the stream's retention period to 10 days using the UpdateStreamMode API or console
D Configure the stream to automatically archive data to Amazon S3 after 24 hours

3. A data engineer is designing a Kinesis Data Streams producer application. The stream has 4 shards. Records from the same IoT device must always be processed in order. Which mechanism ensures that all records from the same device are routed to the same shard?

A Use the device ID as the partition key when calling PutRecord
B Specify a sequence number in each PutRecord call to order records across shards
C Use a shard iterator to direct writes to a specific shard
D Include the stream name as a routing header in each record

4. A company is streaming financial transaction data through Amazon Kinesis Data Streams. A downstream Lambda function using the standard GetRecords API is experiencing high latency because it shares read throughput with three other consumer applications. What is the most effective solution to eliminate this read throughput contention?

A Increase the number of shards in the stream to double the available read throughput
B Register the Lambda function as an enhanced fan-out consumer using SubscribeToShard
C Extend the stream's retention period to reduce the number of GetRecords calls needed
D Use a different partition key strategy to redistribute records across shards

5. A company uses Amazon Kinesis Data Firehose to deliver log data to Amazon S3. The data team wants to convert the incoming JSON records to Apache Parquet format before storage to reduce query costs in Amazon Athena. Which feature of Kinesis Data Firehose should they enable?

A Configure buffer hints to compress JSON records into a columnar format before delivery
B Attach a Lambda transformation function to the Firehose stream to convert each record
C Enable record format conversion in Firehose using an AWS Glue Data Catalog table schema
D Enable dynamic partitioning on the Firehose stream to organize records by schema type

Want to test yourself for real?

Create a free account and run our exam simulation engine.

Free No credit card
  • Simulation engine
  • Up to 10 questions per attempt
  • Score & basic stats
Create free account Already have an account? Sign in
Best
Premium 7-day trial
  • All 340 questions
  • Detailed explanations
  • Smart Practice + Focus Mode
⭐ Start 7-day free trial

Information

Questions 340
Time 2h 10min
Difficulty Medium
Minimum Score 72.00%

🤍 Like

Related Exams

Discussion

No comments yet. Be the first to start the discussion!

Sign in to join the discussion.