Weekend Sale Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 70percent

Confluent CCDAK Confluent Certified Developer for Apache Kafka Certification Examination Exam Practice Test

Demo: 18 questions
Total 61 questions

Confluent Certified Developer for Apache Kafka Certification Examination Questions and Answers

Question 1

You need to configure a sink connector to write records that fail into a dead letter queue topic. Requirements:

    Topic name: DLQ-Topic

    Headers containing error context must be added to the messagesWhich three configuration parameters are necessary?(Select three.)

Options:

A.

errors.tolerance=all

B.

errors.deadletterqueue.topic.name=DLQ-Topic

C.

errors.deadletterqueue.context.headers.enable=true

D.

errors.tolerance=none

E.

errors.log.enable=true

F.

errors.log.include.messages=true

Question 2

You are developing a Java application using a Kafka consumer.

You need to integrate Kafka’s client logs with your own application’s logs using log4j2.

Which Java library dependency must you include in your project?

Options:

A.

SLF4J implementation for Log4j 1.2 (org.slf4j:slf4j-log4j12)

B.

SLF4J implementation for Log4j2 (org.apache.logging.log4j:log4j-slf4j-impl)

C.

None, the right dependency will be added by the Kafka client dependency by transitivity.

D.

Just the log4j2 dependency of the application

Question 3

You create a topic named loT-Data with 10 partitions and replication factor of three.

A producer sends 1 MB messages compressed with Gzip.

Which two statements are true in this scenario?

(Select two.)

Options:

A.

Compression type will be stored in batch attributes.

B.

By default, compression is the producer’s responsibility.

C.

The message is already compressed so it will not be serialized.

D.

All compressed messages will be stored in the same topic partition.

Question 4

A producer is configured with the default partitioner. It is sending records to a topic that is configured with five partitions. The record does not contain any key.

What is the result of this?

Options:

A.

Records will be dispatched among the available partitions.

B.

Records will be sent to partition 0.

C.

An error will be raised and no record will be sent.

D.

Records will be sent to the least used partition.

Question 5

You have a topic t1 with six partitions. You use Kafka Connect to send data from topic t1 in your Kafka cluster to Amazon S3. Kafka Connect is configured for two tasks.

How many partitions will each task process?

Options:

A.

2

B.

3

C.

6

D.

12

Question 6

What are two examples of performance metrics?

(Select two.)

Options:

A.

fetch-rate

B.

Number of active users

C.

total-login-attempts

D.

incoming-byte-rate

E.

Number of active user sessions

F.

Time of last failed login

Question 7

You want to connect with username and password to a secured Kafka cluster that has SSL encryption.

Which properties must your client include?

Options:

A.

security.protocol=SASL_SSL

sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username='myUser' password='myPassword';

B.

security.protocol=SSL

sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username='myUser' password='myPassword';

C.

security.protocol=SASL_PLAINTEXT

sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username='myUser' password='myPassword';

D.

security.protocol=PLAINTEXT

sasl.jaas.config=org.apache.kafka.common.security.ssl.TlsLoginModule required username='myUser' password='myPassword';

Question 8

You are building a system for a retail store selling products to customers.

Which three datasets should you model as a GlobalKTable?

(Select three.)

Options:

A.

Inventory of products at a warehouse

B.

All purchases at a retail store occurring in real time

C.

Customer profile information

D.

Log of payment transactions

E.

Catalog of products

Question 9

You need to consume messages from Kafka using the command-line interface (CLI).

Which command should you use?

Options:

A.

kafka-console-consumer

B.

kafka-consumer

C.

kafka-get-messages

D.

kafka-consume

Question 10

Which two statements are correct about transactions in Kafka?

(Select two.)

Options:

A.

All messages from a failed transaction will be deleted from a Kafka topic.

B.

Transactions are only possible when writing messages to a topic with single partition.

C.

Consumers can consume both committed and uncommitted transactions.

D.

Information about producers and their transactions is stored in the _transaction_state topic.

E.

Transactions guarantee at least once delivery of messages.

Question 11

You are experiencing low throughput from a Java producer.

Metrics show low I/O thread ratio and low I/O thread wait ratio.

What is the most likely cause of the slow producer performance?

Options:

A.

Compression is enabled.

B.

The producer is sending large batches of messages.

C.

There is a bad data link layer (layer 2) connection from the producer to the cluster.

D.

The producer code has an expensive callback function.

Question 12

This schema excerpt is an example of which schema format?

package com.mycorp.mynamespace;

message SampleRecord {

int32 Stock = 1;

double Price = 2;

string Product_Name = 3;

}

Options:

A.

Avro

B.

Protobuf

C.

JSON Schema

D.

YAML

Question 13

Where are source connector offsets stored?

Options:

A.

offset.storage.topic

B.

storage.offset.topic

C.

topic.offset.config

D.

offset, storage, partitions

Question 14

Your application is consuming from a topic configured with a deserializer.

It needs to be resilient to badly formatted records ("poison pills"). You surround the poll() call with a try/catch for RecordDeserializationException.

You need to log the bad record, skip it, and continue processing.

Which action should you take in the catch block?

Options:

A.

Log the bad record, no other action needed.

B.

Log the bad record and seek the consumer to the offset of the next record.

C.

Log the bad record and call the consumer.skip() method.

D.

Throw a runtime exception to trigger a restart of the application.

Question 15

An application is consuming messages from Kafka.

The application logs show that partitions are frequently being reassigned within the consumer group.

Which two factors may be contributing to this?

(Select two.)

Options:

A.

There is a slow consumer processing application.

B.

The number of partitions does not match the number of application instances.

C.

There is a storage issue on the broker.

D.

An instance of the application is crashing and being restarted.

Question 16

You are composing a REST request to create a new connector in a running Connect cluster. You invoke POST /connectors with a configuration and receive a 409 (Conflict) response.

What are two reasons for this response? (Select two.)

Options:

A.

The connector configuration was invalid, and the response body will expand on the configuration error.

B.

The connect cluster has reached capacity, and new connectors cannot be created without expandingthe cluster.

C.

The Connector already exists in the cluster.

D.

The Connect cluster is in process of rebalancing.

Question 17

Match each configuration parameter with the correct deployment step in installing a Kafka connector.

Options:

Question 18

Which configuration allows more time for the consumer poll to process records?

Options:

A.

session.timeout.ms

B.

heartbeat.interval.ms

C.

max.poll.interval.ms

D.

fetch.max.wait.ms

Demo: 18 questions
Total 61 questions