Skip to main content

Kafka topic

Overview

Apache Kafka is an end-to-end event streaming platform that:

  • Publishes (writes) and subscribes to (reads) streams of events from sources like databases, cloud services, and software applications.
  • Stores these events durably and reliably for as long as you want.
  • Processes and reacts to the event streams in real-time and retrospectively.

Those events are organized and stored in topics. These topics are then partitioned over buckets located on different Kafka brokers.

Event streaming thus ensures a continuous flow and interpretation of data so that the right information is at the right place, at the right time for your key use cases.

info

Before you set up your data sync destination, make sure to configure your Source.

tip

The Kafka Topic destination supports batch and real-time syncs.

Destination tab

The following table outlines the mandatory and optional parameters you will find on the Destination tab (Image 1).

The following parameters will help to define your data sync destination and how it functions.

ParameterDescriptionExample
DestinationMandatory. Select your destination from the drop down menu.Kafka Topic
Bootstrap ServersMandatory. Bootstrap Servers are a list of host/port pairs to use for establishing the initial connection to the Kafka cluster. This parameter should a CSV list of "broker host" or "host:port"localhost:9092,another.host:9092
Topic NameMandatory. The name of the Kafka Topic that messages will be produced to.
Use SSLCheck this if you want to connect to Kafka over SSL
SASL MechanismMandatory. Select the SASL (Simple Authentication and Security Layer) Mechanism to use for authentication:- None- PLAIN- SCRAM-SHA-256- SCRAM-SHA-512- OATHBEARER (default)- OATHBEARER (OIDC)
Test ConnectionYou can use the "Test Connection" button to ensure that your credentials are properly configured to access your destination. If configured correctly, a "Connection Successful" pop-up will appear. If configured incorrectly, a "Connection Failed" pop-up will appear along with a link to the applicable error logs to help you troubleshoot.

Image 2: Define your Destination

Next steps

Appendix A

Configuring a Dynamic Topic

Cinchy v5.10 added the ability to use a '@COLUMN' custom formula to enable a dynamic parameterized Kafka Topic when syncing into a Kafka destination.

To use this functionality, follow the below instructions.

  1. Define which table and column you want to use for your dynamic topic. In this example, the table is [Product].[Tasks] and the column is "Quarter".
  2. Create a data sync using a Cinchy Table source and a Kafka Topic destination.
    1. Ensure that the column defined in step 1 is loaded into your Schema. Dynamic Kafka Topic
  3. In the 'Topic' field of the Kafka destination, insert "@COLUMN('<column-name>')". In this example, the formula would be @COLUMN('Quarter').

Dynamic Kafka Topic