Kafka Topic
Overview
The page outlines the parameters that should be included in your listener configuration when setting up a real time sync with a Kafka stream source. This process currently supports **JSON string or AVRO serialized format.**To listen in on multiple topics, you will need to configure multiple listener configs.
Configuring the Sync
All of the confiuration parameters and variables for a Kafka real-time sync are the same as those outlined in the Kafka batch sync documentation. The only difference is that you must also set up your listener configuration as per the next section.
The Listener Configuration
In Cinchy v5.7+, configuring the listener can be done directly in the Connections UI, however for multiple listener requirements you must still add additional configurations in the Listener Config table.
To listen in on multiple topics, you will need to configure multiple listener configs.
To set up an Stream Source, you must set up a Listener Configuration. The below table describes the parameters and their relevant descriptions.
- Listener Config
- Topic
- Connection Attributes
\The following column parameters can be found in the Listener Config table:
Parameter | Description | Example |
---|---|---|
Name | Mandatory. Provide a name for your listener config. | Kafka topic real-time sync |
Event Connector Type | Mandatory. Select your Connector type from the drop-down menu. | Kafka topic |
Topic | Mandatory. This field is expecting a JSON formatted value specific to the connector type you are configuring. | See the Topic tab. |
Connection Attributes | Mandatory. This field is expecting a JSON formatted value specific to the connector type you are configuring. | See the Connection Attributes tab. |
Status | Mandatory. Set to "Enabled" to activate the listener. Leave on "Disabled" until you are ready to start syncing. | Enabled /Disabled |
Running Status | Read-only. Shows the current state of the listener as Starting, Running, or Failed. This is automatically managed by the system. For more information, see the Listener status section. | Running |
Active | Managed by User/System. Indicates whether the listener is set to retry after failure ("Yes") or has stopped attempting to sync and requires user intervention ("No"). | Yes /No |
Data Sync Config | Mandatory. This drop-down will list all the data syncs on your platform. Select the one that you want to use for your real-time sync. | CDC Data Sync |
Subscription Expires On | Salesforce Stream Sources only. This field is a timestamp that's auto populated when it has successfully subscribed to a topic. | |
Message | Auto-populated. This field reports errors that occur during the sync. | |
Auto Offset Reset | Earliest, Latest, None. Determines where to start reading events if there is no last message ID or if it's invalid. Can be adjusted post-configuration. Learn more | Earliest ,Latest , None |
The below table can be used to help create your Topic JSON needed to set up a real-time sync.
Parameter | Description | Example |
---|---|---|
topicName | Mandatory. This is the Kafka topic name to listen messages on. | |
messageFormat | Optional. Put "AVRO" if your messages are serialized in AVRO, otherwise leave blank. |
Example Topic JSON
{
"topicName": "<(mandatory) kafka topic name to listen messages on>",
"messageFormat": "<(optional) Put "AVRO" if the messages are serialized in AVRO>"
}
The below table can be used to help create your Connection Attributes JSON needed to set up a real-time sync.
Parameter | Description |
---|---|
bootstrapServers | List the Kafka bootstrap servers in a comma-separated list. This should be in the form of host:port |
saslMechanism | This will be either PLAIN, SCRAM-SHA-256, or SCRAM-SHA-512. |
saslPassword | The password for your chosen SASL mechanism |
saslUsername | The username for your chosen SASL mechanism. |
url | This is required if your data follows a schema when serialized in AVRO. It is a comma-separated list of URLs for schema registry instances that are used to register or lookup schemas. |
basicAuthCredentialsSource | Specifies the Kafka configuration property "schema.registry.basic.auth.credentials.source" that provides the basic authentication credentials. This can be "UserInfo" | "SaslInherit" |
basicAuthUserInfo | Basic Auth credentials specified in the form of username:password |
sslKeystorePassword | This is the client keystore (PKCS#12) password. |
securityProtocol | Kafka supports cluster encryption and authentication, which can encrypt data-in-transit between your applications and Kafka. Use this field to specify which protocol will be used for communication between client and server. Cinchy currently supports the following options: Plaintext, SaslPlaintext, or SaslSsl. |
{
"bootstrapServers": "< (mandatory) kafka bootstrap servers in a comma-separated list in the form of host:port>",
"saslMechanism": "<PLAIN|SCRAM-SHA-256|SCRAM-SHA-512>",
"saslPassword": "",
"saslUsername": "",
"schemaRegistrySettings": {
"url": "<(optional) This is required if your data follows a schema when serialized in Avro. A comma-separated list of URLs for schema registry instances that are used to register or lookup schemas. >",
"basicAuthCredentialsSource": "<(optional) Specifies the Kafka configuration property "schema.registry.basic.auth.credentials.source" that provides the basic authentication credentials, this can be "UserInfo" | "SaslInherit">",
"basicAuthUserInfo": "<(optional) Basic Auth credentials specified in the form of username:password>",
"sslKeystorePassword": "<(optional) The client keystore (PKCS#12) password>"
}
"securityProtocol": "Plaintext | SaslPlaintext | SaslSsl"
}