# Apache Kafka

Apache Kafka (opens new window) is an open-source distributed event streaming platform. It is used by many industries and organizations to process payments and perform other financial transactions in real-time, to collect and immediately react to customer interactions and orders, and to serve as the foundation for data platforms, microservices, and much more.
Using Workato Apache Kafka connector you can link with your local or web Kafka cluster instance and:

The Apache Kafka connector is an on-premise connector.

# How to connect to Apache Kafka in Workato

You connect to Apache Kafka like any other client. Only basic information is required and you are ready to go.

When setting up connection directly in Workato using cloud profile

You don't have to edit the on-prem config file. Set up all properties directly in Workato as shown below.
If you are still using config.yml file to set up your connection details please refer to this configuration page

Configured Apache Kafka connection Configure your Kafka connection directly in Workato by setting the following properties:

Property name Description
url Comma-separated list of server URLs where protocol is either kafka or kafka+ssl.
timeout General operation timeout, milliseconds.
Server certificate X509 server certificate in .pem format
SSL certificate X509 client certificate in .pem format
SSL certificate key RSA client key in .pem format

The preceding certificate options can be used when connecting to Kafka using SSL/TLS.

Password-protected private keys cannot be inlined.

You can provide any Kafka producer (opens new window) or consumer (opens new window) configuration properties, for example, bootstrap.servers or batch_size.

However, some properties are overridden by the on-prem agent and cannot be configured. You will get a warning when trying to redefine a protected property. Protected properties include:

Property name Comment
key.serializer Only StringSerializer is supported by agent
value.serializer Only StringSerializer is supported by agent
key.deserializer Only StringSerializer is supported by agent
value.deserializer Only StringSerializer is supported by agent
auto.offset.reset Defined by recipes
enable.auto.commit Defined internally

The Kafka connector supports Apache Avro, a binary serialization format. Avro relies on schemas in JSON that define what fields are present and their type. The connector relies on storing data in a schema registry. The Kafka connector only works with schema registry version 6.1.4 and above.

Example: On-prem kafka avro connection Kafka connection configuration with url for schema registry

Last updated: 2/13/2024, 4:59:53 PM