# Apache Kafka

Apache Kafka (opens new window) is an open-source distributed event streaming platform. It is used by many industries and organizations to process payments and perform other financial transactions in real-time, to collect and immediately react to customer interactions and orders, and to serve as the foundation for data platforms, microservices, and much more.
Using Workato Apache Kafka connector you can link with your local or web Kafka cluster instance and:

# How to connect to Apache Kafka in Workato

Complete the following steps to establish a connection to Apache Kafka in Workato using a cloud profile OPA:

LOCAL PROFILE SETUP

The steps in this guide configure an Apache Kafka connection using a Cloud Profile. Refer to the Apache Kafka Profile guide to configure a connection using a config.yml file.

1

Enter a Connection name that identifies which Apache Kafka instance Workato is connected to.

Configured Apache Kafka connectionCreate your connection

2

Use the Location drop-down menu to select the project where you plan to store the connection.

3

Use the On-prem group drop-down menu to select an on-prem group that uses a cloud profile.

4

Enter a server URL that uses the kafka or kafka+ssl protocol.

5

Specify a Timeout in milliseconds for general operations.

6

Enter the following values in .pem format to connect to Kafka using SSL/TLS:

1

Enter an X509 Server certificate and SSL client certificate.

2

Enter an RSA SSL client key.

3

Optional. Enter an RSA SSL client key password. Don't inline password-protected private keys.

7

Optional. Expand the Additional properties for Kafka connection section and click Add parameter to define additional properties.

Define additional properties

You can define any Kafka producer (opens new window) or consumer (opens new window) configuration properties. For example, bootstrap.servers or batch_size.

Some properties are unavailable because the on-prem agent overrides them. Protected properties include the following:

Property name Details
key.serializer Only StringSerializer is supported by agent.
value.serializer Only StringSerializer is supported by agent.
key.deserializer Only StringSerializer is supported by agent.
value.deserializer Only StringSerializer is supported by agent.
auto.offset.reset Defined by recipes.
enable.auto.commit Defined internally.

The Kafka connector uses Apache Avro, a binary serialization format. Avro defines what fields are present and their type using JSON schemas. The connector stores data in a schema registry. The Kafka connector requires schema registry version 6.1.4 or later. For example:

On-prem kafka avro connectionKafka connection configuration with schema registry URL

8

Click Connect.

TROUBLESHOOT NO PROFILE FOUND ERRORS

Refer to the On-prem connections issues troubleshooting guide if you encounter a No profile found error.


Last updated: 3/9/2026, 9:08:59 PM