Kafka Component
Publish and consume messages from Apache Kafka event streams.
Component key: kafka
Description
Apache Kafka is an event streaming platform to implement high-performance data pipelines, streaming analytics, data integration, and other applications. This component allows publishing and consuming messages to/from Apache Kafka event streams.
Connections
Basic Username/Password
key: basicApache Kafka supports authentication using username and password with various SASL mechanisms. This connection requires credentials from the Kafka cluster administrator or from the managed Kafka service provider.
Prerequisites
- A Kafka cluster with SASL authentication enabled
- Username and password credentials with appropriate permissions
- Knowledge of the SASL mechanism configured on the cluster (PLAIN, SCRAM-SHA-256, or SCRAM-SHA-512)
- SSL/TLS certificates (if required by the Kafka cluster)
Setup Steps
For managed Kafka services (e.g., Confluent Cloud, Amazon MSK, Azure Event Hubs):
- Navigate to the service provider's console
- Locate the Kafka cluster and access the security settings
- Generate or obtain API credentials (username and API key/password)
- Note the SASL mechanism configured for the cluster
- Download SSL certificates if required
For self-hosted Kafka clusters:
- Contact the Kafka cluster administrator
- Request credentials with the appropriate permissions for the use case
- Obtain the SASL mechanism type configured on the cluster
- Obtain SSL/TLS certificates if the cluster requires secure connections
Refer to the Kafka SASL authentication documentation for more information on authentication mechanisms.
Configure the Connection
Create a connection of type Basic Username/Password and enter:
- Username: Enter the username provided by the Kafka administrator or service provider
- Password: Enter the password or API key
- Authentication Mechanism: Select the SASL mechanism configured on the cluster:
plain- PLAIN SASL mechanism (transmits credentials in plaintext, use with SSL/TLS)scram-sha-256- SCRAM-SHA-256 mechanism (recommended for production)scram-sha-512- SCRAM-SHA-512 mechanism (higher security)
- Enable SSL/TLS: Check this option if the Kafka cluster requires secure connections
- CA Certificate (if SSL is enabled): Paste the Certificate Authority (CA) certificate in PEM format. This is required for SSL connections.
- Client Certificate (if required): Paste the client certificate in PEM format (only if the cluster requires mutual TLS authentication)
- Client Key (if required): Paste the client private key in PEM format (only if the cluster requires mutual TLS authentication)
For production environments, use SCRAM-SHA-256 or SCRAM-SHA-512 with SSL/TLS enabled. The PLAIN mechanism transmits credentials without hashing and should only be used over encrypted connections.
Certificates must be in PEM format. If the certificates are in other formats (JKS, PKCS12), convert them to PEM before pasting into the connection configuration.
| Input | Notes | Example |
|---|---|---|
| Authentication Mechanism | Desired authorization method for passing username/password. | |
| CA Certificate | Certificate Authority (CA) certificate in PEM format. Required for SSL connections. | |
| Client Certificate | Client certificate in PEM format (if required by the Kafka cluster). | |
| Client Key | Client private key in PEM format (if required by the Kafka cluster). | |
| Password | Password. | |
| Enable SSL/TLS | Enable SSL/TLS for secure connections. | false |
| Username | Username. |
Triggers
Kafka Consumer
Consume messages from Kafka topics on a schedule. | key: kafkaConsumer
| Input | Notes | Example |
|---|---|---|
| Auto Commit | Whether to automatically commit offsets after processing messages. | true |
| Brokers | A Kafka broker allows consumers to fetch messages by topic, partition and offset. | kafka-broker.example.com:9092 |
| Client ID | A Client Id is an optional identifier of a Kafka consumer that is passed to a Kafka broker with every request. | myExampleClient |
| Connection | ||
| Consumer Group ID | The consumer group ID to use for this consumer. | my-consumer-group |
| From Beginning | Whether to start consuming from the beginning of the topic. | false |
| Heartbeat Interval (ms) | The interval for sending heartbeats to the broker in milliseconds. | 3000 |
| Max Messages | Maximum number of messages to consume per trigger execution. | 100 |
| Session Timeout (ms) | The timeout for consumer session in milliseconds. | 30000 |
| Topics | List of topics to subscribe to. | my-topic |
This trigger consumes messages from Apache Kafka topics on a configured schedule.
Unlike traditional polling triggers that track state between executions, this trigger consumes a configurable number of messages from one or more Kafka topics each time it runs based on the schedule configured in the integration.
How It Works
- The trigger executes on the configured schedule (e.g., every 5 minutes)
- It connects to Kafka using the provided connection credentials and broker addresses
- Creates a consumer with the specified consumer group ID
- Subscribes to one or more topics
- Consumes up to the maximum number of messages (or times out after 10 seconds if fewer messages are available)
- Returns all consumed messages in a structured format
- Disconnects from Kafka after consumption
The trigger uses Kafka's consumer group mechanism to track offsets. When Auto Commit is enabled (default), offsets are automatically committed after processing. This ensures that the same messages are not consumed twice by the same consumer group.
Configuration
Configure the following inputs:
- Connection: The Kafka connection configuration
- Client ID: An identifier for the Kafka consumer passed to the broker with every request (e.g.,
"myExampleClient") - Brokers: One or more Kafka broker addresses in
host:portformat (e.g.,"kafka-broker.example.com:9092") - Consumer Group ID: The consumer group ID to use for this consumer (e.g.,
"my-consumer-group") - Topics: One or more topics to subscribe to (e.g.,
"my-topic") - Max Messages: Maximum number of messages to consume per trigger execution (default:
100) - Session Timeout (ms): The timeout for consumer session in milliseconds (default:
30000) - Heartbeat Interval (ms): The interval for sending heartbeats to the broker in milliseconds (default:
3000) - From Beginning: Whether to start consuming from the beginning of the topic (default:
false) - Auto Commit: Whether to automatically commit offsets after processing messages (default:
true)
Returned Data
The trigger returns consumed messages along with metadata about the consumption.
Example Payload
{
"messages": [
{
"topic": "my-topic",
"partition": 0,
"offset": "42",
"key": "message-key",
"value": "{\"orderId\": \"12345\", \"status\": \"shipped\"}",
"timestamp": "1707500000000",
"headers": {}
},
{
"topic": "my-topic",
"partition": 0,
"offset": "43",
"key": "message-key-2",
"value": "{\"orderId\": \"12346\", \"status\": \"delivered\"}",
"timestamp": "1707500001000",
"headers": {}
}
],
"messageCount": 2,
"consumerGroupId": "my-consumer-group",
"topics": ["my-topic"]
}
Data Sources
Select Topic
Select a Kafka topic from the list. | key: selectTopic | type: picklist
| Input | Notes | Example |
|---|---|---|
| Broker | A Kafka broker allows consumers to fetch messages by topic, partition and offset. | 192.168.0.1 |
| Client ID | A Client Id is an optional identifier of a Kafka consumer that is passed to a Kafka broker with every request. | myExampleClient |
| Connection |
Actions
Get Consumer Group Status
Get the status and lag information for a consumer group. Specify topics for better performance, or leave empty to check all topics. | key: getConsumerGroupStatus
| Input | Notes | Example |
|---|---|---|
| Brokers | A Kafka broker allows consumers to fetch messages by topic, partition and offset. | kafka-broker.example.com:9092 |
| Client ID | A Client Id is an optional identifier of a Kafka consumer that is passed to a Kafka broker with every request. | myExampleClient |
| Connection | ||
| Consumer Group ID | The consumer group ID to check status for. | my-consumer-group |
| Topics to Check | Specific topics to check for this consumer group. Leave empty to check all topics (slower). | my-topic |
List Topics
List all topics in the Kafka cluster. | key: listTopics
| Input | Notes | Example |
|---|---|---|
| Brokers | A Kafka broker allows consumers to fetch messages by topic, partition and offset. | kafka-broker.example.com:9092 |
| Client ID | A Client Id is an optional identifier of a Kafka consumer that is passed to a Kafka broker with every request. | myExampleClient |
| Connection |
Publish Messages
Publish a message to an Apache Kafka topic. | key: publishMessages
| Input | Notes | Example |
|---|---|---|
| Brokers | A Kafka broker allows consumers to fetch messages by topic, partition and offset. | kafka-broker.example.com:9092 |
| Client ID | A Client Id is an optional identifier of a Kafka consumer that is passed to a Kafka broker with every request. | myExampleClient |
| Connection | ||
| Messages | Provide a string for a message to be sent to the Kafka topic. | Hello Kafka |
| Topic | A Topic is a category/feed name to which records are stored and published. | myTopic |
Changelog
2026-02-11
Added comprehensive Kafka consumer/subscriber functionality and enhanced connection security.