Kafka Producer
The Kafka Producer component acts as a data source within the pipeline. It generates and publishes messages to Kafka topics, enabling reliable and scalable data ingestion into a Kafka cluster.
Kafka Producer ensures:
High-throughput streaming of messages.
Compatibility with multiple formats (CSV, JSON, XML, Avro).
Integration with both internal and external Kafka clusters.
This makes it suitable for real-time data processing, transformation, and analytics across diverse use cases.
Configuration Sections
All component configurations are classified into:
Basic Information
Meta Information
Resource Configuration
Basic Information Tab
The Basic Information tab defines execution and deployment parameters.
Invocation Type
Select execution mode. Supported: Real-Time.
Yes
Deployment Type
Displays the deployment type for the component (pre-selected).
Yes
Container Image Version
Displays the Docker image version used (pre-selected).
Yes
Failover Event
Select a failover event to handle retries or errors.
Optional
Batch Size
Maximum number of records processed in one cycle (minimum: 10).
Yes
Meta Information Tab
The Meta Information tab configures Kafka topic details, record type, and environment options.
Topic Name
Kafka topic where messages will be published.
Yes
Is External
Enable this to publish data to an external Kafka cluster. Displays Bootstrap Server and Config fields.
Optional
Bootstrap Server
External Kafka bootstrap server connection string.
Conditional
Config
Additional Kafka configuration details.
Conditional
Input Record Type
Message format: CSV, JSON, XML, Avro.
Yes
Format-Specific Fields
CSV
Header: Column names for CSV data.
Separator: Delimiter used in the CSV file (e.g.,
,
).
JSON
No additional fields required.
XML
No additional fields required.
Avro
Registry: Avro schema registry details.
Subject: Subject name in the schema registry.
Schema: Avro schema definition.
Host Aliases
For both internal and external clusters, you can configure host aliases:
IP
IP address of the Kafka broker.
Optional
Host Names
Hostnames mapped to the Kafka broker.
Optional
Saving the Configuration
Configure the Basic Information and Meta Information tabs.
Click Save Component (Storage icon).
A confirmation message appears after the configuration is saved.
Activate the pipeline to start publishing messages to the configured Kafka topic.
Example Workflow
Configure Kafka Producer with:
Topic Name:
customer_transactions
Input Record Type:
Avro
Registry:
http://registry-server:8081
Subject:
transactions-value
Schema: Avro schema definition for customer transaction data.
Is External: Enabled with bootstrap server
broker1.external:9093
.
Connect the producer to an input event (data coming from a Reader or Consumer).
Activate the pipeline.
Messages are published to Kafka and can be consumed by downstream Kafka Consumers for processing.