Kafka Producer
Last updated
Last updated
This component is to produce messages to internal/externa Kafka topics.
All component configurations are classified broadly into following sections:
Metadata
Please follow the demonstration to configure the component:
Kafka Producer component consumes the data from the previous event and produce the data on a given Kafka topic. It can produce the data in same environment and external environment with CSV, JSON, XML and Avro format. This data can be further consumed by Kafka consumer in the data pipeline.
It falls under the Producer component group.
Drag and drop the Kafka Producer Component to the Workflow Editor.
Click on the dragged Kafka Producer component to get the component properties tabs.
Configure the Basic Information tab.
Select an Invocation type from the drop-down menu to confirm the running mode of the component. Select ‘Real-Time’ from the drop-down menu.
Deployment Type: It displays the deployment type for the component. This field comes pre-selected.
Container Image Version: It displays the image version for the docker container. This field comes pre-selected.
Failover Event: Select a failover Event from the drop-down menu.
Batch Size (min 10): Provide maximum number of records to be processed in one execution cycle (Min limit for this field is 10.
Click on the Meta Information tab to open the properties fields and configure the Meta Information tab by providing the required fields.
Topic Name: Specify topic name where user want to produce data.
Is External: User can produce the data to external Kafka topic by enabling 'Is External' option. ‘Bootstrap Server’ and ‘Config’ fields will display after enable 'Is External' option.
Bootstrap Server: Enter external bootstrap details.
Config: Enter configuration details.
Input Record Type: It contain following input record type:
CSV: User can produce CSV data using this option. ‘Headers’ and ‘Separator’ fields will display if user select choose CSV input record type.
Header: In this field user can enter column names of CSV data that produce to the Kafka topic.
Separator: In this field user can enter separators like comma (,) that used for CSV data.
JSON: User can produce JSON data using this option.
XML: User can produce XML data using this option.
AVRO: User can produce AVRO data using this option. ‘Registry’, ‘Subject’ and ‘Schema’ fields will display if user selects AVRO as the input record type.
Registry: Enter registry details.
Subject: Enter subject details.
Schema: Enter schema.
After doing all the configurations click the ‘Save Component in Storage’ icon provides in the configuration panel to save the component.
A notification message appears to inform about the component configuration saved.