Kafka Consumer
Last updated
Last updated
The Kafka Consumer component consumes messages from the internal [pipeline] or external brokers.
All component configurations are classified broadly into 3 section
Metadata
Check out the steps provided in the demonstration to configure the Kafka Consumer component.
Please Note: It currently supports SSL and plaintext as security types.
This Component can read the data from external Brokers as well with SSL as the security type and host Aliases:
Drag and drop the Kafka Consumer Component to the Workflow Editor from the Consumer component group.
Click on the dragged Kafka Consumer component to get the component properties tabs.
The Basic Information tab is the default tab to open while using a click on the dragged component. Configure the Basic Information tab by following the given steps:
Select an Invocation type from the drop-down menu to confirm the running mode of the component. Select ‘Real-Time’ from the drop-down menu.
Deployment Type: It displays the deployment type for the component. This field comes pre-selected.
Container Image Version: It displays the image version for the docker container. This field comes pre-selected.
Failover Event: Select a failover Event from the drop-down menu.
Batch Size (min 10): Provide the maximum number of records to be processed in one execution cycle (Min limit for this field is 10.
Enable Auto-Scaling: Component pod scale up automatically based on a given max instance, if component lag is more than 60%.
Click on the Meta Information tab to open the properties fields and configure the Meta Information tab by providing the required fields.
Topic Name: Specify a topic name from where you want to consume data.
Start From: It contains the following start from:
Processed: Using this option the component will consume the message from the last processed offset. if the component is getting deployed for the first time it will read from the
Beginning: Using this option consumes live processed data and already processed data from the beginning
Latest: Using this option consumes the latest processed data.
Timestamp: Using this option consumes data between given interval times.
Is External: The User can consume external topic data from the external bootstrap server by enabling 'Is External' option. ‘Bootstrap Server’ and ‘Config’ fields will display after enabling the 'Is External' option.
Bootstrap Server: Enter external bootstrap details.
Config: Enter configuration details of external details.
Input Record Type: It contains the following input record type:
CSV: The User can consume csv data using this option. ‘Headers’ and ‘Separator’ fields will display if the user select choose CSV input record type.
Header: In this field, the user can enter column names of CSV data that consume from the Kafka topic.
Separator: In this field, the user can enter separators like comma (,) that used CSV data.
JSON: The User can consume JSON data using this option.
XML: The User can consume parquet data using this option.
AVRO: The User can consume Avro data using this option.
Security Type: It contains the following security type:
Plain Text: Choose the Plain Text option if there environment without SSL.
SSL: Choose the SSL option if there environment with SSL. It will display the following fields:
Trust Store Location: Provide the trust store path.
Trust Store Password: Provide the trust store password.
Key Store Location: Provide the key store path.
Key Store Password: Provide the key store password.
SSL Key Password: Provide the SSL key password.
Host Aliases: It contains the following fields:
IP: Provide the IP.
Host Names: Provide the host names.
After doing all the configurations click the ‘Save Component in Storage’ icon provided in the configuration panel to save the component.
A notification message appears to inform about the component configuration saved.