Events [Kafka and Data Sync]
Last updated
Last updated
Navigate to the Pipeline Editor page.
Click the Toggle Event Panelicon from the header.
Click the Add New Event icon from the Event Panel.
The New Event window opens.
Provide a name for the new Event.
Select Event Duration: Select an option from the below-given options.
* Short (4 hours)
* Medium (8 hours)
* Long (48 hours)
* Full Day (24 hours)
* Long (48 hours)*
Week (168 hours)
Please Note: The event data gets erased after 7 days if no duration option is selected from the available options. The Offsets expire as well.
Provide No. of Partitions (1-50). By default, the number of partitions will be 3.
No. of Outputs: The user can define the number of outputs using this field.
Checkmark the 'Is Failover' option to enable the Failover Event.
Select a Pipeline using the drop-down menu if you have selected the ‘Is Shared?’ option.
Click the Add Event option.
A notification message appears.
The newly created Event gets added to the Private tab in the Events panel.
The user can use the below-given steps to update a Kafka Event.
Drag the created Event component to the Pipeline Editor canvas.
Click the dragged Event component to open the Basic Info configuration fields.
The user can edit the following information (except No. of Partition and Event Duration):
Event Name: Modify the event name.
No. of Outputs: Set the number of outputs (the maximum allowed number of outputs are 3).
‘Is Failover?’: Enable this option to create a failover event.
Select Pipeline: Select one pipeline or multiple pipelines using the drop-down list.
Click the Save Event icon to save the changes.
The user gets a notification message stating that the pipeline update is a success.
The targeted Event component gets updated.
Kafka events can be flushed to delete all records present. Flushing an event retains the offsets of the topic by setting the start-offset value to the end-offset. Events can be flushed by using the "Flush Event" button beside the respective event in the event panel, and all events can be flushed at once by using the "Flush All" button. This button is present at the top of the event panel.
Drag a (reader) component to the canvas.
Configure the parameters of the dragged reader.
Drag the Event from the Events Panel.
Connect the dragged reader component as the input connection to the dragged Event.
Click the Update Pipeline icon to save the pipeline workflow.
Support is available for the following drivers:
ClickHouse
MongoDB
MSSQL
MySQL
Oracle
PostgreSQL
Snowflake
Navigate to the Pipeline Editor page.
Click on the DB Sync tab.
Click on the Add New Data Sync (+) icon from the Toggle Event Panel.
The Create Data Sync window opens.
Provide a display name for the new DB Sync.
Select the Driver: [pre- defined by the user in the settings page]
Click the Save option.
Drag and drop DB Sync Event to the workflow editor.
Click on the dragged Data Sync component.
The Basic Information tab appears with the following fields:
Display Name: Display name of the Data Sync
Event Name: Event name of the Data Sync
Table name: Specify table name.
Driver: this field will be pre-selected.
Primary Key: This field is optional.
Save Mode: Select anyone save mode from the drop-down.
Click on the Save Data Sync icon to save the DB Sync information.
Connect the dragged DB Sync Event to the reader component as displayed below:
Update and activate the pipeline.
Open the Logs tab to view whether the data gets written to a specified table.
Please Note:
In the Save mode, there are two available options.
Append
Upsert: One extra field will be displayed for upsert save mode i.e.: Composite Key.
The user can configure the Data Sync configuration from the Settings page of the Data Pipeline.
The Events Panel appears, and the Toggle Event Panel icon gets changed assuggesting that the event panel is displayed.
Click the Toggle Event Panelicon from the header.
The Events Panel appears, and the Toggle Event Panel icon gets changed as, suggesting that the event panel is displayed.