Right-side Panel

Toggle Log Panel

The Toggle Log Panel displays the Logs and Advanced Logs tabs for the Pipeline Workflows.

  • Navigate to the Pipeline Editor page.

  • Click the Toggle Log Panel icon on the Pipeline.

  • A Log panel toggles displaying the collective component logs of the pipeline under the Logs tab.

Logs tab displaying the collective logs for a Pipeline

  • Select the Advanced Logs tab to display the status of the component containers.

The Advanced Log displayed inside the toggle Log panel.

Object Browser: Search Component in Pipelines

This feature helps the user to search a specific component across all the existing pipelines. The user can drag the required components to the pipeline editor to create a new pipeline workflow.

  • Click the Search Component in pipelines icon from the header panel of the Pipeline Editor.

Search Component in Pipelines icon

  • The Object Browser window opens displaying all the existing pipeline workflows.

Object Browser window

  • The user can search a component using the Search Component space.

  • The user gets prompt suggestions while searching for a component.

  • Once the component name is entered, the pipeline workflows containing the searched pipeline get listed below.

  • The user can click the expand/ collapse icon to expand the component panel for the selected pipeline.

  • The user can drag a searched component from the Object Browser and and drop to the Pipeline Editor canvass.

Toggle Event Panel

The user can access the Toggle Event Panel to create a new Event. We have two options in the Toggle Event Panel:

  1. Private (Event)

  2. Data Sync

The Toggle Event Panel

The user can create an Event (Kafka Topic) that can be used to connect two pipeline components.

  • Navigate to the toggle Event panel.

  • Click the Add New Event icon.

  • The New Event dialog box opens.

  • Provide the required information.

    • Slide the given button to enable the event mapping.

    • Provide a display name for the event (A default name based on the pipeline name appears for the Event).

    • Select the Event Duration from the drop-down menu (It can be set from 4 to 168 hours as per the given options).

    • Number of partitions (You can choose out of 1 to 50).

    • Number of outputs (You can choose out of 1-3) (The maximum number of outputs must not exceed the no. of Partition).

    • Enable the Is Failover? option if you wish to create a failover Event.

    • Click the Add Event option to save the new Event.

  • A confirmation message appears.

  • The new Event gets created in the Event Panel.

  • Drag and drop the event from the event panel to the workflow editor.

Dragging an Event to the Workflow Editor
  • You can drag a pipeline component from the Component Panel.

  • Connect the dragged component to the dragged Event to create a pipeline flow of data.

A Pipeline workflow in process

Data Sync

The user can directly read the data with the reader and write to a Data Sync.

  • The user can add a new DB Sync from the toggle event panel to the workflow editor by clicking on ‘+’ icon.

  • Specify the display name and connection id and click on save.

  • Drag and drop the DB Sync from event panel to workflow editor.

Please Note: Click the Kafka Event and Data Sync options to get redirected to the more details on these topics.

Global Variables

The user can create a Global Variable by following the below-given steps:

  • Click the Global Variable icon from the User Interface.

  • Global Variables panel opens.

  • Click the Add New Variable from the Global Variables panel.

  • The Add Variable window opens.

  • Insert Variable name.

  • Provide Value for the Variable.

  • Click the Save option.

  • A notification message appears.

  • The Global Variable gets created and added to the panel.

Last updated

#65: 's Mar 27 changes

Change request updated