Pipeline Toolbar

Header menu panel on the Pipeline Workflow Editor page.

The Pipeline Editor contains a Toolbar at the top right side of the page with various options to be applied on a Pipeline workflow.

Pipeline Toolbar

This page aims to explain various functions provided on the Pipeline toolbar.

Toggle Log Panel

The Toggle Log Panel displays the Logs and Advanced Logs tabs for the Pipeline Workflows.

  • Navigate to the Pipeline Editor page.

  • Click the Toggle Log Panel icon on the Pipeline.

  • A Log panel toggles displaying the collective component logs of the pipeline under the Logs tab.

Logs tab displaying the collective logs for a Pipeline
  • Select the Advanced Logs tab to display the status of the component containers.

The Advanced Log displayed inside the toggle Log panel.

Object Browser: Search Component in Pipelines

This feature helps the user to search a specific component across all the existing pipelines. The user can drag the required components to the pipeline editor to create a new pipeline workflow.

  • Click the Search Component in pipelines icon from the header panel of the Pipeline Editor.

Search Component in Pipelines icon
  • The Object Browser window opens displaying all the existing pipeline workflows.

Object Browser window
  • The user can search a component using the Search Component space.

  • The user gets prompt suggestions while searching for a component.

  • Once the component name is entered, the pipeline workflows containing the searched pipeline get listed below.

  • The user can click the expand/ collapse icon to expand the component panel for the selected pipeline.

  • The user can drag a searched component from the Object Browser and drop to the Pipeline Editor canvass.

Toggle Event Panel

The user can access the Toggle Event Panel to create a new Event.

We have two options in the Toggle Event Panel:

  1. Private (Event/ Kafka Topic)

  2. Data Sync

The Toggle Event Panel

Private (Event)

The user can create an Event (Kafka Topic) that can be used to connect two pipeline components.

  • Navigate to the toggle Event panel.

  • Click the Add New Event icon.

Accessing the Add New Event icon from the Event panel
  • The New Event dialog box opens.

  • Provide the required information.

    • Slide the given button to enable the event mapping.

    • Provide a display name for the event (A default name based on the pipeline name appears for the Event).

    • Select the Event Duration from the drop-down menu (It can be set from 4 to 168 hours as per the given options).

    • Number of partitions (You can choose out of 1 to 50).

    • Number of outputs (You can choose out of 1-3) (The maximum number of outputs must not exceed the no. of Partition).

    • Enable the Is Failover? option if you wish to create a failover Event.

    • Click the Add Event option to save the new Event.

  • A confirmation message appears.

  • The new Event gets created and added to the Event Panel.

  • Drag and drop the Event from the Event Panel to the workflow editor.

Dragging an Event to the Workflow Editor
  • You can drag a pipeline component from the Component Panel.

  • Connect the dragged component to the dragged Event to create a pipeline flow of data.

A Pipeline workflow in process

DB Sync

The user can directly read the data with the reader and write to a DB Sync.

  • The user can add a new DB Sync from the toggle event panel to the workflow editor by clicking on ‘+’ icon.

  • Specify the display name and connection id and click on save.

  • Drag and drop the DB Sync from event panel to workflow editor.

Global Variables

The user can create a Global Variable by following the below-given steps:

  • Click the Global Variable icon from the User Interface.

  • Global Variables panel opens.

  • Click the Add New Variable icon from the Global Variables panel.

  • The Add Variable window opens.

  • Insert Variable name.

  • Provide Value for the Variable.

  • Click the Save option.

  • A notification message appears.

  • The Global Variable gets created and added to the panel.

Activate/Deactivate Pipeline

The user can activate/deactivate the pipeline by clicking on the / icon. Activation will deploy all the components based on their respective invocation types. When the pipeline is deactivated all the components go down and will halt the process.

Update Pipeline

Clicking on the (Update) icon allows you to save the pipeline. It is recommended to update the pipeline every time you make changes in the workflow editor.

On a successful update of the pipeline, you get a notification as given below:

Please Note: On any Failures the users get a notification through the below-given error message.

Full Screen

The Full Screen icon presents the Pipeline Editor page in the full screen.

  • Navigate to the Pipeline Workflow Editor page.

  • Click the Full Screenicon from the toolbar.

  • The Pipeline Workflow Editor opens in full screen and the icon changes to icon.

Failure Analysis

Failure analysis is a central failure mechanism. Here, the user can identify the failure reason. Failures of any pipeline stored at a particular location(collection). From there you can query your failed data in the Failure Analysis UI. It displays the failed records along with cause, event time, and pipelineId.

  • Navigate to the Pipeline Editor page.

  • Click the Failure Analysis icon.

  • The Failure Analysis page opens.

  • Search Component: A Search bar is provided to search all components associated with that pipeline. It helps to find a specific component by inserting the name in the Search Bar.

  • Component Panel: It displays all the components associated with that pipeline.

  • Filter: By default, the selected component instance Id will be displayed in the filter field. Records will be displayed based on the instanceid of the selected component. It filters the failure data based on the applied filter.

Please Note the Filter Format of some of the field types.

Field Value Type

Filter Format

String

data.data_desc:” "ignition"

Integer

data.data_id:35

Float

data.lng:95.83467601

Boolean

data.isActive:true

  • Project: By default, the pipeline_Id and _id are selected from the records. If the user does not want to select and select any field then that field will be set with 0/1 (0 to exclude and 1 to include), displaying the selected column.

Please Note: data.data_id:0, data.data_desc:1

  • Sort: By default, records are displayed in descending order based on the “_id” field. Users can change ascending order by choosing Ascending option.

  • Limit: By default, 10 records are displayed. Users can modify the records limit according to the requirement. The maximum limit is 1000.

  • Find: It filters/sorts/limits the records and projects the fields by clicking on the find button.

  • Reset: If the user clicks on the Reset button, then all the fields must be reset with a default value.

  • Cause: The cause of the failure gets displayed by a click on any failed data.

Delete Pipeline

This function helps to remove a pipeline.

  • Navigate to the Pipeline Workflow Editor page.

  • Click the Delete icon.

  • A dialog box opens to assure the deletion.

  • Select the YES option.

  • A notification message appears.

  • The selected pipeline gets removed from the Pipeline List.

Last updated