Data Pipeline
  • Data Pipeline
    • About Data Pipeline
    • Design Philosophy
    • Low Code Visual Authoring
    • Real-time and Batch Orchestration
    • Event based Process Orchestration
    • ML and Data Ops
    • Distributed Compute
    • Fault Tolerant and Auto-recovery
    • Extensibility via Custom Scripting
  • Getting Started
    • Homepage
      • List Pipelines
      • Creating a New Pipeline
        • Adding Components to Canvas
        • Connecting Components
          • Events [Kafka and Data Sync]
        • Memory and CPU Allocations
      • List Jobs
      • Create Job
        • Job Editor Page
        • Task Components
          • Readers
            • HDFS Reader
            • MongoDB Reader
            • DB Reader
            • S3 Reader
            • Azure Blob Reader
            • ES Reader
            • Sandbox Reader
          • Writers
            • HDFS Writer
            • Azure Writer
            • DB Writer
            • ES Writer
            • S3 Writer
            • Sandbox Writer
            • Mongodb Writer
            • Kafka Producer
          • Transformations
        • PySpark Job
        • Python Job
      • List Components
      • Delete Orphan Pods
      • Scheduler
      • Data Channel
      • Cluster Event
      • Trash
      • Settings
    • Pipeline Workflow Editor
      • Pipeline Toolbar
        • Pipeline Overview
        • Pipeline Testing
        • Search Component in Pipelines
        • Push Pipeline (to VCS/GIT)
        • Pull Pipeline
        • Full Screen
        • Log Panel
        • Event Panel
        • Activate/Deactivate Pipeline
        • Update Pipeline
        • Failure Analysis
        • Pipeline Monitoring
        • Delete Pipeline
      • Component Panel
      • Right-side Panel
    • Testing Suite
    • Activating Pipeline
    • Monitoring Pipeline
  • Components
    • Adding Components to Workflow
    • Component Architecture
    • Component Base Configuration
    • Resource Configuration
    • Intelligent Scaling
    • Connection Validation
    • Readers
      • S3 Reader
      • HDFS Reader
      • DB Reader
      • ES Reader
      • SFTP Stream Reader
      • SFTP Reader
      • Mongo DB Reader
        • MongoDB Reader Lite (PyMongo Reader)
        • MongoDB Reader
      • Azure Blob Reader
      • Azure Metadata Reader
      • ClickHouse Reader (Docker)
      • Sandbox Reader
      • Azure Blob Reader
    • Writers
      • S3 Writer
      • DB Writer
      • HDFS Writer
      • ES Writer
      • Video Writer
      • Azure Writer
      • ClickHouse Writer (Docker)
      • Sandbox Writer
      • MongoDB Writers
        • MongoDB Writer
        • MongoDB Writer Lite (PyMongo Writer)
    • Machine Learning
      • DSLab Runner
      • AutoML Runner
    • Consumers
      • SFTP Monitor
      • MQTT Consumer
      • Video Stream Consumer
      • Eventhub Subscriber
      • Twitter Scrapper
      • Mongo ChangeStream
      • Rabbit MQ Consumer
      • AWS SNS Monitor
      • Kafka Consumer
      • API Ingestion and Webhook Listener
    • Producers
      • WebSocket Producer
      • Eventhub Publisher
      • EventGrid Producer
      • RabbitMQ Producer
      • Kafka Producer
    • Transformations
      • SQL Component
      • Dateprep Script Runner
      • File Splitter
      • Rule Splitter
      • Stored Producer Runner
      • Flatten JSON
      • Email Component
      • Pandas Query Component
      • Enrichment Component
      • Mongo Aggregation
      • Data Loss Protection
      • Data Preparation (Docker)
      • Rest Api Component
      • Schema Validator
    • Scripting
      • Script Runner
      • Python Script
        • Keeping Different Versions of the Python Script in VCS
    • Scheduler
  • Custom Components
  • Advance Configuration & Monitoring
    • Configuration
      • Default Component Configuration
      • Logger
    • Data Channel
    • Cluster Events
    • System Component Status
  • Version Control
  • Use Cases
Powered by GitBook
On this page
  • Private (Event)
  • Data Sync
  1. Getting Started
  2. Pipeline Workflow Editor
  3. Pipeline Toolbar

Event Panel

PreviousLog PanelNextActivate/Deactivate Pipeline

Last updated 2 years ago

The user can access the Event Panel to create a new Event. We have two options in the Toggle Event Panel:

  1. Private (Event/ Kafka Topic)

  2. Data Sync

Private (Event)

The user can create an Event (Kafka Topic) that can be used to connect two pipeline components.

  • Navigate to the Pipeline Editor page.

  • Click the Event Panel icon.

  • The Event panel opens.

  • Click the Add New Event icon.

  • The New Event dialog box opens.

  • Enable the Event Mapping option to map the Event.

  • Provide the required information.

    • Slide the given button to enable the event mapping.

    • Provide a display name for the event (A default name based on the pipeline name appears for the Event).

    • Select the Event Duration from the drop-down menu (It can be set from 4 to 168 hours as per the given options).

    • Number of partitions (You can choose out of 1 to 50).

    • Number of outputs (You can choose out of 1-3) (The maximum number of outputs must not exceed the no. of Partition).

    • Enable the Is Failover? option if you wish to create a failover Event.​​

  • Click the Add Event option to save the new Event.

  • A confirmation message appears.

  • The new Event gets created and added to the Event Panel.

  • Drag and drop the Event from the Event Panel to the workflow editor.

  • Connect the dragged component to the dragged Event to create a pipeline flow of data.

Data Sync

The user can directly read the data with the reader and write to a Data Sync.

  • The user can add a new Data Sync from the toggle event panel to the workflow editor by clicking on ‘+’ icon.

  • Specify the display name and connection id and click on save.

  • Drag and drop the Data Sync from event panel to workflow editor.

You can drag a pipeline component from the ​

Please Note: Refer the for more details on the DB Sync topic provided under the Connection Components section of this document.

Component Panel.
Events [Kafka and Data Sync] page
The Toggle Event Panel
Accessing the Add New Event icon from the Event panel