Connecting Components

An event-driven architecture uses events to trigger and communicate between decoupled services and is common in modern applications built with microservices.

The connecting components help to assemble various pipeline components and create a Pipeline Workflow. Just click and drag the component you want to use into the editor canvas. Connect the component output to an event/topic.

Once a Pipeline is created the User Interface of the Data Pipeline provides a canvas for the user to build the data flow (Pipeline Workflow).

The Pipeline assembling process can be divided into two parts as mentioned below:

  1. Adding Components to the Canvas

  2. Adding Connecting Components (Events) to create the Data flow/ Pipeline workflow

Each components inside a pipeline are fully decoupled. Each component acts as a producer and consumer of data. The design is based on event-driven process orchestration.

For passing the output of one component to another component we need an intermediatory event.

An event-driven architecture contains three items:

  • Event Producer [Components]

  • Event Stream [Event (Kafka topic/ DB Sync)

  • Event Consumer [Components]

Last updated