# Connecting Components

The connecting components help to assemble various pipeline components and create a Pipeline Workflow. Just click and drag the component you want to use into the editor canvas. Connect the component output to an event/topic.

![An Outlook of the Pipeline Editor](https://972575688-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FRYq1HgffNfbnIMWPu1D5%2Fuploads%2FsrKScfIhGrbj5VgrTQAD%2Fimage.png?alt=media\&token=5303b2eb-c560-4ebb-a963-ce81238f724f)

Once a Pipeline is created the User Interface of the Data Pipeline provides a canvas for the user to build the data flow (Pipeline Workflow).

The Pipeline assembling process can be divided into two parts as mentioned below: &#x20;

1. Adding Components to the Canvas&#x20;
2. Adding Connecting Components (Events) to create the Data flow/ Pipeline workflow

Each components inside a pipeline are fully decoupled. Each component acts as a producer and consumer of data. The design is based on event-driven process orchestration.

For passing the output of one component to another component we need an intermediatory event.

An event-driven architecture contains three items:

* Event Producer \[Components]
* Event Stream \[Event (Kafka topic/ DB Sync)
* Event Consumer \[Components]

![A Sample Pipeline Workflow ](https://972575688-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FRYq1HgffNfbnIMWPu1D5%2Fuploads%2F2ixdqPUmRuIHHZsrBNF6%2Fimage.png?alt=media\&token=5df10ad2-44f9-42d6-b245-36f59bdca77f)
