DB Writer

All component configurations are classified broadly into the following sections:

Please check out the given demonstration to configure the component.

Configuring the DB Writer Component as a part of Pipeline Workflow

Drivers Available

  • MySQL

  • Oracle

  • PostgreSQL

  • MS-SQL

  • ClickHouse

  • Snowflake

Save Modes

The RDBMS writer supports 3 save modes:

Append

As the name suggests it adds all the records without any validations.

Overwrite

This mode truncates the table and adds fresh records. after every run you will get records that are part of the batch process.

Upsert

This operation allows the users to insert a new record or update existing data into a table. For configuring this we need to provide the Composite Key.

The BDB Data Pipeline supports composite key based upsert, in case of composite key, we can specify the second key by using comma separator e.g., key1, key2​. It has now an option to upload the spark schema. This can greatly improve the speed of the write operation as the component will ignore inferring schema and go with the provided schema.

Spark Schema upload
  • Query: In this field, we can write a DDL for creating the table in database where the in-event data has to be written. For example, please refer the below image:

Last updated

#66:

Change request updated