Data Ingestion
Requirement: Data Platform should have Out of the Box capability (connectors) of integrating/interfacing with Cloud hosted and On-Premises Manufacturing Enterprise systems, manufacturing tools/application and external 3rd party systems.
BDB Response : BDB Platform's Data Center module is a core module of platform which address data ingestion activities & has features to extract data and create data service & data store for further data analysis.
High-Level Feature Summary of BDB Data Center
Connect with variety of data source through Pre-built Data Connectors be it standard data warehouse, social media platform, CRM application, third party APIs etc.
Share, Edit & Remove Data Connector Anytime
Build data sets on top of data connector through basic SQL kind of query
Publish data sets as web service for easy data ingestion in other modules for example to use it as an input in data viz module (BDB dashboard designer), Predictive, data cleansing. Generate Xml code for debugging purpose.
Share, Edit & Remove Data Set/Services Anytime.
Create data store & data store meta data on elastic, provide drill hierarchy, user restrictions & create instant visualizations.
Requirement : Data Ingestion should support connecting to data source using methods not limited to DB Connection (JDBC/ODBC), API, Data Replication, Event Brokers, Managed File Transfer & ETL/ELT process.
BDB Response: BDB Platform supports wide range of protocols such as http, https, ftp, smtp, ODBC, JDBC, AMQP, JMS etc. API’s, Event brokers, Managed File Transfer and ETL/ELT processes.
Requirement : Data Platform should support ingestion of data (Structured, unstructured, semi-structured) of any size and format without prior schema definition, data transformation and modeling.
BDB Response : BDB Data pipeline is an event based serverless architecture, deployed on Kubernetes cluster and Kafka based communication to handle real-time and batch data (Structured, unstructured, semi-structured). Gain seamless insights from massive amount of structured, semi-structured, and unstructured data. Drag & drop based interface where pipeline can be created, and models generated from ‘DS Lab’ & ‘Data Preparation’ can be embedded inside the workflow.
With BDB Data Pipeline, you can define data-driven workflows to get great insights and visualization for better decision making.
Requirement : Ingestion process shall support data coming at different speed, from real time event streaming, to scheduled batch import.
BDB Response : BDB's data pipeline is an event based serverless architecture, which can handle any type of data continuous or asynchronous, real-time or batched or both. Data may be ranging from UI activities, logs, performance events, sensor data, emails, social media to organizational documents, BDB's Lambda architecture saves users from the nitty-gritty of data interaction and facilitates smooth data ingestion. User just need to specify the Invocation type that if your data is real-time/batch. BDB Data Pipeline supports basic and advanced level data transformations through in-built components and integrated Data Preparation scripts to enhance data insight discovery.
Requirement : Support incremental (Change Data Capture) and ad-hoc data ingestion capabilities.
BDB Response : Yes, BDB Data pipeline supports incremental (change Data Capture) and ad-hoc data ingestion. Any downstream systems which provide change data capture those all will be supported by BDB platform.
Requirement : Provide supported capabilities for creation and execution of batch ingestion.
BDB Response : BDB’s data pipeline is an event based serverless architecture, which can handle any type of data continuous or asynchronous, real-time or batched or both. Data may be ranging from UI activities, logs, performance events, sensor data, emails, social media to organizational documents, BDB's Lambda architecture saves users from the nitty-gritty of data interaction and facilitates smooth data ingestion. User just need to specify the Invocation type that if your data is real-time/batch. BDB Data Pipeline supports basic and advanced level data transformations through in-built components and integrated Data Preparation scripts to enhance data insight discovery.