Requirement : Supported Out-of-the-Box capabilities to manage, monitor and run the data platform.
BDB Response: BDB Data pipeline module has integrated data pipeline monitoring section which showcases important performance statistics such as memory & CPU utilisation, no. of records processed, allocated CPU & memory, last processed count, total no of records processed, last processed record size, no of instances with each component names associated with the selected pipeline. Along with visualising the key parameters, there is a component log also available.
In addition to the pipeline monitoring facility, in the data pipeline you can configure failover events for each and every component that is available.
User can create exclusive Kafka topics to handle failures and then drag n drop it into the canvas and map it with the components. If there are any failures or errors within a component during data processing these will be registered in the corresponding fail over events.
Data Audit can be implemented by using the monitoring facility available and the failover events. User can verify count of records processed successfully as against the no of records available for processing.
BDB Data Platform can also be integrated with 3rd party monitoring tools like Prometheus, data dog etc. to continuously monitor pipeline components, Kafka topics, published API, various data sources mapped and also alerts can be configured.
BDB Platform logs can be configured with 3rd Party monitoring tools and desired alerts can be configured, below are some examples - Zabbix – Server health monitoring.
Requirement: Provide Performance monitoring metrics not limited to CPU Usage, network utilization using a dashboard.
BDB Response : BDB Data pipeline module has integrated data pipeline monitoring section which showcases important performance statistics such as memory & CPU utilization, no. of records processed, allocated CPU & memory, last processed count, total no of records processed, last processed record size, no of instances with each component names associated with the selected pipeline. Along with visualizing the key parameters, there is a component log also available.
In addition to the pipeline monitoring facility, in the data pipeline you can configure failover events for each and every component that is available.
User can create exclusive Kafka topics to handle failures and then drag n drop it into the canvas and map it with the components. If there are any failures or errors within a component during data processing these will be registered in the corresponding fail over events.
Requirement : Data Audit can be implemented by using the monitoring facility available and the failover events. User can verify count of records processed successfully as against the no of records available for processing.
BDB Response : BDB Data Platform can also be integrated with 3rd party monitoring tools like Prometheus, data dog etc. to continuously monitor pipeline components, Kafka topics, published API, various data sources mapped and also alert can be configured.
Requirement: Mechanism to detect and Trigger Events and Alerts as part of monitoring capabilities.
BDB Response : Yes, BDB Data Pipeline provides mechanism to detect and trigger events and alerts as part of monitoring capabilities. Also, BDB Platform provide capabilities to integrate with 3rd Party monitoring tools like Datadog and Zabbix, etc.
Requirement: Ability to collect and aggregate system logs for analysis and insights.
BDB Response : Yes, BDB Platform provides ability to collect and aggregate systems logs for analysis and insights.
Requirement: Monitor inbound and Outbound API calls.
BDB Response : Yes, BDB Platform’s Data Pipeline is having Drag & Drop console for creating pipeline workflows, monitor events, logs and alerts through a single panel. Consists of a variety of reader, writer, transformation, ingestion, ML & web socket components.
BDB Data Platform can also be integrated with 3rd party monitoring tools like Prometheus, data dog etc. to continuously monitor pipeline components, Kafka topics, published API, various data sources mapped and also alerts can be configured.
Requirement: Create audit trails for resource usage.
BDB Response: Yes, Data Audit can be implemented by using the monitoring facility available and the failover events. The users can verify count of records processed successfully as against the no of records available for processing.
BDB Data Pipeline Module has Log panel which shows which ever component is up and has advanced log panel.
Requirement: Manage and record configuration changes to the data platform.
BDB Response: Yes, BDB platform supports to manage and record configuration changes to the data platform.