Pre Sales
  • BDB Pre Sales
  • Manufacturing Use Case
    • Introduction
    • How is BDB different than Azure, AWS, or GCP?
    • Project Definition and Requirements
      • Functional Requirements
      • Technical Requirements
      • Non-Functional Requirements
      • Project Deliverables
    • Functional Requirements from Manufacturing
    • Technical Requirements
      • Data Ingestion
      • Data Processing (Batch Data)
      • Data Processing (Real-Time Data)
      • Data Preparation
      • Data Store(Data Lake)
      • Data Store (Enterprise Datawarehouse)
      • Query Engine
      • Data Visualization
      • BDB Search
      • Advanced Analytics and Data Science
    • Data Services
    • Security Requirements
    • Networking Requirements
    • Operational Requirement
    • Non-Functional Requirements
      • Scalability
      • Availability
    • Data Platform Benchmarking
    • Hardware Sizing Requirements
  • Data Platform Evaluation Criteria
    • Data Preparation
    • Data Platform Evaluation Highlights
    • Data Pipeline
    • Ingestion Connector
      • Seamless Handling of Data ops and ML ops
    • Ingestion Process
      • Building a path from ingestion to analytics
    • Data Preparation
      • Processing Modern Data Pipeline
  • BDB POC Approach
  • BDB Vertical Analytics
  • Technical FAQs
    • Data Platform
    • Administration
    • Data Security & Privacy
    • Analytics
    • Data Preparation
    • Data Pipeline
    • Dashboard Designer
    • Business Story
    • Performance & Scalability
    • Global and Embeddable
    • Deployability
    • User Experience
    • Support & Licensing
    • AI
    • Change Management
Powered by GitBook
On this page
  1. Data Platform Evaluation Criteria

Data Preparation

Requirement
Evaluation
Remarks

DP Selection Labeling

high

Platform allows to label data.

DP Record Tagging

high

Platform supports record tagging.

DP Version Control

high

It supports check-in and check to Git.

DP Distributed Query Engine

medium

It can be done via pipeline Spark SQL component.

DP Processing Pipelines

very high

Yes, the Data Preparation can be integrated directly with Pipeline.

DP Visual Interface

very high

We have Visual Interface Designer.

DP API Interface

very high

API interfaces are available.

DP Data Catalogue Integration

high

Platform generates Data Catalogue from the underlying meta data automatically.

DP Parquet Files support

medium

Yes, it is supported in the Data Pipeline.

DP Time Series support

medium

Yes, it is supported.

DP Time Series operations

medium

Yes, it is supported, forecasting, and anomaly detection.

DP Secret Management Integration

high

It is supported via Kubernetes secrets.

DP Access Management

high

Support RBAC

DP Reports & Metrics

high

Pipeline generate reporting metric about every process, like Memory used, CPU used , no. of records processed etc.

DP Access Audit Logs

high

Platform captures every user operations and activities

DP Operation Audit Logs

high

Logs can be pushed to third-party log monitoring systems like Datadog, Prometheus, etc.

DP Export as Pipeline

high

Yes, Data Preparation steps can be exported to Pipeline

PreviousBuilding a path from ingestion to analyticsNextProcessing Modern Data Pipeline

Last updated 2 years ago