Pre Sales
  • BDB Pre Sales
  • Manufacturing Use Case
    • Introduction
    • How is BDB different than Azure, AWS, or GCP?
    • Project Definition and Requirements
      • Functional Requirements
      • Technical Requirements
      • Non-Functional Requirements
      • Project Deliverables
    • Functional Requirements from Manufacturing
    • Technical Requirements
      • Data Ingestion
      • Data Processing (Batch Data)
      • Data Processing (Real-Time Data)
      • Data Preparation
      • Data Store(Data Lake)
      • Data Store (Enterprise Datawarehouse)
      • Query Engine
      • Data Visualization
      • BDB Search
      • Advanced Analytics and Data Science
    • Data Services
    • Security Requirements
    • Networking Requirements
    • Operational Requirement
    • Non-Functional Requirements
      • Scalability
      • Availability
    • Data Platform Benchmarking
    • Hardware Sizing Requirements
  • Data Platform Evaluation Criteria
    • Data Preparation
    • Data Platform Evaluation Highlights
    • Data Pipeline
    • Ingestion Connector
      • Seamless Handling of Data ops and ML ops
    • Ingestion Process
      • Building a path from ingestion to analytics
    • Data Preparation
      • Processing Modern Data Pipeline
  • BDB POC Approach
  • BDB Vertical Analytics
  • Technical FAQs
    • Data Platform
    • Administration
    • Data Security & Privacy
    • Analytics
    • Data Preparation
    • Data Pipeline
    • Dashboard Designer
    • Business Story
    • Performance & Scalability
    • Global and Embeddable
    • Deployability
    • User Experience
    • Support & Licensing
    • AI
    • Change Management
Powered by GitBook
On this page

BDB POC Approach

This document highlights the high-level scope of a POC that BDB perhaps for an Enterprise Client on the data provided by them. We aim to complete a POC in 4-5 weeks.

Technical Area
Technical Subarea
Comments

Installation or Deployment

On-Prem or Cloud or BDB Cloud

Any of the option chosen by the enterprise.

Data Pipeline

Core Data Pipeline Design Multi Databases (at least 2)

Data Preparation Module - Data Quality

Data Catalog Feature Business Metrics in BDB Pipeline

Testing Framework

Version Control in Data Pipeline

Create a dummy file if the customer is not giving the data.

How is this important in larger deployments?

Test-What is being built?

Data Science Lab

Core Concept of DS Lab to be explained

Creation of Models, and Training of Model

MLOps flow using Data Pipeline

Model as an API

Version Control

Base flow - add Customer model if they want to import it.

Core Platform

Data Virtualization - Data Fabric Layer

Data Centre

User based Data Access Monitoring (Default Prometheus)

Auditing

SSO/SSL Integration

Data as an API

Base Security Flow and creation of Multiple Users in Platform.

If Customer asks for this point.

Data Visualization

Governed Dashboards with Drills Intelligent Sheet with write back feature

Self Service Reports

NLP/ AI Search

Mobility

Try to bring in Export features automatically

in Dashboard.

Dashboards on Mobile browser.

PreviousProcessing Modern Data PipelineNextBDB Vertical Analytics

Last updated 2 years ago