Data Platform Benchmarking

Industry benchmarking based on magic quadrant from leading organizations like Gartner Forrester etc.

Requirement: Provide details of the evaluation under the following categories.

Today's enterprise needs a state-of-the-art data & analytics platform. This is no longer a good-to-have option, but an absolute necessity. Data platforms act as the central nervous system of your organization, by coordinating and controlling all the data movements & data operations. It also acts as the central repository for all data, transforms & enrich them into a single source of truth. Platform helps to automate the data orchestration process with the help of low code composing tool, there by accelerating your digital transformation journey.

BDB provides Data Ingestion to Data Dissemination all in one integrated data and analytics platform, which caters to all your Data Ops, AI/ ML Ops, and data visualization requirements.

Platform Benefits

  • Future Proof - Our solution continuously adds new features and functionalities; thereby reducing technical debt

  • End to End Low Code - Platform Products delivers sustained value for all the stakeholders and flexible to fuse connected and non-connected data together

  • Secure & Compliant - Platform compliant with privacy guidelines, WPP and UN 155 standards, ISO 20022 (API), etc. to meet European and Global requirements

  • Integrated within Customer - Platform will be white labelled and Customer stakeholders trained for organizational led innovation and product development

  • Leapfrog Innovation - Existing product portfolio can be used to deliver market leading connected products to markets quickly across different vertical segments

  • Fast Track Data Monetization - Out of the box product help to meet the fleet, government, and third-party requirements in a short period

  • Resource Optimization - Delivers increased value from existing resources and reorients resources in delivering business and customer value

  • Scalable & Reliable - Platform is designed for high availability, regularly tested and deployed on Kubernetes for auto scaling

  • Functional Fit - Ability to meet current functional requirements and scalable to as a future proof data platform

At the outset Manufacturing requirements fits ditto with BDB Product Capabilities.

  • One Single platform that gives end to end Analytics features with ability to get deployed on Cloud and On-Premises

  • Cost Optimizations using the On-premises and Cloud working together

  • Compelling Licensing for the end-to-end Product Suite

  • Strong Support on Professional Services and Training

BDB Platform Comprises

  • Big Data Pipeline Module to handle Ingestion of Batch as well as Real Time Data to give seamless DataOps

    • It handles PDFs, Structured, Unstructured, Computer Vision, APIs, Web & Social Data

  • Data Quality Tool to manage the Data

  • Data Catalog, Data as API, Model as API kind of Features

  • Data Science Lab with a Workbench kind of features giving seamless MLOps

  • Scalability, Multitenancy, Security, Test Framework, Business Metrics etc. helps customers use it for their own Data Platforms and deployments in different kind of environments.

  • Dashboard Designer to deliver Governed Dashboards

  • Self Service Reports with ingrained Search abilities

  • Visualization working in Mobile and other devices

  • Forms and Survey modules to add other dynamic data into the solution

  • Write back features, ability to modify dashboards & capability to create Hyper automation use cases, flexibility to white label, Cost Predictability makes BDB as the most Enterprise friendly Data Analytics Platform.

Scalability

  • Cloud Deployments can be auto scaled on the fly on the basis of the data load.

  • On premise deployments can be scaled up without much effort with proper planning and monitoring (using ingrained business metrics features in BDB Pipeline).

  • A new connector can be added by simple adding Python or Scala code (in case more data sources are added).

  • A new data source can be added to the existing data lake in production without disrupting it.

  • The whole deployment method is Iterative and incremental so that Manufacturing can get the value quickly and able to plan multiple use cases in production in a phased manner.

Technology

Please see below the details of the Product Technology Readiness Level of BDB Platform in the capability reference diagram. The platform is in quite advanced state and take care of most of the Analytics needs of the Enterprise. Here we want to claim that BDB Platform is the only ONE with all integrated features. None in Analytics space has so many features without acquisitions. Therefore, when a customer chooses an Azure or AWS – they are more of Services features for which Customer has to pay as per what they choose. Customers land up in choosing more as the solution goes deeper in 3-4 years and lands up in paying huge costs as compared to BDB which is 100% predictable and doesn’t have hidden costs.

The BDB feature document contains a list of all the GA features. This means that these products work in all circumstances and verticals.

  • Performance & Health - Keeping track of BDB Platform Server processes and the capacity of the underlying system is key to maintaining server reliability. It’s also a vital part of future capacity planning. The BDB Platform Server status page displays server processes across all nodes including the state of the active and passive Repositories. BDB Platform notifies admins and content owners when extract updates fail due to data connection issues. This helps you fix the data problem before users end up using stale data in their analyses.

  • User Management - Whether you’re supporting ten users or ten thousand, BDB Platform makes it easy for the server administrator. Our visual user management module helps you see exactly what roles and responsibilities each user and group inherits, for each site, project, and folder. Our enterprise authentication method makes it easy to batch add new users. And if you’re using Active Directory, we’ll sync with your existing AD groups to maintain consistency with your security protocols.

  • Auditing – BDB Platform maintains extensive log files of data connections, all server content, and user interactions. These robust logs can be used for troubleshooting and performance assessments as well as for audit compliance. Each user’s actions on BDB Platform are captured and stored for future reference. All the logs can be integrated with 3rd party monitoring tools like Datadog, etc. With BDB Platform, you have the visibility and tools you need to optimize your platform and ensure it’s running efficiently.

  • Automation - Managing data, content, and users across your enterprise analytics platform shouldn’t have to be manual. BDB Platform provides all feature required to automate the regular task.

BDB offers three Technical Support program levels to help meet the service needs of all customers.

  • Standard Support - Standard Support is available during regular BDB business hours.

  • Extended Support - Extended Support enables your organization to avoid or reduce downtime and expedite the value of your investment through accelerated response times and the additional availability of 24 x 7 weekend support for critical P1 issues.

  • Premium Support - Premium Support provides complete, proactive account care you can rely on. Premium Support provides a comprehensive set of resources, extended availability and the fastest response time to service issues with 24 x 7 support for P1 and P2 issues.

SL NOTITLEDESCRIPTION ESTIMATED TIME

1

Platform familiarization

1-2 Days

1.1

Go through the platform guide documents for

To get an overview about the platform, modules available and its several areas of applications

1 Day

understanding modules and its use

1.2

Installation of client tools

1. SQLYog/Workbench/DBeaver

2. Mongo Compass

3. WinSCP for (FTP Connection)

4. FortiClient for VPN

0.5 Day

2

Data Center Module

1 DAY

2.1

Data Connector

Understanding the concept of data connectors and its properties available.

1 Hour

2.2

Configuring data connectors

Understanding the concept of data sets and its properties. Go through do’s and don’ts.

1 Hour

2.3

Data sets

Understanding the concept of data sets and its properties. Go through do’s and don’ts.

1.5 Hour

2.4

Creating datasets

With and without Dynamic filters and explore the properties.

1 Hour

2.5

Data store

Understanding the concept of data store and its properties.

30 Mins

2.6

Data store creation

1 Hour

2.7

Datastore metadata

Understanding concept and creation. By the end of this, candidates must have a thorough understanding of the difference between them. Use of datastores, its hierarchy, flattened data structure, etc.

1 Hour

2.8

Datasheet

Understanding and creation of datasheet

1 Hour

3

Business Story

1-2Days

3.1

Creating Bi story based on the created datastore

2 Hour

3.2

Filters & Hierarchical Drilling

1 Hour

3.3

Functions - Formula

2 Hour

3.4

Change theme, Order, limit & sort

1 Hour

3.5

Interactions

2 Hour

3.6

NLQ and adding synonyms

3 Hour

3.7

Creating a full BI story including all these functionalities based on a given Table name

6 Hour

4

Dashboard Designer Module

8 Days

4.1.1

Beginner’s Guide

1.5 Days

4.1.2

Familiarization session from mentor

30 Mins

4.1.3

Data connector creation

Understand how to connect to different data sources and configuration. Candidate must try almost all data connectors in dashboard

2 Hour

4.1.4

Component library + familiarization session from mentor for selected components.

Exploring component library with the help of All Components excel file and exploring chart properties. Thorough understanding of each component and its properties is expected.

4-6 Hour

4.1.5

Intermediate Level

7.5 Days

4.1.6

Script + familiarization session from mentor for selected scripts

Understand scripting with the help of Hiring Data excel sheet.

1 Day

Data service as Data connector +

Scripts, dynamic filtering and drilling with table data “Zomato”

1 Day

Familiarization session

4.2.3

Dashboard creation + familiarization session from mentor.

Each filter concept in different tabs with a single chart

5. Standalone filter as exercise

6. Relative filter (Y/Q/M/W)

7. Cascaded filtering

8. YTD/QTD/MTD

9. Last n no of days, Custom range filter

10. Checkbox filter

11. Comparative filter

Developer tool

1.5 Days

4.2.4

Understanding how to debug

Get familiarized with dev-tools for debugging. Familiarization from mentor

1 Day

4.2.5

Mock dashboard creation on a Saturday

1. Biryani Zone

2. Sales distribution by channel

3. 2 trend tiles

4. Revenue distribution

2nd tab 2 chart

Should follow Dashboard Guidelines.

2 Days

4.2.6

Hyperlinking and dashboard within a textbox

1 Hour

4.2.7

User security with custom field

With script and env variable

1.5 Hour

4.2.8

Parent child

2 Hour

4.2.9

Create a dashboard with own KPIs and UI and its mobile view (min 4 KPIs with filters)

1 Day

5

Pipeline Module

6 Days

5.1

Introduction to BDB Data Pipeline

Basic Introduction Videos, Architecture, Data ingestion, Transformations, Loading, home screen, terminologies, Integration with other modules, Component Deployment, Deployment types – Spark / Docker components, workflow editor, logs & advance logs

2 Hour

5.2

Pipeline Components and Events

Understanding batch components, real time components, batch size and kafka topics, failover events and shared events, Usage of batch & real time components, use-cases, batch size, in-events, out-events, event-offset logic.

2 Hour

5.3

Understanding and Configuring Reader & Writer Components

Configuring and creating simple work-flows with various Reader & Writer components such as MongoDB, RDBMS, Elastic Readers & Writers etc.

5 Hour

5.4

Configuring Ingestion Components

Understanding and configuring the commonly used ingestion components such as API Ingestion, Event-Hub Subscriber, SFTP Monitor etc. and creating work-flows with SFTP Monitor & SFTP Reader

4 Hour

5.5

Configuring Transformation Components and Data Prep

Configuring commonly used transformation components such as rule splitter, file splitter, SQL component, Data-Prep script runner, Join component etc.

8 Hour

5.6

Configuring Custom Python Script Component

Configuring and writing custom python scripts for both batch and real time components. Connecting to various databases, handling dictionaries, data frames and lists, return options available, and how they can be used for various use-cases, change of format for data within the code and also handling such changes at writer level outside the code, use of environment variables and integration, lookups and enrichment of data using python components, handling failures and exceptions.

8 Hour

5.7

Predictive Workbench module / DS Lab

Go through the data science workbench/DS Lab module using training materials and create a model.

4 Days

5.8

Configuring Scheduler & ML Model Runner components

Usage of scheduler component and various options available, integration of scheduler with the Python code, integration of ML models from data science workbench into data pipeline.

6 Hour

5.9

Configuring API ingestion & Web-Socket components

Understanding the API ingestion component and configuring it. Testing it through postman and understanding the data format. Integrating the API ingestion components with Dashboard and enabling the dashboard to interact with pipeline dynamically to ingest data into Data Pipeline. Working with web-socket producer and understanding the interaction from pipeline to dashboard, visualizing live data in the dashboards.

8 hour

5.10

Workflow exercises

Pipeline work-flows to be developed for various sample use-cases.

8 Hour

6

Advanced Dashboard Designer

2 Days

6.1

WebSocket

4 hour

6.2

API Ingestion

4 hour

4 hour

6.3

Custom Charts

5 Days

7

MongoDB

7.1

Concepts to cover

1. Project

2. Group

3. Match

4. Sort

5. Date formatting

6. Arithmetic Operation

7. Conditional (if, switch) / Logical statements

8. Arrays, Unwind

9. Lookup, Union with

10. Facets

11. Map Function

12. Merge

13. Slice

14. Bucket

15. Expressions

16. Add fields, add to set

Push

2 Days + 2 Days

7.2

Aggregate Collection with Python script

Using Python script in Pipeline Module

4 hours

7.3

Dashboard with MQL

4 hours

Last updated