Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
This section configures the custom field settings for the user groups and the users assigned to those groups.
Click the Custom Field Settings option from the Configurations list.
The Custom Field Settings page opens.
Provide the following information for each custom field:
Key: Provide the key value of the custom field.
Input Type: Select an input option from the drop-down menu.
Manual: Users need to fill in the field manually.
User Lookup: Users need to choose from a drop-down menu.
Description: Describe the inserted key.
Mandatory: Select the 'Yes' option for the given field and make the inserted custom field
mandatory. Selecting the 'No' option for the given field will not make it mandatory custom field for the user.
Click the Save option to save the inserted custom fields.
Please Note:
Click the Add option to add a new custom field.
Click the Clear option to erase the inserted custom field information.
The configured Custom Fields can be accessed inside the Security module to restrict data access of various users.
The Core Ownership Transfer Settings allows to transfer all the components of Data Center, Designer, Security, Home, and Pipeline modules from one user to another user.
Check out the given walk-through to understand the Core Ownership Transfer configuration option.
This Admin configuration option will help the users in the same space to own the variety of components created by another user in case of project handover or role change. Thus, the entire account access can be transferred to the desired user on the same page.
Please Note: This feature is provided to handle situations like when some user leaves the organization, and his work needs to be transferred to another user.
Click on the Core Ownership Transfer option from the Configurations list in the Admin Module.
Then the Core Ownership Transfer page opens.
Provide the following information to transfer the core ownership to another user.
Select Current User: Select a user from the given drop-down list.
Select New Owner: Select a new owner from the given drop-down list.
Select Module/Modules: Select a module or multiple modules to be transferred to the selected new owner (Platform modules can be listed as choices).
Click the Transfer option.
A success message appears, and the selected modules get transferred to the new owner.
This option allows the users to configure API data connectors.
Click the API Connectors Configurations configuration option from the list of admin options.
The Connector Configurations page opens displaying the list of the API connectors.
Select an API Connector option.
The required configuration details are displayed for the selected API Data Connector.
Click the Save option.
A message appears to confirm the connector configuration.
Please Note: Based on the selected API Connector the configuration fields may vary for each API connector.
The current option provided under the Configuration and Settings tab helps the administrator to create multiple users using a standard template.
Click the Bulk Users Creation option from the Configurations.
The Bulk User Creation page opens.
Please Note: The Bulk User Creation is completed in two steps. Refer to the following content to understand the process of Bulk User Creation.
Navigate to the Bulk User Creation page.
Click the Download icon.
A model template for the bulk user creation gets downloaded.
The admin can insert multiple users in the downloaded user template.
Use the following format (as shown in the image) to enter the user details.
Navigate to the Upload Template option of the Bulk User Creation page.
Use the Choose File option to select the file with multiple usernames.
Select the sheet.
Click the Upload option.
A success message appears to confirm that a new user has been created.
The newly created users get added to the user list provided inside the User Security module.
Please Note:
The users created through the Bulk User Creation functionality get added to the View Role user group by default since no group is allotted to them at the time of creation.
The admin can manually provide a user group to the users created via the Bulk User Creation option.
The administrator can allow different user groups to the users created via the Bulk User Creation functionality.
The CP_spacekey column provided in the user template refers to Custom Fields values that the administrator can insert to restrict the display of data for specific users. It is optional information. The administrator can insert the user group-related custom properties value to restrict data display for the newly created users.
Click the Delete icon provided next to the added custom field to remove that custom field.
API Base URL settings option allows the user to share the base URLs for the various APIs that can be shared across to call the API service.
The API Base URL Settings helps the end users to get the full URL of any published Data As API without much intervention from the DevOps team.
Navigate to the Admin Panel.
Click the API Base URL configuration option from the Admin Panel.
The API Base URL page opens.
The API Base URL will display the configured URL by the DevOps team.
E.g., the given image displays the app.bdb.ai/services as configured base URL.
Navigate to the API Client Registration option.
Click the Edit icon for a registered Client.
The API Client Registration details open.
Click on the document link provided on the API Client Registration page.
By clicking the document link from the API Client Registration page, the user gets directed to the below given page providing a URL along with other service details that is required to call the API service.
Please Note: This URL is a combination of API Base URL & Service name which can be shared across.
Click the Data Lake Settings from the Configurations admin option.
The Data Lake Settings form opens.
The user needs to enter the following details to save the Data Lake Settings based on the selected server option:
Click the Data Lake Settings from the Configurations admin option.
The Data Lake Settings form opens.
Select the Same Server option.
Provide the username, password, Host, and Port address. If the selected server is Hadoop, then the HDFS Port number is also required.
Click the Save option to save the entered Data Lake Settings.
Click the Data Lake Settings from the Configurations admin option.
The Data Lake Settings form opens.
Select the Different Server option.
The user needs to provide Username, Password, Host, and Port address to save their Data into the Hadoop and Hive Data Lakes. (For Hadoop the HDFS Port numbers are also required.
The user need to provide details for the Parent Folder and Child Folders together with the database name.
Click the Save option to save the entered Data Lake Settings.
A notification message appears and the provided Data Lake Settings will be saved/ registered.
This option allows the users to configure the required server settings for Data Connectors.
Click the Data Connectors option from the Configurations list.
The Data Management Settings page opens.
The users can set the Max Fetch Size for the listed data sources.
Click the Update option to save the entered Max Fetch Size for data.
A notification message appears, and the data gets updated.
Click the Clear option.
The entered Data Connector Information will be erased.
The Data Sheet Settings option helps the administrator to configure the Data Sheets templates.
Click the Data Sheet Settings from the list of Configurations under the Admin module.
The Data Sheet Settings form opens.
Select MySQL as Database type.
Enter the database details to save the Data Sheet data as mentioned below:
Username: Use read-only credentials
Password: Use a valid password
Host: Provide the Host address
Port: Provide the port number
Database Name: Provide the database name
Click the Validate option to save the entered settings information.
A notification message appears and the provided Data Sheet Settings get registered.
Click the Save option.
A notification message ensures the user that the validated settings are saved for the Data Sheet Settings.
The user can follow the same set of steps to configure the Data Sheet Settings with the ClickHouse option.
Navigate to the Data Sheet Settings form.
Select the ClickHouse as Database type.
Enter the database details to save the form data as mentioned below:
Username: Use read-only credentials
Password: Use a valid password
Host: Provide the Host address
Port: Provide the port number
Database Name: Provide a database name.
Click the Validate option to save the entered settings information.
A notification message appears and the provided Data Sheet Settings get registered.
Click the Save option.
A notification message ensures the user that the validated settings are saved for the Data Sheet creation.
Please Note: The Data Sheet supports MySQL version up to 8.0 at present.
The admin can configure the Email Server to send confirmation mails.
The administrator can configure the email server information using this admin option to get email alerts or stop the email alerts for some of the actions performed on the various Platform documents or while using functionality such as Forgot Password, etc.
Click the Email Server option from the Configurations list.
The Email Server Settings page opens.
Provide the following Email Server Information.
Email Host: SMTP host address
Email Port: Port number of SMTP
Encryption Type: Select an encryption type from the drop-down menu.
Email From: Enter the authenticated credentials of the sender.
Email Password: Provide the password
Email Username: Name that gets displayed to the receivers
Configure the following alert options for email by putting checkmarks:
Disable email sending: By selecting this option, the email alert gets disabled.
Send test email: By selecting this option immediately a test mail is sent.
Send email for publish/share/copy to documents: By enabling this option email alert gets shared while publishing, sharing, and copying platform documents (files/folders).
Domain Selection: Select a Domain from the given options (All/Selected Domains).
Click the Update option to update the email server settings information.
A notification message appears to assure that the saved email server details are updated.
Please Note: Click the Clear option to erase the entered configuration details for the Email Server Settings.
Configure the database settings for your Data Stores to get saved.
This Admin option helps to configure the Data Store settings. The Data Store settings currently provide three database types to choose from:
Click the Data Store Settings option from the Configuration and Settings options.
The Data Store Settings page appears.
Select the Mongo as the database option.
Provide the following information for the Data Store Settings:
Server Type
Username
Password
IP/Host
Port
Database Name
SSL Type - Select one option NON-SSL or SSL
Click the Validate option.
A notification message appears.
The Save option gets enabled. Click the Save option to save the inserted information.
A notification message confirms that the Data Store configuration for Mongo DB has been saved.
Click the Data Store Settings option from the Configuration and Settings options.
The Data Store Settings page appears.
Select a Database type. E.g., in this case, select the Elastic option.
Provide the following information for the Data Store Settings:
Elastic Search Cluster
Shards
Replicas
Date Formats
Elastic Search Hosts
Elastic Search Ports
Elastic Search HTTP Ports
Click the Validation option.
A notification message appears.
The Save option gets enabled. Click the Save option to save the inserted information.
A notification message confirms that the Data Store configuration for Elastic has been saved.
Click the Data Store Settings option from the Configuration and Settings options.
The Data Store Settings page appears.
Select a Database option. E.g., select the ClickHouse option.
Provide the following information for the Data Store Settings:
Username
Password
IP/Host
Port
Database Name
SSL Type - Select one option NON-SSL or SSL
TCP Port
Click the Validate option.
A notification message appears.
The Save option gets enabled. Click the Save option to save the inserted information.
A notification message confirms that the Data Store configuration for ClickHouse DB has been saved.
Click the Data Store Settings option from the Configuration and Settings options.
The Data Store Settings page appears.
Select a Database option. E.g., in this case, select the PostgreSQL option.
Provide the following information for the Data Store Settings:
Username
Password
IP/Host
Port
Database Name
SSL Type - Select one option NON-SSL or SSL
TCP Port
Click the Validate option.
A notification message appears.
The Save option gets enabled. Click the Save option to save the inserted information.
A notification message confirms that the Data Store configuration for PostgreSQL has been saved.
Click the Data Store Settings option from the Configuration and Settings options.
The Data Store Settings page appears.
Select the Database option as Pinot.
Provide the following information for the Data Store Settings:
Username
Password
IP/Host
Port
Database Name
SSL Type - Select one option NON-SSL or SSL
TCP Port
Click the Validate option.
A notification message appears.
The Save option gets enabled. Click the Save option to save the inserted information.
A notification message confirms that the Data Store configuration for PostgreSQL has been saved.
Please Note:
The users must provide the Upload Key while configuring the Data Store Settings with SSL for the Mongo DB, ClickHouse, and PostgreSQL database.
The PostgreSQL Datastore Settings support multiple IPs.
The Notebook Settings for the DS Lab can be configured through this admin option.
Click the Data Science Lab Settings option from the Configurations list in the Admin Module.
The Data Science Lab Settings page appears.
Provide the following Data Science Lab Settings Information:
Environment Settings
Algorithms
Image Version
Image Name
API Image Version
API Image Name
GPU Support
Idle Shutdown: Click the Add option to get the field for configuring the Idle Shutdown.
Configuration Details
Limit
Memory
Request
Memory
Click the Save option to save the inserted information.
A notification message appears to confirm that the provided configuration for the Notebook Settings has been saved.
This section explains the steps to configure the Geospatial plugin.
The option provides two types of Map settings.
Click the Geo-Spatial from the Configuration and Settings admin option.
The following option appears:
Click the Geo Settings option from the Configurations list.
The Geo Settings page opens displaying the Google Settings and Leaflet Settings options.
Fill in the following information for the Google Map:
Map Key: Enter the map key that has been provided by Google (To be purchased from Google).
Click the Save option.
Fill in the following information for the Leaflet Settings:
Map URL: URL of the selected map (provided by the open-source vendors)
Attribution: Configuration parameters for the map (provided by the open-source vendor)
Click the Save option.
Please Note: Click the Clear option to erase the information.
Click the Geo Shapes option from the Geo Settings configuration option.
The Geo Shapes page opens.
Click the Add option from the Geo Shapes page.
The Create New Geo Shapes page opens.
Enter the following information:
Shape Name: Title of the geo shape (map)
Geometry Type: Select any Geometry type from the drop-down menu (out of Polygon or Line)
Area Type: Select an area type using the drop-down menu
Choose File: Browse a shapefile from the system and upload (Only JSON and JS formats are supported)
Click the Save option to save the inserted details.
A message will pop up to ensure that the file has been uploaded.
The uploaded Geo Shapefile is displayed in the list format.
Please Note: Use the Search space to search a Geo Shapefile from the displayed list.
Select an uploaded Geo Shapefile from the list.
Click the Delete icon provided next to a Geo Shape File.
A new window opens to confirm the deletion.
Select the DELETE option.
The selected Geo Shape file gets removed from the list.
Keycloak settings refer to the configurations that affect the entire Keycloak deployment or multiple realms within the deployment. These settings are typically managed at the server level.
The admin user can configure the Keycloak Settings by using this option.
Click on the Keycloak Settings option from the admin menu panel.
The Keycloak Settings page opens.
Provide the following details:
Keycloak Public Key
Realm URL
Click the Save option.
A notification message appears to ensure the success of the action.
Please Note: Click the Delete icon to remove the added Node Pool details.
Please Note: The user can access the module of the BDB Platform after configuring this settings option and begin with the creation.
The Form Settings admin option provides database details for saving the data of forms. This configuration option is a prerequisite to use the Forms module.
Click the Form Settings option from the admin options panel.
The Form Settings page opens.
Enter the database details to save the form data as mentioned below:
Username: Use read-only credentials
Password: Use a valid password
Host: Provide the Host address
Port: Provide the port number
Database Name: Provide a database name
Click the Validate option to validate the Form Settings.
A notification message appears.
Click the Save option to save the entered settings information.
A confirmation message appears, and the entered Form Settings get saved.
Please Note:
Click the Clear option to erase the entered database settings details.
All the fields for the Form Settings are mandatory.
Click the Pipeline Settings option from the Configurations list.
The Pipeline Settings page opens.
Provide the Name Space for the Pipeline Settings.
Click the Add new field icon to get the required fields.
Provide Key
Provide Value
Click the Save option.
A notification message appears stating that the saved pipeline settings are updated.
The admin can enforce strong password policies to protect user accounts from unauthorized access by using the Password Settings option.
The administrator can configure the account password information using this option.
Click the Password option from the list of configuration and Settings options.
The Password Settings page appears.
Provide the following information:
Password Expiry (Days): Set password validity (in days)
Password Strength (: Set password length (6 to 16)
Password Reuse: Set a limit to restrict the user from using an old password (the last 3 passwords cannot be reused)
Login Failures (No. of User Login Failure): Set the number of chances provided to the user for logging in with wrong passwords (Maximum 3 login chances are provided to a user. The user account gets blocked if a user enters the wrong password more than 3 times.)
Click the Save option to save the entered password information.
Click the Clear All option to erase the entered password information.
Please Note:
The administrator can block any user who fails to enter the correct password 3 times.
A user can Sign in with the same password only when the administrator enables the user again (The password must be a combination of alphabetical letters, numerical figures, and a unique character E.g., Admin1@).
This page explains how to configure the Migration option for a module through New Version Control.
Transfer the files from the source location to the target location within a software platform using the Migration option. The user can configure the migration-related settings using the New Version Control admin option.
The supported modules for the migration are listed below:
API Service
Dashboard
DS Lab (Repo Sync Projects, Notebook & Model)
Pipeline
Story (Report)
Select the New Version Control option from Configurations under the Admin module.
The Version Control Information form opens.
Select the Migration option from the first dropdown.
All supported modules will be listed under the Select a Module drop-down which are supporting the Version Control functionality. (E.g., Dashboard is selected in the given image).
Select the Git type as either GitLab or GitHub.
Provide the Host information.
Provide the Token Key.
Click on the Test button.
Select the Project
Select a Branch where files need to be stored on the VCS.
Configure the More Information fields.
Provide the following information:
Entity App
Work Space Name
Entity Extension
Entity Type
Click the Test option.
A notification message appears to inform about the successful authentication.
The Save option gets enabled. Click the Save option.
A configuration message appears and the configuration for the New Version Control gets saved.
BDB provides Versioning option as a simplified process to track changes for the various modules to ensure efficient development and enhance collaboration for effective code management.
Version control systems like Git provide a robust set of features to ensure efficient and reliable software development. They enhance collaboration, enable effective code management, and simplify the process of tracking changes, thereby contributing to the overall quality and maintainability of the software.
Version Control feature helps the users to maintain a copy of component versions in the Version Control System (Gitlab | GitHub repository). The user can also pull a specific version from the Version Control System after some versions are pushed there. The supported modules for the versioning are as follows:
Dataset
Data as API
Data Store
Dashboard
DS Lab ( Project & Notebook)
Pipeline script
Select the New Version Control option from the Configurations under the Admin module.
The Version Control information form opens.
Select the Version option from the first dropdown.
All supported modules will be listed under the Select a Module drop-down which supports the Version Control functionality. (E.g., Dashboard is selected in the given image).
Select Git type as either GitLab or GitHub.
Provide the Host information.
Provide the Token Key.
Click on the Test button.
Select the Project
Select a Branch where files need to be stored on the VCS.
Configure the More Information fields.
Provide the following information:
Entity App
Work Space Name
Entity Extension
Entity Type
Click the Test option.
A notification message appears to inform about the successful authentication.
The Save option gets enabled. Click the Save option.
A configuration message appears and the configuration for the Version Control gets saved.
Please Note:
The user needs to set up the associated module configuration for each component.
The configuration fields may vary based on selecting the Select field and the Token Type information.
OpenID Settings allows users to authenticate and authorize themselves on different websites or applications using a single set of credentials.
The admin user can enter the database details to save the data of the open ID.
Click the Open ID Settings option from the Configurations options of the admin menu.
The Open ID Settings page opens.
The user needs to enter the database details to save the data of the open ID.
Use the Select Saved Open ID settings drop-down to access the saved Open ID Settings.
Please Note: Select the saved open ID settings from the Select Saved Open ID settings drop-down menu. If no open ID settings is saved before, it will not list any.
There are two options available to configure the settings under the Open ID Settings page:
Individual
Complete URI
Please Note: The user needs to provide the following required details based on the selected configuration option.
Navigate to the Open ID Settings page.
Select the Individual option.
Provide the relevant information for the following fields:
Name
Open ID URL
Grant Type
Client Id
Client Secret
Scope
Click the Save option.
Navigate to the Open ID Settings page.
Select the Complete URI option.
Provide the relevant information for the following fields:
Name
URL
Click the Save option.
A notification message appears (while configuring with any of the given option) to inform that the Open ID Settings have been updated.
This section provides details of CPU and Memory Utilization to the Admin user.
Navigate to the Configurations admin options.
Click the Execution Settings option from the list.
The Execution Settings page opens.
It displays the following Execution Settings details:
Name Space
Configuration List (default list)
Node Pool
Click the Add new field icon.
The Add New Configuration window opens.
Select an option from the context menu.
Provide the required information for the selected new configuration option.
E.g., AI Service is selected in the following image, so the related information it requires are:
Service Type: Select an option from the context menu.
Name: A pre-selected name appears here based on the configuration type selection.
Image Name:
Some of the pre-defined fields will be displayed as shown below:
Min Instance
Max Instance
Limit
Memory
Provide Environment Info by inserting a new field.
Configure the Node Pool section.
Click the Add option.
The new configuration field gets added to the Configuration List.
Click the Save option for the Execution Settings page.
A notification message appears and the concerned Execution Settings will be saved with the newly added configuration.
Please Note: Click the Delete icon to remove the added Node Pool details.
The user can configure the Sandbox settings using this option.
Click the Sandbox Settings option from the list of Configurations option.
The Sandbox Settings page appears.
Provide the following information for the Sandbox Settings:
Select the Storage Type for File Upload: Only Network is implemented as the Storage Type which will be selected by default.
Select Path: Choose the base path where all the Sandbox files will be saved. This path acts as the root directory for file storage.
Path: This is the base path where all the DS Lab files will be saved.
Max File Size (MB): This is the maximum file limit that will be uploaded in the sandbox location from the Data Center.
Temp Location: All the files uploaded from the Data Center will be saved in this location.
Claim Name: This specifies the name of the claim used for persistent storage in the sandbox environment.
Sub Path: Define a sub-directory within the selected path. This helps in organizing files within the main storage path.
Node Pool: The user can add the Key and Value using the Add new field icon.
Click the Save option to save the inserted information.
A notification message ensures that the provided configuration for the Sandbox Settings has been saved.
Please Note:
All DS Lab Notebooks, Models, Transforms and Artifacts will be saved inside Sandbox location only.
After configuring the Sandbox Settings only, the Sandbox files will be available inside the Data Sandbox module of the Data Center and Data Science Lab.
By default, Platform connectors will not perform any verification of the server certificate. It means that someone can pretend to be the server without the client’s awareness. To prevent this kind of impersonation, the client needs to be able to verify the server's identity through a trusted chain of verification.
Pre-requisites: For SSL support to work, the user must have the following:
A MySQL/ ClickHouse /PostgreSQL server that supports SSL (as currently the SSL implementation has been provided for these data connectors).
A signed client certificate.
Click the Upload Certificate Settings option from the list of Configurations.
The Upload Certificate Settings page opens (It will be an empty list initially).
Click the New option to upload settings for a new certificate.
Provide the required details of the SSL Connector.
Name- This name is a key that can be configured for the Respective SSL Connector from the platform.
CA Certificate-Upload a certificate issued by a trusted Certificate Authority that validates the identity of the SSL Connector.
Client Certificate- A Client Certificate is a digital certificate that authenticates the client's identity to the server. It ensures secure and sensitive communications by encrypting data transmitted between the client and the server.
Key File Upload- A small data files that digitally bind a key.
Jks File Upload- It contains the Application Server's trusted certificates.
Jks Trust Store Upload - It contains the Application Server's trusted certificates.
Jks File Password- Password used to protect a JKS file.
Jks Trust Store Password – To protect a JKS file that contains trusted certificates.
After Uploading the required certificates and details click the Save option to save the details.
A notification message appears.
The uploaded certificate gets listed on the Certificate List page.
Please Note:
In-between space is not allowed in the Name.
Use the Back option to exit the page.
Navigate to the Upload Certificate Settings page.
Click the View icon for an uploaded certificate entry.
The details of the selected certificate entry are displayed.
Navigate to the Upload Certificate Settings page.
Click the Delete icon for an uploaded entry.
The Delete Certificate confirmation dialog box appears.
Confirm the deletion by clicking the Yes option.
A notification message appears to confirm the deletion of the SSL configuration, and the selected SSL configuration gets deleted.
Please Note: The Admin can choose the Cancel option to cancel the deletion.
The Secret Management setting gives facility to prevent the database-related or any other sensitive/confidential information from getting exposed. In Secret Management Settings the Admin configures the Key of the Sensitive data but the actual value is saved in Kubernetes as Environmental Variable by the DevOps Team.
Click on the Secret Management icon from the Configurations section provided in the Admin panel.
The Secret Management page opens.
Click on the New option to add a new secret.
Click the Add New Field icon for the Secret Management option.
An option to add the Secret Key appears as shown in the below image:
Provide the Secret Key in the given space.
Click the Save option.
A confirmation message appears.
The secret key will be saved and displayed in the list.
Add a new field to the Secretkey by clicking on the Add new field + icon for the Secret key.
Add multiple fields like Port, Host, Username, Password, etc. depending upon the selected DB.
Once all the required fields are added, click the Save option.
A confirmation message appears to ensure that all the inserted fields are saved.
Please Note:
Click the Share icon for a saved Secret Key.
The Share Secret Key window opens.
Select the USER or USER GROUP option to list the users (Use the EXCLUDE USER to exclude the user from the right to access a shared Secret Key).
Select User(s)/User Group(s) from the list.
Use an arrow to move the selected user(s) and user group(s) to the right side box.
Once the selected user(s) and user group(s) appear in the right side box.
Click the Save option.
A confirmation message appears and the Secret Key gets shared with the selected user(s) or user group(s).
Please Note:
Admins must request DevOps to add keys and values in Kubernetes, adhering to Kubernetes naming conventions (Use hyphen or underscore, special characters are restricted).
Admins must share these secret keys with relevant users/user groups for access.
Once configured and shared with respected users and user groups these keys can be utilized across platform modules.
Click the View icon to display all the added fields.
Click the Delete icon to remove the added Secret key.
Click the Close icon to remove an added field.
Once Admin has configured the settings, it is possible to share it with a user/user group to use the encrypted secret keys. Click the Share icon to share a Secret Key.