Administrative Settings
  • Accessing the Admin Module
  • Admin Panel Options
    • API Client Registration
    • Audit Trail
    • Authentication
      • AD Configuration
      • CA PPM Configuration
      • AWS Cognito Configuration
    • Configurations
      • API Base URL
      • API Connectors Configurations
      • Bulk User Creation
      • Core Ownership Transfer
      • Custom Field Settings
      • Data Catalog Settings
      • Data Connectors
      • Data Lake Settings
      • Data Science Lab Settings
      • Data Sheet Settings
      • Data Store Settings
      • Email Server
      • Encryption
      • Execution Settings
      • Form Settings
      • Geo Spatial
      • Keycloak Settings
      • New Version Control
        • Versioning
        • Migration
      • Open ID Settings
      • Password
      • Pipeline Settings
      • Sandbox Settings
      • Secret Management
      • Upload Certificate Settings
      • Version Control
    • Documentation Management
    • GIT Migration
      • Migrating a Dashboard
      • Migrating an API Service
      • Migrating a Pipeline
    • Language Mapping
      • Languages
      • Mapping Table (Language Mapping Part)
    • License
    • Migration
      • Document Migration
      • SFTP Settings
    • Schedule Monitor
    • Server Monitor
    • Session Manager
Powered by GitBook
On this page
  1. Admin Panel Options
  2. Configurations

Data Lake Settings

PreviousData ConnectorsNextData Science Lab Settings

Last updated 1 year ago

  • Click the Data Lake Settings from the Configurations admin option. ​

  • The Data Lake Settings form opens.

  • The user needs to enter the following details to save the Data Lake Settings based on the selected server option:

  • Fields to be configured with the Same Server option.

    • The user needs to provide the username, password, Host and Port address. If the selected server is Hadoop, then the HDFS Port number is also required.

    • Click the Save option to save the entered Data Lake Settings.

  • Fields to be configured with the Different Server option.

    • The user needs to provide Username, Password, Host, and Port address to save their Data into the Hadoop and Hive Data Lakes. (For Hadoop the HDFS Port numbers are also required.

    • The user need to provide details for the parent and child folders together with the database name.

    • Click the Save option to save the entered Data Lake Settings.​

  • A notification message appears and the provided Data Lake Settings get saved/ registered.