Data Science Lab
  • What is Data Science Lab?
  • Accessing the Data Science Lab Module
  • Data Science Lab Quick Start Flow
  • Project
    • Environments
    • Creating a Project
    • Project List
      • View
      • Keep Multiple Versions of a Project
      • Sharing a Project
      • Editing a Project
      • Activating a Project
      • Deactivating a Project
      • Deleting a Project
    • Tabs for a Data Science Lab Project
      • Tabs for TensorFlow and PyTorch Environment
        • Notebook
          • Ways to Access Notebook
            • Create
            • Import
              • Importing a Notebook
              • Pull from Git
          • Notebook Page
            • Preview Notebook
            • Notebook Cells
              • Using a Code Cell
              • Using a Markdown Cell
              • Using an Assist Cell
            • Renaming a Notebook
            • Resource Utilization Graph
            • Notebook Taskbar
            • Notebook Operations
              • Datasets
                • Copy Path (for Sandbox files)
              • Secrets
              • Algorithms
              • Transforms
              • Utility Notebook Operation
              • Models
                • Model Explainer
                • Registering & Unregistering a Model
                • Model Filter
              • Artifacts
              • Files
              • Variable Explorer
              • Writers
              • Find and Replace
            • Notebook Actions
          • Notebook List
            • Notebook List Actions
              • Export
                • Export to Pipeline
                • Export to GIT
              • Register as Job
              • Notebook Version Control
              • Sharing a Notebook
              • Deleting a Notebook
        • Dataset
          • Adding Data Sets
            • Data Sets
            • Data Sandbox
          • Dataset List Page
            • Preview
            • Data Profile
            • Create Experiment
            • Data Preparation
            • Delete
        • Utility
          • Pull from Git (Utility)
        • Model
          • Model Explainer
          • Share a Model
          • Import Model
          • Export to GIT
          • Register a Model
          • Unregister A Model
          • Register a Model as an API Service
            • Register a Model as an API
            • Register an API Client
            • Pass Model Values in Postman
          • AutoML Models
        • Auto ML
          • Creating AutoML Experiments
            • Creating an Experiment
          • AutoML List Page
            • View Report
              • Details
              • Models
                • View Explanation
                  • Model Summary
                  • Model Interpretation
                    • Classification Model Explainer
                    • Regression Model Explainer
                    • Forecasting Model Explainer
                  • Dataset Explainer
            • Delete
      • Tabs for PySpark Environment
        • Notebook
          • Ways to Access Notebook
            • Create
            • Import
              • Importing a Notebook
          • Notebook Page
            • Preview Notebook
            • Notebook Cells
              • Using a Code Cell
              • Using a Markdown Cell
              • Using an Assist Cell
            • Renaming a Notebook
            • Resource Utilization Graph
            • Notebook Taskbar
            • Notebook Operations
              • Datasets
                • Copy Path (for Sandbox files)
              • Secrets
              • Utility
              • Files
              • Variable Explorer
              • Writers
              • Find and Replace
            • Notebook Actions
          • Notebook List
            • Notebook List Actions
              • Export
                • Export to Pipeline
                • Export to GIT
              • Register as Job
              • Notebook Version Control
              • Sharing a Notebook
              • Deleting a Notebook
        • Dataset
          • Adding Data Sets
            • Data Sets
            • Data Sandbox
          • Dataset List Page
            • Preview
            • Data Profile
            • Data Preparation
            • Delete
        • Utility
  • Repo Sync Project
    • Environments
    • Creating a Repo Sync Project
    • Project List
      • View
      • Project Migration
      • Keep Multiple Versions of a Project
      • Sharing a Project
      • Editing a Project
      • Activating a Project
      • Deactivating a Project
      • Deleting a Project
    • Tabs for a Data Science Lab Project
      • Tabs for TensorFlow and PyTorch Environment
        • Notebook
          • Accessing the Notebook Tab
          • Adding a Folder or File
          • Notebook Page
            • Preview File
            • .ipynb Cells
              • Using a Code Cell
              • Using a Markdown Cell
              • Using an Assist Cell
            • Resource Utilization Graph
            • Notebook Taskbar
            • Operations for an .ipynb File
              • Datasets
                • Copy Path (for Sandbox files)
              • Secrets
              • Algorithms
              • Transforms
              • Models
                • Model Explainer
                • Registering & Unregistering a Model
                • Model Filter
              • Files
              • Variable Explorer
              • Writers
              • Find and Replace
            • Actions Icons for .ipynb File
          • File Options
            • Export
            • Register
            • Delete
          • Git Console
        • Dataset
          • Adding Data Sets
            • Data Sets
            • Data Sandbox
          • Dataset List Page
            • Preview
            • Data Profile
            • Create Experiment
            • Data Preparation
            • Delete
        • Model
          • Import Model
          • Model Explainer
          • Share a Model
          • Export to GIT
          • Register a Model
          • Unregister A Model
          • Register a Model as an API Service
            • Register a Model as an API
            • Register an API Client
            • Pass Model Values in Postman
          • AutoML Models
        • Auto ML
          • Creating AutoML Experiments
            • Creating an Experiment
          • AutoML List Page
            • Experiment Status
            • Actions
              • View Report
                • Details
                • Models
                  • View Explanation
                    • Model Summary
                    • Model Interpretation
                      • Classification Model Explainer
                      • Regression Model Explainer
                      • Forecasting Model Explainer
                    • Dataset Explainer
              • Delete
      • Tabs for PySpark Environment
        • Notebook
          • Accessing the Notebook Tab
          • Adding a Folder or File
          • Notebook Page
            • Preview a File
            • Cells for .ipynb Files
              • Using a Code Cell
              • Using a Markdown Cell
              • Using an Assist Cell
            • Resource Utilization Graph
            • Notebook Taskbar
            • Operations for an .ipynb File
              • Datasets
                • Copy Path (for Sandbox files)
              • Secrets
              • Files
              • Variable Explorer
              • Writers
              • Find and Replace
            • Actions for .ipynb Files
            • File Options
              • Export
              • Register
              • Delete
            • Git Console
        • Dataset
          • Adding Data Sets
            • Data Sets
            • Data Sandbox
          • Dataset List Page
            • Preview
            • Data Profile
            • Data Preparation
            • Delete
Powered by GitBook
On this page
  1. Repo Sync Project

Environments

Allows users to select the environment they want to work in. Currently, supported Python frameworks are Sklearn (default), TensorFlow, PyTorch and PySpark.

PreviousRepo Sync ProjectNextCreating a Repo Sync Project

Last updated 1 year ago

Please Note: The user can select only single environment for one particular project. The project will have all the dependencies of the particular environment selected for in the project level.

Supported environments

  • TensorFlow: Users can execute Sklearn commands by default in the notebook. If the users select the TensorFlow environment, they do not need to install packages like the TensorFlow and Keras explicitly in the notebook. These packages can simply be imported inside the notebook.

  • PyTorch: If the users select the PyTorch environment, they do not need to install packages like the Torch and Torchvision explicitly in the notebook. These packages can simply be imported inside the notebook.

  • PySpark: If the users select the PySpark environment, they do not need to install packages like the PySpark explicitly in the notebook. These packages can simply be imported inside the notebook.

Please Note: Refer the page to get an overview of the Data Science Lab module in nutshell.

Data Science Lab Quick Start Flow