Data Science Lab
  • What is Data Science Lab?
  • Accessing the Data Science Lab Module
  • Data Science Lab Quick Start Flow
  • Project
    • Environments
    • Creating a Project
    • Project List
      • View
      • Keep Multiple Versions of a Project
      • Sharing a Project
      • Editing a Project
      • Activating a Project
      • Deactivating a Project
      • Deleting a Project
    • Tabs for a Data Science Lab Project
      • Tabs for TensorFlow and PyTorch Environment
        • Notebook
          • Ways to Access Notebook
            • Create
            • Import
              • Importing a Notebook
              • Pull from Git
          • Notebook Page
            • Preview Notebook
            • Notebook Cells
              • Using a Code Cell
              • Using a Markdown Cell
              • Using an Assist Cell
            • Renaming a Notebook
            • Resource Utilization Graph
            • Notebook Taskbar
            • Notebook Operations
              • Datasets
                • Copy Path (for Sandbox files)
              • Secrets
              • Algorithms
              • Transforms
              • Utility Notebook Operation
              • Models
                • Model Explainer
                • Registering & Unregistering a Model
                • Model Filter
              • Artifacts
              • Files
              • Variable Explorer
              • Writers
              • Find and Replace
            • Notebook Actions
          • Notebook List
            • Notebook List Actions
              • Export
                • Export to Pipeline
                • Export to GIT
              • Register as Job
              • Notebook Version Control
              • Sharing a Notebook
              • Deleting a Notebook
        • Dataset
          • Adding Data Sets
            • Data Sets
            • Data Sandbox
          • Dataset List Page
            • Preview
            • Data Profile
            • Create Experiment
            • Data Preparation
            • Delete
        • Utility
          • Pull from Git (Utility)
        • Model
          • Model Explainer
          • Share a Model
          • Import Model
          • Export to GIT
          • Register a Model
          • Unregister A Model
          • Register a Model as an API Service
            • Register a Model as an API
            • Register an API Client
            • Pass Model Values in Postman
          • AutoML Models
        • Auto ML
          • Creating AutoML Experiments
            • Creating an Experiment
          • AutoML List Page
            • View Report
              • Details
              • Models
                • View Explanation
                  • Model Summary
                  • Model Interpretation
                    • Classification Model Explainer
                    • Regression Model Explainer
                    • Forecasting Model Explainer
                  • Dataset Explainer
            • Delete
      • Tabs for PySpark Environment
        • Notebook
          • Ways to Access Notebook
            • Create
            • Import
              • Importing a Notebook
          • Notebook Page
            • Preview Notebook
            • Notebook Cells
              • Using a Code Cell
              • Using a Markdown Cell
              • Using an Assist Cell
            • Renaming a Notebook
            • Resource Utilization Graph
            • Notebook Taskbar
            • Notebook Operations
              • Datasets
                • Copy Path (for Sandbox files)
              • Secrets
              • Utility
              • Files
              • Variable Explorer
              • Writers
              • Find and Replace
            • Notebook Actions
          • Notebook List
            • Notebook List Actions
              • Export
                • Export to Pipeline
                • Export to GIT
              • Register as Job
              • Notebook Version Control
              • Sharing a Notebook
              • Deleting a Notebook
        • Dataset
          • Adding Data Sets
            • Data Sets
            • Data Sandbox
          • Dataset List Page
            • Preview
            • Data Profile
            • Data Preparation
            • Delete
        • Utility
  • Repo Sync Project
    • Environments
    • Creating a Repo Sync Project
    • Project List
      • View
      • Project Migration
      • Keep Multiple Versions of a Project
      • Sharing a Project
      • Editing a Project
      • Activating a Project
      • Deactivating a Project
      • Deleting a Project
    • Tabs for a Data Science Lab Project
      • Tabs for TensorFlow and PyTorch Environment
        • Notebook
          • Accessing the Notebook Tab
          • Adding a Folder or File
          • Notebook Page
            • Preview File
            • .ipynb Cells
              • Using a Code Cell
              • Using a Markdown Cell
              • Using an Assist Cell
            • Resource Utilization Graph
            • Notebook Taskbar
            • Operations for an .ipynb File
              • Datasets
                • Copy Path (for Sandbox files)
              • Secrets
              • Algorithms
              • Transforms
              • Models
                • Model Explainer
                • Registering & Unregistering a Model
                • Model Filter
              • Files
              • Variable Explorer
              • Writers
              • Find and Replace
            • Actions Icons for .ipynb File
          • File Options
            • Export
            • Register
            • Delete
          • Git Console
        • Dataset
          • Adding Data Sets
            • Data Sets
            • Data Sandbox
          • Dataset List Page
            • Preview
            • Data Profile
            • Create Experiment
            • Data Preparation
            • Delete
        • Model
          • Import Model
          • Model Explainer
          • Share a Model
          • Export to GIT
          • Register a Model
          • Unregister A Model
          • Register a Model as an API Service
            • Register a Model as an API
            • Register an API Client
            • Pass Model Values in Postman
          • AutoML Models
        • Auto ML
          • Creating AutoML Experiments
            • Creating an Experiment
          • AutoML List Page
            • Experiment Status
            • Actions
              • View Report
                • Details
                • Models
                  • View Explanation
                    • Model Summary
                    • Model Interpretation
                      • Classification Model Explainer
                      • Regression Model Explainer
                      • Forecasting Model Explainer
                    • Dataset Explainer
              • Delete
      • Tabs for PySpark Environment
        • Notebook
          • Accessing the Notebook Tab
          • Adding a Folder or File
          • Notebook Page
            • Preview a File
            • Cells for .ipynb Files
              • Using a Code Cell
              • Using a Markdown Cell
              • Using an Assist Cell
            • Resource Utilization Graph
            • Notebook Taskbar
            • Operations for an .ipynb File
              • Datasets
                • Copy Path (for Sandbox files)
              • Secrets
              • Files
              • Variable Explorer
              • Writers
              • Find and Replace
            • Actions for .ipynb Files
            • File Options
              • Export
              • Register
              • Delete
            • Git Console
        • Dataset
          • Adding Data Sets
            • Data Sets
            • Data Sandbox
          • Dataset List Page
            • Preview
            • Data Profile
            • Data Preparation
            • Delete
Powered by GitBook
On this page
  • Importing a Model
  • Exporting the Model to Data Pipeline
  • Accessing the Exported Model within the Pipeline User interface
  • Sample files for Sklearn
  • Sample files for Keras
  • Sample files for PyTorch
  1. Repo Sync Project
  2. Tabs for a Data Science Lab Project
  3. Tabs for TensorFlow and PyTorch Environment
  4. Model

Import Model

External models can be imported into the Data Science Lab and experimented inside the Notebooks.

PreviousModelNextModel Explainer

Last updated 10 months ago

Please Note:

  • The External models can be registered to the Data Pipeline module and they can be inferred using the Data Science Lab script runner.

  • Only the Native prediction functionality will work for the External models.

Importing a Model

  • Navigate to the Model tab for a Repo Sync Project.

  • Click the Import Model option.

  • The user gets redirected to upload the model file. Select and upload the file.

  • A notification message appears.

  • The imported model gets added to the model list.

Exporting the Model to Data Pipeline

The user needs to start a new .ipynb file with wrapper function that includes Data, Imported Model, Predict function, and output Dataset with predictions.

  • Create a new .ipynb file or navigate to an existing .ipynb file.

  • Use the code cell to write the needed wrapper function.

  • Access the Imported Model inside this .ipynb file.

  • Load it and use the model number inside your script.

  • Run the script.

  • Run the code cell with the inference script to get the preview of the data.

  • Register the model using the Models tab inside the .ipynb file.

  • The Register Model dialog box appears to confirm about the model registration.

  • Click the Yes option.

  • A notification message appears, and the model gets registered.

  • Export the script using the Export functionality provided for the .ipynb file.

  • The Export to Pipeline window appears.

  • Select the Export to pipeline option.

  • Select a specific script from the Notebook. or Choose the Select All option to select the full script.

  • Select the Next option.

  • Click the Validate icon to validate the script.

  • A notification message appears to assure the validity of the script.

  • Click the Export to Pipeline option.

  • A notification message appears to assure that the selected Notebook has been exported.

Please Note: The imported model gets registered to the Data Pipeline module as a script.

Accessing the Exported Model within the Pipeline User interface

  • Navigate to the Data Pipeline Workflow editor.

  • Drag the DS Lab Runner component and configure the Basic Information.

  • Open the Meta Information tab of the DS Lab Runner component.

  • Configure the following information for the Meta Information tab.

    • Select Script Runner as the Execution Type.

    • Select function input type.

    • Select the project name.

    • Select the Script Name from the drop-down option. The same name given to the imported model appears as the script name.

    • Provide details for the External Library (if applicable).

    • Select the Start Function from the drop-down menu.

  • The exported model along with the script can be accessed inside the Script section.

  • The user can connect the DS Lab Script Runner component to an Input Event.

  • Run the Pipeline.

  • The model predictions can be generated in the Preview tab of the connected Input Event.

Please Note:

  • The Imported Models can be accessed through the Script Runner component inside the Data Pipeline module.

  • The other models can be accessed through the Model Runner component inside the Data Pipeline.

Try out the Import Model Functionality yourself

Some of the Sample models and related scripts are provided below for the user to try his hands on this functionality. Please download them by a click, and use them in your Notebook by following the above mentioned steps.

Sample files for Sklearn

Sample files for Keras

Sample files for PyTorch

Please Note:

  • The supported extensions for External models - .pkl, .h5, .pth & .pt

Please Note: The Imported models are referred as External models in model list and they are marked with as pre-fix to their names (as displayed in the above-given image).

Refer the page to get an overview of the Data Science Lab module in nutshell.

Data Science Lab Quick Start Flow
923B
SklearnModel.pkl
Sample Sklearn model for import.
6KB
Importmodels_Sklearn_Inference.ipynb
Sample python script based on the imported Sklearn model.
18KB
KersModel.h5
Sample Keras model for import.
6KB
Importmodels_Keras_Inference.ipynb
Sample python script based on the imported Keras model.
8KB
Pytorch_Model.pth
Sample PyTorch model for import.
10KB
ImportModel_Pytorch_Inference.ipynb
Sample python script based on the imported PyTorch model.
the Import Model option