Data Science Lab
  • What is Data Science Lab?
  • Accessing the Data Science Lab Module
  • Homepage
  • List Projects
  • List Feature Stores
  • Create
    • Create Project
      • Container Status Message
    • Create Feature Store
  • Registered Models and APIs
  • Settings
  • Trash
  • Tabs for a DSL Project
    • Workspace
      • Workspace Folders
        • Repo Folder Attributes
        • Repo Folder Attributes for a Repo Sync Project
        • Utils Folder Attributes
          • Utility Actions
        • Files Attributes
      • Working with the Workspace tab
        • Create
        • Import
          • Importing Notebook
          • Pull from Git
        • Adding File and Folders
      • Linter
      • Git Console
      • Adjustable Repository Panel
    • Data
      • Adding Data
      • Data List Page
    • Model
      • Import Model
      • Explainer Generator
      • Export to GIT/ Model Migration
      • Share a Model
      • Register a Model
      • Unregister a Model
      • Register a Model as an API Service
      • Delete a Model
    • AutoML
      • Creating AutoML Experiment
      • AutoML List Page
        • View Explanation
          • Model Summary
          • Model Interpretation
            • Classification Model Explainer
            • Regression Model Explainer
            • Forecasting Model Explainer
          • Dataset Explainer
  • Data Science Notebook
    • Preview File
    • Save as Notebook
    • .ipynb File Cells
      • Using a Code Cell
      • Using a Markdown Cell
        • Expanding and Collapsing Markdown Cell
      • Using an Assist Cell
    • Resource Utilization Graph
    • Taskbar
    • Actions Icons from Header
    • Notebook Actions
      • Register
        • Export
        • Register as a Job
      • Notebook Version Control
      • Share
      • Delete
      • Information
  • Model Creation using Data Science Notebook
  • Notebook Operations
    • Data
      • Reading Data
      • Copy Path Functionality
    • Secrets
    • Algorithms
    • Transforms
    • Artifacts
    • Variable Explorer
    • Writers
Powered by GitBook
On this page
  • Importing a Model
  • Exporting a Model to the Data Pipeline
  • Accessing the Exported Model within the Pipeline User interface
  • Sample files for Sklearn
  • Sample files for Keras
  • Sample files for PyTorch
Export as PDF
  1. Tabs for a DSL Project
  2. Model

Import Model

External models can be imported into the Data Science Lab and experimented inside the Notebooks using the Import Model functionality.

PreviousModelNextExplainer Generator

Last updated 2 months ago

Please Note:

  • The External models can be registered to the Data Pipeline module and inferred using the Data Science Lab script runner.

  • Only the Native prediction functionality will work for the External models.

Importing a Model

Check out the illustration on importing a model.

  • Navigate to the Model tab for a Data Science Project.

  • Click the Import Model option.

  • The user gets redirected to upload the model file. Select and upload the file.

  • A notification message appears.

  • The imported model gets added to the model list.

Please Note: The imported models are referred to as External models in the model list and are marked with a prefix to their names (as displayed in the above-given image).

Exporting a Model to the Data Pipeline

You can integrate and export cutting-edge Data Science models into your data pipeline, ensuring optimized performance, real-time insights, and data-driven decision-making. The user needs to start a new .ipynb file with a wrapper function that includes Data, Imported Model, Predict function, and output Dataset with predictions.

  • Navigate to a Data Science Notebook (.ipynb file) from an activated project. In this case, a notebook has been imported with the wrapper function.

  • Access the Imported Model inside this .ipynb file.

  • Load the imported model to the Notebook cell.

  • Mention the loaded model in the inference script.

  • Run the code cell with the inference script.

  • The Data preview is displayed below.

  • Click the Register option for the imported model from the ellipsis context menu.

  • The Register Model dialog box appears to confirm the model registration.

  • Click the Yes option.

  • A notification message appears, and the model gets registered.

  • Export the script using the Export functionality provided for the Data Science Notebook (.ipynb file).

  • Another notification appears to ensure that the Notebook is saved.

  • The Export to Pipeline window appears.

  • Select a specific script from the Notebook. or Choose the Select All option to select the full script.

  • Select the Next option.

  • Click the Validate icon to validate the script.

  • A notification message appears to ensure the validity of the script.

  • Click the Export to Pipeline option.

  • A notification message appears to ensure that the selected Notebook has been exported.

Please Note: The imported model gets registered to the Data Pipeline module as a script.

Accessing the Exported Model within the Pipeline User interface

  • Navigate to the Data Pipeline Workflow editor.

  • Drag the DS Lab Runner component and configure the Basic Information.

  • Open the Meta Information tab of the DS Lab Runner component.

  • Configure the following information for the Meta Information tab.

    • Select Script Runner as the Execution Type.

    • Select function input type.

    • Select the project name.

    • Select the Script Name from the drop-down option. The same name given to the imported model appears as the script name.

    • Provide details for the External Library (if applicable).

    • Select the Start Function from the drop-down menu.

  • The exported model can be accessed inside the Script section.

  • The user can connect the DS Lab Script Runner component to an Input Event.

  • Run the Pipeline.

  • The model predictions can be generated in the Preview tab of the connected Input Event.

Please Note:

  • The Imported Models can be accessed through the Script Runner component inside the Data Pipeline module.

  • The execution type should be Model Runner inside the Data Pipeline while accessing the other exported Data Science models.

  • The supported extensions for External models - .pkl, .h5, .pth & .pt

Try out the Import Model Functionality yourself

Some of the Sample models and related scripts are provided below for the users to try this functionality. Please download them with a click, and use them inside your Data Science Notebook by following the above-mentioned steps.

Sample files for Sklearn

Sample files for Keras

Sample files for PyTorch

Importing a Model
923B
SklearnModel .pkl
Sample Sklearn model for import.
6KB
Importmodels_Sklearn_Inference (1).ipynb
Sample python script based on the imported Sklearn model.
18KB
KersModel.h5
Sample Keras model for import.
6KB
Importmodels_Keras_Inference.ipynb
Sample python script based on the imported Keras model.
8KB
Pytorch_Model.pth
Sample Pytorch model for import
10KB
ImportModel_Pytorch_Inference.ipynb
Sample python script based on the imported Pytorch model.