Register as Job

Check out the given illustration on how to register a DSL Notebook as a Job.

Registering a PySpark Notebook

The user can register a Notebook script as a Job by using this functionality.

  • Select a Notebook from the left side panel.

  • Navigate to the Notebook List.

  • Click the Register option for the selected Notebook.

  • The Register as Job page opens.

  • Provide the following information:

    • Enter scheduler name

    • Scheduler description

    • Start function

    • Job basinfo

    • Spark Config

      • Choose an option out of Low, Medium, and High

      • Driver- based on the selected Spark configuration option (Low/Medium/High) the CPU and Memory limit are displayed.

  • Click the Save option to register the Notebook as a Job.

  • A notification message appears.

  • Navigate to the List Jobs page within the Data Pipeline module.

  • The recently registered DS Notebook get listed with the same name that was chosen as the Scheduler name for it.

Please Note: Refer the Data Science Lab Quick Start Flow page to get an overview of the Data Science Lab module in nutshell.

Last updated