Register

Check out the given illustration on how to register a DSL Notebook as a Job.

Registering an .ipynb file as a Job

Registering an .ipynb File

The user can register a Notebook script as a Job by using this functionality.

  • Navigate to the Notebook tab of a Repo Sync Project.

  • Click the ellipsis icon for a .ipynb file.

  • Click the Register option for the selected .ipynb file.

  • The Register as Job page opens.

  • Select script by using the checkbox.

  • Click the Next option.

  • The next page opens to Validate the Script. Click the Validate icon.

  • A notification message appears to ensure that the script is valid.

  • Once the script gets validated, the Next option gets enabled. Click the Next option.

  • Provide the following information:

    • Enter scheduler name: Provide a name for the Registered Job.

    • Scheduler description: Provide description for the scheduler.

    • Start function: Select a function from the drop-down menu.

    • Job basinfo: Select an option from the drop-down menu.

  • Docker Config

    • Choose an option for Limit out of Low, Medium, and High

    • Request - CPU and Memory limit are displayed.

  • Click the Save option to register the selected .ipynb file as a Job.

  • A notification message appears.

  • Navigate to the List Jobs page within the Data Pipeline module.

  • The recently registered DS Notebook gets listed with the same name that was chosen as the Scheduler name for it.

Re-Registering an .ipynb File

This option appears for a .ipynb file that has been registered before.

  • Select Register option for a .ipynb file that has been registered before.

  • The Register as Job page opens displaying the Re-Register and Register as New options.

  • Select the Re-Register option by using the checkbox.

  • Select a version by using a checkbox.

  • Click the Next option.

  • Select script by using the checkbox.

  • Click the Next option.

  • The next page opens to Validate the Script. Click the Validate icon.

  • A notification message appears to ensure that the script is valid.

  • Once the script gets validated, the Next option gets enabled. Click the Next option.

  • The following information appears pre-selected except the Start function and Spark Configuration:

    • Enter scheduler name

    • Scheduler description

  • Start function: Select a function from the drop-down menu.

  • Job basinfo: Select an option from the drop-down menu.

  • Spark Config

    • Driver: Provide Core and Memory details. Minimum memory is 512.

    • Executor: Provide Core, Memory, and Instance details. Minimum memory is 512.

    • Concurrency Policy: Select an option out of Allow, Forbit and Replace from the drop-down.

    • Scheduler time: Set scheduler time using the crone generator. The default time is 1 min.

    • Alert: Choose a scenario out of Success or Failure to send alert notification.

  • Click the Save option to register the selected .ipynb file as a Job.

  • A notification message appears.

  • Navigate to the List Jobs page within the Data Pipeline module.

  • The recently registered DS Notebook gets listed with the same name that was chosen as the Scheduler name for it.

Please Note:

  • Select Register as New option for a .ipynb file if you wish to register it once again to the Data Pipeline module. In this case, the same file will be listed twice to the Job List page.

    • The user can provide a Scheduler Name and Description to the file while registering as Job, they won't be pre-selected.

Last updated