Manage Jobs
You can access the jobs created by you or shared to you from the Jobs page. You can view, update, and manage related actions for a job from this page.
The List Jobs page displays all saved jobs for the logged-in user. From this page, you can:
View and manage job metadata.
Monitor job history and system logs.
Pin/unpin jobs for quick access.
Perform job actions such as push, share, edit, or delete.
Search and filter jobs.
Review recent run statuses at a glance.
Access the Job List
Navigation path: Data Engineering > Jobs> Jobs list
Click on the Jobs option from the Data Engineering module.
The Jobs page opens, displaying all jobs accessible to the logged-in user.
Job Details and History
When you select a job from the list, a panel opens on the right with three tabs:
Job Details
Displays key job metadata, including:
Tasks: Number of tasks in the job.
Created: User who created the job with date and timestamp.
Updated: The User who last updated the job with the date and timestamp.
Last Activated / Deactivated: User and timestamp of last activation/deactivation.
Cron Expression: Scheduling pattern string.
Trigger Interval: Interval at which the job runs (e.g., every 5 minutes).
Next Trigger: Date and time of the next scheduled run.
Description: User-provided description of the job.
Resource Allocation: Total Job Configuration
Python Jobs:
Total Allocated Min/Max CPU (cores)
Total Allocated Min/Max Memory (MB)
Spark/PySpark Jobs:
Total Allocated CPU (cores)
Total Allocated Memory (MB)
Job History
Displays past job runs with succeeded, failed, or interrupted status.
Options available:
Clear: Removes all job run history and logs (confirmation required).
Refresh: Updates the displayed history.
View System Logs: Opens a drawer panel to view or download pod logs.
Select the target instance from the Hostname drop-down.
Pin & Unpin Jobs
Pin jobs for quick access. Pinned jobs appear at the top of the job list.
Multiple jobs can be pinned.
Use the Unpin icon to remove a job from the pinned list.
Job Actions
The Actions menu on each job card allows you to:
Push/Pull Job: Push job configuration to the VCS/ Pull a pushed version of the selected job from the VCS.
View: Redirects to the Job Editor page of the selected job.
Share: Share the job with other users or groups, defining entitlement.
Job Monitoring: Open the monitoring page for the job.
Edit: Update job configuration (disabled if the job is active).
Delete: Move the job to Trash.
Search Jobs
Use the Search Bar on the List Jobs page to find specific jobs.
The list updates dynamically as you type.
Example: Typing
san
lists all jobs containing the textsan
.
Customize Job List
The Jobs header contains a Filter option to refine the displayed list.
Selecting a filter modifies the displayed jobs based on the chosen category.
The job list can be customized based on the following criteria:
Job Type: PySpark, Scala Spark, Script Executor, Python.
Job State: Interrupted, Running, Failed, and Succeeded
Scheduled: Scheduled and Not scheduled,
Recent: Recently visited, owned By Me (the logged-in user).
Recent Runs
The Recent Runs section provides a quick view of the five most recent job executions for each listed job.
Status Types
Succeeded: Job completed successfully (Green).
Failed: Job failed to complete (Red).
Interrupted: Job stopped before completion (Yellow).
Running: Job currently in progress (Blue).
No Run: No runs recorded (Grey).
Features
Hover over a run status icon to see a tooltip with details:
Status: Succeeded, Interrupted, Failed, Running.
Started At: Job start time.
Stopped At: Time when the job stopped.
Completed At: Job completion time.