Jobs
The Job Overview tab provides a high-level summary of all jobs in the Data Pipeline module. It opens by default when you navigate to the Pipeline and Job Overview page.
The tab displays information in graphical and tile-based formats, helping users monitor the health, status, and type of jobs at a glance.
Job Status
The Job Status section shows the total number of jobs, along with their distribution by execution state:
Running: Count and percentage of currently active jobs.
Success: Count and percentage of jobs that completed successfully.
Interrupted: Count and percentage of jobs stopped before completion.
Failed: Count and percentage of jobs that failed to execute.
Interactive option:
Click any status tile to view a detailed list of jobs in that category.
For each listed job:
View: Opens the selected job workspace.
Monitor: Opens the monitoring page for the selected job.
Job Type
The Job Type section categorizes jobs by type and displays counts and percentages in graphical format. Supported job types include:
Script Executor
Spark Job
Python Job
PySpark Job
Interactive option:
Click a job type tile to view all jobs created under that category.
For each listed job:
View: Opens the job workspace.
Monitor: Opens the job monitoring page.