Data Pipeline
  • Data Pipeline
    • About Data Pipeline
    • Design Philosophy
    • Low Code Visual Authoring
    • Real-time and Batch Orchestration
    • Event based Process Orchestration
    • ML and Data Ops
    • Distributed Compute
    • Fault Tolerant and Auto-recovery
    • Extensibility via Custom Scripting
  • Getting Started
    • Homepage
      • Create
        • Creating a New Pipeline
          • Adding Components to Canvas
          • Connecting Components
            • Events [Kafka and Data Sync]
          • Memory and CPU Allocations
        • Creating a New Job
          • Page
          • Job Editor Page
          • Spark Job
            • Readers
              • HDFS Reader
              • MongoDB Reader
              • DB Reader
              • S3 Reader
              • Azure Blob Reader
              • ES Reader
              • Sandbox Reader
              • Athena Query Executer
            • Writers
              • HDFS Writer
              • Azure Writer
              • DB Writer
              • ES Writer
              • S3 Writer
              • Sandbox Writer
              • Mongodb Writer
              • Kafka Producer
            • Transformations
          • PySpark Job
          • Python Job
          • Python Job (On demand)
          • Script Executer Job
          • Job Alerts
        • Register as Job
        • Exporting a Script From Data Science Lab
        • Utility
        • Git Sync
      • Overview
        • Jobs
        • Pipeline
      • List Jobs
      • List Pipelines
      • Scheduler
      • Data Channel & Cluster Events
      • Trash
      • Settings
    • Pipeline Workflow Editor
      • Pipeline Toolbar
        • Pipeline Overview
        • Pipeline Testing
        • Search Component in Pipelines
        • Push & Pull Pipeline
        • Update Pipeline Components
        • Full Screen
        • Log Panel
        • Event Panel
        • Activate/Deactivate Pipeline
        • Update Pipeline
        • Failure Analysis
        • Delete Pipeline
        • Pipeline Component Configuration
        • Pipeline Failure Alert History
        • Format Flowchart
        • Zoom In/Zoom Out
        • Update Component Version
      • Component Panel
      • Right-side Panel
    • Testing Suite
    • Activating Pipeline
    • Pipeline Monitoring
    • Job Monitoring
  • Components
    • Adding Components to Workflow
    • Component Architecture
    • Component Base Configuration
    • Resource Configuration
    • Intelligent Scaling
    • Connection Validation
    • Readers
      • GCS Reader
      • S3 Reader
      • HDFS Reader
      • DB Reader
      • ES Reader
      • SFTP Stream Reader
      • SFTP Reader
      • Mongo DB Reader
        • MongoDB Reader Lite (PyMongo Reader)
        • MongoDB Reader
      • Azure Blob Reader
      • Azure Metadata Reader
      • ClickHouse Reader (Docker)
      • Sandbox Reader
      • Azure Blob Reader (Docker)
      • Athena Query Executer
      • Big Query Reader
    • Writers
      • S3 Writer
      • DB Writer
      • HDFS Writer
      • ES Writer
      • Video Writer
      • Azure Writer
      • ClickHouse Writer (Docker)
      • Sandbox Writer
      • MongoDB Writers
        • MongoDB Writer
        • MongoDB Writer Lite (PyMongo Writer)
    • Machine Learning
      • DSLab Runner
      • AutoML Runner
    • Consumers
      • GCS Monitor
      • Sqoop Executer
      • OPC UA
      • SFTP Monitor
      • MQTT Consumer
      • Video Stream Consumer
      • Eventhub Subscriber
      • Twitter Scrapper
      • Mongo ChangeStream
      • Rabbit MQ Consumer
      • AWS SNS Monitor
      • Kafka Consumer
      • API Ingestion and Webhook Listener
    • Producers
      • WebSocket Producer
      • Eventhub Publisher
      • EventGrid Producer
      • RabbitMQ Producer
      • Kafka Producer
      • Synthetic Data Generator
    • Transformations
      • SQL Component
      • File Splitter
      • Rule Splitter
      • Stored Producer Runner
      • Flatten JSON
      • Pandas Query Component
      • Enrichment Component
      • Mongo Aggregation
      • Data Loss Protection
      • Data Preparation (Docker)
      • Rest Api Component
      • Schema Validator
    • Scripting
      • Script Runner
      • Python Script
        • Keeping Different Versions of the Python Script in VCS
      • PySpark Script
    • Scheduler
    • Alerts
      • Alerts
      • Email Component
    • Job Trigger
  • Custom Components
  • Advance Configuration & Monitoring
    • Configuration
      • Default Component Configuration
      • Logger
    • Data Channel
    • Cluster Events
    • System Component Status
  • Version Control
  • Use Cases
Powered by GitBook
On this page
  • Job List
  • Job Details & History Tabs
  • Job Details
  • Total Job Config
  • History
  • Pin & Unpin Jobs
  • Job Actions
  • Searching Job
  • Customizing the Job List
  • Recent Run
  • Status Types
  • Status Color Coding
  • Feature Details
  • Actions and Notifications
Export as PDF
  1. Getting Started
  2. Homepage

List Jobs

All the saved Jobs by a logged-in user get listed by using this option.

PreviousPipelineNextList Pipelines

Last updated 7 months ago

and The List Jobs option opens the available Jobs List for the logged-in user. All the saved Jobs by a user are listed on this page. By clicking on the Job name the Details tab on the right side of the page gets displayed with the basic details of the selected job.

Job List

  • Navigate to the Data Pipeline Homepage.

  • Click on the List Jobs option.

  • The List Jobs page opens displaying the created jobs.

Job Details & History Tabs

  • Select a Job from the displayed list, and click on it.

  • This will open a panel containing three tabs:

Job Details

The Job Details panel displays key metadata: the creator, last updater, last activation timestamp, and last deactivation timestamp. This information helps users track changes and manage their workflows effectively.

  • Tasks: Indicates the number of tasks used in the job.

  • Created: Indicates the user's name who created the job (with the date and time stamp).

  • Last Activated: Indicates the user's name who last activated the job (with the date and time stamp).

  • Last Deactivated: Indicates the user's name who last deactivated the job (with the date and time stamp).

  • Cron Expression: A string representing a schedule specifying when the job should run.

  • Trigger Interval: The interval at which the job is triggered (e.g., every 5 minutes).

  • Next Trigger: Date and time of the next scheduled trigger for the job.

  • Description: Description of the job provided by the user.

Total Job Config

Python Jobs

  • Total Allocated Min CPU: Total Minimum allocated CPU (in Cores)

  • Total Allocated Min Memory: Total Minimum allocated Memory (MB)

  • Total Allocated Max CPU: Total Maximum allocated CPU (in Cores)

  • Total Allocated Max Memory: Total Maximum allocated Memory (MB)

Spark & PySpark Jobs

  • Total Allocated CPU: Total allocated CPU cores.

History

  • Provides relevant information about the selected job's past runs, including success or failure status.

  • A Clear option is available to clear the job history.

    • Click on the Clear button from the History tab.

    • It will clear all the job run history and logs from the History tab and Sandbox location.

  • A Refresh icon is provided for refreshing the displayed job history.

  • In the List Jobs page, the user can view and download the pod logs for all instances by clicking on the View System Logs option in the History tab.

  • Once the user clicks on the View System Logs option, a drawer panel will open from the right side of the window. The user can select the instance for which the System logs have to be downloaded from the Select Hostname drop-down option.

  • Clear: It will clear all the job run history from the History tab.

    • Navigate to the History section for a Job.

    • Click the Clear option.

    • A confirmation dialog box appears.

    • Click the Yes option to apply the clear history.

    • The job history gets deleted.

    • You may open the History tab for the same job. It will be deleted.

Pin & Unpin Jobs

The Pin and Unpin feature allows users to prioritize and easily access specific jobs within a list. This functionality is beneficial for managing tasks, projects, or workflows efficiently. This feature is available on each job card on the List Jobs page.

  • Navigate to the List Jobs page.

  • Select the jobs that you wish to pin.

  • Click the Pin icon for the selected Job.

  • The job gets pinned to the list for easy access and appears at the top of the job list if it is the first pinned job.

  • You can use the pin icon to pin multiple jobs on the list.

  • Click the Unpin icon for a pinned job.

  • The selected job gets unpinned from the list.

Job Actions

A Job gets some actions to be applied under the Actions section on the List Jobs page.

Action Icon
Action Name
Description

Push Job

Enables to Push a Job to the VCS.

View

Share

Allows the user to share the selected job with other user(s) or usergroup(s).

Job Monitoring

Edit

This enables the user to edit any information about the job. This option will be disabled when the job is active.

Delete

Searching Job

The user can search for a specific Job by using the Search Bar on the Job List.

By typing a name in the Search Bar all the existing jobs containing that word will be listed. E.g., By typing san all the existing Jobs with the word san in it get listed as displayed in the following image:

Customizing the Job List

The Job List has been provided with a header bar containing the categories of the available jobs. The user can also customize the Job List by choosing available filter options in the header.

The Job List gets modified based on the selected category option from the header.

Recent Run

The Recent Runs status indication option provides an at-a-glance view of the five most recent job executions, allowing users to track performance in real-time. Get status display for each listed job—whether successful, failed, or in progress—is displayed, enabling users to assess system health quickly. By highlighting any issues immediately, this feature allows for proactive troubleshooting and faster response times, helping ensure seamless workflows and minimizing downtime.

Status Types

The Recent Run categorizes the recently run Jobs into the following statuses:

  1. Succeeded: The job was completed successfully without any errors.

  2. Interrupted: The job was stopped before completion, either by a user action or an external factor.

  3. Failed: The job encountered an error that prevented it from completing successfully.

  4. Running: The job is currently in progress.

Status Color Coding

The Job Run statuses are displayed with different color codes.

  • Succeeded: Green

  • Failed: Red

  • Interrupted: Yellow

  • Running: Blue

  • No Run: Grey

Feature Details

  • Navigate to the List Jobs page.

  • The Recent Runs section will be visible for all the listed jobs indicating the statuses of the five recent most runs.

  • You may hover on a status icon to get more details under the tooltip.

Actions and Notifications

Users can get a tooltip with additional details on the Job run status by hovering over the status icon in the Recent Run section.

  • Status: The status of the job run (Succeeded, Interrupted, Failed, Running).

  • Started At: The time when the job run was started.

  • Stopped At: This indicates when the job was stopped.

  • Completed At: The time when the job run was completed.

Please Note:

  • The user can open the Job Editor for the selected Job from the list by clicking the View icon.

  • The user can view and download logs only for successfully run or failed jobs. Logs for the interrupted jobs cannot be viewed or downloaded.

Updated: Indicates the user's name who updated the job (with the date and time stamp).

Total Allocated Memory: Total allocated memory in megabytes (MB).

Redirects the user to the page.

Redirects user to the page.

Allows the user to delete the job. The deleted job will be moved to .

Please Note: The Actions section also includes the , and options for a job to perform the said actions.

Job Editor
Job monitoring
Trash
Accessing the List Jobs option from Pipeline Homepage
List Jobs page
View System logs option for PySpark Job from List Jobs page
System Pod Logs for a PySpark Job
Search tab for Job
Selecting one filter option from the Header
Customizing the Job list based on the selected fitler option
Recent Runs in the List Jobs
Tooltip information for a Status