Job Monitoring
This page explains how we can monitor a Job.
Last updated
This page explains how we can monitor a Job.
Last updated
The user can use the Job Monitoring feature to track a Job and its associated tasks. On this page, the user can view details such as Job Status, Last Activated (Date and Time), Last Deactivated (Date and Time), Total Allocated and Consumed CPU, and Total Allocated and Consumed Memory, all presented together on Job monitoring page.
Please go through the below given walk-through on the Job monitoring function.
The user can access the Job Monitoring icon on the List Jobs and Job Workflow Editor pages.
Navigate to the List Jobs page.
The Job Monitoring icon can be seen for all the listed Jobs.
OR
Navigate to the Job Workflow Editor page.
The Job Monitoring icon is provided on the Header panel.
The Job Monitoring page opens displaying the details of resource usage for the selected job.
The below-given images displays Monitoring page for the Spark Job with details on the Spark driver and executor.
Displaying the monitoring details of the Spark Job Driver
Displaying the monitoring details of the Spark Job Executer
The below-given images displays Monitoring page for the PySpark Job with details on the PySpark driver and executor.
Displaying the monitoring details of the PySpark Job Driver
Displaying the monitoring details of the PySpark Job Executer
If Memory or Core allocated to the component is less than required, then it will be displayed in red color as shown in the below image.
Clear: It will clear all the monitoring details of the selected Job.