Jobs
This section provides detailed information on the Jobs to make your data process faster.
Last updated
This section provides detailed information on the Jobs to make your data process faster.
Last updated
Jobs are used for ingesting and transferring data from separate sources. The user can transform, unify, and cleanse data to make it suitable for analytics and business reporting without using the Kafka topic which makes the entire flow much faster.
Check out the given demonstration to understand how to create and activate a job.
The List Jobs option opens the available Jobs List for the logged-in user. All the saved Jobs by a user get listed on this page. By clicking on the Job name the Details tab on the right side of the page gets displayed with the basic details of the selected job.
Navigate to the Data Pipeline homepage.
Click on the List Jobs icon.
The List Jobs page opens displaying the created jobs.
Please Note:
• The user can open the Job Editor for the selected job from the list by clicking the View icon.
• The user can search for a specific Job by using the Search bar on the Job List. By typing a common name all the existing jobs having that word will list. E.g., By typing 'dem' all the existing Jobs with the word 'demo' in it get listed as displayed in the following image:
Navigate to the Data Pipeline homepage.
Click on the Create Job icon.
The New Job dialog box appears redirecting the user to create a new Job.
Enter a name for the new Job.
Describe the Job(Optional).
Job Baseinfo.: Select spark job from the dropdown menu.
Trigger By: There are 2 options for triggering a job on success or failure of a job:
Success Job: On successful execution of the selected job the current job will be triggered.
Failure Job: On failure of the selected job the current job will be triggered.
Is Scheduled?
A job can be scheduled for a particular timestamp. Every time at the same timestamp the job will be triggered.
Job must be scheduled according to UTC.
Spark Configuration
Select a resource allocation option using the radio button. The given choices are:
Low
Medium
High
This feature is used to deploy the Job with high, medium, or low-end configurations according to the velocity and volume of data that the Job must handle.
Click the Save option to create the job.
Please Note: By clicking the Save option, the user gets redirected to the job workflow editor.
A success message appears to confirm the creation of a new job.
The job editor page opens for the newly created job.
Please Note: The Current Job will not be triggered if the selected job is run in the Preview mode.
Once the Job gets saved in the Job list, the user can add a Task to the canvas. The user can drag the required tasks to the canvas and configure it to create a Job workflow or dataflow.
The Job Editor appears displaying the Task Pallet containing various components mentioned as Tasks.
Steps to create a Job Workflow:
Navigate to the Job List page.
Select a Job from the displayed list.
Click the View icon for the Job.
Please Note: Generally, the user can perform this step-in continuation to the Job creation, but in case the user has come out of the Job Editor the above steps can help to access it again.
The Job Editor opens for the selected Job.
Drag and drop the new required task or make changes in the existing task’s meta information or change the task configuration as the requirement. (E.g., the RDBMS Reader is dragged to the workspace in the below-given image):
Click on the dragged task icon.
The task-specific fields open asking the meta-information about the dragged component.
Open the Meta Information tab and configure the required information for the dragged component.
Host IP Address
Port number
Username
Password
Database Name
Driver- Select from the drop-down menu
Table Name
Query
Fetch size
Click the given icon to validate the connection.
Click the Save Task in Storage icon.
A notification message appears.
A dialog window opens to confirm the action of job activation.
Click the YES option to activate the job.
A success message appears confirming the activation of the job.
The Development mode opens for the selected Job.
Please Note: Jobs can be run in the Development mode as well. The user can preview only 10 records if the job is running in the Development mode.
The Status for the Job gets changed on the job List page when they are running in the Development mode.
The Toggle Log Panel displays the Logs and Advanced Logs tabs for the Job Workflows.
Navigate to the Job Editor page.
Click the Toggle Log Panel icon on the header.
A panel Toggles displaying the collective logs of the job under the Logs tab.
Select the Advanced Logs tab to display the pod status of the complete Job.
Icon | Name | Action |
---|---|---|
Job Version details | Displays the latest versions for the Jobs upgrade. | |
Log Panel | Displays Jobs logs. | |
Preview Mode | The Job can be previewed, and 10 records will be displayed. | |
Activate Job | Activates the current Job. | |
Update Job | Updates the current Job. | |
Edit Job | To edit the job name/ configurations. | |
Delete Job | Deletes the current Job. | |
List Job | Redirects to the List Job page. | |
Settings | Redirects to the Settings page. |
Click the Activate Job icon to activate the job(It appears only after the newly created job gets successfully updated).
Please Note: Click the Delete icon from the Job Editor page to delete the selected job. The deleted job gets removed from the Job list.