Feature List
  • 8.x
    • Team Feature
      • Data-Prep Features
      • Pipeline Features
        • What's New Window Pop-up
        • Failure Db-Sync
        • Testing for Kafka 3.1.0
        • Python 3.10 R&D
        • Job History
        • SPC (Single Page for Configuration)
        • Python Jobs
        • Failure Alerts
        • Event Channel Alerts
        • Pipeline Error handling
        • Pipeline : PySpark Component (PySpark Job)
      • Dashboard Charting
        • Widget as component
        • Knowledge Graph Chart
          • Sample Library based code
        • Word Cloud
        • Tile component
        • Sankey Chart
        • Model as API Connector
        • Dataprep recipe in Dataset selection
        • Decomposition Enhancment
      • Python Upgrade
        • Core Platform : Data Services
        • Core Platform : Data Catalog
        • Core Platform : Data Center
        • Data Science Lab
      • Sonar Code Scan automation by DevOps
      • DS Lab PySpark Project.
      • Core Platform
        • Tag Feature For Data Connector , Dataset , DataStore etc..
        • DataStore & Metadatastore Migration
        • MongoDB & ClickHouse Support For DataSheet
        • Data As API WorkBench
        • Pagination in Home , DataCenter , Dataset , DataStore etc..
        • Sharing Data Connector & Data Set with View or Edit Permission.
        • Core Monitoring & Alerting
      • Data Science Lab
        • Auto Forecasting Requirements
          • User Input
          • Forecasting Method
          • Explainability
        • DSLAB Sprint May1-2023-May12-2023
        • DS LAB Sprint Apr10-Apr21
        • Provide Static Variables for DSLAB Component In AutoML
        • Scheduler For DSLAB Scripts
        • Optimisation of Model Explainability code
    • QA
    • DevOps
Powered by GitBook
On this page
  1. 8.x
  2. Team Feature
  3. Pipeline Features

Python Jobs

We need to deploy script as python jobs

DS Lab component : Transformation and Writer (Boiler-Plate)

We need a Tag for Python and PySpark so different jobs can be created.

The Importance of Tags: Tags are a powerful tool for organizing and managing jobs in a deployment environment. By using tags to differentiate between Python and PySpark scripts, developers can create separate jobs that are optimized for each process. This ensures that code is executed efficiently and we don't create unnecessary spark context.

PreviousSPC (Single Page for Configuration)NextFailure Alerts

Last updated 2 years ago