Data Pipeline
  • Data Pipeline
    • About Data Pipeline
    • Design Philosophy
    • Low Code Visual Authoring
    • Real-time and Batch Orchestration
    • Event based Process Orchestration
    • ML and Data Ops
    • Distributed Compute
    • Fault Tolerant and Auto-recovery
    • Extensibility via Custom Scripting
  • Getting Started
    • Homepage
      • List Pipelines
      • Creating a New Pipeline
        • Adding Components to Canvas
        • Connecting Components
          • Events [Kafka and Data Sync]
        • Memory and CPU Allocations
      • List Jobs
      • Create Job
        • Job Editor Page
        • Task Components
          • Readers
            • HDFS Reader
            • MongoDB Reader
            • DB Reader
            • S3 Reader
            • Azure Blob Reader
            • ES Reader
            • Sandbox Reader
          • Writers
            • HDFS Writer
            • Azure Writer
            • DB Writer
            • ES Writer
            • S3 Writer
            • Sandbox Writer
            • Mongodb Writer
            • Kafka Producer
          • Transformations
        • PySpark Job
        • Python Job
      • List Components
      • Delete Orphan Pods
      • Scheduler
      • Data Channel
      • Cluster Event
      • Trash
      • Settings
    • Pipeline Workflow Editor
      • Pipeline Toolbar
        • Pipeline Overview
        • Pipeline Testing
        • Search Component in Pipelines
        • Push Pipeline (to VCS/GIT)
        • Pull Pipeline
        • Full Screen
        • Log Panel
        • Event Panel
        • Activate/Deactivate Pipeline
        • Update Pipeline
        • Failure Analysis
        • Pipeline Monitoring
        • Delete Pipeline
      • Component Panel
      • Right-side Panel
    • Testing Suite
    • Activating Pipeline
    • Monitoring Pipeline
  • Components
    • Adding Components to Workflow
    • Component Architecture
    • Component Base Configuration
    • Resource Configuration
    • Intelligent Scaling
    • Connection Validation
    • Readers
      • S3 Reader
      • HDFS Reader
      • DB Reader
      • ES Reader
      • SFTP Stream Reader
      • SFTP Reader
      • Mongo DB Reader
        • MongoDB Reader Lite (PyMongo Reader)
        • MongoDB Reader
      • Azure Blob Reader
      • Azure Metadata Reader
      • ClickHouse Reader (Docker)
      • Sandbox Reader
      • Azure Blob Reader
    • Writers
      • S3 Writer
      • DB Writer
      • HDFS Writer
      • ES Writer
      • Video Writer
      • Azure Writer
      • ClickHouse Writer (Docker)
      • Sandbox Writer
      • MongoDB Writers
        • MongoDB Writer
        • MongoDB Writer Lite (PyMongo Writer)
    • Machine Learning
      • DSLab Runner
      • AutoML Runner
    • Consumers
      • SFTP Monitor
      • MQTT Consumer
      • Video Stream Consumer
      • Eventhub Subscriber
      • Twitter Scrapper
      • Mongo ChangeStream
      • Rabbit MQ Consumer
      • AWS SNS Monitor
      • Kafka Consumer
      • API Ingestion and Webhook Listener
    • Producers
      • WebSocket Producer
      • Eventhub Publisher
      • EventGrid Producer
      • RabbitMQ Producer
      • Kafka Producer
    • Transformations
      • SQL Component
      • Dateprep Script Runner
      • File Splitter
      • Rule Splitter
      • Stored Producer Runner
      • Flatten JSON
      • Email Component
      • Pandas Query Component
      • Enrichment Component
      • Mongo Aggregation
      • Data Loss Protection
      • Data Preparation (Docker)
      • Rest Api Component
      • Schema Validator
    • Scripting
      • Script Runner
      • Python Script
        • Keeping Different Versions of the Python Script in VCS
    • Scheduler
  • Custom Components
  • Advance Configuration & Monitoring
    • Configuration
      • Default Component Configuration
      • Logger
    • Data Channel
    • Cluster Events
    • System Component Status
  • Version Control
  • Use Cases
Powered by GitBook
On this page
  • Basic Information
  • Meta Information Tab
  • When the authentication option is Password
  • When the authentication option is PEM/PPK file
  • Saving the Component
  1. Components
  2. Writers

Video Writer

PreviousES WriterNextAzure Writer

Last updated 1 year ago

The video writer component is designed to write .mp4 format video to a SFTP location by combining the frames that can be consumed using the video consumer component.

All component configurations are classified broadly into the following sections:

  • ​​

  • Meta Information

  • ​

Please follow the given demonstration to configure the Video Writer component.

Please Note:

  • The Pipeline testing suite and Data Metrices options in Monitoring pipeline page are not available for this component.

  • The video Writer component supports only .mp4 file format. Its writes video frame by frame to SFTP.

  • Drag & drop the Video Stream Consumer component to the Workflow Editor.

  • Click the dragged Video Stream Consumer component to open the component properties tabs.

Basic Information

It is the default tab to open for the component.

  • Invocation Type: Select an Invocation type from the drop-down menu to confirm the running mode of the reader component. Select ‘Real-Time’ or ‘Batch’ from the drop-down menu.

  • Deployment Type: It displays the deployment type for the component. This field comes pre-selected.

  • Batch Size (min 10): Provide the maximum number of records to be processed in one execution cycle (Min limit for this field is 10).

  • Failover Event: Select a failover Event from the drop-down menu.

  • Container Image Version: It displays the image version for the docker container. This field comes pre-selected.

  • Description: Description of the component. It is optional.

Please Note: If the selected Invocation Type option is Batch, then, Grace Period (in a sec)* field appears to provide the grace period for the component to go down gracefully after that time.

Selecting Real-time as the Invocation Type option will display the Intelligent Scaling option.

Meta Information Tab

Select the Meta Information tab and provide the mandatory fields to configure the dragged Video Stream Consumer component.

  • Host IP Address (*)- Provide IP or URL

    • The input in Host IP Address in the Meta Information tab changes based on the selection of the Channel. There are two options available:

      • Live: This allows writing the data to the desired location when live data is coming continuously.

      • Media File: It will read only stored video file and writes them to desired SFTP location.

  • Username (*)- Provide username

  • Port (*)- Provide the Port number

  • Authentication- Select any one authentication option out of Password or PEM PPK File

  • Stream(*)- The supported streaming methods are Live and Media files.

  • Partition Time(*)- It defines the length of video the component will consume at once in seconds. This field will appear only if the LIVE option is selected in Stream field.

  • Writer Path (*)- Provide the desired path in SFTP location where the video has to be written.

  • File Name(*)- Give any filename with a format mp4(sample_filename.mp4).

  • Frame Rate – Provide the rate of frames to be consumed.

Please Note: The fields for the Meta Information tab change based on the selection of the Authentication option.

When the authentication option is Password

While using the authentication option as a Password it adds a password field in the Meta information.

When the authentication option is PEM/PPK file

While choosing the PEM/PPK File authentication option, the user needs to select a file using the Choose File option.

Saving the Component

  • Click the Save Component in Storage icon for the Video Writer component.

  • The message appears to notify that the component properties are saved.

  • The Video Writer component gets configured to pass the data in the Pipeline Workflow.

​Basic Information​
Resource Configuration​
Video Writer as a Pert of a Pipeline Workflow