Machine Sensors – Pipeline

The Interface of Pipeline

The following pipelines provide the complete workflow for equipment monitoring and detecting anomalies.

Anomaly Detection Workflow Interface

Anomaly Detection Workflow

  • In the first part of the workflow, the data we are fetching from the shared event will be parallelly processed in three ways.

    • For prediction using DS Lab Runner Component

    • For Detection using the Python component

    • Storing raw data in mongo collection [Collection Name: adnoc_raw_data]

Components Used in the first part of the Pipeline

DS Formatter Python Component

From the shared event, we fetch the data and send it to the Python component which formats

the data structure and further sends only the value field to the DS Lab Runner.

Data Science Lab Runner Component

This component is used for the prediction of the sensor data which we send through the

python component and it will send the predictions to the out event.

Anomaly MGW PyMongo Writer

This component will write the predicted data in mongo collection. [Collection Name

og_anamoly_data]

Rule Engine Python Component

From the shared event, we fetch the data and send it to the Python component, using lookup

collection which we have already stored in the bi_testing database. We check whether the

generated data is an anomaly or normal data and send it to the output event with a flagged field

named is_alert with values as “yes” or “no”.

Raw Data PyMongo Writer

This component will write the raw data which we generated in the Data Generator pipeline in the

Mongo collection. [Collection Name: adnoc_raw_data].

  • In the second part of the workflow, after passing through the Rule Engine Component which detects the anomaly, gives a flagged column called is_alert, and goes through the Rule Splitter which segregates normal and anomaly data in different workflows:

    • The normal data goes through the python component which changes the data structure and sends it to the Web Socket producer.

    • The anomaly data is further divided into three ways and parallelly processed as explained:

      • The anomaly data go through the Python component that changes the data structure and sends it to the Web Socket Producer which will send data to the dashboard for producing an alert.

      • The anomaly data will kick start the Jira Python component to create a Jira Ticket.

      • Store the alert data in the Mongo collection [Collection Name: adnoc_alert_tag_values]

Components Used in the second part of the Pipeline

Rule Splitter Component

This component is used for segregating the anomaly and normal data using the flagged field

“is_alert” dividing the workflows into two.

Web Socket Normal Python component

This component is used to format the normal data’s structure to the web socket’s structure and it

will send it to the web socket producer.

Web Socket Producer Normal

This component will send the normal data to the Web socket client.

Web Socket Anomaly Python component

This component is used to format the anomaly data’s structure to the web socket’s structure and

it will send it to the web socket producer.

Web Socket Producer Anomaly

This component will send the anomaly data to the Web socket client from there it will be used in

a dashboard for the live alert.

Jira Python component

This component is used to create a ticket for a specific sensor alert in Jira.

Alert Data PyMongo Writer

This component will write the alert data in mongo collection. {Collection Name:

adnoc_alert_tag_values]

Last updated