In Rayven.io, merging data from different payloads and systems is essential for gaining actionable insights, especially when dealing with real-time or multi-system data
In Rayven.io, the workflow builder is where you merge data streams from different payloads and systems. Data merging typically involves:
- Ingesting data from various sources.
- Normalizing the data to ensure uniformity.
- Aligning timestamps to maintain chronological integrity.
- Combining data from multiple systems into a cohesive dataset.
Steps in the Data Merging Process:
-
Data Ingestion:
- Each payload is ingested into Rayven.io using input nodes (such as API, MQTT, Modbus, or other connectors). These input nodes capture data in its raw, unstructured format.
-
Normalization:
- As the data flows through the workflow, normalization nodes apply transformations to ensure that data from different payloads is standardized. This includes:
- Unit conversion (e.g., converting Celsius to Fahrenheit).
- Timestamp alignment to a common format (e.g., Unix timestamp).
- Data formatting adjustments (e.g., standardizing decimal places or rounding values).
- As the data flows through the workflow, normalization nodes apply transformations to ensure that data from different payloads is standardized. This includes:
-
Timestamp Alignment:
- Data from different systems often arrives with different timestamp configurations. Rayven.io normalizes the timestamps during the merging process by:
- Aligning data points to the nearest relevant timestamp based on the desired frequency (e.g., second, minute, hour).
- Using time zones to convert all timestamps to a unified reference time (e.g., UTC or a specific local time).
Example:
- If data from System A logs entries every minute and System B logs entries every five seconds, Rayven.io can synchronize the timestamps, so the data is merged into one cohesive dataset without conflicts in time alignment.
- Data from different systems often arrives with different timestamp configurations. Rayven.io normalizes the timestamps during the merging process by:
-
Data Combination:
- After normalization and timestamp alignment, data from different payloads is combined. This allows you to aggregate metrics, perform analytics, and visualize the unified data on dashboards.
Example:
- Imagine merging data from a temperature sensor and a humidity sensor. The temperature data arrives every second, while humidity readings are sent every five seconds. After normalization and alignment, the data can be combined to show how temperature and humidity change together over time in a single view.
Example Use Case: Smart Factory Monitoring
Scenario: In a smart factory, multiple systems track different parameters. For instance:
- Temperature sensors send data in Celsius every 10 seconds.
- Vibration sensors send data in millimeters per second (mm/s) every 5 seconds.
- Humidity sensors report data every minute.
To analyze the factory’s environmental conditions and detect anomalies, you need to merge these datasets.
Steps:
-
Ingest Data:
- Use input nodes in Rayven.io to capture data from all three systems.
-
Normalize Data:
- Standardize units (e.g., convert vibration data to the desired unit of measure).
- Align timestamps to the nearest second, ensuring all data streams are chronologically consistent.
-
Merge and Analyze:
- Combine the normalized and aligned data into a unified dataset, allowing you to correlate temperature, vibration, and humidity over time.
- Use the merged data to identify patterns, such as spikes in vibration when the temperature exceeds a certain threshold.
-
Visualize Data:
- Present the merged data in a single dashboard that shows real-time interactions between temperature, vibration, and humidity, helping operators monitor conditions and respond to potential issues.
Conclusion
In Rayven.io, merging data from different payloads and systems is essential for gaining actionable insights, especially when dealing with real-time or multi-system data. By normalizing data and aligning timestamps, you can create a unified dataset that allows for accurate analysis, aggregation, and visualization. This process ensures that data from various sources can be merged seamlessly, providing a comprehensive view of operations, system health, or other business metrics.