Axoflow normalizes incoming data and delivers it in a destination-friendly format and schema - covering keys, data types, streams, or indexes - rather than ingesting raw blobs. This includes consistent timestamps and attributes that downstream search queries, alerts, and detection engineers can immediately use. When a destination supports a specific normalization framework, Axoflow applies it by default. For vendor-specific schemas, such as Dynatrace Grail’s semantic dictionary, Axoflow adds native support to ensure seamless integration.
Why Normalization Matters
Logs and security data come in different formats, from different devices, even when referring to the same event, making it difficult to find, understand, and correlate the data. Security events and detection rules should be agnostic to how the data was collected or where it came from. Your detection engineers shouldn’t care about what’s the exact name of the field where your different firewalls store the IP of the client: all such events should refer to a standard name. However, it needs expertise and a lot of legwork to do that properly. There is no one-to-one mapping between sources and schemas. You need to check the schema descriptions, understand the original content and know how the data will be used for analysis. The normalized version of an event helps your security teams best in their detection and investigation, as it makes it easy to correlate events received from different vendors (or different devices).
Axoflow builds this intelligence in. By combining AI with our deep data and security expertise, Axoflow automatically normalizes your data in real time within the pipeline. It supports multiple normalization frameworks, including Elastic Common Schema (ECS) and Open Cybersecurity Schema Framework (OCSF), so your security teams can work with normalized data from the start, enabling them to write stronger detection rules and uncover insights faster. You can even send the same data to multiple destinations in the normalized format that the specific destination supports.
We create and maintain our normalization database using an AI-backed process supervised by our engineers, so development is rapid, while it maintains high-quality and consistency.
To sum up, normalizing data:
- simplifies data ingestion for your SIEM
- improves correlation and detection
- speeds up investigations
- decreases vendor lock-in and improves flexibility.