
Splunk .conf25 - The Role of the Pipeline
As Q4 gears up and companies begin their drive to source new technology, I wanted to post about this past year in security operations and data pipelines. We are just coming off Splunk.conf25, which is absolutely my favorite SecOps conference/event. The reason why is that literally everyone who stopped by the Axoflow booth understood immediately what Axoflow does as a platform. The discussions from that point forward then focused on what makes the Axoflow platform unique. This brings me to my second discussion point, which is what the underlying theme of the conversations were at .conf25 this year.

What makes Axoflow unique is our automated processing engine. The classification of data sources, curation of the data flows, reduction and routing are (and should) be automated. And the maintenance of those workflows is on us, not the customers nor the communities. We already understand that the data sources, and the destination vendors are not incentivized to maintain the integration or actually standardize around a protocol that could solve this. Axoflow has productized these operations (at scale).
Which leads me to what we were hearing at .conf last week. It has always been our thesis that data quality is the root issue at the core of ineffective and expensive SIEM operations. If we get the data right, the cost problem will also be solved.. Firms may actually start asking for more data, not less in the SIEM. Data quality was the recurring discussion. And was that because the SecOps world is just coming around to this idea? I doubt it. Perhaps it’s because not just SIEMs, but also the downstream AI tools that are going to be totally dependent on access to clean data. Or it's just simply too much data, too much noise, and it’s time to get it fixed.

And my final point is that the pipeline is the ideal place to do this work. But don’t take my word for it: George Kurtz the founder and CEO of Crowdstrike was recently on CNBC with Jim Cramer and said:
If you own the data pipeline, you’re going to own the SIEM market.
The destinations are shifting left, the data sources are shifting right. The pipeline allows for vendor-agnostic normalization across the data sources and the destinations. We can argue over the end state, but the role of the pipeline has never been more clear.

Follow Our Progress!
We are excited to be realizing our vision above with a full Axoflow product suite.
Sign Me UpFighting data Loss?

Book a free 30-min consultation with syslog-ng creator Balázs Scheidler