blog by OSO

The Future of Streaming Data

Sion Smith 2 January 2025
The future of streaming data

Happy New Year, everyone!

As we kick off January, it’s time to share OSO’s annual trends and predictions for data streaming technology. Based on our roots in the Kafka open-source community and our experience leading London’s Kafka summits, we’ve had a front-row seat to the evolution of data streaming technology. Naturally, we’ve been thinking about the future and what’s next. Here are our 2025 predictions for the future of streaming data. 

 

  • Trend 1: Simplification
  • Trend 2: Cost Optimisation
  • Trend 3: Business Use Cases for AI

Trend 1: Simplifying legacy infrastructure

First, data streaming is undergoing a major simplification. Traditional data pipelines, once celebrated for their complexity, are now recognised as brittle and inadequate for today’s heavy volumes of data and increasingly rapid rates of change. What worked just a few years ago is now obsolete, and legacy systems are struggling to keep pace with the exponential growth of data generation and processing requirements.

At the same time, modern enterprise companies are confronting a fragmented technology stack, driving a ‘shift left’ movement that aims to unify operational and analytical data sets through a stream-first approach. This isn’t just a technical shift; companies are fundamentally rethinking how data flows between components and re-imagining their systems. 

We’re excited about the emergence of more integrated workflows. Using a Kafka to Iceberg workflow, you can now manage stream and table evolution together as a single entity. This approach lets companies be more flexible: changes to streams no longer break tables, and teams can now test Event-Driven Architecture (EDA) applications and Change Data Capture (CDC) connectors directly within Iceberg topics.

You can now see innovations like Redpanda’s Iceberg connector, written in C++, pushing boundaries by offering native, high-performance integration that was previously challenging to achieve. 

Trend 2: Optimising infrastructure costs

As companies also look to do more with less, they’re also focused on optimising costs. The trend is clear and compelling: fewer components in the streaming stack mean less management overhead and a smaller infrastructure footprint. Solutions that can be deployed as a single binary are the epitome of this approach. By streamlining and simplifying complex infrastructure, you naturally save resources on engineering time and budget. 

Tiered storage represents another significant shift in how companies approach data management. Leveraging cheap object storage is dramatically driving workloads away from expensive SSD-backed brokers. You can see this trend in Confluent’s acquisition of Warpstream, which signals how the industry tends to recognise more cost-effective storage strategies.

Redpanda’s ‘mixed mode’ offering is one of these emerging cost-effective options. By giving companies multimodal control at the topic level for how data segments are stored, it lets organisations dynamically optimise their storage strategies and balance performance and cost with precision. This provides a solution that’s win-win. It directly addresses the CFO’s desire for cost efficiency while maintaining a CTO’s need for technical flexibility. 

Trend 3: Business use cases for artificial intelligence

Perhaps the most exciting trend is the increasingly prominent role of AI in re-platforming and migrating between technologies. We’ve seen its potential for streamlining operations through a proof of concept (POC) OSO conducted for a UK Tier 1 bank, in which we had to migrate 5,000 IBM DataStage batch jobs to a streaming data pipeline.

Data sovereignty and intellectual property protection remain critical considerations in these types of migrations, but solutions like Redpanda ONE, which supports running private local models on a Nvidia-compatible binary, mitigate some of the risks. Solutions like these give companies the flexibility to incorporate AI-driven migrations while maintaining control over sensitive data.

Open-source frameworks like Benthos are also emerging as powerful tools by providing templated, structured, and scalable streaming environments that help companies simplify complex migrations, acting as bridges that help organisations navigate the challenges of modernising their data. 

What’s next for the future of streaming data

As we evaluate these trends, it looks like the future of data streaming isn’t about adding complexity, but about creating more intelligent, efficient, and adaptable systems: breaking down silos, optimising costs, and changing the way organisations understand and use their data.

This year, OSO is committed to being at the front of these innovations, helping both our clients and the London open-source community navigate new developments and explore new possibilities. Our expertise in Kafka, combined with our strong track record of projects, means that we’ll stay on top of the latest trends and share them with our community.

We’re also looking forward to building new tutorials and resources for Kafka developers. If you’d like to learn more, come to our Kafka Meetups or explore our educational content! We’d love to hear your thoughts on data streaming’s future. 

 

Cheers!

Sion Smith, Co-Founder and CTO, OSO

Accelerate your Kafka adoption with the right support

Get in touch with us today to discover how you can leverage your data to build a faster, more responsive business

Book a call