blog by OSO

The future of Data Streaming using shared data products

Sion Smith 17 August 2023

What is the future of data streaming and how it can revolutionise the way companies share and collaborate with their partners? 80% of the fortune 500 have adopted Kafka internally, it has become a common practice for any organisation who wants to leverage data, the next step in data streaming is to extend this data movement framework to external partners and close collaborators. Building data products from external data sources raises important questions about governance, data security, and data policy. However, it is also a frontier that holds great potential for companies as they continue to adopt Kafka internally.

The future of Data Streaming: Challenges with share data architecture

When companies transition to microservices-based systems, they often treat other departments as external entities. They validate their input, provide uptime guarantees, and ensure smooth collaboration. But why not treat other companies as external departments in the same way? We need the concept of a middle boundary between internal and external teams that needs to be addressed.

Treating internal teams as semi-external raises concerns about accidental DDoS attacks and the need for internal API throttling. It also prompts the question of why external teams are not treated as semi-internal. While there are challenges to be solved in terms of security, governance, and scalability, it is important for technologists to address these issues and find solutions.

What is a data product?

Kafka and data streaming are gaining popularity because of the infinite use cases they offer. An interesting example; An online grocery store, they heavily rely on Kafka for their operations, with millions of events streaming through their system every day. They are now able to offer their customers deliveries in under 15 minutes, a premium subscription feature.

Born out of the data gathered from robots in their factory, which gives a physical representation of the power of data streaming, with groceries flowing through conveyor belts and events streaming through topics. It is a testament to the potential of Kafka and how it can revolutionise various industries and create value from your data. By showcasing the possibilities and potential of Kafka, more companies will  be encouraged to explore and leverage this technology to build data products.

Stream sharing with Confluent

Confluent have recently realised Stream Sharing companies can now unlock the full potential of data streaming and revolutionise the way they share and collaborate with their partners. Using Stream Sharing, teams can:

  • Easily exchange real-time data without delays directly from Confluent to any Kafka client.
  • Safely share and protect your data with robust authenticated sharing, access management, and layered encryption controls.
  • Trust the quality and compatibility of shared data by enforcing consistent schemas across users, teams, and organisations.
data streaming

And don’t forget to mark your calendars for the upcoming Big Data LDN event. Our Kafka Meetup London will cover how to move from batch to real-time. You can register from here: 

These events are great opportunities to connect with the Kafka community, learn from industry experts, and stay updated on the latest developments in the world of data streaming. I hope to see you there!

Fore more content:

How to take your Kafka projects to the next level with a Confluent preferred partner

Event driven Architecture: A Simple Guide

Watch Our Kafka Summit Talk: Offering Kafka as a Service in Your Organisation

Successfully Reduce AWS Costs: 4 Powerful Ways

Protecting Kafka Cluster

Apache Kafka Common Mistakes

Kafka Cruise Control 101

Kafka performance best practices for monitoring and alerting

How to build a custom Kafka Streams Statestores

Get started with OSO professional services for Apache Kafka

Have a conversation with a Kafka expert to discover how we help your adopt of Apache Kafka in your business.

Contact Us