blog by OSO

Real-time Push APIs Using Kafka 

Sion Smith 14 July 2023
real-time-push-apis-using-kafka

We all love real world examples of how to leverage the power of Kafka, let’s explore the concept of building real-time push APIs using Kafka as the customer-facing interface. We will discuss the advantages and challenges of using a push-based approach, the authentication and authorisation mechanisms involved, and the documentation standards used.

Real-time Push APIs using Kafka: The rise of Push-Based approach

The push-based approach in building real-time push APIs using Kafka typically involves a smart broker and a dumb client. However, this approach can be challenging to implement due to operational complexities and the need for retry strategies. It is also important to appropriately size consumers to avoid overloading them. Additionally, in a push-based approach, once a message is acknowledged by the consumer, it cannot be received again. This can be problematic for B2B customers who may need to replay messages in case of any issues. On the other hand, in a pull-based approach, messages stay in the system and can be replayed if needed, enabling real ‘exactly-once’ semantics.

Real-time Push APIs using Kafka: Event-Driven API Ecosystem

To create an event-driven API ecosystem similar to REST APIs, several considerations need to be addressed. These include authentication, authorization, and documentation. 

Authentication and Authorization methods of Apache Kafka:

When implementing authentication for real-time push APIs using Kafka, it is important to ensure that the authentication mechanism is seamless and consistent across different types of APIs. In the case of Kafka, it supports various authentication mechanisms, including OAuth 2.0 token authentication. This allows clients to use the same authentication mechanism for both REST requests and connecting to the Kafka broker.

Authorisation in Kafka is based on Kafka resources such as topics and consumer groups. ACLs (Access Control Lists) can be used to authorise access to specific topics or groups. However, it is important to note that authorisation is at the topic level, meaning that consumers authorised for a topic can consume all messages within that topic. This may require careful consideration when structuring topics for different customers. If you are unsure how to design your topics, speak to one of our experts.

Real-time Push APIs using Kafka

Create documentation standards

Providing clear and comprehensive documentation is crucial for any API ecosystem. In the case of REST APIs, OpenAPI is a widely used standard for documenting APIs. However, for push-based APIs, a similar standard is needed. AsyncAPI, which is compatible with OpenAPI, provides a similar look and feel and can be used to document push APIs. Integrating these documentation standards into a developer portal can provide a consistent and user-friendly experience for customers. If you are interested in seeing a full end-to-end example of Kafka and AsyncAPI – check out SpecMesh.

Event driven APIs is a perfect match of IoT 

Building event-driven real-time push APIs using Kafka as the customer-facing interface offers several advantages, especially in IoT scenarios. By combining Kafka with authentication mechanisms like OAuth 2.0 and documentation standards like AsyncAPI, a similar experience to REST APIs can be achieved. However, it is important to consider the limitations of authorisation at the topic level and the potential impact on topic structure. Additionally, a pull-based approach may be more suitable for B2C scenarios where device availability and bandwidth limitations are a concern.

Kafka provides a powerful solution for implementing real-time push APIs using Kafka. By leveraging its features and combining them with industry-standard authentication and documentation practices , a robust and scalable event-driven API ecosystem can be created. While there may be challenges and limitations, such as authorisation at the topic level, Kafka’s flexibility and compatibility with existing infrastructure make it a viable choice for B2B scenarios.

See how German startup Kugu followed this methodology to build their digital platform to lower carbon emissions of buildings.

Fore more content:

How to take your Kafka projects to the next level with a Confluent preferred partner

Event driven Architecture: A Simple Guide

Watch Our Kafka Summit Talk: Offering Kafka as a Service in Your Organisation

Successfully Reduce AWS Costs: 4 Powerful Ways

Protecting Kafka Cluster

Apache Kafka Common Mistakes

Kafka Cruise Control 101

Kafka performance best practices for monitoring and alerting

Get started with OSO professional services for Apache Kafka

Have a conversation with a Kafka expert to discover how we help your adopt of Apache Kafka in your business.

CONTACT US