We all love real world examples of how to leverage the power of Kafka, let’s explore the concept of building real-time push APIs using Kafka as the customer-facing interface. We will discuss the advantages and challenges of using a push-based approach, the authentication and authorisation mechanisms involved, and the documentation standards used.
Real-time Push APIs using Kafka: The rise of Push-Based approach
The push-based approach in building real-time push APIs using Kafka typically involves a smart broker and a dumb client. However, this approach can be challenging to implement due to operational complexities and the need for retry strategies. It is also important to appropriately size consumers to avoid overloading them. Additionally, in a push-based approach, once a message is acknowledged by the consumer, it cannot be received again. This can be problematic for B2B customers who may need to replay messages in case of any issues. On the other hand, in a pull-based approach, messages stay in the system and can be replayed if needed, enabling real ‘exactly-once’ semantics.
Real-time Push APIs using Kafka: Event-Driven API Ecosystem
To create an event-driven API ecosystem similar to REST APIs, several considerations need to be addressed. These include authentication, authorization, and documentation.
Authentication and Authorization methods of Apache Kafka:
When implementing authentication for real-time push APIs using Kafka, it is important to ensure that the authentication mechanism is seamless and consistent across different types of APIs. In the case of Kafka, it supports various authentication mechanisms, including OAuth 2.0 token authentication. This allows clients to use the same authentication mechanism for both REST requests and connecting to the Kafka broker.
Authorisation in Kafka is based on Kafka resources such as topics and consumer groups. ACLs (Access Control Lists) can be used to authorise access to specific topics or groups. However, it is important to note that authorisation is at the topic level, meaning that consumers authorised for a topic can consume all messages within that topic. This may require careful consideration when structuring topics for different customers. If you are unsure how to design your topics, speak to one of our experts.
Create documentation standards
Providing clear and comprehensive documentation is crucial for any API ecosystem. In the case of REST APIs, OpenAPI is a widely used standard for documenting APIs. However, for push-based APIs, a similar standard is needed. AsyncAPI, which is compatible with OpenAPI, provides a similar look and feel and can be used to document push APIs. Integrating these documentation standards into a developer portal can provide a consistent and user-friendly experience for customers. If you are interested in seeing a full end-to-end example of Kafka and AsyncAPI – check out SpecMesh.
Event driven APIs is a perfect match of IoT
Building event-driven real-time push APIs using Kafka as the customer-facing interface offers several advantages, especially in IoT scenarios. By combining Kafka with authentication mechanisms like OAuth 2.0 and documentation standards like AsyncAPI, a similar experience to REST APIs can be achieved. However, it is important to consider the limitations of authorisation at the topic level and the potential impact on topic structure. Additionally, a pull-based approach may be more suitable for B2C scenarios where device availability and bandwidth limitations are a concern.
Kafka provides a powerful solution for implementing real-time push APIs using Kafka. By leveraging its features and combining them with industry-standard authentication and documentation practices , a robust and scalable event-driven API ecosystem can be created. While there may be challenges and limitations, such as authorisation at the topic level, Kafka’s flexibility and compatibility with existing infrastructure make it a viable choice for B2B scenarios.
See how German startup Kugu followed this methodology to build their digital platform to lower carbon emissions of buildings.
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.
If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.
3rd Party Cookies
This website uses Google Analytics to collect anonymous information such as the number of visitors to the site, and the most popular pages.
Keeping this cookie enabled helps us to improve our website.
Please enable Strictly Necessary Cookies first so that we can save your preferences!