Messages in the Ether: Architecting Microservice Communication Protocols

  • Diljeet Singh
  • June 30, 2024

This is part 1 of a multi-series related to how microservice to microservice communication can be improved through streaming platforms.

Microservice architecture one of the leading approaches to software development. Communication between these services is an essential component of modern software architecture. It enables disparate services to interact and function cohesively within a distributed system. Communication can be achieved through various methods, with REST and streaming platforms being two prominent approaches. While both approaches are valid and functional, both approaches scale differently. Let’s discuss some of the pros and cons including failure scenarios and technical complexity of the data flowing.


REST, or Representational State Transfer, is a widely adopted protocol for microservice communication. It operates over HTTP, using standard methods such as GET, POST, PUT, and DELETE to perform operations on resources identified by URLs. Each microservice exposes its functionality via RESTful APIs, allowing other services to interact with it through HTTP requests. REST’s simplicity and ubiquity make it a popular choice among developers, as it integrates easily with existing web technologies and frameworks. Tools such as Swagger UI make testing and adoption of these APIs easier through automated API documentation generation.

In a RESTful architecture, communication between microservices is typically synchronous. When one microservice needs to interact with another, it sends an HTTP request and waits for a response. This synchronous nature makes REST straightforward to understand and debug, as the flow of data is direct and predictable. However, this approach also has inherent limitations. As the number of microservices grows, managing the communication between these services becomes increasingly complex and complicated as the number of interconnections also grows.

Microservices can typically ask for entity information via REST to other services in batches or individual selections. If multiple services need similar information from another service, they must all make REST calls to that service which can result in increased queries to databases, number of transactions, higher latency and added performance implications. These performance implications are then exacerbated to each service making the requests. To try to be efficient you can try to cache the responses to have increased performance, but it comes at the cost of increased complexity and potentially maintaining multiple caching implementations across multiple services. This can be particularly problematic in scenarios requiring high scalability and low latency, such as real-time data processing. As you can see REST can be valid selection for batch processing but does not scale when it comes to using it for real-time data processing.

Streaming Platforms

Streaming platforms, such as Apache Kafka, provide a robust and scalable solution for handling high-throughput, low-latency communication. In streaming-based architectures, microservices publish messages to a central message broker or stream processor, which then distributes these messages to the appropriate consumers. Unlike REST, which is inherently synchronous, streaming platforms enable asynchronous communication. This means that services can produce and consume messages independently and asynchronously. This decoupling enhances the scalability and resilience of the system, as services can continue to operate even if some components are temporarily unavailable.

For example, in Kafka, services produce messages to data streams(topics), and other services consume messages from these data streams(topics). This publish-subscribe model allows multiple services to listen to the same stream of messages, enabling simpler data processing pipelines and real-time processing of events.

One of the significant advantages of streaming platforms is their ability to handle microservices dying during processing. In a RESTful architecture, if a microservice fails while handling a request, the client service may experience errors or timeouts, requiring additional logic to handle retries and error recovery. This can complicate the implementation and increase overall system complexity. In contrast, streaming platforms are inherently designed to handle such failures more gracefully. When a microservice produces a message to a data stream (topic), it is stored in the message broker until it is consumed. If a consuming microservice fails, the message remains in the data stream and can be reprocessed once the service is back online. This ensures that no data is lost, and that processing can resume seamlessly. Additionally, streaming platforms often provide built-in mechanisms for ensuring message ordering, delivery guarantees, and fault tolerance. For instance, Kafka supports various delivery semantics, such as at-most-once, at-least-once, and exactly-once, allowing developers to choose the appropriate level of reliability for their use case.

Another significant benefit of streaming platforms is their efficiency in handling scenarios where multiple services need to retrieve the same data. In a RESTful system, each service would need to send separate requests to obtain the same data, resulting in duplicated effort and increased load on the data source. Streaming platforms address this issue by allowing services to subscribe to data stream and receive messages as they are published. This means that once a piece of data is produced to a data stream, it can be consumed by multiple services without redundant requests to the data source. This approach reduces the load on data sources and improves overall system efficiency, especially in data-intensive applications where the same information needs to be processed by various components.

In conclusion, REST offers a simple and straightforward approach for synchronous communication, making it usable for applications with low-to-moderate scalability, small number of microservices, and loose latency requirements. For scenarios requiring high throughput, low latency, robust fault tolerance, and efficient data retrieval, streaming platforms present a superior solution. By leveraging the capabilities of streaming platforms, organizations can build more resilient and scalable microservice architectures that meet the demands of modern, data-intensive applications.

Part 2 we will explore how streaming platforms provide better security to access data from your microservices.