Messages in the Ether: Architecting Microservice Communication Protocols

This is part 1 of a multi-series related to how microservice to microservice communication can be improved through streaming platforms.

Microservice architecture one of the leading approaches to software development. Communication between these services is an essential component of modern software architecture. It enables disparate services to interact and function cohesively within a distributed system. Communication can be achieved through various methods, with REST and streaming platforms being two prominent approaches. While both approaches are valid and functional, both approaches scale differently. Let’s discuss some of the pros and cons including failure scenarios and technical complexity of the data flowing.

REST

REST, or Representational State Transfer, is a widely adopted protocol for microservice communication. It operates over HTTP, using standard methods such as GET, POST, PUT, and DELETE to perform operations on resources identified by URLs. Each microservice exposes its functionality via RESTful APIs, allowing other services to interact with it through HTTP requests. REST’s simplicity and ubiquity make it a popular choice among developers, as it integrates easily with existing web technologies and frameworks. Tools such as Swagger UI make testing and adoption of these APIs easier through automated API documentation generation.

In a RESTful architecture, communication between microservices is typically synchronous. When one microservice needs to interact with another, it sends an HTTP request and waits for a response. This synchronous nature makes REST straightforward to understand and debug, as the flow of data is direct and predictable. However, this approach also has inherent limitations. As the number of microservices grows, managing the communication between these services becomes increasingly complex and complicated as the number of interconnections also grows.

Microservices can typically ask for entity information via REST to other services in batches or individual selections. If multiple services need similar information from another service, they must all make REST calls to that service which can result in increased queries to databases, number of transactions, higher latency and added performance implications. These performance implications are then exacerbated to each service making the requests. To try to be efficient you can try to cache the responses to have increased performance, but it comes at the cost of increased complexity and potentially maintaining multiple caching implementations across multiple services. This can be particularly problematic in scenarios requiring high scalability and low latency, such as real-time data processing. As you can see REST can be valid selection for batch processing but does not scale when it comes to using it for real-time data processing.

Streaming Platforms

Streaming platforms, such as Apache Kafka, provide a robust and scalable solution for handling high-throughput, low-latency communication. In streaming-based architectures, microservices publish messages to a central message broker or stream processor, which then distributes these messages to the appropriate consumers. Unlike REST, which is inherently synchronous, streaming platforms enable asynchronous communication. This means that services can produce and consume messages independently and asynchronously. This decoupling enhances the scalability and resilience of the system, as services can continue to operate even if some components are temporarily unavailable.

For example, in Kafka, services produce messages to data streams(topics), and other services consume messages from these data streams(topics). This publish-subscribe model allows multiple services to listen to the same stream of messages, enabling simpler data processing pipelines and real-time processing of events.

One of the significant advantages of streaming platforms is their ability to handle microservices dying during processing. In a RESTful architecture, if a microservice fails while handling a request, the client service may experience errors or timeouts, requiring additional logic to handle retries and error recovery. This can complicate the implementation and increase overall system complexity. In contrast, streaming platforms are inherently designed to handle such failures more gracefully. When a microservice produces a message to a data stream (topic), it is stored in the message broker until it is consumed. If a consuming microservice fails, the message remains in the data stream and can be reprocessed once the service is back online. This ensures that no data is lost, and that processing can resume seamlessly. Additionally, streaming platforms often provide built-in mechanisms for ensuring message ordering, delivery guarantees, and fault tolerance. For instance, Kafka supports various delivery semantics, such as at-most-once, at-least-once, and exactly-once, allowing developers to choose the appropriate level of reliability for their use case.

Another significant benefit of streaming platforms is their efficiency in handling scenarios where multiple services need to retrieve the same data. In a RESTful system, each service would need to send separate requests to obtain the same data, resulting in duplicated effort and increased load on the data source. Streaming platforms address this issue by allowing services to subscribe to data stream and receive messages as they are published. This means that once a piece of data is produced to a data stream, it can be consumed by multiple services without redundant requests to the data source. This approach reduces the load on data sources and improves overall system efficiency, especially in data-intensive applications where the same information needs to be processed by various components.

In conclusion, REST offers a simple and straightforward approach for synchronous communication, making it usable for applications with low-to-moderate scalability, small number of microservices, and loose latency requirements. For scenarios requiring high throughput, low latency, robust fault tolerance, and efficient data retrieval, streaming platforms present a superior solution. By leveraging the capabilities of streaming platforms, organizations can build more resilient and scalable microservice architectures that meet the demands of modern, data-intensive applications.

Part 2 we will explore how streaming platforms provide better security to access data from your microservices.

Data-Driven Decisions or Decision-Driven Data? The Rise of Decision Intelligence

In the age of information overload, organizations are drowning in data. The question isn’t whether we have enough data, but rather, how can we use it to make better decisions? In the era of big data, the concept of being “data-driven” seemed intuitive. The idea was to approach data with fresh eyes, letting it speak for itself and guide decisions. However, this approach often lacked context, leading organizations on misguided quests for “all the data,” plagued by biases and limited perspectives. While a well-modeled dataset can reveal patterns and inform responses, the reality is that data alone cannot always steer us towards optimal outcomes.

The Limitations of “Data-Driven”

Traditionally, organizations have strived to be data-driven, believing that data holds all the answers. A purely data-driven approach overlooks the crucial element of intent.

Organizations risk chasing after every data point without a clear understanding of the underlying business or mission problem or desired outcome. This can result in:

  • Analysis Paralysis: The inability to distill meaningful insights from a sea of information.
  • Misaligned Priorities: Focusing on data collection over solving the actual business challenges.
  • Lack of Contextual Awareness: Disregarding the nuanced human factors that influence decision-making.

Decision-Driven Data: A Paradigm Shift

Decision-driven data flips the script. Instead of starting with data and looking for answers, it begins with the decisions that need to be made. To overcome these limitations, a paradigm shift is necessary.

Decision-driven data places the focus back on the decisions themselves. By prioritizing the outcomes we seek, we can determine the RIGHT data to collect and analyze. This shift helps organizations avoid aimless data expeditions and ensures that all efforts are aligned with strategic goals.

Enter Decision Intelligence (DI)

DI is the answer to this new paradigm. DI is the bridge between the digital / data world and the human world of decisions. It’s a discipline that combines data, analytics, and technology with behavioral science and managerial expertise to improve decision-making processes. Or put another way, augmenting human intuition and subjectivity with the machine’s algorithms & data and objectivity.

It starts by defining desired outcomes and documenting them as measurable metrics. By identifying the levers that decision-makers control, as well as the internal and external factors influencing those outcomes, DI effectively connects the digital and data ecosystem with an organization’s strategic objectives. DI tools and platforms create a comprehensive map of the decision landscape, enabling organizations to:

  • Trace Cause-and-Effect: Understand the relationships between actions and outcomes.
  • Unify Data: Integrate data from disparate sources to create a single source of truth.
  • Surface Insights: Use advanced analytics and AI to uncover hidden patterns and correlations.
  • Simulate Scenarios: Explore the potential impact of various decisions.
  • Incorporate External Factors: Account for market trends, competitor actions, and other variables.
  • Collaborate Effectively: Break down silos and foster communication among decision-makers.

Why Decision Intelligence Matters

DI is not merely a buzzword – it’s a game-changer for organizations seeking to maximize their data and AI investments. By adopting a decision-driven approach and leveraging DI, businesses can:

  • Focus Efforts: Align data, analytics, and AI teams towards common objectives.
  • Improve Decision Quality: Gain deeper insights and context for more informed choices.
  • Drive Proactive Action: Shift from reacting to events to anticipating and shaping them.
  • Increase ROI: Realize the full potential of data and technology investments.
  • Make Better Decisions: DI provides the insights and context needed to make informed, confident decisions.
  • Improve Efficiency: DI streamlines decision-making processes and reduces the time spent on analysis.
  • Drive Innovation: DI enables organizations to experiment with and explore new possibilities.
  • Enhance Agility: DI empowers businesses to respond quickly and effectively to changing market conditions.

The Future of Decision-Making

The advent of decision intelligence is ushering in an exciting era for organizations. By embracing DI, businesses can move beyond reactive decision-making and embrace a proactive, strategic approach. As technology continues to evolve, so too will the field of decision intelligence. We can expect to see even more sophisticated DI tools that harness the power of AI and machine learning to augment human decision-making capabilities. The future of decision-making is not about replacing humans with machines, but rather about empowering humans with the tools and insights they need to make the best possible choices. DI not only enhances the quality of decisions but also provides decision-makers with a clearer understanding of the second, third, and even fourth-order effects of their choices.

HighlightCares Program Selected for 2024 ACG National Capital Community Service Award

Fairfax, VA– The Highlight corporate giving program, HighlightCares, has been selected as the 2024 ACG National Capital Community Service Award. ACG National Capital recognizes one company in the Greater Washington area for demonstrating success
in supporting a charitable activity of its choosing.

Highlight’s corporate community service program, HighlightCares, aims at making a powerful impact on local DC-Metro area and communities nationwide. The organization selects a cause and organization to support each quarter through virtual or in-person service and fundraising opportunities. Since its inception, HighlightCares has supported over a dozen non-profit organizations, raised over $100,000 in lifetime donations, and donated more than 500 hours of employee time to various initiatives.

“This recognition is a true testament to the incredible dedication of our employee-owners and the power of investing in our communities. I’m incredibly proud of how our team has embraced this responsibility and propelled our HighlightCares initiatives forward. Thank you as well to our non-profit partners and community leaders who we have the honor of helping support and amplify the missions of – our combined efforts have created such positive change and further deepens our impact,” said Highlight Director of People and Culture Fiona Sityar.

###

About Highlight
Highlight Technologies (“Highlight”) is an award-winning, 100% employee-owned, ISO® 9001, ISO 20000, ISO 27001, ISO 44001 certified, ISO 56000 certified, CMMI-DEV Level 3, and CMMI-SVC Level 3 appraised federal contractor that provides Application Development, Business & Mission Operations, Data and Analytics, Hybrid Cloud and Automation, Cybersecurity, and IT Services to more than 20 U.S. federal government customers. Our customers include National Security (DHS, Army, Air Force, Intel), Health (USAID, NIH, HHS) and Citizen Services (FCC, GSA, SBA). For more information, please visit www.highlighttech.com.

Highlight Chief Financial Officer, Tracy Nguyen recognized as 2024 Transformative Financing Transaction CFO of the Year by Northern Virginia Technology Council (NVTC)

Fairfax, VA– Highlight Chief Financial Officer, Tracy Nguyen has been awarded the Transformative Financing Transaction CFO of the Year award. This award recognizes a transaction that allows a company to scale to its potential and continue its journey. It is not an exit but could include financings such as a venture capital fundraise, an investment from a strategic market participant, a private equity transaction (minority or majority investment), or a private equity to private equity sale where the company continues its journey. This transformative financing transaction must have occurred between January 1, 2023, and March 1, 2024.

On receiving this award, CFO Tracy Nguyen said, “Winning this award means a lot to me and our team. I didn’t expect it. Our team put a lot of hard work, hours, and time away from family into this work. The award has inspired us to continue on the road we have been heading down and try to make the next coming years the best.” Tracy has been essential in leading our transition to a 100% Employee-Owned organization. This transformative initiative has not only increased the value of our organization for our customers but also
provided our employee-owners with equity and wealth-building opportunities, paving the way for a secure financial future.

“This is a well-deserved recognition of the significant time commitment that Tracy invested in converting Highlight to a 100% employee-owned firm,” said CEO Aarish Gokaldas. “The employees reap the benefit from her hard work, and all marvel at her financial savvy and
intelligence.”

###

About Highlight
Highlight Technologies (“Highlight”) is an award-winning, 100% employee-owned, ISO® 9001, ISO 20000, ISO 27001, ISO 44001 certified, ISO 56000 certified, CMMI-DEV Level 3, and CMMI-SVC Level 3 appraised federal contractor that provides Application Development, Business & Mission Operations, Data and Analytics, Hybrid Cloud and Automation, Cybersecurity, and IT Services to more than 20 U.S. federal government customers. Our customers include National Security (DHS, Army, Air Force, Intel), Health
(USAID, NIH, HHS) and Citizen Services (FCC, GSA, SBA). For more information, please visit www.highlighttech.com.

About NVTC
NVTC is the trade association representing the Northern Virginia technology community. One of the nation’s largest and oldest tech councils, NVTC convenes regional tech companies from start-ups to Fortune 100 companies, government contractors, service providers, academic institutions, and nonprofits who are committed to innovating to improve how we live, work, and learn.