We just open-sourced the Airy Core Platform, a fully-featured, production ready messaging platform that allows its user to process messaging data from a variety of sources (like Facebook Messenger, Google Business Messages or Website Live Chat).

In this blog post, we go over the components we open-sourced and then we provide an overview of the exciting features we intend to add to the Airy Core Platform. 

Before we move on with a description of each component, let us clarify what we mean by “fully-featured messaging platform”: the Airy Core Platform lets you process data from conversational platforms, which we call sources. Airy is your central hub for conversations: Conversations happening via these platforms can be queried for further processing, integrated into your existing infrastructure, and connected to conversational service providers like conversational AI solutions. 

You can use Airy to show a cross-platform aggregate view of the current conversations, tag these conversations programmatically, and link them to your CRM.

Of course, it’s a messaging platform so you must be able to get near real-time notifications about new messages, read receipts, and similar events. To us, those features make a platform fully-featured.

So let’s dive into the components one by one.

Ingestion platform

One of the crucial features the Airy Core Platform provides is the ability to process conversational data from a variety of sources (like Facebook Messenger, Google Business Messages, and so on). We open-sourced an ingestion platform that processes incoming webhook data from different sources. We make sense of the data and reshape it into source independent contacts, conversations, and messages (see our glossary for formal definitions).

Of course, due the very nature of the problem, the code is very specific to the thirty-party source it deals with. We consider this as a strength as it frees Airy Core Platform users from dealing with these integrations themselves.
While sources are all different, their architecture follows a few key principles: 

  • The webhook integration ingests payload data as raw as we get it in a source specific topic. 
  • We only extract metadata from the source data, we do not parse content at ingestion time.

These principles allow us to reprocess data from a conversation platform at any given point time. If the data pipeline has a bug, say we count messages incorrectly, we can reprocess the data and fix a bug for past data as well.

You can check out the source code (pun intended) here.


Once conversational data is flowing through the Airy Core Platform, the question is how does one access it? As developers the first access point is clear: HTTP endpoints.

The endpoints exposing conversational data make use of Kafka Streams interactive queries giving our endpoints very interesting properties:

  • The endpoints are really fast! The average response time barely decreases with a growing number of conversations.
  • Changes in the datasets (like new messages, conversations, unread counts) will reflect into our API endpoints at a near real-time speed. As these endpoints only depend on Apache Kafka and a local RocksDB (transparently managed by the wonderful Kafka Streams library)

You can check out the full list of endpoints here.

WebSocket server

Messaging is all about real-time communication. HTTP endpoints are not suitable for all use-cases: Enter the WebSocket server. 

We use the WebSocket server to power our real-time inbox applications of our commercial offering. If all we offered was a HTTP API, you would have to resort to poll the API to build such applications.
For this reason, we provide a WebSocket server you can subscribe to. The server will notify you, on different STOMP queues, every time your conversational data changes. You can check out the documentation here.

Webhook integration

A conversation is only as great as its participants. The Webhook integration enables you to programmatically participate in conversations by sending messages or reacting to them. The webhook integration exposes conversational events so users can "listen" to those events and react programmatically. You can check out the documentation here.

React UI component library

Sometimes not only machines need to access and respond to conversations, humans need interfaces too. For those cases, you’re going to need a UI and that’s why we open-sourced a React UI component library (which we showcase here). The long-term plan is to provide React components that can help you to build the typical user interfaces of a messaging platform.

What’s next?

We’re constantly working to improve the Airy Core Platform and we’re happy to share a sneak preview of what’s coming:

  • More sources: Google Business Messages, SMS, WhatsApp.
  • We’re building a demo application that leverages the whole core platform so users can have a concrete starting point for user interfaces.
  • We will optimize the bootstrap process for folks who wish to run the Airy Core Platform on their local machines.

In the coming weeks, we will publish milestones on Github so watch out for updates and let us know what you think!