From Microfoam to Microflow

Part three: Brewing Data into Dashboards

Debugging an Espresso Machine

Finally, the moment was there to test the entire setup. Two weeks ago, when I hooked up all the components to the Gaggia Classic, it was showtime for the MQTT configuration. At first, everything looked promising—the Gaggimate connected nicely to my home network and started spewing out messages about temperature, brew state, and other delightful statistics only a caffeine addict could appreciate. I was convinced I was just days away from pulling my first espresso shot into a Mendix companion app.

But then demo syndrome hit.

After a few days, I noticed the Gaggimate would lose its Wi-Fi connection whenever the machine was switched off completely. Never thought I’d write this sentence, but I had to debug my espresso machine.

Unscrewing the top and checking the wiring revealed nothing wrong. Plugging the board into my laptop produced no log messages either. I reflashed the PCB and touchscreen, but still no luck. Thankfully, the Gaggimate community on Discord came to the rescue. The problem? The PCB only connects to dedicated 2.4 GHz networks. My home setup uses a single SSID for both 2.4 and 5 GHz, which confused it.

The easy fix would have been to split the networks in my router config. Instead, I went for the harder but more fun option: turning my Raspberry Pi, which was already acting as my MQTT middleman, into a dedicated IoT router just for the espresso machine.

And so, the Espressix network was born. Once configured, the Gaggimate connected instantly, and data started flowing all the way from boiler to cup to AWS Cloud. Now it was finally time to build the Mendix companion app.

Building the domain model

Like any greenfield Mendix project, I started with the domain model. My main source: the MQTT messages generated by the Gaggimate. Using a tool called MQTT Explorer, I captured the payloads and stored them as JSON structures in my Mendix app, which I fittingly named Espressix (Espresso + Mendix).

The Pro kit? That’s where the real tinkering starts. Installing the OPV on a pump with pre-soldered wiring is… fiddly. The kind of fiddly that eats up two-thirds of your installation time.

The data points I’m most interested in are:

  • Boiler temperature
  • Boiler target temperature
  • Controlled more
  • Brewstate

As explained in my previous blog posts, these data points will help me get from being a noobie barista to a mediocre barista. The brew state was the most important; it tells me when the machine is actively brewing, which defines the time window to collect and attach other data points to a shot record.

To link it all together I added a Shot entity with the necessary properties that I want to log using my own taste buds, and what beans I use. So next to the Shot entity a Bean entity is created to store in a master data table what beans I can use and what the properties are (like is it a dark or light roast, what’s the roastery). And as pictures say more than a thousand words, a photo of the bean bag can be attached to this.

After associating all the statistics from the Gaggimate MQTT broker to the Shot entity I ended up with the following domain model.

Without going full Espresso nerd, there are a couple of things to highlight on the Shot entity that will help me better understand the difference between two shots of espresso. If you’re into espresso or read the first blog post, you might remember that going down the espresso rabbit hole is an insane hobby. It’s not just the espresso machine you’re investing in, it’s also your own sanity. A good grinder, precision scale and other tools are of great value if you’re trying to dial in the perfect shot of espresso. So In order to log all these very important statistics the following attributes can be logged by the user (me) in the application:

Bean weightThe amount of grams you use for your shot can make the difference between a strong or weak cup of espresso or even the difference between sour and bitter shots. Ideally with my setup I aim for 16.0 grams of beans
YieldIf you dive into espresso sub reddits you’ll read a lot about ratios. A good double espresso in this case has a 1:2 ratio, meaning 16 grams of beans should result in 32 grams of the amazing dark liquid.
GrindsizeGrind too fine and your shot tastes sour, grind to course and your shot tastes bitter. The grindsize depends on the specific beans so one type of beans might grind at setting 12 of my grinder, another one might be at 10. This makes all the difference and therefor is very important.
HasUsedWDTToolYou might want to look this up on google because if I’m going to explain what this is you will (if you haven’t done already) call me crazy. This is a tool that breaks up small clunks of grinded beans. Too much of those and the water will create channels in your espresso puck, causing less beans to be in contact with water, causing less of a taste.
HasUsedDistributorA tool of which the name speaks for itself. A tool that will help you distribute your grinded coffee beans through the portafilter. This again prevents that water flows unequally into your cup
HasUsedWaterSprayThis one is a tough sell for normal people. Beans are roasted and therefore very dry, causing it to become static and get stuck in the grinder. Spraying a mist of water on your beans will prevent this, but I admit, it will look insane.
TasteNotesThe first taste wise attribute. This will help me better understand over a couple of espressos what is good and what’s bad, helping me understand using the above mentioned attributes what I can change to improve the shots.
BitternessUsing a range between 0 and 10 the bitterness of a shot can be tracked. If a shot is too bitter it might be that I grinded to coarse and need to grind finer. Usually this goes hand in hand with a shot that pulled to fast
SournessSame as above, although this might indicate a too fine grind setting and a slower shot time.
CommentIf there’s anything that might be of influence of the shot, like weather, general mood or sleep score it can be logged here for future reference.
OpenedDateTimeThis is a little helper attribute that will indicate if the shot has been extended with the necessary information, and can help identify new shots in the UI.

Logging all this means I can start spotting patterns like how grind size affects sourness, how roast level impacts bitterness, and whether my insane habit of spraying beans with water actually makes a difference.

Logging shots via MQTT

There’s a basis now to build upon and it’s almost time to log the shots. If we take a quick look again at the architecture diagram of this project it becomes clear that the Mendix application needs to get the espresso data from the AWS IoT Core environment. The great thing about AWS IoT Core is that it communicates with MQTT in both directions. The Gaggimate posts datapoints to Mosquitto on my Raspberry PI and this in turn bridges and forwards these messages over MQTT to the AWS IoT Core broker.

With Mendix you can read those messages from AWS IoT core via MQTT using the MQTT connector.

The solution? A middleman—my trusty, slightly dusty, 7-year-old Raspberry Pi, now promoted to kitchen IoT hub. It would run a local MQTT broker to talk to the Gaggimate and bridge messages securely to AWS IoT Core.

The MQTT Connector in Mendix is a marketplace module that uses Java actions to publish and subscribe to MQTT topics. In this project, the topics are the data points the Gaggimate sends out, such as temperature or brewstate. Subscribing to a topic means that whenever a new message is published to the broker (in this case AWS IoT Core), the connector automatically triggers a microflow in your app. The payload of the message, along with the topic name, is passed into that microflow.

For example, to detect when the espresso machine is brewing, I subscribed to the brewstate topic. The Gaggimate posts a message here whenever brewing starts and stops. When the state is “brewing,” the microflow creates a new Shot entity. When the state changes to “not brewing,” the microflow finds the active shot and updates its EndDateTime to finalize the record.

To configure this, you use the “Subscribe to MQTT” action from the module. It requires a ConnectionDetail object, which holds the connection settings (endpoint, credentials, topic name, the microflow that should handle messages, and the Quality of Service level). QoS defines the reliability of delivery:

  • 0: “fire and forget”—the broker does not check if the message is received.
  • 1: the subscriber (Mendix) acknowledges receipt, so the broker keeps the message until it’s confirmed.
  • 2: a full handshake to guarantee delivery, but with more overhead.

For my scenario, I chose QoS 1. The brewstate topic doesn’t produce a high volume of messages, but I want to make sure I don’t lose any. With this setup, messages remain on the broker until the Mendix app confirms receipt, giving me both reliability and performance.

In practice it looks like this:

With the corresponding microflow like this:

Alongside the brew state, I also subscribed to the temperature and targetTemperature topics. These messages provide real-time boiler readings and the configured target temperature, which are crucial for understanding how stable each shot was during extraction.

The logic here is slightly different than with brew state. Each temperature message includes a timestamp, so my microflow checks whether there’s an active Shot entity whose time window matches that timestamp. If it finds one, the message is linked to the Shot record and stored in the database. This way, every shot record can show not only when it was brewed but also how the machine performed during that extraction.

By subscribing to both topics, I essentially enrich each shot with a mini telemetry log. Over time, this makes it possible to see patterns like whether a certain bean consistently leads to higher brew temps, or whether my preinfusion routine causes dips that affect flavor. It’s a small detail, but logging these datapoints gives me the foundation for a more data-driven approach to dialing in espresso.

In the end the application at startup will try to create a connection to the MQTT broker and subscribe to the four topics mentioned earlier in this blog post.

Now that we can technically receive messages from AWS IoT core it’s time to configure the MQTT module, this is done by adding the page ConnectionAdministration to your navigation profile and setting the right user roles. A great blog post about how you can configure the AWS IoT Core broker in your Mendix application has been written a while ago by Marco Spoel.

Something’s brewing!

And now the party piece, building the UI to log all shots that are brewed with the Gaggia.

The main entry point of the application is the Shot_Overview, a page that will show a gallery with all the shots brewed by the machine. If it’s a new shot it will show “Finish entry” and if not you can already see the characteristics filled by the espresso junkie using the app (me).

Clicking on the gallery items will open a Shot_NewEdit page that gives me all the controls to enrich the data like selecting the tools I used, the beans that have been grinded, and fill in the characteristics of the brewed espresso.

All this data is fun, but I need insights. I want to better understand my espresso behaviour apart from being close to addiction. When all data is logged it can be used to create dashboards with great insights like how many grams of beans do I use on a weekly basis and of which type. What were the last five shots and what were the characteristics? So for that a dashboard is added as well, because this journey all started with the simple thought of every espresso machine deserves a dashboard!

And that’s how the Espressix app was born. I’m now able to track my “so-called addictive” espresso behaviour in my own app and make sure that I can get the best out of my espresso machine. So let’s put it to the test and see what’s brewing!

So what’s next?

This concludes the three part blog series on how I said goodbye to my beloved Isomac and welcomed the Gaggia Classic E24 in my life, how I electrocuted myself only once when I installed the Gaggimate mod and hooked up this analog Espresso machine to the World Wide Web and build the dashboard it deserves. As with any app built with Mendix, you never stop making. So, although this is the end of this blog series, I’m already thinking about extending the application with some AI capabilities to analyze the data gathered from all the espresso shots and give me suggestions for my future shots. So who knows, you might see some of this in the future!

If you’re also interested in this setup or want to know more about Mendix, IoT, and an overcomplicated relation with caffeine, let me know, and I’m always open to a cup of… coffee and discuss how we can help you with this!

About the Author

Freek Brinkhuis is a Principal Consultant and Architect at The Orange Force and Mendix Product MVP. He previously worked in different roles, ranging from native iOS developer to Product Manager AWS at Mendix. He’s a technical all-rounder who loves to learn about new technologies and how he can implement them for his customers. In his free time, he loves to play around with both software and hardware. His growing collection of vintage Apple desktops can be seen in his online meeting background when he’s working from home!

Scroll to Top