Events, events, events!


Events, events, events!

This was both a challenging and a bit frustrating week as was trying to wrap my head around the right way to introduce event-carried state transfer via Kafka into the Game Store application I'm preparing for the upcoming .NET Developer Bootcamp.

I have already done this via RabbitMQ and Azure Service Bus in my microservices program, but there are limitations that I was hoping Kafka could help address to take things to the next level in this new bootcamp.

So today I'll tell you about the scenario that I think is a perfect fit for Kafka, how it relates to even-driven microservices, and how I'm integrating all that into the bootcamp.

But first, let me tell you a bit about .NET Aspire 8.1 and the new Keycloak support.

Keycloak support in .NET Aspire 8.1

With .NET Aspire 8.1, which just launched, we got the new Keycloak support I've been working on for the past couple of months.

Here's why I thought we needed this support in .NET Aspire:

  1. Keycloak is an OIDC-compliant identity provider that can easily run in your box via Docker.
  2. Identity, authentication, and authorization are hard. Anything we can do to simplify things for devs is a huge help.
  3. I don't want to have to manually start and configure my Keycloak container for local development. Lots of trial and error.

I won't go deep into how to add Keycloak support to your .NET Aspire apps since there's a good article that covers it over here.

But, in essence, you install 2 NuGet packages (currently in preview), the first one on your AppHost project, and the second one in your ASP.NET Core API or frontend project:

Then you can add the Keycloak Docker container to your AppHost Program.cs like this:

After starting your .NET Aspire app, you'll get a new Keycloak endpoint in your dashboard:

And, you can then browse to your endpoint (http://localhost:8080 in this case) to create and configure your Keycloak realm.

Then you can connect your API to Keycloak like this:

And you do something like this for your frontend:

The exact options you configure in both cases will vary, but the key thing is that you don't have to waste time figuring out the Keycloak Docker image details nor how to set your Authority URL in your apps. .NET Aspire will take care of all that for you.

The curious thing is that as I start to integrate these packages into the Game Store application I'm already thinking of ways to improve the Keycloak support. So stay tuned for future updates!

Event-driven microservices via Kafka

Here's the scenario that's been bothering me for a while:

Here, any time the user adds a game to the shopping cart, we send the list of items with ID, name, quantity, and prices, to the Basket microservice.

And, when the user initiates the check-out, we create an order with the Basket items and send them, with ID, name, quantity, and prices, to the Ordering microservice.

What's the problem there? Well, the prices!

Why would any of the microservices trust that the frontend will send the correct prices when the source of truth is in the Catalog microservice, where all games live?

Never trust the frontend. Stick to the information safely stored in the backend.

So, to address this, what most folks would do is this:

So now, both Basket and Ordering will make an HTTP call to the Catalog microservice to fetch the product details.

Bad idea! Why?

Well because:

  1. If Catalog has any sort of trouble or goes down, Basket and Ordering can't do their job.
  2. The more services you stand up, and that need Catalog data, the more load you put on that Catalog microservice, which can only handle so much.

Essentially, the SLA of Basket and Ordering is bound to that of Catalog.

This is the #1 mistake people make when transitioning to microservices.

Then, what should you do?

Decouple! And here's the key idea and where message brokers come into place:

Instead of having each microservice request data from Catalog when needed, you let Catalog publish product events any time interesting things happen to those products.

Something along these lines:

So here, when a new game is created in the Catalog, a GameCreated event is published with all the product data to a message broker (Kafka here) so that Basket and Ordering can eventually consume it and store the new product info in their own databases.

Catalog can go down anytime, but as long as all events make their way to the message broker, Basket and Ordering will always have their updated copy of all products so they can move on just fine.

What's the deal with that Outbox table?

Goes back to the dual-write problem, since you must make sure an event is published for every update you make to the Catalog Games table. The Outbox table and the corresponding Worker service enable the Transactional Outbox pattern, which will ensure no events are left unpublished.

Why Kafka and not RabbitMQ or something else?

Because you want those events to persist potentially forever in the message broker, not just disappear from the queue once consumed.

Think about it. A year from now, we decide we need to stand up a new customer rewards microservice, which will also need a few details from the product catalog.

How is customer rewards going to get an updated list of all products if it can't make HTTP calls to Catalog and all events are already gone from the message broker?

With Kafka you don't have a queue, but instead a long log of all events that happened to all products, and you can keep it there for as long as needed. A new microservice comes in, reads the entire log, and has a full copy of all product details.

Here's a code snippet on how events are produced to Kafka in the Catalog microservice Outbox processor:

And here's the Basket microservice consuming the GameCreated event from Kafka via MassTransit Riders:

It did take me a while to understand how Kafka works as compared to RabbitMQ and Azure Service Bus, and how to configure it properly. But I think I got it and it works really nicely.

Can't wait to go over all the details in the bootcamp!

The Inventory microservice is gone

As I was working on the updated event-driven story across the Game Store system I noticed how small the responsibility of the Inventory microservice was (keep track of game codes) and how it would need to also have its own cache of Games.

This is where you need to decide if you really need that other microservice. It will probably be needed in a much more complex system, but here I believe the scope is too small to deserve its own microservice. So I decided to tear it down and merge its endpoints into the Catalog microservice.

After this simplification and the addition of Kafka, the system looks like this now:

Which I think is complex enough for you to get a good sense of the challenges of building a distributed system with .NET, which is one of the main goals of the bootcamp.

Book recommendation

If you get a chance, check out this book:

Practical Event-Driven Microservices Architecture by Hugo Filipe Oliveira Rocha.

I've been reading it for the past month to go deep into event-driven microservices, and I must admit it's a good one.

You won't find any code there, but it's compensated with tons of graphics that clearly help you get the full picture in this fascinating area.

Closing

As a side note, I also had to switch all entities across all microservices to use GUIDs for their IDs as opposed to integers. Not a fun change, but made many things easier across the board. I'll explain why in the bootcamp.

Can't wait to start diving into deploying this entire system to Azure very soon!

Until next time.


Whenever you’re ready, there are 3 ways I can help you:

  1. Building Microservices With .NET:​ The only .NET backend development training program that you need to become a Senior .NET Backend Engineer.
  2. ASP.NET Core Full Stack Bundle: A carefully crafted package to kickstart your career as an ASP.NET Core Full Stack Developer, step by step.
  3. Promote yourself to 15,000+ subscribers: by sponsoring this newsletter.

11060 236th PL NE, Redmond, WA 98053
Unsubscribe · Preferences

The .NET Saturday

Join 16,000+ subscribers for actionable .NET, C#, Azure and DevOps tips. Upgrade your skills in less than 5 mins every week.

Read more from The .NET Saturday

The Pillars of Observability After completing the Game Store application, the last week was all about scripting the first few modules of the upcoming .NET Cloud Developer Bootcamp, which essentially means creating a detailed Word document for each lesson, describing exactly how that lesson should go. I don't know how many content creators do this, since it's a long (and sometimes tedious) process, but I find it essential to make sure each concept and technique is introduced at exactly the...

DevOps: Part 2 It's done! A couple of days ago I finally completed the Game Store system, the distributed .NET Web application that will drive the upcoming .NET Cloud Developer Bootcamp (Is that a good name? Let me know!). I'm amazed by how much the tech has advanced in the .NET and Azure world in the last few years. There's so much going on in this field that I have no idea how folks are solving today's chaotic puzzle to learn cloud development with .NET. I was lucky enough to enter the .NET...

DevOps: Part 1 Wow, getting the Game Store web application deployed to Azure via Azure DevOps was one of the most challenging things I've done so far as part of the .NET Developer Bootcamp project. But, somehow it all worked out, and the end result is really nice. The complexity came from me trying to fit both the Azure infra deployment and the CI/CD process into the .NET Aspire model, which is only poorly supported at this time. But, having worked on dozens of Azure deployments and CI/CD...