Module 5 - ACA Async Communication with Dapr Pub/Sub API¶
Module Duration
90 minutes
Objective¶
In this module, we will accomplish five objectives:
- Learn how Azure Container Apps uses the Publisher-Subscriber (Pub/Sub) pattern with Dapr.
- Introduce a new background service,
ACA Processor - Backend
configured for Dapr. - Use Azure Service Bus as a Service Broker for Dapr Pub/Sub API.
- Deploy the Backend Background Processor and the updated Backend API Projects to Azure Container Apps.
- Configure Managed Identities for the Backend Background Processor and the Backend API Azure Container Apps.
Module Sections¶
-
From the VS Code Terminal tab, open developer command prompt or PowerShell terminal in the project folder
TasksTracker.ContainerApps
(root): -
Restore the previously-stored variables by executing the local script. The output informs you how many variables have been set.
1. Pub/Sub Pattern with Dapr¶
As a best practice, it is recommended that we decouple services from each other. A conventional way to do so is by employing the Publisher-Subscriber (Pub/Sub) pattern. The primary advantage of this pattern is that it offers loose coupling between services where the sender/publisher of the message doesn't know anything about the receivers/consumers. This can be done in a 1-1 or 1-many constellation in which multiple consumers each receive a copy of the message in a totally different way. For example, imagine adding another consumer which is responsible for sending push notifications to the task owner (e.g. if we had a mobile app channel).
In module 3 we introduced you to decoupling ACA Web - Frontend
from ACA API - Backend
through asynchronous HTTP calls via Dapr. And in module 4 we added integrations with Redis Cache locally and Azure Cosmos DB in the cloud. In this module we will configure such a Pub/Sub pattern to faciliate asynchronous messaging between the microservices. Specifically, the publisher/subscriber pattern relies on a message broker which is responsible for receiving the message from the publisher, storing the message to ensure durability, and delivering this message to the interested consumer(s) to process it. There is no need for the consumers to be available when the message is stored in the message broker. Consumers can process the messages at a later time in an async fashion.
The below diagram gives a high-level overview of how the Pub/Sub pattern works:
If you implemented the Pub/Sub pattern before, you already know that there is a lot of plumbing needed on the publisher and subscriber components in order to publish and consume messages. In addition, each message broker has its own SDK and implementation. So you need to write your code in an abstracted way to hide the specific implementation details for each message broker SDK and make it easier for the publisher and consumers to re-use this functionality. What Dapr offers here is a building block that significantly simplifies implementing Pub/Sub functionality by abstracting the implementation of the provider from the usage of the pattern in the container. Differently, the container does not know who it is interacting with - and this is entirely intentional and favorable for container portability and immutability.
Put simply, the Dapr Pub/Sub building block provides a platform-agnostic API framework to send and receive messages. Your producer/publisher services publish messages to a named topic. Your consumer services subscribe to a topic to consume messages.
1.1 Testing Pub/Sub Locally¶
To try this out we can directly invoke the Pub/Sub API and publish a message to Redis locally. If you remember from module 3, when we initialized Dapr in a local development environment, it installed Redis container instance locally. Therefore, we can use Redis locally to publish and subscribe to a message.
If you navigate to the path %USERPROFILE%\.dapr\components (assuming you are using windows)
you will find a file named pubsub.yaml
. Inside this file, you will see the properties needed to access the local Redis instance.
The publisher/subscriber brokers template component file structure can be found here.
However, we want to have more control and provide our own component file, so let's create Pub/Sub component file in our components folder as shown below:
To try out the Pub/Sub API, run the Backend API from VS Code by running the below command or using the Run and Debug tasks we have created in the appendix.
Let's try to publish a message by sending a POST request to http://localhost:3500/v1.0/publish/taskspubsub/tasksavedtopic with the below request body, don't forget to set the Content-Type
header to application/json
Curious about the details of the endpoint?
We can break the endpoint into the following:
- The value
3500
: is the Dapr app listing port, it is the port number upon which the Dapr sidecar is listening. - The value
taskspubsub
: is the name of the selected Dapr Pub/Sub-component. - The value
tasksavedtopic
: is the name of the topic to which the message is published.
If all is configured correctly, you should see an HTTP 204 No Content response from this endpoint which indicates that the message was published successfully by the service broker (Redis) into the topic named tasksavedtopic
.
You can also check that topic is created successfully by using the Redis Xplorer extension in VS Code which should look like this:
Right now those published messages are just hanging out in the message broker topic. We don't yet have any subscribers bound to the service broker on the topic tasksavedtopic
, which are interested in consuming and processing those messages. We will set up such a consumer in the next section.
Note
Some Service Brokers allow the creation of topics automatically when sending a message to a topic which has not been created before. That's the reason why the topic tasksavedtopic
was created automatically here for us.
2. Setting up the Backend Background Processor Project¶
2.1 Create the Backend Service Project¶
In this module, we will also introduce a new background service which is named ACA Processor - Backend
according to our architecture diagram. This new service will be responsible for sending notification emails (simulated) to the task owners to notify them that a new task has been assigned to them. We can do this in the Backend API and send the email right after saving the task, but we want to offload this process to another service and keep the Backend API service responsible for managing tasks data only.
Now we will add a new ASP.NET Core Web API project named TasksTracker.Processor.Backend.Svc. Open a command-line terminal and navigate to the workshop's root.
Controller-Based vs. Minimal APIs
APIs can be created via the traditional, expanded controller-based structure with Controllers and Models folders, etc. or via the newer minimal APIs approach where controller actions are written inside Program.cs. The latter approach is preferential in a microservices project where the endpoints are overseeable and may easily be represented by a more compact view.
As our workshop takes advantage of microservices, the use case for minimal APIs is given. However, in order to make the workshop a bit more demonstrable, we will, for now, stick with controller-based APIs.
-
Delete the boilerplate
WeatherForecast.cs
andControllers\WeatherForecastController.cs
files from the newTasksTracker.Processor.Backend.Svc
project folder. -
We need to containerize this application, so we can push it to the Azure Container Registry before we deploy it to Azure Container Apps:
- Open the VS Code Command Palette (Ctrl+Shift+P) and select Docker: Add Docker Files to Workspace...
- Use
.NET: ASP.NET Core
when prompted for the application platform. - Choose the newly-created project, if prompted.
- Choose
Linux
when prompted to choose the operating system. - Set the application port to
8080
, which is the default non-privileged port since .NET 8. - You will be asked if you want to add Docker Compose files. Select
No
. Dockerfile
and.dockerignore
files are added to the project workspace.-
Open
Dockerfile
and remove--platform=$BUILDPLATFORM
from theFROM
instruction.Dockerfile Build Platform
Azure Container Registry does not set
$BUILDPLATFORM
presently when building containers. This consequently causes the build to fail. See this issue for details. Therefore, we remove it from the file for the time being. We expect this to be corrected in the future.
2.2 Add Models¶
Now we will add the model which will be used to deserialize the published message. Create a Models folder and add this file:
Tip
For sake of simplicity we are recreating the same model TaskModel.cs
under each project. For production purposes it is recommended to place the TaskModel.cs
in a common project that can be referenced by all the projects and thus avoid code repetition which increases the maintenance cost.
2.3 Install Dapr SDK Client NuGet package¶
Now we will install Dapr SDK to be able to subscribe to the service broker topic in a programmatic way. Add the highlighted NuGet package to the file shown below:
2.4 Create an API Endpoint for the Consumer to Subscribe to the Topic¶
Now we will add an endpoint that will be responsible to subscribe to the topic in the message broker we are interested in. This endpoint will start receiving the message published from the Backend API producer. Add a new controller under Controllers folder.
Curious about what we have done so far?
- We have added an action method named
TaskSaved
which can be accessed on the routeapi/tasksnotifier/tasksaved
- We have attributed this action method with the attribute
Dapr.Topic
which accepts the Dapr Pub/Sub component to target as the first argument, and the second argument is the topic to subscribe to, which in our case istasksavedtopic
. - The action method expects to receive a
TaskModel
object. - Now once the message is received by this endpoint, we can start out the business logic to trigger sending an email (more about this next) and then return
200 OK
response to indicate that the consumer processed the message successfully and the broker can delete this message. - If anything went wrong during sending the email (i.e. Email service not responding) and we want to retry processing this message at a later time, we return
400 Bad Request
, which will inform the message broker that the message needs to be retired based on the configuration in the message broker. - If we need to drop the message as we are aware it will not be processed even after retries (i.e Email to is not formatted correctly) we return a
404 Not Found
response. This will tell the message broker to drop the message and move it to dead-letter or poison queue.
You may be wondering how the consumer was able to identify what are the subscriptions available and on which route they can be found at.
The answer for this is that at startup on the consumer service (more on that below after we add app.MapSubscribeHandler())
, the Dapr runtime will call the application on a well-known endpoint to identify and create the required subscriptions.
The well-known endpoint can be reached on this endpoint: http://localhost:<appPort>/dapr/subscribe
. When you invoke this endpoint, the response will contain an array of all available topics for which the applications will subscribe. Each includes a route to call when the topic receives a message. This was generated as we used the attribute Dapr.Topic
on the action method api/tasksnotifier/tasksaved
.
That means when a message is published on the PubSubname taskspubsub
on the topic tasksavedtopic
, it will be routed to the action method /api/tasksnotifier/tasksaved
and will be consumed in this action method.
In our case, a sample response will be as follows:
Tip
Follow this link to find a detailed diagram of how the consumers will discover and subscribe to those endpoints.
2.5 Register Dapr and Subscribe Handler at the Consumer Startup¶
Update below file in TasksTracker.Processor.Backend.Svc project.
-
Let's verify that the Dapr dependency is restored properly and that the project compiles. From VS Code Terminal tab, open developer command prompt or PowerShell terminal and navigate to the parent directory which hosts the
.csproj
project folder and build the project.
Curious about the code above?
- On line
builder.Services.AddControllers().AddDapr();
, the extension methodAddDapr
registers the necessary services to integrate Dapr into the MVC pipeline. It also registers aDaprClient
instance into the dependency injection container, which then can be injected anywhere into your service. We will see how we are injecting DaprClient in the controller constructor later on. - On line
app.UseCloudEvents();
, the extension methodUseCloudEvents
adds CloudEvents middleware into the ASP.NET Core middleware pipeline. This middleware will unwrap requests that use the CloudEvents structured format, so the receiving method can read the event payload directly. You can read more about CloudEvents here which includes specs for describing event data in a common and standard way. - On line
app.MapSubscribeHandler();
, we make the endpointhttp://localhost:<appPort>/dapr/subscribe
available for the consumer so it responds and returns available subscriptions. When this endpoint is called, it will automatically find all WebAPI action methods decorated with theDapr.Topic
attribute and instruct Dapr to create subscriptions for them.
With all those bits in place, we are ready to run the publisher service Backend API
and the consumer service Backend Background Service
and test Pub/Sub pattern end to end.
-
Execute the
Set-Variables.ps1
in the root to update thevariables.ps1
file with all current variables. The output of the script will inform you how many variables are written out.
To do so, run the below commands in two separate PowerShell console, ensure you are on the right root folder of each respective project.
-
Restore the previously-stored variables by executing the local script. The output informs you how many variables have been set.
Note
Notice that we gave the new Backend background service a Dapr App Id with the name tasksmanager-backend-processor
and a Dapr HTTP port with the value 3502
.
Now let's try to publish a message by sending a POST request to http://localhost:3500/v1.0/publish/taskspubsub/tasksavedtopic with the below request body, don't forget to set the Content-Type
header to application/json
Keep an eye on the terminal logs of the Backend background processor as you will see that the message is received and consumed by the action method api/tasksnotifier/tasksaved
and an information message is logged in the terminal to indicate the processing of the message.
VS Code Dapr Extension
You can use the VS Code Dapr Extension to publish the message directly. It will be similar to the below image:
Shut down the sessions.
2.6 Optional: Update VS Code Tasks and Launch Configuration Files¶
If you have followed the steps in the appendix so far in order to be able to run the three services together (frontend, backend api, and backend processor) and debug them in VS Code, we need to update the files tasks.json
and launch.json
to include the new service we have added.
Click to expand the files to update
You can use the below files to update the existing ones.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 |
|
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 |
|
2.7 Update Backend API to Publish a Message When a Task Is Saved¶
Now we need to update our Backend API to publish a message to the message broker when a task is saved (either due to a new task being added or an existing task assignee being updated).
To do this, update below file under the project TasksTracker.TasksManager.Backend.Api and update the file in the Services folder as highlighted below:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 |
|
Tip
Notice the new method PublishTaskSavedEvent
added to the class. All we have to do is to call the method PublishTaskSavedEvent
and pass the Pub/Sub name. In our case we named it dapr-pubsub-servicebus
as we are going to use Azure Service Bus as a message broker in the next step.
The second parameter tasksavedtopic
is the topic name the publisher going to send the task model to. That's all the changes required to start publishing async messages from the Backend API.
This is a good opportunity to save intermediately:
-
From the root, persist a list of all current variables.
-
Navigate to the root and persist the module to Git.
3. Use Azure Service Bus as a Service Broker for Dapr Pub/Sub API¶
Now we will switch our implementation to use Azure Service Bus as a message broker. Redis worked perfectly for local development and testing, but we need to prepare ourselves for the cloud deployment. To do so we need to create Service Bus Namespace followed by a Topic. A namespace provides a scoping container for Service Bus resources within your application.
3.1 Create Azure Service Bus Namespace and a Topic¶
You can do this from Azure portal or use the below PowerShell command to create the services. We will assume you are using the same PowerShell session from the previous module so variables still hold the right values. You need to change the namespace variable as this one should be unique globally across all Azure subscriptions. Also, you will notice that we are opting for standard sku (default if not passed) as topics only available on the standard tier not and not on the basic tier. More details can be found here.
Note
Primary connection string is only needed for local dev testing. We will be using Managed Identities when publishing container apps to ACA.
3.2 Create a local Dapr Component file for Pub/Sub API Using Azure Service Bus¶
We need to add a new Dapr Azure Service Bus Topic component. Add a new file in the components folder as shown below:
Note
We used the name dapr-pubsub-servicebus
which should match the name of Pub/Sub component we've used earlier in the TasksNotifierController.cs
controller on the action method with the attribute Topic
.
We set the metadata (key/value) to allow us to connect to Azure Service Bus topic. The metadata consumerID
value should match the topic subscription name sbts-tasks-processor
.
We have set the scopes section to include the tasksmanager-backend-api
and tasksmanager-backend-processor
app ids, as those will be the Dapr apps that need access to Azure Service Bus for publishing and
consuming the messages.
3.3 Create an ACA Dapr Component file for Pub/Sub API Using Azure Service Bus¶
Add a new files aca-components as shown below:
Note
Remember to replace the namespace placeholder with the unique global name you chose earlier
Things to note here
- We didn't specify the component name
dapr-pubsub-servicebus
when we created this component file. We are going to specify it once we add this dapr component to Azure Container Apps Environment via CLI. - We are not referencing any service bus connection strings as the authentication between Dapr and Azure Service Bus will be configured using Managed Identities.
- The metadata
namespaceName
value is set to the address of the Service Bus namespace as a fully qualified domain name. ThenamespaceName
key is mandatory when using Managed Identities for authentication. - We are setting the metadata
consumerID
value to match the topic subscription namesbts-tasks-processor
. If you didn't set this metadata, dapr runtime will try to create a subscription using the dapr application ID.
With all those bits in place, we are ready to run the publisher service Backend API
and the consumer service Backend Background Service
and test Pub/Sub pattern end to end.
Note
Ensure you are on the right root folder of each respective project.
Note
We gave the new Backend background service a Dapr App Id with the name tasksmanager-backend-processor
and a Dapr HTTP port with the value 3502.
Now let's try to publish a message by sending a POST request to http://localhost:3500/v1.0/publish/dapr-pubsub-servicebus/tasksavedtopic with the below request body, don't forget to set the Content-Type
header to application/json
You should see console messages from APP in the backend service console as you send requests.
4. Deploy the Backend Background Processor and the Backend API Projects to Azure Container Apps¶
4.1 Build the Backend Background Processor and the Backend API App Images and Push Them to ACR¶
As we have done previously we need to build and deploy both app images to ACR, so they are ready to be deployed to Azure Container Apps.
Note
Make sure you are in root directory of the project, i.e. TasksTracker.ContainerApps
4.2 Create a new Azure Container App to host the new Backend Background Processor¶
Now we need to create a new Azure Container App. We need to have this new container app with those capabilities in place:
- Ingress for this container app should be disabled (no access via HTTP at all as this is a background processor responsible to process published messages).
- Dapr needs to be enabled.
To achieve the above, run the PowerShell script below.
Note
Notice how we removed the Ingress property totally which disables the Ingress for this Container App.
4.3 Deploy New Revisions of the Backend API to Azure Container Apps¶
We need to update the Azure Container App hosting the Backend API with a new revision so our code changes for publishing messages after a task is saved is available for users.
4.4 Add Azure Service Bus Dapr Pub/Sub Component to Azure Container Apps Environment¶
Deploy the Dapr Pub/Sub Component to the Azure Container Apps Environment using the following command:
Note
Notice that we set the component name dapr-pubsub-servicebus
when we added it to the Container Apps Environment.
5. Configure Managed Identities for Both Container Apps¶
In the previous module we have already configured and used system-assigned identity for the Backend API container app. We follow the same steps here to create an association between the backend processor container app and Azure Service Bus.
5.1 Create system-assigned identity for Backend Processor App¶
Run the command below to create system-assigned
identity for our Backend Processor App:
This command will create an Enterprise Application (basically a Service Principal) within Azure AD, which is linked to our container app. The output of this command will be as the below, keep a note of the property principalId
as we are going to use it in the next step.
5.2 Grant Backend Processor App the Azure Service Bus Data Receiver Role¶
We will be using a system-assigned
managed identity with a role assignments to grant our Backend Processor App the Azure Service Bus Data Receiver
role which will allow it to receive messages from Service Bus queues and subscriptions.
You can read more about Azure built-in roles for Azure Service Bus
here.
Run the command below to associate the system-assigned
identity with the access-control role Azure Service Bus Data Receiver
:
5.3 Grant Backend API App the Azure Service Bus Data Sender Role¶
We'll do the same with Backend API container app, but we will use a different Azure built-in roles for Azure Service Bus which is the role Azure Service Bus Data Sender
as the Backend API is a publisher of the messages. Run the command below to associate the system-assigned
with access-control role Azure Service Bus Data Sender
:
Limiting Managed Identity Scope in Azure Service Bus
Take note of the AZ CLI commands in 5.2 and 5.3. We are setting the scope of access for the system-assigned managed identity very narrowly to just the topic(s) that the container app should be able to access, not the entire Azure Service Bus namespace.
5.4 Restart Container Apps¶
Lastly, we need to restart both container apps revisions to pick up the role assignment.
Success
With this in place, you should be able to test the 3 services end to end.
Start by running the command below and then launch the application and start creating new tasks. You should start seeing logs similar to the ones shown in the image below. The command will stop executing after 60 seconds of inactivity.
What to do if you do not see messages?
Sometimes, the revision creation right after creating the managed identity results in the identity not yet being picked up properly. This becomes evident when we look at the Backend Service's Container App's Log stream
blade in the Azure portal. Specifically, the daprd
sidecar container will show HTTP 401 errors.
Should this be the case, you can navigate to the Revisions
blade, click on the active revision, then press Restart
. Going back to the daprd
sidecar in the Log Stream
should now reveal processing of messages.
-
Execute the
Set-Variables.ps1
in the root to update thevariables.ps1
file with all current variables. The output of the script will inform you how many variables are written out. -
From the root, persist a list of all current variables.
-
Navigate to the root and persist the module to Git.
Review¶
In this module, we have accomplished five objectives:
- Learned how Azure Container Apps uses the Publisher-Subscriber (Pub/Sub) pattern with Dapr.
- Introduced a new background service,
ACA Processor - Backend
configured for Dapr. - Used Azure Service Bus as a Service Broker for Dapr Pub/Sub API.
- Deployed the Backend Background Processor and the updated Backend API Projects to Azure Container Apps.
- Configured Managed Identities for the Backend Background Processor and the Backend API Azure Container Apps.
The next module will delve into the implementation of Dapr bindings with ACA.