Microsoft Fabric Updates Blog

From RabbitMQ to PowerBI reports with Microsoft Fabric Real-Time Analytics

In this blog I’ll show you an End-to-End scenario, where you can send humidity and temperature telemetry data from RabbitMQ to a KQL Database in Microsoft Fabric. You will be using an Eventstream to orchestrate the data streaming. Once your data is in a KQL Database, you’ll be able to visualize it in PowerBI to gain insights from it.

Here’s the full demo at work:

Sending data to Microsoft Fabric from RabbitMQ

Creating a KQL Database

In Microsoft Fabric, select the Synapse Real-Time Analytics experience. Then create a new KQL Database that will hold the streamed data. Call it telemetry.

Once the database has been created, go back to the Synapse Real-Time Analytics experience, and create a new Eventstream. Call it from-rmq.

Then you will add a new Custom App source. This will let you receive data from RabbitMQ. Call the Custom App rmq-ca.

Configuring a RabbitMQ Shovel

Once the Custom App is created, it’s time to get the AMQP URI. In the Custom App details pane, select the AMQP tab, and copy the Connection string-primary key.

Then you can go to RabbitMQ management and create a new shovel. A RabbitMQ shovel will take care of grabbing messages from your RabbitMQ telemetry queue, and ship them to Fabric. You need the RabbitMQ Shovel plugin enabled for this to work.

Call the shovel to-fabric. In this example, the telemetry queue in RabbitMQ is called rmq-fabric. The shovel destination will be the AMQP 1.0 Custom App you just created. In the URI field paste the AMQP URI you copied from Fabric. Make sure to add the query parameter ?verify=verify_none. (This is needed since Erlang/OTP Version 26.0).

Go back to your Custom App and copy the Entity Name, this will be used as the Address in the shovel configuration. Select Add Shovel to finish the shovel setup.

Sending telemetry data to RabbitMQ

In your dotnet project, add the RabbitMQ.Client NuGet package:

dotnet add package RabbitMQ.Client

Then you can use the following code to send telemetry data to RabbitMQ:

using System.Text;
using RabbitMQ.Client;

var factory = new ConnectionFactory() { HostName = "localhost" };
using (var connection = factory.CreateConnection())
using (var channel = connection.CreateModel())
{
    channel.QueueDeclare(queue: "rmq-fabric",
                        durable: true,
                        exclusive: false,
                        autoDelete: false,
                        arguments: null);

    int i = 0;
    while (true)
    {
        var message = GenerateRandomMessage();
        var body = Encoding.UTF8.GetBytes(message);

        channel.BasicPublish(exchange: "",
                            routingKey: "rmq-fabric",
                            basicProperties: null,
                            body: body);
        if (i++ % 100 == 0)
        {
            Console.WriteLine(" [x] Sent {0}", message);
            System.Threading.Thread.Sleep(100);
        }
    }
}

string GenerateRandomMessage()
{
    var random = new Random();
    var temperature = random.Next(0, 50);
    var humidity = random.Next(0, 100);
    var timestamp = DateTime.UtcNow.ToString("o");
    return $"{{\"Timestamp\": \"{timestamp}\", \"Temperature\": {temperature}, \"Humidity\": {humidity}, \"Source\": \"rabbitmq\"}}";
}

Save the code in a file called `Program.cs`. Then you can run it with:

dotnet run

This will start sending telemetry data to RabbitMQ.

Creating a KQL Table

Back in your Eventstream, now you will configure the destination. Select New destination -> KQL Database. Call it rmq-dest, then choose My Workspace, and specify the telemetry database from the KQL Database dropdown. Select Create and configure.

Here go through the steps in the wizard. Call the table sensor-telemetry, and make sure to select JSON as the Data format. Fabric will then parse the data sent from RabbitMQ and create a table with the appropriate columns: Timestamp, Temperature, Humidity, and Source.

Querying the data

Once the Eventstream has been fully setup, you will be able to go to the KQL Database, and query the data. Select the `telemetry` database, and then select the `sensor-telemetry` table.

In the query editor we will run the following query:

['sensor-telemetry']
| summarize ptemperature = avg(Temperature), phumidity avg(Humidity) by bin(Timestamp, 1s)

This will give you the average temperature and humidity per second. Select Run to execute the query. Then select Build PowerBI report to create a new PowerBI report.

Creating a PowerBI report

In the PowerBI editor, you will add two line charts. The first one will display the phumidity values, while the second one will display the ptemperature values.

Then you can select File -> Save to publish the report to PowerBI. Once the report is published, you can open it in PowerBI. There you will be able to share the dashboard with your colleagues. Congratulations, you just consumed data from RabbitMQ into PowerBI using Microsoft Fabric Real-Time Analytics.

Learn More

Microsoft Fabric Real-Time Analytics

Kusto Query Language

RabbitMQ Shovel

Related blog posts

From RabbitMQ to PowerBI reports with Microsoft Fabric Real-Time Analytics

April 5, 2024 by Brad Watts

As part of the consumption model for KQL Database and Eventhouse, we are enabling public preview for Kusto Cache consumption. This change is scheduled to start rolling out on April 12. This means that you will start seeing billable consumption of the OneLake Cache Data Stored meter from the KQL Database and Eventhouse artifacts. For … Continue reading “Announcement: Kusto Cache Storage Public”

March 21, 2024 by Guy Reginiano

Users of Azure SQL can now take advantage of an enhanced monitoring solution for their databases and leverage integration with Microsoft Fabric. With the introduction of the new Database Watcher for Azure SQL (preview), users gain access to advanced monitoring capabilities. Through integration with Microsoft Fabric, they can effortlessly stream, store, and analyze monitoring data … Continue reading “Fabric Real-Time Analytics Integrates with Newly Announced Database Watcher for Azure SQL”