Real-Time Dashboards
With the Real-Time Intelligence suite in Microsoft Fabric, we have everything we need to build an analytics solution for our streaming and telemetry data.
One of the missing pieces in that puzzle has been the capability to get real-time insights on the data arriving in the data estate.
A few months back, we got the Real-Time Dashboard service to help us do exactly that — give us the possibility to use the power of Eventhouse and the KQL engine to visualize and analyze the data as it arrives in the database.
Enable the Real-Time Dashboard
By default, the Real-Time Dashboard items are disabled on the tenant. You must enable the capability in the Admin portal.
Right now, the feature is in public preview.
Create your first tile
Each Real-Time Dashboard consists of one or more tiles. A tile is a combination of a KQL script and a configured visualization for the result set from executing the KQL statement.
It can be thought of like this:
Notice that the connection between the KQL statement and the tile is one-to-one. So, one KQL statement for one visual tile. I’ll come back to this concept later on in this post.
The visualization is actually an option. You don’t have to convert the result set into a visual. If you don’t, you will just have the result set in a table like this:
The options for making it a bit more visual are not as many as you get from Power BI. You have a list of given options:
From KQL to visual tile
First of all, we need data in an Eventhouse — this can be done in many different ways. For the rest of this post, I’ll use the MachineLogs data set from Microsoft’s best-ever gamification of learning a new scripting language - Kusto Detective Agency.
.execute database script <|
.create-merge table MachineLogs (Timestamp:datetime, Machine:string, EventType:string, Message:string)
.ingest async into table MachineLogs (@'https://kustodetectiveagency.blob.core.windows.net/kda2c9kda/log_00000.csv.gz')
.ingest async into table MachineLogs (@'https://kustodetectiveagency.blob.core.windows.net/kda2c9kda/log_00001.csv.gz')
.ingest into table MachineLogs (@'https://kustodetectiveagency.blob.core.windows.net/kda2c9kda/log_00002.csv.gz')
The simplest tile we can create that contains a visualization is the KPI card called “STAT”:
Notice that for each visual selection, you have different options for customizing the visual. The stat card is the simplest one with only text size and value column.
For most of the visuals, you also have options to create conditional formatting based on rules:
Here you can go nuts with rules for coloring — but with a limited selection of colors. For the Stat card, we have Red, Yellow, Green, and Blue as colors. These colors might not be perfect for visual aid and for colorblind users, but it is what we have for now. To compensate a bit for that, we can also choose from a long list of icons for each color.
We can also add a tag — a tag will appear on the bottom left corner of the tile and show a text box with the tag. So even though we might not have the best options for colors, we do have two options to help everyone understand the status of a visual based on icons and tags.
After clicking “Apply changes,” we now have our first tile in the dashboard.
Filtering the dashboard
No visual representation of data is complete without some filtering options. And in the Real-Time Dashboard, we have a very special one for time — which is built in and given to us for all dashboards we create.
Let’s start with the KQL statement of the tile we just created.
MachineLogs
| count
At the top of the visual work area, we have the [@]Parameters area - here we find the Time Range with two paramters: startTime and endTime. We can use these in the KQL query as follows:
MachineLogs
| where Timestamp between (_startTime .. _endTime)
| count
Now we can use the built in filter option for time. Notice here, that the time range is high customisable with standard ranges from Last 15 minutes to Last 1095 days - with a custom range where we can define the range as we need.
Without trying to tell you which one is best between Power BI and Real-Time Dashboard, the RT-Dashboard has a highly granular approach to time filtering, with options to fit the user’s needs — without the need for an optimized and separated date and time table as in Power BI. This also makes perfect sense, as the Real-Time Intelligence part of Fabric is built for time-series data with extremely high volumes of data and the need for detailed insights, without aggregating the details away.
Your own custom paramter
Of course you can also built your own custom parameter in the Dashboard. This is done by clickin the [@]Parameters button and here create the needed list of values, either from a hard-coded list or based in a KQL query.
To use the created parameter, you simply add the correct WHERE clause and reference the name of the parameter. Below I’ve created a parameter called Machine and are using it as a filter in the KQL statement.
MachineLogs
| where Timestamp between (_startTime.._endTime)
| where isempty(['MachineName']) or Machine in (['MachineName'])
| count
Notice that you can have as many where statements in your KQL script as you like. And try to put them in as early as you can in the script, in that way you help the engine be as fast as possible.
Build the dashboard with more tiles
After the first tile has been built, we can add more tiles, move them around, scale them and make it look as we need. It could look something like this:
Auto refresh
The Real-Time Dashboard has a very nice feature to help the end user get insights as they happen in the Eventhouse. This is called Auto refresh as is found in the Manage section in the top menu when editing the dashboard.
Here you can enable the feature and select the options you want to provide to your end users, from Allow all refresh intervals to 1 day. This enables the dashboard to auto-update all tiles at the defined refresh rate.
You can also define the default rate, which is the refresh rate that is enabled whenever a user opens the dashboard.
Now we get to the part I promised at the beginning of this post to return to — the fact that each tile has its own KQL statement.
For each refresh of the dashboard, each KQL statement behind every tile is executed against the KQL database in the Eventhouse. This also applies to the option of continuous refresh, which actually just executes the KQL statements when the result set has been returned to the dashboard. This results in, for the dashboard above with three tiles, that for every refresh, the engine is executing three KQL statements. Imagine having more tiles with even more complex KQL statements. This could be expensive in CU cost on your capacity.
Below is the above dashboard running with continuous refresh — it is pretty nice…
To sum it up
I hope you liked this installment of the Real-Time Dashboard. If you have any comments, please write them in the threads below.
Don’t forget to sign up for the newsletter, and if you need more, you can always follow me on LinkedIn — the link to my profile is found at the top of this page.
Enjoy and happy coding.