Microsoft Fabric Updates Blog

Get started with exploring your data with KQL – a purpose-built tool for petabyte scale data analytics

In this blog, we’ll focus on the different ways of querying data in Synapse Real-Time Analytics. But before we begin, make sure to follow the steps in the Synapse Real-Time Analytics: Discovering the best ways to get data into a KQL database to ingest data into your KQL database.
 
We recently launched Microsoft Fabric – an all-in-one analytics solution for enterprises that covers everything from data movement to data science, real-time analytics, and business intelligence. Microsoft Fabric brings together new and existing components from Power BI, Azure Synapse, and Azure Data Explorer into a single integrated environment. The platform is built on a foundation of Software as a Service (SaaS), which takes simplicity and integration to a whole new level.
 
One of Fabric’s key components is Synapse Real-Time Analytics, a fully managed big data analytics platform optimized for streaming and time-series data. It provides all the amazing query capabilities, performance, and scale that customers are used to with Azure Synapse Data Explorer in a SaaSified experience.
 
The main items available in Real-Time Analytics include:

    • A KQL database for data storage and management.
    • Event streams for data loading.
    • A KQL queryset to run queries, view, and manipulate query results on data.

 
Querying data in Real-Time Analytics is powered by Kusto Query Language (KQL). KQL is a powerful language to explore your data, discover patterns, identify anomalies and outliers, create statistical modeling, and more. The query uses schema entities that are organized in a hierarchy similar to SQL’s: databases, tables, and columns.
  

Query table

If you’re just starting out with KQL, you can use sample queries to get an initial look at your data. To access the sample queries, browse to your KQL database, select a table, and then select ‘Query table’ from the Table tab. When you select a query, it populates in the ‘Explore your data’ window, and automatically runs in the context of the selected table.

A screenshot of a computer

Description automatically generated

The ‘Query table’ context menu provides some sample SQL queries. You can bring your SQL skills to query data in your KQL database.

A screenshot of a computer

Description automatically generated
  

Explore your data

The ‘Explore your data’ button allows you to quickly explore your data and begin writing KQL queries. As you’ll notice, your previously written queries are available in this window.
 
KQL is powered with Intellisense so that you don’t need to remember table names, column names, KQL commands and operators. Intellisense helps you complete the query and also highlights syntax errors, if they exist.
 
You can save the queries in the ‘Explore your data’ window as a KQL queryset. Your KQL queryset becomes an artifact in your workspace that you can share and collaborate on with other users.

A screenshot of a computer

Description automatically generated with medium confidence

KQL Queryset

KQL Queryset is the query editor for KQL Database in Real-time Analytics. It enables you to write simple to complex KQL queries to explore and analyze your data. When you begin writing KQL queries, please refer to the following documents:

 
Here are some examples of KQL queries to help you get started. You can copy and run these queries in your KQL queryset.
 
1. Count the number of records by the ticker:

StocksDaily

| summarize count() by Ticker

 

In this query, we use the summarize operator and the count() function. Similar to SQL, KQL provides many standard scalar functions.

2. Find the years with the largest change for ATVI stock ticker:

StocksDaily

| where Ticker == “ATVI”

| summarize max(Close), min(Close) by startofyear(Date)

| extend pct = round((max_Close – min_Close)/ min_Close *100,2)

| sort by pct

 

In this query, we use the extend operator to create a calculated column and then the sort operator to sort the result set.

3. Render a timechart of the closing price of GSPC stock ticker:

StocksDaily

| where Ticker == “^GSPC”

| project Close, Date

| sort by Date asc

| render timechart

 

In this query, we use the project operator to select the columns of interest in the result set. And then, we use the render operator to show a timechart of the result set.

4. Find the day when stock market crashed:

StocksDaily

| where Ticker == “^GSPC”

| serialize

| extend prevClose = prev(Close,1)

| extend pct = round((prevClose – Close)/ Close *100,2)

| top 10 by pct

| project Date, pct

| render scatterchart

 

In this query, we use the serialize operator to serialize the output so that we can use prev() to calculate the difference between the current and previous values. KQL also provides the top operator that combines the function of sorting and showing only the specified number of records.

5. Find the stocks that had the largest fall in any given year:

StocksDaily

| summarize max(Close), min(Close) by Ticker, startofyear(Date)

| extend pct = round((max_Close – min_Close)/ min_Close *100,2)

| sort by pct

 

In this query, we are combining summarize, project, and sort by operators with arithmetic calculations.

 
In summary, Synapse Real-time Analytics brings the power of Kusto Query Language (KQL) to the Microsoft Fabric platform along with providing intuitive UI driven options and the ability to save and share queries to enable seamless integration.
 
Over the next few weeks, we’ll dive deeper into many of these features via blogs, videos, tutorials and learn modules.
 
In the meanwhile, check out a demo that showcases getting started with KQL in Real-time Analytics.

Liittyvät blogikirjoitukset

Get started with exploring your data with KQL – a purpose-built tool for petabyte scale data analytics

marraskuuta 4, 2024 tekijä Salil Kanade

AI is transforming data warehousing, making complex analytics more accessible and efficient. With tools like Copilot for Data Warehouse and AI Skill, Microsoft Fabric offers two powerful, complementary resources that serve both data developers and business users. This blog explores how these tools differ, when to use each, and how they can work together to … Continue reading “Data Warehouse: Copilot & AI Skill”

lokakuuta 21, 2024 tekijä Amir Jafari

The Share capability for Fabric AI Skill is now in public preview. This feature enables AI Skill creators to share their AI Skill with colleagues, allowing them to consume the AI Skill and gain valuable insights from the data. How Sharing WorksOnce you have created and validated the effectiveness of your AI Skill, you can … Continue reading “Announcing Public Preview of the “Share” Feature for Fabric AI Skill”