Microsoft Fabric Updates Blog

Announcing Public Preview of T-SQL Notebook in Fabric

T-SQL Notebook public preview

Coauthored by: Priyanka Langade

T-SQL notebook is now in public preview! This update expands our language offerings, allowing T-SQL developers to use Notebooks for Data Warehouse development. You can manage complex queries and document your process with Markdown cells for better documentation. Now, with the ability to add a Data Warehouse or SQL analytics endpoint into Notebook, users can run T-SQL directly on connected warehouse or SQL analytics endpoint. BI Analysts will benefit as well by performing cross-database queries to gather insights from different warehouses and SQL analytics endpoint.

Most of the current Notebook functionalities are readily available for T-SQL notebooks. For example, T-SQL developers can utilize the extensive charting features to graphically represent their query outcomes, and work together with peers on co-authoring the notebook via collaboration, you can also schedule a regular execution of the T-SQL notebook with scheduler or trigger its execution within the Data Integration pipeline.

To get started with this experience, you can create a T-SQL notebook with the following two entries:

  • Create a T-SQL Notebook from the Data Warehouse homepage.
  • Create a T-SQL Notebook from an existing Fabric warehouse editor.

A screenshot of a computer

Description automatically generated

A screenshot of a computer

Description automatically generated

Once the notebook is created, T-SQL is set as the default language, you can add data warehouse or SQL analytics endpoints from the current workspace into to notebook.

A screenshot of a computer

Description automatically generated

Multiple warehouses/SQL analytics endpoints can be added into the same notebook. One of them should be set as the primary warehouse. The SQL Connection string from the primary warehouse will be used to accept request of running T-SQL code, you can use the context menu item to switch Primary warehouse.

Similar to the Warehouse editor experience, the primary warehouse is always set as the first item in the Data Warehouse list.

Like the SQL editor, user can get familiar T-SQL code templates from the context menu. The first release has these options and will have more later.

  • Select Top 100:
  • Create Table
  • Create As Select
  • Drop
  • Drop and create

By clicking the “Run” button at the code cell, all the T-SQL code within the code cell will be executed under one SQL session.

A white rectangular object with blue border

Description automatically generated

If you click the “Run all”, it will run every code cell in this notebook in order. Each code cell is executed under a new SQL session.

Within the same code cell, it might contain multiple lines of code. User can select part of these code and only run the selected ones; each execution starts a new SQL session.

By expanding the message panel, you can check the summary of the execution.

A screenshot of a computer

Description automatically generated

The “Table” tab lists the records from the returned result set. If the execution contains multiple result sets, you can switch from one to another via the dropdown menu.

A screenshot of a computer

Description automatically generated

You can use “Save as table” menu to save the results of the query into the table using CTAS command. To use this menu, select the query text from the code cell and select “Save as table” menu.

Similarly, you can create a view from your selected query text using “Save as view” menu in the cell command bar.

When clicking “Inspect”, you can see the charts which represent the data quality and distribution of each column.

A screenshot of a computer

Description automatically generated

You can perform cross-warehouse queries by using three-part naming in the T-SQL code.

Within the same notebook, it’s possible to create code cells that use different languages. For instance, a PySpark code cell can precede a T-SQL code cell. In such case, user can choose to skip the run of any PySpark code for T-SQL notebook.

A screenshot of a computer

Description automatically generated

相關部落格文章

Announcing Public Preview of T-SQL Notebook in Fabric

10月 31, 2024 作者: Rie Merritt

Welcome to the October edition of the Fabric Influencers Spotlight, a recurring monthly post here to shine a bright light on the places on the internet where Microsoft MVPs & Fabric Super Users are doing some amazing work on all aspects of Microsoft Fabric.  The Microsoft Fabric Community team has created the Fabric Influencers Spotlight to highlight … Continue reading “Fabric Influencers Spotlight October 2024 “

10月 17, 2024 作者: Gabi Lehner

Microsoft Fabric’s new Real-Time Dashboard permissions feature brings granular control to how users interact with Real-Time Analytics. With the introduction of separate permissions for dashboards and underlying data, administrators now have the flexibility to allow users to view dashboards without giving access to the raw data. This separation is key for organizations that need to … Continue reading “Real-Time Dashboards and underlying KQL databases access separation (preview)”