Microsoft Fabric Updates Blog

Microsoft Fabric logo
Microsoft Fabric logo

Announcing Azure Private Link Support for Microsoft Fabric in Public Preview

Organizations today rely on cloud platforms for storage and analysis of data at scale and need to keep up with the accelerating volume of data while protecting sensitive information.  While enterprises in Banking, Healthcare, and similar domains require strict data security standards by default, securing business-critical data is the highest priority for all enterprises. At …

Introducing Managed Private Endpoints for Microsoft Fabric in Public Preview

In the era of AI, data has become the cornerstone of analytics platforms. With the ever-increasing volume of data being collected across various applications, data lakes, databases, and data warehouses within an enterprise data estate, the need for secure access to enterprise data sources has become critical. This is particularly important given the growth of …

Eventhouse Overview: Handling Real-Time Data with Microsoft Fabric

Today, Fabric introduces the Eventhouse (preview), a dynamic workspace hosting multiple KQL databases as part of Fabric Real-Time Analytics Overview of Real-Time Analytics – Microsoft Fabric | Microsoft Learn Handling Real-Time Data with Microsoft Fabric Eventhouses assume a pivotal role in the Microsoft Fabric ecosystem by offering a robust solution for managing and analyzing substantial …

Reduce egress costs with S3 shortcuts in OneLake

Fabric allows workloads to easily access data across clouds through OneLake shortcuts. Define a shortcut once and use it with Power BI reports, SQL, Spark and Kusto. This ease of consumption allows users to start analyzing their data in minutes rather than hours or even days, but this can also lead to increased egress charges. …

Dataflows Gen2 data destinations and managed settings

We are very excited to announce a lot of new improvements to data destinations in Dataflows Gen2. Here is an overview of the new improvements and how to get started. After you have cleaned and prepared your data with dataflows gen 2, you want to land your data into a destination. This is possible with …