Microsoft Fabric Updates Blog

Service principal support to connect to data in Dataflow, Datamart, Dataset and Dataflow Gen 2

Today I am very excited to announce that Azure service principal has been added as an authentication type for a set of data sources that can be used in Dataset, Dataflow, Dataflow Gen2 and Datamart. 

Azure service principal is a security identity that is application based and can be assigned permissions to access your data sources. Service principals are used to safely connect to data, without a user identity. Learn more about service principals.

SPN authentication type for SQL data source won’t work with DQ dataset

Supported data sources

  • Azure Synapse analytics
  • Azure SQL database
  • Azure data Lake gen 2
  • Azure data lake
  • Azure blob storage
  • Web
  • Dataverse
  • SharePoint online

Note: Service principal authentication is not supported for SQL data source with Direct Query in datasets

How to use service principals to connect to your data in dataflows gen 2 

In this example, we will show how you can use service principal to connect to an Azure Data Lake gen 2 through dataflows gen 2. Later in the article we show how you can use service principals in datasets.  

Prerequisite

  1. Create a service principal using the Azure portal
  1. Grant permission for the application to have read access on the data source. In the example of data lake, make sure the application has storage blob data reader access.  

Connect to your data using Service Principal within dataflows gen 

  1. Navigate to https://app.fabric.microsoft.com/ 
  1. Create a dataflow gen 2 
  1. Select Azure Data Lake Storage Gen2 as source 
  1. Fill in the URL and select create new connection 
  1. Change Authentication kind to Service Principal 
Select service principal as authentication type.
  1. Fill in Tenant ID. You can find the tenant ID in the Azure Portal. 
Find tenant ID in Azure portal.
  1. Fill in the Service principal ID 
Find client ID in Azure portal

8. Fill in the Service principal key 

  1. Click Next 

How to use service principals to connect to your data in datasets 

Prerequisite

  1. Create a service principal using the Azure portal.[ Create an Azure AD app and service principal in the portal – Microsoft Entra | Microsoft Learn
  1. Grant permission for the application to have read access on the data source. 
  1. Have a dataset published to te service. 

Connect to data using Service Principal within datasets 

  1. Navigate to https://app.fabric.microsoft.com/ 
  1. Navigate to the dataset settings page 
Open dataset settings page
  1. Navigate to the data source credentials and click edit credentials 
  1. Fill in Tenant ID. You can find the tenant ID in the Azure Portal. 
Find tenant ID in Azure portal
  1. Fill in the Service principal ID 
Find client ID in Azure portal
  1. Fill in the Service principal key 
  1. Click Sign in 

Other resources

  • Join the Fabric community to post your questions, share your feedback, and learn from others.
  • Visit Microsoft Fabric Ideas to submit feedback and suggestions for improvements and vote on your peers’ ideas!
  • Check our Known Issues page for up to date on product fixes!

Have any questions or feedback? Leave a comment below!

Kapcsolódó blogbejegyzések

Service principal support to connect to data in Dataflow, Datamart, Dataset and Dataflow Gen 2

június 24, 2024 Készítette Justin Barry

When we talk about Microsoft Fabric workspace collaboration, a common scenario is developers and their teams using a shared workspace environment, which means they have access to “live items”. A change made directly within a workspace would override and affect all other developers or users utilizing that workspace. This is where git becomes increasingly important … Continue reading “Microsoft Fabric Lifecycle Management: Getting started with development in isolation using a Private Workspace”

június 21, 2024 Készítette Marc Bushong

Developing ETLs/ELTs can be a complex process when you add in business logic, large amounts of data, and the high volume of table data that needs to be moved from source to target. This is especially true in analytical workloads involving relational data when there is a need to either fully reload a table or incrementally update a table. Traditionally this is easily completed in a flavor of SQL (or name your favorite relational database). But a question is, how can we execute a mature, dynamic, and scalable ETL/ELT utilizing T-SQL with Microsoft Fabric? The answer is with Fabric Pipelines and Data Warehouse.