Microsoft Fabric Updates Blog

Integrating On-Premises Data into Microsoft Fabric Using Data Pipelines in Data Factory

We are thrilled to announce the public preview of on-premises connectivity for Data pipelines in Microsoft Fabric.

Using the on-premises Data Gateway, customers can connect to on-premises data sources using dataflows and data pipelines with Data Factory in Microsoft Fabric. This enhancement significantly broadens the scope of data integration capabilities. In essence, by using an on-premises Data Gateway, organizations can keep databases and other data sources on their on-premises networks while securely integrating them to Microsoft Fabric (cloud).

Thank you to all product feedback from customers and the Microsoft Fabric community who have been working with us closely to deliver this new capability!

Let’s help you get started!

Create an on-premises data gateway

  1. An on-premises data gateway is a software designed to be installed within a local network environment. It provides a means to directly install the gateway onto your local machine. For detailed instructions on how to download and install the on-premises data gateway, refer to Install an on-premises data gateway.
  2. Sign-in using your user account to access the on-premises data gateway, after which it’s prepared for utilization.

Create a connection for your on-premises data source

  1. Navigate to the admin portal and select the settings button (an icon that looks like a gear) at the top right of the page. Then choose Manage connections and gateways from the dropdown menu that appears.

2. On the New connection dialog that appears, select On-premises and then provide your gateway cluster, along with the associated resource type and relevant information.

Using on-premises data in a pipeline

  1. Go to your workspace and create a Data Pipeline.

2. Add a new source to the pipeline copy activity and select the connection established in the previous step.

3. Select a destination for your data from the on-premises source.

4. Run the pipeline.

You have now created a pipeline to load data from an on-premises data source into a cloud destination.

Resources to help you get started

Have any questions or feedback? Leave a comment below!

相關部落格文章

Integrating On-Premises Data into Microsoft Fabric Using Data Pipelines in Data Factory

5月 13, 2024 作者: Jianlei Shen

In this post, we will introduce the practice of copying data between Lakehouse that cross different workspaces via Data pipeline. In this example, we will copy data from a Lakehouse in another workspace to a Lakehouse in the current workspace by leveraging parameters to specify the workspace and Lakehouse. Follow the steps below to achieve … Continue reading “Copy Data from Lakehouse in Another Workspace Using Data pipeline”

5月 1, 2024 作者: Alex Powers

Overview  Data Factory empowers you to ingest, prepare and transform data across your data estate with a modern data integration experience. Whether you are a citizen or professional developer, Data Factory is your one-stop-shop to move or transform data. It offers you intelligent transformations and a rich set of activities from hundreds of cloud and … Continue reading “Data Factory Spotlight: Semantic model refresh activity “