Microsoft Fabric Updates Blog

Copy data to Amazon S3 & Amazon S3 Compatible via Fabric Data Factory Data Pipeline

We are happy to announce that copying data to Amazon S3 and Amazon S3 Compatible is now available in Data pipeline of Fabric Data Factory! You can use Copy assistant and Copy activity in your Data pipeline to finish this data movement.  

How to create Amazon S3 & Amazon S3 Compatible connection 

You can create an Amazon S3/Amazon S3 Compatible connection directly through Copy assistant or Copy activity destination tab. 

In Copy assistant

Amazon S3 and Amazon S3 Compatible are now available when creating new connections with the Copy assistant.

In Data pipeline

Create a new connection to Amazon S3 and Amazon S3 Compatible in your Copy data activity.
A pop-up allows you to select a new connection to Amazon S3 and Amazon S3 Compatible.

How to configure the copy destination for Amazon S3 & Amazon S3 Compatible 

To copy data directly to Amazon S3/Amazon S3 Compatible, specify the data destination folder path (required) and the file name.

If you use the Copy activity in your Data pipeline, specify the File format (required) for the destination data. Meanwhile, you can choose to use advanced features to improve copy flexibility and performance. There are options for Copy behavior that allows you to define the copy behavior when the source is a file from a file-based data store, and Max concurrent connections that allows you to indicate the upper limit of concurrent connections established to the data store during the activity run.  

For more details about the copy destination configuration, read Configure Amazon S3 in a copy activity – Microsoft Fabric | Microsoft Learn and Configure Amazon S3 Compatible in a copy activity – Microsoft Fabric | Microsoft Learn

Have any questions or feedback? Leave a comment below!

Related blog posts

Copy data to Amazon S3 & Amazon S3 Compatible via Fabric Data Factory Data Pipeline

May 13, 2024 by Jianlei Shen

In this post, we will introduce the practice of copying data between Lakehouse that cross different workspaces via Data pipeline. In this example, we will copy data from a Lakehouse in another workspace to a Lakehouse in the current workspace by leveraging parameters to specify the workspace and Lakehouse. Follow the steps below to achieve … Continue reading “Copy Data from Lakehouse in Another Workspace Using Data pipeline”

May 1, 2024 by Alex Powers

Overview  Data Factory empowers you to ingest, prepare and transform data across your data estate with a modern data integration experience. Whether you are a citizen or professional developer, Data Factory is your one-stop-shop to move or transform data. It offers you intelligent transformations and a rich set of activities from hundreds of cloud and … Continue reading “Data Factory Spotlight: Semantic model refresh activity “