Microsoft Fabric Updates Blog

Integrate your SAP data into Microsoft Fabric

SAP systems hold some of the most valuable data of many enterprises, large or small. Whether it is operational data in ERP systems like SAP ECC or SAP S/4HANA, data from SAP’s data warehouse suite of products like SAP BW or SAP Datasphere, or data from SAP’s SaaS solutions – a comprehensive view on the state of the business typically involves SAP data to be combined with data from various other sources.

Working with SAP data often requires a variety of skills: setting up pipelines for incremental extraction of vast amounts of financial or sales order data along with maintaining and monitoring those pipelines is typically a task for data integration professionals. On the other hand, the use case and department-specific consumption of data and the corresponding enrichment with non-enterprise sources is best handled by citizen data integrators and developers.

Consequently, an analytic platform like Microsoft Fabric, which provides a single, comprehensive environment for all developer personas, provides an ideal foundation to work with SAP data.

Fundamentally, there are three different approaches to integrating your SAP data into Microsoft Fabric. First, you can use the built-in SAP connectors of Microsoft Fabric. Second, you might want to leverage Azure Data Factory or Azure Synapse Analytics to land data in your Fabric Lakehouse. Or, if some of your SAP data already resides in Azure Data Lake Gen2, AWS or GCP, you can simply use shortcuts to make that data accessible for your Microsoft Fabric users.

Of course, the current SAP connectivity in Microsoft Fabric is just a starting point. Over the next months we will continue to add more built-in connectors and extend the capabilities, so stay tuned for more updates!

Built-in connectivity for SAP sources

Using the built-in connectivity of Microsoft Fabric is, of course, the easiest and least-effort way of adding SAP data to your Fabric data estate. The current options provide connectivity to SAP S/4HANA, SAP BW or BW/4HANA, SAP HANA or HANA Cloud, and SAP Datasphere.

All these connectors are supported via the on-premises data gateway (OPDG), which is typically installed in network proximity to the SAP source system. This architecture supports SAP systems running virtually anywhere – whether in a customer’s data center, Microsoft Azure, or any other hyperscaler infrastructure – in a secure way since direct network access to the SAP source system is only required for OPDG.

Let’s have a closer look at these connectors:

SAP BW (Application Server or Message Server) connector

This connector is primarily used to access the multidimensional analytic query layer in SAP BW or BW/4HANA. It’s important to note that the analytic query layer of SAP S/4HANA – based on the ABAP CDS virtual data model – is supported with this connector as well.

While this access method is not optimized for mass data extraction in terms of tens or hundreds of millions of rows, it typically provides access to highly curated and easy-to-consume data, containing complex, business relevant KPIs. In addition, data security is handled in this layer as well, which makes it an extremely valuable access point for self-service analytics.

Besides import via Fabric dataflows also direct query is supported with this connector.

Configuration of the SAP BW connectors is described here:
Set up your SAP BW Application Server connection – Microsoft Fabric | Microsoft Learn
Set up your SAP BW Message Server connection – Microsoft Fabric | Microsoft Learn

SAP HANA connector

Similar to the SAP BW connector, standard access of the SAP HANA connector is via the multidimensional cube layer of an SAP HANA or SAP HANA Cloud database consisting of so-called HANA Calculation Views. Again, this layer is frequently used to prepare data for rendering in BI clients, thus containing business relevant KPI definitions and data security.

However, many customers also use HANA Calculation Views for more SQL-style relational data modeling and have even built data extraction layers in their SAP source systems based on this technology. Also, SAP BW-on-HANA or SAP BW/4HANA provide HANA Calculation Views as an external interface for third-party tools. Therefore, this option can also be considered for integration of larger data volumes via Fabric dataflows.

In addition, the SAP HANA connector can also be used to extract data from SAP HANA tables and SQL views with the ability to fetch data via a SQL statement.

Finally, it is worth noting that access to SAP Datasphere’s HANA Cloud layer is also possible. Such connectivity can be established by creating a SQL user with read access within an appropriate Space in SAP Datasphere together with the above-mentioned option via SQL statements in the SAP HANA connector.

For HANA Calculation Views, this connector also supports direct query.

Configuration of the SAP HANA connector is described here: Set up your SAP HANA database connection – Microsoft Fabric | Microsoft Learn.

ODBC connector

For access to SAP HANA SQL artifacts like tables and (SQL) views, the Microsoft Fabric’s ODBC connector provides an alternative to the SAP HANA connector. Since it is less geared towards proprietary SAP HANA artifacts like Calculation Views, some customers prefer it when working with SQL objects.

Especially in the case of SAP Datasphere which works mostly with standard SQL objects in its data layer this might be the more natural choice than the SAP HANA connector.

Configuration of the ODBC connector is described here: Set up your Odbc connection – Microsoft Fabric | Microsoft Learn

To summarize, here‘s an overview of the connectors, supported sources and typical scenarios:

ConnectorSources Use Case
SAP BW (Application Server or Message Server)– SAP BW, SAP BW/4HANA
Access to multidimensional analytic layer
– SAP Datasphere
– Access to multidimensional analytic layer
– Access to SQL artifacts (tables, views)
– Access to HANA Calculation Views exposed by SAP BW-on-HANA or BW/4HANA
– SAP Datasphere
Access to SQL artifacts (tables, views)

Leveraging Azure Data Factory or Azure Synapse Analytics

For use cases that cannot yet be covered by the built-in SAP connectivity in Microsoft Fabric, or for customers who want to leverage their existing investments into the Azure Data portfolio and combine it with Fabric, it’s worth looking at the SAP connectivity provided by Azure Data Factory or Azure Synapse Analytics.

Both tools provide comprehensive SAP connectivity for large scale use cases. Supported SAP sources include SAP S/4HANA, SAP ECC, SAP BW and BW/4HANA, SAP HANA and SAP Cloud applications like SAP Successfactors, SAP Concur and others.

With the recently added connectivity to Fabric Lakehouse, customers can leverage these well-established products to ingest SAP data into Microsoft Fabric, as well.

For example, to ingest data from SAP into Fabric using the SAP CDC connector in ADF, just set up a Lakehouse linked service as described in the links above, and use it for your sink in your SAP CDC mapping data flow as shown this video.

An overview of SAP connectors available in Azure Data Factory and Azure Synapse Analytics with links to the configuration details can be found here: SAP Connectors – Azure Data Factory | Microsoft Learn.

For details on setup and configuration of the Lakehouse connectivity in Azure Data Factory and Azure Synapse Analytics, please read the following documents:

Copy and transform data in Microsoft Fabric Lakehouse Files (Preview) – Azure Data Factory & Azure Synapse | Microsoft Learn

Copy and Transform data in Microsoft Fabric Lakehouse Table (Preview) – Azure Data Factory & Azure Synapse | Microsoft Learn

OneLake Shortcuts

For SAP data that already resides in some hyperscaler data lake storage, OneLake Shortcuts provide a quick and easy-to-setup option. Via shortcuts, existing data lake artifacts like files or Delta tables from Azure Data Lake Gen2, Amazon S3 and others can be accessed from OneLake without the need to physically copy the data: OneLake shortcuts – Microsoft Fabric | Microsoft Learn.

This opens a variety of scenarios. For example, if you have already extracted large amounts of data from your SAP systems into Azure Data Lake Gen2 using Azure Data Factory or Azure Synapse Analytics (or any other ELT tool that provides connectivity to both SAP and ADLSGen2), that data can be made available to Microsoft Fabric users and workloads within seconds – and without having to create an additional data copy.

In cases where the SAP team within the customer’s IT organization is responsible for providing SAP data to non-SAP consumers, SAP tools might be of particular interest for such a scenario. Below, we are listing some of these options along with links to additional resources providing more details.

SAP Data Services (aka BODS) is an established ETL tool with a long history and significant market presence which can also ingest data into ADLSGen2: How to ingest data from SAP to Azure Data Lake using SAP DS | SAP Blogs.

SAP Data Intelligence can be considered a cloud based successor of SAP Data Services. One way to leverage it for data ingestion into ADLSGen2 can be found here: SAP Data Intelligence : SLT Replication to Azure Data Lake with file size limit | SAP Blogs.

SAP Datasphere, an evolution of SAP Data Warehouse Cloud, comes with significantly enhanced capabilities for data integration – including replication into non-SAP targets like ADLS Gen2. Here’s the corresponding link to the SAP Datasphere documentation describing this feature: Using a Cloud Storage Provider As the Target | SAP Help Portal.


The comprehensive SAP connectivity provided by Microsoft – whether already built into Fabric or available in the wider data integration portfolio like Azure Data Factory – can be used with Microsoft Fabric immediately. Have fun using it – and feel free to reach out to us with your feedback!

And finally, watch out for more blog posts as we expand the built-in SAP connectivity in Fabric over the next months!

Related blog posts

Integrate your SAP data into Microsoft Fabric

July 19, 2024 by Penny Zhou

We are thrilled to share that you can easily browse and connect to your Azure resources automatically with the modern get data experience of Data Pipelines. With the Browse Azure experience, you don’t need to manually fill in the endpoint, URL or server and database name when connecting to Azure resources. Instead, the connection will … Continue reading “Quickly Connect to your Azure Resources in Fabric with the Data Pipeline Modern Get Data Experience”

July 17, 2024 by Ye Xu

Fast Copy boosts the speed and cost-effectiveness of your Dataflows Gen2. We loaded a 6 GB CSV file to a Lakehouse table in Microsoft Fabric with 8x faster and 3x cheaper result. See our last post for details. Today, we’re excited to announce that Fast Copy in Dataflow Gen2 now supports high-performance data transfers from … Continue reading “Fast Copy with On-premises Data Gateway Support in Dataflow Gen2”