Microsoft Fabric Updates Blog

How to debug user data functions locally in VS Code

Debugging your code is important to identify issues and mitigate them when you’re working with user data functions in Microsoft Fabric. You want to make sure everything works as it should and that’s where local debugging lets you catch problems in your code without messing with the live environment. In this blog post, I will walk you through the steps to make local debugging easier and faster.

What You’ll Need

Before you dive in, make sure you have:

Step 1: Open the user data function in VS Code

To open a user data function item in VS Code from the portal, start by navigating to the Microsoft Fabric portal and locating the specific user data functions item you want to work with.  Select Open in VS Code to open the item in VS Code.

Step 2: Setup your local environment

You will be notified to create a Python virtual environment for user data functions project that was opened in VS Code. This will be used to set up the libraries used by user data functions project.

Select Python 3.11.9 runtime version when creating the virtual environment.

Once the virtual environment is created , you can start writing your own functions.

Step 3: Add a new function

In the Fabric explorer you will find all the connections and libraries added to this user data functions item.

With a database connection, add this function to function_app.py to create an employee table and insert data.

@udf.connection(argName="sqlDB",alias="<alias for sql database>")
@udf.function()
def write_one_to_sql_db(sqlDB: fn.FabricSqlConnection, employeeId: int, employeeName: str, deptId: int) -> str:
    # Replace with the data you want to insert
    data = (employeeId, employeeName, deptId)

    # Establish a connection to the SQL database
    connection = sqlDB.connect()
    cursor = connection.cursor()
  
    # Create the table if it doesn't exist
    create_table_query = """
        IF OBJECT_ID(N'dbo.Employee', N'U') IS NULL
        CREATE TABLE dbo.Employee (
            EmpID INT PRIMARY KEY,
            EmpName nvarchar(50),
            DepID INT
            );
    """
    cursor.execute(create_table_query)
 
    # Insert data into the table
    insert_query = "INSERT INTO Employee (EmpID, EmpName, DepID) VALUES (?, ?, ?);"
    cursor.execute(insert_query, data)

    # Commit the transaction
    connection.commit()

    # Close the connection
    cursor.close()
    connection.close()               
    return "Employee table was created (if necessary) and data was added to this table"

Step 4: Use Debugging Tools

Select F5 to debug your Fabric functions. You can add a breakpoint anywhere in your code. In debug mode, your breakpoints are hit as expected and test your code as you would test a deployed function.

Run the function step by step, observing variable values and understanding the behavior. You can use the following to help debug issues:

  • Breakpoints: pause the code at certain spots to check what’s happening.
  • Console Logs: print logs in the console to help you trace issues and debug.
  • Logger: use the logger available with fabric-user-data-functions library to push logging for the functions already published in Fabric.

FAQ

  • What is the reason for the squiggly underlines appearing on the import statements?

You may see this since the libraries are not installed in your virtual environment. When you start debugging the function the first time, the libraries will be installed as pip install from the requirements.txt  is executed during debugging.

  • After updating the requirements.txt file with new libraries. Can the changes to functions and the libraries published?

Today, you can publish the functions code changes, but new libraries added in requirements.txt file will not be added to the user data functions item during publish. After publishing the changes, you would need to add the new libraries in the portal using library management and publish it again. We understand this is not ideal and are working on improving the experience with libraries.

  • How can local.settings.json be used to add connection strings??

You can use local.settings.json to store connection strings and use them in your code.

  • Is it possible to add a data connection from the Fabric extension to a user data function item?

Not at this time. All data connections need to be added to the user data functions item in the portal. Sync the changes locally to run these functions in VS Code.

Final Thoughts

By running your function in a controlled environment locally in VS Code, you can find and fix issues faster, and make sure your code works as expected. Explore the Fabric User data functions documentation that guide you through connecting data sources, building pipelines, integrating with Notebooks and Power BI.  Feel free to request features or report issues on VS Code fabric extension issues. Submit your feedback on Fabric Ideas and join the conversation on the Fabric Community.

Entradas de blog relacionadas

How to debug user data functions locally in VS Code

julio 17, 2025 por Sunitha Muthukrishna

Microsoft Fabric has introduced new features for its User Data Functions (UDFs), enhancing Python-based data processing capabilities within the platform. These updates include support for asynchronous functions and the use of pandas DataFrame and Series types for input and output, enabling more efficient handling of large-scale data. • Async function support: Developers can now write async functions in Fabric UDFs to improve responsiveness and efficiency, especially for managing high volumes of I/O-bound operations, such as reading files asynchronously from a Lakehouse. • Pandas DataFrame and Series integration: UDFs can accept and return pandas DataFrames and Series, allowing batch processing of rows with improved speed and performance in data analysis tasks. An example function calculates total revenue by driver using pandas groupby operations. • Usage in notebooks: These functions can be invoked directly from notebooks using pandas objects, facilitating efficient aggregation and analysis of large datasets interactively within Microsoft Fabric. • Getting started and benefits: Users can enable these features by updating the fabric-user-data-functions library to version 1.0.0. The enhancements reduce I/O operations, enable concurrent task handling, and improve performance on datasets with millions of rows.

julio 10, 2025 por Matthew Hicks

Effortlessly read Delta Lake tables using Apache Iceberg readers Microsoft Fabric is a unified, SaaS data and analytics platform designed for the era of AI. All workloads in Microsoft Fabric use Delta Lake as the standard, open-source table format. With Microsoft OneLake, Fabric’s unified SaaS data lake, customers can unify their data estate across multiple … Continue reading “New in OneLake: Access your Delta Lake tables as Iceberg automatically (Preview)”