Microsoft Fabric Updates Blog

Environment is now generally available

Exciting news! The environment has officially become a generally available feature within Microsoft Fabric.

What is the environment in Microsoft Fabric?

The environment serves as a comprehensive container for both your hardware and software settings within Spark. Within this unified interface, you have the ability to select the desired Spark runtime, install libraries, and configure Spark compute settings and properties. It simplifies the process of managing, maintaining, and sharing these configurations.

Environment authoring

Libraries and Spark compute

Managing libraries and Spark compute configurations have been integral features of the environment since its private preview. With these core functionalities, you can tailor distinct configuration sets. Currently, in Spark compute configurations, two Spark runtime versions are available, and administrators can select and fine-tune compute values for the designated pool. As for libraries, we currently support public libraries from PyPI and conda.

Resources folder (new feature)

A screenshot of a computer

Description automatically generated

The Resources folder facilitates the management of small resources during the development phase. When files are uploaded within the environment, they become accessible from various notebooks once they are attached to the same environment.

The beauty of this feature lies in its real-time manipulation capabilities. Regardless of the environment’s current state, you can seamlessly add, edit, or remove files and folders. Any changes made in one notebook are instantly reflected across other notebooks and the environment item.

Sharing (new feature)

A screenshot of a computer

Description automatically generated

Environment sharing is now available, allowing you to collaborate seamlessly. When you share an environment item, recipients automatically receive read permission. With this permission, they can explore the environment’s configurations and attach it to notebooks or Spark jobs. To ensure smooth code execution, remember to grant the ‘Read’ permission of attached environment when sharing your notebooks and Spark job definitions.

Furthermore, you have the option to share the environment with ‘Share’ and ‘Edit’ permissions. Users with ‘Share’ permission can continue sharing the environment with others, with their existing permissions. Meanwhile, recipients with ‘Edit’ permission can update the environment’s content.

Environment CICD

Git support (new feature)

A screenshot of a computer

Description automatically generated

Fabric environments offer Git integration for seamless source control with Azure DevOps now. Currently, Libraries and Spark compute are supported. Within the item root folder, environments are structured with a ‘Libraries’ folder containing ‘PublicLibraries’ and ‘CustomLibraries’ subfolders, alongside the ‘Setting’ folder.

Libraries

A screenshot of a computer

Description automatically generated

When you commit an environment to a Git repository, the public library section is transformed into its YAML representation. Additionally, the custom library is committed along with its source file. By either editing the existing YAML file or uploading your own YAML file, you can effectively manage the public libraries within the environment. Furthermore, you have the ability to control the custom libraries that impact the environment by uploading or deleting the corresponding custom library files in the designated folder.

Setting

A screenshot of a computer

Description automatically generated

The Spark compute section is seamlessly transformed into its YAML representation. Within this YAML file, you have the flexibility to switch the attached pool, fine-tune compute configurations, manage Spark properties, and select the desired Spark runtime.

Deployment pipeline (new feature)

A screenshot of a computer

Description automatically generated

Fabric’s deployment pipelines simplify the process of delivering modified content across different phases, such as moving from development to test. Excitingly, the deployment pipelines now support environment items, allowing you to efficiently manage environment deployments by configuring workspaces with phases.

Public APIs (new feature)

Public APIs have consistently ranked among the most requested features for our environment, and now they’re finally here. Libraries and Spark compute can be managed through public APIs. If you’re interested in learning how to utilize APIs for environment management, I recommend reading this article [Using public APIs for environment].

Liittyvät blogikirjoitukset

Environment is now generally available

lokakuuta 29, 2024 tekijä Dandan Zhang

Managed private endpoints allow Fabric experiences to securely access data sources without exposing them to the public network or requiring complex network configurations. We announced General Availability for Managed Private Endpoint in Fabric in May of this year. Learn more here: Announcing General Availability of Fabric Private Links, Trusted Workspace Access, and Managed Private Endpoints. … Continue reading “APIs for Managed Private Endpoint are now available”

lokakuuta 28, 2024 tekijä Estera Kot

We’re thrilled to announce that the Native Execution Engine is now available at no additional cost, unlocking next-level performance and efficiency for your workloads. What’s New?  The Native Execution Engine now supports Fabric Runtime 1.3, which includes Apache Spark 3.5 and Delta Lake 3.2. This upgrade enhances Microsoft Fabric’s Data Engineering and Data Science workflows, … Continue reading “Native Execution Engine available at no additional cost!”