Introducing Enhanced Capabilities in Eventstream: Derived Streams, Edit modes, and Smart Routing
We are pleased to announce a slew of new enhancements in Fabric Eventstream to improve your development experience of building a data streaming application. Our latest update introduces three new features: Edit and Live modes, Default and Derived Streams, and Smart Routing. These additions are set to revolutionize the way developers manage, process, and analyze real-time data streams with unprecedented flexibility and efficiency.
A Glimpse into the New Features
Edit and Live Modes
Eventstream offers two distinct modes, Edit and Live, to provide flexibility and control over the development process of your eventstream. Edit mode allows you to make changes to your data streaming flow without disrupting the ongoing data streams. This mode avoids test data being streamed to your eventstream, ensuring a secure environment for testing. On the other hand, Live mode provides real-time visibility into the data flow, enabling you to oversee the ingestion, processing, and distribution of data streams within the Fabric. You can easily switch two modes using the button on the top-right corner.
Default and Derived Streams
Data stream is a dynamic and continuous flow of data, allowing you to set up real-time alerts, feed into different types of data stores, and enable publish/subscribe model. With the introduction of Default and Derived Streams, you can now create and manage multiple data streams within Eventstream, which can then be displayed in the Real-Time Hub for further analysis. Here are the differences between these two types of streams:
- Default stream: Automatically generated when a streaming source is added to Eventstream. Default stream captures raw event data directly from the source, ready for transformation or analysis.
- Derived stream: A specialized stream that users can create as a destination within Eventstream. Derived stream can be created after a series of operations such as filtering and aggregating, and then it’s ready for further consumption or analysis by other users in the organization through the Real-Time Hub.
The following example shows that when creating a new Eventstream a default stream alex-demo-stream is automatically generated. Subsequently, a derived stream dstream is added after an aggregation operation. Both default and derived streams can be accessible in the Real-Time Hub.
Content-based Routing
Customers can now perform stream operations directly in Eventstream’s Edit mode, revolutionizing the way real-time data streams are processed and routed. This enhancement allows you to design stream processing logic and route data streams based on content directly on the Eventstream editor.
The following screenshot shows a scenario where a single data stream source is processed and routed to three distinct destinations based on different stream content. One destination, a KQL database, stores raw data for archival purposes. A second KQL database retains filtered data streams, ensuring that only relevant information is routed for further analysis. Finally, the Lakehouse serves as a repository for aggregated values.
Add and process a data stream in Fabric Eventstream
In the following guide, we’ll use a bike-sharing sample data as a practical example to illustrate how to design a data streaming flow in Eventstream. You’ll learn to capture, transform, and route real-time bike data streams by taking full advantage of Eventstream’s enhanced capabilities.
1. Begin by switching your PowerBI experience to Real-Time Intelligence, and select Eventstream to create a new one. Make sure Enhanced Capabilities box is checked.
2. Click on Use sample data, name this source, and select Bicycles from the available option. This action will add a real-time bike-sharing data into your eventstream.
3. Select Add destination from the menu bar and choose Lakehouse as your destination for storing the raw bike data. Make sure you enter the correct information of your lakehouse.
5. Now, let’s add a transformation to your data stream. Select Transform events and choose Aggregation operation. Configure this operation to calculate the total number of bikes on the streets every 5 mintues. Then route this aggregated data stream to a KQL database for more further analysis.
6. To filter the streets with no bike, add a Filter operation with the condition “No_Bikes equals 0″. Following this, add a derived stream for this filtered data stream that will be ready for consumption in the Real-Time Hub.
7. Finally, you can publish the eventstream and initiate the processing and routing of your data streams. Then you can monitor and interact with the streams in the Real-time Hub.
Learn more, and help us with your feedback
To find out more about Real-Time Intelligence, read Yitzhak Kesselman’s announcement. As we launch our preview, we’d love to hear what you think and how you’re using the product. The best way to get in touch with us is through our community forum or submit an idea. For detailed how-tos, tutorials and other resources, check out the documentation.
This is part of a series of blog posts that dive into all the capabilities of Real-Time Intelligence. Stay tuned for more!