An exciting new announcement has come to Microsoft Build in the form oftela de microsoft, a unified platform that brings together data engineering, data warehousing, data science, real-time analytics, and Power BI. It is an evolution of Azure Synapse Analytics and Power BI from their current forms.You can read the notice here.
- Attempt:tela de microsoft
- Community page:Home - Microsoft Fabric Community
- Ideas:Home (microsoft.com)
- Product website:Introducing Microsoft Fabric: The Data Platform for the AI Age | Azure Blog | microsoft azure
- Documentation:Microsoft Fabric Documentation - Microsoft Fabric | learn
- Blog:Microsoft Fabrics Blog
- Learn modules:Getting started with Microsoft Fabric - Training | learn
- End-to-End Scenario Tutorials:Full Tutorials i Microsoft Fabric – Microsoft Fabric | Microsoft Laer
I've been lucky enough to be in private preview for the past few months and have been keeping up with Fabric and its various features, so I wanted to put together a starter blog to explain how to enable Fabric and create a (very basic) solution. There is so much to Fabric and I can't do it justice in a single blog post, but I hope this is helpful to get you started.
Keep in mind that Microsoft Fabric, like any data platform, still needs to be implemented with due diligence around architecture, development practices, security, and governance in mind.
There will be a lot of information about Fabric in the coming days and weeks from Microsoft and the computing community, so stay tuned as there is a lot to cover. But if you're like me and love to dive in, play, and look at the details later, good news because you can activate Fabric and start a 60-day trial in a Power BI tenant that only has a Pro license. Pro, you can start for free. Note that Fabric is disabled by default... probably a good thing, as I'm sure there are many Power BI admins out there who don't want all of these features enabled by default.
After the 60 day trial a Fabric capacity must be purchased (if you don't already have a Power BI Premium capacity), this can be done via the Azure portal and capacity starts at £260 per month (well in my currency UK POUNDS).
In this blog, we'll take a look at getting started with Microsoft Fabric, building a lake house, and then using the SQL endpoint to create a Power BI report. I know there is a lot to cover and there will be a lot of information on fabric concepts in the coming days and weeks, so stay tuned!
The first thing you need to do is sign in to your existing Power BI tenanthttps://app.powerbi.comand click Settings on the top right, then you can chooseadministration portallowGovernance and knowledgesection.
From here you have to chooseTenant Settingsand you should seetela de microsoftat the top of the settings area. Expand the option and enable the option. I would suggest choosingSpecific security groupsand add an Active Directory group with only the users you want to access.
Now you will enable the Fabric features, but you need actual capacity to run the features. You can activate a 60-day trial period foractivalowSettings for help and support>Users can test paid features of Microsoft Fabric. Once again I would suggest choosingSpecific security groupsand add appropriate AD groups with users who should have access.
Once enabled, if you return to the Power BI home page, you should see a new icon in the left menu calledOnelake data center, you can click here to display a page of relevant objects. In the lower left corner you will now see an icon (probably Power BI), if you click on this icon you can switch to another "person". This is the core of working in Fabric, being able to switch between different people for different workloads. In this example we have to choosedata warehouse.
You can also access Fabric using the URLhttps://app.fabric.microsoft.com/, you will be presented with a Microsoft Fabric introduction page. In the lower left corner you can switch between the different people. It's actually just a redirect to app.powerbi.com.
Creating a fabric-enabled workspace
Now we are going to create a new workspace in the tenant by clicking onwork spaceand then click+ New workspaceyo denWorkspace setupwe can check to make sure the workspace is assigned toAttemptlicense that gives us access to a drug testing capacity for 60 days.
Create a new Lakehouse and load data
Click on the new workspaceHein the upper left corner and selectLake House (Preview).
give the newLake Housea name and clickCabinet.
Once the new lake house has been created, it will be inlake house mode(top right will be Lakehouse, we may switch to another mode soon...). From this area, you can see all the files and folders contained in theByeWe don't want anything here yet as it's a new Lakehouse.
What we need to do is upload a CSV file to Lakehouse (remember, we can upload files to Lakehouse via pipelines, dataflows, Spark, etc., but for this simple tutorial, we'll just upload a file directly).
ClickRecover data>Upload filesand select a file, in my case it is a CSV file that contains web telemetry data.
Create Delta Lake Table in Lakehouse
Now that the CSV file is uploaded, I can convert it to aShare tableright-clicking on the file and selectingload into tables. Delta (Parquet compatible) is the storage format in Lakehouse when working with tables.
Choose a name for the table and clickConfirm.
If everything is successful, you will see the new table name on the left side. If you click on the table name, a data sample should appear on the right hand side.
Query Delta table using Spark
What we can do now is query the data using a Spark notebook. If you click on the ellipse in the table and chooseopen in notebook>new notebookthen a new empty Spark notebook is created in the workspace.
You can now click on the ellipse in the table, selectLoad data>spark - sparkand the auto-generated Spark code is created, run it, and it runs a Spark session to return the table results.
When you run the Spark notebook, you will be able to see the results of the newly loaded table.
Using SQL Endpoint to query and model data
Now let's get the SQL endpoint up and running! This is the feature that allows us to create reports using Power BI and is also available using external client tools like SSMS, Azure Data Studio, etc. You can access the SQL endpoint by opening the workspace and clicking on the name of the SQL endpoint, this will be the name of the lake house you created earlier.
Within the SQL endpoint, you will be able to see 3 tabs at the bottom,Data,Order, yModel(yes, it looks a lot like Data Marts...). In the data table we can see the following objects:
- Points of view
- Stored Procedures
By the way, on the top right, you'll be able to switch between SQL endpoint and Lakehouse mode by clicking the dropdown menu and selecting one of the modes.
click onModelTo show a very familiar "Power BI" modeling experience, we can perform basic modeling activities here. When Lakehouse was created, it automatically created the SQL endpoint and also a Power BI dataset.
I created a measure using a COUNTROWS function and used the advanced functions at the bottom right to format the measure (thousands separator).
Create a Power BI report
then i clickedThe reportin the top left and the report canvas appeared allowing me to do a (fairly basic) visualization of the data. Now the thing about this is that there'sno import modefor the Power BI dataset here, query the SQL endpoint directly using the newdirect lake Function (import performance with direct query latency). You can also use the SQL endpoint in import mode and DirectQuery if you create a data model in Power BI Desktop (DirectLake is not supported).
We can save the report and if we go back to the workspace there will be more artifacts now. Lakehouse, sql endpoint, dataset, notebook (to query the table), and new report.
By switching to line view at the top right, there is now a view of how everything is connected. From the lake house to the SQL endpoint, dataset, and report.
There is much more to come, so dive in! I hope this very basic guide has been helpful. you are welcome to do itchat with her.