Microsoft Fabric interactive exercises
Microsoft Fabric is a unified analytics platform that brings together data engineering, data warehousing, real-time intelligence, data science, and business intelligence in one integrated software as a service (SaaS) experience.
These interactive exercises give you practical experience with Fabric’s core capabilities so you can build confidence and prepare for real-world projects and certification exams.
Get started with Fabric (8 exercises)
Create a Microsoft Fabric Lakehouse (30 minutes)
In this lab, you'll create a Microsoft Fabric lakehouse and import data into it. You'll explore how lakehouses combine file and table storage in OneLake, and learn to query managed delta tables using SQL.
Analyze data in a data warehouse (30 minutes)
You'll create tables in a Microsoft Fabric data warehouse, load data using SQL, and query dimensional models with T-SQL joins and aggregations. You'll also use the visual query designer and optionally define relationships in a data model for downstream analytics.
Get started with Real-Time Intelligence in Microsoft Fabric (30 minutes)
In this lab, you'll learn to work with real-time data streams in Microsoft Fabric by ingesting stock market data using eventstreams, querying it with KQL, and visualizing it on real-time dashboards. You'll also configure alerts using Activator to respond to real-time events.
Get started with data science in Microsoft Fabric (20 minutes)
In this lab, you'll learn how to work with notebooks and Data Wrangler to explore and transform data, then train and compare both regression and classification machine learning models. You'll use MLflow to track your experiments and save the best performing model, gaining practical skills in the core data science capabilities of Microsoft Fabric.
Design a semantic model for scale (30 minutes)
In this lab, you'll design a semantic model for scale in the Microsoft Fabric service. You'll connect to lakehouse data using Direct Lake, build star schema relationships, create a calculation group for time intelligence, and configure settings that support large datasets and concurrent consumption.
Work with SQL Database in Microsoft Fabric (30 minutes)
In this lab, you'll create and query a SQL database in Microsoft Fabric, integrate external data sources, and secure data using views and role-based access control.
Create an ontology with Fabric IQ (40 minutes)
In this lab, you'll manually build a complete Fabric IQ ontology by creating entity types with properties and keys, defining relationships, and binding data from lakehouse tables and eventhouse streams. You'll work hands-on with both static and time-series data to model a healthcare scenario including hospitals, departments, patients, and vital sign monitoring equipment.
Discover and connect to data in OneLake (30 minutes)
In this lab, you'll discover data assets through the OneLake catalog, create shortcuts to access data across workspaces without copying it, and query lakehouse data using SQL analytics endpoints. You'll also create and explore a semantic model, gaining hands-on experience with Microsoft Fabric's data discovery and connectivity capabilities.
Fabric IQ (4 exercises)
Create an ontology with Fabric IQ (40 minutes)
In this lab, you'll manually build a complete Fabric IQ ontology by creating entity types with properties and keys, defining relationships, and binding data from lakehouse tables and eventhouse streams. You'll work hands-on with both static and time-series data to model a healthcare scenario including hospitals, departments, patients, and vital sign monitoring equipment.
Build an ontology from a semantic model in Fabric IQ (45 minutes)
In this lab, you generate a Fabric IQ ontology from a Power BI semantic model using healthcare data, configure entity types and relationships with data bindings, and combine static data from a lakehouse with time-series data from an eventhouse.
Visualize ontology data with Microsoft Fabric IQ (30 minutes)
In this lab, you visualize entity instances and relationships using the ontology preview experience. You work with the Lamna Healthcare ontology to see how your data comes to life through interactive graphs, charts, and relationship visualizations.
Build a Fabric data agent with an ontology (30 minutes)
In this lab, you create a Fabric data agent that uses a Lamna Healthcare ontology as its data source. You configure agent instructions, test natural language queries, and publish the agent for colleagues to use.
Data engineering (10 exercises)
Analyze data with Apache Spark (45 minutes)
In this lab, you'll ingest data into a Fabric lakehouse and use Apache Spark to read and analyze it. You'll work with PySpark to load files, explore data, and perform analysis using Spark notebooks.
Use delta tables in Apache Spark (45 minutes)
In this lab, you'll create Delta tables in a Microsoft Fabric lakehouse and explore data using SQL queries. You'll work with the Delta Lake format to support relational semantics for both batch and streaming data operations.
Create a medallion architecture in a Microsoft Fabric lakehouse (45 minutes)
You'll implement a medallion architecture by transforming raw data through bronze, silver, and gold layers using PySpark and Delta Lake. You'll create a star schema with dimensions and fact tables, then explore and model the data for analytics and reporting.
Create and use Dataflows (Gen2) in Microsoft Fabric (30 minutes)
In this lab, you will create a Dataflow (Gen2) to extract data from a CSV file, perform transformations in Power Query Online, and load the data into a lakehouse. You will also learn how to orchestrate dataflows by integrating them into a data pipeline.
Load data into a warehouse using T-SQL (30 minutes)
In this lab, you'll learn how to load data into a Microsoft Fabric data warehouse using T-SQL. You'll create fact and dimension tables, build a stored procedure that uses cross-database queries to load data from a lakehouse into your warehouse, and run analytical queries to validate the data.
Work smarter with Copilot in Microsoft Fabric Dataflow Gen2 (30 minutes)
In this lab, you'll learn to use Copilot in Microsoft Fabric Dataflow Gen2 to accelerate data transformation tasks using natural language prompts. You'll clean and transform retail store data by removing unwanted characters, parsing XML content, creating categorical groupings, and setting appropriate data types—all with AI assistance. By the end, you'll understand how to leverage Copilot to streamline common data engineering workflows and improve productivity.
Analyze data with Apache Spark and Copilot in Microsoft Fabric notebooks (30 minutes)
In this lab, you'll use Copilot in Fabric notebooks to generate Apache Spark code from natural language prompts, transforming raw population data through cleaning, filtering, and reshaping operations, then creating visualizations to explore the results. You'll learn how AI-assisted coding accelerates data engineering by letting you focus on what you want to accomplish rather than how to code it.
Discover and connect to data in OneLake (30 minutes)
In this lab, you'll discover data assets through the OneLake catalog, create shortcuts to access data across workspaces without copying it, and query lakehouse data using SQL analytics endpoints. You'll also create and explore a semantic model, gaining hands-on experience with Microsoft Fabric's data discovery and connectivity capabilities.
Transform data using dataflows in Microsoft Fabric (30 minutes)
In this lab, you create a Dataflow Gen2 to connect to sample sales data, apply Power Query transformations to clean and shape the data, and load the results to a lakehouse table. You practice common data preparation tasks including filtering rows, removing columns, changing data types, renaming columns, and creating calculated columns.
Transform data with notebooks in Microsoft Fabric (30 minutes)
In this lab, you clean raw sales data in a Fabric notebook, join multiple tables, apply aggregations and window functions, and write results to Delta tables in a lakehouse.
Data warehouse (7 exercises)
Analyze data in a data warehouse (30 minutes)
You'll create tables in a Microsoft Fabric data warehouse, load data using SQL, and query dimensional models with T-SQL joins and aggregations. You'll also use the visual query designer and optionally define relationships in a data model for downstream analytics.
Load data into a warehouse using T-SQL (30 minutes)
In this lab, you'll learn how to load data into a Microsoft Fabric data warehouse using T-SQL. You'll create fact and dimension tables, build a stored procedure that uses cross-database queries to load data from a lakehouse into your warehouse, and run analytical queries to validate the data.
Query a data warehouse in Microsoft Fabric (30 minutes)
In this lab, you will query a Microsoft Fabric data warehouse using SQL to analyze data, join tables, and aggregate results. You will also verify data consistency by identifying and handling anomalies, and create views to filter data for specific reporting needs.
Monitor a data warehouse in Microsoft Fabric (30 minutes)
In this lab, you will learn how to monitor activity and queries in a Microsoft Fabric data warehouse. You'll use dynamic management views (DMVs) to track connections, sessions, and running requests, and explore query insights to analyze query history and performance.
Secure data in a data warehouse (45 minutes)
In this lab, you'll learn to secure sensitive data in a Fabric data warehouse using multiple security techniques. You'll apply dynamic data masking to obscure confidential information, implement row-level and column-level security to restrict data access, and use SQL DCL commands (GRANT, DENY, REVOKE) to control granular permissions on database objects.
Use Copilot in Microsoft Fabric data warehouse (30 minutes)
You'll learn to use Copilot in Microsoft Fabric to generate, refine, and troubleshoot SQL queries using natural language prompts. Through hands-on practice with a retail sales data warehouse, you'll experience how AI can accelerate writing complex SQL queries, automatically fix syntax errors, and create views for data analysis.
Transform data in a Fabric data warehouse using T-SQL (45 minutes)
Write T-SQL queries to filter, join, and aggregate staging data in a Fabric data warehouse. Create views and stored procedures for reusable logic, and build dimensional tables for analytics.
Real-Time Intelligence (5 exercises)
Get started with Real-Time Intelligence in Microsoft Fabric (30 minutes)
In this lab, you'll learn to work with real-time data streams in Microsoft Fabric by ingesting stock market data using eventstreams, querying it with KQL, and visualizing it on real-time dashboards. You'll also configure alerts using Activator to respond to real-time events.
Ingest real-time data with Eventstream in Microsoft Fabric (30 minutes)
In this lab, you'll learn how to ingest and analyze real-time data in Microsoft Fabric by creating an eventstream that captures sample bicycle data, loads it into an eventhouse database, and transforms it using aggregations. You'll configure data sources and destinations, then query both raw and transformed data using KQL to gain insights from streaming events.
Use Activator in Microsoft Fabric (30 minutes)
In this lab, you'll learn how to use Activator in Microsoft Fabric to monitor real-time data streams and create alert rules that trigger actions based on data conditions. You'll work with eventstreams, define objects and properties, set up conditional triggers with filters, and configure automated email notifications when specific thresholds are exceeded.
Work with data in a Microsoft Fabric eventhouse (25 minutes)
In this lab, you'll learn to query data in a KQL database using both Kusto Query Language (KQL) and Transact-SQL. You'll practice essential data querying operations including retrieving, filtering, summarizing, and sorting data to gain insights from real-time event data stored in a Microsoft Fabric eventhouse.
Get started with Real-time Dashboards in Microsoft Fabric (25 minutes)
In this lab, you'll learn to create interactive real-time dashboards in Microsoft Fabric by building visualizations from streaming data using KQL queries. You'll configure data ingestion from an eventstream, create multiple dashboard tiles with charts and maps, implement reusable base queries and parameters for interactivity, and set up auto-refresh capabilities.
Data science (5 exercises)
Get started with data science in Microsoft Fabric (20 minutes)
In this lab, you'll learn how to work with notebooks and Data Wrangler to explore and transform data, then train and compare both regression and classification machine learning models. You'll use MLflow to track your experiments and save the best performing model, gaining practical skills in the core data science capabilities of Microsoft Fabric.
Explore data for data science with notebooks in Microsoft Fabric (30 minutes)
In this lab, you'll use notebooks in Microsoft Fabric to perform exploratory data analysis on a dataset. You'll load data into dataframes, check data structure and quality, generate descriptive statistics, create visualizations to understand distributions and relationships, and perform correlation analysis. By the end of this lab, you'll have practical experience using notebooks for interactive data exploration.
Preprocess data with Data Wrangler in Microsoft Fabric (30 minutes)
You'll use Data Wrangler to perform common data preprocessing tasks like formatting text, filtering, sorting, and aggregating data, while automatically generating reusable Python code. This hands-on experience will help you prepare data for machine learning models efficiently without writing code from scratch.
Train and track machine learning models with MLflow in Microsoft Fabric (25 minutes)
In this lab, you'll train regression models using scikit-learn to predict diabetes measures and use MLflow to track, compare, and evaluate model performance. You'll learn how to search experiments, visualize metrics, and save the best performing model for future use.
Generate batch predictions using a deployed model in Microsoft Fabric (20 minutes)
You train and register a machine learning model, then use it to generate batch predictions on test data stored in a lakehouse. This lab demonstrates applying a deployed model to score new patient data and save the results.
Semantic models (6 exercises)
Create DAX calculations in semantic models (45 minutes)
You'll learn to enhance semantic models by creating calculated tables, calculated columns, and measures using DAX. Through hands-on practice, you'll build a date table, control sorting behavior, organize measures into display folders, and create calculations that aggregate and compare data effectively.
Design a semantic model for scale (30 minutes)
In this lab, you'll design a semantic model for scale in the Microsoft Fabric service. You'll connect to lakehouse data using Direct Lake, build star schema relationships, create a calculation group for time intelligence, and configure settings that support large datasets and concurrent consumption.
Optimize semantic model performance (30 minutes)
Use Performance analyzer in Power BI Desktop to diagnose report visual performance, identify expensive DAX patterns, apply optimizations, and verify improvements.
Enforce semantic model security (30 minutes)
Import a security mapping table, configure relationship filter propagation, and implement static and dynamic row-level security in a Power BI semantic model.
Design and implement a dimensional model (30 minutes)
In this lab, you design and implement a star schema dimensional model in a Fabric Warehouse, creating fact and dimension tables with foreign key relationships. You run analytical queries to aggregate sales data across multiple dimensions, and implement slowly changing dimension (SCD) Type 1 and Type 2 patterns to handle data that changes over time.
Prepare a semantic model for AI in Microsoft Fabric (30 minutes)
Configure Prep for AI features in Power BI Desktop, add synonyms via linguistic modeling, publish to a Fabric workspace, test with Copilot and HCAAT diagnostics, and mark the model as Approved for Copilot.
Fabric SQL database (2 exercises)
Work with SQL Database in Microsoft Fabric (30 minutes)
In this lab, you'll create and query a SQL database in Microsoft Fabric, integrate external data sources, and secure data using views and role-based access control.
Work with API for GraphQL in Microsoft Fabric (30 minutes)
In this lab, you create an API for GraphQL in Microsoft Fabric to expose data from a SQL database. You learn how to configure a GraphQL endpoint, connect it to a data source, and query the data using GraphQL syntax with filters.
Security and governance (4 exercises)
Secure data in a data warehouse (45 minutes)
In this lab, you'll learn to secure sensitive data in a Fabric data warehouse using multiple security techniques. You'll apply dynamic data masking to obscure confidential information, implement row-level and column-level security to restrict data access, and use SQL DCL commands (GRANT, DENY, REVOKE) to control granular permissions on database objects.
Enforce semantic model security (30 minutes)
Import a security mapping table, configure relationship filter propagation, and implement static and dynamic row-level security in a Power BI semantic model.
Secure data access in Microsoft Fabric (45 minutes)
In this lab, you'll learn how to secure data in Microsoft Fabric using a multi-layer security approach. You'll apply workspace roles to control access to entire workspaces, configure item-level permissions to restrict access to specific Fabric items, and create OneLake data access roles to grant granular permissions on specific folders and tables within a lakehouse.
Govern analytics data in Microsoft Fabric (30 minutes)
Create analytics assets in Microsoft Fabric and apply governance practices including endorsement, documentation, and lineage analysis to make data assets trustworthy and discoverable.
Operations and lifecycle management (5 exercises)
Ingest data with a pipeline in Microsoft Fabric (45 minutes)
In this lab, you'll create data pipelines to ingest data from external sources into a lakehouse, and integrate Spark notebooks to transform and load the data into tables. You'll learn how to combine Copy Data activities with custom Spark transformations to build reusable ETL processes in Microsoft Fabric.
Monitor a data warehouse in Microsoft Fabric (30 minutes)
In this lab, you will learn how to monitor activity and queries in a Microsoft Fabric data warehouse. You'll use dynamic management views (DMVs) to track connections, sessions, and running requests, and explore query insights to analyze query history and performance.
Monitor Fabric activity in the monitoring hub (30 minutes)
You'll learn how to use the monitoring hub in Microsoft Fabric to track and view activity for various items including dataflows and Spark notebooks. You'll explore run histories, apply filters, and customize views to effectively monitor your Fabric workspace activities.
Implement deployment pipelines in Microsoft Fabric (20 minutes)
In this lab, you'll learn to implement CI/CD practices in Microsoft Fabric using deployment pipelines. You'll configure a multi-stage pipeline and deploy content across development, test, and production environments, gaining hands-on experience automating content promotion in Fabric.
Manage the semantic model lifecycle (45 minutes)
Validate a semantic model with SemPy in a Fabric notebook, create a deployment pipeline to promote content across stages, and verify deployed content in a production workspace.
Copilot in Fabric (4 exercises)
Work smarter with Copilot in Microsoft Fabric Dataflow Gen2 (30 minutes)
In this lab, you'll learn to use Copilot in Microsoft Fabric Dataflow Gen2 to accelerate data transformation tasks using natural language prompts. You'll clean and transform retail store data by removing unwanted characters, parsing XML content, creating categorical groupings, and setting appropriate data types—all with AI assistance. By the end, you'll understand how to leverage Copilot to streamline common data engineering workflows and improve productivity.
Analyze data with Apache Spark and Copilot in Microsoft Fabric notebooks (30 minutes)
In this lab, you'll use Copilot in Fabric notebooks to generate Apache Spark code from natural language prompts, transforming raw population data through cleaning, filtering, and reshaping operations, then creating visualizations to explore the results. You'll learn how AI-assisted coding accelerates data engineering by letting you focus on what you want to accomplish rather than how to code it.
Use Copilot in Microsoft Fabric data warehouse (30 minutes)
You'll learn to use Copilot in Microsoft Fabric to generate, refine, and troubleshoot SQL queries using natural language prompts. Through hands-on practice with a retail sales data warehouse, you'll experience how AI can accelerate writing complex SQL queries, automatically fix syntax errors, and create views for data analysis.
Chat with your data using Microsoft Fabric data agents (30 minutes)
In this lab, you'll create and configure a Microsoft Fabric data agent, then interact with it using plain English questions to retrieve insights from a sales warehouse. You'll learn how data agents translate natural language into SQL queries, making data analysis accessible without requiring SQL expertise. This hands-on experience demonstrates the power of AI-assisted analytics to democratize data access.
DP-700: Implement data engineering solutions using Microsoft Fabric (15 exercises)
Create a Microsoft Fabric Lakehouse (30 minutes)
In this lab, you'll create a Microsoft Fabric lakehouse and import data into it. You'll explore how lakehouses combine file and table storage in OneLake, and learn to query managed delta tables using SQL.
Analyze data with Apache Spark (45 minutes)
In this lab, you'll ingest data into a Fabric lakehouse and use Apache Spark to read and analyze it. You'll work with PySpark to load files, explore data, and perform analysis using Spark notebooks.
Use delta tables in Apache Spark (45 minutes)
In this lab, you'll create Delta tables in a Microsoft Fabric lakehouse and explore data using SQL queries. You'll work with the Delta Lake format to support relational semantics for both batch and streaming data operations.
Create a medallion architecture in a Microsoft Fabric lakehouse (45 minutes)
You'll implement a medallion architecture by transforming raw data through bronze, silver, and gold layers using PySpark and Delta Lake. You'll create a star schema with dimensions and fact tables, then explore and model the data for analytics and reporting.
Ingest data with a pipeline in Microsoft Fabric (45 minutes)
In this lab, you'll create data pipelines to ingest data from external sources into a lakehouse, and integrate Spark notebooks to transform and load the data into tables. You'll learn how to combine Copy Data activities with custom Spark transformations to build reusable ETL processes in Microsoft Fabric.
Create and use Dataflows (Gen2) in Microsoft Fabric (30 minutes)
In this lab, you will create a Dataflow (Gen2) to extract data from a CSV file, perform transformations in Power Query Online, and load the data into a lakehouse. You will also learn how to orchestrate dataflows by integrating them into a data pipeline.
Analyze data in a data warehouse (30 minutes)
You'll create tables in a Microsoft Fabric data warehouse, load data using SQL, and query dimensional models with T-SQL joins and aggregations. You'll also use the visual query designer and optionally define relationships in a data model for downstream analytics.
Load data into a warehouse using T-SQL (30 minutes)
In this lab, you'll learn how to load data into a Microsoft Fabric data warehouse using T-SQL. You'll create fact and dimension tables, build a stored procedure that uses cross-database queries to load data from a lakehouse into your warehouse, and run analytical queries to validate the data.
Query a data warehouse in Microsoft Fabric (30 minutes)
In this lab, you will query a Microsoft Fabric data warehouse using SQL to analyze data, join tables, and aggregate results. You will also verify data consistency by identifying and handling anomalies, and create views to filter data for specific reporting needs.
Monitor a data warehouse in Microsoft Fabric (30 minutes)
In this lab, you will learn how to monitor activity and queries in a Microsoft Fabric data warehouse. You'll use dynamic management views (DMVs) to track connections, sessions, and running requests, and explore query insights to analyze query history and performance.
Secure data in a data warehouse (45 minutes)
In this lab, you'll learn to secure sensitive data in a Fabric data warehouse using multiple security techniques. You'll apply dynamic data masking to obscure confidential information, implement row-level and column-level security to restrict data access, and use SQL DCL commands (GRANT, DENY, REVOKE) to control granular permissions on database objects.
Get started with Real-Time Intelligence in Microsoft Fabric (30 minutes)
In this lab, you'll learn to work with real-time data streams in Microsoft Fabric by ingesting stock market data using eventstreams, querying it with KQL, and visualizing it on real-time dashboards. You'll also configure alerts using Activator to respond to real-time events.
Ingest real-time data with Eventstream in Microsoft Fabric (30 minutes)
In this lab, you'll learn how to ingest and analyze real-time data in Microsoft Fabric by creating an eventstream that captures sample bicycle data, loads it into an eventhouse database, and transforms it using aggregations. You'll configure data sources and destinations, then query both raw and transformed data using KQL to gain insights from streaming events.
Secure data access in Microsoft Fabric (45 minutes)
In this lab, you'll learn how to secure data in Microsoft Fabric using a multi-layer security approach. You'll apply workspace roles to control access to entire workspaces, configure item-level permissions to restrict access to specific Fabric items, and create OneLake data access roles to grant granular permissions on specific folders and tables within a lakehouse.
Implement deployment pipelines in Microsoft Fabric (20 minutes)
In this lab, you'll learn to implement CI/CD practices in Microsoft Fabric using deployment pipelines. You'll configure a multi-stage pipeline and deploy content across development, test, and production environments, gaining hands-on experience automating content promotion in Fabric.
DP-600: Implement analytics solutions using Microsoft Fabric (17 exercises)
Discover and connect to data in OneLake (30 minutes)
In this lab, you'll discover data assets through the OneLake catalog, create shortcuts to access data across workspaces without copying it, and query lakehouse data using SQL analytics endpoints. You'll also create and explore a semantic model, gaining hands-on experience with Microsoft Fabric's data discovery and connectivity capabilities.
Create a Microsoft Fabric Lakehouse (30 minutes)
In this lab, you'll create a Microsoft Fabric lakehouse and import data into it. You'll explore how lakehouses combine file and table storage in OneLake, and learn to query managed delta tables using SQL.
Analyze data in a data warehouse (30 minutes)
You'll create tables in a Microsoft Fabric data warehouse, load data using SQL, and query dimensional models with T-SQL joins and aggregations. You'll also use the visual query designer and optionally define relationships in a data model for downstream analytics.
Get started with Real-Time Intelligence in Microsoft Fabric (30 minutes)
In this lab, you'll learn to work with real-time data streams in Microsoft Fabric by ingesting stock market data using eventstreams, querying it with KQL, and visualizing it on real-time dashboards. You'll also configure alerts using Activator to respond to real-time events.
Design and implement a dimensional model (30 minutes)
In this lab, you design and implement a star schema dimensional model in a Fabric Warehouse, creating fact and dimension tables with foreign key relationships. You run analytical queries to aggregate sales data across multiple dimensions, and implement slowly changing dimension (SCD) Type 1 and Type 2 patterns to handle data that changes over time.
Transform data using dataflows in Microsoft Fabric (30 minutes)
In this lab, you create a Dataflow Gen2 to connect to sample sales data, apply Power Query transformations to clean and shape the data, and load the results to a lakehouse table. You practice common data preparation tasks including filtering rows, removing columns, changing data types, renaming columns, and creating calculated columns.
Transform data with notebooks in Microsoft Fabric (30 minutes)
In this lab, you clean raw sales data in a Fabric notebook, join multiple tables, apply aggregations and window functions, and write results to Delta tables in a lakehouse.
Transform data in a Fabric data warehouse using T-SQL (45 minutes)
Write T-SQL queries to filter, join, and aggregate staging data in a Fabric data warehouse. Create views and stored procedures for reusable logic, and build dimensional tables for analytics.
Create DAX calculations in semantic models (45 minutes)
You'll learn to enhance semantic models by creating calculated tables, calculated columns, and measures using DAX. Through hands-on practice, you'll build a date table, control sorting behavior, organize measures into display folders, and create calculations that aggregate and compare data effectively.
Design a semantic model for scale (30 minutes)
In this lab, you'll design a semantic model for scale in the Microsoft Fabric service. You'll connect to lakehouse data using Direct Lake, build star schema relationships, create a calculation group for time intelligence, and configure settings that support large datasets and concurrent consumption.
Optimize semantic model performance (30 minutes)
Use Performance analyzer in Power BI Desktop to diagnose report visual performance, identify expensive DAX patterns, apply optimizations, and verify improvements.
Enforce semantic model security (30 minutes)
Import a security mapping table, configure relationship filter propagation, and implement static and dynamic row-level security in a Power BI semantic model.
Manage the semantic model lifecycle (45 minutes)
Validate a semantic model with SemPy in a Fabric notebook, create a deployment pipeline to promote content across stages, and verify deployed content in a production workspace.
Prepare a semantic model for AI in Microsoft Fabric (30 minutes)
Configure Prep for AI features in Power BI Desktop, add synonyms via linguistic modeling, publish to a Fabric workspace, test with Copilot and HCAAT diagnostics, and mark the model as Approved for Copilot.
Build an ontology from a semantic model in Fabric IQ (45 minutes)
In this lab, you generate a Fabric IQ ontology from a Power BI semantic model using healthcare data, configure entity types and relationships with data bindings, and combine static data from a lakehouse with time-series data from an eventhouse.
Secure data access in Microsoft Fabric (45 minutes)
In this lab, you'll learn how to secure data in Microsoft Fabric using a multi-layer security approach. You'll apply workspace roles to control access to entire workspaces, configure item-level permissions to restrict access to specific Fabric items, and create OneLake data access roles to grant granular permissions on specific folders and tables within a lakehouse.
Govern analytics data in Microsoft Fabric (30 minutes)
Create analytics assets in Microsoft Fabric and apply governance practices including endorsement, documentation, and lineage analysis to make data assets trustworthy and discoverable.
DP-601: Implement a Lakehouse with Microsoft Fabric (6 exercises)
Create a Microsoft Fabric Lakehouse (30 minutes)
In this lab, you'll create a Microsoft Fabric lakehouse and import data into it. You'll explore how lakehouses combine file and table storage in OneLake, and learn to query managed delta tables using SQL.
Analyze data with Apache Spark (45 minutes)
In this lab, you'll ingest data into a Fabric lakehouse and use Apache Spark to read and analyze it. You'll work with PySpark to load files, explore data, and perform analysis using Spark notebooks.
Use delta tables in Apache Spark (45 minutes)
In this lab, you'll create Delta tables in a Microsoft Fabric lakehouse and explore data using SQL queries. You'll work with the Delta Lake format to support relational semantics for both batch and streaming data operations.
Create and use Dataflows (Gen2) in Microsoft Fabric (30 minutes)
In this lab, you will create a Dataflow (Gen2) to extract data from a CSV file, perform transformations in Power Query Online, and load the data into a lakehouse. You will also learn how to orchestrate dataflows by integrating them into a data pipeline.
Ingest data with a pipeline in Microsoft Fabric (45 minutes)
In this lab, you'll create data pipelines to ingest data from external sources into a lakehouse, and integrate Spark notebooks to transform and load the data into tables. You'll learn how to combine Copy Data activities with custom Spark transformations to build reusable ETL processes in Microsoft Fabric.
DP-602: Implement a data warehouse with Microsoft Fabric (5 exercises)
Analyze data in a data warehouse (30 minutes)
You'll create tables in a Microsoft Fabric data warehouse, load data using SQL, and query dimensional models with T-SQL joins and aggregations. You'll also use the visual query designer and optionally define relationships in a data model for downstream analytics.
Load data into a warehouse using T-SQL (30 minutes)
In this lab, you'll learn how to load data into a Microsoft Fabric data warehouse using T-SQL. You'll create fact and dimension tables, build a stored procedure that uses cross-database queries to load data from a lakehouse into your warehouse, and run analytical queries to validate the data.
Query a data warehouse in Microsoft Fabric (30 minutes)
In this lab, you will query a Microsoft Fabric data warehouse using SQL to analyze data, join tables, and aggregate results. You will also verify data consistency by identifying and handling anomalies, and create views to filter data for specific reporting needs.
Monitor a data warehouse in Microsoft Fabric (30 minutes)
In this lab, you will learn how to monitor activity and queries in a Microsoft Fabric data warehouse. You'll use dynamic management views (DMVs) to track connections, sessions, and running requests, and explore query insights to analyze query history and performance.
Secure data in a data warehouse (45 minutes)
In this lab, you'll learn to secure sensitive data in a Fabric data warehouse using multiple security techniques. You'll apply dynamic data masking to obscure confidential information, implement row-level and column-level security to restrict data access, and use SQL DCL commands (GRANT, DENY, REVOKE) to control granular permissions on database objects.
DP-603: Implement Real-Time Intelligence with Microsoft Fabric (5 exercises)
Get started with Real-Time Intelligence in Microsoft Fabric (30 minutes)
In this lab, you'll learn to work with real-time data streams in Microsoft Fabric by ingesting stock market data using eventstreams, querying it with KQL, and visualizing it on real-time dashboards. You'll also configure alerts using Activator to respond to real-time events.
Ingest real-time data with Eventstream in Microsoft Fabric (30 minutes)
In this lab, you'll learn how to ingest and analyze real-time data in Microsoft Fabric by creating an eventstream that captures sample bicycle data, loads it into an eventhouse database, and transforms it using aggregations. You'll configure data sources and destinations, then query both raw and transformed data using KQL to gain insights from streaming events.
Work with data in a Microsoft Fabric eventhouse (25 minutes)
In this lab, you'll learn to query data in a KQL database using both Kusto Query Language (KQL) and Transact-SQL. You'll practice essential data querying operations including retrieving, filtering, summarizing, and sorting data to gain insights from real-time event data stored in a Microsoft Fabric eventhouse.
Get started with Real-time Dashboards in Microsoft Fabric (25 minutes)
In this lab, you'll learn to create interactive real-time dashboards in Microsoft Fabric by building visualizations from streaming data using KQL queries. You'll configure data ingestion from an eventstream, create multiple dashboard tiles with charts and maps, implement reusable base queries and parameters for interactivity, and set up auto-refresh capabilities.
Use Activator in Microsoft Fabric (30 minutes)
In this lab, you'll learn how to use Activator in Microsoft Fabric to monitor real-time data streams and create alert rules that trigger actions based on data conditions. You'll work with eventstreams, define objects and properties, set up conditional triggers with filters, and configure automated email notifications when specific thresholds are exceeded.
DP-604: Implement a data science and machine learning solution for AI with Microsoft Fabric (5 exercises)
Get started with data science in Microsoft Fabric (20 minutes)
In this lab, you'll learn how to work with notebooks and Data Wrangler to explore and transform data, then train and compare both regression and classification machine learning models. You'll use MLflow to track your experiments and save the best performing model, gaining practical skills in the core data science capabilities of Microsoft Fabric.
Explore data for data science with notebooks in Microsoft Fabric (30 minutes)
In this lab, you'll use notebooks in Microsoft Fabric to perform exploratory data analysis on a dataset. You'll load data into dataframes, check data structure and quality, generate descriptive statistics, create visualizations to understand distributions and relationships, and perform correlation analysis. By the end of this lab, you'll have practical experience using notebooks for interactive data exploration.
Preprocess data with Data Wrangler in Microsoft Fabric (30 minutes)
You'll use Data Wrangler to perform common data preprocessing tasks like formatting text, filtering, sorting, and aggregating data, while automatically generating reusable Python code. This hands-on experience will help you prepare data for machine learning models efficiently without writing code from scratch.
Train and track machine learning models with MLflow in Microsoft Fabric (25 minutes)
In this lab, you'll train regression models using scikit-learn to predict diabetes measures and use MLflow to track, compare, and evaluate model performance. You'll learn how to search experiments, visualize metrics, and save the best performing model for future use.
Generate batch predictions using a deployed model in Microsoft Fabric (20 minutes)
You train and register a machine learning model, then use it to generate batch predictions on test data stored in a lakehouse. This lab demonstrates applying a deployed model to score new patient data and save the results.
DP-3029: Work smarter with Copilot in Microsoft Fabric (4 exercises)
Work smarter with Copilot in Microsoft Fabric Dataflow Gen2 (30 minutes)
In this lab, you'll learn to use Copilot in Microsoft Fabric Dataflow Gen2 to accelerate data transformation tasks using natural language prompts. You'll clean and transform retail store data by removing unwanted characters, parsing XML content, creating categorical groupings, and setting appropriate data types—all with AI assistance. By the end, you'll understand how to leverage Copilot to streamline common data engineering workflows and improve productivity.
Analyze data with Apache Spark and Copilot in Microsoft Fabric notebooks (30 minutes)
In this lab, you'll use Copilot in Fabric notebooks to generate Apache Spark code from natural language prompts, transforming raw population data through cleaning, filtering, and reshaping operations, then creating visualizations to explore the results. You'll learn how AI-assisted coding accelerates data engineering by letting you focus on what you want to accomplish rather than how to code it.
Use Copilot in Microsoft Fabric data warehouse (30 minutes)
You'll learn to use Copilot in Microsoft Fabric to generate, refine, and troubleshoot SQL queries using natural language prompts. Through hands-on practice with a retail sales data warehouse, you'll experience how AI can accelerate writing complex SQL queries, automatically fix syntax errors, and create views for data analysis.
Chat with your data using Microsoft Fabric data agents (30 minutes)
In this lab, you'll create and configure a Microsoft Fabric data agent, then interact with it using plain English questions to retrieve insights from a sales warehouse. You'll learn how data agents translate natural language into SQL queries, making data analysis accessible without requiring SQL expertise. This hands-on experience demonstrates the power of AI-assisted analytics to democratize data access.