Get started with Real-Time Analytics in Microsoft Fabric

Microsoft Fabric provides a runtime that you can use to store and query data by using Kusto Query Language (KQL). Kusto is optimized for data that includes a time series component, such as real-time data from log files or IoT devices.

This lab takes approximately 30 minutes to complete.

Note: You need a Microsoft school or work account to complete this exercise. If you don’t have one, you can sign up for a trial of Microsoft Office 365 E3 or higher.

Activate a Microsoft Fabric trial

  1. After signing up for a Microsoft Fabric account, navigate to the Microsoft Fabric portal at
  2. Select the Account Manager icon (the user image at the top right)
  3. In the account manager menu, select Start trial to start a Microsoft Fabric trial.
  4. After successfully upgrading to Microsoft Fabric, navigate to the home page by selecting Fabric Home Page.

Create a workspace

Before working with data in Fabric, create a workspace with the Fabric trial enabled.

  1. On the Microsoft Fabric home page, select Synapse Real-Time Analytics.
  2. In the menu bar on the left, select Workspaces (the icon looks similar to 🗇).
  3. Create a new workspace with a name of your choice, selecting a licensing mode that includes Fabric capacity (Trial, Premium, or Fabric).
  4. When your new workspace opens, it should be empty.

    Screenshot of an empty workspace in Fabric.

Download file for KQL database

Now that you have a workspace, it’s time to download the data file you’re going to analyze.

  1. Download the data file for this exercise from, saving it as sales.csv on your local computer (or lab VM if applicable)
  2. Return to the browser window with the Microsoft Fabric Experience.

Create a KQL database

Kusto query language (KQL) is used to query static or streaming data in a table that is defined in a KQL database. To analyze the sales data, you must create a table in a KQL database and ingest the data from the file.

  1. In the Microsoft Fabric experience portal, select the Synapse Real-Time Analytics experience image as shown here:

    Screenshot of selected Fabric Experience home with RTA selected

  2. On the Home page for the Real-Time Analytics experience, select KQL database and create a new database with a name of your choice.
  3. When the new database has been created, select the option to get data from Local File. Then use the wizard to import the data into a new table by selecting the following options:
    • Destination:
      • Database: The database you created is already selected
      • Table: Create a new table named sales.
    • Source:
      • Source type: File
      • Upload files: Drag or Browse for the file you downloaded earlier
    • Schema:
      • Compression type: Uncompressed
      • Data format: CSV
      • Ignore the first record: Selected
      • Mapping name: sales_mapping
    • Summary:
      • Review the preview of the table and close the wizard.

Note: In this example, you imported a very small amount of static data from a file, which is fine for the purposes of this exercise. In reality, you can use Kusto to analyze much larger volumes of data; including real-time data from a streaming source such as Azure Event Hubs.

Use KQL to query the sales table

Now that you have a table of data in your database, you can use KQL code to query it.

  1. Make sure you have the sales table highlighted. From the menu bar, select the Query table drop-down, and from there select Show any 100 records .

  2. A new pane will open with the query and its result.

  3. Modify the query as follows:

    | where Item == 'Road-250 Black, 48'
  4. Run the query. Then review the results, which should contain only the rows for sales orders for the Road-250 Black, 48 product.

  5. Modify the query as follows:

    | where Item == 'Road-250 Black, 48'
    | where datetime_part('year', OrderDate) > 2020
  6. Run the query and review the results, which should contain only sales orders for Road-250 Black, 48 made after 2020.

  7. Modify the query as follows:

    | where OrderDate between (datetime(2020-01-01 00:00:00) .. datetime(2020-12-31 23:59:59))
    | summarize TotalNetRevenue = sum(UnitPrice) by Item
    | sort by Item asc
  8. Run the query and review the results, which should contain the total net revenue for each product between January 1st and December 31st 2020 in ascending order of product name.
  9. Select Save as KQL queryset and save the query as Revenue by Product.

Create a Power BI report from a KQL Queryset

You can use your KQL Queryset as the basis for a Power BI report.

  1. In the query workbench editor for your query set, run the query and wait for the results.
  2. Select Build Power BI report and wait for the report editor to open.
  3. In the report editor, in the Data pane, expand Kusto Query Result and select the Item and TotalRevenue fields.
  4. On the report design canvas, select the table visualization that has been added and then in the Visualizations pane, select Clustered bar chart.

    Screenshot of a report from a KQL query.

  5. In the Power BI window, in the File menu, select Save. Then save the report as Revenue by Item.pbix in the workspace where your lakehouse and KQL database are defined using a Non-Business sensitivity label.
  6. Close the Power BI window, and in the bar on the left, select the icon for your workspace.

    Refresh the Workspace page if necessary to view all of the items it contains.

  7. In the list of items in your workspace, note that the Revenue by Item report is listed.

Clean up resources

In this exercise, you have created a lakehouse, a KQL database to analyze the data uploaded into the lakehouse. You used KQL to query the data and create a query set, which was then used to create a Power BI report.

If you’ve finished exploring your KQL database, you can delete the workspace you created for this exercise.

  1. In the bar on the left, select the icon for your workspace.
  2. In the menu on the toolbar, select Workspace settings.
  3. In the Other section, select Remove this workspace.