Device Message Routing

Lab Scenario

Contoso Management is impressed with your implementation of automatic device enrollment using DPS. They are now interested in having you develop an IoT-based solution related to product packaging and shipping.

The cost associated with packaging and shipping cheese is significant. To maximize cost efficiency, Contoso operates an on-premises packaging facility. The workflow is straightforward - cheese is cut and packaged, packages are assembled into shipping containers, containers are delivered to specific bins associated with their destination. A conveyor belt system is used to move the product through this process. The metric for success is the number of packages leaving the conveyor belt system during a given time period (typically a work shift).

The conveyor belt system is a critical link in this process and is visually monitored to ensure that the workflow is progressing at maximum efficiency. The system has three operator controlled speeds: stopped, slow, and fast. Naturally, the number of packages being delivered at the low speed is less than at the higher speed. However, there are a number of other factors to consider:

  • the vibration level of the conveyor belt system is much lower at the slow speed
  • high vibration levels can cause packages to fall from the conveyor
  • high vibration levels are known to accelerate wear-and-tear of the system
  • when vibration levels exceed a threshold limit, the conveyor belt must be stopped to allow for inspection (to avoid more serious failures)

In addition to maximizing throughput, your automated IoT solution will implement a form of preventive maintenance based on vibration levels, which will be used to detect early warning signs before serious system damage occurs.

Note: Preventive maintenance (sometimes called preventative maintenance or predictive maintenance) is an equipment maintenance program that schedules maintenance activities to be performed while the equipment is operating normally. The intent of this approach is to avoid unexpected breakdowns that often incur costly disruptions.

It’s not always easy for an operator to visually detect abnormal vibration levels. For this reason, you are looking into an Azure IoT solution that will help to measure vibration levels and data anomalies. Vibration sensors will be attached to the conveyor belt at various locations, and you will use IoT devices to send telemetry to IoT hub. The IoT hub will use Azure Stream Analytics, and a built-in Machine Learning (ML) model, to alert you to vibration anomalies in real time. You also plan to archive all of the telemetry data so that in-house machine learning models can be developed in the future.

You decide to prototype the solution using simulated telemetry from a single IoT device.

To simulate the vibration data in a realistic manner, you work with an engineer from Operations to understand a little bit about what causes the vibrations. It turns out there are a number of different types of vibration that contribute to the overall vibration level. For example, a “force vibration” could be introduced by a broken guide wheel or an especially heavy load placed improperly on the conveyor belt. There’s also an “increasing vibration”, that can be introduced when a system design limit (such as speed or weight) is exceeded. The Engineering team agrees to help you develop the code for a simulated IoT device that will produce an acceptable representation of vibration data (including anomalies).

The following resources will be created:

Lab 7 Architecture

In This Lab

In this lab, you will begin by reviewing the lab prerequisites and you will run a script if needed to ensure that your Azure subscription includes the required resources. You will then create a simulated device that sends vibration telemetry to your IoT hub. With your simulated data arriving at IoT hub, you will implement an IoT Hub Message Route and Azure Stream Analytics job that can be used to archive data. The lab includes the following exercises:

  • Verify Lab Prerequisites

    • A script will be used to create any missing resources and a new device identity (sensor-v-3000) for this lab
  • Write Code to generate Vibration Telemetry
  • Create a Message Route to Azure Blob Storage
  • Logging Route Azure Stream Analytics Job

Lab Instructions

Exercise 1: Verify Lab Prerequisites

This lab assumes that the following Azure resources are available:

Resource Type Resource Name
Resource Group rg-az220
IoT Hub iot-az220-training-{your-id}
Device ID sensor-v-3000

Important: Run the setup script to create the required device.

To create any missing resources and the new device you will need to run the lab07-setup.azcli script as instructed below before moving on to Exercise 2. The script file is included in the GitHub repository that you cloned locally as part of the dev environment configuration (lab 3).

The lab07-setup.azcli script is written to run in a bash shell environment - the easiest way to execute this is in the Azure Cloud Shell.

  1. Using a browser, open the Azure Cloud Shell and login with the Azure subscription you are using for this course.

    If you are prompted about setting up storage for Cloud Shell, accept the defaults.

  2. Verify that the Cloud Shell is using Bash.

    The dropdown in the top-left corner of the Azure Cloud Shell page is used to select the environment. Verify that the selected dropdown value is Bash.

  3. On the Cloud Shell toolbar, click Upload/Download files (fourth button from the right).

  4. In the dropdown, click Upload.

  5. In the file selection dialog, navigate to the folder location of the GitHub lab files that you downloaded when you configured your development environment.

    In Lab 3: Setup the Development Environment, you cloned the GitHub repository containing lab resources by downloading a ZIP file and extracting the contents locally. The extracted folder structure includes the following folder path:

    • Allfiles
      • Labs
        • 07-Device Message Routing
          • Setup

    The lab07-setup.azcli script file is located in the Setup folder for lab 7.

  6. Select the lab07-setup.azcli file, and then click Open.

    A notification will appear when the file upload has completed.

  7. To verify that the correct file has uploaded in Azure Cloud Shell, enter the following command:

     ls
    

    The ls command lists the content of the current directory. You should see the lab07-setup.azcli file listed.

  8. To create a directory for this lab that contains the setup script and then move into that directory, enter the following Bash commands:

     mkdir lab7
     mv lab07-setup.azcli lab7
     cd lab7
    
  9. To ensure that lab07-setup.azcli has the execute permission, enter the following command:

     chmod +x lab07-setup.azcli
    
  10. On the Cloud Shell toolbar, to enable access to the lab07-setup.azcli file, click Open Editor (second button from the right - { }).

  11. In the FILES list, to expand the lab7 folder and open the script file, click lab7, and then click lab07-setup.azcli.

    The editor will now show the contents of the lab07-setup.azcli file.

  12. In the editor, update the {your-id} and {your-location} assigned values.

    Referencing the sample below as an example, you need to set {your-id} to the Unique ID you created at the start of this course - i.e. cah191211, and set {your-location} to the location that makes sense for your resources.

     #!/bin/bash
    
     # Change these values!
     YourID="{your-id}"
     Location="{your-location}"
    

    Note: The {your-location} variable should be set to the short name for the region. You can see a list of the available regions and their short-names (the Name column) by entering this command:

    az account list-locations -o Table
    
    DisplayName           Latitude    Longitude    Name
    --------------------  ----------  -----------  ------------------
    East Asia             22.267      114.188      eastasia
    Southeast Asia        1.283       103.833      southeastasia
    Central US            41.5908     -93.6208     centralus
    East US               37.3719     -79.8164     eastus
    East US 2             36.6681     -78.3889     eastus2
    
  13. In the top-right of the editor window, to save the changes made to the file and close the editor, click , and then click Close Editor.

    If prompted to save, click Save and the editor will close.

    Note: You can use CTRL+S to save at any time and CTRL+Q to close the editor.

  14. To create the resources required for this lab, enter the following command:

     ./lab07-setup.azcli
    

    This script can take a few minutes to run. You will see output as each step completes.

    The script will first create a resource group named rg-az220 and an IoT Hub named iot-az220-training-{your-id}. If they already exist, a corresponding message will be displayed. The script will then add a device with an ID of sensor-v-3000 to the IoT hub and display the device connection string.

  15. Notice that, once the script has completed, the connection string for the device is displayed.

    The connection string starts with “HostName=”

  16. Copy the connection string into a text document, and note that it is for the sensor-v-3000 device.

    Once you have saved the connection string to a location where you can find it easily, you will be ready to continue with the lab.

Exercise 2: Write Code to generate Vibration Telemetry

Both long term and real-time data analysis are required to automate the monitoring of Contoso’s conveyor belt system and enable predictive maintenance. Since no historical data exists, your first step will be to generate simulated data that mimics vibration data and data anomalies in a realistic manner. Contoso engineers have developed an algorithm to simulate vibration over time and embedded the algorithm within a code class that you will implement. The engineers have agreed to support any future updates required to adjust the algorithms.

During your initial prototype phase, you will implement a single IoT device that generates telemetry data. In addition to the vibration data, your device will create some additional values (packages delivered, ambient temperature, and similar metrics) that will be sent to Blob storage. This additional data simulates the data that will be used to develop machine learning modules for predictive maintenance.

In this exercise, you will:

  • load the simulated device project
  • update the connection string for your simulated device and review the project code
  • test your simulated device connection and telemetry communications
  • ensure that telemetry is arriving at your IoT hub

Task 1: Open your simulated device project

  1. Open Visual Studio Code.

  2. On the File menu, click Open Folder.

  3. In the Open Folder dialog, navigate to the 07-Device Message Routing folder.

    In Lab 3: Setup the Development Environment, you cloned the GitHub repository containing lab resources by downloading a ZIP file and extracting the contents locally. The extracted folder structure includes the following folder path:

    • Allfiles
      • Labs
        • 07-Device Message Routing
          • Starter
            • VibrationDevice
  4. Navigate to the Starter folder for Lab 7.

  5. Click VibrationDevice, and then click Select Folder.

    You should see the following files listed in the EXPLORER pane of Visual Studio Code:

    • Program.cs
    • VibrationDevice.csproj

    Note: If you are prompted to load required assets, you can do that now.

  6. In the EXPLORER pane, click Program.cs.

    A cursory glance will reveal that the VibrationDevice application is very similar to those used in the preceding labs. This version of the application uses symmetric Key authentication, sends both telemetry and logging messages to the IoT Hub, and has a more complex sensor implementation.

  7. On the Terminal menu, click New Terminal.

    Examine the directory path indicated as part of the command prompt to ensure that you are in the correct location. You do not want to start building this project within the folder structure of a previous lab project.

  8. At the terminal command prompt, to verify that the application builds without errors, enter the following command:

     dotnet build
    

    The output will be similar to:

     ❯ dotnet build
     Microsoft (R) Build Engine version 16.5.0+d4cbfca49 for .NET Core
     Copyright (C) Microsoft Corporation. All rights reserved.
    
     Restore completed in 39.27 ms for D:\Az220-Code\AllFiles\Labs\07-Device Message Routing\Starter\VibrationDevice\VibrationDevice.csproj.
     VibrationDevice -> D:\Az220-Code\AllFiles\Labs\07-Device Message Routing\Starter\VibrationDevice\bin\Debug\netcoreapp3.1\VibrationDevice.dll
    
     Build succeeded.
         0 Warning(s)
         0 Error(s)
    
     Time Elapsed 00:00:01.16
    

In the next task, you will configure the connection string and review the application.

Task 2: Configure connection and review code

The simulated device app that you will build in this task simulates an IoT device that is monitoring the conveyor belt. The app will simulate sensor readings and report vibration sensor data every two seconds.

  1. Ensure that you have the Program.cs file opened in Visual Studio Code.

  2. Near the top of the Program class, locate the declaration of the deviceConnectionString variable:

     private readonly static string deviceConnectionString = "<your device connection string>";
    
  3. Replace <your device connection string> with the device connection string that you saved earlier.

    Note: This is the only change that you are required to make to this code.

  4. On the File menu, click Save.

  5. Take a minute to review the structure of the project.

    Notice that the application structure is similar to that of your previous simulated device projects.

    • Using statements
    • Namespace definition
      • Program class - responsible for connecting to Azure IoT and sending telemetry
      • ConveyorBeltSimulator class - (replaces EnvironmentSensor) rather than just generating telemetry, this class also simulates a running conveyor belt
      • ConsoleHelper - a new class that encapsulates writing different colored text to the console
  6. Take a minute to review the Main method.

     private static void Main(string[] args)
     {
         ConsoleHelper.WriteColorMessage("Vibration sensor device app.\n", ConsoleColor.Yellow);
    
         // Connect to the IoT hub using the MQTT protocol.
         deviceClient = DeviceClient.CreateFromConnectionString(deviceConnectionString, TransportType.Mqtt);
    
         SendDeviceToCloudMessagesAsync();
         Console.ReadLine();
     }
    

    Notice how straightforward it is to create an instance of DeviceClient using the deviceConnectionString variable. Since the deviceClient object is declared outside of Main (at the Program level in the code above), it is global and therefor available inside the methods that communicate with IoT hub.

  7. Take a minute to review the SendDeviceToCloudMessagesAsync method.

     private static async void SendDeviceToCloudMessagesAsync()
     {
         var conveyor = new ConveyorBeltSimulator(intervalInMilliseconds);
    
         // Simulate the vibration telemetry of a conveyor belt.
         while (true)
         {
             var vibration = conveyor.ReadVibration();
    
             await CreateTelemetryMessage(conveyor, vibration);
    
             await CreateLoggingMessage(conveyor, vibration);
    
             await Task.Delay(intervalInMilliseconds);
         }
     }
    

    First off, notice that this method is being used to establish the infinite program loop, first taking a vibration reading and then sending messages at a defined time interval.

    A closer look reveals that the ConveyorBeltSimulator class is used to create a ConveyorBeltSimulator instance named conveyor. The conveyor object is first used to capture a vibration reading which is placed into a local vibration variable, and is then passed to the two create message methods along with the vibration value that was captured at the start of the interval.

  8. Take a minute to review the CreateTelemetryMessage method.

     private static async Task CreateTelemetryMessage(ConveyorBeltSimulator conveyor, double vibration)
     {
         var telemetryDataPoint = new
         {
             vibration = vibration,
         };
         var telemetryMessageString = JsonConvert.SerializeObject(telemetryDataPoint);
         var telemetryMessage = new Message(Encoding.ASCII.GetBytes(telemetryMessageString));
    
         // Add a custom application property to the message. This is used to route the message.
         telemetryMessage.Properties.Add("sensorID", "VSTel");
    
         // Send an alert if the belt has been stopped for more than five seconds.
         telemetryMessage.Properties.Add("beltAlert", (conveyor.BeltStoppedSeconds > 5) ? "true" : "false");
    
         Console.WriteLine($"Telemetry data: {telemetryMessageString}");
    
         // Send the telemetry message.
         await deviceClient.SendEventAsync(telemetryMessage);
         ConsoleHelper.WriteGreenMessage($"Telemetry sent {DateTime.Now.ToShortTimeString()}");
     }
    

    As in earlier labs, this method creates a JSON message string and uses the Message class to send the message, along with additional properties. Notice the sensorID property - this will be used to route the VSTel values appropriately at the IoT Hub. Also notice the beltAlert property - this is set to true if the conveyor belt haas stopped for more than 5 seconds.

    As usual, the message is sent via the SendEventAsync method of the device client.

  9. Take a minute to review the CreateLoggingMessage method.

     private static async Task CreateLoggingMessage(ConveyorBeltSimulator conveyor, double vibration)
     {
         // Create the logging JSON message.
         var loggingDataPoint = new
         {
             vibration = Math.Round(vibration, 2),
             packages = conveyor.PackageCount,
             speed = conveyor.BeltSpeed.ToString(),
             temp = Math.Round(conveyor.Temperature, 2),
         };
         var loggingMessageString = JsonConvert.SerializeObject(loggingDataPoint);
         var loggingMessage = new Message(Encoding.ASCII.GetBytes(loggingMessageString));
    
         // Add a custom application property to the message. This is used to route the message.
         loggingMessage.Properties.Add("sensorID", "VSLog");
    
         // Send an alert if the belt has been stopped for more than five seconds.
         loggingMessage.Properties.Add("beltAlert", (conveyor.BeltStoppedSeconds > 5) ? "true" : "false");
    
         Console.WriteLine($"Log data: {loggingMessageString}");
    
         // Send the logging message.
         await deviceClient.SendEventAsync(loggingMessage);
         ConsoleHelper.WriteGreenMessage("Log data sent\n");
     }
    

    Notice that this method is very similar to the CreateTelemetryMessage method. Here are the key items to note:

    • The loggingDataPoint contains more information than the telemetry object. It is common to include as much information as possible for logging purposes to assist in any fault diagnosis activities or more detailed analytics in the future.
    • The logging message includes the sensorID property, this time set to VSLog. Again, as noted above, his will be used to route the VSLog values appropriately at the IoT Hub.
  10. Optionally, take a moment to review the ConveyorBeltSimulator class and the ConsoleHelper class.

    You don’t actually need to understand how either of these classes work to achieve the full value of this lab, but they both support the outcome in their own way. The ConveyorBeltSimulator class simulates the operation of a conveyor belt, modeling a number of speeds and related states to generate vibration data. The ConsoleHelper class is used to write different colored text to the console to highlight different data and values.

Task 3: Test your code to send telemetry

  1. At the Terminal command prompt, to run the app, enter the following command:

     dotnet run
    

    This command will run the Program.cs file in the current folder.

  2. Console output should be displayed that is similar to the following:

     Vibration sensor device app.
    
     Telemetry data: {"vibration":0.0}
     Telemetry sent 10:29 AM
     Log data: {"vibration":0.0,"packages":0,"speed":"stopped","temp":60.22}
     Log data sent
    
     Telemetry data: {"vibration":0.0}
     Telemetry sent 10:29 AM
     Log data: {"vibration":0.0,"packages":0,"speed":"stopped","temp":59.78}
     Log data sent
    

    Note: In the Terminal window, green text is used to show things are working as they should and red text when bad stuff is happening. If you receive error messages, start by checking your device connection string.

  3. Leave this app running for the next task.

    If you won’t be continuing to the next task, you can enter Ctrl-C in the Terminal window to stop the app. You can start it again later by using the dotnet run command.

Task 4: Verify the IoT Hub is Receiving Telemetry

In this task, you will use the Azure portal to verify that your IoT Hub is receiving telemetry.

  1. Open the Azure Portal.

  2. On your Resources tile, click iot-az220-training-{your-id}.

  3. On the Overview pane, scroll down to view the metrics tiles.

  4. Adjacent to Show data for last, change the time range to one hour.

    The Device to cloud messages tile should be plotting some current activity. If no activity is shown, wait a short while, as there’s some latency.

    With your device pumping out telemetry, and your hub receiving it, the next step is to route the messages to their correct endpoints.

Exercise 3: Create a Message Route to Azure Blob Storage

IoT solutions often require that incoming message data be sent to multiple endpoint locations, either dependent upon the type of data or for business reasons. Azure IoT hub provides the message routing feature to enable you to direct incoming data to locations required by your solution.

The architecture of our system requires data be sent to two destinations: a storage location for archiving data, and a location for more immediate analysis.

Contoso’s vibration monitoring scenario requires you to create two message routes:

  • the first route will be to an Azure Blob storage location for data archiving
  • the second route will be to an Azure Stream Analytics job for real-time analysis

Message routes should be built and tested one at a time, so this exercise will focus on the storage route. This route will be referred to as the “logging” route, and it involves digging a few levels deep into the creation of Azure resources.

One important feature of message routing is the ability to filter incoming data before routing to an endpoint. The filter, written as a SQL query, directs output through a route only when certain conditions are met.

One of the easiest ways to filter data is to evaluate a message property. You may recall adding message properties to your device messages in the previous exercise. The code that you added looked like the following:

...
telemetryMessage.Properties.Add("sensorID", "VSTel");
...
loggingMessage.Properties.Add("sensorID", "VSLog");

You can now embed a SQL query within your message route that uses sensorID as a criteria for the route. In this case, when the value assigned to sensorID is VSLog (vibration sensor log), the message is intended for the storage archive.

In this exercise, you will create and test the logging route.

Task 1: Define the message routing endpoint

  1. In the Azure Portal, ensure that your IoT hub blade is open.

  2. On the left-hand menu, under Messaging, click Message routing.

  3. On the Message routing pane, ensure that the Routes tab is selected.

  4. To add a new route, click + Add.

    The Add a route blade should now be displayed.

  5. On the Add a route blade, under Name, enter vibrationLoggingRoute

  6. To the right of Endpoint, click + Add endpoint, and then, in the drop-down list, click Storage.

    The Add a storage endpoint blade should now be displayed.

  7. On the Add a storage endpoint blade, under Endpoint name, enter vibrationLogEndpoint

  8. To display a list of Storage accounts associated with your subscription, click Pick a container.

    A list of the storage accounts already present in the Azure Subscription is listed. At this point you could select an existing storage account and container, however, for this lab you will create a new one.

  9. To begin creating a storage account, click + Storage account.

    The Create storage account blade should now be displayed.

  10. On the Create storage account blade, under Name, enter vibrationstore{your-id}

    For example: vibrationstorecah191211

    Note: This field can only contain lower-case letters and numbers, must be between 3 and 24 characters, and must be unique.

  11. In the Account kind dropdown, click StorageV2 (general purpose v2).

  12. Under Performance, ensure that Standard is selected.

    This keeps costs down at the expense of overall performance.

  13. Under Replication, ensure that Locally-redundant storage (LRS) is selected.

    This keeps costs down at the expense of risk mitigation for disaster recovery. In production your solution may require a more robust replication strategy.

  14. Under Location, select the region that you are using for the labs in this course.

  15. To create the storage account endpoint, click OK.

  16. Wait until the request is validated and the storage account deployment has completed.

    Validation and creation can take a minute or two.

    Once completed, the Create storage account blade will close and the Storage accounts blade will be displayed. The Storage accounts blade should have auto-updated to show the storage account that was just created.

Task 2: Define the storage account container

  1. On the Storage accounts blade, click vibrationstore{your-id}.

    The Containers blade should appear. Since this is a new storage account, there are no containers listed.

  2. To create a container, click + Container.

    The New container dialog should now be displayed.

  3. On the New container dialog, under Name, enter vibrationcontainer

    Again, only lower-case letters and numbers are accepted.

  4. Under Public access level, ensure that Private (no anonymous access) is selected.

  5. To create the container, click Create.

    After a moment the Lease state for your container will update to display Available.

  6. To choose this container for your solution, click vibrationcontainer, and then click Select.

    You should be returned to the Add a storage endpoint blade. Notice that the Azure Storage container has been set to the URL for the storage account and container you just created.

  7. Leave the Batch frequency and Chunk size window fields set to their default values of 100.

  8. Under Encoding, notice that there are two options and that AVRO is selected.

    Note: By default IoT Hub writes the content in Avro format, which has both a message body property and a message property. The Avro format is not used for any other endpoints. Although the Avro format is great for data and message preservation, it’s a challenge to use it to query data. In comparison, JSON or CSV format is much easier for querying data. IoT Hub now supports writing data to Blob storage in JSON as well as AVRO.

  9. Take a moment to examine the value specified in File name format field.

    The File name format field specifies the pattern used to write the data to files in storage. The various tokens are replace with values as the file is created.

  10. At the bottom of the blade, to create your storage endpoint, click Create.

    Validation and subsequent creation will take a few moments. Once complete, you should be located back on the Add a route blade.

Task 3: Define the routing query

  1. On the Add a route blade, under Data source, ensure that Device Telemetry Messages is selected.

  2. Under Enable route, ensure that Enable is selected.

  3. Under Routing query, replace true with the query below:

     sensorID = 'VSLog'
    

    This query ensures that only messages with the sensorID application property set to VSLog will be routed to the storage endpoint.

  4. To save this route, click Save.

    Wait for the success message. Once completed, the route should be listed on the Message routing pane.

  5. Navigate back to your Azure portal Dashboard.

Task 4: Verify Data Archival

  1. Ensure that the device app you created in Visual Studio Code is still running.

    If not, run it in the Visual Studio Code terminal using dotnet run.

  2. On your Resources tile, to open you Storage account blade, click vibrationstore{your-id}.

    If your Resources tile does not list your Storage account, click the Refresh button at the top of the resource group tile, and then follow the instruction above to open your storage account.

  3. On the left-side menu of your vibrationstore{your-id} blade, click Storage Explorer (preview).

    You can use the Storage Explorer to verify that your data is being added to the storage account.

    Note: The Storage Explorer is currently in preview mode, so its exact mode of operation may change.

  4. In Storage Explorer (preview) pane, expand BLOB CONTAINERS, and then click vibrationcontainer.

    To view the data, you will need to navigate down a hierarchy of folders. The first folder will be named for the IoT Hub.

  5. In the right-hand pane, under NAME, double-click iot-az220-training-{your-id}, and then use double-clicks to navigate down into the hierarchy.

    Under your IoT hub folder, you will see folders for the Partition, then numeric values for the Year, Month, and Day. The final folder represents the Hour, listed in UTC time. The Hour folder will contain a number of Block Blobs that contain your logging message data.

  6. Double-click the Block Blob for the data with the earliest time stamp.

    The URL link will open in a new browser tab. Although the data is not formatted in a way that is easy to read, you should be able to recognize it as your vibration messages.

  7. Close browser tab containing your data, and then navigate back to your Azure portal Dashboard.

Exercise 4: Logging Route Azure Stream Analytics Job

In this exercise, you will create a Stream Analytics job that outputs logging messages to Blob storage. You will then use Storage Explorer in the Azure Portal to view the stored data.

This will enable you to verify that your route includes the following settings:

  • Name - vibrationLoggingRoute
  • Data Source - DeviceMessages
  • Routing query - sensorID = ‘VSLog’
  • Endpoint - vibrationLogEndpoint
  • Enabled - true

Note: It may seem odd that in this lab you are routing data to storage, and then also sending your data to storage through Azure Stream Analytics. In a production scenario, you wouldn’t have both paths long-term. Instead, it is likely that the second path that we’re creating here would not exist. You will use it here, in a lab environment, as a way to validate that your routing is working as expected and to show a simple implementation of Azure Stream Analytics.

Task 1: Create the Stream Analytics Job

  1. On the Azure portal menu, click + Create a resource.

  2. On the New blade, in the Search the Marketplace textbox, type stream analytics and then click Stream Analytics job.

  3. On the Stream Analytics job blade, click Create.

    The New Stream Analytics job pane is displayed.

  4. On the New Stream Analytics job pane, under Name, enter vibrationJob.

  5. Under Subscription, choose the subscription you are using for the lab.

  6. Under Resource group, select rg-az220.

  7. Under Location, select the region that you are using for the labs in this course.

  8. Under Hosting environment, ensure that Cloud is selected.

    Edge hosting will be discussed later in the course.

  9. Under Streaming units, reduce the number from 3 to 1.

    This lab does not require 3 units and this will reduce costs.

  10. To create the Stream Analytics job, click Create.

  11. Wait for the Deployment succeeded message, then open the new resource.

    Tip: If you miss the message to go to the new resource, or need to find a resource at any time, select Home/All resources. Enter enough of the resource name for it to appear in the list of resources.

  12. Take a moment to examine your new Stream Analytics job.

    Notice that you have an empty job, showing no inputs or outputs, and a skeleton query. The next step is to populate these entries.

  13. On the left-side menu under Job topology, click Inputs.

    The Inputs pane will be displayed.

  14. On the Inputs pane, click + Add stream input, and then click IoT Hub.

    The IoT Hub - New input pane will be displayed.

  15. On the IoT Hub - New input pane, under Input alias, enter vibrationInput.

  16. Ensure that Select IoT Hub from your subscriptions is selected.

  17. Under Subscription, ensure that the subscription you used to create the IoT Hub earlier is selected.

  18. Under IoT Hub, ensure that your iot-az220-training-{your-id} IoT hub is selected.

  19. Under Endpoint, ensure that Messaging is selected.

  20. Under Shared access policy name, ensure that iothubowner is selected.

    Note: The Shared access policy key is populated and read-only.

  21. Under Consumer group, ensure that $Default is selected.

  22. Under Event serialization format, ensure that JSON is selected.

  23. Under Encoding, ensure that UTF-8 is selected.

    You may need to scroll down to see some of the fields.

  24. Under Event compression type, ensure None is selected.

  25. To save the new input, click Save, and then wait for the input to be created.

    The Inputs list should be updated to show the new input.

  26. To create an output, on the left-side menu under Job topology, click Outputs.

    The Outputs pane is displayed.

  27. On the Outputs pane, click + Add, and then click Blob storage/Data Lake Storage Gen2.

    The Blob storage/Data Lake Storage Gen2 - New output pane is displayed.

  28. On the Blob storage/Data Lake Storage Gen2 - New output pane, under Output alias, enter vibrationOutput.

  29. Ensure that Select storage from your subscriptions is selected.

  30. Under Subscription, select the subscription you are using for this lab.

  31. Under Storage account, click vibrationstore{your-id}.

    Note: The Storage account key is automatically populated and read-only.

  32. Under Container, ensure that Use existing is selected and that vibrationcontainer is selected from the dropdown list.

  33. Leave the Path pattern blank.

  34. Leave the Date format and Time format at their defaults.

  35. Under Event serialization format, ensure that JSON is selected.

  36. Under Encoding, ensure that UTF-8 is selected.

  37. Under Format, ensure that Line separated is selected.

    Note: This setting stores each record as a JSON object on each line and, taken as a whole, results in a file that is an invalid JSON record. The other option, Array, ensures that the entire document is formatted as a JSON array where each record is an item in the array. This allows the entire file to be parsed as valid JSON.

  38. Leave Minimum rows blank.

  39. Under Maximum time, leave Hours and Minutes blank.

  40. Under Authentication mode, ensure that Connection string is selected.

  41. To create the output, click Save, and then wait for the output to be created.

    The Outputs list will be updated with the new output.

  42. To edit the query, on the left-side menu under Job topology, click Query.

  43. In the query editor pane, replace the existing query with the query below:

     SELECT
         *
     INTO
         vibrationOutput
     FROM
         vibrationInput
    
  44. Directly above the query editor pane, click Save Query.

  45. On the left-side menu, click Overview.

Task 2: Test the Logging Route

Now for the fun part. Does the telemetry your device app is pumping out work its way along the route, and into the storage container?

  1. Ensure that the device app you created in Visual Studio Code is still running.

    If not, run it in the Visual Studio Code terminal using dotnet run.

  2. On the Overview pane of your Stream Analytics job, click Start.

  3. In the Start job pane, leave the Job output start time set to Now, and then click Start.

    It can take a few moments for the job to start.

  4. On the Azure portal menu, click Dashboard.

  5. On your Resources tile, click vibrationstore{your-id}.

    If your Storage account is not visible, use the Refresh button at the top of the resource group tile.

  6. On the Overview pane of your Storage account, scroll down until you can see the Monitoring section.

  7. Under Monitoring, adjacent to Show data for last, change the time range to 1 hour.

    You should see activity in the charts.

  8. On the left-side menu, click Storage Explorer (preview).

    You can use Storage Explorer for additional reassurance that all of your data is getting to the storage account.

    Note: The Storage Explorer is currently in preview mode, so its exact mode of operation may change.

  9. In Storage Explorer (preview), under BLOB CONTAINERS, click vibrationcontainer.

    To view the data, you will need to navigate down a hierarchy of folders. The first folder will be named for the IoT Hub, the next will be a partition, then year, month, day and finally hour.

  10. In the right-hand pane, under Name, double-click the folder for your IoT hub, and then use double-clicks to navigate down into the hierarchy until you open the most recent hour folder.

    Within the hour folder, you will see files named for the minute they were generated. This verifies that your data is reaching the storage location as intended.

  11. Navigate back to your Dashboard.

  12. On your Resources tile, click vibrationJob.

  13. On the vibrationJob blade, click Stop, and then click Yes.

    You’ve traced the activity from the device app, to the hub, down the route, and to the storage container. Great progress! You will continue this scenario stream analytics in the next module when you take a quick look at data visualization.

  14. Switch to the Visual Studio Code window.

  15. At the Terminal command prompt, to exit the device simulator app, press CTRL-C.

IMPORTANT: Do not remove these resources until you have completed the Data Visualization module of this course.