Click here to Skip to main content
13,902,997 members
Click here to Skip to main content
Add your own
alternative version

Stats

2K views
Posted 13 Dec 2018
Licenced CPOL

Azure Stream Analytics Job and Tools for Visual Studio

, 13 Dec 2018
Rate this:
Please Sign up or sign in to vote.
What if there is a tool which helps you create a stream solution in Visual Studio so that you don’t want to go over the browser and do some manual button clicks? That’s where the Azure Data Lake and Stream Analytics Tool extension come into play.

Introduction

Creating an Azure Stream Analytics job and working with them is always fun. And what if there is a tool which helps you create a stream solution in Visual Studio so that you don’t want to go over the browser and do some manual button clicks? That’s where the Azure Data Lake and Stream Analytics Tool extension come into play. In my previous post, we have already discussed what is an Azure Stream Analytics job and how we can work with the same, if you haven’t read the same, please read it. Now let’s go and use the extension.

Background

Recently, I started working with Azure IoT Device Kit, which is a wonderful microcontroller board, it can send a lot of information like temperature, humidity to the Azure cloud, as it has a lot of inbuilt sensors. Now in this article, we will create a Visual Studio solution for our Stream Analytics Job so that the same can be moved to a source control and can be easily managed.

Please note that the Extension we are going to use is not yet supported by Visual Studio 2019, and if you try to create the same in VS2019, you will get an error as the version of Visual Studio is not supported.

The extension is not supported in VS2019.

Setting Up the Environment

I am assuming that you have a valid Azure Subscription and you have access to create the resources in it.

Modify Visual Studio with Required Workloads

Let’s go to our Visual Studio installer and modify the workloads now.

Install the Required Workloads

Once the workloads are modified, make sure that the extension is available in your Extensions, you can do that by going to Tools -> Extension menu.

Azure Data Lake Extension

Creating a New Stream Analytics Project

Once the Extension is enabled, you should be able to create a new Azure Stream Analytics Application.

Azure Stream Analytics Application

The new project will contain the below files:

  1. Input.json
  2. Output.json
  3. JobConfig.json
  4. Script.asaql

To configure your subscription, please make sure that you have added your Subscription in the Server Explorer.

Azure Server Explorer

The Input.json file is the replica of your Input job topology, double-clicking on the file will give you the configuration page to configure the details.

Input Configuration

The Output.json file is your output job topology, you can have as many outputs as you need. In my case, it is just one, SQL Server Database.

Output Configuration

You can always configure your job using the JobConfig.json file. When you configure the Job, you need to be sure about the values you are providing and what are the needs of them.

Data Locale is your locale input. The option Output Error Handling is for handling the situation when the events fail to be written to the output, you can select either Drop or Retry. The Late Arrival Tolerance Window is the timeframe which you can wait for the event to reach the IoT hub, the time difference is between the event timestamp and the system time.

Job Configuration

And the Script.asaql is the file where you need to add your custom query which gets data from the input and send it to the output.

SELECT
    messageId,
    deviceId,
    temperature,
    humidity,
    pressure,
    pointInfo,
    IoTHub,
    EventEnqueuedUtcTime,
    EventProcessedUtcTime,
    PartitionId
INTO
    streamoutputs
FROM
    streaminputs

Once everything is done, you are ready to submit the same to the Azure. You can either create a new job or use the existing one.

Submit Job

When you submit, you can see that a new Stream Analytics view will get opened and the job will be starting automatically. You can always see the blobs created under your container by going to the cloud explorer.

Cloud Explorer

Now just right click on your solution and select “Add solution to Source Control" and then push the same to your git repository. Once you have added the solution to the source control, your team members can easily update the Input and Output configuration and have a history of the same.

Conclusion

In this article, we have learned how to:

  1. set up the Visual Studio for using Data Lake and Stream Analytics tool
  2. use the Data Lake and Stream Analytics tool
  3. configure the Input for Stream Analytics
  4. configure the Stream Analytics Job
  5. use the created package in another solution

Your Turn. What Do You Think?

Thanks a lot for reading. I will come back with another post on the same topic very soon. Did I miss anything that you think is needed? Could you find this post useful? Kindly do not forget to share your feedback.

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

Share

About the Author

Sibeesh Passion
Software Developer
Germany Germany
I am Sibeesh Venu, an engineer by profession and writer by passion. I’m neither an expert nor a guru. I have been awarded Microsoft MVP 3 times, C# Corner MVP 5 times, DZone MVB. I always love to learn new technologies, and I strongly believe that the one who stops learning is old.

My Blog: Sibeesh Passion
My Website: Sibeesh Venu

You may also be interested in...

Pro

Comments and Discussions

 
-- There are no messages in this forum --
Permalink | Advertise | Privacy | Cookies | Terms of Use | Mobile
Web06 | 2.8.190306.1 | Last Updated 13 Dec 2018
Article Copyright 2018 by Sibeesh Passion
Everything else Copyright © CodeProject, 1999-2019
Layout: fixed | fluid