Machine to machine communication with IoT Edge and HiveMQ

Machine to machine communication(M2M) considers two or more machines communicating with each other.
This communication can involve exchanging data, sending commands in order to regulate sensors, raise alarms, start/stop processes, etc., without human interaction.
Some of the protocols that are widely used in M2M communication are MQTT, OPC-UA, CoAP, LWM2M, etc.


In scenarios where devices/machines need to exchange data over MQTT broker, Azure IoT Edge can be useful for managing broker and other container deployments(subscribers, data processors, etc) over the cloud. Another role of IoT Edge in these scenarios is to enable a secure connection to the cloud and to send the telemetry to the cloud.
As a broker, HiveMQ broker is used, which is 100% compliant with MQTT standard(including MQTT 5.0), and makes it a perfect option for M2M communication, especially in cases where machines require specific MQTT features.
More about HiveMQ can be found here.
Following example is composed out of the following components:
– Publisher Machine, that sends the telemetry, temperature and humidity
– Subscriber Machine(also can act as a publisher if required), that subscribes to the telemetry topic and reacts on the sensor data from Publisher Machine
– IoT Edge as a module(container) deployment orchestrator and cloud communication gateway
– HiveMQ module(container) as MQTT broker
– Subscriber module, processes telemetry data, before sending to the cloud


Architecture of the sample

The architecture of the sample

As a publisher machine, MessageSender was used. This is a UWP application that can send messages to various targets, including the MQTT broker. Any other MQTT client can be used for this purpose.
Subscriber, in this case, is a simple Java application that leverages the HiveMQ library for MQTT brokers.
Following code demonstrates how the topic listener can be implemented with HiveMQ library in Java:


The core part of IoT Edge deployment is the manifest file that describes the deployment. HiveMQ can be found on the DockerHub, and the following configuration is an example of how to pull the HiveMQ docker image and start it via IoT Edge deployment.


HiveMQ needs to bind 1883 port for MQTT and 8080 for the broker dashboard.
The full code sample with ‘how to run’ instructions can be found on GitHub.
In order to make the sample working, it is required that the publisher and subscriber machines add additional ‘hosts’ configuration:

After running the sample and sending the message from the publisher, the subscriber machine console should show that the message came through the HiveMQ broker deployed on Azure IoT Edge.

Simulation of the Publisher machine


The subscriber machine console output after sending the message from the publisher

Finally, the Subscriber Module makes sure that the message ends up on the output, that is passed to the Azure IoT Hub via Routes in Azure IoT Edge Deployment manifest.



– HiveMQ documentation:
– IoT Edge documentation:

.NET Core Service Fabric IoT Sample project

Since Microsoft recommends building new applications based on .NET Core, the topic of this blog is simple Service Fabric IoT example based on .NET Core.
The idea is similar and is based on the official Sevice Fabric IoT Example that is .NET Framework based.
This blog will point on some of the differences and will offer a full solution on Github.
Service fabric will contain stateful services for consuming messages from the IoT Hub partitions. One stateful service partition per one IoT Hub partition. That way the scaling is achieved. The state of the service will keep information about event hub queue offset, and epoch.
Epoch ensures that there is only one receiver per consumer group, with the following rules:
a) If there is no existing receiver on a consumer group then the user can create a receiver with any epoch value.
b) If there is a receiver with epoch value e1 and a new receiver is created with an epoch value e2 where e1 <= e2 then receiver with e1 will be disconnected automatically, receiver with e2 get created successfully.
c) If there is a receiver with epoch value e1 and a new receiver is created with an epoch value e2 where e1 > e2 then the creation of e2 with fail with the error “A receiver with epoch e1 already exists”
The offset represents the date which is used to read all messages that arrived to the IoT Hub after that date.


Official Service Fabric IoT Example(Iot.Ingestion.RouterService) is using NuGet packages with WindowsAzure prefix, which causes some incompatibilities in .NET Core based project.
That is why .NET Core IoT based applications prefer to use packages with Microsoft.Azure prefix.
Service Fabric IoT Sample, based on .NET Core, is using the following packages:
1. Microsoft.Azure.EventHubs
2. Microsoft.Azure.ServiceBus
These packages require slightly different implementation when it comes to reading messages from IoT Hub partitions.
Following code represents how the method for creating the event hub receiver should be implemented with .NET core related packages:

The code and other changes are available on Github.


1. Explaination for event hub epochs:
2. Service Fabric Sample IoT solution based on .NET Framework:

Azure Stream Analytics Anomaly detection on IoT Edge


A few months ago Microsoft announced a preview for the Anomaly Detection Feature for Azure Stream Analytics.
In the IoT cases where anomaly detection is required in order to reduce failures and damages, usually it would be done in a way that in the first phase data is being collected, labeled and machine learning model would be trained to classify a new piece of information compared to the previous learnings.
Anomaly detection for Azure Stream Analytics works in a similar way, but the difference is that there is no pre-trained model.
Azure Stream analytics tries to learn from the incoming data and then creates a model that can determine if the incoming data is an anomaly.
The difference between both approaches is pricing and the reliability of the models.
This blog will cover only technical aspects of how to take advantage of this feature with IoT Edge and OPC Publisher and would not be focused on the reliability of such models in details.


Picture 1: Architecture

In order to produce relevant information above is the scheme of the setup with Raspberry Pi and the temperature/humidity sensor.
RaspberryPi is running on Raspbian operating system and has dummy OPC Server installed that publishes temperature and humidity information.
Whether this can be done without OPC-UA Server and OPC-UA publisher? Yes, but this way is more interesting 🙂
Raspberry Pi with Raspbian OS has an application written in Python that reads humidity and temperature values from the sensor. Also, it runs a lightweight OPC server that publishes the values from the sensor.
IoT Edge contains three modules.
– OPC Publisher module,
– Azure Stream Analytics Module
– AnomalyDetectionHandling module.
IoT Edge has OPC-UA Publisher module that gets the data from the OPC-UA server and passes further.
After OPC-UA publisher passes the information, Edge Runtime makes sure that these messages reach stream analytics module.
Stream analytics module takes the data and produces the output only in cases when an anomaly has been detected.
Finally, AnomalyHandling module handles anomalies so that it writes them to the output. The final destination of anomalies would be IoT Hub.
Module for handling the anomaly detection is pretty simple, the default one, and could be used for custom logic to handle anomalies.
One example could be sending commands to stop the machine or to start a process that would normalize measured parameters in the production.


1. Python application
As previously mentioned, this application reads the data from dht11 temperature and humidity sensor and runs a local OPC server which publishes the data.
FreeOPCUA Python library was used for the OPC server implementation. For reading the sensor values this library was used.
Full code is available on GitHub.
2. IoT Edge
Since custom OPC server for this project has some limitations and accent of the project is not on that end, OPC-UA publisher had to be slightly modified in order to make it work. In the ‘real world case’ this modification would not take place.
The modified version of OPC-UA publisher is here(and is only valid for the purpose of this project).
The modification contains removed following line of code:

With the assumption that Docker is installed on a local machine and IoT Hub is in place, for the purpose of testing the following steps are required in order to run the solution in IoT Edge simulator:
a) To create IoT Edge Device in the Azure portal
b) To create Container Registry and enable Admin user.
c) To run the following command:

The login server is available on the Overview page in the Azure Portal. Credentials are available in AccessKeys.
d) build OPC publisher docker image after navigating in the console to the folder where the project is saved:

e) publish OPC-UA publisher docker image to the previously created docker image container

f) To create a new IoT Edge Project with C# module(AnomalyHandlingModule) in Visual Studio Code by using Command Palette
g) Build and push IoT Edge Solution, so that AnomalyHandlingModule is in the container registry

Configuring IoT Edge Deployment template file

In order to have OPC-UA publisher running on IoT Edge, it is necessary to modify the deployment.template.json file which is a part of the IoT Edge Solution.
This file will reference the docker image from container registry and specify required options that will be passed as parameters when running an OPC-UA publisher as a container on IoT Edge.
In the ‘modules’ section it is required to add the following lines:


This configuration indicates that the OPC-UA publisher image can be found in the specified(previously deployed) container registry under the tag ‘latest’.
It also specifies that the configuration file for OPC-UA server is in /appdata folder which binds to C:/Test. This means that all files that are placed in C:/Test(Windows running on a host machine) will be available in the internal file system of the container by using /appdata folder location.
Following flags are setting how OPC-UA publisher would work in this environment:
– “–ns=true” – no shutdown, OPC-Publisher runs for the time being
– “–fd=true” – fetch display name, OPC-UA Publisher will fetch the names of the variables that are holding the values
– “–aa” – trust all certificates from OPC-UA server. This is required if running the OPC-UA publisher on IoT Edge

Stream analytics

Azure Stream Analytics for IoT Edge needs to be created in the Azure portal.

Picture 2: Create Azure Stream Analytics Job for IoT Edge

Picture 3: Create Azure Stream Analytics Job for IoT Edge

Before deploying to the IoT Edge, Stream analytics job for IoT Edge needs to have Azure storage account assigned

Picture 4: Assigning Azure Storage account to ASA job

The input of the ASA is Edge hub and output is Edge hub as well.
There are two outputs for Temperature and Humidity anomalies.

Picture 5: ASA job outputs

Following code represents a query for anomaly detection:


The key part here are the following lines


This line of code indicates that ‘spike and dip’ is the function that is used for anomaly detection(there is also ‘ChangePoint’ function).
The first parameter of this function represents which value the function will track. The second parameter represents a confidence level, 80% in this case, and the third parameter represents how many events ASA job should consider for model training.
It is recommended to include only the necessary number of events for better performance.
Finally, the fourth parameter represents the mode. In this case, only spikes would be tracked. Other options are ‘dips’, to detect dips, and ‘spikesanddips’ in order to detect and spikes and dips.
The following part of the code prepares properties for the output in case that an anomaly has been detected.


‘Score’ in this case represents a value between 0 and 1. The smaller score the higher chance for the anomaly. ‘IsAnomaly’ flag returns 1 if it is an anomaly, otherwise, it returns 0.
Finally, the result of this query will send to the output all events that are recognized as anomalies.


Deploying to the Edge and incorporating with other modules


Deploying the Azure Stream analytics query to the IoT Edge can be done through Azure Portal.
1. It is required to set IoT Edge device modules through Azure Portal

Picture 6: Set ASA job as a module on IoT Edge

2. Clicking on ‘Add’ button and selecting Azure Stream Analytics Module returns the following form

Picture 7: Selecting previously created ASA job for IoT Edge

By confirming on ‘Save’, Azure Stream analytics job for IoT Edge will be packed as a zip file to a previously assigned storage account.
The remaining part is to ‘tell’ IoT Edge how to fetch this package and start a job.
Clicking on the Azure Stream analytics Anomaly detection module provides information on how to fetch this package and start a job on IoT Edge.

Picture 8: Getting required information for deployment.template.json file

This information needs to be copied and stored in deployment.template.json file below the information about the edge hub:


Modules section of the same file should contain the following(below OPC-UA publisher module):


With all this in place, IoT Edge Solution has information on how to route messages and which containers it should start and with which parameters.
Full IoT Edge solution with the full deployment.template.json file can be found on the GitHub.

Observations and conclusions

The solution can be started in IoT Edge simulator.

Picture 9: Starting a solution in IoT Edge simulator

Two minutes after starting the simulator the sensor was manually stimulated and first results showed up in the console.

Picture 10: First results in IoT Edge console

Picture 11: Anomaly detected and passed to IoT Hub

In this case, value 19 for the humidity is marked as an anomaly(normal was 18). Shortly after the sensor stimulation, humidity jumps to 25, which was expected to be detected as an anomaly.
By stopping the simulation and starting again after time range that is longer than 120s(as in the query), the first couple of events would be detected as an anomaly.

Picture 12: Events that are not anomalies, detected as anomalies

This brings the conclusion that this model is becoming more and more reliable after a certain period of time, which indeed makes sense, as Azure Stream analytics tries to learn from the incoming data. Obviously, this is one of the ways to get the cheaper version of anomaly detection on the edge. Increasing the number of events included for scoring might impact the performance of query execution, so for more reliable models, and for the use cases which require better precision and model trained on bigger data sets, it is recommended to take advantage of Machine learning technics.
On the other side, an advantage of having the Azure Stream Analytics anomaly detection on the edge could be that each of the machines could have its own model trained only on the data that is produced by that machine. Sometimes in the industrial cases, this is desired as each of the machines could be installed in different environments and have different external impacts.


1. Azure Stream Analytics on IoT Edge:
2. Azure Stream Analytics Anomaly Detection:
3. OPC-UA publisher:

Authenticating downstream devices with x.509 certificates

One of the most common scenarios in the industry is to provide one common interface/gateway for the devices to connect and send data to the cloud.
Some of the reasons for this might be:
– Devices do not have access to public network  all the time
– Protocol translations need to be done before data reaches cloud
– Identity translation needs to be done before data reaches cloud
– Data filtering and processing need to be done before data reaches cloud
– Security, updating, management etc.
Microsoft offers Azure IoT Edge as a powerful tool with huge potential as a solution for these scenarios.
This blog post will focus on a case where IoT Edge acts as a gateway which simply passes communications between the devices and IoT Hub.
This case in the official documentation is called ‘Transparent gateway’.

Picture 1.

For the testing purpose, end-to-end components are:
– Downstream device – generates messages
– IoT Edge as a gateway – passes communication between downstream device and IoT Hub
– Azure IoT Hub as message broker and device management service
– Azure Function that consumes messages from IoT Hub for testing purposes

Setting up IoT Edge

Below are the documentation steps required to take before writing any code:
1. Setup Azure IoT Edge on Linux Ubuntu 18.04 VM
2. In Azure Portal, create IoT Hub and add new Edge device with Symmetric key as AuthenticationType
   a) Copy the connection string
3. In Azure Portal, create two new devices one with “Sas” authentication type and the other one with CertificateAuthority authentication type

Picture 2

    a) Copy the “Sas” device connection string
4. Create certificates and use the deviceIds from steps 2. and 3. when creating the device and edge certificate
   a) Add Root certificate to IoT Hub  and follow the steps for the certificate verification
5. Make sure to store following certificates on the Linux VM on which IoT Edge is running:
– – Root Certificate
– new-edge-device.key.pem – Edge device private certificate
– new-edge-device-full-chain.cert.pem – Edge full chain certificate
6. Edit config.yaml file located in folder /etc/iotedge. Since this file is write protected, it is necessary to change the permissions.

In order to edit the file it is possible to use the following command:

7. Provisioning section in config.yaml should be:

 8. Certificates section in config.yaml should look like:

9.  In Azure Portal routes should be set as:


Picture 3

Follow the steps in Azure portal and finish the module deployment to the Edge device. After this step routes should be updated.
10. In order to double check if everything works fine, IoT Edge runtime can be restarted

and the following command should return status ‘Active’:

Picture 4

Side note: It is required to make sure that downstream device(host machine in this case) can ping IoT Edge host. In case that ping is rejected, potentially it is required to check DNS settings.
At this point, IoT Edge is in place and works as a transparent gateway.

Setting up the downstream device

As presented in Picture 1 in order to establish secure communication between the downstream device and Edge device it is necessary to install Root Certificate on the downstream device. This can be resolved in the code or simply by importing the certificate in the Certificate store. Downstream devices with “Sas” authentication type should contain similar code:

CA_CERTIFICATE_PATH – path on the file system to root certificate
ConnectionString – Connection string retrieved from IoT Hub(step 3a) and attached ‘;GatewayHostName=hostname_from_config.yaml’. Without GatewayHostName messages would be sent directly to IoT Hub after initial authentication.
Running this code results in:

Picture 5. Downstream Device, Azure IoT Edge and Azure function outputs. Messages are passed through the Azure IoT Edge to IoT Hub and then consumed by Azure Function

With downstream devices with CA Authentication Type things are slightly different.
1. First, in Azure Portal and IoT Hub there must be a relation (parent-child) set between the edge device and downstream devices.
Side note: At the moment of writing this post documentation does not state if this is always required, even for “Sas” devices. For “Sas” devices it works without setting this, but for “CA” authentication type this is a must.

Picture 6.

This sets in the device twin of downstream device attribute called deviceScope which matches with the same attribute from the device twin of the edge device.

Picture 7.

 Side note:  At the moment of writing this post, there is no any kind of public API and SDK that would enable setting this attribute from the code. This is also a limitation for Device Provisioning Service, which is not able to provision downstream devices at the moment.
2. Root certificate but also Device Certificate need to be installed in the local certificate store. Also, gateway hostname needs to be specified:

Side note: x.509 authentication has been added by default as of IoT Edge v1.06. In version 1.05 this was not enabled by default and environment variable had to be added for Edge hub module:

Picture 8.

Picture 9.

The code is available on Github.


Azure IoT Edge evolved with features and capabilities. One of the features definitely is authentication of downstream devices with x.509 certificate, considering the fact that many IoT solutions that are using Azure IoT Edge, also have devices that are using certificates for authentication. It would be nice to have Device Provisioning Service as a next step to support ‘zero-touch’ provisioning of downstream devices. That way we can have ‘zero-touch’ installations with downstream devices and IoT Edge gateways in the field. I believe that Microsoft is heading towards ‘zero-touch’ provisioning concept. An expectation is that all pieces of information regarding relations between devices are synced with IoT Hub after initial authentication so that potential changes in the code are avoided during the installation in the field. Looking forward to new releases!


1. IoT Edge as a gateway:
2. Create a transparent gateway:
3. Connect downstream device:
4. Setup x.509 security in IoT Hub:
5. Connect a downstream device to an Azure IoT Edge Gateway: