In this tutorial you will learn how to integrate Azure services with machine learning on the NVIDIA Jetson Nano (an ARM64 device) using Python. By the end of this sample, you will have a low-cost DIY solution for object detection within a space and a unique understanding of integrating ARM64 platform with Azure IoT services and machine learning.
- Phase One focuses on setting up the related services and environment.
- Phase Two focuses on setting up the Jetson Nano to be ready for IoT Edge Development with Azure and ONNX Runtime.
- Phase Three focuses on deploying an IoT Edge module to the device (combining phases one and two together).
- Phase Four focuses on visualizing the data in Power BI via Azure Blob Storage.
- Working with Azure
- Understanding the fundamentals of machine learning
- Working with ARM64 devices (such as the Jetson Nano)
- Coding in Python
We expect the full tutorial to take an experienced developer ~3 hours.
- Phase one: ~60 minutes (~20 minutes of work, the rest is load time)
- Phase two: ~60 minutes (~20 minutes of work, the rest is load time)
- Phase three: ~30 minutes (~10 minutes of work, the rest is load time)
- Phase four: ~20 minutes (depending on your visualization needs)
This step focuses on setting up the Azure cloud components and the Visual Studio environment. You can skip this step if you already have the setup to work with Visual Studio and Azure services.
Phase two focuses on setting up the Jetson Nano to be ready for IoT Edge Development with Azure and ONNX Runtime.
- 1 Jetson Nano
- 1 32+ GB microSD card and a microSD to SD card adapter
- 1 USB Mouse and USB Keyboard
- 1 Ethernet cable (cat 6)
- 1 USB camera or 1 IP camera
- 1 Laptop with SD card capability
- Highly recommended
- 1 female-to-female jumper
- 1 5V barrel jack power supply, 20W
- Important: If you are using the 5V barrel jack power supply, short the two pins on the Jetson right above the power jack to allow it to draw power from the power jack instead of the USB.
-
Follow these instructions from NVIDIA to flash your Jetson.
- When you insert the MicroSD into your computer, a lot of windows and memory drives may pop up. Ignore them (or close them if you care) and continue with the flashing instructions.
-
Once finished, turn on the Jetson and open the terminal (ctrl + alt + t).
- Attach the ethernet cable, power supply, female-to-female jumper cable, USB keyboard, USB mouse and HDMI cable to the Jetson.
- Plug the ethernet cable into a functional ethernet port
- Plug the power supply into a reliable outlet.
- Plug the other side of the HDMI cable into the monitor and power it on.
-
If a loading bar appears after logging on and does not move after a minute, just select Cancel.
-
Ensure that the Jetson is connected to ethernet.
-
Note the IP address of the Jetson by running the command:
ifconfig
-
Run the following command in terminal on the Jetson to disable the GUI:
sudo systemctl set-default multi-user.target
Note: This change is critical to preserving RAM and GPU RAM space for deploying the ONNX model in phase three.
-
Reboot to apply this change.
sudo reboot
-
Once the Jetson starts up again, you should see a black monitor with white text at the top corner.
-
On your own computer, type into terminal
ssh <Jetson user name>@<IP address>
. For example:ssh myusername@10.123.12.12
-
Install packages with
sudo apt-get install <PACKAGENAME>
as some packages may come preinstalled while others might not. -
OpenCV 3.3.1 should come preinstalled on the image provided by NVIDIA but if it is not, or you need a more updated version, you will have to build from source for ARM64.
First, we will install the daemon and some prerequisites in order to register the Jetson as an IoT Edge device.
-
Run this command:
sudo apt-get upgrade -y && sudo apt-get update -y
- Note: This may take ~10 minutes. If prompted to restart, you should restart.
-
You will need to install curl:
sudo apt-get install curl
-
Run apt-get fix:
sudo apt-get install -f
-
Install the iotedge daemon. Follow these instructions:
-
Register Microsoft key and software repository feed for Ubuntu Server 18.04
curl https://packages.microsoft.com/config/ubuntu/18.04/multiarch/prod.list > ./microsoft-prod.list
-
Copy the generated list.
sudo cp ./microsoft-prod.list /etc/apt/sources.list.d/
-
Install Microsoft GPG public key
curl https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > microsoft.gpg
sudo cp ./microsoft.gpg /etc/apt/trusted.gpg.d/
-
Perform apt update.
sudo apt-get update
-
Install the security daemon. The package is installed at /etc/iotedge/.
sudo apt-get install iotedge
source for above instructions.
-
-
(Optional) Verify your Linux kernel for Moby compatibility.
-
Register a new IoT Edge device in your IoT Hub by following this tutorial.
-
In your Azure IoT Hub the portal should look like this
-
Be sure to note copy the Primary Connection String of your device as you will need it for the next step.
-
From the SSH terminal, run this command:
sudo vim /etc/iotedge/config.yaml
-
Scroll until you see:
device_connection_string:"<ADD DEVICE CONNECTION STRING HERE>"
and paste the Primary Connection String in between the""
.- This step establishes a connection from the IoT Hub to the Jetson Nano as an IoT Edge device.
-
Save and exit the file. Then restart iotedge:
sudo systemctl restart iotedge
. -
For the edgeHub module to run, there needs to be a deployment configured. Right now there are none so it will not be running. Later, when you build and push your modules to the device, this will generate a deployment to the edgeHub. Since we have not done this yet, but want to ensure both edgeHub and edgeAgent will run, we will create a blank deployment.
- In the portal under the device, select Set Modules
- Do not change anything, just select Next twice and then Submit
You should see the modules running successfully, keep in mind edgeHub can take several minutes to load. If not, see the Troubleshooting section at the end of this phase.
Running sudo iotedge list
on the Nano should yield something like:
To fix internet connectivity issues where agent module (and other launched containers) are unable to connect to IoT Hub:
- Run
sudo ln -sf /run/systemd/resolve/resolv.conf /etc/resolv.conf
on the host device. This points the resolve to the correct DNS servers.
If running sudo iotedge list
did not yield the same output shown in the image above, try running one of these commands:
- Run
sudo systemctl status iotedge
to view the status of the IoT Edge Security Manager. - Run
sudo journalctl -u iotedge -f
to view the logs of the IoT Edge Security Manager.
If you get a 400 error in your edgeAgent menu, there's a chance you typed something incorrectly in Advanced Runtime Settings. Make sure the iotedge list is correct word for word.
For more help on troubleshooting Azure IoT Edge, go here.
Phase three of this tutorial focuses on deploying an object detection model on your Jetson.
- Clone this repo to your local drive / computer.
If you completed phases one and two, you should have the following dependencies already. Feel free to double check them.
- Install Azure Account, Azure IoT Edge & Azure IoT Hub Toolkit extensions in VS Code.
- These will allow you to build and push IoT Edge solutions to a custom container and view output from the Jetson in VS Code.
-
On your computer (not the Jetson), open the folder for this repo in VS Code.
- Note: If you downloaded as a zip file, there may be two onnxruntime-iot-edge-master folders when you unzip, one nested in the other. Open the INNER one
-
Select View > Command Palette to open the VS Code command palette.
-
In the command palette, enter and run the command Azure: Sign in and follow the instructions to sign into your Azure account.
-
Open the .env file and edit the variables with the credentials of the container registry and storage account that was set up in phase one. You can find the details in your Azure Portal:
- The container registry credentials are in Your resource group > Your container registry > Access keys.
- The storage account details are on the Access Keys tab of the Storage account page.
-
Note the Username and Login server. Enable Admin User; this should generate two passwords. The first password is the one you need to note. Fill in the .env file so that it has the following entries filled with your specific resources:
CONTAINER_REGISTRY_USERNAME="<username>"
CONTAINER_REGISTRY_PASSWORD="<password>"
CONTAINER_REGISTRY_ADDRESS="<Login server>"
MY_STORAGE_ACCOUNT_NAME="<Storage account name>"
MY_STORAGE_ACCOUNT_KEY="<access key>"
MY_STORAGE_CONNECTION_STRING="<Connection string>"
- In the CameraCaptureModule directory, edit the file camerainfo.csv so that each line holds the camera number and the name of the camera delimited with a ','. The current csv is set for a camera with the name cam1 and camera number 0.
-
Within the InferenceModule directory, main.py is the file in which blob storage is set up as well. By default, we are going to use blob storage and we have created the necessary resources for it. If you do not wish to use it, change the variable CLOUD_STORAGE to False.
-
Then in your deployment.template.json file, find the last occurrence of
azureblobstorageoniotedge
. This is where the device twin properties of your blob storage module are set. -
Change the cloudStorageConnectionString variable to your cloud storage connection where it has
"<insert cloud storage connection string here>"
. You can find the connection string on the portal in your storage account under the Access Keys tab. -
Change the variable LOCAL_STORAGE_ACCOUNT_NAME to the container you created in your storage account during phase one (i.e. storagetestlocal).
-
Change the variable LOCAL_STORAGE_ACCOUNT_KEY to your generated local storage account key. You can use this generator here.
-
In the InferenceModule directory, in main.py adjust the variable block_blob_service to hold the connection string to the local blob storage account. You can find information about configuring connection strings here or just replace the given
< >
with what is required. -
Run
sudo mkdir /home/storagedata
in the SSH terminal.
-
-
Before deploying, if you are using a 10 Watt power jack, this will not reliably provide the 10 Watts needed so we will put the Jetson in low power mode (5 Watt mode).
-
In the SSH terminal run:
sudo nvpmodel -m 1
-
To check what power mode the device is in, run:
sudo nvpmodel -q
-
-
If you are using the 20 Watt power jack we recommend, ensure the Jetson is running at full strength.
-
To check what power mode the device is in, run:
sudo nvpmodel -q
-
If the device is not in 10 W mode, run:
sudo nvpmodel -m 0
-
-
Right click on deployment.template.json, then select Build and Push IoT Edge Solution. Behind the scenes, this runs two docker commands. One to build your container and another to push that to the container registry. This step may take some time (15 minutes)
- Note: Every time changes are made and you want to re-deploy the modules the version of the module must be incremented or changed. In module.json change the version number before selecting Build and Push IoT Edge Solution.
-
At the bottom left corner of VS Code, you should see a drop-down menu labeled AZURE IOT HUB. Expand it and select IoT Hub. Follow the prompts that appear in the command palette at the top and select the IoT Hub you created.
-
After selecting the hub, click on the Devices drop down menu. You should be able to see your device like this:
- Right click on the device and select Create Deployment for Single Device. This will open a File Explorer window. Navigate into the config folder and select the deployment file.
- The File Explorer should close and the output terminal in VS Code should look something like this:
- Either in the Jetson terminal itself, or on a terminal that is SSH'd into the Jetson, you can verify that the PreModule and InferenceModules are running by typing the command
sudo iotedge list
. It should yield something like this (your module versions may be different):
- To view the output of the model in VS Code, select on the device in the Azure IoT Hub device menu and select Start Monitoring Built-in Event Endpoint. Your terminal should look like this:
- You should be able to see the output. You can select on the lock icon in the top right corner to lock the toggle at the bottom of the terminal window; now you can see the output in real time:
-
Once your modules are up and running on your Jetson, you should be able to see inference outputs on the portal in your storage account!
-
Go to your storage account and select the Blobs tab.
-
There should be a storage container called storagetest. Select it and you will see your results stored as blobs!
-
To see what's in a blob, select it and then select Edit Blob
-
The output being displayed only shows the labels and confidence scores for objects above a threshold confidence score.
-
The output of the TinyYOLO model contains more information such as the confidence scores for each of the 20 labels and the coordinates of the detected objects in the frame. If you would like to see this additional information, feel free to modify the Inference Python file in the ARM64_EdgeSolution.
If you don't see your module as 'Running':
- Run
sudo journalctl -u iotedge -f
and see if the image is being pulled. The message should look like this:
Jul 02 10:15:49 tutorial-jetson iotedged[10175]: 2019-07-02T17:15:49Z [INFO] - Pulling image tutorialregistryjetson.azurecr.io/<modulename>:0.0.1-arm64...)
- If it is still not working, restart iotedge with
systemctl restart iotedge
, then check again.
For further debugging, you can try these commands:
-
Run
sudo systemctl status iotedge
to view the status of the IoT Edge Security Manager. -
Run
sudo journalctl -u iotedge -f
to view the logs of the IoT Edge Security Manager. -
Run
sudo docker logs <module name>
to view specific error logs for a module
For more help on troubleshooting Azure IoT Edge, go here.
This phase focuses on visualizing the data being gathered from the IoT device based on the inference results from the model and stored in Azure Blob Storage using Power BI to display.
Congratulations! In this tutorial you learned how to integrate Azure with machine learning on the Jetson Nano (an ARM64 device) using Python and completed the following tasks:
- Set up all relevant Azure cloud components
- Configured a Jetson Nano for IoT Edge Development with Azure and ONNX Runtime
- Deployed an object detection model on the Jetson Nano
- Visualized data in Power BI via Azure Blob Storage
-
Deploy your own model!
- Check out ONNX's pre-made model zoo here for models to download and deploy.
- Create your own model using Azure Machine Learning or Custom Vision.
-
Create a dashboard for your Power BI report by following this tutorial.
If you are seeking additional help, please visit
This project was created with active contributions from Abhinav Ayalur, Angela Martin, Kaden Dippe, Kelly Lin, Lindsey Cleary and Priscilla Lui
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repositories using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
Security issues and bugs should be reported privately, via email, to the Microsoft Security Response Center (MSRC) at secure@microsoft.com. You should receive a response within 24 hours. If for some reason you do not, please follow up via email to ensure we received your original message. Further information, including the MSRC PGP key, can be found in the Security TechCenter.
Copyright (c) Microsoft Corporation. All rights reserved.
MIT License
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED AS IS, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.