Skip to content

In this repository is contained the archive of Sensorial fusion via QNNs project for the Open Hardware Contest 2020 of the Xilinx University Program

Notifications You must be signed in to change notification settings

eneriz-daniel/sensorialfusionQNNs

Repository files navigation

Sensorial fusion via QNNs

Project info

  • Team number: xohw20_215

  • Project name: Sensorial fusion via QNNs

  • Date: 30/06/2020

  • Version of uploaded archive: 1.0

  • University name: University of Zaragoza

  • Supervisor name: Nicolas Medrano

  • Supervisor e-mail: nmedrano@unizar.es

  • Participant(s): Daniel Enériz Orta

  • Email: eneriz@unizar.es

  • Board used: PYNQ-Z2

  • Software Version: PYNQ image 2.5.1

  • Brief description of project: This project presents the design, implementation and usage of a quantized neural network implemented over the PYNQ's FPGA to perform sensorial fusion, concretely the combination of 16 gas sensors to estimate the concentration level of to gases on the atmosphere.

  • Link to project repository: https://github.com/eneriz-daniel/sensorialfusionQNNs

  • Link to YouTube Video(s): https://youtu.be/NJG7mib3UBc

Archive description

Instructions to build and test the project

Raw data

The Virtualization notebook guides you to get the quantized neural network parameters, the only thing you may need is the HLS simulation of the model using fixed-point data types, for that you can use the next step and then get back to the second training stage.

Quantized PyTorch prediction results

In order to implement the HLS version of the network, you must create a Vivado HLS project using the gas-nn.cpp as a source file, defining its gas_nn function as the top-level function. For the test bench file, just add the gas-nn-tb.cpp as the test bench source. Finally, select the PYNQZ2 board file as you FPGA.

There you can try to simulate and synthetize on your own. The final exported design we have used has its directives in the source file as #pragmas.

HLS prediction results

Once you have exported your IP, you have to create a Vivado project and create a block design. In the Vivado settings there is the IP repositories locations, there you can add the neural network HLS exported IP. The using the block autoconnection feature, you can connect the ZYNQ processing system and the gas_nn block like this:

Vivado IP integrator screenshot

Then you are ready generate the bitstream.

To use the customized overlay on PYNQ it is necessary to upload it to the overlay folder. For that you can use the Windows File Explorer, connecting to the PYNQ's file system. In the directory \\pynq\xilinx\pynq\overlays\gas_nn you must add the bitstream, the .tcl and the .hwh files

PYNQ's file system

Finally, the Usage notebook shows you how to use the driver, the only thing you may need to customize yours is the HLS generated file with the port information and it is in your HLS project directory .../[HLSproyect]/[solutionX]/impl/misc/drivers/[ip_block_name]/src/x[ip_block_name]_hw.h

Then you can load the trained parameters and predict the time series.

PYNQ's predictions

Link to YouTube Video(s): https://youtu.be/NJG7mib3UBc

About

In this repository is contained the archive of Sensorial fusion via QNNs project for the Open Hardware Contest 2020 of the Xilinx University Program

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published