Skip to content

GUI aid for remotely controlling simulated robotics | HackMIT 2020

License

Notifications You must be signed in to change notification settings

mcthomas/VAI-GUI

Repository files navigation

V.A.I. - Visual Aided Interface | HackMIT 2020

Backend - Robotics led by Gokulraj K.S
Frontend - GUI and design led by Matt Thomas / API implementations led by Ipshita Joshi

Contents

Introduction

We are aiming to assist the daily lives of those who are restricted to wheelchairs or otherwise impaired by paralysis, through the use of gesture controls, image processing, and voice command features. Interfacing in this way allows the user to move between rooms, make emergency calls, and order food and transportation services. This project serves as the frontend GUI aid for remotely controlling the simulated robotics, written in HTML, CSS, JS, and wrapped with Python via a flask server. Made for the HackMIT 2020 Health Tech.

Additional GUI Server Dependencies

$ pip install -U Flask
$ pip install twilio
$ pip install pizzaapy
$ pip install xmltodict

Usage

Intended for use with eye blink tracking software such as OpenCV w/ dlib, with simulated robotics via gazebo, rviz, tinkercad, and ROS packages. Some functions are purely hypothetical without configuration, so footage of backend simulated robotics is included in their place. The server can be demoed (without the automated tabbing) with mouse clicks by simply cloning the project and running:

$ python app.py

Footage: