We are a group of 4 University of Waterloo CS students. This project was made for the ConUHacks 2024 hackathon.
- Seamlessly connects with a virtual call window (Zoom, Google Meet, Teams, etc.)
- Specially trained machine learning models analyze call participants' faces to determine both emotion and attentiveness
- Rolling average and radar graphs to smoothly display the emotion and attentiveness data of each participant
- Easily toggle meeting overlays to display face-detecting squares