-
This script creates a graphical user interface (GUI) that captures video from the default camera, analyzes the emotion of the person in the video frame using DeepFace, and provides a recommendation based on the detected emotion. The user inputs their name and age, which are displayed on the video feed along with the detected emotion and recommendation.
-
It uses DeepFace for emotion detection and Gemini AI for generating recommendations. The user details and the detected emotion with the recommendation are stored in a Firestore database.
-
The script integrates emotion detection with a recommendation system, making it useful for applications in mental health, personalized user experiences, and more.
-
Camera Access: Ensure your system has a camera, and it is accessible.
-
Internet Connection: Ensure your system is connected to the internet as the script will interact with Firebase and the Gemini model.
-
Open the Flutter App Version folder in an IDE (for example: VS Code)
-
Using the IDE terminal, create a virtual environment to manage dependencies:
python -m venv venv
- In cmd.exe:
venv\Scripts\activate.bat
- In PowerShell:
venv\Scripts\Activate.ps1
- Linux and MacOS venv activation:
$ source myvenv/bin/activate
- Install Necessary Libraries:
pip install firebase-admin
pip install google-cloud-firestore
pip install opencv-python
pip install deepface
pip install python-dotenv
pip install flet
pip install google-generativeai
-
Run the script using the following command:
python main.py
-
Closing the video feed: Press q on your keyboard