Welcome to the Emotion Based Music Player project, a sophisticated application utilizing Convolutional Neural Networks (CNN) for emotion detection. This project not only identifies emotions through the fer2013 dataset but also enhances user experience by dynamically playing music corresponding to the detected emotions.
- Description: The fer2013 dataset serves as the backbone for training and testing the emotion detection model. It consists of facial expressions labeled with various emotions.
- Download Link: fer2013 Dataset
- Description: Our curated music dataset complements the emotion detection model by providing a diverse set of audio files, preclassified based on emotions.
- Download Link: Music Dataset
The effectiveness of the implemented CNN is evident in its accuracy metrics:
- Testing Accuracy: 81.71%
- Training Accuracy: 85.37%
- Validation Accuracy: 81.71%
To experience the capabilities of the trained model, follow these steps:
- Ensure you have the necessary dependencies installed.
- Download the fer2013 dataset and the music dataset.
- Run the application using the following command:
python app.py