Neuro-GPT: Towards a Foundation Model for EEG paper
We propose Neuro-GPT, a foundation model consisting of an EEG encoder and a GPT model. The foundation model is pre-trained on a large-scale data set using a self-supervised task that learns how to reconstruct masked EEG segments. We then fine-tune the model on a Motor Imagery Classification task to validate its performance in a low-data regime (9 subjects). Our experiments demonstrate that applying a foundation model can significantly improve classification performance compared to a model trained from scratch.
Pre-trained foundation model available here
git clone git@github.com:wenhui0206/NeuroGPT.git
pip install -r requirements.txt
cd NeuroGPT/scripts
./train.sh
pip install -r requirements.txt
This project is developed based on the following open-source repositories: