Skip to content

Repository for "Inter and Intra Signal Variance in Feature Extraction and Classification of Affective State" AICS 2022

License

Notifications You must be signed in to change notification settings

ZacDair/Emo_Phys_Eval

Repository files navigation

Inter and Intra Signal Variance in Feature Extraction and Classification of Affective State

Z. Dair, S. Dockray, R. O'Reilly
AICS 2022, Part of the Communications in Computer and Information Science book series (CCIS,volume 1662).
Available Here

⚠️ POST AICS 2022 - Update In Progress ⚠️

The codebase will be updated shortly following the AICS 2022 conference, to align with the code used in the publication.

Variance and Performance of ECG and PPG Signals in Classifying Affective State

Second version of the physiological signal evaluation platform. Now includes automated experiment running from YAML configurations.

Provided Functions

  • Automated/Compartmentalised experiment running via YAML files
  • ECG and PPG Feature Extraction - leveraging HeartPy
  • ECG and PPG Windowing and labelling in accordance with dataset author instructions
  • Cardiac feature difference analysis between ECG and PPG of the same dataset
  • Cardiac feature importance from emotive ECG and PPG, using SHAP and a selected Machine Learning classifier
  • Model selection including cross-validation (several base Sklearn models supported)

Instructions

  1. Download the supported datasets, unzip and place in the Data/Datasets directory
  2. Ensure a correct installation of python is available
  3. Load the datasets, window the signals, extract features and label accordingly using CASE_Features.py and WESAD_Features.py respectively
  4. ML Operations including model selection, feature importance, and plotting ROC curves is handled in ML.py
  5. Feature difference analysis is provided by Signal_Wide_Stats.py

Supported Datasets

CASE - The Continuously Annotated Signals of Emotion (CASE) dataset
WESAD - WESAD (Wearable Stress and Affect Detection) Data Set

Note: Datasets following the same structure as the above may also be supported.

The expected structure is a Root directory containing subdirectories per participant, which in turn include either individual files for ECG, PPG and Labels as in CASE, or a combined file as in WESAD Reading from .pkl and .csv supported.

Documentation

For generated documentation see here
Each individual Module is documented as follows:

Runtime configurations are located in Configs.config.py - these include HeartPy feature names, Dictionary of Sklearn models, YAML Experiment directory

Main.py acts as the entry point, which runs all experiments located in the YAML Experiment directory one by one.

Future Development

V0.2 Feature List

Experiments:

  • Isolate ROC functionality to it’s own experiment

General:

  • Makefile for current dependencies
  • YAML template (requirements and datatypes) + validator
  • Config to point to YAML locations, dataset locations etc
  • Unit testing

Automation:

  • Encapsulate data loading functions
  • Encapsulate signal processing, windowing, feature extraction
  • Encapsulate individual experiment procedures
  • YAML file design to automate ISSC work
  • YAML main loop

V0.3 Feature List

Experiments:

  • Continuous arousal/valence value classification
  • Semi-Supervised Approaches
  • Intra/Inter personal variances of emotion - re-align and compare windows of emotion
  • Expand feature importance to include Sklearn methods

Signal Processing:

  • Add a signal processing stage with basic hearty filtering
  • Add further methods based on literature
  • Anomaly detection to identify electrode disconnection, or noise

Feature Extraction:

  • Add support for NeuroKit features

General:

  • Implement starting from windowed data
  • Summarise YAML giving a experiment description
  • Physiological signal identification of meta data - take sample rate identify length of signal, compare to other, signals, expected length, label length etc
  • Signal wide processing (might fit with anomaly detection) - take whole signal identify at what points emotion is showing

UI:

  • CLI or dashboard UI creation
  • Results logging through google sheets or alternative + YAML description

API:

  • Send signal window - retrieve emotion label (requires phys data window, phys signal name, sample rate, signal processing mode, feature extraction mode)