This repository presents a classification algorithm for Power Quality (PQ) events using Artificial Neural Networks (ANNs). It utilizes four spectrum analysis techniques: (i) Short-Time Fourier Transform (STFT), (ii) Continuous Wavelet Transform (CWT), (iii) Discrete Stockwell Transform (DST), and (iv) Hilbert-Huang Transform (HHT). These spectra are used as input to a Convolutional Neural Network (CNN) for event classification.
PQ events were simulated using a numerical model (refer to [1]). The simulated events include:
- Sag: Voltage dips between 0.1 and 0.9 pu
- Swell: Voltage rises between 1.1 and 1.8 pu
- Interruption: Voltage drops below 0.1 pu
- Transient: Sudden voltage deviations
- Notch: Rapid drops and recoveries in voltage, often due to switching
- Harmonics: Multiple frequency components causing signal distortions
- Flicker: Long-term voltage variations causing visible light fluctuations
Equations for these events are in src/data_gen.py
and in the table below.
Name | Equation |
Range of |
---|---|---|
Normal | ||
Sag | ||
Swell | ||
Interruption | ||
Transient |
|
|
Notch | ||
Harmonics |
STFT provides time-frequency analysis using sliding windows. It’s computed using librosa
in src/signal_processing/compute_stft
:
CWT analyzes frequency evolution using wavelets, offering better resolution than STFT. It’s computed in src/signal_processing/compute_cwt
:
DST combines STFT and CWT advantages for efficient time-frequency analysis. It’s computed in src/signal_processing/compute_st
:
HHT analyzes nonlinear or non-stationary signals using Empirical Mode Decomposition (EMD) and Hilbert Transform. It’s computed in src/signal_processing/compute_hht
:
The CNN architecture used by the project for classification consists of:
Layer Type | Parameters | Activation |
---|---|---|
Input Layer | Input_Size | - |
Convolution Layer | filters=20, kernel_size=3 | ReLU |
Max Pooling Layer | pool_size=2 | - |
Convolution Layer | filters=10, kernel_size=3 | ReLU |
Max Pooling Layer | pool_size=2 | - |
Flatten Layer | - | - |
Dense Perceptron Layer | units=64 | ReLU |
Dense Perceptron Layer | units=num_classes=8 | Softmax |