Skip to content

Steerable Pyramid Transform enables robust left ventricle quantification

Notifications You must be signed in to change notification settings

yangyangyang127/RobustLV

Repository files navigation

Code for "Steerable Pyramid Transform Enables Robust Left Ventricle Quantification"

This is an end-to-end framework for accurate and robust left ventricle indices quantification, including cavity and myocardium areas, six regional wall thicknesses, and three directional dimensions.

The proposed method first decomposes a CMR image into directional frequency bands via Steerable Pyramid Transformation. Then the deep representation of each direction is extracted separately via a CNN model, and we also use an LSTM module to model the temporal dynamics. Finally, we explore the multidirectional relationship of features, indices, and directional subbands to optimize the quantification system.

Requirements

Create a conda environment and install dependencies:

cd RobustLV

conda create -n RobustLV python=3.7
conda activate RobustLV

# Install the according versions of torch and torchvision
conda install pytorch torchvision cudatoolkit

If you want to test the Mamba module, please refer to VMamba to set up the environment.

Datasets

The dataset we used can be found at the MICCAI 2018/2019 Left Ventricle Full Quantification Challenge, an open-source dataset on Kaggle. The dataset can be put under the './data/' path.

Training

Train the model with the below command:

CUDA_VISIBLE_DEVICES=0 python train.py

Please modify the corresponding hyperparameters to conduct experiments in 'config.py' file.

Acknowledgement

We thank VMamba and MTLearn for sharing their source code.

About

Steerable Pyramid Transform enables robust left ventricle quantification

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published