Skip to content

Commit

Permalink
Small fixes and more unit tests (#3)
Browse files Browse the repository at this point in the history
fix small bugs, change urls to pretrained networks, add more unit tests and support newest pydicom
  • Loading branch information
eeulig authored Oct 10, 2024
1 parent 36821f6 commit ddea399
Show file tree
Hide file tree
Showing 16 changed files with 489 additions and 162 deletions.
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -8,4 +8,5 @@ wandb/
site/
*.DS_Store
__pycache__/
*.psd
*.psd
.coverage
32 changes: 24 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
[![PyPI - Version](https://img.shields.io/pypi/v/ldct-benchmark?color=blue&cacheSeconds=!%5BPyPI%20-%20Version%5D(https%3A%2F%2Fimg.shields.io%2Fpypi%2Fv%2Fldct-benchmark))](https://pypi.org/project/ldct-benchmark/)
![License](https://img.shields.io/badge/MIT-blue?label=License)
[![arXiv](https://img.shields.io/badge/2401.04661-red?label=arXiv)](https://arxiv.org/abs/2401.04661)
[![DOI](https://img.shields.io/badge/10.1002%2Fmp.17379-red?label=MedPhys)](https://doi.org/10.1002/mp.17379)

[GitHub](https://github.com/eeulig/ldct-benchmark) | [Documentation](https://eeulig.github.io/ldct-benchmark/)

Expand Down Expand Up @@ -49,15 +50,30 @@ We welcome contributions of novel denoising algorithms. For details on how to do

## Reference
If you find this project useful for your work, please cite our [arXiv preprint](https://arxiv.org/abs/2401.04661):
> Elias Eulig, Björn Ommer, & Marc Kachelrieß (2024). Benchmarking Deep Learning-Based Low Dose CT Image Denoising Algorithms. arXiv, 2401.04661.
> Elias Eulig, Björn Ommer, & Marc Kachelrieß (2024). Benchmarking Deep Learning-Based Low-Dose CT Image Denoising Algorithms. arXiv, 2401.04661.
```bibtex
@article{eulig2024ldctbench,
title={Benchmarking Deep Learning-Based Low Dose CT Image Denoising Algorithms},
author={Elias Eulig and Björn Ommer and Marc Kachelrieß},
year={2024},
eprint={2401.04661},
archivePrefix={arXiv},
primaryClass={physics.med-ph}
@article{ldctbench-arxiv,
title = {Benchmarking Deep Learning-Based Low-Dose CT Image Denoising Algorithms},
author = {Elias Eulig and Björn Ommer and Marc Kachelrieß},
year = {2024},
eprint = {2401.04661},
archivePrefix = {arXiv},
primaryClass = {physics.med-ph}
}
```

or [Medical Physics paper](https://doi.org/10.1002/mp.17379):
> Elias Eulig, Björn Ommer, & Marc Kachelrieß (2024). Benchmarking Deep Learning-Based Low-Dose CT Image Denoising Algorithms. Medical Physics.
```bibtex
@article{ldctbench-medphys,
title = {Benchmarking deep learning-based low-dose CT image denoising algorithms},
author = {Eulig, Elias and Ommer, Björn and Kachelrieß, Marc},
journal = {Medical Physics},
year = {2024},
doi = {https://doi.org/10.1002/mp.17379},
url = {https://aapm.onlinelibrary.wiley.com/doi/abs/10.1002/mp.17379},
eprint = {https://aapm.onlinelibrary.wiley.com/doi/pdf/10.1002/mp.17379},
}
```
33 changes: 24 additions & 9 deletions README_PYPI.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,15 +36,30 @@ We welcome contributions of novel denoising algorithms. For details on how to do

## Reference
If you find this project useful for your work, please cite our [arXiv preprint](https://arxiv.org/abs/2401.04661):
> Elias Eulig, Björn Ommer, & Marc Kachelrieß (2024). Benchmarking Deep Learning-Based Low Dose CT Image Denoising Algorithms. arXiv, 2401.04661.
> Elias Eulig, Björn Ommer, & Marc Kachelrieß (2024). Benchmarking Deep Learning-Based Low-Dose CT Image Denoising Algorithms. arXiv, 2401.04661.
```bibtex
@article{eulig2024ldctbench,
title={Benchmarking Deep Learning-Based Low Dose CT Image Denoising Algorithms},
author={Elias Eulig and Björn Ommer and Marc Kachelrieß},
year={2024},
eprint={2401.04661},
archivePrefix={arXiv},
primaryClass={physics.med-ph}
@article{ldctbench-arxiv,
title = {Benchmarking Deep Learning-Based Low-Dose CT Image Denoising Algorithms},
author = {Elias Eulig and Björn Ommer and Marc Kachelrieß},
year = {2024},
eprint = {2401.04661},
archivePrefix = {arXiv},
primaryClass = {physics.med-ph}
}
```
```

or [Medical Physics paper](https://doi.org/10.1002/mp.17379):
> Elias Eulig, Björn Ommer, & Marc Kachelrieß (2024). Benchmarking Deep Learning-Based Low-Dose CT Image Denoising Algorithms. Medical Physics.
```bibtex
@article{ldctbench-medphys,
title = {Benchmarking deep learning-based low-dose CT image denoising algorithms},
author = {Eulig, Elias and Ommer, Björn and Kachelrieß, Marc},
journal = {Medical Physics},
year = {2024},
doi = {https://doi.org/10.1002/mp.17379},
url = {https://aapm.onlinelibrary.wiley.com/doi/abs/10.1002/mp.17379},
eprint = {https://aapm.onlinelibrary.wiley.com/doi/pdf/10.1002/mp.17379},
}
```
33 changes: 24 additions & 9 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,16 +38,31 @@ Therefore, the **aim** of this project is to
We welcome contributions of novel denoising algorithms. For details on how to do so, please check out our [contributing guide](https://github.com/eeulig/ldct-benchmark/blob/main/CONTRIBUTING.md){:target="_blank"} or reach out to [me](mailto:elias.eulig@dkfz.de).

## Reference
If you find this project useful for your work, please cite our [arXiv preprint](https://arxiv.org/abs/2401.04661){:target="_blank"}:
> Elias Eulig, Björn Ommer, & Marc Kachelrieß (2024). Benchmarking Deep Learning-Based Low Dose CT Image Denoising Algorithms. arXiv, 2401.04661.
If you find this project useful for your work, please cite our [arXiv preprint](https://arxiv.org/abs/2401.04661):
> Elias Eulig, Björn Ommer, & Marc Kachelrieß (2024). Benchmarking Deep Learning-Based Low-Dose CT Image Denoising Algorithms. arXiv, 2401.04661.
```bibtex
@article{eulig2024ldctbench,
title={Benchmarking Deep Learning-Based Low Dose CT Image Denoising Algorithms},
author={Elias Eulig and Björn Ommer and Marc Kachelrieß},
year={2024},
eprint={2401.04661},
archivePrefix={arXiv},
primaryClass={physics.med-ph}
@article{ldctbench-arxiv,
title = {Benchmarking Deep Learning-Based Low-Dose CT Image Denoising Algorithms},
author = {Elias Eulig and Björn Ommer and Marc Kachelrieß},
year = {2024},
eprint = {2401.04661},
archivePrefix = {arXiv},
primaryClass = {physics.med-ph}
}
```

or [Medical Physics paper](https://doi.org/10.1002/mp.17379):
> Elias Eulig, Björn Ommer, & Marc Kachelrieß (2024). Benchmarking Deep Learning-Based Low-Dose CT Image Denoising Algorithms. Medical Physics.
```bibtex
@article{ldctbench-medphys,
title = {Benchmarking deep learning-based low-dose CT image denoising algorithms},
author = {Eulig, Elias and Ommer, Björn and Kachelrieß, Marc},
journal = {Medical Physics},
year = {2024},
doi = {https://doi.org/10.1002/mp.17379},
url = {https://aapm.onlinelibrary.wiley.com/doi/abs/10.1002/mp.17379},
eprint = {https://aapm.onlinelibrary.wiley.com/doi/pdf/10.1002/mp.17379},
}
```
8 changes: 4 additions & 4 deletions ldctbench/data/LDCTMayo.py
Original file line number Diff line number Diff line change
Expand Up @@ -213,10 +213,10 @@ def __getitem__(self, idx: int) -> Dict[str, torch.Tensor]:
# Load image
sample = self.samples[idx]
f_name = self._idx2filename(sample["slice"], sample["n_slices"])
x = pydicom.read_file(
x = pydicom.filereader.dcmread(
os.path.join(self.path, sample["input"][2:], f_name)
).pixel_array.astype("float32")
y = pydicom.read_file(
y = pydicom.filereader.dcmread(
os.path.join(self.path, sample["target"][2:], f_name)
).pixel_array.astype("float32")

Expand All @@ -242,10 +242,10 @@ def __init__(self, datafolder, data_norm):
)
for s in range(patient_dict["n_slices"]):
f_name = self._idx2filename(s + 1, patient_dict["n_slices"])
x = pydicom.read_file(
x = pydicom.filereader.dcmread(
os.path.join(datafolder, patient_dict["input"][2:], f_name)
).pixel_array.astype("float32")
y = pydicom.read_file(
y = pydicom.filereader.dcmread(
os.path.join(datafolder, patient_dict["target"][2:], f_name)
).pixel_array.astype("float32")
self.samples[-1]["x"].append(x)
Expand Down
4 changes: 3 additions & 1 deletion ldctbench/data/prepare_dataset.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,9 @@


def load_dicom(path):
slices = [pydicom.read_file(os.path.join(path, s)) for s in os.listdir(path)]
slices = [
pydicom.filereader.dcmread(os.path.join(path, s)) for s in os.listdir(path)
]
slices.sort(key=lambda x: float(x.ImagePositionPatient[2]))
image = np.stack([s.pixel_array for s in slices])
return image
Expand Down
16 changes: 8 additions & 8 deletions ldctbench/hub/checkpoints.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"resnet": {
"url": "https://my.hidrive.com/api/sharelink/download?id=p3CIkV4O",
"url": "https://downloads.eeulig.com/ldct-benchmark/resnet.pt",
"checksum": "4fe7137fe5abe7c381d746914528eaf23f061204580d8f027faf4d5eaaca6714",
"model_name": "Model",
"args": {},
Expand All @@ -11,55 +11,55 @@
"normalization": "meanstd"
},
"bilateral": {
"url": "https://my.hidrive.com/api/sharelink/download?id=YqiokoAv",
"url": "https://downloads.eeulig.com/ldct-benchmark/bilateral.pt",
"checksum": "3e0722758e31c92fd11cc8bb9f5d2e93714a885bbe21183228fbe8946e79d5ff",
"model_name": "Model",
"args": {"cuda": true},
"kwargs": {},
"normalization": "meanstd"
},
"cnn10": {
"url": "https://my.hidrive.com/api/sharelink/download?id=LGCIEJ1u",
"url": "https://downloads.eeulig.com/ldct-benchmark/cnn10.pt",
"checksum": "ee96e494c8eda02e946464df29cf43c14163e63bc4b1df6d8907ee33a1a18c3e",
"model_name": "Model",
"args": {},
"kwargs": {},
"normalization": "meanstd"
},
"dugan": {
"url": "https://my.hidrive.com/api/sharelink/download?id=ljCok94w",
"url": "https://downloads.eeulig.com/ldct-benchmark/dugan.pt",
"checksum": "c6faad8376994fe22eadc0a723f81705fda6f435f40ef0f999666e871216ac5e",
"model_name": "Model",
"args": {},
"kwargs": {"out_ch": 96},
"normalization": "meanstd"
},
"qae": {
"url": "https://my.hidrive.com/api/sharelink/download?id=5LiIE1Em",
"url": "https://downloads.eeulig.com/ldct-benchmark/qae.pt",
"checksum": "ca6ce543b476d4c3673acd2f3d1284d47d98483c41a61cc912f5214c80921363",
"model_name": "Model",
"args": {},
"kwargs": {},
"normalization": "meanstd"
},
"redcnn": {
"url": "https://my.hidrive.com/api/sharelink/download?id=2GCokzaQ",
"url": "https://downloads.eeulig.com/ldct-benchmark/redcnn.pt",
"checksum": "058ae58264a00116e5620ed88047e687c7e93c8b732bfd6d7796e76bae0c74f3",
"model_name": "Model",
"args": {},
"kwargs": {"out_ch": 96},
"normalization": "meanstd"
},
"transct": {
"url": "https://my.hidrive.com/api/sharelink/download?id=6aiIEj6Q",
"url": "https://downloads.eeulig.com/ldct-benchmark/transct.pt",
"checksum": "37b5af1cb4fdfaa0c1d67d5992a5e9c6da5b6afda59bf5274d1fe5db163f959e",
"model_name": "Model",
"args": {},
"kwargs": {},
"normalization": "meanstd"
},
"wganvgg": {
"url": "https://my.hidrive.com/api/sharelink/download?id=GHCoEb4H",
"url": "https://downloads.eeulig.com/ldct-benchmark/wganvgg.pt",
"checksum": "1ad87d16aefea99a9eeb474c33127971131cc21a07e129bf464c764cf9ca6003",
"model_name": "Model",
"args": {},
Expand Down
43 changes: 29 additions & 14 deletions ldctbench/hub/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,11 @@
from ldctbench.hub import Methods, load_model
from ldctbench.utils.test_utils import denormalize, normalize

SUPPORTED_TRANSFER_SYNTAX_UIDS = [
pydicom.uid.ExplicitVRLittleEndian,
pydicom.uid.RLELossless,
]


@torch.no_grad()
def denoise_numpy(
Expand Down Expand Up @@ -162,9 +167,12 @@ def denoise_dicom(
files = None
if os.path.isdir(dicom_path):
files = [
os.path.join(dicom_path, file)
for file in os.listdir(dicom_path)
if pydicom.misc.is_dicom(os.path.join(dicom_path, file))
os.path.join(dicom_path, item)
for item in os.listdir(dicom_path)
if (
not os.path.isdir(os.path.join(dicom_path, item))
and pydicom.misc.is_dicom(os.path.join(dicom_path, item))
)
]
elif pydicom.misc.is_dicom(dicom_path):
files = [dicom_path]
Expand All @@ -173,25 +181,24 @@ def denoise_dicom(

# Iterate over filepaths
for file in tqdm(files, desc="Denoise DICOMs", disable=disable_progress):
ds = pydicom.read_file(file)
ds = pydicom.filereader.dcmread(file)

# Training data had RescaleIntercept: -1024 HU and RescaleSlope: 1.0. raise a warning if given dicom has different values
# Training data had RescaleIntercept: -1024 HU and RescaleSlope: 1.0. Raise a warning if given dicom has different values
intercept = getattr(ds, "RescaleIntercept", None)
slope = getattr(ds, "RescaleSlope", None)
if not intercept or int(intercept) != -1024:
if intercept is None or int(intercept) != -1024:
warnings.warn(
f"Expected DICOM data to have intercept -1024 but got {intercept} instead!"
)
if not intercept or float(slope) != 1.0:
if slope is None or float(slope) != 1.0:
warnings.warn(
f"Expected DICOM data to have slope 1.0 but got {slope} instead!"
)

# Might be that this function doesn't properly handle DICOM files with compressed pixel data. Raise a warning.
if not ds.file_meta.TransferSyntaxUID == pydicom.uid.ExplicitVRLittleEndian:
ds.decompress()
warnings.warn(
f"TransferSyntaxUID (0002, 0010) must be {pydicom.uid.ExplicitVRLittleEndian} ({pydicom.uid.ExplicitVRLittleEndian.name}) but got {ds.file_meta.TransferSyntaxUID} ({ds.file_meta.TransferSyntaxUID.name}) instead! Used pydicoms .decompress()"
# Raise error if TransferSyntaxUID is not supported
if not ds.file_meta.TransferSyntaxUID in SUPPORTED_TRANSFER_SYNTAX_UIDS:
raise ValueError(
f"TransferSyntaxUID (0002, 0010) must be one of {', '.join([f'{item} ({item.name})' for item in SUPPORTED_TRANSFER_SYNTAX_UIDS])}) but got {ds.file_meta.TransferSyntaxUID} ({ds.file_meta.TransferSyntaxUID.name}) instead!"
)

# Denoise image data
Expand All @@ -208,8 +215,16 @@ def denoise_dicom(
out=x_denoised,
)

# Overwrite existing PixelData
ds.PixelData = x_denoised.astype(x.dtype).tobytes()
# Update PixelData, compressed if needed
if ds.file_meta.TransferSyntaxUID.is_compressed:
ds.compress(
ds.file_meta.TransferSyntaxUID,
x_denoised.astype(x.dtype),
generate_instance_uid=False,
)
print(f"Compressed again using {ds.file_meta.TransferSyntaxUID.name}")
else:
ds.PixelData = x_denoised.astype(x.dtype).tobytes()

# Save as a new file
new_filepath = os.path.join(savedir, os.path.basename(file))
Expand Down
6 changes: 6 additions & 0 deletions ldctbench/utils/auxiliaries.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,12 @@ def load_json(path: str):
return d


def save_json(content, path: str):
"""Save python object to json file"""
with open(path, "w") as f:
json.dump(content, f)


def dump_config(args: Namespace, path: str):
"""Save argparse.Namespace to yaml file"""
with open(os.path.join(path, "args.yaml"), "w") as outfile:
Expand Down
Loading

0 comments on commit ddea399

Please sign in to comment.