Skip to content

zangzelin/code_ECCV2022_DLME

Repository files navigation

Deep Local-flatness Manifold Embedding (DLME)

The code includes the following modules:

  • Datasets (digits, coil20, coil100, Mnist, EMNIST, KMnsit, Colon, Activity, MCA, Gast10K, Samusik, HCL)
  • Training for DLME
  • Evaluation metrics
  • Visualisation

Requirements

  • pytorch == 1.11.0
  • pytorch-lightning == 1.4.8
  • torchvision == 0.12.0
  • scipy == 1.8.0
  • numpy == 1.18.5
  • scikit-learn == 1.0
  • matplotlib == 3.4.3
  • wandb == 0.12.5

Description

  • ./eval
    • eval/eval_core.py -- The code for evaluate the embedding
  • ./Loss -- Calculate losses
    • ./Loss/dmt_loss_aug.py -- The DLME loss
    • ./Loss/dmt_loss_source.py -- The template of loss function
  • ./eval -- The yaml file for gird search
  • ./sweep -- the yaml file for gird search
  • ./nuscheduler.py -- Adjustment Learning Rate
  • ./main.py -- End-to-end training of the DLME model
  • ./load_data_f -- the dataloader
    • ./load_data_f/source.py -- The template of dataset
    • ./load_data_f/dataset.py -- The DLME dataset

Baseline Methods

The compared methods include two manifold learning methods (UMAP, t-SNE) and three deep manifold learning methods (PHATE, ivis and parametric UMAP(P-UMAP)).

Dataset

The datasets include six simple image datasets (Digits, Coil20, Coil100, Mnist, EMnist, KMnist) and six biological datasets (Colon, Activity, MCA, Gast10k, SAMUSIK, and HCL).

Running the code

  1. Install the required dependency packages and Before that, and configure wandb with the instructions.

  2. To get the grid search results, run

wandb sweep sweep/sweep_base_line.yaml

and the terminal will show the id of sweep

(torch1.8) root@4fisk2abvqo3c-0:/zangzelin/project/dlme_eccv2022# wandb sweep sweep/sweep_base_line.yaml 
wandb: Creating sweep from: sweep/sweep_base_line.yaml
wandb: Created sweep with ID: 1frm0208
wandb: View sweep at: https://wandb.ai/cairi/DLME_ECCV2022/sweeps/1frm0208
wandb: Run sweep agent with: wandb agent cairi/DLME_ECCV2022/1frm0208

The cairi/DLME_ECCV2022/1frm0208 is the id of the sweep.

  1. My Replication Results:

https://www.wolai.com/zangzelin/gsKxT6fHMtnuwTrLprJmWb

If you find this file useful in your research, please consider citing:

@article{zang2022dlme,
  title={DLME: Deep Local-flatness Manifold Embedding},
  author={Zang, Zelin and Li, Siyuan and Wu, Di and Wang, Ge and Shang, Lei and Sun, Baigui and Li, Hao and Li, Stan Z},
  journal={arXiv preprint arXiv:2207.03160},
  year={2022}
}
@InProceedings{10.1007/978-3-031-19803-8_34,
author="Zang, Zelin
and Li, Siyuan
and Wu, Di
and Wang, Ge
and Wang, Kai
and Shang, Lei
and Sun, Baigui
and Li, Hao
and Li, Stan Z.",
editor="Avidan, Shai
and Brostow, Gabriel
and Ciss{\'e}, Moustapha
and Farinella, Giovanni Maria
and Hassner, Tal",
title="DLME: Deep Local-Flatness Manifold Embedding",
booktitle="Computer Vision -- ECCV 2022",
year="2022",
publisher="Springer Nature Switzerland",
address="Cham",
pages="576--592",
abstract="Manifold learning (ML) aims to seek low-dimensional embedding from high-dimensional data. The problem is challenging on real-world datasets, especially with under-sampling data, and we find that previous methods perform poorly in this case. Generally, ML methods first transform input data into a low-dimensional embedding space to maintain the data's geometric structure and subsequently perform downstream tasks therein. The poor local connectivity of under-sampling data in the former step and inappropriate optimization objectives in the latter step leads to two problems: structural distortion and underconstrained embedding. This paper proposes a novel ML framework named Deep Local-flatness Manifold Embedding (DLME) to solve these problems. The proposed DLME constructs semantic manifolds by data augmentation and overcomes the structural distortion problem using a smoothness constrained based on a local flatness assumption about the manifold. To overcome the underconstrained embedding problem, we design a loss and theoretically demonstrate that it leads to a more suitable embedding based on the local flatness. Experiments on three types of datasets (toy, biological, and image) for various downstream tasks (classification, clustering, and visualization) show that our proposed DLME outperforms state-of-the-art ML and contrastive learning methods.",
isbn="978-3-031-19803-8"
}

License

DLME is released under the MIT license.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published