This repository contains main code for the paper "Tactile Functasets: Neural Implicit Representations of Tactile Datasets" by Sikai Li, Samanta Rodriguez, Yiming Dou, Andrew Owens, Nima Fazeli.
π¦Ύ Tactile functaset (TactFunc) reconstructs the high-dimensional raw tactile dataset by training neural implicit functions. It produces compact representations that capture the underlying structure of the tactile sensory inputs. We demonstrate the efficacy of this representation on the downstream task of in-hand object pose estimation, achieving improved performance over image-based methods while simplifying downstream models.
This codebase contains implementations of:
- Meta-learning for Bubble and Gelslim tactile datasets.
- Conversion from raw tactile datasets to functasets.
- Inference over tactile functasets.
- Downstream models for in-hand object pose estimation.
- Baselines: ResNet-18, Variational Autoencoder and T3 model.
Bubble and Gelslim datasets are from "Touch2Touch: Cross-Modal Tactile Generation for Object Manipulation" and can be found here.
Git clone the project and cd to the project main directory.
Set up a python virtual environment with the required dependencies using conda.
conda create -n <name> python=3.10
conda activate <name>
pip install -r requirements.txt
# To use GPU (NVIDIA, CUDA 12) with jax, install jax with
pip install -U "jax[cuda12]"
The structure of the data folder should be as follows:
.
βββ assets
βββ baselines # implementations of baselines: VAE, T3 and ResNet-18
βββ data
β βββ datasets # folders to setup bubble/gelslim
β β βββ bubble
β β β βββ bubble.py
β β β βββ checksum.tsv
β β βββ gelslim
β β βββ bubble.py
β β βββ checksum.tsv
β βββ functasets # tactile functasets
β β βββ bubble
β β βββ combined
β β βββ gelslim
β βββ meta_learned # meta learned trunk model checkpoints
β βββ bubble_pt_dataset
β βββ combined_pt_dataset
β βββ gelslim_pt_dataset
βββ data_utils.py
βββ function_reps.py
βββ helpers.py
βββ pytree_conversions.py
βββ README.md
βββ requirements.txt
βββ tactile_functaset_writer.py
βββ tactile_meta_learning.py
- Download Bubble and Gelslim datasets from here.
- Preprocess the datasets. Assume that both datasets are saved under paths:
# Bubble Dataset
/data/functa/bubble/ # All bubble data points are under this folder with file name from "data_0.pt" to "data_16127.pt"
# Gelslim Dataset
/data/functa/gelslim/ # All gelslim data points are under this folder with file name from "data_0.pt" to "data_16127.pt"
The bubble tactile sensor data is transformed into a TensorFlow Dataset (tfds) format, enabling seamless integration with JAX/Haiku models while providing access to tfds's diverse data processing capabilities. The input is derived by normalizing the difference between post-contact and pre-contact images from the left-hand-side sensor, which is paired with the corresponding in-hand object pose vector for downstream tasks.
Run:
cd data/datasets/bubble
tfds build --register_checksums
The gelslim tactile sensor data is transformed into a TensorFlow Dataset (tfds) format, enabling seamless integration with JAX/Haiku models while providing access to tfds's diverse data processing capabilities. The input is derived by normalizing and converting the difference between post-contact and pre-contact images from the left-hand-side sensor into grayscale channel, which is paired with the corresponding in-hand object pose vector for downstream tasks.
Run:
cd data/datasets/gelslim
tfds build --register_checksums
After setting up Bubble and Gelslim datasets, change the exp.dataset.name
in tactile_meta_learning.py
's get_config()
to bubble_pt_dataset
, gelslim_pt_dataset
, or combined_pt_dataset
for the dataset you want.
Combined dataset is the combination of bubble and gelslim datasets, and is handled in data_utils.py
.
Configure the parameters in tactile_meta_learning.py
's get_config()
and run:
python -m tactile_meta_learning --config=tactile_meta_learning.py
After training, the checkpoint of the weights of the meta-learned trunk network is saved to both ./tmp/training/{exp.dataset.name}/checkpoint.npz
(will be overwritten) and ./data/meta_learned/{exp.dataset.name}/checkpoint_{exp.model.width}w_{exp.model.depth}d_{exp.model.latent_dim}ld.npz
for better clarification.
Configure the paths for pretrained trunk network weights and created functasets in tactile_functaset_writer.py
, and run:
python -m tactile_functaset_writer --type=DATA_TYPE # DATA_TYPE should be "bubble", "combined" or "gelslim"
After running, the functaset is saved as a .npz
file to the directory you configured.
Coming soon!
Images compared are of the same size, the displayed differences do not represent the actual size.
Meta-learned Initialization | Reconstruction PSNR = 38.32 | Target |
---|---|---|
Meta-learned Initialization | Reconstruction PSNR = 36.83 | Target |
---|---|---|
Reconstruction PSNR = 27.32 | Target |
---|---|
Reconstruction PSNR = 28.71 | Target |
---|---|
Reconstruction PSNR = 13.86 | Target |
---|---|
Reconstruction PSNR = 24.83 | Target |
---|---|
@misc{li2024tactilefunctasetsneuralimplicit,
title={Tactile Functasets: Neural Implicit Representations of Tactile Datasets},
author={Sikai Li and Samanta Rodriguez and Yiming Dou and Andrew Owens and Nima Fazeli},
year={2024},
eprint={2409.14592},
archivePrefix={arXiv},
primaryClass={cs.RO},
url={https://arxiv.org/abs/2409.14592},
}
This work is supported by NSF GRFP #2241144, NSF CAREER Awards #2339071 and #2337870, and NSF NRI #2220876.
The source code is licensed under Apache 2.0.
For more information please contact skevinci@umich.edu.