Skip to content

Commit

Permalink
add arxiv link
Browse files Browse the repository at this point in the history
  • Loading branch information
BenediktAlkin committed Feb 20, 2024
1 parent d6baccd commit 633ad5e
Show file tree
Hide file tree
Showing 3 changed files with 22 additions and 43 deletions.
58 changes: 16 additions & 42 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,53 +1,27 @@
Code and instructions are coming soon.


[[Project Page](https://ml-jku.github.io/UPT)]
[[Project Page](https://ml-jku.github.io/UPT)] [[Paper (arxiv)](https://arxiv.org/abs/2402.12365)] [[BibTeX](https://github.com/ml-jku/UPT#citation)]



**U**niversal **P**hysics **T**ransformers (UPTs) are a novel learning paradigm that can model a wide range of
spatio-temporal problems - both for Lagrangian and Eulerian discretization schemes.
# Train your own models

Instructions to setup the codebase on your own environment are provided in
[SETUP_CODE](https://github.com/ml-jku/MIM-Refiner/blob/main/SETUP_CODE.md),
[SETUP_DATA](https://github.com/ml-jku/MIM-Refiner/blob/main/SETUP_DATA.md).

<p align="center">
<img width="100%" alt="schematic" src="https://raw.githubusercontent.com/ml-jku/UPT/main/.github/imgs/schematic.png">
</p>
Configurations to train, evaluate or analyze models can be found [here](https://github.com/ml-jku/MIM-Refiner/tree/main/src/yamls).

# Citation

The architecture of UPT consists of an encoder, an approximator and a decoder. The encoder is responsible to encode
the physics domain into a latent representation, the approximator propagates the latent representation forward in time
and the decoder transforms the latent representation back to the physics domain.
If you like our work, please consider giving it a star :star: and cite us

<p align="center">
<img width="80%" alt="architecture1" src="https://raw.githubusercontent.com/ml-jku/UPT/main/.github/imgs/architecture1.svg">
</p>


To enforce the responsibilities of each component, inverse encoding and decoding tasks are added.


<p align="center">
<img width="80%" alt="architecture2" src="https://raw.githubusercontent.com/ml-jku/UPT/main/.github/imgs/architecture2.svg">
</p>



UPTs can model transient flow simulations (Eulerian discretization scheme) as indicated by test loss and rollout performance (measured via correlation time):

<p align="center">
<img width="48%" alt="cfd_scaling_testloss" src="https://raw.githubusercontent.com/ml-jku/UPT/main/.github/imgs/cfd_scaling_testloss.svg">
<img width="48%" alt="cfd_scaling_corrtime" src="https://raw.githubusercontent.com/ml-jku/UPT/main/.github/imgs/cfd_scaling_corrtime.svg">
</p>


<p align="center">
<img width="100%" alt="cfd_rollout" src="https://raw.githubusercontent.com/ml-jku/UPT/main/.github/imgs/cfd_rollout.png">
</p>


UPTs can also model the flow-field of particle based simulations (Lagrangian discretization scheme):

<p align="center">
<img width="100%" alt="lagrangian_field" src="https://raw.githubusercontent.com/ml-jku/UPT/main/.github/imgs/lagrangian_field.png">
</p>
Particles show the ground truth velocities of particles and the white arrows show the learned velocity field of a UPT model evaluated on the positions of a regular grid.
```
@article{alkin2024upt,
title={Universal Physics Transformers},
author={Benedikt Alkin and Andreas Fürst and Simon Schmid and Lukas Gruber and Markus Holzleitner and Johannes Brandstetter},
journal={arXiv preprint arXiv:2402.12365},
year={2024}
}
```
5 changes: 5 additions & 0 deletions SETUP_DATA.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,5 +37,10 @@ rm -rf ./training_data/param8/c5079a5b8d59220bc3fb0d224baae2a
Preprocess the data by using the folder of the downloaded dataset as SRC_FOLDER.
`python analysis/data/shapenetcar/preprocess --src <SRC_FOLDER> --dst <DST_FOLDER>`

# Transient Flow Dataset

Coming shortly.


# Lagrangian
No preprocessing needed. Datasets will be downloaded automatically when using a Lagrangian dataset.
2 changes: 1 addition & 1 deletion docs/index.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@


[[Code](https://github.com/ml-jku/UPT)]
[[Code](https://github.com/ml-jku/UPT)] [[Paper (arxiv)](https://arxiv.org/abs/2402.12365)] [[BibTeX](https://github.com/ml-jku/UPT#citation)]


**U**niversal **P**hysics **T**ransformers (UPTs) are a novel learning paradigm that can model a wide range of
Expand Down

0 comments on commit 633ad5e

Please sign in to comment.