updated references

This commit is contained in:
N_T 2025-02-20 09:49:35 +08:00
parent 16f0f351ac
commit 3b73717017
5 changed files with 35 additions and 8 deletions

View File

@ -98,7 +98,7 @@ definitely the most powerful set of tools we have for non-linear problems.
That all the details aren't fully worked out and have nicely been written up
shouldn't stop us from including these powerful methods in our numerical toolbox.
### Reconciling DL and simulations
### Reconciling AI and simulations
Taking a step back, the aim of this book is to build on all the powerful techniques that we have
at our disposal for numerical simulations, and use them wherever we can in conjunction
@ -108,9 +108,9 @@ As such, a central goal is to _reconcile_ the AI viewpoint with physical simulat
```{admonition} Goals of this document
:class: tip
The key aspects that we will address in the following are:
- explain how to use deep learning techniques to solve PDE problems,
- how to use deep learning techniques to **solve PDE** problems,
- how to combine them with **existing knowledge** of physics,
- without **discarding** our knowledge about numerical methods.
- without **discarding** numerical methods.
At the same time, it's worth noting what we won't be covering:
- there's no in-depth **introduction** to deep learning and numerical simulations (there are great other works already taking care of this),

View File

@ -7,7 +7,7 @@ Many simulation problems like fluid flows are often poorly represented by a sing
## Diffusion Graph Nodes
In the following, we'll demonstrate these capabilities based on the _diffusion graph net_ (DGN) approach {cite}`lino2024dgn`, the full source code for which [can be found here](https://github.com/tum-pbs/dgn4cfd/).
In the following, we'll demonstrate these capabilities based on the _diffusion graph net_ (DGN) approach {cite}`lino2025dgn`, the full source code for which [can be found here](https://github.com/tum-pbs/dgn4cfd/).
To learn the probability distribution of dynamical states of physical systems, defined by their discretization mesh and their physical parameters, the DDPM and flow matching frameworks can directly be applied to the mesh nodes. Additionally, DGN introduces a second model variant, which operates in a pre-trained semantic _latent space_ rather than directly in the physical space (these variants will be called LDGN).

View File

@ -14,10 +14,38 @@
@inproceedings{lino2024dgn,
@inproceedings{lino2025dgn,
title={Diffusion Graph Networks},
author={Mario Lino and Tobias Pfaff and Nils Thuerey},
booktitle={International Conference on Machine Learning},
booktitle={International Conference on Learning Representations},
year={2025}
}
@inproceedings{liu2025config,
title={ConFIG: Towards Conflict-free Training of Physics Informed Neural Networks},
author={Qiang Liu and Mengyu Chu and Nils Thuerey},
booktitle={International Conference on Learning Representations},
year={2025}
}
@inproceedings{bhatia2025prdp,
title={Progressively Refined Differentiable Physics},
author={Kanishk Bhatio and Felix Koehler and Nils Thuerey},
booktitle={International Conference on Learning Representations},
year={2025}
}
@inproceedings{shehata2025trunc,
title={Truncation Is All You Need: Improved Sampling Of Diffusion Models For Physics-Based Simulations},
author={Youssef Shehata and Benjamin Holzschuh and Nils Thuerey},
booktitle={International Conference on Learning Representations},
year={2025}
}
@inproceedings{schnell2025td,
title={Temporal Difference Learning: Why It Can Be Fast and How It Will Be Faster},
author={Patrick Schnell and Luca Guastoni and Nils Thuerey},
booktitle={International Conference on Learning Representations},
year={2025}
}
@ -28,7 +56,6 @@
year={2024}
}
@inproceedings{liu2024airfoils,
title={Uncertainty-aware Surrogate Models for Airfoil Flow Simulations with Denoising Diffusion Probabilistic Models},
author={Liu, Qiang and Thuerey, Nils},

Binary file not shown.

View File

@ -75,7 +75,7 @@ A 3x3 convolution (orange) shown for differently deformed regular multi-block gr
For unstructured data, graph-based neural networks (GNNs) are a good choice. While they're often discussed in terms of _message-passing_ operations,
they share a lot of similarities with structured grids: the basic operation of a message-passing step on a GNN is equivalent to a convolution on a grid {cite}`sanchez2020learning`.
Hierarchies can likewise be constructed by graph coarsening {cite}`lino2024dgn`. Hence, while we'll primarily discuss grids below, keep in mind that the approaches carry over to GNNs. As dealing with graph structures makes the implementation more complicated, we won't go into details until later.
Hierarchies can likewise be constructed by graph coarsening {cite}`lino2025dgn`. Hence, while we'll primarily discuss grids below, keep in mind that the approaches carry over to GNNs. As dealing with graph structures makes the implementation more complicated, we won't go into details until later.
```{figure} resources/arch02.jpg
---