pbdl-book/intro.md

155 lines
5.3 KiB
Markdown
Raw Normal View History

2021-01-04 09:36:09 +01:00
Welcome ...
============================
Welcome to the Physics-based Deep Learning Book 👋
**TL;DR**: This document targets
a veriety of combinations of physical simulations with deep learning.
As much as possible, the algorithms will come with hands-on code examples to quickly get started.
Beyond standard _supervised_ learning from data, we'll look at loss constraints, and
more tightly coupled learning algorithms with differentiable simulations.
2021-01-26 06:04:57 +01:00
2021-01-07 02:39:57 +01:00
```{figure} resources/teaser.png
2021-01-04 12:34:34 +01:00
---
height: 220px
name: pbdl-teaser
---
2021-01-31 05:13:00 +01:00
Some visual examples of hybrid solvers, i.e. numerical simulators that are enhanced by trained neural networks.
2021-01-04 12:34:34 +01:00
```
2021-01-07 02:39:57 +01:00
% Teaser, simple version:
% ![Teaser, simple version](resources/teaser.png)
2021-01-04 12:34:34 +01:00
2021-01-31 05:13:00 +01:00
## Coming up
2021-01-04 12:34:34 +01:00
2021-01-04 09:36:09 +01:00
As a _sneak preview_, in the next chapters we'll show:
- How to train networks to infer fluid flow solutions around shapes like airfoils in one go, i.e., without needing a simulator.
- We'll show how to use model equations as residual to train networks that represent solutions, and how to improve upon this behavior by using differentiable simulations.
- Even more tightly coupling a full _rough_ simulator for control problems is another topic. E.g., we'll demonstrate how to circumvent the convergence problems of standard reinforcement learning techniques by leveraging simulators in the training loop.
This _book_, where book stands for a collection of text, equations, images and code examples,
is maintained by the
[TUM Physics-based Simulation Group](https://ge.in.tum.de). Feel free to contact us via
[old fashioned email](mailto:i15ge@cs.tum.edu) if you have any comments.
If you find mistakes, please also let us know! We're aware that this document is far from perfect,
and we're eager to improve it. Thanks in advance!
2021-01-04 12:34:34 +01:00
This collection of materials is a living document, and will grow and change over time.
2021-01-31 05:13:00 +01:00
Feel free to contribute 😀
2021-01-04 12:34:34 +01:00
We also maintain a [link collection](https://github.com/thunil/Physics-Based-Deep-Learning) with recent research papers.
```{admonition} Code, executable, right here, right now
:class: tip
We focus on jupyter notebooks, a key advantage of which is that all code examples
can be executed _on the spot_, out of a browser. You can modify things and
immediately see what happens -- give it a try...
<br><br>
Oh, and it's great because it's [literate programming](https://en.wikipedia.org/wiki/Literate_programming).
```
---
2021-01-04 09:36:09 +01:00
## Thanks!
The contents of the following files would not have been possible without the help of many people. Here's an alphabetical list. Big kudos to everyone 🙏
2021-01-26 06:04:57 +01:00
- [Li-wei Chen](https://ge.in.tum.de/about/dr-liwei-chen/)
- [Philipp Holl](https://ge.in.tum.de/about/)
- [Patrick Schnell](https://ge.in.tum.de/about/patrick-schnell/)
- [Nils Thuerey](https://ge.in.tum.de/about/n-thuerey/)
- [Kiwon Um](https://ge.in.tum.de/about/kiwon/)
2021-01-12 04:50:42 +01:00
2021-01-26 06:04:57 +01:00
<!-- % some markdown tests follow ...
2021-01-12 04:50:42 +01:00
---
2021-01-04 12:34:34 +01:00
a b c
```{admonition} My title2
:class: seealso
See also... Test link: {doc}`supervised`
```
2021-01-04 09:36:09 +01:00
2021-01-07 02:39:57 +01:00
✅ Do this , ❌ Don't do this
2021-01-04 09:36:09 +01:00
2021-01-26 06:04:57 +01:00
% ---------------- -->
2021-01-12 04:50:42 +01:00
2021-01-04 12:34:34 +01:00
---
2021-01-04 09:36:09 +01:00
2021-01-26 06:04:57 +01:00
## TODOs , Planned content
2021-01-04 09:36:09 +01:00
2021-01-04 12:34:34 +01:00
Loose collection of notes and TODOs:
2021-01-04 09:36:09 +01:00
General physics & dl , intro & textual overview
2021-01-26 06:04:57 +01:00
- Intro phys loss example, parabola example
2021-01-22 13:31:22 +01:00
2021-01-26 06:04:57 +01:00
Supervised simple starting point
2021-01-04 09:36:09 +01:00
- AIAA supervised learning , idp_weissenov/201019-upd-arxiv-v2/ {cite}`thuerey2020deepFlowPred`
skepticism? , started colab -> https://colab.research.google.com/drive/11KUe5Ybuprd7_qmNTe1nvQVUz3W6gRUo
torch version 1.7 [upd from Liwei?]
2021-01-26 06:04:57 +01:00
- surrogates, shape opt?
Physical losses
- vs. PINNs [alt.: neural ODEs , PDE net?] , all using GD (optional, PINNs could use BFGS)
2021-01-04 09:36:09 +01:00
[PINNs], phiflow example -> convert to colab
- PINNs -> are unsupervised a la tompson; all DL NNs are "supervised" during learning, unsup just means not precomputed and goes through function
- add image | NN | <> | Loss | , backprop; (bring back every section, add variants for other methods?)
- discuss CG solver, tompson as basic ''unsupervisedd'' example?
Diff phys, start with overview of idea: gradients via autodiff, then run GD
2021-01-22 13:31:22 +01:00
- illustrate and discuss gradients -> mult. for chain rule; (later: more general PG chain w func composition)
2021-01-04 09:36:09 +01:00
- Differentiable Physics (w/o network) , {cite}`holl2019pdecontrol`
-> phiflow colab notebook good start, but needs updates (see above Jan2)
- SOL_201019-finals_Solver-in-the-Loop-Main-final.pdf , {cite}`um2020sol`
numerical errors, how to include in jupyter / colab?
- ICLR_190925-ICLR-final_1d8cf33bb3c8825e798f087d6cd35f2c7c062fd4.pdf alias
PDE control, control focused
https://github.com/holl-/PDE-Control -> update to new version?
beyond GD: re-cap newton & co
Phys grad (PGs) as fundamental improvement, PNAS case; add more complex one?
PG update of poisson eq? see PNAS-template-main.tex.bak01-poissonUpdate , explicitly lists GD and PG updates
2021-01-22 13:31:22 +01:00
- PGa 2020 Sept, content: ML & opt
2021-01-04 09:36:09 +01:00
Gradients.pdf, -> overleaf-physgrad/
2021-01-22 13:31:22 +01:00
- PGb 201002-beforeVac, content: v1,v2,old - more PG focused
2021-01-04 09:36:09 +01:00
-> general intro versions
2021-01-04 12:34:34 +01:00
TODO, for version 2.x add:
time series, sequence prediction?] {cite}`wiewel2019lss,bkim2019deep,wiewel2020lsssubdiv`
include DeepFluids variant?
2021-01-04 09:36:09 +01:00
[BAYES , prob?]
2021-01-04 12:34:34 +01:00
include results Jakob
2021-01-04 09:36:09 +01:00
[unstruct / lagrangian] {cite}`prantl2019tranquil,ummenhofer2019contconv`
2021-01-04 12:34:34 +01:00
include ContConv / Lukas
---
_Misc jupyter book TODOs_
- Fix latex PDF output