updates intro & motivation

This commit is contained in:
NT 2021-01-04 19:34:34 +08:00
parent 4703b0ee07
commit 4c61648d7e
2 changed files with 157 additions and 14 deletions

View File

@ -9,6 +9,16 @@ As much as possible, the algorithms will come with hands-on code examples to qui
Beyond standard _supervised_ learning from data, we'll look at loss constraints, and
more tightly coupled learning algorithms with differentiable simulations.
```{figure} ./resources/teaser.png
---
height: 220px
name: pbdl-teaser
---
Some examples ... preview teaser ...
```
As a _sneak preview_, in the next chapters we'll show:
- How to train networks to infer fluid flow solutions around shapes like airfoils in one go, i.e., without needing a simulator.
@ -24,7 +34,35 @@ is maintained by the
If you find mistakes, please also let us know! We're aware that this document is far from perfect,
and we're eager to improve it. Thanks in advance!
TODO, add teaser pic
This collection of materials is a living document, and will grow and change over time.
Feel free to contribute 😀
[TUM Physics-based Simulation Group](https://ge.in.tum.de).
We also maintain a [link collection](https://github.com/thunil/Physics-Based-Deep-Learning) with recent research papers.
```{admonition} Code, executable, right here, right now
:class: tip
We focus on jupyter notebooks, a key advantage of which is that all code examples
can be executed _on the spot_, out of a browser. You can modify things and
immediately see what happens -- give it a try...
<br><br>
Oh, and it's great because it's [literate programming](https://en.wikipedia.org/wiki/Literate_programming).
```
## Specifically
To be a bit more specific, _physics_ is a huge field, we can't cover everything...
```{note}
For now our focus is:
- field-based simulations , less Lagrangian
- simulations, not experiments
- combination with _deep learning_ (plenty of other interesting ML techniques)
```
---
## Thanks!
@ -34,17 +72,26 @@ The contents of the following files would not have been possible without the hel
- Ms. y
- ...
% tests...
a b c
```{admonition} My title2
:class: seealso
See also... Test link: {doc}`supervised`
```
% ----------------
---
===
## Planned content
## Planned content, loose collection of notes and TODOs:
Loose collection of notes and TODOs:
General physics & dl , intro & textual overview
more general intro: https://github.com/thunil/Physics-Based-Deep-Learning
Supervised? Airfoils? Liwei, simple example? app: optimization, shape opt w surrogates
@ -86,7 +133,23 @@ PGa 2020 Sept, content: ML & opt
PGb 201002-beforeVac, content: v1,v2,old - more PG focused
-> general intro versions
[MISSING, time series, sequence prediction?] {cite}`wiewel2019lss,bkim2019deep,wiewel2020lsssubdiv`
TODO, for version 2.x add:
time series, sequence prediction?] {cite}`wiewel2019lss,bkim2019deep,wiewel2020lsssubdiv`
include DeepFluids variant?
[BAYES , prob?]
include results Jakob
[unstruct / lagrangian] {cite}`prantl2019tranquil,ummenhofer2019contconv`
Outlook
include ContConv / Lukas
---
_Misc jupyter book TODOs_
- Fix latex PDF output
- How to include links in references?

View File

@ -5,17 +5,62 @@ The following "book" of targets _"Physics-Based Deep Learning"_ techniques
(PBDL), i.e., the field of methods with combinations of physical modeling and
deep learning (DL) techniques. Here, DL will typically refer to methods based
on artificial neural networks. The general direction of PBDL represents a very
active, quickly growing and exciting field of research. As such, this collection
of materials is a living document, and will grow and change over time. Feel free
to contribute 😀
[TUM Physics-based Simulation Group](https://ge.in.tum.de).
[Link collection](https://github.com/thunil/Physics-Based-Deep-Learning)
active, quickly growing and exciting field of research.
## Motivation
....
From weather forecasts (? ) to quantum physics (? ),
... more ...
using numerical analysis to obtain solutions for physical models has
become an integral part of science.
Among others, GPT-3
has recently demonstrated that learning models can
achieve astounding accuracy for processing natural language.
Also: AlphaGO, closer to physics: protein folding...
This is a vibrant, quickly developing field with vast possibilities.
At the same time, machine
learning technologies and deep neural networks in particular,
have given rise to concerns that this technology has the poten-
tial to replace the traditional, simulation-driven approach to
science. Instead of relying on models that are carefully crafted
from first principles, can data collections of sufficient size
be processed to provide the correct answers instead?
Very clear advantages of data-driven approaches would lead
to a "yes" here ... but that's not where we stand as of this writing.
Given the current state of the art, these clear breakthroughs
are outstanding, the proposed techniques are novel,
sometimes difficult to apply, and
significant difficulties combing physics and DL persist.
Also, many
fundamental theoretical questions remain unaddressed, most importantly
regarding data efficienty and generalization.
Over the course of the last decades,
highly specialized and accurate discretization schemes have
been developed to solve fundamental model equations such
as the Navier-Stokes, Maxwells, or Schroedingers equations.
Seemingly trivial changes to the discretization can determine
whether key phenomena are visible in the solutions or not.
```{admonition} Goal of this document
:class: tip
Thus, a key aspect that we want to address in the following in the following is:
- explain how to use DL,
- and how to combine it with existing knowledge of physics and simulations,
- **without throwing away** all existing numerical knowledeg and techniques!
```
Rather, we want to build on all the neat techniques that we have
at our disposal, and use them as
much as possible. I.e., our goal is to _reconcile_ the data-centered
viewpoint and the physical simuation viewpoint.
Also interesting: from a math standpoint ...
''just'' non-linear optimization ...
## Categorization
@ -66,4 +111,39 @@ starting points with code examples, and illustrate pros and cons of the
different approaches. In particular, it's important to know in which scenarios
each of the different techniques is particularly useful.
## Deep Learning and Neural Networks
Very brief intro, basic equations... approximate $f(x)=y$ with NN ...
Details in [Deep Learning book](https://www.deeplearningbook.org)
## Notation and Abbreviations
Unify notation... TODO ...
Math notation:
| Symbol | Meaning |
| --- | --- |
| $x$ | NN input |
| $y$ | NN output |
| $\theta$ | NN params |
Quick summary of the most important abbreviations:
| ABbreviation | Meaning |
| --- | --- |
| CNN | Convolutional neural network |
| DL | Deep learning |
| NN | Neural network |
| PBDL | Physics-based deep learning |
test table formatting
| | Sentence # | Word | POS | Tag |
|---:|:-------------|:-----------|:------|:------|
| 1 | Sentence: 1 | They | PRP | O |
| 2 | Sentence: 1 | marched | VBD | O |