intro update

This commit is contained in:
NT
2021-01-26 13:04:57 +08:00
parent b8f381b14a
commit e3e72982a7
3 changed files with 110 additions and 44 deletions

View File

@@ -9,6 +9,7 @@ As much as possible, the algorithms will come with hands-on code examples to qui
Beyond standard _supervised_ learning from data, we'll look at loss constraints, and
more tightly coupled learning algorithms with differentiable simulations.
```{figure} resources/teaser.png
---
height: 220px
@@ -51,9 +52,9 @@ immediately see what happens -- give it a try...
Oh, and it's great because it's [literate programming](https://en.wikipedia.org/wiki/Literate_programming).
```
## Specifically
## More Specifically
To be a bit more specific, _physics_ is a huge field, we can't cover everything...
To be a bit more specific, _physics_ is a huge field, and we can't cover everything...
```{note}
For now our focus is:
@@ -69,16 +70,13 @@ For now our focus is:
The contents of the following files would not have been possible without the help of many people. Here's an alphabetical list. Big kudos to everyone 🙏
- Mr. X
- Ms. y
- ...
- [Li-wei Chen](https://ge.in.tum.de/about/dr-liwei-chen/)
- [Philipp Holl](https://ge.in.tum.de/about/)
- [Patrick Schnell](https://ge.in.tum.de/about/patrick-schnell/)
- [Nils Thuerey](https://ge.in.tum.de/about/n-thuerey/)
- [Kiwon Um](https://ge.in.tum.de/about/kiwon/)
% some markdown tests follow ...
<!-- % some markdown tests follow ...
---
@@ -91,26 +89,30 @@ See also... Test link: {doc}`supervised`
✅ Do this , ❌ Don't do this
% ----------------
% ---------------- -->
---
## Planned content
## TODOs , Planned content
Loose collection of notes and TODOs:
General physics & dl , intro & textual overview
- Intro phys loss example, notebook patrick
- Intro phys loss example, parabola example
Supervised? Airfoils? Liwei, simple example? app: optimization, shape opt w surrogates
Supervised simple starting point
- AIAA supervised learning , idp_weissenov/201019-upd-arxiv-v2/ {cite}`thuerey2020deepFlowPred`
skepticism? , started colab -> https://colab.research.google.com/drive/11KUe5Ybuprd7_qmNTe1nvQVUz3W6gRUo
torch version 1.7 [upd from Liwei?]
vs. PINNs [alt.: neural ODEs , PDE net?] , all using GD (optional, PINNs could use BFGS)
- surrogates, shape opt?
Physical losses
- vs. PINNs [alt.: neural ODEs , PDE net?] , all using GD (optional, PINNs could use BFGS)
[PINNs], phiflow example -> convert to colab
- PINNs -> are unsupervised a la tompson; all DL NNs are "supervised" during learning, unsup just means not precomputed and goes through function
@@ -120,7 +122,6 @@ vs. PINNs [alt.: neural ODEs , PDE net?] , all using GD (optional, PINNs could u
- discuss CG solver, tompson as basic ''unsupervisedd'' example?
Diff phys, start with overview of idea: gradients via autodiff, then run GD
(TODO include squared func Patrick?)
- illustrate and discuss gradients -> mult. for chain rule; (later: more general PG chain w func composition)
@@ -164,4 +165,3 @@ _Misc jupyter book TODOs_
- Fix latex PDF output
- How to include links to papers in the bibtex references?

View File

@@ -10,34 +10,46 @@ active, quickly growing and exciting field of research -- we want to provide
a starting point for new researchers as well as a hands-on introduction into
state-of-the-art resarch topics.
```{figure} resources/overview-pano.jpg
---
height: 240px
name: overview-pano
---
Understanding our environment, and predicting how it will evolve is one of the key challenges of humankind.
```
## Motivation
From weather forecasts (? ) over X, Y,
... more ...
to quantum physics (? ),
From weather and climate forecasts {cite}`stocker2014climate`,
over quantum physics {cite}`o2016scalable`,
to the control of plasma fusion {cite}`maingi2019fesreport`,
using numerical analysis to obtain solutions for physical models has
become an integral part of science.
At the same time, machine learning technologies and deep neural networks in particular,
have led to impressive achievements in a variety of field.
Among others, GPT-3
has recently demonstrated that learning methods can
achieve astounding accuracy for processing natural language.
Also: AlphaGO, closer to physics: protein folding...
This is a vibrant, quickly developing field with vast possibilities.
have led to impressive achievements in a variety of fields:
from image classification {cite}`krizhevsky2012` over
natural language processing {cite}`radford2019language`,
and more recently also for protein folding {cite}`alquraishi2019alphafold`.
The field is very vibrant, and quickly developing, with the promise of vast possibilities.
The successes of DL approaches have given rise to concerns that this technology has
At the same time, the successes of deep learning (DL) approaches
has given rise to concerns that this technology has
the potential to replace the traditional, simulation-driven approach to
science. Instead of relying on models that are carefully crafted
from first principles, can data collections of sufficient size
be processed to provide the correct answers instead?
In short: this concern is unfounded. As we'll show in the next chapters,
it is crucial to bring together both worlds: _classical numerical techniques_
and _deep learning_.
Very clear advantages of data-driven approaches would lead
to a "yes" here ... but that's not where we stand as of this writing.
Given the current state of the art, these clear breakthroughs
are outstanding, the proposed techniques are novel,
One central reason for the importance of this combination is
that DL approaches are simply not powerful enough by themselves.
Given the current state of the art, the clear breakthroughs of DL
in physical applications are outstanding, the proposed techniques are novel,
sometimes difficult to apply, and
significant difficulties combing physics and DL persist.
significant practical difficulties combing physics and DL persist.
Also, many fundamental theoretical questions remain unaddressed, most importantly
regarding data efficienty and generalization.
@@ -47,22 +59,23 @@ been developed to solve fundamental model equations such
as the Navier-Stokes, Maxwells, or Schroedingers equations.
Seemingly trivial changes to the discretization can determine
whether key phenomena are visible in the solutions or not.
Rather than discarding the powerful methods that have been
carefully developed in the field of numerical mathematics, it
is highly beneficial for DL to use them as much as possible.
```{admonition} Goal of this document
```{admonition} Goals of this document
:class: tip
Thus, a key aspect that we want to address in the following in the following is:
Thus, the key aspects that we want to address in the following are:
- explain how to use DL,
- how to combine it with existing knowledge of physics and simulations,
- **without throwing away** all existing numerical knowledge and techniques!
- and how to combine it with existing knowledge of physics and simulations,
- **without throwing away** all existing numerical knowledge and techniques.
```
Rather, we want to build on all the neat techniques that we have
at our disposal, and use them as
much as possible. I.e., our goal is to _reconcile_ the data-centered
Thus, we want to build on all the powerful techniques that we have
at our disposal, and use them wherever we can.
I.e., our goal is to _reconcile_ the data-centered
viewpoint and the physical simuation viewpoint.
Also interesting: from a math standpoint ...
''just'' non-linear optimization ...
## Categorization
@@ -124,8 +137,9 @@ each of the different techniques is particularly useful.
A brief look at our _Notation_ won't hurt in both cases, though!
```
---
## A brief history of PBDL in the context of Fluids
<!-- ## A brief history of PBDL in the context of Fluids
First:
@@ -135,7 +149,7 @@ Chu, descriptors, early but not used
Ling et al. isotropic turb, small FC, unused?
PINNs ... and more ...
PINNs ... and more ... -->
## Deep Learning and Neural Networks
@@ -160,4 +174,5 @@ we only deal with _regression_ problems in the following.
maximum likelihood estimation
Also interesting: from a math standpoint ''just'' non-linear optimization ...

View File

@@ -787,9 +787,60 @@
publisher={Elsevier}
}
@book{stocker2014climate,
title={Climate change 2013: the physical science basis: Working Group I contribution to the Fifth assessment report of the Intergovernmental Panel on Climate Change},
author={Stocker, Thomas},
year={2014},
publisher={Cambridge university press}
}
@article{maingi2019fesreport,
title={Summary of the FESAC transformative enabling capabilities panel report},
author={Maingi, Rajesh and Lumsdaine, Arnold and Allain, Jean Paul and Chacon, Luis and Gourlay, SA and others},
journal={Fusion Science and Technology},
volume={75},
number={3},
pages={167--177},
year={2019},
publisher={Taylor Francis}
}
@article{o2016scalable,
title={Scalable quantum simulation of molecular energies},
author={OMalley, Peter JJ and Babbush, Ryan and Kivlichan, Ian D and Romero, Jonathan and McClean, Jarrod R and Barends, Rami and Kelly, Julian and Roushan, Pedram and Tranter, Andrew and Ding, Nan and others},
journal={Physical Review X},
volume={6},
number={3},
pages={031007},
year={2016},
publisher={APS}
}
@inproceedings{krizhevsky2012,
title = {ImageNet Classification with Deep Convolutional Neural Networks},
author = {Alex Krizhevsky and Sutskever, Ilya and Hinton, Geoffrey E},
booktitle = {Advances in Neural Information Processing Systems},
year =2012,
}
@article{radford2019language,
title={Language models are unsupervised multitask learners},
author={Radford, Alec and Wu, Jeffrey and Child, Rewon and Luan, David and Amodei, Dario and Sutskever, Ilya},
journal={OpenAI blog},
volume={1},
number={8},
pages={9},
year={2019}
}
@article{alquraishi2019alphafold,
title={AlphaFold at CASP13},
author={AlQuraishi, Mohammed},
journal={Bioinformatics},
volume={35},
number={22},
pages={4862--4865},
year={2019},
publisher={Oxford University Press}
}