intro update
This commit is contained in:
36
intro.md
36
intro.md
@@ -19,14 +19,7 @@ reinforcement learning and uncertainty modeling.
|
|||||||
We live in exciting times: these methods have a huge potential to fundamentally change what we can achieve
|
We live in exciting times: these methods have a huge potential to fundamentally change what we can achieve
|
||||||
with simulations.
|
with simulations.
|
||||||
|
|
||||||
|
|
||||||
```{figure} resources/teaser.jpg
|
|
||||||
---
|
---
|
||||||
height: 220px
|
|
||||||
name: pbdl-teaser
|
|
||||||
---
|
|
||||||
Some visual examples of numerically simulated time sequences. In this book, we explain how to realize algorithms that use neural networks alongside numerical solvers.
|
|
||||||
```
|
|
||||||
|
|
||||||
## Coming up
|
## Coming up
|
||||||
|
|
||||||
@@ -47,15 +40,6 @@ will be discussed. It's important to know in which scenarios each of the
|
|||||||
different techniques is particularly useful.
|
different techniques is particularly useful.
|
||||||
|
|
||||||
|
|
||||||
## Comments and suggestions
|
|
||||||
|
|
||||||
This _book_, where "book" stands for a collection of digital texts and code examples,
|
|
||||||
is maintained by the
|
|
||||||
[TUM Physics-based Simulation Group](https://ge.in.tum.de). Feel free to contact us
|
|
||||||
if you have any comments, e.g., via [old fashioned email](mailto:i15ge@cs.tum.edu).
|
|
||||||
If you find mistakes, please also let us know! We're aware that this document is far from perfect,
|
|
||||||
and we're eager to improve it. Thanks in advance 😀! Btw., we also maintain a [link collection](https://github.com/thunil/Physics-Based-Deep-Learning) with recent research papers.
|
|
||||||
|
|
||||||
```{admonition} Executable code, right here, right now
|
```{admonition} Executable code, right here, right now
|
||||||
:class: tip
|
:class: tip
|
||||||
We focus on Jupyter notebooks, a key advantage of which is that all code examples
|
We focus on Jupyter notebooks, a key advantage of which is that all code examples
|
||||||
@@ -66,7 +50,24 @@ Plus, Jupyter notebooks are great because they're a form of [literate programmin
|
|||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||

|
|
||||||
|
## Comments and suggestions
|
||||||
|
|
||||||
|
This _book_, where "book" stands for a collection of digital texts and code examples,
|
||||||
|
is maintained by the
|
||||||
|
[TUM Physics-based Simulation Group](https://ge.in.tum.de). Feel free to contact us
|
||||||
|
if you have any comments, e.g., via [old fashioned email](mailto:i15ge@cs.tum.edu).
|
||||||
|
If you find mistakes, please also let us know! We're aware that this document is far from perfect,
|
||||||
|
and we're eager to improve it. Thanks in advance 😀! Btw., we also maintain a [link collection](https://github.com/thunil/Physics-Based-Deep-Learning) with recent research papers.
|
||||||
|
|
||||||
|
|
||||||
|
```{figure} resources/divider-mult.jpg
|
||||||
|
---
|
||||||
|
height: 220px
|
||||||
|
name: divider-mult
|
||||||
|
---
|
||||||
|
Some visual examples of numerically simulated time sequences. In this book, we explain how to realize algorithms that use neural networks alongside numerical solvers.
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
## Thanks!
|
## Thanks!
|
||||||
@@ -89,7 +90,6 @@ Chloe Paillard for proofreading parts of the document.
|
|||||||
% future:
|
% future:
|
||||||
% - [Georg Kohl](https://ge.in.tum.de/about/georg-kohl/)
|
% - [Georg Kohl](https://ge.in.tum.de/about/georg-kohl/)
|
||||||
|
|
||||||
|
|
||||||
## Citation
|
## Citation
|
||||||
|
|
||||||
If you find this book useful, please cite it via:
|
If you find this book useful, please cite it via:
|
||||||
|
|||||||
42
overview.md
42
overview.md
@@ -35,6 +35,8 @@ natural language processing {cite}`radford2019language`,
|
|||||||
and more recently also for protein folding {cite}`alquraishi2019alphafold`.
|
and more recently also for protein folding {cite}`alquraishi2019alphafold`.
|
||||||
The field is very vibrant and quickly developing, with the promise of vast possibilities.
|
The field is very vibrant and quickly developing, with the promise of vast possibilities.
|
||||||
|
|
||||||
|
### Replacing traditional simulations?
|
||||||
|
|
||||||
These success stories of deep learning (DL) approaches
|
These success stories of deep learning (DL) approaches
|
||||||
have given rise to concerns that this technology has
|
have given rise to concerns that this technology has
|
||||||
the potential to replace the traditional, simulation-driven approach to science.
|
the potential to replace the traditional, simulation-driven approach to science.
|
||||||
@@ -67,6 +69,37 @@ Rather than discarding the powerful methods that have been
|
|||||||
developed in the field of numerical mathematics, it
|
developed in the field of numerical mathematics, it
|
||||||
is highly beneficial for DL to use them as much as possible.
|
is highly beneficial for DL to use them as much as possible.
|
||||||
|
|
||||||
|
### Black boxes and magic?
|
||||||
|
|
||||||
|
People who are unfamiliear with DL methods often associate neural networks
|
||||||
|
with _black boxes_, and see the training processes as something that is beyond the grasp
|
||||||
|
of human understanding. However, these viewpoints typically stem from
|
||||||
|
relying on hearsay and not dealing with the topic enough.
|
||||||
|
|
||||||
|
Rather, the situation is a very common one in science: we are facing a new class of methods,
|
||||||
|
and "all the gritty details" are not yet fully worked out. However, this is pretty common
|
||||||
|
for scientific advances.
|
||||||
|
Numerical methods themselves are a good example. Around 1950, numerical approximations
|
||||||
|
and solvers had a tough standing. E.g., to cite H. Goldstine,
|
||||||
|
numerical instabilies were considered to be a "constant source of
|
||||||
|
anxiety in the future" {cite}`goldstine1990history`.
|
||||||
|
By now we have a pretty good grasp of these instabilities, and numerical methods
|
||||||
|
are ubiquitous, and well established.
|
||||||
|
|
||||||
|
Thus, it is important to be aware of the fact that -- in a way -- there is nothing
|
||||||
|
magical or otherworldly to deep learning methods. They're simply another set of
|
||||||
|
numerical tools. That being said, they're clearly fairly new, and right now
|
||||||
|
definitely the most powerful set of tools we have for non-linear problems.
|
||||||
|
Just because all the details aren't fully worked out and nicely written up,
|
||||||
|
that shouldn't stop us from including these powerful methods in our numerical toolbox.
|
||||||
|
|
||||||
|
### Reconciling DL and simulations
|
||||||
|
|
||||||
|
Taking a step back, the aim of this book is to build on all the powerful techniques that we have
|
||||||
|
at our disposal for numerical simulations, and use them wherever we can in conjunction
|
||||||
|
with deep learning.
|
||||||
|
As such, a central goal is to _reconcile_ the data-centered viewpoint with physical simulations.
|
||||||
|
|
||||||
```{admonition} Goals of this document
|
```{admonition} Goals of this document
|
||||||
:class: tip
|
:class: tip
|
||||||
The key aspects that we will address in the following are:
|
The key aspects that we will address in the following are:
|
||||||
@@ -75,11 +108,6 @@ The key aspects that we will address in the following are:
|
|||||||
- without **discarding** our knowledge about numerical methods.
|
- without **discarding** our knowledge about numerical methods.
|
||||||
```
|
```
|
||||||
|
|
||||||
Thus, our aim is to build on all the powerful techniques that we have
|
|
||||||
at our disposal, and use them wherever we can.
|
|
||||||
As such, a central goal of this book is to _reconcile_ the data-centered
|
|
||||||
viewpoint with physical simulations.
|
|
||||||
|
|
||||||
The resulting methods have a huge potential to improve
|
The resulting methods have a huge potential to improve
|
||||||
what can be done with numerical methods: in scenarios
|
what can be done with numerical methods: in scenarios
|
||||||
where a solver targets cases from a certain well-defined problem
|
where a solver targets cases from a certain well-defined problem
|
||||||
@@ -142,9 +170,9 @@ that leverage _differentiable physics_ allow for very tight integration
|
|||||||
of deep learning and numerical simulation methods.
|
of deep learning and numerical simulation methods.
|
||||||
|
|
||||||
|
|
||||||
## More specifically
|
## Looking ahead
|
||||||
|
|
||||||
_Physical simulations_ are a huge field, and we won't cover all possible types of physical models and simulations in the following.
|
_Physical simulations_ are a huge field, and we won't be able to cover all possible types of physical models and simulations.
|
||||||
|
|
||||||
```{note} Rather, the focus of this book lies on:
|
```{note} Rather, the focus of this book lies on:
|
||||||
- _Field-based simulations_ (no Lagrangian methods)
|
- _Field-based simulations_ (no Lagrangian methods)
|
||||||
|
|||||||
@@ -797,6 +797,12 @@
|
|||||||
|
|
||||||
% ----------------- external --------------------
|
% ----------------- external --------------------
|
||||||
|
|
||||||
|
@book{goldstine1990history,
|
||||||
|
title={A history of scientific computing},
|
||||||
|
author={Goldstine, H},
|
||||||
|
publisher={ACM},
|
||||||
|
year={1990}
|
||||||
|
}
|
||||||
|
|
||||||
@inproceedings{tompson2017,
|
@inproceedings{tompson2017,
|
||||||
title = {Accelerating Eulerian Fluid Simulation With Convolutional Networks},
|
title = {Accelerating Eulerian Fluid Simulation With Convolutional Networks},
|
||||||
|
|||||||
|
Before Width: | Height: | Size: 78 KiB After Width: | Height: | Size: 78 KiB |
Reference in New Issue
Block a user