added outlook chapter
This commit is contained in:
parent
18d4f3bc75
commit
f0324e7148
1
_toc.yml
1
_toc.yml
@ -31,6 +31,7 @@
|
|||||||
sections:
|
sections:
|
||||||
- file: physgrad-comparison.ipynb
|
- file: physgrad-comparison.ipynb
|
||||||
- file: physgrad-discuss.md
|
- file: physgrad-discuss.md
|
||||||
|
- file: outlook.md
|
||||||
- file: old-phiflow1.md
|
- file: old-phiflow1.md
|
||||||
sections:
|
sections:
|
||||||
- file: overview-burgers-forw-v1.ipynb
|
- file: overview-burgers-forw-v1.ipynb
|
||||||
|
@ -1,4 +1,4 @@
|
|||||||
Outlook
|
Summary and Discussion
|
||||||
=======================
|
=======================
|
||||||
|
|
||||||
The previous sections have explained the differentiable physics approach for deep learning, and have given a range of examples: from a very basic gradient calculation, all the way to complex learning setups powered by simulations. This is a good time to pause and take a step back, to take a look at what we have: in the end, the _differentiable physics_ part is not too complicated. It's largely based on existing numerical methods, with a focus on efficiently using those methods to not only do a forward simulation, but also to compute gradient information. What's more exciting is the combination of these methods with deep learning.
|
The previous sections have explained the differentiable physics approach for deep learning, and have given a range of examples: from a very basic gradient calculation, all the way to complex learning setups powered by simulations. This is a good time to pause and take a step back, to take a look at what we have: in the end, the _differentiable physics_ part is not too complicated. It's largely based on existing numerical methods, with a focus on efficiently using those methods to not only do a forward simulation, but also to compute gradient information. What's more exciting is the combination of these methods with deep learning.
|
||||||
@ -14,16 +14,9 @@ One key component for these hybrids to work well is to let the NN _interact_ wit
|
|||||||
|
|
||||||
## Generalization
|
## Generalization
|
||||||
|
|
||||||
The hybrid approach also bears particular promise for simulators: it improves generalizing capabilities of the trained models by letting the PDE-solver handle large-scale changes to the data distribution such that the learned model can focus on localized structures not captured by the discretization. While physical models generalize very well, learned models often specialize in data distributions seen at training time. This was, e.g., shown for the models reducing numerical errors of the previous chapter: the trained models can deal with solution manifolds with significant amounts of varying physical behavior, while simpler training variants quickly deteriorate over the course of recurrent time steps.
|
The hybrid approach also bears particular promise for simulators: it improves generalizing capabilities of the trained models by letting the PDE-solver handle large-scale _changes to the data distribution_ such that the learned model can focus on localized structures not captured by the discretization. While physical models generalize very well, learned models often specialize in data distributions seen at training time. This was, e.g., shown for the models reducing numerical errors of the previous chapter: the trained models can deal with solution manifolds with significant amounts of varying physical behavior, while simpler training variants quickly deteriorate over the course of recurrent time steps.
|
||||||
|
|
||||||
## Possibilities
|
---
|
||||||
|
|
||||||
We've just scratched the surface regarding the possibilities of this combination. The examples with Burgers equation and Navier-Stokes solvers are non-trivial, and good examples for advection-diffusion-type PDEs. However, there's a wide variety of other potential combinations, to name just a few examples:
|
Despite being a very powerful method, the DP approach is clearly not the end of the line. In the next chapters we'll consider further improvements and extensions.
|
||||||
|
|
||||||
* PDEs for chemical reactions often show complex behavior due to the interactions of multiple species. Here, and especially interesting direction is to train models that quickly learn to predict the evolution of an experiment or machine, and adjust control knobs to stabilize it, i.e., an online _control_ setting.
|
|
||||||
|
|
||||||
* Plasma simulations share a lot with vorticity-based formulations for fluids, but additionally introduce terms to handle electric and magnetic interactions within the material. Likewise, controllers for plasma fusion experiments and generators are an excellent topic with plenty of potential for DL with differentiable physics.
|
|
||||||
|
|
||||||
* Finally, weather and climate are crucial topics for humanity, and highly complex systems of fluid flows interacting with a multitude of phenomena on the surface of our planet. Accurately modeling all these interacting systems and predicting their long-term behavior shows a lot of promise to benefit from DL approaches that can interface with numerical simulations.
|
|
||||||
|
|
||||||
So overall, there's lots of exciting research work left to do - the next years and decades definitely won't be boring 👍
|
|
||||||
|
@ -8,8 +8,8 @@
|
|||||||
| $A$ | matrix |
|
| $A$ | matrix |
|
||||||
| $\eta$ | learning rate or step size |
|
| $\eta$ | learning rate or step size |
|
||||||
| $\Gamma$ | boundary of computational domain $\Omega$ |
|
| $\Gamma$ | boundary of computational domain $\Omega$ |
|
||||||
| $f^{*}()$ | generic function to be approximated, typically unknown |
|
| $f^{*}$ | generic function to be approximated, typically unknown |
|
||||||
| $f()$ | approximate version of $f^{*}$ |
|
| $f$ | approximate version of $f^{*}$ |
|
||||||
| $\Omega$ | computational domain |
|
| $\Omega$ | computational domain |
|
||||||
| $\mathcal P^*$ | continuous/ideal physical model |
|
| $\mathcal P^*$ | continuous/ideal physical model |
|
||||||
| $\mathcal P$ | discretized physical model, PDE |
|
| $\mathcal P$ | discretized physical model, PDE |
|
||||||
|
16
outlook.md
Normal file
16
outlook.md
Normal file
@ -0,0 +1,16 @@
|
|||||||
|
Outlook
|
||||||
|
=======================
|
||||||
|
|
||||||
|
Despite the lengthy discussions and numerous examples,
|
||||||
|
we've really just barely scratched the surface regarding the possibilities that arise in the context
|
||||||
|
of physics-based deep learning.
|
||||||
|
|
||||||
|
The examples with Burgers equation and Navier-Stokes solvers are non-trivial, and good examples for advection-diffusion-type PDEs. However, there's a wide variety of other potential combinations. To name just a few promising examples from other fields:
|
||||||
|
|
||||||
|
* PDEs for chemical reactions often show complex behavior due to the interactions of multiple species. Here, and especially interesting direction is to train models that quickly learn to predict the evolution of an experiment or machine, and adjust control knobs to stabilize it, i.e., an online _control_ setting.
|
||||||
|
|
||||||
|
* Plasma simulations share a lot with vorticity-based formulations for fluids, but additionally introduce terms to handle electric and magnetic interactions within the material. Likewise, controllers for plasma fusion experiments and generators are an excellent topic with plenty of potential for DL with differentiable physics.
|
||||||
|
|
||||||
|
* Finally, weather and climate are crucial topics for humanity, and highly complex systems of fluid flows interacting with a multitude of phenomena on the surface of our planet. Accurately modeling all these interacting systems and predicting their long-term behavior shows a lot of promise to benefit from DL approaches that can interface with numerical simulations.
|
||||||
|
|
||||||
|
So overall, there's lots of exciting research work left to do - the next years and decades definitely won't be boring 👍
|
Loading…
Reference in New Issue
Block a user