updated intro and outlook

This commit is contained in:
N_T 2025-02-19 09:52:46 +08:00
parent 3f8c7bc672
commit c59992f349
2 changed files with 14 additions and 21 deletions

View File

@ -10,20 +10,12 @@ name: pbdl-logo-large
Welcome to the _Physics-based Deep Learning Book_ (v0.3, the _GenAI_ edition) 👋
**TL;DR**:
This document contains a practical and comprehensive introduction of everything
related to deep learning in the context of physical simulations.
As much as possible, all topics come with hands-on code examples in the
form of Jupyter notebooks to quickly get started.
Beyond standard _supervised_ learning from data,
we'll look at _physical loss_ constraints and _differentiable simulations_,
diffusion-based approaches for _probabilistic, generative models_,
as well as
reinforcement learning and neural network architectures.
We live in exciting times: these methods have a huge potential to fundamentally change what humans can achieve via computer simulations.
This document is a hands-on, comprehensive guide to deep learning in the realm of physical simulations. Rather than just theory, we emphasize practical application: every concept is paired with interactive Jupyter notebooks to get you up and running quickly. Beyond traditional supervised learning, we dive into physical _loss-constraints_, _differentiable_ simulations, _diffusion-based_ approaches for _probabilistic generative AI_, as well as reinforcement learning and advanced neural network architectures. These foundations are paving the way for the next generation of scientific _foundation models_.
We are living in an era of rapid transformation. These methods have the potential to redefine whats possible in computational science.
```{note}
_What's new in v0.3?_
Most importantly, this version has a large new chapter on generative modeling, offering a deep dive into topics such as denoising, flow-matching, autoregressive learning, the integration of physics-based constraints, and diffusion-based graph networks. Additionally, a new section explores neural architectures tailored for physics simulations, while all code examples have been updated to use the latest frameworks.
This latest edition takes things even further with a major new chapter on generative modeling, covering cutting-edge techniques like denoising, flow-matching, autoregressive learning, physics-integrated constraints, and diffusion-based graph networks. We've also introduced a dedicated section on neural architectures specifically designed for physics simulations. All code examples have been updated to leverage the latest frameworks.
```
---
@ -83,7 +75,7 @@ Some visual examples of numerically simulated time sequences. In this book, we e
## Thanks!
This project would not have been possible without the help of many people who contributed. Thanks to everyone 🙏 Here's an alphabetical list:
This project would not have been possible without the help of the many people who contributed to it. A big thanks to everyone 🙏 Here's an alphabetical list:
- [Benjamin Holzschuh](https://ge.in.tum.de/about/)
- [Philipp Holl](https://ge.in.tum.de/about/philipp-holl/)
@ -116,4 +108,6 @@ If you find this book useful, please cite it via:
}
```
## Time to get started
The future of simulation is being rewritten, and with the following tools, youll be at the forefront of these developments. Lets dive in!

View File

@ -1,32 +1,31 @@
Outlook
=======================
Despite the lengthy discussions and numerous examples, we've really just barely scratched the surface regarding the possibilities that arise in the context of physics-based deep learning.
Despite the in-depth discussions and diverse examples we've explored, we've really only begun to tap into the vast potential of physics-based deep learning. The techniques covered in the previous chapters arent just useful -— they have the power to reshape computational methods for decades to come. As we've seen in the code examples, theres no magic at play; rather, deep learning provides an incredibly powerful new tool to work with complex, non-linear functions.
Most importantly, the techniques that were explained in the previous chapter have an enormous potential to influence all computational methods of the next decades. As demonstrated many times in the code examples, there's no magic involved, but deep learning gives us very powerful tools to represent and approximate non-linear functions. And deep learning by no means makes existing numerical methods deprecated. Rather, the two are an ideal combination.
Crucially, deep learning doesnt replace traditional numerical methods. Instead, it enhances them. Together, they form a groundbreaking synergy, with a huge potential to unlock new frontiers in simulation and modeling. One aspect we havent yet touched upon is perhaps the most profound: at its core, our ultimate goal is to deepen human understanding of the world. The notion of neural networks as impenetrable “black boxes” is outdated. Instead, they should be seen as just another numerical tool—one that is as interpretable as traditional simulations when used correctly.
A topic that we have not touched at all so far is, that -- of course -- in the end our goal is to improve human understanding of our world. And here the view of neural networks as "black boxes" is clearly outdated. It is simply another numerical method that humans can employ, and the physical fields predicted by a network are as interpretable as the outcome of a traditional simulation. Nonetheless, it is important to further improve the tools for analyzing learned networks, and to extract condensed formulations of the patterns and regularities the networks have found in the solution manifolds.
Looking ahead, one of the most exciting challenges is to refine our ability to analyze learned networks. By distilling the patterns and structures these networks uncover, we move closer to extracting fundamental, human-readable insights from their solution manifolds. The future of differentiable simulation isnt just about better predictions -— its about revealing the hidden order of the physical world in ways weve never imagined.
![Divider](resources/divider2.jpg)
## Some specific directions
Beyond this long term outlook, there are many interesting and immediate steps.
And while the examples with Burgers equation and Navier-Stokes solvers are clearly non-trivial, there's a wide variety of other potential PDE models that the techniques of this book can be applied to. To name just a few promising examples from other fields:
Beyond this long-term vision, there are plenty of exciting and immediate next steps. While our deep dives into Burgers equation and Navier-Stokes solvers have tackled non-trivial challenges, they represent just a fraction of the landscape of PDE models and operators that these techniques can improve. Here are just a few promising directions from other fields:
* PDEs for chemical reactions often show complex behavior due to the interactions of multiple species. Here, and especially interesting direction is to train models that quickly learn to predict the evolution of an experiment or machine, and adjust control knobs to stabilize it, i.e., an online _control_ setting.
* Chemical Reaction PDEs often exhibit intricate behaviors due to multi-species interactions. A particularly exciting avenue is training models that can rapidly predict experimental or industrial processes and dynamically adjust control parameters to stabilize them to enable real-time, intelligent control.
* Plasma simulations share a lot with vorticity-based formulations for fluids, but additionally introduce terms to handle electric and magnetic interactions within the material. Likewise, controllers for plasma fusion experiments and generators are an excellent topic with plenty of potential for DL with differentiable physics.
* Plasma Simulations share similarities with vorticity-based fluid formulations but introduce additional complexities due to electric and magnetic interactions. This makes them a prime candidate for deep learning methods, especially for plasma fusion experiments and energy generators, where differentiable physics could be a game-changer.
* Finally, weather and climate are crucial topics for humanity, and highly complex systems of fluid flows interacting with a multitude of phenomena on the surface of our planet. Accurately modeling all these interacting systems and predicting their long-term behavior shows a lot of promise to benefit from DL approaches that can interface with numerical simulations.
* Weather and Climate Modeling remain among the most critical scientific challenges for humanity. These highly complex, multi-scale systems involve fluid flows intertwined with countless environmental factors. Leveraging deep learning to enhance numerical simulations in this space holds immense potential. Not just for more accurate forecasts, but for unlocking deeper insights into the dynamics of our planet.
![Divider](resources/divider3.jpg)
## Closing remarks
So overall, there's lots of exciting research work left to do - the next years and decades definitely won't be boring. 👍
These are just a few examples, but they illustrate the incredible breadth of opportunities where differentiable physics and deep learning can make an impact. There's lots of exciting research work left to do - the next years and decades definitely won't be boring. 🤗 👍
```{figure} resources/logo.jpg
---