PG conclusions, list formatting
This commit is contained in:
@@ -26,13 +26,13 @@ actual solver in the training loop via a DP approach.
|
|||||||
To summarize the pros and cons of training NNs via differentiable physics:
|
To summarize the pros and cons of training NNs via differentiable physics:
|
||||||
|
|
||||||
✅ Pro:
|
✅ Pro:
|
||||||
- uses physical model and numerical methods for discretization
|
- Uses physical model and numerical methods for discretization.
|
||||||
- efficiency of selected methods carries over to training
|
- Efficiency of selected methods carries over to training.
|
||||||
- tight coupling of physical models and NNs possible
|
- Tight coupling of physical models and NNs possible.
|
||||||
|
|
||||||
❌ Con:
|
❌ Con:
|
||||||
- not compatible with all simulators (need to provide gradients)
|
- Not compatible with all simulators (need to provide gradients).
|
||||||
- require more heavy machinery (in terms of framework support) than previously discussed methods
|
- Require more heavy machinery (in terms of framework support) than previously discussed methods.
|
||||||
|
|
||||||
Especially the last point is one that is bound to strongly improve in a fairly short time, but for now it's important to keep in mind that not every simulator is suitable for DP training out of the box. Hence, in this book we'll focus on examples using phiflow, which was designed for interfacing with deep learning frameworks.
|
Especially the last point is one that is bound to strongly improve in a fairly short time, but for now it's important to keep in mind that not every simulator is suitable for DP training out of the box. Hence, in this book we'll focus on examples using phiflow, which was designed for interfacing with deep learning frameworks.
|
||||||
Next we can target more some complex scenarios to showcase what can be achieved with differentiable physics.
|
Next we can target more some complex scenarios to showcase what can be achieved with differentiable physics.
|
||||||
|
|||||||
@@ -1,14 +1,27 @@
|
|||||||
Discussion
|
Discussion
|
||||||
=======================
|
=======================
|
||||||
|
|
||||||
The training via physical gradients,
|
In a way, the learning via physical gradients provides the tightest possible coupling
|
||||||
... **TODO** ...
|
of physics and NNs: the full non-linear process of the PDE model directly steers
|
||||||
|
the optimization of the NN.
|
||||||
|
|
||||||
|
Naturally, this comes at a cost - invertible simulators are more difficult to build
|
||||||
|
(and less common) than the first-order gradients which are relatively commonly used
|
||||||
|
for learning processes and adjoint optimizations. Nonetheless, if they're available,
|
||||||
|
they can speed up convergence, and yield models that have an inherently better performance.
|
||||||
|
Thus, once trained, these models can give a performance that we simply can't obtain
|
||||||
|
by, e.g., training longer with a simpler approach. So, if we plan to evaluate these
|
||||||
|
models often (e.g., ship them in an application), this increased one-time cost
|
||||||
|
can pay off in the long run.
|
||||||
|
|
||||||
## Summary
|
## Summary
|
||||||
|
|
||||||
✅ Pro:
|
✅ Pro:
|
||||||
- ...
|
- Very accurate "gradient" information for learning and optimization.
|
||||||
|
- Improved convergence and model performance.
|
||||||
|
- Tightest possible coupling of model PDEs and learning.
|
||||||
|
|
||||||
❌ Con:
|
❌ Con:
|
||||||
- ...
|
- Requires inverse simulators (at least local ones).
|
||||||
|
- less wide-spread availability than, e.g., differentiable physics simulators.
|
||||||
|
|
||||||
|
|||||||
@@ -57,14 +57,14 @@ Bringing these numerical methods back into the picture will be one of the centra
|
|||||||
goals of the next sections.
|
goals of the next sections.
|
||||||
|
|
||||||
✅ Pro:
|
✅ Pro:
|
||||||
- uses physical model
|
- Uses physical model.
|
||||||
- derivatives can be conveniently compute via backpropagation
|
- Derivatives can be conveniently compute via backpropagation.
|
||||||
|
|
||||||
❌ Con:
|
❌ Con:
|
||||||
- quite slow ...
|
- Quite slow ...
|
||||||
- physical constraints are enforced only as soft constraints
|
- Physical constraints are enforced only as soft constraints.
|
||||||
- largely incompatible _classical_ numerical methods
|
- Largely incompatible _classical_ numerical methods.
|
||||||
- accuracy of derivatives relies on learned representation
|
- Accuracy of derivatives relies on learned representation.
|
||||||
|
|
||||||
Next, let's look at how we can leverage numerical methods to improve the DL accuracy and efficiency
|
Next, let's look at how we can leverage numerical methods to improve the DL accuracy and efficiency
|
||||||
by making use of differentiable solvers.
|
by making use of differentiable solvers.
|
||||||
|
|||||||
@@ -96,18 +96,18 @@ avoid overfitting.
|
|||||||
|
|
||||||
## Supervised Training in a nutshell
|
## Supervised Training in a nutshell
|
||||||
|
|
||||||
To summarize:
|
To summarize, supervised training has the following properties.
|
||||||
|
|
||||||
✅ Pros:
|
✅ Pros:
|
||||||
- very fast training
|
- Very fast training.
|
||||||
- stable and simple
|
- Stable and simple.
|
||||||
- great starting point
|
- Great starting point.
|
||||||
|
|
||||||
❌ Con:
|
❌ Con:
|
||||||
- lots of data needed
|
- Lots of data needed.
|
||||||
- sub-optimal performance, accuracy and generalization
|
- Sub-optimal performance, accuracy and generalization.
|
||||||
|
|
||||||
Outlook: interactions with external "processes" (such as embedding into a solver) are tricky with supervised training.
|
Outlook: any interactions with external "processes" (such as embedding into a solver) are tricky with supervised training.
|
||||||
First, we'll look at bringing model equations into the picture via soft-constraints, and afterwards
|
First, we'll look at bringing model equations into the picture via soft-constraints, and afterwards
|
||||||
we'll revisit the challenges of bringing together numerical simulations and learned approaches.
|
we'll revisit the challenges of bringing together numerical simulations and learned approaches.
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user