updated latex code for pdf output
This commit is contained in:
parent
4576bc09f3
commit
55a602e814
@ -57,32 +57,41 @@
|
||||
"$\\newcommand{\\vr}[1]{\\mathbf{r}_{#1}} \\vr{t+n}$. \n",
|
||||
"This is what we will address with an NN in the following.\n",
|
||||
"\n",
|
||||
"We'll use an $L^2$-norm in the following to quantify the deviations, i.e., an error function \n",
|
||||
"$\n",
|
||||
"\\newcommand{\\loss}{e} \n",
|
||||
"We'll use an $L^2$-norm in the following to quantify the deviations, i.e., \n",
|
||||
"an error function $\\newcommand{\\loss}{e} \n",
|
||||
"\\newcommand{\\corr}{\\mathcal{C}} \n",
|
||||
"\\newcommand{\\vc}[1]{\\mathbf{s}_{#1}} \n",
|
||||
"\\newcommand{\\vr}[1]{\\mathbf{r}_{#1}} \n",
|
||||
"\\loss (\\vc{t},\\project \\vr{t})=\\Vert\\vc{t}-\\project \\vr{t}\\Vert_2$. \n",
|
||||
"Our learning goal is to train at a correction operator $\\corr ( \\vc{} )$ such that \n",
|
||||
"a solution to which the correction is applied has a lower error than the original unmodified (source) solution: \n",
|
||||
"$\\loss ( \\pdec( \\corr (\\project \\vr{t_0}) ) , \\project \\vr{t_1}) < \\loss ( \\pdec( \\project \\vr{t_0} ), \\project \\vr{t_1})$. \n",
|
||||
"Our learning goal is to train at a correction operator \n",
|
||||
"$\\newcommand{\\corr}{\\mathcal{C}} \n",
|
||||
"\\corr ( \\vc{} )$ such that \n",
|
||||
"a solution to which the correction is applied has a lower error than the original unmodified (source) \n",
|
||||
"solution: $\\newcommand{\\loss}{e} \n",
|
||||
"\\newcommand{\\pdec}{\\pde_{s}}\n",
|
||||
"\\newcommand{\\corr}{\\mathcal{C}} \n",
|
||||
"\\newcommand{\\project}{\\mathcal{T}} \n",
|
||||
"\\newcommand{\\vr}[1]{\\mathbf{r}_{#1}} \n",
|
||||
"\\loss ( \\pdec( \\corr (\\project \\vr{t_0}) ) , \\project \\vr{t_1}) < \\loss ( \\pdec( \\project \\vr{t_0} ), \\project \\vr{t_1})$. \n",
|
||||
"\n",
|
||||
"The correction function \n",
|
||||
"$\\newcommand{\\vcN}{\\mathbf{s}} \\newcommand{\\corr}{\\mathcal{C}} \\corr (\\vcN | \\theta)$ \n",
|
||||
"is represented as a deep neural network with weights $\\theta$\n",
|
||||
"and receives the state $\\vcN$ to infer an additive correction field with the same dimension.\n",
|
||||
"To distinguish the original states $\\vcN$ from the corrected ones, we'll denote the latter with an added tilde\n",
|
||||
"$\\newcommand{\\vctN}{\\tilde{\\mathbf{s}}} \\vctN$.\n",
|
||||
"and receives the state $\\mathbf{s}$ to infer an additive correction field with the same dimension.\n",
|
||||
"To distinguish the original states $\\mathbf{s}$ from the corrected ones, we'll denote the latter with an added tilde $\\tilde{\\mathbf{s}}$.\n",
|
||||
"The overall learning goal now becomes\n",
|
||||
"\n",
|
||||
"$\n",
|
||||
"\\newcommand{\\pdec}{\\pde_{s}}\n",
|
||||
"\\newcommand{\\corr}{\\mathcal{C}} \n",
|
||||
"\\newcommand{\\project}{\\mathcal{T}} \n",
|
||||
"\\newcommand{\\vr}[1]{\\mathbf{r}_{#1}} \n",
|
||||
"\\text{argmin}_\\theta | ( \\pdec \\corr )^n ( \\project \\vr{t} ) - \\project \\vr{t}|^2\n",
|
||||
"$\n",
|
||||
"\n",
|
||||
"A crucial bit here that's easy to overlook is that the correction depends on the modified states, i.e.\n",
|
||||
"it is a function of\n",
|
||||
"$\\newcommand{\\vctN}{\\tilde{\\mathbf{s}}} \\vctN$, so we have \n",
|
||||
"$\\tilde{\\mathbf{s}}$, so we have \n",
|
||||
"$\\newcommand{\\vctN}{\\tilde{\\mathbf{s}}} \\newcommand{\\corr}{\\mathcal{C}} \\corr (\\vctN | \\theta)$.\n",
|
||||
"These states actually evolve over time when training. They don't exist beforehand.\n",
|
||||
"\n",
|
||||
@ -203,9 +212,9 @@
|
||||
"source": [
|
||||
"## Simulation Setup\n",
|
||||
"\n",
|
||||
"Now we can set up the _source_ simulation $\\newcommand{\\pdec}{\\pde_{s}} \\pdec$. \n",
|
||||
"Now we can set up the _source_ simulation $\\mathcal{P}_{s}$. \n",
|
||||
"Note that we won't deal with \n",
|
||||
"$\\newcommand{\\pder}{\\pde_{r}} \\pder$\n",
|
||||
"$\\mathcal{P}_{r}$\n",
|
||||
"below: the downsampled reference data is contained in the training data set. It was generated with a four times finer discretization. Below we're focusing on the interaction of the source solver and the NN. \n",
|
||||
"\n",
|
||||
"This code block and the next ones will define lots of functions, that will be used later on for training.\n",
|
||||
|
Loading…
x
Reference in New Issue
Block a user