fixed smaller issues

This commit is contained in:
NT 2022-04-15 13:26:29 +02:00
parent 31aed1f58c
commit 7664c5a6a0
3 changed files with 7 additions and 4 deletions

View File

@ -1357,7 +1357,7 @@
"err_source = vx_ref - vx_src \n", "err_source = vx_ref - vx_src \n",
"err_hybrid = vx_ref - vx_hyb \n", "err_hybrid = vx_ref - vx_hyb \n",
"v = np.concatenate([err_source,err_hybrid], axis=1)\n", "v = np.concatenate([err_source,err_hybrid], axis=1)\n",
"axes[3].imshow( v , origin='lower', cmap='magma')\n", "axes[3].imshow( v , origin='lower', cmap='cividis')\n",
"axes[3].set_title(f\" Errors: Source & Learned\")\n", "axes[3].set_title(f\" Errors: Source & Learned\")\n",
"\n", "\n",
"pylab.tight_layout()\n" "pylab.tight_layout()\n"

View File

@ -20,9 +20,11 @@ python3.7 json-cleanup-for-pdf.py
/Users/thuerey/Library/Python/3.7/bin/jupyter-book build . --builder pdflatex /Users/thuerey/Library/Python/3.7/bin/jupyter-book build . --builder pdflatex
cd _build/latex cd _build/latex
#mv book.pdf book-xetex.pdf # failed anyway #mv book.pdf book-xetex.pdf # not necessary, failed anyway
# this generates book.tex
rm -f book-in.tex sphinxmessages-in.sty book-in.aux book-in.toc rm -f book-in.tex sphinxmessages-in.sty book-in.aux book-in.toc
# rename book.tex -> book-in.tex (this is the original output!)
mv book.tex book-in.tex mv book.tex book-in.tex
mv sphinxmessages.sty sphinxmessages-in.sty mv sphinxmessages.sty sphinxmessages-in.sty
mv book.aux book-in.aux mv book.aux book-in.aux
@ -30,9 +32,10 @@ mv book.toc book-in.toc
#mv sphinxmanual.cls sphinxmanual-in.cls #mv sphinxmanual.cls sphinxmanual-in.cls
python3.7 ../../fixup-latex.py python3.7 ../../fixup-latex.py
# generates book-in2.tex # reads book-in.tex -> writes book-in2.tex
# remove unicode chars via unix iconv # remove unicode chars via unix iconv
# reads book-in2.tex -> writes book.tex
iconv -c -f utf-8 -t ascii book-in2.tex > book.tex iconv -c -f utf-8 -t ascii book-in2.tex > book.tex
# finally run pdflatex, now it should work: # finally run pdflatex, now it should work:

View File

@ -28,7 +28,7 @@ $$ (learn-l2)
We typically optimize, i.e. _train_, We typically optimize, i.e. _train_,
with a stochastic gradient descent (SGD) optimizer of choice, e.g. Adam {cite}`kingma2014adam`. with a stochastic gradient descent (SGD) optimizer of choice, e.g. Adam {cite}`kingma2014adam`.
We'll rely on auto-diff to compute the gradient w.r.t. weights, $\partial f / \partial \theta$, We'll rely on auto-diff to compute the gradient of a scalar loss $L$ w.r.t. the weights, $\partial L / \partial \theta$,
We will also assume that $e$ denotes a _scalar_ error function (also We will also assume that $e$ denotes a _scalar_ error function (also
called cost, or objective function). called cost, or objective function).
It is crucial for the efficient calculation of gradients that this function is scalar. It is crucial for the efficient calculation of gradients that this function is scalar.