em dash; sentence case

This commit is contained in:
jverzani
2025-07-27 15:26:00 -04:00
parent c3b221cd29
commit 33c6e62d68
59 changed files with 385 additions and 243 deletions

View File

@@ -1,4 +1,4 @@
# Curve Sketching
# Curve sketching
{{< include ../_common_code.qmd >}}

View File

@@ -1,4 +1,4 @@
# Implicit Differentiation
# Implicit differentiation
{{< include ../_common_code.qmd >}}

View File

@@ -1,4 +1,4 @@
# L'Hospital's Rule
# L'Hospital's rule
{{< include ../_common_code.qmd >}}

View File

@@ -1,4 +1,4 @@
# The mean value theorem for differentiable functions.
# The mean value theorem for differentiable functions
{{< include ../_common_code.qmd >}}

View File

@@ -368,7 +368,7 @@ $$
So we should consider `f(x_n)` an *approximate zero* when it is on the scale of $f'(\alpha) \cdot \alpha \delta$. That $\alpha$ factor means we consider a *relative* tolerance for `f`.
> For checking if $f(x_n) \approx 0$ both a relative and absolute error should be used--the relative error involving the size of $x_n$.
> For checking if $f(x_n) \approx 0$ both a relative and absolute error should be used---the relative error involving the size of $x_n$.
A good condition to check if `f(x_n)` is small is

View File

@@ -10,15 +10,6 @@ This section uses the `TermInterface` add-on package.
using TermInterface
```
```{julia}
#| echo: false
const frontmatter = (
title = "Symbolic derivatives",
description = "Calculus with Julia: Symbolic derivatives",
tags = ["CalculusWithJulia", "derivatives", "symbolic derivatives"],
);
```
---

View File

@@ -1,4 +1,4 @@
# Taylor Polynomials and other Approximating Polynomials
# Taylor polynomials and other approximating polynomials
{{< include ../_common_code.qmd >}}
@@ -104,7 +104,7 @@ $$
tl(x) = f(c) + f'(c) \cdot(x - c).
$$
The key is the term multiplying $(x-c)$ -- for the secant line this is an approximation to the related term for the tangent line. That is, the secant line approximates the tangent line, which is the linear function that best approximates the function at the point $(c, f(c))$.
The key is the term multiplying $(x-c)$---for the secant line this is an approximation to the related term for the tangent line. That is, the secant line approximates the tangent line, which is the linear function that best approximates the function at the point $(c, f(c))$.
This is quantified by the *mean value theorem* which states under our assumptions on $f(x)$ that there exists some $\xi$ between $x$ and $c$ for which:
@@ -194,7 +194,7 @@ function divided_differences(f, x, xs...)
end
```
In the following--even though it is *type piracy*--by adding a `getindex` method, we enable the `[]` notation of Newton to work with symbolic functions, like `u()` defined below, which is used in place of $f$:
In the following---even though it is *type piracy*---by adding a `getindex` method, we enable the `[]` notation of Newton to work with symbolic functions, like `u()` defined below, which is used in place of $f$:
```{julia}
@@ -215,7 +215,7 @@ Now, let's look at:
ex₂ = u[c, c+h, c+2h]
```
If multiply by $2$ and simplify, a discrete approximation for the second derivative--the second order forward [difference equation](http://tinyurl.com/n4235xy)--is seen:
If multiply by $2$ and simplify, a discrete approximation for the second derivative---the second order forward [difference equation](http://tinyurl.com/n4235xy)---is seen:
```{julia}
simplify(2ex₂)
@@ -794,7 +794,7 @@ This is re-expressed as $2s + s \cdot p$ with $p$ given by:
```{julia}
cancel((a_b - 2s)/s)
p = cancel((a_b - 2s)/s)
```
Now, $2s = m - s\cdot m$, so the above can be reworked to be $\log(1+m) = m - s\cdot(m-p)$.
@@ -807,7 +807,7 @@ How big can the error be between this *approximations* and $\log(1+m)$? The expr
```{julia}
Max = (v/(2+v))(v => sqrt(2) - 1)
Max = (x/(2+x))(x => sqrt(2) - 1)
```
The error term is like $2/19 \cdot \xi^{19}$ which is largest at this value of $M$. Large is relative - it is really small: