Merge pull request #89 from fangliu-tju/main

Update taylor_series_polynomials.qmd
This commit is contained in:
john verzani 2023-05-10 13:43:19 -04:00 committed by GitHub
commit 59ea928ea7
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -267,7 +267,7 @@ $$
f[c] + f[c, c+h](x-c).
$$
As mentioned, we can verify directly that it interpolates the points $(c,f(c))$ and $(c+h, f(c+h))$. He we let `SymPy` do the algebra:
As mentioned, we can verify directly that it interpolates the points $(c,f(c))$ and $(c+h, f(c+h))$. Here we let `SymPy` do the algebra:
```{julia}
@ -304,7 +304,7 @@ Hmm, doesn't seem correct - that was supposed to be $0$. The issue isn't the mat
simplify(p₂(x => c+2h) - u(c+2h))
```
By contrast, at the point $x=c+3h$ we have no guarantee of interpolation, and indeed don't, as this expression is non always zero:
By contrast, at the point $x=c+3h$ we have no guarantee of interpolation, and indeed don't, as this expression is not always zero:
```{julia}
@ -418,7 +418,7 @@ By ignoring friction, the total energy is conserved giving:
$$
K = \frac{1}{2}m v^2 + mgR \cdot (1 - \cos(\theta) =
K = \frac{1}{2}m v^2 + mgR \cdot (1 - \cos(\theta)) =
\frac{1}{2} m R^2 (\frac{d\theta}{dt})^2 + mgR \cdot (1 - \cos(\theta)).
$$
@ -461,7 +461,7 @@ The polynomial in question will be the Taylor polynomial of degree $2$:
$$
T_2(x_k) = f(x_k) + f'(x_k)(x-x_k) + \frac{f''(x_k)}{2}(x - x_k)^2
T_2(x) = f(x_k) + f'(x_k)(x-x_k) + \frac{f''(x_k)}{2}(x - x_k)^2
$$
The vertex of this quadratic polynomial will be when its derivative is $0$ which can be solved for $x_{k+1}$ giving:
@ -612,7 +612,7 @@ plot(d1, 𝒂, 𝒃; color=:blue, label="interpolating")
plot!(d2; color=:green, label="Taylor")
```
The graph should be $0$ at each of the the points in `xs`, which we can verify in the graph above. Plotting over a wider region shows a common phenomenon that these polynomials approximate the function near the values, but quickly deviate away:
The graph should be $0$ at each of the points in `xs`, which we can verify in the graph above. Plotting over a wider region shows a common phenomenon that these polynomials approximate the function near the values, but quickly deviate away:
In this graph we make a plot of the Taylor polynomial for different sizes of $n$ for the function $f(x) = 1 - \cos(x)$:
@ -769,7 +769,7 @@ But how? One can see details of a possible way [here](https://github.com/musm/Am
First, there is usually a reduction stage. In this phase, the problem is transformed in a manner to one involving only a fixed interval of values. For this, function values of $k$ and $m$ are found so that $x = 2^k \cdot (1+m)$ *and* $\sqrt{2}/2 < 1+m < \sqrt{2}$. If these are found, then $\log(x)$ can be computed with $k \cdot \log(2) + \log(1+m)$. The first value - a multiplication - can easily be computed using pre-computed value of $\log(2)$, the second then *reduces* the problem to an interval.
Now, for this problem a further trick is utilized, writing $s= f/(2+f)$ so that $\log(1+m)=\log(1+s)-\log(1-s)$ for some small range of $s$ values. These combined make it possible to compute $\log(x)$ for any real $x$.
Now, for this problem a further trick is utilized, writing $s= m/(2+m)$ so that $\log(1+m)=\log(1+s)-\log(1-s)$ for some small range of $s$ values. These combined make it possible to compute $\log(x)$ for any real $x$.
To compute $\log(1\pm s)$, we can find a Taylor polynomial. Let's go out to degree $19$ and use `SymPy` to do the work:
@ -786,7 +786,7 @@ This is re-expressed as $2s + s \cdot p$ with $p$ given by:
```{julia}
cancel(a_b - 2s/s)
cancel((a_b - 2s)/s)
```
Now, $2s = m - s\cdot m$, so the above can be reworked to be $\log(1+m) = m - s\cdot(m-p)$.
@ -850,8 +850,8 @@ For notational purposes, let $g(x)$ be the inverse function for $f(x)$. Assume *
\begin{align*}
f(x_0 + \Delta_x) &= f(x_0) + a_1 \Delta_x + a_2 (\Delta_x)^2 + \cdots a_n + (\Delta_x)^n + \dots\\
g(y_0 + \Delta_y) &= g(y_0) + b_1 \Delta_y + b_2 (\Delta_y)^2 + \cdots b_n + (\Delta_y)^n + \dots
f(x_0 + \Delta_x) &= f(x_0) + a_1 \Delta_x + a_2 (\Delta_x)^2 + \cdots a_n (\Delta_x)^n + \dots\\
g(y_0 + \Delta_y) &= g(y_0) + b_1 \Delta_y + b_2 (\Delta_y)^2 + \cdots b_n (\Delta_y)^n + \dots
\end{align*}
@ -881,7 +881,7 @@ Solving for $b_n = g^{(n)}(y_0) / n!$ gives the formal expression:
$$
g^{(n)}(y_0) = n! \cdot \lim_{\Delta_x \rightarrow 0}
\frac{\Delta_x - \sum_{i=1}^{n-1} b_i \left(\sum_{j=1}^n a_j (\Delta_x)^j \right)^i}{
\left(\sum_{j=1}^n a_j \left(\Delta_x^j\right)^i\right)^n}
\left(\sum_{j=1}^n a_j \left(\Delta_x \right)^j\right)^n}
$$
(This is following [Liptaj](https://vixra.org/pdf/1703.0295v1.pdf)).
@ -921,7 +921,7 @@ end
gᵏs
```
We can see the expected `g' = 1/f'` (where the point of evalution is $g(y) = 1/f'(f^{-1}(y))$ is not written). In addition, we get 3 more formulas, hinting that the answers grow rapidly in terms of their complexity.
We can see the expected `g' = 1/f'` (where the point of evalution is $g'(y) = 1/f'(f^{-1}(y))$ is not written). In addition, we get 3 more formulas, hinting that the answers grow rapidly in terms of their complexity.
In the above, for each `n`, the code above sets up the two sides, `left` and `right`, of an equation involving the higher-order derivatives of $g$. For example, when `n=2` we have: