more pset2 problems

This commit is contained in:
Steven G. Johnson 2023-02-22 10:16:21 -05:00
parent eab03cfce5
commit 4c63b638e3

View File

@ -28,12 +28,67 @@
"Compare the eigenvalues of $B$ (`eigvals(B)`) to the singular values of $A$ (`svdvals(A)`). What do you notice? Explain it by using the SVD $A = U \\Sigma V^T$ to construct eigenvalues and eigenvectors of $B$."
]
},
{
"cell_type": "markdown",
"id": "8110f592",
"metadata": {},
"source": [
"## Problem 2\n",
"\n",
"**(a)** For any $m \\times n$ real matrix $A$ and any real unitary $m \\times m$ matrix $Q_1$ and any real unitary $n \\times n$ matrix $Q_2$, show that $\\Vert A \\Vert$ = $\\Vert Q_1 A Q_2 \\Vert$ for the norms:\n",
"$$\n",
"\\Vert A \\Vert_2 = \\max_{x\\ne 0} \\frac{\\Vert A x \\Vert_2}{\\Vert x \\Vert_2}\\, , \\; \\; \\Vert A \\Vert_F = \\sqrt{\\text{tr}(A^T A) } \\, .\n",
"$$\n",
"Do *not* use the relationships of these norms to the singular values of $A$, from class; use only the definitions above. (Hint: a change of variables may be useful for the first norm, and the cyclic property of the trace for the second.)\n",
"\n",
"**(b)** Using the full SVD $A = U \\Sigma V^T$ and the unitary invariance from part (a), show that $\\Vert A \\Vert_2 = \\sigma_1$ and $\\Vert A \\Vert_F = \\sqrt{\\sum_k \\sigma_k^2}$."
]
},
{
"cell_type": "markdown",
"id": "32250444",
"metadata": {},
"source": [
"## Problem 3\n",
"\n",
"Find a closest-rank-1 matrix (in the Frobenius norm, for example) to:\n",
"\n",
"**(a)** $A = \\begin{pmatrix} 0 & 3 \\\\ 2 & 0 \\end{pmatrix}$\n",
"\n",
"**(b)** $A = \\begin{pmatrix} \\cos\\theta & -\\sin\\theta \\\\ \\sin\\theta & \\cos\\theta \\end{pmatrix}$ (where $\\theta$ is some real number).\n",
"\n",
"You should be able to do your calculations completely by hand (it's not too hard, honest!), but of course you may use Julia to check your answers if you wish."
]
},
{
"cell_type": "markdown",
"id": "d438a8e0",
"metadata": {},
"source": [
"## Problem 4\n",
"\n",
"For an $m \\times m$ real-symmetric matrix $S = S^T$, we know that we have real eigenvalues $\\lambda_1,\\ldots,\\lambda_m$ and can find an orthonormal basis of eigenvectors $Q = \\begin{pmatrix} q_1 & \\cdots & q_m \\end{pmatrix}$. So, we can write any vector $x \\in \\mathbb{R}^m$ as $x = Qc = q_1 c_1 + \\cdots q_m c_m$ for some coefficient $c$.\n",
"\n",
"**(a)** Show that $x^T x = c_1^2 + \\cdots + c_m^2$ and $x^T S x = \\lambda_1 c_1^2 + \\cdots + \\lambda_m c_m^2$.\n",
"\n",
"**(b)** Show that the **Rayleigh quotient**\n",
"$$\n",
"R(x) = \\frac{x^T S x}{x^T x}\n",
"$$\n",
"is *maximized* (over *any* possible $x \\ne 0$, not just eigenvectors) by $R(q_1) = \\lambda_1$.\n",
"\n",
"**(c)** If $A$ is any $m \\times n$ real matrix, we know that the squared singular values $\\sigma_i^2$ are the nonzero eigenvalues of $A^T A$. Use this fact, combined with part (b), to give an alternative proof of why\n",
"$$\n",
"\\Vert A \\Vert_2 = \\max_{x\\ne 0} \\frac{\\Vert A x \\Vert_2}{\\Vert x \\Vert_2} = \\sigma_1\n",
"$$"
]
},
{
"cell_type": "markdown",
"id": "88ac965b",
"metadata": {},
"source": [
"## Problems 2, 3, etc: coming soon"
"## Problems 5, 6, etc: coming soon"
]
},
{