Lecture 9: Four Ways to Solve Least Squares Problems
Description
In this lecture, Professor Strang details the four ways to solve least-squares problems. Solving least-squares problems comes in to play in the many applications that rely on data fitting.
Summary
- Solve \(A^{\mathtt{T}} Ax = A^{\mathtt{T}}b\) to minimize \(\Vert Ax - b \Vert^2\)
- Gram-Schmidt \(A = QR\) leads to \(x = R^{-1} Q^{\mathtt{T}}b\).
- The pseudoinverse directly multiplies \(b\) to give \(x\).
- The best \(x\) is the limit of \((A^{\mathtt{T}}A + \delta I)^{-1} A^{\mathtt{T}}b\) as \(\delta \rightarrow 0\).
Related section in textbook: II.2
Instructor: Prof. Gilbert Strang
Problems for Lecture 9
From textbook Section II.2
2. Why do \(A\) and \(A\)
have the same rank? If \(A\) is square, do \(A\) and \(A\)
have the same eigenvectors? What are the eigenvalues of \(A\)
?
8. What multiple of \(a = \left[\begin{matrix}1\\1\end{matrix}\right]\) should be subtracted from \(b = \left[\begin{matrix}4\\0\end{matrix}\right]\) to make the result \(A_2\) orthogonal to \(a\)? Sketch a figure to show \(a\), \(b\), and \(A_2\).
9. Complete the Gram-Schmidt process in Problem 8 by computing \(q_1=a/\|a\|\) and \(q_2=A_2/\|A_2\|\) and factoring into \(QR\): $$\left[\begin{matrix}1 & 4\\ 1 & 0\end{matrix}\right] = \left[\begin{matrix}q_1 & q_2\end{matrix}\right]\left[\begin{matrix}\|a\| & ?\\ 0 & \|A_2\|\end{matrix}\right]$$ The backslash command \(A\backslash b\) is engineered to make \(A\) block diagonal when possible.