Section 1.7-1.8:

- Understand the LU decompostion/Gaussian elimination, with and without pivoting.
- Be able to calculate the LU decomposition.
- Given P,L and U, know how to solve Ax=b.

- Understand the condition number: how to calculate it, what it means, its relationship with magnification, etc.
- Be able to bound errors due to perturbations in b or A, using Theorem 2.2.4, 2.3.3, and/or 2.3.8.
- Use the residual to bound error in the solution (Theorem 2.4.1).
- Understand how floating point arithmetic works, and be able to do calculations. (You do not need to know the IEEE standard.)
- Be able to precisely explain why multiplication and division are fine, but why addition/subtraction can lead to large errors.
- Understand what backwards stability means, and how to combine backwards stability with the earlier error estimates to show a priori accuracy of solutions.
- Be able to explain why small pivots in Gaussian elimination is bad, and how this relates to ill-conditioned matrices.

- Understand how to set up least squares problems, with arbitrary basis functions phi_i(t), and given data.
- Understand how reflections work, and be able to find requested reflections.
- Be able to calculate QR decompositions for given matrices.
- Be able to solve least squares problems given the QR decomposition.
- Be able to prove theorems about orthogonal matrices, inner products, orthogonal complements, subspaces, orthogonal projections, null spaces, ranges, etc, as in sections 3.2 and 3.5, of similar difficulty as those on the homework.
- Know how to set up the normal equations, and how to use them to solve the least squares problem.

As usual, this may not be an exhaustive list of everything I can ask you, but will cover almost everything I could ask you.