Friday Oct 14 lecture summary:

We talked about what will be on the first midterm. The topics include
Section 1.2: Section 1.3: Section 1.4: Section 1.5: Secion 8.1: Section 8.2:
With regard to ODE/PDE, you mainly just need to know the two approximations: u'(x_i)=( u(x_i+1) - u(x_i-1) )/(2h) and u''(x_i)=( u(x_i+1) -2u(x_i) + u(x_i-1) )/(h^2) and how to use them. The test will focus on these topics, but I reserve the right to ask questions on anything we talked about in class or that is covered in the book in those sections.

We discussed how, n using the iterative method, you can sometimes avoid actually writing down the matrices you want to use, which can simiplify programming.

We discussed Geiss-Seidel's method, which is an improvement on Jacobi's method. It converges in about half the iterations, and uses the same number of flops. The code is only one line different from that for Jacobi's method, line 14 in this code.

G-S is not inherently parallel, but we briefly discussed a scheme to get around this, red-black G-S.

We then discussed successive overrelaxation, an improvement on G-S. Essentially, at each step of G-S, you change the guess by MORE than G-S would tell you to. This can make convergence go 10 to 20 times faster. See the book for more details.