Skip to main content

Unit 8.1.3 What you will learn

This week, you are introduced to additional techniques for solving sparse linear systems (or any linear system where computing a matrix-vector multiplication with the matrix is cheap). We discuss descent methods in general and the Conjugate Gradient Method in particular, which is the most important member of this family of algorithms.

Upon completion of this week, you should be able to

  • Relate solving a linear system of equations \(A x = b \text{,}\) where \(A \) is symmetric positive definite (SPD), to finding the minimum of the function \(f( x ) = \frac{1}{2} x^T A x + x^T b \text{.}\)

  • Solve \(A x = b \) via descent methods including the Conjugate Gradient Method.

  • Exploit properties of A-conjugate search directions to morph the Method of Steepest Descent into a practical Conjugate Gradient Method.

  • Recognize that while in exact arithmetic the Conjugate Gradient Method solves \(A x = b \) in a finite number of iterations, in practice it is an iterative method due to error introduced by floating point arithmetic.

  • Accelerate the Method of Steepest Descent and Conjugate Gradient Method by applying a preconditioner implicitly defines a new problem with the same solution and better condition number.