Skip to main content

Subsection 4.4.2 Via Gram-Schmidt QR factorization

In Section 3.2, you were introduced to the (Classical and Modified) Gram-Schmidt process and how it was equivalent to computing a QR factorization of the matrix, \(A \text{,}\) that has as columns the linearly independent vectors being orthonormalized. The resulting \(Q \) and \(R \) can be used to solve the linear least squares problem by first computing \(y = Q^H b \) and next solving \(R \hat x = y \text{.}\)

Starting with \(A \in \Cmxn \) let's explicitly state the steps required to solve the LLS problem via either CGS or MGS and analyze the cost.:

  • From Homework 3.2.6.1 or Homework 3.2.6.2, factoring \(A = Q R \) via CGS or MGS costs, approximately, \(2 mn^2 \) flops.

  • Compute \(y = Q^H b \text{:}\) \(2 m n \) flops.

  • Solve \(R \hat x = y \text{:}\) \(n^2 \) flops.

Total: \(2 m n^2 + 2 m n + n^2 \) flops.