Skip to main content

Unit 10.2.4 Cost of a simple QR algorithm

The QR algorithms that we have discussed incur the following approximate costs per iteration for an \(m \times m \) Hermitian matrix.

  • \(A \rightarrow Q R \) (QR factorization): \(\frac{4}{3} m^3 \) flops.

  • \(A := R Q \text{.}\) A naive implementation would take advantage of \(R \) being upper triangular, but not of the fact that \(A \) will again be Hermitian, at a cost of \(m^3 \) flops. If one also takes advantage of the fact that \(A\) is Hermitian, that cost is reduced to \(\frac{1}{2}m^3 \) flops.

  • If the eigenvectors are desired (which is usually the case), the update \(V := V Q \) requires an addition \(2 m^3\) flops.

Any other costs, like shifting the matrix, are inconsequential.

Thus, the cost, per iteration, equals approximately \((\frac{4}{3} + \frac{1}{2}) m^3 = \frac{11}{6} m^3 \) flops if only the eigenvalues are to be computed. If the eigenvectors are also required, then the cost increases by \(2 m^3 \) flops to become \(\frac{23}{6} m^3 \) flops.

Let us now consider adding deflation. The rule of thumb is that it takes a few iterations per eigenvalue that is found. Let's say \(K \) iterations are needed. Every time an eigenvalue is found, the problem deflates, decreasing in size by one. The cost then becomes

\begin{equation*} \sum_{i=m}^{1} K \frac{23}{6} i^3 \approx K \frac{23}{6} \int_{0}^{m} x^3 dx = K \frac{23}{6} \frac{1}{4} x^4 \vert_0^m =K \frac{23}{24} m^4. \end{equation*}

The bottom line is that the computation requires \(O(m^4) \) flops. All other factorizations we have encountered so far require at most \(O(m^3) \) flops. Generally \(O(m^4)\) is considered prohibitively expensive. We need to do better!