Skip to main content

Subsection 2.3 Question 3

Question 3.

Let \(A \in \mathbb R^{m \times n} \) and \(x \in \mathbb R^n \text{.}\) How many floating point operations (flops) are required to compute the matrix-vector multiplication \(A x \text{?}\)

Answer

Answer

Approximately \(m n \) multiplies and \(m (n-1) \) adds, for a total of \(2 m n - m \) flops.

1 Solution

Solution

Let

\begin{equation*} A = \left( \begin{array}{ c c c c} \alpha_{0,0} \amp \alpha_{0,1} \amp \cdots \amp \alpha_{0,n-1} \\ \alpha_{1,0} \amp \alpha_{1,1} \amp \cdots \amp \alpha_{1,n-1} \\ \vdots \amp \vdots \amp \amp \vdots \\ \alpha_{m-1,0} \amp \alpha_{m-1,1} \amp \cdots \amp \alpha_{m-1,n-1} \\ \end{array} \right) \mbox{ and } x = \left( \begin{array}{c} \chi_0 \\ \chi_1 \\ \vdots \\ \chi_{n-1} \end{array} \right). \end{equation*}

Then

\begin{equation*} \begin{array}{rcl} A x \amp = \amp \left( \begin{array}{ c c c c} \alpha_{0,0} \amp \alpha_{0,1} \amp \cdots \amp \alpha_{0,n-1} \\ \alpha_{1,0} \amp \alpha_{1,1} \amp \cdots \amp \alpha_{1,n-1} \\ \vdots \amp \vdots \amp \amp \vdots \\ \alpha_{m-1,0} \amp \alpha_{m-1,1} \amp \cdots \amp \alpha_{m-1,n-1} \\ \end{array} \right) \left( \begin{array}{c} \chi_0 \\ \chi_1 \\ \vdots \\ \chi_{n-1} \end{array} \right)\\ \amp = \amp \left( \begin{array}{r r r r r} \alpha_{0,0} \chi_0 + \amp \alpha_{0,1} \chi_1 + \amp \cdots + \amp \alpha_{0,n-1} \chi_{n-1} \\ \alpha_{1,0} \chi_0 + \amp \alpha_{1,1} \chi_1 + \amp \cdots + \amp \alpha_{1,n-1} \chi_{n-1} \\ \vdots ~~~~ \amp \vdots ~~~~ \amp \amp \vdots ~~~~ \\ \alpha_{m-1,0} \chi_0 + \amp \alpha_{m-1,1} \chi_1 + \amp \cdots + \amp \alpha_{m-1,n-1} \chi_{n-1} \end{array} \right). \end{array} \end{equation*}

This exposes \(m n \) muiltiplications and \(m (n-1) \) additions, for a total of \(2 m n - m \) flops. We usually would state this as "approximately \(2 m n\) flops."

2 Alternative solution

Alternative solution

A dot product \(x^T y \) with vectors of size \(n\) requires \(n \) multiplies and \(n-1 \) adds for \(2n-1 \) flops. A matrix-vector multiplication with an \(m \times n \) matrix requires \(m \) dot products with vectors of size \(n \text{,}\) for a total of \(m ( 2n-1) = 2mn - m\) flops.

3 Relevance

Relevance

Matrix computations are at the core of many scientific computing and data science applications. Such applications often require substantial computational resources. Thus, reducing the cost (in time) of matrix computations is important, as is the ability to estimate that cost.

4 Resources

Resources

The cost of various linear algebra operations is discussed throughout LAFF.