The probability that it will be sunny the day after tomorrow and sunny tomorrow is \(0.4 \times 0.4 \text{.}\)
The probability that it will sunny the day after tomorrow and cloudy tomorrow is \(0.3 \times 0.4 \text{.}\)
The probability that it will sunny the day after tomorrow and rainy tomorrow is \(0.1 \times 0.2 \text{.}\)
Thus, the probability that it will be sunny the day after tomorrow, if it is sunny today, is \(0.4 \times 0.4 + 0.3 \times 0.4 + 0.1 \times 0.2
= 0.30 \text{.}\) Notice that this is the inner product of the vector that is the row for "Tomorrow is sunny" and the column for "Today is cloudy". By similar arguments, the probability that it is cloudy the day after tomorrow is 0.40 and the probability that it is rainy is 0.3.
Ponder This4.1.1.3.
If today is cloudy, what is the probability that a week from today it is sunny? cloudy? rainy?
Think about this for at most two minutes, and then look at the answer.
We will not answer this question until a little later in this unit. We insert it to make the point that things can get messy! (Hopefully you didn't waste too much time.)
When things get messy, it helps to introduce some notation.
Let \(\chi_s^{(k)} \) denote the probability that it will be sunny \(k\) days from now (on day \(k \)).
Let \(\chi_c^{(k)} \) denote the probability that it will be cloudy \(k \) days from now.
Let \(\chi_r^{(k)} \) denote the probability that it will be rainy \(k \) days from now.
The probabilities that denote what the weather may be on day \(k \) and the table that summarizes the probabilities are often represented as a (state) vector, \(x^{(k)} \text{,}\) and transition matrix, \(P \text{,}\) respectively:
Repeating this process (preferrably using Python rather than by hand), we can find the probabilities for the weather for the next seven days, under the assumption that today is cloudy, by completing the following calculations in the obvious way:
where \(Q \) is the transition matrix that tells us how the weather today predicts the weather the day after tomorrow. (Well, actually, we don't yet know that applying a matrix to a vector twice is a linear transformation... We'll learn that later this week.)
Now, just like \(P \) is simply the matrix of values from the original table that showed how the weather tomorrow is predicted from today's weather, \(Q \) is the matrix of values for the above table.
Homework4.1.1.5.
Given
Today
sunny
cloudy
rainy
sunny
0.4
0.3
0.1
Tomorrow
cloudy
0.4
0.3
0.6
rainy
0.2
0.4
0.3
fill in the following table, which predicts the weather the day after tomorrow, given the weather today:
Today
sunny
cloudy
rainy
Day after
sunny
tomorrow
cloudy
rainy
Now here is the hard part: Do so without using your knowledge about how to perform a matrix-matrix multiplication, since you won't learn about that until later this week...
By now surely you have noticed that the \(j\)th column of a matrix \(A\text{,}\) \(a_j \text{,}\) equals \(A e_j \text{.}\) So, the \(j \)th column of \(Q\) equals \(Q e_j \text{.}\) Now, using \(e_0 \) as an example,