Following a discussion using matrix algebra to show computation in a Multivariate Analysis of Variance, a doctoral student asked me,
“Professor, when will I ever use this? Why do I need to know this?”
He had a valid point. I’m always asking myself why I’m teaching something. Is it because it interests me personally, because it is in the textbook or because students really need to know it.
Let’s take some things about matrix algebra we always teach students in statistics.
What conformable means and why it might matter
Two matrices are conformable if they can be multiplied together. When you multiply two matrices, the row of the first matrix will be multiplied by the column of the second matrix. You sum the products and that is the first element in the matrix. You repeat this until you have multiplied all of the rows in the first matrix by all of the columns in the second.
So — you can multiply a 3 x 2 matrix by a 2 x 3 matrix but not vice versa.
Multiplying a matrix of dimension a x b and a matrix of dimension c x d will give you a resulting matrix with a rows and d columns, that is, of dimensions a x d .
This can give you results that sometimes seem counter-intuitive, like that the product of a 1 x 3 matrix and a 3 x 1 matrix is a 3 x 3 matrix.
It may seem weird that the result of matrix multiplication can either be a larger matrix than both of the matrices you multiplied, or smaller than both of them, but there it is.
If both matrices are square, that is, of dimension n x n, then the resulting product will also be an n x n matrix.
And, of course, any matrix can be multiplied by its transpose because the transpose of an m x n matrix will always be n x m .
If a square matrix is of full rank, it means that none of the rows are linearly dependent. If you DO have linear dependence, it means you have redundant measures. Now, I could go on to prove this mathematically and all of it is very interesting to me.
I question, though, whether you really need to know anything about matrix algebra to understand that redundant measures are a bad thing.
Do you need matrix algebra to explain that we are going to apply coefficients (do you even need to refer to it as a vector?) to the values of each variable for each record and get a predicted score such that
predicted score = b0 + b1X1 + b2X2 …. b.Xn
When I was in graduate school, calculators that did statistical analyses, even as simple as regression, cost a few hundred dollars which was the equivalent of three months of my car payment. Computer time was charged to your department by the hour. So … my first few courses, I did all of my homework problems using a pencil and paper, transposing and inverting matrices – and it was a huge pain in the ass.
Then, I got a job as a research assistant and one of the perks was hours of computer time. I thought I’d died and gone to heaven. It took me less than half an hour to get all of my homework done using SAS (which ran on a mini-computer and spit out printouts that I had to walk across campus to pick up).
My students are learning in a completely different environment. So … do they need to learn the same things in the same way I did? This is a question I ponder a lot.