I was introduced to the geometric interpretation of correlation and linear regression recently.
Orignially due to the famous statistician R.A.Fisher, the idea is that the correlation between
two variables is the cosine of the angle between the 2 vectors in n-dimensional space.
This can be demonstrated in Maple as follows:
First, we represent each variable as a vector and transform it so that it is centred at its
mean and has a length equal to the standard deviation of the original vector.
Here is a simple procedure which does that:
trf := proc(a::Vector)
# transformation of a vector to a new vector
# centred about it's mean, with length equal
# to the original vector's standard deviation
uses LinearAlgebra, Statistics:
for i from 1 to n do
Then it is extremely simple to find the correlation
This can then be readily extended to ordinarly least squares regression, because
the regression coefficient is simply the magnitude of the orthogonal projection of
the response vector on the dependent vector:
LinearFit([1,t], x, y, t);
0.2675455672 + 0.1759074773 t