I am a data scientist working on time series forecasting (using R and Python 3) at the London Ambulance Service NHS Trust. I earned my PhD in cognitive neuroscience at the University of Glasgow working with fmri data and neural networks. I favour linux machines, and working in the terminal with Vim as my editor of choice.
A very important factorisation of a square non-singular matrix $A$ is:
$$ \begin{equation} A = X\Lambda X^{-1} \end{equation} $$Where $X$ contains the eigenvectors of $A$ and $\Lambda$ is a diagonal matrix containing the eigenvalues. This can be seen from:
$$ \begin{equation} AX = X\Lambda \end{equation} $$Since when $A$ multiplies any eigenvector $x_1$ (any column of $X$) we get the original vector scaled by some number:
$$ \begin{equation} x_1\lambda \end{equation} $$Therefore 'move' the $X$ to the right hand side like so:
$$ \begin{equation} AX = X\Lambda\\ AXX^{-1} = X\Lambda X^{-1}\\ A = X\Lambda X^{-1}\\ \end{equation} $$This shows why $A$ must be square and non-singular - the eigenvectors must be independent for $X^{-1}$ to exist.
We compute the eigenvalues and eigenvectors of $A$, and store the eigenvalues on the diagonal of a matrix 'eigval_mat' (the matrix $\Lambda$ above). Next, we need to compute the inverse of the eigenvector matrix. If the matrix $A$ is symmetric, then we know that the eigenvectors will be orthogonal, and the inverse is simply the transpose. If not, then we have to compute the inverse from scratch.
We create a matrix, call the 'eigdiag' method and print the results. We also 'reconstruct' the matrix $A$ from $X\Lambda X^{-1}$ (note there are some rounding errors):
Outputs:
< Eigenvalues and eigenvectors
back to project main page
back to home