We know, Linear algebra has a very vast variety of applications like animation, data analysis, networking etc. And, we are bound to perform operations like finding an inverse of some huge matrix. Or, for that matter, even finding really big power of a simple 3×3 matrix. It can become cumbersome to do such a feat manually…(But.. Why do I need to know that calculation… I have a computer!!!)

Imagine yourself in a world devoid of a computer or any calculating machine, or may be your computer/ calculator is not powerful enough to handle that kind of computation. But, you HAVE to find a solution.. Because, probably your life depends on it (Just kidding here :)). Anyways, for some reason, you desperately need to pull off that tough calculation!! Now what???

Here comes the characteristic polynomials to our rescue? Oh.. My superhero.. my Messiah.. The Characteristic polynomials!!! ( Give it a big round of applause please.)

Yes.. It lets you do exactly those herculean feat without bursting your grey cells. ( Thank God!!! We are saved.)

Here is how this special saviour looks like.

Δ(t) = det( tI – A)

Where, Δ(t) is the characteristic polynomial, t is an indeterminate, I is a n-square matrix, and A is a n×n square matrix for which we have to do one of those herculean feat.

Friends, this introduction was necessary because we would have to study Caley Hamilton theorem and eigenvectors which would help us in diagonalisation of matrices. And we know, diagonal matrices have some nice properties that are easy to work with (Because we love taking shortcuts and getting things done really fast).

In the next post (Yesss.. In the next post dear friend.. just to keep this post short and simple, before your mind wanders off), we will do some practice problems on finding characteristic polynomial. Watch out this space for more updates. Till then, explore a bit more about it. Have a nice time!!

Folks, how is this post?Comment hereto pour your thoughts!

If you liked this post, then please do spread a word. (Please read copyrights before sharing.)

Hello everyone, those of you who ever had a course in linear algebra might have heard of this alien like name. (Yes.. I heard it in my Linear algebra class and imagined it to be an alien from some unfound planet EFX23145. Eewwwwww!!!!)

Eigensomething!!!! What is this gross thing??

Or some of you might have fantasized it to be the name of some beautiful lass!! ( Ooooooooo.. Love you Ms. Eigen❤)

And the pious ones amongst you might be remembering it as PAP..(Aiyyyyooo..auvvvvvaaa.. What has paap n punya got to do in algebra?)

But sadly, dear friend, that’s none of it. Yes.. It’s none of the above. So, don’t get knocked off.. Because, here, I am going to reveal what the hell is this Eigensomething??

And, those of you who could sneak a little bit of information about this eigensomething to your brain, might remember that it had something to do with vectors.. vector spaces, linear transformations and with some diagonalization.

First of all, recall that ( linear ) transformations are some activity (changes) on the space, which can do numerous things to the space under consideration.. Like…

Or may be…

But, they keep the gridlines on the co-ordinate axis parallel and keep the origin fixed at a point i.e the origin doesn’t move about with transformation. (Now, that is what a linear transformation means.)

But, in all this process of transformation, imagine what happens to the vectors sitting on this vector space!!

See what happened to vectors [1,2]’ (in purple) and [0,2]’ (in purple) displayed on the black axis… Yes, as the axis moved to the new position i.e the red axis, the vectors too changed their position (now they are the blue vectors). They were knocked off from the original place!!!

But, this situation is not true for all the vectors. Some do hold on to their positions.. 😎 (Really!!!🤔).

Yes.. Some do stay in the original position.. They are called ‘Eigenvectors’.

(I know you are curious to see a transformation having eigenvectors. Beware.. Its a rough sketch and the scale is not upto the mark)

Look, the red vector in first quadrant moved to take on the position of the black vector. Whereas the red vector on the x axis has retained its position.

So, there, my friends, is an eigenvector… Sticking to its position!!!

That’s it friends!! This was just a crude introduction to eigenvectors. See you soon with more posts about this beautiful concept. Till then, explore a bit about this idea.

Image/ gif sources: Giphy.com

Folks, how is this post?Comment here to pour your thoughts!

If you liked this post, then please do spread a word. (Please read copyrights before sharing.)