Eigenvalue: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Michael Underwood
m (Eigenvalues moved to Eigenvalue)
mNo edit summary
 
(5 intermediate revisions by 3 users not shown)
Line 1: Line 1:
{{subpages}}
In [[linear algebra]] an '''eigenvalue''' of a (square) [[matrix]] <math>A</math> is a number <math>\lambda</math> that satisfies the eigenvalue equation,
In [[linear algebra]] an '''eigenvalue''' of a (square) [[matrix]] <math>A</math> is a number <math>\lambda</math> that satisfies the eigenvalue equation,
:<math>\text{det}(A-\lambda I)=0\ ,</math>
:<math>\text{det}(A-\lambda I)=0\ ,</math>
where det means the [[determinant]], <math>I</math> is the [[identity matrix]] of the same [[dimension]] as <math>A</math>,
where det means the [[determinant]], <math>I</math> is the [[identity matrix]] of the same [[dimension]] as <math>A</math>,
and in general <math>\lambda</math> can be [[complex number|complex]].
and in general <math>\lambda</math> can be [[complex number|complex]].
The origin of this equation is the [[eigenvalue problem]], which is to find the eigenvalues and associated [[eigenvectors]] of <math>A</math>.
The origin of this equation, the [[characteristic polynomial]] of ''A'', is the [[eigenvalue problem]], which is to find the eigenvalues and associated [[eigenvectors]] of <math>A</math>.
That is, to find a number <math>\lambda</math> and a vector <math>\vec{v}</math> that together satisfy
That is, to find a number <math>\lambda</math> and a vector <math>\scriptstyle\vec{v}</math> that together satisfy
:<math>A\vec{v}=\lambda\vec{v}\ .</math>
:<math>A\vec{v}=\lambda\vec{v}\ .</math>
What this equation says is that even though <math>A</math> is a matrix its action on <math>\vec{v}</math> is the same as multiplying it by the number <math>\lambda</math>.
What this equation says is that even though <math>A</math> is a matrix its action on <math>\scriptstyle\vec{v}</math> is the same as multiplying the vector by the number <math>\lambda</math>.
This means that the vector <math>\vec{v}</math> and the vector <math>A\vec{v}</math> are [[parallel]] (or [[anti-parallel]] if <math>\lambda</math> is negative).
This means that the vector <math>\scriptstyle\vec{v}</math> and the vector <math>\scriptstyle A\vec{v}</math> are [[parallel]] (or [[anti-parallel]] if <math>\lambda</math> is negative).
Note that generally this will ''not'' be true.  This is most easily seen with a quick example.  Suppose
Note that generally this will ''not'' be true.  This is most easily seen with a quick example.  Suppose
:<math>A=\begin{pmatrix}a_{11} & a_{12} \\ a_{21} & a_{22}\end{pmatrix}</math> and <math>\vec{v}=\begin{pmatrix} v_1 \\ v_2 \end{pmatrix}\ .</math>
:<math>A=\begin{pmatrix}a_{11} & a_{12} \\ a_{21} & a_{22}\end{pmatrix}</math> and <math>\vec{v}=\begin{pmatrix} v_1 \\ v_2 \end{pmatrix}\ .</math>
Line 15: Line 17:
whereas the [[scalar]] product is
whereas the [[scalar]] product is
:<math>\lambda\vec{v}=\begin{pmatrix} \lambda v_1 \\ \lambda v_2 \end{pmatrix}\ .</math>
:<math>\lambda\vec{v}=\begin{pmatrix} \lambda v_1 \\ \lambda v_2 \end{pmatrix}\ .</math>
Obviously then <math>A\vec{v}\neq \lambda\vec{v}</math> unless
Obviously then <math>\scriptstyle A\vec{v}\neq \lambda\vec{v}</math> unless
<math>\lambda v_1 = a_{11}v_1+a_{12}v_2</math> and [[simultaneous equations|simultaneously]] <math>\lambda v_2 = a_{21}v_1+a_{22}v_2</math>,
<math>\lambda v_1 = a_{11}v_1+a_{12}v_2</math> and [[simultaneous equations|simultaneously]] <math>\lambda v_2 = a_{21}v_1+a_{22}v_2</math>.
and it is easy to pick numbers for the entries of <math>A</math> and <math>\vec{v}</math> such that this cannot happen for any value of <math>\lambda</math>.
For a given <math>\lambda</math>, it is easy to pick numbers for the entries of <math>A</math> and <math>\scriptstyle\vec{v}</math> such that this is not satisfied.


==The eigenvalue equation==
==The eigenvalue equation==
So where did the eigenvalue equation <math>\text{det}(A-\lambda I)=0</math> come from?  Well, we assume that we know the matrix <math>A</math> and want to find a number <math>\lambda</math> and a non-zero vector <math>\vec{v}</math> so that <math>A\vec{v}=\lambda\vec{v}</math>.  (Note that if <math>\vec{v}=\vec{0}</math> then the equation is always true, and therefore uninteresting.)  So now we have
So where did the eigenvalue equation <math>\text{det}(A-\lambda I)=0</math> come from?  Well, we assume that we know the matrix <math>A</math> and want to find a number <math>\lambda</math> and a non-zero vector <math>\scriptstyle\vec{v}</math> so that <math>\scriptstyle A\vec{v}=\lambda\vec{v}</math>.  (Note that if <math>\scriptstyle\vec{v}=\vec{0}</math> then the equation is always true, and therefore uninteresting.)  So now we have
<math>A\vec{v}-\lambda\vec{v}=\vec{0}</math>.  It doesn't make sense to subtract a number from a matrix, but we can factor out the vector if we first multiply the right-hand term by the identity, giving us
<math>\scriptstyle A\vec{v}-\lambda\vec{v}=\vec{0}</math>.  It doesn't make sense to subtract a number from a matrix, but we can factor out the vector if we first multiply the right-hand term by the identity, giving us
:<math>(A-\lambda I)\vec{v}=\vec{0}\ .</math>
:<math>(A-\lambda I)\vec{v}=\vec{0}\ .</math>
Now we have to remember the fact that <math>A-\lambda I</math> is a square matrix, and so it might be [[matrix inverse|invertible]].
Now we have to remember the fact that <math>A-\lambda I</math> is a square matrix, and so it might be [[matrix inverse|invertible]].
If it was invertible then we could simply multiply on the left by its inverse to get
If it was invertible then we could simply multiply on the left by its inverse to get
:<math>\vec{v}=(A-\lambda I)^{-1}\vec{0}=\vec{0}</math>
:<math>\vec{v}=(A-\lambda I)^{-1}\vec{0}=\vec{0}</math>
but we have already said that <math>\vec{v}</math> can't be the zero vector!  The only way around this is if <math>A-\lambda I</math> is in fact non-invertible.  It can be shown that a square matrix is non-invertible if and only if its [[determinant]] is zero.  That is, we require
but we have already said that <math>\scriptstyle\vec{v}</math> can't be the zero vector!  The only way around this is if <math>A-\lambda I</math> is in fact non-invertible.  It can be shown that a square matrix is non-invertible if and only if its [[determinant]] is zero.  That is, we require
:<math>\text{det}(A-\lambda I)=0\ ,</math>
:<math>\text{det}(A-\lambda I)=0\ ,</math>
which is the eigenvalue equation stated above.
which is the eigenvalue equation stated above.
Line 33: Line 35:
So far we have looked eigenvalues in terms of square matrices.  As usual in [[mathematics]] though we like things to be as general as possible, since then anything we prove will be true in as many different applications as possible.  So instead we can define eigenvalues in the following way.
So far we have looked eigenvalues in terms of square matrices.  As usual in [[mathematics]] though we like things to be as general as possible, since then anything we prove will be true in as many different applications as possible.  So instead we can define eigenvalues in the following way.


Definition: Let <math>V</math> be a [[vector space]] over a [[field]] <math>F</math>, and let <math>A:V\to V</math> be a [[linear map]].  An '''eigenvalue''' associated with <math>A</math> is an element <math>\lambda\in F</math> for which there exists a vector <math>\vec{v}\in V</math> such that
Definition: Let <math>V</math> be a [[vector space]] over a [[field]] <math>F</math>, and let <math>\scriptstyle A:V\to V</math> be a [[linear map]].  An '''eigenvalue''' associated with <math>A</math> is an element <math>\scriptstyle\lambda\in F</math> for which there exists a non-zero vector <math>\scriptstyle\vec{v}\in V</math> such that
:<math>A(\vec{v})=\lambda\vec{v}\ .</math>
:<math>A(\vec{v})=\lambda\vec{v}\ .</math>
Then <math>\vec{v}</math> is called the '''eigenvector''' of <math>A</math> associated with <math>\lambda</math>.
Then <math>\scriptstyle\vec{v}</math> is called the '''eigenvector''' of <math>A</math> associated with <math>\lambda</math>.[[Category:Suggestion Bot Tag]]

Latest revision as of 16:00, 10 August 2024

This article is developing and not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

In linear algebra an eigenvalue of a (square) matrix Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle A} is a number Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \lambda} that satisfies the eigenvalue equation,

Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \text{det}(A-\lambda I)=0\ ,}

where det means the determinant, is the identity matrix of the same dimension as , and in general can be complex. The origin of this equation, the characteristic polynomial of A, is the eigenvalue problem, which is to find the eigenvalues and associated eigenvectors of . That is, to find a number and a vector that together satisfy

What this equation says is that even though is a matrix its action on is the same as multiplying the vector by the number . This means that the vector and the vector are parallel (or anti-parallel if is negative). Note that generally this will not be true. This is most easily seen with a quick example. Suppose

and

Then their matrix product is

whereas the scalar product is

Obviously then unless and simultaneously . For a given , it is easy to pick numbers for the entries of and such that this is not satisfied.

The eigenvalue equation

So where did the eigenvalue equation come from? Well, we assume that we know the matrix and want to find a number Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \lambda} and a non-zero vector Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \scriptstyle\vec{v}} so that Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \scriptstyle A\vec{v}=\lambda\vec{v}} . (Note that if Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \scriptstyle\vec{v}=\vec{0}} then the equation is always true, and therefore uninteresting.) So now we have Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \scriptstyle A\vec{v}-\lambda\vec{v}=\vec{0}} . It doesn't make sense to subtract a number from a matrix, but we can factor out the vector if we first multiply the right-hand term by the identity, giving us

Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle (A-\lambda I)\vec{v}=\vec{0}\ .}

Now we have to remember the fact that Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle A-\lambda I} is a square matrix, and so it might be invertible. If it was invertible then we could simply multiply on the left by its inverse to get

Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \vec{v}=(A-\lambda I)^{-1}\vec{0}=\vec{0}}

but we have already said that Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \scriptstyle\vec{v}} can't be the zero vector! The only way around this is if Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle A-\lambda I} is in fact non-invertible. It can be shown that a square matrix is non-invertible if and only if its determinant is zero. That is, we require

Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \text{det}(A-\lambda I)=0\ ,}

which is the eigenvalue equation stated above.

A more technical approach

So far we have looked eigenvalues in terms of square matrices. As usual in mathematics though we like things to be as general as possible, since then anything we prove will be true in as many different applications as possible. So instead we can define eigenvalues in the following way.

Definition: Let Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle V} be a vector space over a field Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle F} , and let Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \scriptstyle A:V\to V} be a linear map. An eigenvalue associated with Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle A} is an element Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \scriptstyle\lambda\in F} for which there exists a non-zero vector Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \scriptstyle\vec{v}\in V} such that

Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle A(\vec{v})=\lambda\vec{v}\ .}

Then Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \scriptstyle\vec{v}} is called the eigenvector of Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle A} associated with Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \lambda} .