The concept of determinants plays a crucial role in linear algebra, particularly in understanding the properties of matrices and their associated linear transformations. One of the fundamental connections between determinants and eigenvalues is that the determinant of a matrix is equal to the product of its eigenvalues. This theorem has significant implications in various applications, including solving systems of linear equations, analyzing stability in dynamical systems, and understanding the geometry of linear transformations. In this article, we will delve into the proof of this theorem, highlighting its key steps and providing insights into its applications.
The Determinant and Eigenvalues
Before diving into the proof, let's define the key terms involved.
Determinant: The determinant of a square matrix A, denoted as det(A) or |A|, is a scalar value that captures certain properties of the matrix, such as its invertibility. The determinant is computed using a specific set of rules involving the elements of the matrix.
Eigenvalues: An eigenvalue of a square matrix A is a scalar λ for which there exists a non-zero vector v, called an eigenvector, such that:
Av = λv
This equation signifies that when the matrix A acts on the eigenvector v, it simply scales v by a factor of λ.
The Theorem: Proving the Connection
Now, let's prove the theorem that the determinant of a matrix is equal to the product of its eigenvalues. We will demonstrate this using the characteristic polynomial.
Characteristic Polynomial: The characteristic polynomial of a square matrix A is a polynomial obtained by subtracting λ from the diagonal elements of A and then calculating the determinant of the resulting matrix. It is given by:
P(λ) = det(A - λI)
where I is the identity matrix of the same size as A.
Proof:
-
Eigenvalues and Roots of the Characteristic Polynomial: The eigenvalues of A are precisely the roots of its characteristic polynomial. This is because if λ is an eigenvalue of A, then there exists a non-zero eigenvector v such that Av = λv. Rearranging this equation, we get (A - λI)v = 0. Since v is non-zero, this implies that the matrix (A - λI) is singular, meaning its determinant is zero. This condition is equivalent to P(λ) = 0, indicating that λ is a root of the characteristic polynomial.
-
Factoring the Characteristic Polynomial: The characteristic polynomial P(λ) can be factored into linear factors corresponding to its roots (the eigenvalues of A). Let λ<sub>1</sub>, λ<sub>2</sub>, ..., λ<sub>n</sub> be the eigenvalues of A. Then, we can write:
P(λ) = (λ - λ<sub>1</sub>)(λ - λ<sub>2</sub>)...(λ - λ<sub>n</sub>)
- Determinant as a Special Case: Setting λ = 0 in the characteristic polynomial, we get:
P(0) = det(A - 0I) = det(A)
On the other hand, substituting λ = 0 into the factored form of the characteristic polynomial gives:
P(0) = (-λ<sub>1</sub>)(-λ<sub>2</sub>)...(-λ<sub>n</sub>) = λ<sub>1</sub>λ<sub>2</sub>...λ<sub>n</sub>
- Conclusion: Equating the two expressions for P(0), we obtain:
det(A) = λ<sub>1</sub>λ<sub>2</sub>...λ<sub>n</sub>
This proves that the determinant of A is equal to the product of its eigenvalues.
Illustrative Example
Let's illustrate this theorem with a simple example. Consider the matrix:
A = [2 1; 1 2]
The characteristic polynomial is:
P(λ) = det(A - λI) = det([2-λ 1; 1 2-λ]) = (2-λ)² - 1 = λ² - 4λ + 3 = (λ - 1)(λ - 3)
The eigenvalues of A are λ<sub>1</sub> = 1 and λ<sub>2</sub> = 3.
Now, let's calculate the determinant of A:
det(A) = (2)(2) - (1)(1) = 3
We can see that the determinant of A is indeed equal to the product of its eigenvalues:
det(A) = λ<sub>1</sub>λ<sub>2</sub> = (1)(3) = 3
Applications
The theorem demonstrating the equality of the determinant and the product of eigenvalues has diverse applications in various fields:
Linear Algebra:
- Invertibility: A matrix is invertible if and only if its determinant is non-zero. This theorem implies that a matrix is invertible if and only if all its eigenvalues are non-zero.
- Eigenbasis: If a matrix has distinct eigenvalues, its eigenvectors form a basis for the vector space. This fact is crucial for diagonalizing matrices and simplifying linear transformations.
Dynamical Systems:
- Stability Analysis: In analyzing the stability of linear systems, the eigenvalues play a crucial role. The theorem implies that the system is stable if and only if the eigenvalues of the system matrix have negative real parts.
Geometry:
- Volume Scaling: The determinant of a matrix representing a linear transformation is a scaling factor for the volume of geometric objects. The theorem indicates that this scaling factor is equal to the product of the eigenvalues of the transformation matrix.
Conclusion
The theorem stating that the determinant of a matrix is equal to the product of its eigenvalues establishes a fundamental connection between these two important concepts in linear algebra. This theorem has significant implications in various fields, including linear algebra, dynamical systems, and geometry, providing a powerful tool for analyzing and understanding the behavior of matrices and their associated linear transformations. By understanding this theorem, we gain deeper insights into the properties and applications of matrices in various disciplines.