Você está na página 1de 9

LOVELY PROFFESSIONAL UNIVERSITY

TERM PAPER
MATHS

PROVE THAT THE EIGEN VALUES OF A


HERMITIAN MATRIX ARE ALWAYS REAL.GIVE
EXAMPLES

SUBMITTED TO :- SUBMITTED BY :-
MS. PRIYANKA SINGH NAME – SHIVAM SHARMA
DEPTT. OF MATHS ROLL NO. – A59
SECTION – K6004
REGD. NO. - 11013138
ACKNOWLEDGEMENT

Words are not enough to pay gratitude to them who helped me in producing this project. Still
I would like to add few words for the people who were a part of this term paper in numerous
ways, people who gave unending support right from the stage the idea was conceived. In
particular I wish to thanks our Teacher, PRIYANKA SINGH, without whose support this
project would have been impossible. She has not only helped in giving guidance but also
reviewed this project painstaking attention for the details. I would like to take this
opportunity to thanks all the staff members for their unending support which they have
provided in many ways .Last but not the least I would like to thanks all my classmates for
overwhelming support throughout the making term paper.

SHIVAM SHARMA

TABLE OF CONTENTS
1. ABSTRACT

2. HERMITIAN MATRIX

3. PROPERTIES

4. EXAMPLES

5. HERMITIAN MATRICES HAVE REAL EIGEN VALUES

6. REFERENCES

ABSTRACT
What is hermitian matrix? ----- Properties of hermitian matrices ----- Examples of hermitian
matrices ------ Prove for the statement that hermitian matrices have real eigen values
HERMITIAN MATRIX

In mathematics, a Hermitian matrix (or self-adjoint matrix) is a square


matrix with complex entries that is equal to its own conjugate transpose – that is, the element
in the i-th row and j-th column is equal to the complex conjugate of the element in the j-th
row and i-th column, for all indices i and j:

If the conjugate transpose of a matrix A is denoted by , then the Hermitian property can be
written concisely as

Hermitian matrices can be understood as the complex extension of real symmetric matrices.

Hermitian matrices are named after Charles Hermite, who demonstrated in 1855 that matrices
of this form share a property with real symmetric matrices of having eigen values always real.

PROPERTIES

The entries on the main diagonal (top left to bottom right) of any Hermitian matrix are
necessarily real. A matrix that has only real entries is Hermitianif and only if it is a symmetric
matrix, i.e., if it is symmetric with respect to the main diagonal. A real and symmetric matrix
is simply a special case of a Hermitian matrix.

Every Hermitian matrix is normal, and the finite-dimensional spectral theorem applies. It says
that any Hermitian matrix can be diagonalized by a unitary matrix, and that the resulting
diagonal matrix has only real entries. This means that all eigen values of a Hermitian matrix
are real, and, moreover, eigen vectors with distinct eigenvalues are orthogonal. It is possible
to find an orthonormal basis of Cn consisting only of eigenvectors.

The sum of any two Hermitian matrices is Hermitian, and the inverse of an invertible
Hermitian matrix is Hermitian as well. However, the product of two Hermitian
matrices A and B will only be Hermitian if they commute, i.e., if AB = BA. Thus An is
Hermitian if A is Hermitian and n is an integer.

The Hermitian n-by-n matrices form a vector space over the real numbers (but not over the
complex numbers). The dimension of this space is n2 (one degree of freedom per main
diagonal element, and two degrees of freedom per element above the main diagonal).
The eigenvectors of a Hermitian matrix are orthogonal, i.e., its eigen
decomposition is where Since right- and left- inverse are the
same, we also have

and therefore

,
where σi are the eigenvalues and ui the eigenvectors.

Additional properties of Hermitian matrices include:

 The
sum of a square matrix and its conjugate transpose is
Hermitian.
 The difference of a square matrix and its conjugate transpose
is skew-Hermitian (also called antihermitian).
 An arbitrary square matrix C can be written as the sum of a Hermitian
matrix A and a skew-Hermitian matrix B:

The determinant of a Hermitian matrix is real:

Proof:

Therefore if

EXAMPLES

For example,

is a Hermitian matrix.

HERMITIAN MATRICES HAVE REAL EIGEN VALUES


Given a matrix A of dimension m ´ k (where m denotes the number of rows and k denotes the
number of columns) and a matrix B of dimension k ´ n, the matrix product AB is defined as
the m ´ n matrix with the components

for m ranging from 1 to m and for n ranging from 1 to n. Notice that matrix multiplication is
not generally commutative, i.e., the product AB is not generally equal to the product BA.

The transpose AT of the matrix A is defined as the k ´ m matrix with the components

for m ranging from 1 to m and for k ranging from 1 to k. Notice that transposition is
distributive, i.e., we have (A+B)T = (AT + BT).

Combining the preceding definitions, the transpose of the matrix product AB has the
components

Hence we've shown that

We can also define the complex conjugate A* of the matrix A as the m ´ k matrix with the
components

Notice that the matrix A can be written as the sum AR + iAI where AR and AI are real valued
matrices. The complex conjugate of A can then be written in the form

We also note that transposition and complex conjugation are commutative, i.e., we have
(AT)* = (A*)T. Hence the composition of these two operations (in either order) gives the
same result, called the Hermitian conjugate (named for the French mathematician Charles
Hermite, 1822-1901) and denoted by AH. We can express the components of AH as follows

A Hermitian matrix is defined as a matrix that is equal to its Hermitian conjugate. In other
words, the matrix A is Hermitian if and only if A = AH. Obviously a Hermitian matrix must
be square, i.e., it must have dimension m ´ m for some integer m.

The Hermitian conjugate of a general matrix product satisfies an identity similar to (1). To
prove this, we begin by writing the product AB in the form

Thus the Hermitian conjugate can be written as

Applying identity (1) to the transposed products, we have

We recognize the right hand side as the product of the Hermitian conjugates of B and A,
i.e.,

Consequently we have the identity

We're now in a position to prove an interesting property of Hermitian matrices, namely,


that their eigenvalues are necessarily real. Recall that the scalar l is an eigenvalue of the
(square) matrix A if and only if there is a column vector X such that

Taking the Hermitian conjugate of both sides, applying identity (2), and noting that
multiplication by a scalar is commutative, we have

Now, if A is Hermitian, we have (by definition) AH = A, so this becomes


If we multiply X by both sides of this equation, and if we multiply both sides of the
original eigenvalue equation by XH, we get

Since the left hand sides are equal, and since multiplication by a scalar is commutative, we
havel = l*, and therefore l is purely real.

Of course, the converse is not true. A matrix with real eigenvalues is not necessarily
Hermitian. This is easily seen by examining the general 2 ´ 2 matrix

The roots of the characteristic polynomial

are

The necessary and sufficient condition for the roots to be purely real is that both of the
following relations are satisfied

If the matrix is Hermitian we have a = d = 0, b = c, and b = -g, in which case the left hand
expression reduces to a sum of squares (necessarily non-negative) and the right hand
expression vanishes. However, it is also possible for these two relations to be satisfied
even if the original matrix is not Hermitian. For example, the matrix

is not Hermitian if r ¹ s, but it has the eigenvalues

which are purely real provided only that rs ³ 0.

Returning to Hermitian matrices, we can also show that they possess another very
interesting property, namely, that their eigenvectors are mutually orthogonal (assuming
distinct eigenvalues) in a sense to be defined below. To prove this, let l1 and l2 denote two
distinct eigenvalues of the Hermitian matrix A with the corresponding eigenvectors
X1 and X2. (These subscripts signify vector designations, not component indices.) Then
we have

Taking the Hermitian conjugate of both sides of the left hand equation, replacing AH with
A, noting that l1* = l1, and multiplying X2 by both sides gives

Now we multiply both sides of the right hand equation by X1H to give

The left hand sides of these last two equations are identical, so subtracting one from the
other gives

Since the eigenvalues are distinct, this implies

which shows that the "dot product" of X2 with the complex conjugate of X1 vanishes. In
general this inner product can be applied to arbitrary vectors, and we sometimes use the
bra/ket notation introduced by Paul Dirac

where, as always, the asterisk superscript signifies the complex conjugate. (The subscripts
denote component indices.) Terms of this form are a suitable "squared norm" for the same
reason that the squared norm of an individual complex number z = a + bi is not z2, but
rather z*z = a2 + b2.

Hermitian matrices have found an important application in modern physics, as the


representations of measurement operators in Heisenberg's version of quantum mechanics.
To each observable parameter of a physical system there corresponds a Hermitian matrix
whose eigenvalues are the possible values that can result from a measurement of that
parameter, and whose eigenvectors are the corresponding states of the system following a
measurement. Since a measurement yields precisely one real value and leaves the system
in precisely one of a set of mutually exclusive (i.e., "orthogonal") states, it's natural that
the measurement values should correspond to the eigenvalues of Hermitian matrices, and
the resulting states should be the eigenvectors.

REFERENCES
1. MATHPAGES

2. WIKIPEDIA

Você também pode gostar