Escolar Documentos
Profissional Documentos
Cultura Documentos
TERM PAPER
MATHS
SUBMITTED TO :- SUBMITTED BY :-
MS. PRIYANKA SINGH NAME – SHIVAM SHARMA
DEPTT. OF MATHS ROLL NO. – A59
SECTION – K6004
REGD. NO. - 11013138
ACKNOWLEDGEMENT
Words are not enough to pay gratitude to them who helped me in producing this project. Still
I would like to add few words for the people who were a part of this term paper in numerous
ways, people who gave unending support right from the stage the idea was conceived. In
particular I wish to thanks our Teacher, PRIYANKA SINGH, without whose support this
project would have been impossible. She has not only helped in giving guidance but also
reviewed this project painstaking attention for the details. I would like to take this
opportunity to thanks all the staff members for their unending support which they have
provided in many ways .Last but not the least I would like to thanks all my classmates for
overwhelming support throughout the making term paper.
SHIVAM SHARMA
TABLE OF CONTENTS
1. ABSTRACT
2. HERMITIAN MATRIX
3. PROPERTIES
4. EXAMPLES
6. REFERENCES
ABSTRACT
What is hermitian matrix? ----- Properties of hermitian matrices ----- Examples of hermitian
matrices ------ Prove for the statement that hermitian matrices have real eigen values
HERMITIAN MATRIX
If the conjugate transpose of a matrix A is denoted by , then the Hermitian property can be
written concisely as
Hermitian matrices can be understood as the complex extension of real symmetric matrices.
Hermitian matrices are named after Charles Hermite, who demonstrated in 1855 that matrices
of this form share a property with real symmetric matrices of having eigen values always real.
PROPERTIES
The entries on the main diagonal (top left to bottom right) of any Hermitian matrix are
necessarily real. A matrix that has only real entries is Hermitianif and only if it is a symmetric
matrix, i.e., if it is symmetric with respect to the main diagonal. A real and symmetric matrix
is simply a special case of a Hermitian matrix.
Every Hermitian matrix is normal, and the finite-dimensional spectral theorem applies. It says
that any Hermitian matrix can be diagonalized by a unitary matrix, and that the resulting
diagonal matrix has only real entries. This means that all eigen values of a Hermitian matrix
are real, and, moreover, eigen vectors with distinct eigenvalues are orthogonal. It is possible
to find an orthonormal basis of Cn consisting only of eigenvectors.
The sum of any two Hermitian matrices is Hermitian, and the inverse of an invertible
Hermitian matrix is Hermitian as well. However, the product of two Hermitian
matrices A and B will only be Hermitian if they commute, i.e., if AB = BA. Thus An is
Hermitian if A is Hermitian and n is an integer.
The Hermitian n-by-n matrices form a vector space over the real numbers (but not over the
complex numbers). The dimension of this space is n2 (one degree of freedom per main
diagonal element, and two degrees of freedom per element above the main diagonal).
The eigenvectors of a Hermitian matrix are orthogonal, i.e., its eigen
decomposition is where Since right- and left- inverse are the
same, we also have
and therefore
,
where σi are the eigenvalues and ui the eigenvectors.
The
sum of a square matrix and its conjugate transpose is
Hermitian.
The difference of a square matrix and its conjugate transpose
is skew-Hermitian (also called antihermitian).
An arbitrary square matrix C can be written as the sum of a Hermitian
matrix A and a skew-Hermitian matrix B:
Proof:
Therefore if
EXAMPLES
For example,
is a Hermitian matrix.
for m ranging from 1 to m and for n ranging from 1 to n. Notice that matrix multiplication is
not generally commutative, i.e., the product AB is not generally equal to the product BA.
The transpose AT of the matrix A is defined as the k ´ m matrix with the components
for m ranging from 1 to m and for k ranging from 1 to k. Notice that transposition is
distributive, i.e., we have (A+B)T = (AT + BT).
Combining the preceding definitions, the transpose of the matrix product AB has the
components
We can also define the complex conjugate A* of the matrix A as the m ´ k matrix with the
components
Notice that the matrix A can be written as the sum AR + iAI where AR and AI are real valued
matrices. The complex conjugate of A can then be written in the form
We also note that transposition and complex conjugation are commutative, i.e., we have
(AT)* = (A*)T. Hence the composition of these two operations (in either order) gives the
same result, called the Hermitian conjugate (named for the French mathematician Charles
Hermite, 1822-1901) and denoted by AH. We can express the components of AH as follows
A Hermitian matrix is defined as a matrix that is equal to its Hermitian conjugate. In other
words, the matrix A is Hermitian if and only if A = AH. Obviously a Hermitian matrix must
be square, i.e., it must have dimension m ´ m for some integer m.
The Hermitian conjugate of a general matrix product satisfies an identity similar to (1). To
prove this, we begin by writing the product AB in the form
We recognize the right hand side as the product of the Hermitian conjugates of B and A,
i.e.,
Taking the Hermitian conjugate of both sides, applying identity (2), and noting that
multiplication by a scalar is commutative, we have
Since the left hand sides are equal, and since multiplication by a scalar is commutative, we
havel = l*, and therefore l is purely real.
Of course, the converse is not true. A matrix with real eigenvalues is not necessarily
Hermitian. This is easily seen by examining the general 2 ´ 2 matrix
are
The necessary and sufficient condition for the roots to be purely real is that both of the
following relations are satisfied
If the matrix is Hermitian we have a = d = 0, b = c, and b = -g, in which case the left hand
expression reduces to a sum of squares (necessarily non-negative) and the right hand
expression vanishes. However, it is also possible for these two relations to be satisfied
even if the original matrix is not Hermitian. For example, the matrix
Returning to Hermitian matrices, we can also show that they possess another very
interesting property, namely, that their eigenvectors are mutually orthogonal (assuming
distinct eigenvalues) in a sense to be defined below. To prove this, let l1 and l2 denote two
distinct eigenvalues of the Hermitian matrix A with the corresponding eigenvectors
X1 and X2. (These subscripts signify vector designations, not component indices.) Then
we have
Taking the Hermitian conjugate of both sides of the left hand equation, replacing AH with
A, noting that l1* = l1, and multiplying X2 by both sides gives
Now we multiply both sides of the right hand equation by X1H to give
The left hand sides of these last two equations are identical, so subtracting one from the
other gives
which shows that the "dot product" of X2 with the complex conjugate of X1 vanishes. In
general this inner product can be applied to arbitrary vectors, and we sometimes use the
bra/ket notation introduced by Paul Dirac
where, as always, the asterisk superscript signifies the complex conjugate. (The subscripts
denote component indices.) Terms of this form are a suitable "squared norm" for the same
reason that the squared norm of an individual complex number z = a + bi is not z2, but
rather z*z = a2 + b2.
REFERENCES
1. MATHPAGES
2. WIKIPEDIA