A set of mn numbers (real or complex) arranged in the form of a rectangular array having m rows and n columns is called an m x n matrix [to be read as ‘m by n’ matrix].
An m x n matrix is usually written as:
In a compact form, the above matrix is represented by:
or simply by
We write the general element of the matrix and enclose it in brackets of type [ ] or of the type ( ).
(i) Square Matrices
Definition: An matrix for which (i.e., the number of rows is equal to the number of columns) is called a square matrix of order n. It is also called an -rowed square matrix. The elements aij of a square matrix for which .
(ii) Unit Matrix or Identity Matrix
Definition: A square matrix, each of whose diagonal elements is 1 and whose non-diagonal elements are 0, is called a Unit Matrix or an Identity Matrix and is denoted by I. In will denote a unit matrix of order n.
Thus, a square matrix is a unit matrix if
(iii) Null Matrix:
Definition: The matrix whose elements are all 0 is called the null matrix (or zero matrix) of the type . It is usually denoted by O or more clearly by . Often, a null matrix is simply denoted by the symbol 0 read as ‘zero’.
(iv) Row Matrix and Column Matrix
Definition: Any matrix, which has only one row and columns, is called a row matrix or row vector.
Similarly, any matrix, which has rows and only one column, is a column matrix or a column vector.
Equality of Two Matrices
Definition: Two matrices and are said to be equal if:
They are the same size, and
The elements in the corresponding places of the two matrices are the same, i.e.,
If two matrices A and B are equal, we write A = B.
If two matrices A and B are not equal, we write A ≠ B.
If two matrices are not of the same size, they cannot be equal.
Addition of Matrices
Definition: The sum of two matrices and of the same order is obtained by adding their corresponding elements.
If and are two matrices of the same order m×n, then their sum is given by:
for all values of and .
Conditions for Addition:
The two matrices must have the same order (same number of rows and columns).
The addition is performed element-wise.
Example:
Let
Then,
Multiplication of Two Matrices
Definition: The product of two matrices and is defined if the number of columns of the first matrix is equal to the number of rows of the second matrix .
If is of order and is of order , then their product is a matrix of order , where each element is calculated as:
Conditions for Multiplication:
The number of columns of the first matrix must be equal to the number of rows of the second matrix.
The resulting matrix will have the same number of rows as the first matrix and the same number of columns as the second matrix.
Example:
Let
Now, calculating :
Key Points:
Matrix multiplication is NOT commutative, i.e., in general.
Associative Property: .
Distributive Property: .
Triangular, Diagonal and Scalar Matrices
(i) Upper Triangular Matrix
Definition: A square matrix is called an upper triangular matrix if whenever .
(ii) Lower Triangular Matrix
Definition: A square matrix is called a lower triangular matrix if whenever .
(iii) Diagonal Matrix
Definition: A square matrix whose elements above and below the principal diagonal are all zero, i.e., for all , is called a diagonal matrix.
(iv) Scalar Matrix
Definition: A diagonal matrix whose diagonal elements are all equal is called a scalar matrix.
Idempotent Matrix
A matrix such that is called an idempotent matrix.
Involutory Matrix
A matrix is said to be an involutory matrix if (Unit Matrix).
Nilpotent Matrix
A matrix is said to be a nilpotent matrix if (null matrix), where is a positive integer. If is the least positive integer for which , then is called the index of the nilpotent matrix.
Periodic Matrix
A matrix is said to be a periodic matrix if , where k is a positive integer. If is the least positive integer for which , then k is called the period of .
If we choose , then , and we call it an idempotent matrix.
Lecture 1
Lecture 1 PDF Download from this link
Transpose of a Matrix
Definition:
Let , then the n×m matrix obtained from by changing its rows into columns and its columns into rows is called the transpose of and is denoted by the symbol or .
Theorems:
If and be the transposes of matrices and respectively, then:
, where is any complex number.
, where and are conformable for multiplication.
Orthogonal Matrix
Definition:
A square matrix is said to be Orthogonal if:
where is the transpose of and is the identity matrix.
Theorem 1
If and are -rowed orthogonal matrices, then the products and are also orthogonal matrices.
Theorem 2
If is an orthogonal matrix, then its transpose and its inverse are also orthogonal
Lecture 2
Lecture 2 PDF Download from this link
Q1: Determine the values ofwhen is orthogonal.
Q2: Show that the matrix is orthogonal.
Answer : For the solution, see the video given below (हल के लिए, नीचे दी गई विडियो देखे)
Q3: Show that the matrix is orthogonal.
Answer : For the solution, see the video given below (हल के लिए, नीचे दी गई विडियो देखे)
Q4: Verify that the matrix is orthogonal.
Answer : For the solution, see the video given below (हल के लिए, नीचे दी गई विडियो देखे)
Lecture 3
Lecture 3 PDF Download from this link
Conjugate of a Matrix
If I = √−1 , then z = x + iy is called a complex number where x and y are any real numbers.
If z = x + iy, then z̅ = x − iy is called the Conjugate of the complex number z.
We have z z̅ = (x + iy)(x - iy) = x² + y² i.e., is real
Also if z = z̅, then x + iy = x - iy i.e., 2iy = 0 i.e., y = 0 i.e., z is real.
Conversely, if z is real then z̅ = z
z = x + iy, then z̅ = x - iy (z̅̅) = x + iy = z
If z₁ and z₂ are two complex numbers, then it can be easily seen that
Definition: The matrix obtained from any given matrix A on replacing its elements by the corresponding conjugate complex numbers is called the conjugate of A and is denoted by Ā.
Thus if A = [aᵢⱼ]ₘₓₙ, then Ā = [āᵢⱼ]ₘₓₙ, where āᵢⱼ denotes the conjugate complex of aᵢⱼ.
If A be a matrix over the field of real numbers, then obviously Ā coincides with A.
Theorems:
If Ā and B̅ be the conjugates of A and B respectively, then
(i) (Ā̅) = A
(ii) (A + B)̅ = Ā + B̅, A and B being of the same size
(iii) (kA)̅ = k̅Ā, k is being any complex number
(iv) (AB)̅ = ĀB̅, A and B being conformable to multiplication
Transposed of Conjugate of a Matrix
Definition: The transpose of the conjugate of a matrix A is called the transposed conjugate of A and is denoted by Aᶿ or by A*.
Obviously, the conjugate of the transpose of A is the same as the transpose of the conjugate of A, i.e.,
(A̅′) = (A′)̅ = Aᶿ
If A = , then Aᶿ =
Where b₍ᵢⱼ₎ = ā₍ⱼᵢ₎, i.e., the element of Aᶿ = the conjugate complex of the element of A
Theorems:
If Aᶿ and Bᶿ be the transposed conjugates of A and B respectively, then
(Aᶿ)ᶿ = A
(A + B)ᶿ = Aᶿ + Bᶿ, where A and B are of the same size
(kA)ᶿ = kAᶿ, where k is any complex number
(AB)ᶿ = Bᶿ Aᶿ, where A and B are conformable to multiplication
Symmetric and Skew-symmetric Matrices
Symmetric Matrix (Definition):
A square matrix A = [aᵢⱼ] is said to be symmetric if its (i, j)ᵗʰ element is the same as its (j, i)ᵗʰ element, i.e., if aᵢⱼ = aⱼᵢ for all i, j.
Example Matrices:
Skew-Symmetric Matrix (Definition):
A square matrix A = [aᵢⱼ] is said to be skew-symmetric if its (i, j)ᵗʰ element of A is the negative of the (j, i)ᵗʰ element of A, i.e., if
Example Matrices:
Properties of Skew-Symmetric Matrices:
If A is a skew-symmetric matrix, then:
Thus, the diagonal elements of a skew-symmetric matrix are all zero.
Hermitian and Skew-Hermitian Matrices
Hermitian Matrix (Definition):
A square matrix A = [aᵢⱼ] is said to be Hermitian if its (i, j)ᵗʰ element of A is equal to the conjugate complex of the (j, i)ᵗʰ element of A, i.e., if
Example Matrices:
Note:
If A is a Hermitian matrix, then
By definition, aᵢⱼ is real for all i, thus every diagonal element of a Hermitian matrix must be real.
Skew-Hermitian Matrix (Definition):
A square matrix A = [aᵢⱼ] is said to be Skew-Hermitian if its (i, j)ᵗʰ element of A is equal to the negative of the conjugate complex of the (j, i)ᵗʰ element of A, i.e., if
Example Matrices:
Properties of a Skew-Hermitian Matrix:
Symmetry Condition:
Diagonal Elements:
If A is a Skew-Hermitian matrix, then
This implies that the diagonal elements of a Skew-Hermitian matrix must be purely imaginary or zero.
Note:
If A is a Skew-Hermitian matrix, then
By definition, aᵢⱼ must be either a pure imaginary number or zero.
Lecture 4
Lecture 4 PDF Download from this link
Q1: Show that the matrix is Skew-Hermitian.
Answer : For the solution, see the video given below (हल के लिए, नीचे दी गई विडियो देखे)
Q2: Express as the sum of a Hermitian and a Skew-Hermitian matrix
Q3: Show that the matrix is skew-symmetric
Q4:Show that the matrix A, where
Answer : For the solution, see the video given below (हल के लिए, नीचे दी गई विडियो देखे)
Q5:If prove that A is a Hermitian matrix.
Answer : For the solution, see the video given below (हल के लिए, नीचे दी गई विडियो देखे)
Lecture 5
Lecture 5 PDF Download from this link
Unitary Matrix
Definition: A square matrix is said to be unitary if.
Sinceand, therefore if, we have.
Thus, the determinant of a unitary matrix is of unit modulus.
If is a unitary matrix, then and so, i.e., AA is non-singular and invertible.
Hence, implies .
Thus,is a unitary matrix if and only if .
Q1: Prove is Unitary.
Answer : For the solution, see the video given below (हल के लिए, नीचे दी गई विडियो देखे)
Q2: Show that the matrix
is unitary if α2+β2+γ2+δ2=1\alpha^2 + \beta^2 + \gamma^2 + \delta^2 = 1.
Answer : For the solution, see the video given below (हल के लिए, नीचे दी गई विडियो देखे)
Q3: Prove that the matrix [1+i2−1+i21+i21−i2]\begin{bmatrix} \frac{1 + i}{2} & \frac{-1 + i}{2} \\ \frac{1 + i}{2} & \frac{1 - i}{2} \end{bmatrix} is unitary.
Answer : For the solution, see the video given below (हल के लिए, नीचे दी गई विडियो देखे)