Matrices (2nd Sem)


Chapter

Matrices

All Lectures are available on this page 

Matrices :

A set of mn numbers (real or complex) arranged in the form of a rectangular array having m rows and n columns is called an m x n matrix [to be read as ‘m by n’ matrix].

An m x n matrix is usually written as:

A=[a11a12a1na21a22a2na31a32a3nam1am2amn]A = \begin{bmatrix} a_{11} & a_{12} & \dots & \dots & a_{1n} \\ a_{21} & a_{22} & \dots & \dots & a_{2n} \\ a_{31} & a_{32} & \dots & \dots & a_{3n} \\ \vdots & \vdots & \ddots & \vdots & \vdots \\ a_{m1} & a_{m2} & \dots & \dots & a_{mn} \end{bmatrix}

In a compact form, the above matrix is represented by:

A=[aij]i=1,2,3,,m,j=1,2,3,,n

or simply by

[aij]m×n​

We write the general element of the matrix and enclose it in brackets of type [ ] or of the type ( ).

(i) Square Matrices

Definition: An m×nm \times n matrix for which m=nm = n (i.e., the number of rows is equal to the number of columns) is called a square matrix of order nn. It is also called an nn-rowed square matrix. The elements aija_{ij} of a square matrix A=[aij]n×mA = [a_{ij}]_{n \times m} for which i=ji = j.

A=[125133341]3×3


A = \begin{bmatrix} 1 & 2 & 5 \\ 1 & 3 & 3 \\ 3 & 4 & 1 \end{bmatrix}_{3 \times 3}

(ii) Unit Matrix or Identity Matrix

Definition: A square matrix, each of whose diagonal elements is 1 and whose non-diagonal elements are 0, is called a Unit Matrix or an Identity Matrix and is denoted by I.
InI_n will denote a unit matrix of order nn.

Thus, a square matrix A=[aij]A = [a_{ij}] is a unit matrix if

aij=1when i=j,andaij=0when ij.I3=[100010001]I_3 = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} I2=[1001]

(iii) Null Matrix:

Definition: The m×nm \times n matrix whose elements are all 0 is called the null matrix (or zero matrix) of the type m×nm \times n. It is usually denoted by O or more clearly by Om,nO_{m,n}. Often, a null matrix is simply denoted by the symbol 0 read as ‘zero’.

O3×3=[000000000]O3×2=[000000]O_{3 \times 3} = \begin{bmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix} \quad O_{3 \times 2} = \begin{bmatrix} 0 & 0 \\ 0 & 0 \\ 0 & 0 \end{bmatrix}

(iv) Row Matrix and Column Matrix

Definition: Any 1×n1 \times n matrix, which has only one row and nn columns, is called a row matrix or row vector.
Similarly, any m×1m \times 1 matrix, which has mm rows and only one column, is a column matrix or a column vector.

Y=[254]3×1X=[471]1×3Y = \begin{bmatrix} 2 \\ 5 \\ 4 \end{bmatrix}_{3 \times 1} \quad X = \begin{bmatrix} 4 & 7 & 1 \end{bmatrix}_{1 \times 3}

Equality of Two Matrices

Definition: Two matrices A=[aij]A = [a_{ij}] and B=[bji]B = [b_{ji}] are said to be equal if:

  1. They are the same size, and
  2. The elements in the corresponding places of the two matrices are the same, i.e., aij=bij for each pair of subscripts i and j.

If two matrices A and B are equal, we write A = B.
If two matrices A and B are not equal, we write A ≠ B.

Addition of Matrices

Definition: The sum of two matrices AA and BB of the same order is obtained by adding their corresponding elements.

If A=[aij]A = [a_{ij}] and B=[bij]B = [b_{ij}] are two matrices of the same order m×nm \times n, then their sum is given by:

A+B=[aij+bij]

for all values of ii and jj.

Conditions for Addition:

  • The two matrices must have the same order (same number of rows and columns).
  • The addition is performed element-wise.

Example:

Let

A=[1234],B=[5678]

Then,

A+B=[1+52+63+74+8]=[681012]

Multiplication of Two Matrices

Definition: The product of two matrices AA and BB is defined if the number of columns of the first matrix AA is equal to the number of rows of the second matrix BB.

If AA is of order m×nm \times n and BB is of order n×pn \times p, then their product ABAB  is a matrix of order m×pm \times p, where each element is calculated as:

(AB)ij=k=1naikbkj​

Conditions for Multiplication:

  1. The number of columns of the first matrix must be equal to the number of rows of the second matrix.
  2. The resulting matrix will have the same number of rows as the first matrix and the same number of columns as the second matrix.

Example:

Let

A=[1234],B=[5678]A = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix}, B = \begin{bmatrix} 5 & 6 \\ 7 & 8 \end{bmatrix}

Now, calculating ABAB:

AB=[(1×5+2×7)(1×6+2×8)(3×5+4×7)(3×6+4×8)]AB = \begin{bmatrix} (1 \times 5 + 2 \times 7) & (1 \times 6 + 2 \times 8) \\ (3 \times 5 + 4 \times 7) & (3 \times 6 + 4 \times 8) \end{bmatrix}
=[(5+14)(6+16)(15+28)(18+32)]= \begin{bmatrix} (5+14) & (6+16) \\ (15+28) & (18+32) \end{bmatrix}
=[19224350]

Key Points:

  • Matrix multiplication is NOT commutative, i.e., ABBAAB \neq BA in general.
  • Associative Property: (AB)C=A(BC)(AB)C = A(BC).
  • Distributive Property: A(B+C)=AB+ACA(B+C) = AB + AC.

Triangular, Diagonal and Scalar Matrices

(i) Upper Triangular Matrix

Definition: A square matrix A=[aij]A = [a_{ij}] is called an upper triangular matrix if aij=0a_{ij} = 0 whenever i>ji > j.

[134026007]\begin{bmatrix} 1 & 3 & 4 \\ 0 & 2 & 6 \\ 0 & 0 & 7 \end{bmatrix}

(ii) Lower Triangular Matrix

Definition: A square matrix A=[aij]A = [a_{ij}] is called a lower triangular matrix if aij=0a_{ij} = 0 whenever i<ji < j.

[100220476]\begin{bmatrix} 1 & 0 & 0 \\ 2 & 2 & 0 \\ 4 & 7 & 6 \end{bmatrix}

Lecture 1

Lecture 1 PDF Download from this link

Transpose of a Matrix

Definition:

Let A=[aij]m×nA = [a_{ij}]_{m \times n}, then the n×mn \times m matrix obtained from AA by changing its rows into columns and its columns into rows is called the transpose of AA and is denoted by the symbol AA' or ATA^T.

A=[126347528]A = \begin{bmatrix} 1 & 2 & 6 \\ 3 & 4 & 7 \\ 5 & 2 & 8 \end{bmatrix} AT=[135242678]A^T = \begin{bmatrix} 1 & 3 & 5 \\ 2 & 4 & 2 \\ 6 & 7 & 8 \end{bmatrix}

Theorems:

If AA' and BB' be the transposes of matrices AA and BB respectively, then:

  1. (A)=A(A')' = A
  2. (A+B)=A+B(A + B)' = A' + B'
  3. (kA)=kA(kA)' = kA', where kk is any complex number.
  4. (AB)=BA(AB)' = B'A', where AA and BB are conformable for multiplication.

Orthogonal Matrix

Definition:

A square matrix AA is said to be Orthogonal if:

ATA=IA^T A = I

where ATA^T is the transpose of AA and II is the identity matrix.

Theorem 1

If AA and BB are nn-rowed orthogonal matrices, then the products ABAB and BABA are also orthogonal matrices.

Theorem 2

If AA is an orthogonal matrix, then its transpose AA' and its inverse A1A^{-1} are also orthogonal

Lecture 2


Lecture 2 PDF Download from this link

Q1: Determine the values of α,β,γ\alpha, \beta, \gamma when is orthogonal.

A=[02βγαβγαβγ]

Answer : For the solution, see the video given below (हल के लिए, नीचे दी गई विडियो देखे)

Q2: Show that the matrix is orthogonal.

A=[cosθsinθsinθcosθ]

Answer : For the solution, see the video given below (हल के लिए, नीचे दी गई विडियो देखे)

Q3: Show that the matrix is orthogonal.

A=13[122212221]A = \frac{1}{3} \begin{bmatrix} -1 & 2 & 2 \\ 2 & -1 & 2 \\ 2 & 2 & -1 \end{bmatrix}

Q4: Verify that the matrix is orthogonal.

A=13[122212221]A = \frac{1}{3} \begin{bmatrix} 1 & 2 & 2 \\ 2 & 1 & -2 \\ -2 & 2 & -1 \end{bmatrix}

Lecture 3


Lecture 3 PDF Download from this link

Conjugate of a Matrix

If I = √−1 , then z = x + iy is called a complex number where x and y are any real numbers.
If z = x + iy, then = x − iy is called the Conjugate of the complex number z.

We have z z̅ = (x + iy)(x - iy) = x² + y² i.e., is real

Also if z = , then x + iy = x - iy i.e., 2iy = 0 i.e., y = 0 i.e., z is real.

Conversely, if z is real then = z

z = x + iy, then = x - iy (z̅̅) = x + iy = z

If z₁ and z₂ are two complex numbers, then it can be easily seen that

(i) z̅₁ + z₂ = z̅₁ + z̅₂
(ii) z̅₁ z₂ = (z̅₁) + (z̅₂)

Conjugate of a Matrix

Definition: The matrix obtained from any given matrix A on replacing its elements by the corresponding conjugate complex numbers is called the conjugate of A and is denoted by Ā.

Thus if A = [aᵢⱼ]ₘₓₙ, then Ā = [āᵢⱼ]ₘₓₙ, where āᵢⱼ denotes the conjugate complex of aᵢⱼ.

If A be a matrix over the field of real numbers, then obviously Ā coincides with A.

A=[2+3i47i8i69+i]A = \begin{bmatrix} 2 + 3i & 4 - 7i & 8 \\ - i & 6 & 9 + i \end{bmatrix}
Aˉ=[23i4+7i8i69i]Ā = \begin{bmatrix} 2 - 3i & 4 + 7i & 8 \\ i & 6 & 9 - i \end{bmatrix}

Theorems:

If Ā and be the conjugates of A and B respectively, then

(i) (Ā̅) = A

(ii) (A + B)̅ = Ā + B̅, A and B being of the same size

(iii) (kA)̅ = k̅Ā, k is being any complex number

(iv) (AB)̅ = ĀB̅, A and B being conformable to multiplication

Transposed of Conjugate of a Matrix

Definition: The transpose of the conjugate of a matrix A is called the transposed conjugate of A and is denoted by Aᶿ or by A*.
Obviously, the conjugate of the transpose of A is the same as the transpose of the conjugate of A, i.e.,

(A̅′) = (A′)̅ = Aᶿ

If A = [aij]m×n[a_{ij}]_{m \times n}, then Aᶿ = [bij]n×m[b_{ij}]_{n \times m}

Where b₍ᵢⱼ₎ = ā₍ⱼᵢ₎, i.e., the (j,i)th(j, i)^{th} element of Aᶿ = the conjugate complex of the (i,j)th(i, j)^{th} element of A

A=[1+2i23i3+4i45i5+6i67i87+8i7]A = \begin{bmatrix} 1 + 2i & 2 - 3i & 3 + 4i \\ 4 - 5i & 5 + 6i & 6 - 7i \\ 8 & 7 + 8i & 7 \end{bmatrix} A=[1+2i45i823i5+6i7+8i3+4i67i7]A′ = \begin{bmatrix} 1 + 2i & 4 - 5i & 8 \\ 2 - 3i & 5 + 6i & 7 + 8i \\ 3 + 4i & 6 - 7i & 7 \end{bmatrix}

Theorems:

If Aᶿ and Bᶿ be the transposed conjugates of A and B respectively, then

  1. (Aᶿ)ᶿ = A
  2. (A + B)ᶿ = Aᶿ + Bᶿ, where A and B are of the same size
  3. (kA)ᶿ = kAᶿ, where k is any complex number
  4. (AB)ᶿ = Bᶿ Aᶿ, where A and B are conformable to multiplication

Symmetric and Skew-symmetric Matrices

Symmetric Matrix (Definition):

A square matrix A = [aᵢⱼ] is said to be symmetric if its (i, j)ᵗʰ element is the same as its (j, i)ᵗʰ element, i.e., if aᵢⱼ = aⱼᵢ for all i, j.

Example Matrices:

[2443]\begin{bmatrix} 2 & 4 \\ 4 & 3 \end{bmatrix}
[1i2ii242i43]\begin{bmatrix} 1 & i & -2i \\ i & -2 & 4 \\ -2i & 4 & 3 \end{bmatrix}

Hermitian and Skew-Hermitian Matrices

Hermitian Matrix (Definition):

A square matrix A = [aᵢⱼ] is said to be Hermitian if its (i, j)ᵗʰ element of A is equal to the conjugate complex of the (j, i)ᵗʰ element of A, i.e., if

aij=aji for all i,j.

Example Matrices:

[ab+icbicd]\begin{bmatrix} a & b + ic \\ b - ic & d \end{bmatrix}
[123i3+4i2+3i045i34i4+5i2]\begin{bmatrix} 1 & 2 - 3i & 3 + 4i \\ 2 + 3i & 0 & 4 - 5i \\ 3 - 4i & 4 + 5i & 2 \end{bmatrix}

Note:

If A is a Hermitian matrix, then

aij=ajiaᵢⱼ = \overline{aⱼᵢ}

By definition, aᵢⱼ is real for all i, thus every diagonal element of a Hermitian matrix must be real.

Skew-Hermitian Matrix (Definition):

A square matrix A = [aᵢⱼ] is said to be Skew-Hermitian if its (i, j)ᵗʰ element of A is equal to the negative of the conjugate complex of the (j, i)ᵗʰ element of A, i.e., if

aij=aji for all i,j.

Example Matrices:

[02i2i0]\begin{bmatrix} 0 & -2 - i \\ 2 - i & 0 \end{bmatrix}
[0i3+4ii03+4i34i34i0]\begin{bmatrix} 0 & -i & 3 + 4i \\ i & 0 & -3 + 4i \\ -3 - 4i & 3 - 4i & 0 \end{bmatrix}

Properties of a Skew-Hermitian Matrix:

  1. Symmetry Condition: aij=aji for all i,jaᵢⱼ = -\overline{aⱼᵢ} \text{ for all } i, j
  2. Diagonal Elements:
    If A is a Skew-Hermitian matrix, then aii=aii (by definition)aᵢᵢ = -\overline{aᵢᵢ} \text{ (by definition)}
    aii+aii=0aᵢᵢ + \overline{aᵢᵢ} = 0This implies that the diagonal elements of a Skew-Hermitian matrix must be purely imaginary or zero.

Note:

If A is a Skew-Hermitian matrix, then

aij=ajiaᵢⱼ = -\overline{aⱼᵢ}

By definition, aᵢⱼ must be either a pure imaginary number or zero.

Lecture 4


Lecture 4 PDF Download from this link

Q1: Show that the matrix is Skew-Hermitian.

[i3+2i2i3+2i034i2i34i2i]\begin{bmatrix} i & 3 + 2i & -2 - i \\ -3 + 2i & 0 & 3 - 4i \\ 2 - i & -3 - 4i & -2i \end{bmatrix}

Q2: Express as the sum of a Hermitian and a Skew-Hermitian matrix

[2+3i1i2+i345i511+i2+2i]\begin{bmatrix} -2 + 3i & 1 - i & 2 + i \\ 3 & 4 - 5i & 5 \\ 1 & 1 + i & -2 + 2i \end{bmatrix}

Answer : For the solution, see the video given below (हल के लिए, नीचे दी गई विडियो देखे)

Q3: Show that the matrix is skew-symmetric

Q4: Show that the matrix A, where

Q5: If prove that A is a Hermitian matrix.

A=[323i3+5i2+3i5i35ii7]A = \begin{bmatrix} 3 & 2 - 3i & 3 + 5i \\ 2 + 3i & 5 & -i \\ 3 - 5i & -i & 7 \end{bmatrix}

Lecture 5


Lecture 5 PDF Download from this link

Unitary Matrix

Definition: A square matrix AA is said to be unitary if AθA=IA^\theta A = I.

Since Aθ=A|A^\theta| = |A| and AθA=AθA|A^\theta A| = |A^\theta||A|, therefore if AθA=IA^\theta A = I, we have AAθ=1|A| |A^\theta| = 1.

Thus, the determinant of a unitary matrix is of unit modulus.

If AA is a unitary matrix, then AAθ=1|A| |A^\theta| = 1 and so A0|A| \neq 0, i.e., AA is non-singular and invertible.

Hence, AθA=IA^\theta A = I implies AAθ=IA A^\theta = I.

Thus, AA is a unitary matrix if and only if AθA=I=AAθA^\theta A = I = A A^\theta.

Q1: Prove B=13[11+i1i1]B = \frac{1}{\sqrt{3}} \begin{bmatrix} 1 & 1 + i \\ 1 - i & -1 \end{bmatrix} is Unitary.

Answer : For the solution, see the video given below (हल के लिए, नीचे दी गई विडियो देखे)

Q2: Show that the matrix

A=[α+iγβ+iδβ+iδαiγ]A = \begin{bmatrix} \alpha + i\gamma & -\beta + i\delta \\ \beta + i\delta & \alpha - i\gamma \end{bmatrix}

is unitary if α2+β2+γ2+δ2=1\alpha^2 + \beta^2 + \gamma^2 + \delta^2 = 1.
Answer : For the solution, see the video given below (हल के लिए, नीचे दी गई विडियो देखे)

Q3: Prove that the matrix [1+i21+i21+i21i2]\begin{bmatrix} \frac{1 + i}{2} & \frac{-1 + i}{2} \\ \frac{1 + i}{2} & \frac{1 - i}{2} \end{bmatrix} is unitary.
Answer : For the solution, see the video given below (हल के लिए, नीचे दी गई विडियो देखे)

Lecture 6


Lecture 6 PDF Download from this link