Tag: inverse of a matrix and linear equations

Questions Related to inverse of a matrix and linear equations

Let $A$ be a square matrix of order $n \times  n$. A constant $\lambda $ is said to be characteristic root of $A$ if there exists a $n \times  1$ matrix $X$ such that  $AX=\lambda X$

Let $P$ be a non-singular matrix, then which of the following matrices have the same characteristic roots.

  1. $A$ and $AP$

  2. $A$ and $PA$

  3. $A$ and $P^{-1} :AP$

  4. none of these


Correct Option: C
Explanation:

Since $X\neq 0$ is such that $(A-\lambda I)X=0,:|A-\lambda 
I|=0\Leftrightarrow A-\lambda I$ is singular. If $A-\lambda I$ is non-singular the then equation $(A-\lambda I)X=0\Rightarrow X=0$
If $\lambda= 0$, we get $|A|=0\Rightarrow A$ is singular.
We have $A^2 :X=A(AX) =A(\lambda X) = \lambda(AX)$
$=\lambda^2X$,
$A^3 :X=A(A^2X) =A(\lambda^2 X)$
$=\lambda^2(AX)=\lambda^2(\lambda X)=\lambda^3 X$
Continuing in this way, we obtain
$A" X=\lambda^": X: \forall : n\in N$
Also,$|P^{-1}:AP -\lambda I|=|P^{-1}(A -\lambda I)P|$
$=|P^{-1}|: |A -\lambda I|: |P|=|A -\lambda I|$

Let $A$ and $B$ be two non-null square matrices. If the product $AB$ is a null matrix, then

  1. $A$ is singular

  2. $B$ is singular

  3. $A$ is non-singular

  4. $B$ is non-singular


Correct Option: A,B
Explanation:

Let $B$ be non-singular, then ${ B }^{ -1 }$ exists.

Now, $AB=0($ given$) \Rightarrow \left( AB \right) { B }^{ -1 }=0{ B }^{ -1 }$
$($ post multiplying both sides by ${ B }^{ -1 })$
$\Rightarrow A\left( B{ B }^{ -1 } \right) =0$     $($ by associativity $)$
$\Rightarrow A{ I } _{ n }=0\quad \quad \quad \left( \because B{ B }^{ -1 }={ I } _{ n } \right) $
$\Rightarrow A=0$
But $A$ is a non-null matrix.
Hence $B$ is a singular matrix.
Similarly, it can be shown that $A$ is a singular matrix.

Let $A=\begin{bmatrix}x+\lambda& x&x\x &x+\lambda&x\x&x&x+\lambda  \end{bmatrix}$, then $A^{-1}$ exists if

  1. $x\neq 0$

  2. $\lambda \neq 0$

  3. $3x+\lambda \neq 0, \lambda \neq 0$

  4. $x\neq 0, \lambda \neq 0$


Correct Option: C
Explanation:

$A=\left[ \begin{matrix} x+\lambda  & x & x \ x & x+\lambda  & x \ x & x & x+\lambda  \end{matrix} \right] $
${ A }^{ -1 }$ exists if $\left| A \right| \neq 0$
$\left| A \right| =\left| \begin{matrix} x+\lambda  & x & x \ x & x+\lambda  & x \ x & x & x+\lambda  \end{matrix} \right| \neq 0$
${ R } _{ 1 }\rightarrow { R } _{ 1 }+{ R } _{ 2 }+{ R } _{ 3 }$
$\left( 3x+\lambda  \right) \left| \begin{matrix} 1 & 1 & 1 \ x & x+\lambda  & x \ x & x & x+\lambda  \end{matrix} \right| \neq 0$
${ C } _{ 2 }\rightarrow { C } _{ 2 }-{ C } _{ 1 }$ and ${ C } _{ 3 }\rightarrow { C } _{ 3 }-{ C } _{ 1 }$
$\left( 3x+\lambda  \right) \left| \begin{matrix} 1 & 0 & 0 \ x & \lambda  & 0 \ x & 0 & \lambda  \end{matrix} \right| \neq 0$
$\left( 3x+\lambda  \right) { \left[ { \lambda  }^{ 2 } \right]  }\neq 0$
$3x+\lambda \neq 0,\lambda \neq 0$

If adj $B=A$ and $|P|=|Q|=1$, then $adj (\left( { Q }^{ -1 }{ BP }^{ -1 } \right)$ is equal ?

  1. $APQ$

  2. $PAQ$

  3. $B$

  4. $A$


Correct Option: A
Explanation:

$adj\left| { A }^{ -1 } \right| =\frac { A }{ \left| A \right|  } $


$adj\left| { Q }^{ -1 }{ BP }^{ -1 } \right| =adj\left| { P }^{ -1 } \right| \ast adj\left| { B } \right| \ast adj\left| { Q }^{ -1 } \right| =\frac { P }{ \left| P \right|  } \ast A\ast \frac { Q }{ \left| Q \right|  } =PAQ$

Matrices $A$ and $B$ will be inverse of each other only if

  1. $AB=BA$

  2. $AB=0,BA=I$

  3. $AB=BA=0$

  4. $AB=BA=I$


Correct Option: D
Explanation:

We know that if $A$ is a square of order $m$, and if there exists another square matrix $B$ of the same order $m$, such that $AB=I$, then $B$ is said to be the inverse of $A$. 

In this case, it is clear that $A$ is the inverse of $B$.
Thus , matrices $A$ and $B$ will be inverses of each other only if $AB=BA=I.$


 Let A$= \left[\begin{array}{lll}1 & 0 & 0\2 & 1 & 0\3 & 2 & 1\end{array}\right]$.If $\mathrm{u _1}$ and $\mathrm{u} _{2}$ are column matrices such that  $\mathrm{Au _{1}}=\left[\begin{array}{l}1\0\0\end{array}\right]$ and  $\mathrm{Au _{2}}=\left[\begin{array}{l}0\1\0\end{array}\right]$ then $\mathrm{u _{1}+u _{2}}$ is equal to:

  1. $\left[\begin{array}{l}

    -1\

    1\

    0

    \end{array}\right]$

  2. $\left[\begin{array}{l}

    -1\

    1\

    -1

    \end{array}\right]$

  3. $\left[\begin{array}{l}

    -1\

    -1\

    0

    \end{array}\right]$

  4. $\left[\begin{array}{l}

    1\

    -1\

    -1

    \end{array}\right]$


Correct Option: D
Explanation:
Given:Matrices are 
$A=\left[ \begin{matrix} 1 & 0  & 0 \\ 2 & 1  & 0 \\ 3 & 2  & 1 \end{matrix} \right]$

$A{u} _{1}=\left[ \begin{matrix} 1  \\  0 \\ 0 \end{matrix} \right]$ and 
$A{u} _{2}=\left[ \begin{matrix} 0  \\  1 \\ 0 \end{matrix} \right]$

To find:Matric ${u} _{1}+{u} _{2}$

Since both $A{u} _{1}$ and $A{u} _{2}$ are given, hence adding them, we get

$A{u} _{1}+A{u} _{2}=\left[ \begin{matrix} 1  \\  0 \\ 0 \end{matrix} \right]+\left[ \begin{matrix} 0  \\  1 \\ 0 \end{matrix} \right]$

$A\left({u} _{1}+{u} _{2}\right)=\left[ \begin{matrix} 1  \\  1 \\ 0 \end{matrix} \right]$

Since,$A$ is a non-singular matrix,we have
$\left|A\right|\neq\,0$

Hence multiplying both sides by ${A}^{-1}$ from RHS we get

${A}^{-1}A\left({u} _{1}+{u} _{2}\right)={A}^{-1}\left[ \begin{matrix} 1  \\  1 \\ 0 \end{matrix}\right]$

${u} _{1}+{u} _{2}={\left[ \begin{matrix} 1 & 0  & 0 \\ 2 & 1  & 0 \\ 3 & 2  & 1 \end{matrix} \right]}^{-1}\left[ \begin{matrix} 1  \\  1 \\ 0 \end{matrix}\right]$     ..........$(1)$

Now, $\left|A\right|=\left|\begin{matrix} 1 & 0  & 0 \\ 2 & 1  & 0 \\ 3 & 2  & 1 \end{matrix} \right|$

$=1\left| \begin{matrix} 1 & 0   \\ 2 & 1  \end{matrix} \right|-0+0$(by expanding the determinant along row $1$)

$\Rightarrow\,\left|A\right|=1$

Now, co-factor matrix of $A$ (i.e., the matrix in which every element is replaced by corresponding co-factor)

$=\left[\begin{matrix} \left| \begin{matrix} 1 & 0   \\ 2 & 1  \end{matrix} \right| & -\left| \begin{matrix} 2 & 0   \\ 3 & 1  \end{matrix} \right|  & \left| \begin{matrix} 2 & 1 \\ 3 & 2 \end{matrix} \right| \\ -\left| \begin{matrix} 0 & 0   \\ 2 & 1  \end{matrix} \right| & \left| \begin{matrix} 1 & 0   \\ 3 & 1  \end{matrix} \right|  & -\left| \begin{matrix} 1 & 0   \\ 3 & 2 \end{matrix} \right| \\ \left| \begin{matrix} 0 & 0   \\ 1 & 0  \end{matrix} \right| & -\left| \begin{matrix} 1 & 0   \\ 2 & 0  \end{matrix} \right|  & \left| \begin{matrix} 1 & 0   \\ 2 & 1  \end{matrix} \right| \end{matrix} \right]$

$=\left[\begin{matrix} 1 & -2 & 1 \\ 0 & 1  & -2 \\ 0 & 0 & 1 \end{matrix} \right]$
$\therefore\,adj\left(A\right)={\left[\begin{matrix} 1 & -2 & 1 \\ 0 & 1  & -2 \\ 0 & 0 & 1 \end{matrix} \right]}^{T}=\left[\begin{matrix} 1 & 0 & 0 \\ -2 & 1  & 0 \\ 1 & -2 & 1 \end{matrix} \right]$

$\Rightarrow\,{A}^{-1}=\dfrac{adj\left(A\right)}{\left|A\right|}$

$=\left[\begin{matrix} 1 & 0  & 0 \\ -2 & 1  & 0 \\ 1 & -2  & 1 \end{matrix} \right|\,\,\,\because\left|A\right|=1$

From eqn$(1)$ we get

${u} _{1}+{u} _{2}={\left[\begin{matrix} 1 & 0  & 0 \\ 2 & 1  & 0 \\ 3 & 2  & 1 \end{matrix} \right]}^{-1}\times\left[ \begin{matrix} 1  \\  1 \\ 0 \end{matrix}\right]$ 

$=\left[\begin{matrix} 1 & 0  & 0 \\ -2 & 1  & 0 \\ 1 & -2  & 1 \end{matrix} \right]\times\left[ \begin{matrix} 1  \\  1 \\ 0 \end{matrix}\right]$ 

$=\left[ \begin{matrix} 1+0+0  \\  -2+1+0 \\ 1-2+0 \end{matrix}\right]$ 

$\therefore\,{u} _{1}+{u} _{2}=\left[ \begin{matrix} 1  \\  -1 \\ -1 \end{matrix}\right]$ 

If a $3\times 3$ matrix $A$ has its inverse equal to $A$, then ${A}^{2}$ is equal to

  1. $\begin{bmatrix} 0 & 1 & 0 \ 1 & 1 & 1 \ 0 & 1 & 0 \end{bmatrix}$

  2. $\begin{bmatrix} 1 & 0 & 1 \ 0 & 0 & 0 \ 1 & 0 & 1 \end{bmatrix}$

  3. $\begin{bmatrix} 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1 \end{bmatrix}$

  4. $\begin{bmatrix} 1 & 1 & 1 \ 1 & 1 & 1 \ 1 & 1 & 1 \end{bmatrix}$


Correct Option: C
Explanation:

Given $A^{-1}=A$

$AA^{-1}=I$
$\implies A.A=I$
$\implies A^{2}=I=\begin{bmatrix}1&0&0\0&1&0\0&0&1\end{bmatrix}$

If $A$ is an $3\times 3$ non -singular matrix that $AA'=A'A$ and $B=A^{-1}A'$,then $BB'$ equal ?

  1. $I+B^{-1}$

  2. $(B^{-1})$

  3. $I+B$

  4. $I$


Correct Option: B

 ${( -A )}^{ -1 }$ is always equal to (where $A$ is $nth$ order square matrix)

  1. ${ (-1) }^{ n }{ A }^{ -1 }$

  2. ${ -A }^{ -1 }$

  3. ${( -1) }^{ n-1 }{ A }^{ -1 }$

  4. none of these


Correct Option: B
Explanation:

We know that if $A^{-1}$ exist then $(cA)^{-1}=\dfrac{1}{c}A^{-1}$ where c is  a constant
Hence $(-A)^{-1}=\dfrac{1}{-1}A^{-1}=-A^{-1}$

If $A\left( \alpha ,\beta  \right) =\left[ \begin{matrix} \cos { \alpha  }  & \sin { \alpha  }  & 0 \ -\sin { \alpha  }  & \cos { \alpha  }  & 0 \ 0 & 0 & { e }^{ \beta  } \end{matrix} \right]$, then $A{ \left( \alpha ,\beta  \right)  }^{ -1 }$ is equal to 

  1. $A{ \left( -\alpha ,-\beta \right) }$

  2. $A{ \left( -\alpha ,\beta \right) }$

  3. $A{ \left(\alpha ,-\beta \right) }$

  4. $A{ \left(\alpha ,\beta \right) }$


Correct Option: A