Dark Calculator |
Converters | Constants | Calculator | Analytical Geometry |
Matrix | Equation | Complex Number | Numerical Analysis | Others |
Operations:
|
menu OPERATIONS: ×
Choose the order of the matrix \(m\times n\) The value of \(m\times n\) cannot exceed \(16\). Criteria for a Square MatrixCriteria for Any Matrix
$$
\left[
\begin{matrix}
a_{11}
\end{matrix}
\right]
$$
|
In mathematics, matrices are fundamental structures used to represent and manipulate data. Here is a list of different types of matrices used extensively in mathematics:
A matrix whose All elements are zero. Notated as \(0_{mn}\), where \(m\) is the number of rows and \(n\) is the number of columns of the matrix.
A matrix that has equal number of rows and columns is called square matrix. The order of the matrix is the number of rows in the matrix. A square matrix of order \(3\) means the matrix has \(3\) columns and \(3\) rows. The matrix can also be expressed using its size, specially of size \((3\times 3)\).
A matrix that has unequal number of rows and columns is called rectangular matrix. If the number of rows is more than column, it is called vertical matrix. On the contrary, if the number of columns is more than that of rows, it is called horizontal matrix.
Non-zero elements are only on the diagonal. It is usually expressed using the notation: $$D=\text{diag}(d_1,d_2,...,d_n)$$ An Example of such a matrix is: $$\left[\begin{matrix}2&0\\0&4\end{matrix}\right]$$
Special diagonal matrix where all diagonal elements are 1. Expressed with the symbol \(I_n\) where \(n\) is the order of the matrix. A matrix of order \(3\times 3\) is notated with \(I_3\) and looks like: $$\left[\begin{matrix}1&0&0\\0&1&0\\0&0&1\end{matrix}\right] $$
All elements below the main diagonal [also known as the principal diagonal] are zero. Usually represented by \(U_n\), where \(n\) is the order of the matrix, it looks like: $$\left[\begin{matrix}1&-5&4\\0&1&-3\\0&0&2\end{matrix}\right]$$
All elements below the main diagonal [also known as the principal diagonal] are zero. Usually represented by \(U_n\), where \(n\) is the order of the matrix, it looks like: $$\left[\begin{matrix}1&0&0\\-5&1&0\\-4&6&3\end{matrix}\right]$$
A transpose of a matrix is a new matrix formed by swapping rows with columns. If you have an original matrix \(A\) with dimensions \(m\times n\) (where \(m\) is the number of rows and \(n\) is the number of columns), the transpose \(A^T\) will have dimensions \(n\times m\). To compute the transpose, you simply rewrite the matrix such that the rows become columns and vice versa. For example, if \(A\) is: $$\begin{bmatrix} 1 & 2 & 3 \\4 & 5 & 6 \end{bmatrix} $$ The transpose \(A^T\) would be: $$\begin{bmatrix} 1 & 4\\2 & 5 \\3 & 6\end{bmatrix}$$
Key points about matrix transpose:
The transpose of a transpose (\((A^T)^T\)) is the original matrix \(A\).
If \(A\) is a square matrix (same number of rows and columns), then the order of the matrix and the transpose matrix is the same.
It is to note that, a tranpose matrix has no independent existence, it can only be produced from another matrix.
A symmetric matrix is a square matrix that equals its own transpose. In mathematical terms, for an \(n \times n\) matrix \(A\), it is symmetric if and only if: $$A=A^T$$, where \(A^T\) denotes the transpose of matrix \(A\).
A matrix \(A=[a_{ij}]\) is symmetric if \(a_{ij}=a_{ji}\) for all \(i,j\). This means that the element at the \(i\)-th row and \(j\)-th column is equal to the element at the \(j\)-th row and \(i\)-th column.
Symmetric matrices are always square matrices, meaning they have the same number of rows and columns (i.e.,\(n\times n\) matrices).
Main Diagonal: The main diagonal elements \(a_{ii}\) (where \(i=1,2,…,n\)) are necessarily symmetric by definition (i.e., \(a_{ii}=a_{ii}\)).
Real Eigenvalues: Symmetric matrices have real eigenvalues. This property is crucial in many applications, including mechanics, physics, and optimization.
Diagonalizability: Symmetric matrices are diagonalizable, meaning they can be expressed in terms of a diagonal matrix \(D\) and an orthogonal matrix \(P\) such that (A=PDP^T). Orthogonal matrices \(P\) preserve lengths and angles, which ensures that the transformation represented by \(A\) preserves these geometric properties.
Addition and Multiplication: The sum and product of symmetric matrices are also symmetric. If \(A\) and \(B\) are symmetric matrices, then \(A+B\) and \(AB\) are symmetric as well.
It is note that:
Identity Matrix: The identity matrix is a symmetric matrix.
Diagonal Matrix: A diagonal matrix where all off-diagonal elements are zero is symmetric.
A skew-symmetric matrix (or antisymmetric matrix) is a square matrix that satisfies the condition \(A=-A^T\). In other words, the transpose of the matrix is equal to its negative. This implies that the elements on the main diagonal of a skew-symmetric matrix must be zero, and the elements above the diagonal are the negatives of the corresponding elements below the diagonal.
A matrix \(A=[a_{ij}]\) is skew-symmetric if \(a_{ij}=-a_{ji}\) for all \(i\,j\). This means that the element at the \(i\)-th row and \(j\)-th column is the negative of the element at the \(j\)-th row and \(i\)-th column.
Skew-symmetric matrices are always square matrices, meaning they have the same number of rows and columns (i.e., \(n\times n\) matrices).
The main diagonal elements \(a_ii\) of a skew-symmetric matrix are always zero because $$a_{ii}=-a_{ii}$$ implies $$2a_{ii}=0$$, thus $$a_{ii}=0$$
As an example, we have: $$ A=\begin{bmatrix} 0&2&-1\\-2&0&-4\\1&4&0 \end{bmatrix} $$ In this matrix, each element \(a_{ij}\) is the negative of \(a_{ji}\), and the main diagonal elements are zero.
Odd-Order Skew-Symmetric Matrix If \(A\) is a skew-symmetric matrix of odd order (i.e., the number of rows and columns is odd), then its determinant is zero. This follows from the property that the determinant of \(A\) equals the determinant of \(-A\), and since \(A\) is of odd order, \(det(A)=-det(A)\), which implies \(det(A)=0\).
Eigenvalues The eigenvalues of a skew-symmetric matrix are either zero or purely imaginary (if the matrix is over the real numbers).
It is to note that:
Zero Matrix The zero matrix is a trivial example of a skew-symmetric matrix.
\(2\times2\) Skew-Symmetric Matrix Any \(2\times2\) skew-symmetric matrix has the form: $$ \begin{bmatrix} 0&a\\-a&0 \end{bmatrix} $$ where \(a\) is a real number.
An orthogonal matrix is a square matrix with real entries whose columns and rows are orthonormal vectors. This means that the matrix satisfies the condition: $$QQ^T=QQ^{-1}=I$$ where, \(Q\) is the orthogonal matrix, \(Q_T\) is its transpose, and \(I\) is the identity matrix. It means, a matrix \(Q\) is orthogonal if its transpose is equal to its inverse. Mathematically, \(Q^T=Q^{-1}\). This implies that multiplying \(Q\) by its transpose results in the identity matrix.
Orthonormal Columns and Rows The columns (and rows) of an orthogonal matrix are orthonormal, meaning they are both orthogonal (perpendicular to each other) and normalized (having unit length). For any two columns (or rows) \(Q_j\) and \(Q_i\) of \(Q\):
\(Q_i\cdot Q_j=0\) (orthogonality)
\(Q_i\cdot Q_i=1\) (normalization)
Consider the following matrix: $$Q=\begin{bmatrix}1&0&0\\0&0&-1\\0&1&0\end{bmatrix}$$ To verify if \(Q\) is orthogonal, we calculate \(Q^T\) and \(Q^TQ\): $$Q^T=\begin{bmatrix}1&0&0\\0&0&1\\0&-1&0\end{bmatrix}$$ $$Q^TQ=\begin{bmatrix}1&0&0\\0&0&1\\0&-1&0\end{bmatrix}\begin{bmatrix}1&0&0\\0&0&-1\\0&1&0\end{bmatrix}=\begin{bmatrix}1&0&0\\0&1&0\\0&0&1\end{bmatrix}$$ Since \(Q^TQ=I\), \(Q\) is indeed orthogonal.
Determinant The determinant of an orthogonal matrix is either \(1\) or \(-1\). If the determinant is \(+1\), the matrix represents a proper rotation (preserving orientation). If the determinant is \(-1\), it represents an improper rotation or a reflection.
An idempotent matrix is a matrix that, when multiplied by itself, yields the same matrix. In other words, a matrix \(A\) is idempotent if \(A^2=A\). This property implies that applying the matrix operation repeatedly does not change the result beyond the initial application. Consider the following matrix: $$A=\begin{bmatrix}1&0\\0&0\end{bmatrix}$$To check if \(A\) is idempotent, we compute \(A^2\): $$A^2=\begin{bmatrix}1&0\\0&0\end{bmatrix}\begin{bmatrix}1&0\\0&0\end{bmatrix}=\begin{bmatrix}1&0\\0&0\end{bmatrix}$$ Since \(A^2=A\), \(A\) is idempotent.
Eigenvalues: The eigenvalues of an idempotent matrix are either 0 or 1. This follows from the equation \(A^2=A\), which leads to $$\lambda=\lambda^2$$giving \(\lambda(\lambda-1)=0\) or \(\lambda=0,1\).
Trace: The trace of an idempotent matrix (the sum of its diagonal elements) is equal to the rank of the matrix. This is because the trace is the sum of the eigenvalues, and the eigenvalues are \(0\) or \(1\).
An involutory matrix is a square matrix that is its own inverse. In other words, a matrix \(A\) is involutory if \(A^2=I\), where \(I\) is the identity matrix. This property means that applying the matrix operation twice returns the original value, essentially making it a "self-inverse."
Consider the following matrix: $$A=\begin{bmatrix}0&1\\1&0\end{bmatrix}$$ To verify if \(A\) is involutory, we compute \(A^2\): $$ A^2=\begin{bmatrix}0&1\\1&0\end{bmatrix}\begin{bmatrix}0&1\\1&0\end{bmatrix}=\begin{bmatrix}1&0\\0&1\end{bmatrix} $$ Since, \(A^2=I\), \(A\) is involutory.
Eigenvalues: The eigenvalues of an involutory matrix are \(\pm 1\). This follows from the fact that: $$A^2=I\Rightarrow\lambda^2=1\Rightarrow\lambda=\pm 1$$
Symmetric Involutory Matrices: If a matrix is both symmetric and involutory, it implies that it is orthogonally diagonalizable with eigenvalues of (\pm 1).
Inverse: The inverse of an involutory matrix \(A\) is \(A\) itself, simplifying computations involving the matrix inverse.
The diagonal matrix whose all entries are same is called a scalar matrix. The constant value of the diagonal entries is called the scalar constant. The determinant of the matrix is cube of this scalar constant. As an example, a \(3\times 3\) matrix of scalar constant \(7\) is: $$\begin{bmatrix}7&0&0\\0&7&0\\0&0&7\end{bmatrix}$$
A nilpotent matrix is a square matrix \(A\) for which some positive integer power \(k\) results in the zero matrix. In other words, \(A^k=0\) for some smallest positive integer \(k\). This integer \(k\) is called the index of nilpotency. Consider the following matrix: $$A=\begin{bmatrix}0&1\\0&0\end{bmatrix}$$ To check if \(A\) is nilpotent, we compute \(A^2\) $$A^2=\begin{bmatrix}0&1\\0&0\end{bmatrix}\begin{bmatrix}0&1\\0&0\end{bmatrix}=\begin{bmatrix}0&0\\0&0\end{bmatrix}$$ Since, \(A^2=0\), \(A\) is nilpotent with an index of nilpotency \(k=2\).
Trace The trace (sum of the diagonal elements) of a nilpotent matrix is always zero.
Determinant: The determinant of a nilpotent matrix is zero because the product of the eigenvalues (which are all zero) is zero.
A matrix made up of rows and columns arranged in the form of a magic square is called a magic square matrix. Have a look at the detailed study here