Dark Calculator Cookie Settings

Dark Calculator uses cookie for smoother and better performance. Would you allow us to have cookies enabled on your browser?

Logo

Dark Calculator

ContactTermsProfile


Converters Constants Calculator Analytical Geometry
MatrixEquationComplex Number Numerical AnalysisOthers
Operations:
Generate Any Matrix
Additive Operations of Matrix
Multiply Matrices
Inverse-Determinant-Trace of Matrices
Rank-Nullity of a Matrix
Eigenvalues and Eigenvectors of a Matrix
Adjoint-Minor-Cofactor of a Matrix
Characteristic Polynomial of a Matrix
Diagonalization of a Matrix
Reduced Row Echelon and Row Echleon Form of a Matrix
LU Decomposition of a Matrix
QR decompostion of a Matrix
Learn More About Matrices
See Examples on Different Matrices

menu

OPERATIONS:

×
Generate Any Matrix
Additive Operations of Matrix
Multiply Matrices
Inverse-Determinant-Trace of Matrices
Rank-Nullity of a Matrix
Eigenvalues and Eigenvectors of a Matrix
Adjoint-Minor-Cofactor of a Matrix
Characteristic Polynomial of a Matrix
Diagonalization of a Matrix
Reduced Row Echelon and Row Echleon Form of a Matrix
LU Decomposition of a Matrix
QR decompostion of a Matrix
Learn More About Matrices
See Examples on Different Matrices

dialogs

Generate Matrices as You Like

In case of inputting square roots, press the square root button and activate it. To insert fractions, insert it converting into decimals. For example, \(\frac{3}{4}\) is to be inputted as \(0.75\). But the output will be in decimals. Again, for working with more irrational numbers like \(2+\sqrt3\), you must input it in decimal form using full-stop \((.)\) as the decimal point.

Choose the order of the matrix \(m\times n\)


The value of \(m\times n\) cannot exceed \(16\).

Criteria for a Square Matrix



Top Tier Criteria

These Criteria would be prioritized the most. Choosing any of these conditions to work would disallow you to work with other criteria




Types of Matrix:

Criteria A: [Does not work with Criteria B]






Criteria B: [Does not work with Criteria A]




Criteria for Any Matrix


Value of an Entry: [Works only for Definite Trace and Definite Determinant]

If you want, you can set the value of an entry as you wish. In the text below, use comma to go to the next entry and enter the value as you want.

For example, if you wish to set the value of \(a_{32}\) as \(10\), use this: ,,,,,,,\(10\) [\(7\) commas before \(10\)].


Illegal inputs would be automatically filtered





Orthogonal matrix must have every of its entries within the range \([-1\) to \(1]\). Therefore, the range you are setting up, is the range of the entry of the matrix after taking out the common factor as the coefficient of the matrix.

From: To :




$$ \left[ \begin{matrix} a_{11} \end{matrix} \right] $$



Theory [Types of Matrices]


In mathematics, matrices are fundamental structures used to represent and manipulate data. Here is a list of different types of matrices used extensively in mathematics:

\(1\). Zero Matrix [Null Matrix]

A matrix whose All elements are zero. Notated as \(0_{mn}\), where \(m\) is the number of rows and \(n\) is the number of columns of the matrix.

\(2.\) Square Matrix

A matrix that has equal number of rows and columns is called square matrix. The order of the matrix is the number of rows in the matrix. A square matrix of order \(3\) means the matrix has \(3\) columns and \(3\) rows. The matrix can also be expressed using its size, specially of size \((3\times 3)\).

\(3.\) Rectangular Matrix

A matrix that has unequal number of rows and columns is called rectangular matrix. If the number of rows is more than column, it is called vertical matrix. On the contrary, if the number of columns is more than that of rows, it is called horizontal matrix.

\(4.\) Diagonal Matrix

Non-zero elements are only on the diagonal. It is usually expressed using the notation: $$D=\text{diag}(d_1,d_2,...,d_n)$$ An Example of such a matrix is: $$\left[\begin{matrix}2&0\\0&4\end{matrix}\right]$$

\(5.\)Identity Matrix

Special diagonal matrix where all diagonal elements are 1. Expressed with the symbol \(I_n\) where \(n\) is the order of the matrix. A matrix of order \(3\times 3\) is notated with \(I_3\) and looks like: $$\left[\begin{matrix}1&0&0\\0&1&0\\0&0&1\end{matrix}\right] $$

\(6.\)Upper Triangular Matrix

All elements below the main diagonal [also known as the principal diagonal] are zero. Usually represented by \(U_n\), where \(n\) is the order of the matrix, it looks like: $$\left[\begin{matrix}1&-5&4\\0&1&-3\\0&0&2\end{matrix}\right]$$

\(7.\)Lower Triangular Matrix

All elements below the main diagonal [also known as the principal diagonal] are zero. Usually represented by \(U_n\), where \(n\) is the order of the matrix, it looks like: $$\left[\begin{matrix}1&0&0\\-5&1&0\\-4&6&3\end{matrix}\right]$$

\(8.\) Transpose Matrix

A transpose of a matrix is a new matrix formed by swapping rows with columns. If you have an original matrix \(A\) with dimensions \(m\times n\) (where \(m\) is the number of rows and \(n\) is the number of columns), the transpose \(A^T\) will have dimensions \(n\times m\). To compute the transpose, you simply rewrite the matrix such that the rows become columns and vice versa. For example, if \(A\) is: $$\begin{bmatrix} 1 & 2 & 3 \\4 & 5 & 6 \end{bmatrix} $$ The transpose \(A^T\) would be: $$\begin{bmatrix} 1 & 4\\2 & 5 \\3 & 6\end{bmatrix}$$

Key points about matrix transpose:
  The transpose of a transpose (\((A^T)^T\)) is the original matrix \(A\).
  If \(A\) is a square matrix (same number of rows and columns), then the order of the matrix and the transpose matrix is the same.

It is to note that, a tranpose matrix has no independent existence, it can only be produced from another matrix.

\(9.\)Symmetric Matrix

A symmetric matrix is a square matrix that equals its own transpose. In mathematical terms, for an \(n \times n\) matrix \(A\), it is symmetric if and only if: $$A=A^T$$, where \(A^T\) denotes the transpose of matrix \(A\).

A matrix \(A=[a_{ij}]\) is symmetric if \(a_{ij}=a_{ji}\) for all \(i,j\). This means that the element at the \(i\)-th row and \(j\)-th column is equal to the element at the \(j\)-th row and \(i\)-th column.

Symmetric matrices are always square matrices, meaning they have the same number of rows and columns (i.e.,\(n\times n\) matrices).

Main Diagonal: The main diagonal elements \(a_{ii}\) (where \(i=1,2,…,n\)) are necessarily symmetric by definition (i.e., \(a_{ii}=a_{ii}\)).

Real Eigenvalues: Symmetric matrices have real eigenvalues. This property is crucial in many applications, including mechanics, physics, and optimization.

Diagonalizability: Symmetric matrices are diagonalizable, meaning they can be expressed in terms of a diagonal matrix \(D\) and an orthogonal matrix \(P\) such that (A=PDP^T). Orthogonal matrices \(P\) preserve lengths and angles, which ensures that the transformation represented by \(A\) preserves these geometric properties.

Addition and Multiplication: The sum and product of symmetric matrices are also symmetric. If \(A\) and \(B\) are symmetric matrices, then \(A+B\) and \(AB\) are symmetric as well.

It is note that:
  Identity Matrix: The identity matrix is a symmetric matrix.
  Diagonal Matrix: A diagonal matrix where all off-diagonal elements are zero is symmetric.

\(10.\)Skew-Symmetric Matrix

A skew-symmetric matrix (or antisymmetric matrix) is a square matrix that satisfies the condition \(A=-A^T\). In other words, the transpose of the matrix is equal to its negative. This implies that the elements on the main diagonal of a skew-symmetric matrix must be zero, and the elements above the diagonal are the negatives of the corresponding elements below the diagonal.

A matrix \(A=[a_{ij}]\) is skew-symmetric if \(a_{ij}=-a_{ji}\) for all \(i\,j\). This means that the element at the \(i\)-th row and \(j\)-th column is the negative of the element at the \(j\)-th row and \(i\)-th column.

Skew-symmetric matrices are always square matrices, meaning they have the same number of rows and columns (i.e., \(n\times n\) matrices).

The main diagonal elements \(a_ii\) of a skew-symmetric matrix are always zero because $$a_{ii}=-a_{ii}$$ implies $$2a_{ii}=0$$, thus $$a_{ii}=0$$

As an example, we have: $$ A=\begin{bmatrix} 0&2&-1\\-2&0&-4\\1&4&0 \end{bmatrix} $$ In this matrix, each element \(a_{ij}\) is the negative of \(a_{ji}\), and the main diagonal elements are zero.

Odd-Order Skew-Symmetric Matrix If \(A\) is a skew-symmetric matrix of odd order (i.e., the number of rows and columns is odd), then its determinant is zero. This follows from the property that the determinant of \(A\) equals the determinant of \(-A\), and since \(A\) is of odd order, \(det(A)=-det(A)\), which implies \(det(A)=0\).

Eigenvalues The eigenvalues of a skew-symmetric matrix are either zero or purely imaginary (if the matrix is over the real numbers).

It is to note that:

  Zero Matrix The zero matrix is a trivial example of a skew-symmetric matrix.

  \(2\times2\) Skew-Symmetric Matrix Any \(2\times2\) skew-symmetric matrix has the form: $$ \begin{bmatrix} 0&a\\-a&0 \end{bmatrix} $$ where \(a\) is a real number.

\(11.\)Orthogonal Matrix

An orthogonal matrix is a square matrix with real entries whose columns and rows are orthonormal vectors. This means that the matrix satisfies the condition: $$QQ^T=QQ^{-1}=I$$ where, \(Q\) is the orthogonal matrix, \(Q_T\) is its transpose, and \(I\) is the identity matrix. It means, a matrix \(Q\) is orthogonal if its transpose is equal to its inverse. Mathematically, \(Q^T=Q^{-1}\). This implies that multiplying \(Q\) by its transpose results in the identity matrix.

Orthonormal Columns and Rows The columns (and rows) of an orthogonal matrix are orthonormal, meaning they are both orthogonal (perpendicular to each other) and normalized (having unit length). For any two columns (or rows) \(Q_j\) and \(Q_i\) of \(Q\):

\(Q_i\cdot Q_j=0\)  (orthogonality)
\(Q_i\cdot Q_i=1\)  (normalization)

Consider the following matrix: $$Q=\begin{bmatrix}1&0&0\\0&0&-1\\0&1&0\end{bmatrix}$$ To verify if \(Q\) is orthogonal, we calculate \(Q^T\) and \(Q^TQ\): $$Q^T=\begin{bmatrix}1&0&0\\0&0&1\\0&-1&0\end{bmatrix}$$ $$Q^TQ=\begin{bmatrix}1&0&0\\0&0&1\\0&-1&0\end{bmatrix}\begin{bmatrix}1&0&0\\0&0&-1\\0&1&0\end{bmatrix}=\begin{bmatrix}1&0&0\\0&1&0\\0&0&1\end{bmatrix}$$ Since \(Q^TQ=I\), \(Q\) is indeed orthogonal.

Determinant The determinant of an orthogonal matrix is either \(1\) or \(-1\). If the determinant is \(+1\), the matrix represents a proper rotation (preserving orientation). If the determinant is \(-1\), it represents an improper rotation or a reflection.

\(12.\)Idempotent Matrix

An idempotent matrix is a matrix that, when multiplied by itself, yields the same matrix. In other words, a matrix \(A\) is idempotent if \(A^2=A\). This property implies that applying the matrix operation repeatedly does not change the result beyond the initial application. Consider the following matrix: $$A=\begin{bmatrix}1&0\\0&0\end{bmatrix}$$To check if \(A\) is idempotent, we compute \(A^2\): $$A^2=\begin{bmatrix}1&0\\0&0\end{bmatrix}\begin{bmatrix}1&0\\0&0\end{bmatrix}=\begin{bmatrix}1&0\\0&0\end{bmatrix}$$ Since \(A^2=A\), \(A\) is idempotent.

Eigenvalues: The eigenvalues of an idempotent matrix are either 0 or 1. This follows from the equation \(A^2=A\), which leads to $$\lambda=\lambda^2$$giving \(\lambda(\lambda-1)=0\) or \(\lambda=0,1\).

Trace: The trace of an idempotent matrix (the sum of its diagonal elements) is equal to the rank of the matrix. This is because the trace is the sum of the eigenvalues, and the eigenvalues are \(0\) or \(1\).

\(13.\)Involutory Matrix

An involutory matrix is a square matrix that is its own inverse. In other words, a matrix \(A\) is involutory if \(A^2=I\), where \(I\) is the identity matrix. This property means that applying the matrix operation twice returns the original value, essentially making it a "self-inverse."

Consider the following matrix: $$A=\begin{bmatrix}0&1\\1&0\end{bmatrix}$$ To verify if \(A\) is involutory, we compute \(A^2\): $$ A^2=\begin{bmatrix}0&1\\1&0\end{bmatrix}\begin{bmatrix}0&1\\1&0\end{bmatrix}=\begin{bmatrix}1&0\\0&1\end{bmatrix} $$ Since, \(A^2=I\), \(A\) is involutory.

Eigenvalues: The eigenvalues of an involutory matrix are \(\pm 1\). This follows from the fact that: $$A^2=I\Rightarrow\lambda^2=1\Rightarrow\lambda=\pm 1$$

Symmetric Involutory Matrices: If a matrix is both symmetric and involutory, it implies that it is orthogonally diagonalizable with eigenvalues of (\pm 1).

Inverse: The inverse of an involutory matrix \(A\) is \(A\) itself, simplifying computations involving the matrix inverse.

\(14.\)Scalar Matrix

The diagonal matrix whose all entries are same is called a scalar matrix. The constant value of the diagonal entries is called the scalar constant. The determinant of the matrix is cube of this scalar constant. As an example, a \(3\times 3\) matrix of scalar constant \(7\) is: $$\begin{bmatrix}7&0&0\\0&7&0\\0&0&7\end{bmatrix}$$

\(15.\)Nilpotent Matrix

A nilpotent matrix is a square matrix \(A\) for which some positive integer power \(k\) results in the zero matrix. In other words, \(A^k=0\) for some smallest positive integer \(k\). This integer \(k\) is called the index of nilpotency. Consider the following matrix: $$A=\begin{bmatrix}0&1\\0&0\end{bmatrix}$$ To check if \(A\) is nilpotent, we compute \(A^2\) $$A^2=\begin{bmatrix}0&1\\0&0\end{bmatrix}\begin{bmatrix}0&1\\0&0\end{bmatrix}=\begin{bmatrix}0&0\\0&0\end{bmatrix}$$ Since, \(A^2=0\), \(A\) is nilpotent with an index of nilpotency \(k=2\).

Trace The trace (sum of the diagonal elements) of a nilpotent matrix is always zero.

Determinant: The determinant of a nilpotent matrix is zero because the product of the eigenvalues (which are all zero) is zero.

\(16.\)Magic Square Matrix

A matrix made up of rows and columns arranged in the form of a magic square is called a magic square matrix. Have a look at the detailed study here

Have functions in your mind that is not in our site? Contact Us.

All rights reserved by the publisher.

The website is still under development and currently in its beta version. Any kind of suggestions or feedback would be highly appreciated.

The website has been developed by:
MRK Studio
MRK Studio
AboutContactTerms
Tour
Cookie Policy
Privacy Policy
MRK Studio is a leading software development company, that has been building websites, applications, softwares and creating scripts since 2023. It searches for enthausiastic programmers that would help them excel themselves for a galore growth both for the company and the individuals.
Address: Mohakhali, Dhaka-1212, Bangladesh.
Copyright: 2023-