Symmetric and Skew-Symmetric Matrices
Symmetric Matrix
A symmetric matrix is a square matrix that is equal to its transpose. For a matrix \( \mathbf{A} \) to be symmetric, it must satisfy the condition:
This means that for all elements \( a_{ij} \) of the matrix, the following condition holds:
This property indicates that the elements of a symmetric matrix are symmetric about the main diagonal (from the top left to the bottom right).
Example
Consider the following 3x3 matrix:
To check if this matrix is symmetric, we verify that \( a_{ij} = a_{ji} \) for all \( i \) and \( j \):
- \( a_{12} = -1 \) and \( a_{21} = -1 \)
- \( a_{13} = 3 \) and \( a_{31} = 3 \)
- \( a_{23} = 4 \) and \( a_{32} = 4 \)
Since all the elements satisfy \( a_{ij} = a_{ji} \), matrix \( \mathbf{A} \) is symmetric.
Skew-Symmetric Matrix
A skew-symmetric matrix is a square matrix that is equal to the negative of its transpose. For a matrix \( \mathbf{A} \) to be skew-symmetric, it must satisfy the condition:
This means that for all elements \( a_{ij} \) of the matrix, the following condition holds:
Additionally, for a skew-symmetric matrix, the diagonal elements must all be zero. This is because for any diagonal element \( a_{ii} \):
Thus, all diagonal elements \( a_{ii} \) of a skew-symmetric matrix are zero.
Example
Consider the following 3x3 matrix:
To check if this matrix is skew-symmetric, we verify that \( a_{ij} = -a_{ji} \) for all \( i \) and \( j \), and that the diagonal elements are zero:
- \( a_{12} = 2 \) and \( a_{21} = -2 \)
- \( a_{13} = -3 \) and \( a_{31} = 3 \)
- \( a_{23} = 5 \) and \( a_{32} = -5 \)
- Diagonal elements: \( a_{11} = 0 \), \( a_{22} = 0 \), \( a_{33} = 0 \)
Since all the conditions are satisfied, matrix \( \mathbf{A} \) is skew-symmetric.
Properties of Symmetric and Skew-Symmetric Matrices
-
Transpose of a Symmetric Matrix:
- If \( \mathbf{A} \) is symmetric, then \( \mathbf{A}^\top \) is also symmetric.
- Proof: Since \( \mathbf{A} = \mathbf{A}^\top \), transposing \( \mathbf{A} \) again gives \( (\mathbf{A}^\top)^\top = \mathbf{A} = \mathbf{A}^\top \).
-
Transpose of a Skew-Symmetric Matrix:
- If \( \mathbf{A} \) is skew-symmetric, then \( \mathbf{A}^\top \) is also skew-symmetric.
- Proof: For a skew-symmetric matrix, \( \mathbf{A}^\top = -\mathbf{A} \). Transposing again gives \( (\mathbf{A}^\top)^\top = (-\mathbf{A})^\top = -\mathbf{A} \), confirming \( \mathbf{A} \) is still skew-symmetric.
-
Scalar Multiplication of a Symmetric Matrix:
- If \( \mathbf{A} \) is symmetric, then \( k\mathbf{A} \) is also symmetric for any scalar \( k \).
- Proof: \( (k\mathbf{A})^\top = k\mathbf{A}^\top = k\mathbf{A} \), which means \( k\mathbf{A} \) is symmetric.
-
Scalar Multiplication of a Skew-Symmetric Matrix:
- If \( \mathbf{A} \) is skew-symmetric, then \( k\mathbf{A} \) is also skew-symmetric for any scalar \( k \).
- Proof: \( (k\mathbf{A})^\top = k\mathbf{A}^\top = k(-\mathbf{A}) = -k\mathbf{A} \), showing \( k\mathbf{A} \) is skew-symmetric.
-
Square of a Symmetric Matrix:
- If \( \mathbf{A} \) is symmetric, then \( \mathbf{A}^2 \) is also symmetric.
- Proof: \( (\mathbf{A}^2)^\top = (\mathbf{A} \mathbf{A})^\top = \mathbf{A}^\top \mathbf{A}^\top = \mathbf{A} \mathbf{A} = \mathbf{A}^2 \), proving \( \mathbf{A}^2 \) is symmetric.
-
Square of a Skew-Symmetric Matrix:
- If \( \mathbf{A} \) is skew-symmetric, then \( \mathbf{A}^2 \) is symmetric.
- Proof: \( (\mathbf{A}^2)^\top = (\mathbf{A} \mathbf{A})^\top = \mathbf{A}^\top \mathbf{A}^\top = (-\mathbf{A})(-\mathbf{A}) = \mathbf{A}^2 \), showing \( \mathbf{A}^2 \) is symmetric.
-
Decomposition of a matrix into a sum of symmetric and skew-symmetrix matrices:
For any square matrix \( \mathbf{A} \), which may not necessarily be symmetric or skew-symmetric, the matrix \( \mathbf{A} + \mathbf{A}^\top \) is always symmetric, and the matrix \( \mathbf{A} - \mathbf{A}^\top \) is always skew-symmetric. To see why this is true, consider the transpose of \( \mathbf{A} + \mathbf{A}^\top \):
\[ (\mathbf{A} + \mathbf{A}^\top)^\top = \mathbf{A}^\top + (\mathbf{A}^\top)^\top = \mathbf{A}^\top + \mathbf{A}. \]Since \( (\mathbf{A} + \mathbf{A}^\top)^\top = \mathbf{A} + \mathbf{A}^\top \), the matrix \( \mathbf{A} + \mathbf{A}^\top \) is symmetric. Now, consider the transpose of \( \mathbf{A} - \mathbf{A}^\top \):
\[ (\mathbf{A} - \mathbf{A}^\top)^\top = \mathbf{A}^\top - (\mathbf{A}^\top)^\top = \mathbf{A}^\top - \mathbf{A}. \]Since \( (\mathbf{A} - \mathbf{A}^\top)^\top = -(\mathbf{A} - \mathbf{A}^\top) \), the matrix \( \mathbf{A} - \mathbf{A}^\top \) is skew-symmetric.
This leads to the decomposition theorem, which states that any square matrix \( \mathbf{A} \) can be uniquely decomposed into the sum of a symmetric matrix and a skew-symmetric matrix:
\[ \mathbf{A} = \frac{1}{2}(\mathbf{A} + \mathbf{A}^\top) + \frac{1}{2}(\mathbf{A} - \mathbf{A}^\top). \]Here, \( \frac{1}{2}(\mathbf{A} + \mathbf{A}^\top) \) is symmetric, and \( \frac{1}{2}(\mathbf{A} - \mathbf{A}^\top) \) is skew-symmetric. This decomposition shows that any square matrix can be represented as the sum of a symmetric matrix and a skew-symmetric matrix in a unique way.
Uniqueness:
To understand why this decomposition is unique, suppose we claim that \( \mathbf{A} \) has two different decompositions: \( \mathbf{A} = \mathbf{P} + \mathbf{Q} \), where \( \mathbf{P} \) is symmetric and \( \mathbf{Q} \) is skew-symmetric, and \( \mathbf{A} = \mathbf{R} + \mathbf{S} \), where \( \mathbf{R} \) is symmetric and \( \mathbf{S} \) is skew-symmetric.
This would mean that:
\[ \mathbf{P} + \mathbf{Q} = \mathbf{R} + \mathbf{S}. \]Rearranging the terms, we have:
\[ \mathbf{P} - \mathbf{R} = \mathbf{S} - \mathbf{Q}. \]Now, notice that \( \mathbf{P} - \mathbf{R} \) is the difference between two symmetric matrices, so it is symmetric. Similarly, \( \mathbf{S} - \mathbf{Q} \) is the difference between two skew-symmetric matrices, so it is skew-symmetric.
Thus, \( \mathbf{P} - \mathbf{R} \) is both symmetric and skew-symmetric. Similarly, \( \mathbf{S} - \mathbf{Q} \) is both symmetric and skew-symmetric. The only matrix that is both symmetric and skew-symmetric is the zero matrix. Therefore, we must have:
\[ \mathbf{P} - \mathbf{R} = 0 \quad \text{and} \quad \mathbf{S} - \mathbf{Q} = 0. \]This implies \( \mathbf{P} = \mathbf{R} \) and \( \mathbf{Q} = \mathbf{S} \). Hence, it is not possible to have two different decompositions of \( \mathbf{A} \) into a symmetric and a skew-symmetric matrix. Therefore, the decomposition is unique.
-
Given any matrix \( \mathbf{A} \), the matrices \( \mathbf{A} \mathbf{A}^\top \) and \( \mathbf{A}^\top \mathbf{A} \) are always symmetric.
To see why \( \mathbf{A} \mathbf{A}^\top \) is symmetric, consider its transpose:
\[ (\mathbf{A} \mathbf{A}^\top)^\top = (\mathbf{A}^\top)^\top \mathbf{A}^\top. \]Since the transpose of a transpose returns the original matrix, \( (\mathbf{A}^\top)^\top = \mathbf{A} \). Thus,
\[ (\mathbf{A} \mathbf{A}^\top)^\top = \mathbf{A} \mathbf{A}^\top. \]This shows that \( \mathbf{A} \mathbf{A}^\top \) is equal to its transpose and therefore is symmetric.
Similarly, to show that \( \mathbf{A}^\top \mathbf{A} \) is symmetric, consider its transpose:
\[ (\mathbf{A}^\top \mathbf{A})^\top = \mathbf{A}^\top (\mathbf{A}^\top)^\top. \]Again, since \( (\mathbf{A}^\top)^\top = \mathbf{A} \), we have:
\[ (\mathbf{A}^\top \mathbf{A})^\top = \mathbf{A}^\top \mathbf{A}. \]Thus, \( \mathbf{A}^\top \mathbf{A} \) is equal to its transpose and is therefore symmetric.
This proves that for any matrix \( \mathbf{A} \), both \( \mathbf{A} \mathbf{A}^\top \) and \( \mathbf{A}^\top \mathbf{A} \) are symmetric matrices.
-
Given non-zero matrices \( \mathbf{A} \) and \( \mathbf{B} \), several properties arise depending on whether \( \mathbf{A} \) and \( \mathbf{B} \) are symmetric or skew-symmetric.
If \( \mathbf{A} \) and \( \mathbf{B} \) are both symmetric matrices, then their sum \( \mathbf{A} + \mathbf{B} \) and their difference \( \mathbf{A} - \mathbf{B} \) are also symmetric. To see why, consider that if \( \mathbf{A} \) and \( \mathbf{B} \) are symmetric, then \( \mathbf{A}^\top = \mathbf{A} \) and \( \mathbf{B}^\top = \mathbf{B} \). Therefore,
\[ (\mathbf{A} + \mathbf{B})^\top = \mathbf{A}^\top + \mathbf{B}^\top = \mathbf{A} + \mathbf{B}, \]\[ (\mathbf{A} - \mathbf{B})^\top = \mathbf{A}^\top - \mathbf{B}^\top = \mathbf{A} - \mathbf{B}. \]Since the transpose of the sum and difference equals the sum and difference themselves, both \( \mathbf{A} + \mathbf{B} \) and \( \mathbf{A} - \mathbf{B} \) are symmetric.
If \( \mathbf{A} \) and \( \mathbf{B} \) are both skew-symmetric matrices, then their sum \( \mathbf{A} + \mathbf{B} \) and their difference \( \mathbf{A} - \mathbf{B} \) are also skew-symmetric. This is because if \( \mathbf{A} \) and \( \mathbf{B} \) are skew-symmetric, then \( \mathbf{A}^\top = -\mathbf{A} \) and \( \mathbf{B}^\top = -\mathbf{B} \). Thus,
\[ (\mathbf{A} + \mathbf{B})^\top = \mathbf{A}^\top + \mathbf{B}^\top = -\mathbf{A} - \mathbf{B} = -(\mathbf{A} + \mathbf{B}), \]\[ (\mathbf{A} - \mathbf{B})^\top = \mathbf{A}^\top - \mathbf{B}^\top = -\mathbf{A} + \mathbf{B} = -(\mathbf{A} - \mathbf{B}). \]Since the transpose of the sum and difference is the negative of the sum and difference, both \( \mathbf{A} + \mathbf{B} \) and \( \mathbf{A} - \mathbf{B} \) are skew-symmetric.
However, if \( \mathbf{A} \) is symmetric and \( \mathbf{B} \) is skew-symmetric, then neither \( \mathbf{A} + \mathbf{B} \) nor \( \mathbf{A} - \mathbf{B} \) are symmetric or skew-symmetric. To see this, note that if \( \mathbf{A} \) is symmetric (\( \mathbf{A}^\top = \mathbf{A} \)) and \( \mathbf{B} \) is skew-symmetric (\( \mathbf{B}^\top = -\mathbf{B} \)), then
\[ (\mathbf{A} + \mathbf{B})^\top = \mathbf{A}^\top + \mathbf{B}^\top = \mathbf{A} - \mathbf{B} \neq \mathbf{A} + \mathbf{B}, \]\[ (\mathbf{A} - \mathbf{B})^\top = \mathbf{A}^\top - \mathbf{B}^\top = \mathbf{A} + \mathbf{B} \neq -(\mathbf{A} - \mathbf{B}). \]Therefore, \( \mathbf{A} + \mathbf{B} \) and \( \mathbf{A} - \mathbf{B} \) are neither symmetric nor skew-symmetric.
-
Given two square matrices \( \mathbf{A} \) and \( \mathbf{B} \), if it is claimed that \( \mathbf{A} \) is symmetric, \( \mathbf{B} \) is skew-symmetric, and \( \mathbf{A} + \mathbf{B} \) is symmetric, then \( \mathbf{B} \) must be the null matrix (a matrix with all entries equal to zero).
To see why this is true, recall that a matrix \( \mathbf{A} \) is symmetric if \( \mathbf{A}^\top = \mathbf{A} \), and a matrix \( \mathbf{B} \) is skew-symmetric if \( \mathbf{B}^\top = -\mathbf{B} \). Now consider the transpose of \( \mathbf{A} + \mathbf{B} \):
\[ (\mathbf{A} + \mathbf{B})^\top = \mathbf{A}^\top + \mathbf{B}^\top. \]Since \( \mathbf{A} \) is symmetric and \( \mathbf{B} \) is skew-symmetric, this becomes:
\[ (\mathbf{A} + \mathbf{B})^\top = \mathbf{A} + (-\mathbf{B}) = \mathbf{A} - \mathbf{B}. \]If \( \mathbf{A} + \mathbf{B} \) is also symmetric, then it must satisfy \( (\mathbf{A} + \mathbf{B})^\top = \mathbf{A} + \mathbf{B} \). Therefore,
\[ \mathbf{A} - \mathbf{B} = \mathbf{A} + \mathbf{B}. \]Simplifying this equation:
\[ -\mathbf{B} = \mathbf{B} \implies 2\mathbf{B} = 0 \implies \mathbf{B} = 0. \]Thus, \( \mathbf{B} \) must be the null matrix.
Similarly, if it is claimed that \( \mathbf{A} \) is symmetric, \( \mathbf{B} \) is skew-symmetric, and \( \mathbf{A} + \mathbf{B} \) is skew-symmetric, then \( \mathbf{A} \) must be the null matrix.
To prove this, consider the transpose of \( \mathbf{A} + \mathbf{B} \):
\[ (\mathbf{A} + \mathbf{B})^\top = \mathbf{A}^\top + \mathbf{B}^\top. \]Using the properties of \( \mathbf{A} \) and \( \mathbf{B} \), this becomes:
\[ (\mathbf{A} + \mathbf{B})^\top = \mathbf{A} + (-\mathbf{B}) = \mathbf{A} - \mathbf{B}. \]If \( \mathbf{A} + \mathbf{B} \) is skew-symmetric, then \( (\mathbf{A} + \mathbf{B})^\top = -(\mathbf{A} + \mathbf{B}) \). Therefore,
\[ \mathbf{A} - \mathbf{B} = -(\mathbf{A} + \mathbf{B}). \]Simplifying this equation:
\[ \mathbf{A} - \mathbf{B} = -\mathbf{A} - \mathbf{B} \implies 2\mathbf{A} = 0 \implies \mathbf{A} = 0. \]Thus, \( \mathbf{A} \) must be the null matrix.
In summary, if \( \mathbf{A} \) is symmetric, \( \mathbf{B} \) is skew-symmetric, and \( \mathbf{A} + \mathbf{B} \) is symmetric, then \( \mathbf{B} \) must be the null matrix. Similarly, if \( \mathbf{A} + \mathbf{B} \) is skew-symmetric, then \( \mathbf{A} \) must be the null matrix.
-
If \( \mathbf{A} \) and \( \mathbf{B} \) are symmetric matrices, then the product \( \mathbf{AB} \) is symmetric if and only if \( \mathbf{A} \) and \( \mathbf{B} \) commute, that is, \( \mathbf{AB} = \mathbf{BA} \).
Proof:
To prove this, we need to show two things: 1. If \( \mathbf{A} \) and \( \mathbf{B} \) are symmetric and \( \mathbf{AB} \) is symmetric, then \( \mathbf{AB} = \mathbf{BA} \). 2. If \( \mathbf{A} \) and \( \mathbf{B} \) are symmetric and \( \mathbf{AB} = \mathbf{BA} \), then \( \mathbf{AB} \) is symmetric.
1. If \( \mathbf{A} \) and \( \mathbf{B} \) are symmetric and \( \mathbf{AB} \) is symmetric, then \( \mathbf{AB} = \mathbf{BA} \):
Suppose \( \mathbf{A} \) and \( \mathbf{B} \) are symmetric matrices, so \( \mathbf{A}^\top = \mathbf{A} \) and \( \mathbf{B}^\top = \mathbf{B} \). If \( \mathbf{AB} \) is symmetric, then \( (\mathbf{AB})^\top = \mathbf{AB} \). Let's calculate the transpose of \( \mathbf{AB} \):
\[ (\mathbf{AB})^\top = \mathbf{B}^\top \mathbf{A}^\top. \]Since \( \mathbf{A} \) and \( \mathbf{B} \) are symmetric, this becomes:
\[ (\mathbf{AB})^\top = \mathbf{B} \mathbf{A}. \]If \( \mathbf{AB} \) is symmetric, then:
\[ \mathbf{AB} = (\mathbf{AB})^\top = \mathbf{B} \mathbf{A}. \]Therefore, \( \mathbf{AB} = \mathbf{BA} \).
2. If \( \mathbf{A} \) and \( \mathbf{B} \) are symmetric and \( \mathbf{AB} = \mathbf{BA} \), then \( \mathbf{AB} \) is symmetric:
Suppose \( \mathbf{A} \) and \( \mathbf{B} \) are symmetric matrices, so \( \mathbf{A}^\top = \mathbf{A} \) and \( \mathbf{B}^\top = \mathbf{B} \), and suppose \( \mathbf{AB} = \mathbf{BA} \). We want to show that \( \mathbf{AB} \) is symmetric.
Calculate the transpose of \( \mathbf{AB} \):
\[ (\mathbf{AB})^\top = \mathbf{B}^\top \mathbf{A}^\top. \]Since \( \mathbf{A} \) and \( \mathbf{B} \) are symmetric:
\[ (\mathbf{AB})^\top = \mathbf{B} \mathbf{A}. \]Given \( \mathbf{AB} = \mathbf{BA} \), we substitute:
\[ (\mathbf{AB})^\top = \mathbf{AB}. \]Therefore, \( \mathbf{AB} \) is symmetric.
If \( \mathbf{A} \) and \( \mathbf{B} \) are both skew-symmetric matrices and they commute (i.e., \( \mathbf{AB} = \mathbf{BA} \)), then the product \( \mathbf{AB} \) is symmetric.
If \( \mathbf{A} \) is symmetric and \( \mathbf{B} \) is skew-symmetric, and they commute (i.e., \( \mathbf{AB} = \mathbf{BA} \)), then the product \( \mathbf{AB} \) is skew-symmetric.
-
If \( \mathbf{A} \) is a symmetric (or skew-symmetric) matrix and \( \mathbf{P} \) is any matrix, then the matrices \( \mathbf{PAP}^\top \) and \( \mathbf{P}^\top \mathbf{A} \mathbf{P} \) are also symmetric (or skew-symmetric).
Proof:
Let \( \mathbf{P} \) be any matrix and \( \mathbf{A} \) be a symmetric (or skew-symmetric) matrix. We want to show that the matrices \( \mathbf{PAP}^\top \) and \( \mathbf{P}^\top \mathbf{A} \mathbf{P} \) are also symmetric (or skew-symmetric).
Case 1: Symmetric Matrices
If \( \mathbf{A} \) is symmetric, then \( \mathbf{A}^\top = \mathbf{A} \).
-
To prove \( \mathbf{PAP}^\top \) is symmetric:
Consider the transpose of \( \mathbf{PAP}^\top \):
\[ (\mathbf{PAP}^\top)^\top = (\mathbf{P}^\top)^\top \mathbf{A}^\top \mathbf{P}^\top. \]Since \( (\mathbf{P}^\top)^\top = \mathbf{P} \) and \( \mathbf{A}^\top = \mathbf{A} \) because \( \mathbf{A} \) is symmetric, we have:
\[ (\mathbf{PAP}^\top)^\top = \mathbf{P} \mathbf{A} \mathbf{P}^\top = \mathbf{PAP}^\top. \]Therefore, \( \mathbf{PAP}^\top \) is symmetric.
-
To prove \( \mathbf{P}^\top \mathbf{A} \mathbf{P} \) is symmetric:
Consider the transpose of \( \mathbf{P}^\top \mathbf{A} \mathbf{P} \):
\[ (\mathbf{P}^\top \mathbf{A} \mathbf{P})^\top = \mathbf{P}^\top \mathbf{A}^\top (\mathbf{P}^\top)^\top. \]Since \( \mathbf{A}^\top = \mathbf{A} \) and \( (\mathbf{P}^\top)^\top = \mathbf{P} \), we get:
\[ (\mathbf{P}^\top \mathbf{A} \mathbf{P})^\top = \mathbf{P}^\top \mathbf{A} \mathbf{P} = \mathbf{P}^\top \mathbf{A} \mathbf{P}. \]Therefore, \( \mathbf{P}^\top \mathbf{A} \mathbf{P} \) is symmetric.
Case 2: Skew-Symmetric Matrices
If \( \mathbf{A} \) is skew-symmetric, then \( \mathbf{A}^\top = -\mathbf{A} \).
-
To prove \( \mathbf{PAP}^\top \) is skew-symmetric:
Consider the transpose of \( \mathbf{PAP}^\top \):
\[ (\mathbf{PAP}^\top)^\top = (\mathbf{P}^\top)^\top \mathbf{A}^\top \mathbf{P}^\top. \]Since \( (\mathbf{P}^\top)^\top = \mathbf{P} \) and \( \mathbf{A}^\top = -\mathbf{A} \) because \( \mathbf{A} \) is skew-symmetric, we have:
\[ (\mathbf{PAP}^\top)^\top = \mathbf{P} (-\mathbf{A}) \mathbf{P}^\top = -\mathbf{P} \mathbf{A} \mathbf{P}^\top. \]Therefore, \( \mathbf{PAP}^\top \) is skew-symmetric.
-
To prove \( \mathbf{P}^\top \mathbf{A} \mathbf{P} \) is skew-symmetric:
Consider the transpose of \( \mathbf{P}^\top \mathbf{A} \mathbf{P} \):
\[ (\mathbf{P}^\top \mathbf{A} \mathbf{P})^\top = \mathbf{P}^\top \mathbf{A}^\top (\mathbf{P}^\top)^\top. \]Since \( \mathbf{A}^\top = -\mathbf{A} \) and \( (\mathbf{P}^\top)^\top = \mathbf{P} \), we get:
\[ (\mathbf{P}^\top \mathbf{A} \mathbf{P})^\top = \mathbf{P}^\top (-\mathbf{A}) \mathbf{P} = -(\mathbf{P}^\top \mathbf{A} \mathbf{P}). \]Therefore, \( \mathbf{P}^\top \mathbf{A} \mathbf{P} \) is skew-symmetric.
Conclusion
If \( \mathbf{A} \) is symmetric (skew-symmetric), then both \( \mathbf{PAP}^\top \) and \( \mathbf{P}^\top \mathbf{A} \mathbf{P} \) are also symmetric (skew-symmetric).
-
-
If \( \mathbf{P} \) is a row matrix (1 by \( n \)) and \( \mathbf{A} \) is a skew-symmetric matrix ( \( n \times n \) ), then the matrix \( \mathbf{PAP}^\top \) is a \( 1 \times 1 \) matrix, which means it is a scalar. We want to prove that \( \mathbf{PAP}^\top = [0] \).
Proof:
Given that \( \mathbf{A} \) is a skew-symmetric matrix, it satisfies the property \( \mathbf{A}^\top = -\mathbf{A} \).
Now, consider \( \mathbf{PAP}^\top \). Since \( \mathbf{P} \) is a \( 1 \times n \) row matrix, and \( \mathbf{A} \) is an \( n \times n \) skew-symmetric matrix, the product \( \mathbf{PAP}^\top \) results in a \( 1 \times 1 \) matrix, or a scalar.
From the previous property, we know that if \( \mathbf{A} \) is skew-symmetric, then \( \mathbf{PAP}^\top \) is also skew-symmetric. Since \( \mathbf{PAP}^\top \) is a \( 1 \times 1 \) matrix, and a \( 1 \times 1 \) skew-symmetric matrix must satisfy \( x = -x \), it follows that:
\[ \mathbf{PAP}^\top = -\mathbf{PAP}^\top. \]Adding \( \mathbf{PAP}^\top \) to both sides, we get:
\[ 2\mathbf{PAP}^\top = 0. \]Therefore, \( \mathbf{PAP}^\top = 0 \), which means \( \mathbf{PAP}^\top = [0] \).
This proves that if \( \mathbf{P} \) is a row matrix and \( \mathbf{A} \) is a skew-symmetric matrix, then \( \mathbf{PAP}^\top = [0] \).
Conversely, if for a square matrix \( \mathbf{A} \), the matrix \( \mathbf{PAP}^\top = [0] \) for all possible row matrices \( \mathbf{P} \), then \( \mathbf{A} \) must be skew-symmetric.
Example:
Let
\[ \mathbf{A} = \begin{bmatrix} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \end{bmatrix} , \quad \mathbf{P} = \begin{bmatrix} x & y & z \end{bmatrix}. \]To calculate \( \mathbf{PAP}^\top \):
\[ \mathbf{PAP}^\top = \begin{bmatrix} x & y & z \end{bmatrix} \begin{bmatrix} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \end{bmatrix} \begin{bmatrix} x \\ y \\ z \end{bmatrix}. \]Multiplying out the matrices:
\[ \mathbf{PAP}^\top = \begin{bmatrix} x & y & z \end{bmatrix} \begin{bmatrix} xa_1 + yb_1 + zc_1 \\ xa_2 + yb_2 + zc_2 \\ xa_3 + yb_3 + zc_3 \end{bmatrix}. \]This gives:
\[ \mathbf{PAP}^\top = x(xa_1 + yb_1 + zc_1) + y(xa_2 + yb_2 + zc_2) + z(xa_3 + yb_3 + zc_3). \]Expanding and rearranging, we get:
\[ \mathbf{PAP}^\top = x^2a_1 + xy(a_2 + b_1) + xz(a_3 + c_1) + y^2b_2 + yz(b_3 + c_2) + z^2c_3. \]For \( \mathbf{PAP}^\top = 0 \) for all possible row matrices \( \mathbf{P} = [x \ y \ z] \), the coefficients of all the terms must be zero. This implies that \( a_1, b_2, c_3 = 0 \) and \( a_2 = -b_1 \), \( a_3 = -c_1 \), \( b_3 = -c_2 \), which are the conditions for \( \mathbf{A} \) to be skew-symmetric.
We can extend this example as a general proof.