Let $S=\{A_1,A_2,...,A_k \}$ be a group of row equivalent matrices where $A_m$ is a square matrix.
Prove or disprove that if there is a linear combination of elements of S, such that this combination is invertible matrix, then any element of S is an invertible matrix as well.
I know that if matrices are row equivalent, then one can be changed to the other by a sequence of elementary row operations, means that I can take any $A_m \in S$ and find a sequence of elementary row operations until $A_m$ is the same as $A_1$.
A linear combinations of the elements in S will be $A_C=\lambda_1 A_1+\lambda_2 A_2+...+\lambda_k A_k$ (C for combination). If we can take any matrix as equal to $A_1$ then $A_C=(\lambda_1+\lambda_2+...+\lambda_k)A_1$.
Now, if we multiply an invertible matrix in a scalar the result is still an invertible matrix, thus if $A_C$ is an invertible matrix, then $A_1$ is an invertible matrix. Every element of S is a row equivalent matrix to the matrix $A_1$, thus invertible.
Is my claims correct? how should we solve this question?
Please help (I'm preparing for a test). Thank you!
Answer
What do we know about row-equivalence, what do we know about invertability of matrices, and can we make them join together somewhere?
Two row equivalent matrices will have the same row space. Also, an $n\times n$ matrix is invertible iff its row space is the whole space ($\Bbb R^n$?). So there you go! If there is a linear combination $A_C$ of matrices in $S$ which is invertible, then the row space of the linear combination must be $\Bbb R^n$.
But a linear combination of matrices with common row space cannot extend said row space (each row in $A_C$ will just be a linear combination of rows from matrices in $S$, and row spaces, like any vector space, is closed under linear combinations). So if $A_C$ has row space $\Bbb R^n$, that means the matrices of $S$ must have that row space too, and thus they are invertible.
No comments:
Post a Comment