Monday, July 16, 2018

linear algebra - If row spaces of two $mtimes n$ matrices are equal, then matrices are row-equivalent



I'm trying to prove that if two $m\times n$ matrices $A$ and $B$ over a field $F$ have the same row space, then they are row-equivalent.



Intuitively, it makes sense. Indeed, if $\mathrm{row}(A) = \mathrm{row}(B)$, then each row $[a_{i1}, ..., a_{in}]$ of $A$ is generated by rows of $B$ and vice versa, that if, for any $i = 1,...,m$ we have

$$[a_{i1},...,a_{in}] = c_{i1}[b_{i_11},...,b_{i_1n}] + ... + c_{it_i}[b_{i_{t_i}1},...,b_{i_{t_i}n}]$$
and
$$[b_{i1},...,b_{in}] = d_{i1}[a_{i'_11},...,a_{i'_1n}] + ... d_{ip_i}[b_{i'_{p_i}1},...,b_{i'_{p_i}n}].$$



This seems similar to the condition an elementary operation being applied to a row, but I can't seem to make this final step to make it rigorous. One idea was to prove by induction is that is $A_k$ is equivalent to $A$, then $A_{k+1}$ is equivalent to $B$, where $(A_k)_{ij}$ is equal to $b_{ij}$ if $i < k$ and to $a_{ij}$ otherwise, but I have a feeling that this is wrong.



Edit: This question has been marked as a duplicate of another question for some reason. However, that question asked how to prove that two row-equivalent matrices have the same row space, while my question asks how to prove the converse.


Answer



First of all, elementary row operations can be realized as multiplication by elementary matrices, that is, matrices differing from the identity by an elementary row operation. Such matrices are invertible.




Also, elementary row operations don't change the row space.



If you perform Gaussian elimination on $A$, the matrix in row echelon form you get has $k$ nonzero rows and the other are zero (at the bottom), where $k$ is the dimension of the row space. This echelon form $U$ is row equivalent to $A$ and $U=SA$, for some invertible matrix $S$.



Similarly, you can get a row echelon form $V=TB$ from $B$. So it's not restrictive to assume that $A$ and $B$ are in row echelon form: indeed, if $V=LU$ for an invertible matrix $L$, then $TB=LSA$ and $B=T^{-1}LSA$.



Note also that the nonzero rows of a matrix in row echelon form are linearly independent.



Since $A$ and $B$ have the same row space, they have the same number of nonzero rows. If $a_1,a_2,\dots,a_k$ and $b_1,b_2,\dots,b_k$ are the nonzero rows in $A$ and $B$, then
$$

a_i=\sum_{j=1}^kc_{ij}b_j
$$

and the $k\times k$ matrix $C=[c_{ij}]$ is invertible (why?). Now consider the invertible block matrix
$$
L=\begin{bmatrix} C & 0 \\ 0 & I_{m-k} \end{bmatrix}
$$

(the bottom right corner is the identity matrix; if $k=m$, just consider $L=C$).



Compute $LB$.


No comments:

Post a Comment

analysis - Injection, making bijection

I have injection $f \colon A \rightarrow B$ and I want to get bijection. Can I just resting codomain to $f(A)$? I know that every function i...