Consider a matrix $A \in \mathcal{R}^{m \times n}$ In the case when $rank(A) = n$ then this implies the existence of a left inverse: $A_l^{-1} = (A^\top A)^{-1}A^\top$ such that $A_l^{-1}A = I_{n\times n}$.
Similarly if $rank(A) = m$ then this implies the existence of a right inverse: $A_r^{-1} = A^\top(A A^\top )^{-1}$ such that $A A_r^{-1} = I_{m\times m}$.
I understand how the concept of a right inverse is naturally follows in say solution of a least squares problem: $Ax = b$, $rank(A)=n I expect it to involve the fact that in this case $rank(A)=m < n$ and so there are now infinitely many solutions to $Ax = b$ and that the right inverse now someone seeks the solution which minimises the length of the solution vector?
No comments:
Post a Comment