Thursday, May 31, 2018

linear algebra - Practical use of matrix right inverse

Consider a matrix $A \in \mathcal{R}^{m \times n}$ In the case when $rank(A) = n$ then this implies the existence of a left inverse: $A_l^{-1} = (A^\top A)^{-1}A^\top$ such that $A_l^{-1}A = I_{n\times n}$.


Similarly if $rank(A) = m$ then this implies the existence of a right inverse: $A_r^{-1} = A^\top(A A^\top )^{-1}$ such that $A A_r^{-1} = I_{m\times m}$.


I understand how the concept of a right inverse is naturally follows in say solution of a least squares problem: $Ax = b$, $rank(A)=n

I expect it to involve the fact that in this case $rank(A)=m < n$ and so there are now infinitely many solutions to $Ax = b$ and that the right inverse now someone seeks the solution which minimises the length of the solution vector?

No comments:

Post a Comment

analysis - Injection, making bijection

I have injection $f \colon A \rightarrow B$ and I want to get bijection. Can I just resting codomain to $f(A)$? I know that every function i...