Let L:V→V be a linear map such that L3=0 (i.e. L3 is the zero matrix). Show that I−L is invertible and find (I−L)−1 in terms of a polynomial of L.
This question is giving me fits. How do I show this? Furthermore, how do I find the invertible matrix in terms of a polynomial of L? I know, by the Invertible Matrix Theorem, that the following are equivalent for an n×n square matrix:
- A is an invertible matrix
- A is row equivalent to the n×n identity matrix
- A has n pivot positions.
- Ax=0 has only the trivial solution.
- The columns of A form a linearly independent set.
- The linear transformation x→Ax is one-to-one.
- The columns of A span Rn
- The linear transformation x→Ax maps Rn onto Rn.
- There is an n×n matrix C such that CA=I.
- There is an n×n matrix D such that AD=I.
- AT is an invertible matrix.
and so on.
New to linear algebra. Usually I can give a bit more in my questions.
Any help is appreciated.
Answer
Suppose
Lk=0,k≥1;
then consider the identity, which holds for any m≥1,
Lm−I=(L−I)(m−1∑0Lj)=(L−I)(Lm−1+Lm−2+…L+I);
this equation may easily be proved (by induction on m if you like), and is quite likely familiar to the reader either from high-school algebra or the study of roots of unity in field theory. Be that as it may, with (1) in place we see that (2) becomes, with m=k,
−I=(L−I)(Lk−1+Lk−2+L+I),
which shows that I−L is invertible with inverse
(I−L)−1=Lk−1+Lk−2+L+I.
The particular case at hand may be resolved by taking k=3.
No comments:
Post a Comment