Monday, July 11, 2016

linear algebra - Does the inverse of a polynomial matrix have polynomial growth?




Let M:RnRn×n be a matrix-valued function whose entries mij(x1,,xn) are all multivariate polynomials with real coefficients. Suppose that M(x) is invertible for every xRn. I would like to show that M1 has at most polynomial growth.



That is, let be your favorite matrix norm on Rn×n. I want to show there are constants C,N such that M1(x)C(1+|x|)N.





One possible approach starts as follows. Let p(λ,x)=det be the characteristic polynomial of M. Then, if we consider M as a matrix over the commutative ring \mathbb{R}[\mathbf{x}] = \mathbb{R}[x_1, \dots, x_n], we can apply the Cayley-Hamilton theorem to conclude that p(M(\mathbf{x}), \mathbf{x}) = 0. Since M(\mathbf{x}) is invertible for every \mathbf{x}, we have that p(0, \mathbf{x}) = \det(M(\mathbf{x})) is a polynomial with no zeros; let us suppose p(0, \mathbf{x}) > 0. Then by writing p(\lambda, \mathbf{x}) = \lambda r(\lambda, \mathbf{x}) + p(0, \mathbf{x}) for some other polynomial r, we get that M(\mathbf{x})^{-1} = -\frac{r(M(\mathbf{x}), \mathbf{x})}{p(0,\mathbf{x})}. The numerator certainly has polynomial growth, so it would remain to estimate 1/p(0,\mathbf{x}) = 1/\det(M(\mathbf{x})). I am not sure of an easy way to do that; I considered applying Hilbert's 17th problem (as solved by Artin) to write p(0,\mathbf{x}) as a sum of squares of rational functions, but that seems much too complicated.



I feel like there must be something much simpler that I am missing.


Answer



David Speyer's answer here showed that the reciprocal of a polynomial has polynomial growth, thanks to Stengle's Positivstellensatz. So the approach described in the question works, and the answer to the question is positive.


No comments:

Post a Comment

analysis - Injection, making bijection

I have injection f \colon A \rightarrow B and I want to get bijection. Can I just resting codomain to f(A)? I know that every function i...