Inspired by some of the greats on this site, I've been trying to improve my residue theorem skills. I've come across the integral
∫∞0xn−2x+1x2n−1dx,
where n is a positive integer that is at least 2, and I'd like to evaluate it with the residue theorem. Through non-complex methods, I know that the integral is 0 for all n≥2. But I know that it can be done with the residue theorem.
The trouble comes in choosing a contour. We're probably going to do some pie-slice contour, perhaps small enough to avoid any of the 2nth roots of unity, and it's clear that the outer-circle will vanish. But I'm having trouble evaluating the integral on the contour, or getting cancellation.
Can you help? (Also, do you have a book reference for collections of calculations of integrals with the residue theorem that might have similar examples?)
Answer
We want to prove that the integral is 0 for n>1, it is the same thing as ∫∞0dxxn+1=2∫∞0x−1x2n−1 dx.
Notice that ∫Cf(z) dz=0 (the integral is taken counter clockwise always) since f is holomorphic inside C. and |∫C2f(x) dx|=O(r−1)→0.
No comments:
Post a Comment