Question:$$\int\limits_0^1\mathrm dx\,\frac {\log\log\frac 1x}{(1+x)^2}=\frac 12\log\frac {\pi}2-\frac {\gamma}2$$
I’ve had some practice with similar integrals, but this one eludes me for some reason. I first made the transformation $x\mapsto-\log x$ to get rid of the nested log. Therefore$$\mathfrak{I}=\int\limits_0^{\infty}\mathrm dx\,\frac {e^{-x}\log x}{(1+e^{-x})^2}$$
The inside integrand can be rewritten as an infinite series to get$$\mathfrak{I}=\sum\limits_{n\geq0}(n+1)(-1)^n\int\limits_0^{\infty}\mathrm dx\, e^{-x(n+1)}\log x$$The inside integral, I thought, could be evaluated by differentiating the gamma function to get$$\int\limits_0^{\infty}\mathrm dt\, e^{-t(n+1)}\log t=-\frac {\gamma}{n+1}-\frac {\log(n+1)}{n+1}$$
However, when I simplify everything and split the sum, neither sum converges. If we consider it as a Cesaro sum, then I know for sure that$$\sum\limits_{n\geq0}(-1)^n=\frac 12$$Which eventually does give the right answer. But I’m not sure if we’re quite allowed to do that especially because in a general sense, neither sum converges.
No comments:
Post a Comment