My question specifically deals with certain real indefinite integrals such as ∫e−x2dx and ∫√1+x3dx
Answer
The trick is to make precise the meaning of "elementary": essentially, these are functions, which are expressible as finite combinations of polynomials, exponentials and logarithms. It is then possible to show (by algebraically tedious disposition of cases, though not necessarily invoking much of differential Galois theory - see e.g. Rosenthal's paper on the Liouville-Ostrowski theorem) that functions admitting elementary derivatives can always be written as the sum of a simple derivative and a linear combination of logarithmic derivatives. One consequence of this is the notable criterion that a (real or complex) function of the form x↦f(x)eg(x), where f,g are rational functions, admits an elementary antiderivative in the above sense if and only if the differential equation y′+g′y=f admits a rational solution. The problem of showing that ex2 and the lot have no elementary indefinite integrals is then reduced to simple algebra. In any case, this isn't an unsolved problem and there is not much mystery to it once you've seen the material.
No comments:
Post a Comment