Let f:(x,y)→R be increasing, bounded and continuous on (x,y). Prove that f is uniformly continuous on (x,y).
Since it is bounded then there exists an M∈R such that $|f(x,y)|
but I am not sure how I can show that?
Answer
Since f is continuous on (a,b), bounded and increasing, there's a unique continous extension of f to [a,b]. This works because both limits f(b):=limx→b− and f(a)=limx→a+ and are guaranteed to exist since every bounded and increasing (respectively bounded a decreasing) sequence converges. To prove this, simply observe that for a increasing and bounded sequence, all xm with m>n have to lie within [xn,M] where M=supnxn is the upper bound. Add to that the fact that by the very definition of sup, there are xn arbitrarily close to M.
You can then use the fact that continuity on a compact set implies uniform continuity, and you're done. This theorem, btw, isn't hard to prove either (and the proof shows how powerful the compactness property can be). The proof goes like this:
First, recall the if f is continuous then the preimage of an open set, and in particular of an open interval, is open. Thus, for x∈[a,b] all the sets Cx:=f−1((f(x)−ϵ2,f(x)+ϵ2))
are open. The crucial property of these Cx is that for all y∈Cx you have |f(y)−f(x)|<ϵ2 and thus |f(u)−f(v)|=|(f(u)−f(x))−(f(v)−f(x))|≤|f(u)−f(x)|⏟<ϵ2+|f(v)−f(x)|⏟<ϵ2<ϵ for all u,v∈Cx
Now recall that an open set contains an open interval around each of its points. Each Bx thus contains an open interval around x, and you may wlog assume that its symmetric around x (just make it smaller if it isn't). Thus, there are δx>0 such that Bx:=(x−δx2,x+δx2)⊂(x−δx,x+δx)⊂Cx
Note how we made Bx artifically smaller than seems necessary, that will simplify the last stage of the proof. Since Bx contains x, the Bx form an open cover of [a,b], i.e. ⋃x∈[a,b]Bx⊃[a,b].
Now we invoke compactness. Behold! Since [a,b] is compact, every covering with open sets contains a finite covering. We can thus pick finitely many xi∈[a,b] such that we still have ⋃1≤i≤nBxi⊃[a,b].
We're nearly there, all that remains are a few applications of the triangle inequality. Since we're only dealing with finitly many xi now, we can find the minimum of all their δxi. Like in the definition of the Bx, we leave ourselves a bit of space to maneuver later, and actually set δ:=min1≤i≤nδxi2.
Now pick arbitrary u,v∈[a,b] with |u−v|<δ.
Since our Bx1,…,Bxn form a cover of [a,b], there's an i∈1,…,n with u∈Bxi, and thus |u−xi|<δxi2. Having been conservative in the definition of Bx and δ pays off, because we get |v−xi|=|v−((xi−u)+u)|=|(v−u)−(xi−u)|<|u−v|⏟<δ≤δxi2+|xi−u|⏟<δxi2<δxi.
This doesn't imply y∈Bxi (the distance would have to be δxi2 for that), but it does imply y∈Cxi!. We thus have x∈Bxi⊂Cxi and y∈Cxi, and by definition of Cx (see the remark about the crucial property of Cx above) thus |f(x)−f(y)|<ϵ.
No comments:
Post a Comment