If f(x) takes a finite real value for all x on the closed interval [a,b], must there be a real number M such that M≥f(x) for all x on this interval? It seems that if not, there must be a point c∈[a,b] such that limx→cf(x)=+∞, and so f(x) must be undefined at some point on this interval, but I don't know how to make this rigorous.
Edit: I see that f(0)=0, f(x)=1/x on (0,1] is a counterexample. I also see that I have been imprecise with terminology. Let me modify the question: Is there always a sub-interval [a′,b′] with $a
No comments:
Post a Comment