Calculus/Limits

< Calculus

Introducing Limits

Let's consider the following expression: {\frac {n}{n+1}}

As n gets larger and larger, the fraction gets closer and closer to 1.

\left({\tfrac {1}{2}},{\tfrac {2}{3}},{\tfrac {3}{4}},\dots {\tfrac {999}{1000}}\right)

As n approaches infinity, the expression will evaluate to fractions where the difference between them and 1 becomes negligible. The expression itself approaches 1. As mathematicians would say, the limit of the expression as n\! goes to infinity is 1, or in symbols: \lim _{n\rightarrow \infty }{\frac {n}{n+1}}=1.

An interesting sequence is S_{n}={\tfrac {1}{n}} As n\! gets bigger (in symbols n\rightarrow \infty , we have smaller values of S\!, for

S_{1}=1\!,S_{2}={\tfrac {1}{2}},S_{3}={\tfrac {1}{3}},S_{4}={\tfrac {1}{4}},S_{5}={\tfrac {1}{5}}

and so on. Clearly, S\! can't be smaller than zero (for if {\tfrac {1}{n}}<0 we have that n\! is less than zero). Then we may say that \lim _{n\rightarrow \infty }S_{n}=0. Continuing with this sequence, we might want to study what happens when n\! gets near to zero, and later what happens with negative values of n\! going near to zero. Usually, the letter n\! is reserved for integer values, so we are going to redefine our sequence as S_{n}=S(x)={\frac {1}{x}}. If we take a sequence of values of x\!, say

x=1,x={\tfrac {1}{2}},x={\tfrac {1}{4}},x={\tfrac {1}{8}},x={\tfrac {1}{16}}\!

We see that the respective values of S\! grows indefinitely, for

S(1)=1\!
S(2)=2\!
S(4)=4\!
S(8)=8\!
S(16)=16\!

In this case, we might say that the limit of \lim _{x\rightarrow 0^{+}}S(x)=\infty , in words, the limit of S\! as x\! goes to zero from right (as the sequence of values of x\! goes to zero from the left in a graphic) diverge (or tends to infinity, or is unbounded, but we never say that it is infinity or equals infinity). Other case happens if we study sequences of values of x such that every element of the sequence is lower than zero, the sequence is increasing but never exceeding zero. One example of such sequence is:

x=-1\!, with S(-1)=-1\!
x=-{\frac {1}{2}}, with S(-{\frac {1}{2}})=-2
x=-{\frac {1}{4}}, with S(-{\frac {1}{4}})=-4
x=-{\frac {1}{8}}, with S(-{\frac {1}{8}})=-8
x=-{\frac {1}{16}}, with S(-{\frac {1}{16}})=-16

The values of S(x)\! decrease without bounds. The we say that \lim _{x\rightarrow 0^{-}}S(x)=-\infty , or that S(x)\! tends to minus infinity when x goes to zero from left.

Basic Limits

For some limits (if the function is continuous at and near the limit), the variable can be replaced with its value directly:
\lim _{n\rightarrow x}f(n)=f(x)
For example,
\lim _{x\rightarrow 5}{\frac {x+10}{2x}}={\frac {5+10}{2\cdot 5}}={\frac {3}{2}}
and
\lim _{m\rightarrow b}{\frac {m+b}{m}}={\frac {b+b}{b}}=2 (with b not equal to 0)

Others are somewhat more complicated:
\lim _{x\rightarrow \infty }{\frac {5x+1}{x}}=\lim _{x\rightarrow \infty }5+{\frac {1}{x}}=5+0=5

Note that in this limit, one may not immediately set x equal to \infty because this would result in the expression evaluating to
{\frac {\infty }{\infty }}

which is an undefined expression. However, one may reduce the expression by separating the terms into separate fractions (in this case, {\frac {5x}{x}} and {\frac {1}{x}}), which can be evaluated directly.

Right and Left Hand Limits

Sometimes, we want to calculate the limit of a function as a variable approaches a certain value from only one side; that is, from the left or right side. This is denoted, respectively, by \lim _{x\rightarrow _{a}-} or \lim _{x\rightarrow _{a}+}. If the left-hand and right-hand limits do not both exist, or are not equal to each other, then the limit does not exist.
The following limit does not exist: \lim _{x\rightarrow _{0}}{\frac {1}{x}}
It doesn't because the left and right handed limits are unequal.
\lim _{x\rightarrow _{0}-}{\frac {1}{x}}=-\infty
\lim _{x\rightarrow _{0}+}{\frac {1}{x}}=\infty

Note that if the function is undefined at the point where we are trying to find the limit, it doesn't mean that the limit of the function at that point does not exist; for an example, let's evaluate g(x)=1/x^{2} at x=0.

Left-hand limit Right-hand limit
\lim _{x\rightarrow _{0}-}{\frac {1}{x^{2}}}=\infty \lim _{x\rightarrow _{0}+}{\frac {1}{x^{2}}}=\infty

\lim _{x\rightarrow _{0}-}{\frac {1}{x^{2}}}=\infty =\lim _{x\rightarrow _{0}+}{\frac {1}{x^{2}}}

Therefore: \lim _{x\rightarrow _{0}}{\frac {1}{x^{2}}}=\infty

Some Formal Definitions and Properties

Until now, limits have been discussed informally but it shouldn't be all intuition, for we need to be sure of certain assertions. For an example, the limit

\lim _{n\rightarrow \infty }S_{n}=0

We have seen that the function decreases as n increases, but how do we guarantee that there isn't a value y, say

y=0.000000000000000000000000000001

such that S is never smaller than y? If there is such y, we might want to say that the limit is y, not zero, and we can't test every single possible value of y (for there are infinite possibilities). We must then find a mathematical way of proving that there isn't such y, but for that we need to define formally what a limit is.


Right Limits, Left Limits, Limits and Continuity

Let f be a real valued function. We say that

\lim _{x\rightarrow x_{0}^{-}}f(x)=c

if for every \epsilon >0 there is a \delta >0 such that, for every x' between (x_{0}-\delta ) and x_{0}

|f(x')-c|<\epsilon

where |x| is the absolute value of x.

This is the formal definition of convergence from the left. It means that for each possible error bigger than zero, we are able to find a interval such that for all x in that interval, the distance between the value of the function and the constant c is less than the error.

TODO: Graphics illustrating this.

In an analogous fashion, we say that

\lim _{x\rightarrow x_{0}^{+}}f(x)=c

if for every \epsilon >0 there is a \delta >0 such that, for every x' between (x_{0}) and x_{0}+\delta , |f(x')-c|<\epsilon .

And to finish the necessary definitions,

\lim _{x\rightarrow x_{0}}f(x)=c

if

\lim _{x\rightarrow x_{0}^{-}}f(x)=c

and

\lim _{x\rightarrow x_{0}^{+}}f(x)=c.

Example:

\lim _{x\rightarrow 0}x^{2}=0

This is a assertion that must be proved. First, lets study the behavior of f(x)=x^{2} near zero;

|x^{2}-0|<\epsilon \Rightarrow |x^{2}|<\epsilon \Rightarrow x^{2}<\epsilon \Rightarrow x<{\sqrt {\epsilon }}

where the arrow pointing right means implies. So, define the function

\delta (\epsilon )={\sqrt {\epsilon }}

If |x-0|<\delta (\epsilon ), then |f(x)-0|<\epsilon . We have show how to find the delta in the definition of limit, showing that the limit of f as x tends to zero is zero.

In fact, for any real number x_{0},

\lim _{x\rightarrow x_{0}}x^{2}=x_{0}^{2}

Lets see how to construct a suitable function \delta (\epsilon ).

|x^{2}-x_{0}^{2}|=|(x-x_{0})(x+x_{0})|<(x+x_{0})^{2}, then (x+x_{0})^{2}<\epsilon \Rightarrow |x^{2}-x_{0}^{2}|<\epsilon

So,

(x+x_{0})^{2}<\epsilon \Rightarrow x+x_{0}<{\sqrt {\epsilon }}\Rightarrow \delta (\epsilon )={\sqrt {\epsilon }}

implying that |x-x_{0}|<\delta (\epsilon ) makes |x^{2}-x_{0}^{2}|<\epsilon for any x_{0}.

Functions with the property that

\lim _{x\rightarrow x_{0}}f(x)=f(x_{0})

are called continuous, and arise very naturally in the physical sciences; Beware that, against the intuition of most people, not every function is continuous.

Property of Limits

Property one: If \lim _{x\rightarrow x_{0}}f(x)=c, then

\lim _{x\rightarrow x_{0}}kf(x)=kc

for any constant k.
Proof: Construct the function \delta (\epsilon ) for f. Then

|x-x_{0}|<\delta (\epsilon )\Rightarrow |f(x)-c|<\epsilon

So

|x-x_{0}|<{\frac {1}{k}}\delta (\epsilon )\Rightarrow |kf(x)-kc|=|k||f(x)-c|<\epsilon

Then the limit of kf is kc, for the delta function of kf is

\delta '(\epsilon )={\frac {1}{k}}\delta (\epsilon )

QED.

TODO: Demonstrate main properties of limit (unicity, etc)

L'Hôpital's Rule

L'Hôpital's Rule is used when a limit approaches an indeterminate form. The two main indeterminate forms are {\frac {0}{0}} and {\frac {\infty }{\infty }}. Other indeterminate forms can be algebrically manipulated, such as \infty -\infty .

L'Hôpital's Rule states if a limit of {\frac {f(x)}{g(x)}} approaches an intederminate form as x approaches a, then:
\lim _{x\rightarrow a}{\frac {f(x)}{g(x)}}=\lim _{x\rightarrow a}{\frac {f'(x)}{g'(x)}}

Example: \lim _{x\rightarrow 0}{\frac {\sin(x)}{x}}
Both the numerator and the denominator approach zero as x approaches zero, therefore the limit is in indeterminate form, so l'Hôpital's rule can be applied to this limit. (note: you can also use the Sandwich Theorem.)
\lim _{x\rightarrow 0}{\frac {\sin(x)}{x}}=\lim _{x\rightarrow 0}{\frac {\sin '(x)}{x'}}=\lim _{x\rightarrow 0}{\frac {\cos(x)}{1}}
Now the limit is in a usable form, which equals 1.

If the limit resulting from applying l'Hôpital's Rule is still one of the two mentioned indeterminates, we may apply the rule again (to the limit obtained), and again and again until a usable form is encountered.

Where does it come from

To obtain l'Hôpital's Rule for a limit of {\frac {f(x)}{g(x)}} which approaches {\frac {0}{0}} as x approaches a, we simply decompose both f(x) and g(x) in terms of their Taylor expansion (centered around a). The independent terms of both expansions must be 0 (because both f(x) and g(x) approached 0), so if we divide both the f(x) and g(x) by (x-a) (or, equivalently, find their derivatives), our limit will stop being indeterminate.

It could be the case that the Taylor expansions of both the numerator and the denominator have a 0 as coefficient of the (x-a) term, thus yielding an indeterminate. This is the same case mentioned above where the trick was to repeat the process until a suitable limit was found.

The case of a limit which approaches {\frac {\infty }{\infty }} can be transformed to the case above by exchanging {\frac {f(x)}{g(x)}} with {\frac {1/g(x)}{1/f(x)}}, which obviously approaches {\frac {0}{0}}.

Calculus/Limits/Exercises

This article is issued from Wikiversity - version of the Sunday, December 13, 2015. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.