## Terms/Theorems (Differentiation)

- Differentiable Function
- The function $f: (a,b)\to \mathbb{R}$ is
**differentiable at x**if $\lim_{t\to x} \frac{f(t)-f(x)}{t-x}=L$ exists. Be definition this means $L$ is a real number and for each $\epsilon > 0$ $\exists$ $\delta > 0$ such that if $0 < \mid t-x \mid < \delta$ then the differential quotient above differs from $L$ by $< \epsilon$. - The Rules of Differentiation

(a) Differentiablility implies continuity.

(b) If $f$ and $g$ are differentiable at $x$ then so is $f + g$, the derivative being $(f + g)'(x) = f'(x) + g'(x)$.

(c) If $f$ and $g$ are differentiable at $x$ then so is $f \cdot g$, the derivative being $(f \cdot g)'(x) = f'(x) \cdot g(x) + f(x) \cdot g'(x)$.

(d) The derivative of a constant is zero, $c' = 0$.

(e) If $f$ and $g$ are differentiable at $x$ and $g(x) \neq 0$ then $f / g$ is differentiable at $x$, the derivative being $(f / g)'(x) = \frac{(f'(x) \cdot g(x) - f(x) \cdot g'(x))}{g(x)^2}$.

(f) If $f$ is differentiable at $x$ and $g$ is differentiable at $y = f(x)$ then $g \circ f$ is differentiable at $x$, the derivative being $(g \circ f)'(x) = g'(y) f'(x)$.

- Mean Value Theorem
- A continuous function $f: [a,b]\to \mathbb{R}$ that is differentiable on the interval (a,b) has the
**mean value property**: there exists a point c$\in$(a,b) such that f(b)-f(a)=f'(c)(b-a). - Smoothness Classes
- If $f$ is differentiable and its derivative function $f'$(x) is a continuous function of x, then $f$ is continuously differentiable, and $f$ is of class $C^1$. If $f$ is $r^{th}$ order differentiable and $f^{(r)}$(x) is a continuous function of x, then $f$ is continuously $r^{th}$ order differentiable and $f$ is of class $C$
^{r}. If $f$ is smooth, then it is of class $C$^{r}for all finite $r$ and we say that $f$ is of class $C^\infty$. A continuous function is of class $C^0$. - Uniform Continuity
- A function $f$ is uniformly continuous on an interval [a,b] or (a,b) if for each $\epsilon > 0$ $\exists$ $\delta > 0$ such that if $0 < \mid x-t \mid < \delta$, then $|f(x)-f(t)|<\epsilon$. Note that $\delta$ can only depend on $\epsilon$ and not
*x*, different from regular continuity where $\delta$ can depend on both*x*and $\epsilon$.

## Terms/Theorems (Series)

- Absolute Convergence
- If the series converges absolutely, then every rearrangement converges to the same value. An example is: $\sum a_n$ converges
*absolutely*if $\sum |a_n|$ converges. - Comparison Test
- If $\sum b_n$ converges to $\beta$, and $|a_n| \leq b_n$, then the $\sum a_n$ converges to some $\alpha \leq \beta$. (Note: you can also guarantee convergence if $|a_n| \leq b_n$ for $n>N$.)
- Geometric series
- $\sum_{k=0} ^\infty = 1 + \lambda + .. + \lambda^n + ...$
- Harmonic Series
- $\sum_{k=1}^\infty = 1 +\frac{1}{2} + \frac{1}{3} + ...$
- p-Test
- The series $\sum \frac{1}{n^p}$ from $n=1$ to $\infty$ converges if and only if $p>1$. If the area under the curve is finite, then the function converges. If the area is infinite, then the function does not converge.
- Rearrangement Theorem

(1) Let $\sum_{n=1}^\infty a_n$ be a absolutely convergent sequence. Then any rearrangement of terms in that series results in a new series that is also absolutely convergent to the same limit.

(2) Let $\sum_{n=1}^\infty a_n$ be a conditionally convergent sequence. Then, for any real number c there is a rearrangement of the series such that the new resulting series will converge to c.

## Terms/Theorems (Sequences/Series of Functions)

- Analytic Function
- A function is analytic if it has power series expansions about all points. Ex: Sin(x)
- Pointwise Convergence
- A sequence of functions $f_n$ : [a,b] $\to \mathbb{R}$
**converges pointwise**to a limit function $f$ : [a,b] $\to \mathbb{R}$ if for each x $\in$ [a,b], $\lim_{n\to \infty} f_n(x)$ = $f(x)$. - Power Series
- $\sum_{k=0} ^\infty = a_0+a_1x+a_2x^2...a_nx^n$
- Uniform Convergence
- A sequence $(f_n)$ converges uniformly to a function $f$ if $\forall \epsilon>0$, $\exists n$ such that for $n>N$, $|f_n(x)-f(x)|<\epsilon$ $\forall x \in$ domain.
- Uniform Convergence Theorem
- If the sequence $f_n$ converges uniformly to $f$ and $f_n$ is continuous, then $f$ is continuous.