Subsection 3.1.1 Derivative of Power Functions, General Power Rule
For power functions, where the variable is raised to a constant power, there is a simple rule for differentiation.
Theorem 3.1.1. Power rule for derivatives.
If \(f(x) = x^r\) for some \(r \in \mathbb{R}\text{,}\) then \(f'(x) = rx^{r-1}\text{.}\) In other words,
\begin{equation*}
\boxed{\frac{d}{dx} x^r = rx^{r-1}}
\end{equation*}
In other words, to differentiate a function that is some power of the variable \(x\text{,}\) “bring down” the exponent \(r\) into the “front” as a coefficient, and subtract 1 from the exponent.
This theorem is widely used, because it allows for differentiation of polynomials, but also radicals (i.e. expressions of the form \(\sqrt[b]{x^a} = x^{a/b}\)), and in general any term that can be represented as a power of the variable \(x\text{.}\) However, it is difficult to prove in full generality without using more advanced rules introduced later on.
Example 3.1.2.
Notice that the fact that,
\begin{equation*}
\frac{d}{dx} x = 1 \qquad \frac{d}{dx} x^2 = 2x \qquad \frac{d}{dx} x^3 = 3x^2 \qquad \frac{d}{dx} \brac{\frac{1}{x}} = -\frac{1}{x^2}
\end{equation*}
are special cases of this rule, with \(r = 1, 2, 3\) and \(r = -1\text{,}\) respectively.
Subsection 3.1.2 Sum, Difference, and Constant Multiple Rules
Theorem 3.1.3.
Let \(f, g\) be functions, differentiable at \(x\text{,}\) and let \(k \in \mathbb{R}\text{.}\) Then, the functions \(f + g\text{,}\) \(f - g\text{,}\) and \(kf\) are all differentiable at \(x\text{,}\) and,
Sum/difference rule.
\begin{equation*}
\boxed{(f + g)'(x) = f'(x) + g'(x)}
\end{equation*}
The derivative of a sum/difference is the sum/difference of the derivatives of the individual functions.
Constant multiple rule.
\begin{equation*}
\boxed{(kf)'(x) = kf'(x)}
\end{equation*}
The derivative of a constant multiple times a function is that constant times the derivative of the function.
This theorem is about a single point \(x\text{,}\) but it of course can be generalized to all points in an interval \((a,b)\text{.}\) Intuitively, the first statement means that the slope of a sum of functions is the sum of the slopes of the individual functions. The second says that scaling a function by a vertical stretch or compression applies the same scaling to the slopes of the function.
The sum rule can be naturally generalized to \(n\) functions. This basically means that derivatives can be determined term-by-term.
Corollary 3.1.4.
Let \(f_1, \dots, f_n\) be functions, differentiable at \(x\text{.}\) Then, \(f_1 + \dots + f_n\) is differentiable at \(x\text{,}\) and,
\begin{equation*}
\boxed{(f_1 + \dots + f_n)'(x) = f_1'(x) + \dots + f_n'(x)}
\end{equation*}
Subsection 3.1.3 Differentiation of Polynomials
By combining all of the previous results, we can show that any polynomial is differentiable on \(\mathbb{R}\text{,}\) and determine its derivative. Recall that a polynomial is a function of the form,
\begin{equation*}
f(x) = a_n x^n + a_{n-1} x^{n-1} + \dots + a_1 x + a_0
\end{equation*}
where \(a_n, \dots, a_0 \in \mathbb{R}\text{,}\) \(n \in \mathbb{N}\text{.}\) In other words, it is a sum of monomials of the form \(a_k x^k\text{.}\) Using the power rule (for positive integers) and constant multiple rule, we can differentiate each monomial.
\begin{equation*}
\frac{d}{dx} (a_k x^k) = a_k \frac{d}{dx} x^k = a_k k x^{k-1}
\end{equation*}
Then, using the sum rule, we can determine the derivative of a polynomial,
\begin{align*}
f'(x) \amp = \frac{d}{dx} \brac{a_n x^n + a_{n-1} x^{n-1} + \dots + a_1 x + a_0}\\
\amp = \frac{d}{dx} (a_n x^n) + \frac{d}{dx} (a_{n-1} x^{n-1}) + \dots + \frac{d}{dx} (a_1 x) + \frac{d}{dx} a_0 \amp\amp \text{by the sum rule}\\
\amp = a_n n x^{n-1} + a_{n-1} (n - 1) x^{n-2} + \dots + a_1
\end{align*}
Notice that the derivative of a polynomial of degree \(n\) is a polynomial of degree \(n-1\text{.}\) For example, the derivative of a cubic polynomial is quadratic, and the derivative of a quadratic polynomial is linear.