Skip to main content

Section 3.1 Basic Differentiation Rules

Subsection 3.1.1 Derivative of Power Functions, General Power Rule

For power functions, where the variable is raised to a constant power, there is a simple rule for differentiation.
In other words, to differentiate a function that is some power of the variable \(x\text{,}\) “bring down” the exponent \(r\) into the “front” as a coefficient, and subtract 1 from the exponent.
This theorem is widely used, because it allows for differentiation of polynomials, but also radicals (i.e. expressions of the form \(\sqrt[b]{x^a} = x^{a/b}\)), and in general any term that can be represented as a power of the variable \(x\text{.}\) However, it is difficult to prove in full generality without using more advanced rules introduced later on.
Notice that the fact that,
\begin{equation*} \frac{d}{dx} x = 1 \qquad \frac{d}{dx} x^2 = 2x \qquad \frac{d}{dx} x^3 = 3x^2 \qquad \frac{d}{dx} \brac{\frac{1}{x}} = -\frac{1}{x^2} \end{equation*}
are special cases of this rule, with \(r = 1, 2, 3\) and \(r = -1\text{,}\) respectively.

Subsection 3.1.2 Sum, Difference, and Constant Multiple Rules

This theorem is about a single point \(x\text{,}\) but it of course can be generalized to all points in an interval \((a,b)\text{.}\) Intuitively, the first statement means that the slope of a sum of functions is the sum of the slopes of the individual functions. The second says that scaling a function by a vertical stretch or compression applies the same scaling to the slopes of the function.
The sum rule can be naturally generalized to \(n\) functions. This basically means that derivatives can be determined term-by-term.

Subsection 3.1.3 Differentiation of Polynomials

By combining all of the previous results, we can show that any polynomial is differentiable on \(\mathbb{R}\text{,}\) and determine its derivative. Recall that a polynomial is a function of the form,
\begin{equation*} f(x) = a_n x^n + a_{n-1} x^{n-1} + \dots + a_1 x + a_0 \end{equation*}
where \(a_n, \dots, a_0 \in \mathbb{R}\text{,}\) \(n \in \mathbb{N}\text{.}\) In other words, it is a sum of monomials of the form \(a_k x^k\text{.}\) Using the power rule (for positive integers) and constant multiple rule, we can differentiate each monomial.
\begin{equation*} \frac{d}{dx} (a_k x^k) = a_k \frac{d}{dx} x^k = a_k k x^{k-1} \end{equation*}
Then, using the sum rule, we can determine the derivative of a polynomial,
\begin{align*} f'(x) \amp = \frac{d}{dx} \brac{a_n x^n + a_{n-1} x^{n-1} + \dots + a_1 x + a_0}\\ \amp = \frac{d}{dx} (a_n x^n) + \frac{d}{dx} (a_{n-1} x^{n-1}) + \dots + \frac{d}{dx} (a_1 x) + \frac{d}{dx} a_0 \amp\amp \text{by the sum rule}\\ \amp = a_n n x^{n-1} + a_{n-1} (n - 1) x^{n-2} + \dots + a_1 \end{align*}
Notice that the derivative of a polynomial of degree \(n\) is a polynomial of degree \(n-1\text{.}\) For example, the derivative of a cubic polynomial is quadratic, and the derivative of a quadratic polynomial is linear.