Visual Tools
Calculators
Tables
Mathematical Keyboard
Converters
Other Tools


Higher Order Derivatives






Differentiating the Derivative


The derivative ff' is a function in its own right, and functions can be differentiated. Applying the differentiation process to ff' produces the second derivative ff'', which measures how the slope of ff changes. Differentiating again yields ff''', then f(4)f^{(4)}, and so on without limit—provided the relevant limits exist at each stage.

Each successive derivative captures a finer layer of a function's behavior. The first derivative controls direction. The second controls concavity. The third and beyond govern subtler aspects of curvature and, in physics, correspond to jerk, snap, and higher kinematic quantities. At the theoretical level, the entire collection of higher-order derivatives at a point encodes the local shape of a function through its Taylor series.



Definition and Notation


The second derivative of ff is the derivative of ff':

f(x)=ddx[f(x)]f''(x) = \frac{d}{dx}[f'(x)]


The third derivative is the derivative of ff'', and so on. The nnth derivative, denoted f(n)(x)f^{(n)}(x), is obtained by differentiating ff a total of nn times.

In Lagrange notation, the first few derivatives use primes: ff', ff'', ff'''. Beyond the third, parenthetical superscripts replace primes to avoid clutter: f(4)f^{(4)}, f(5)f^{(5)}, f(n)f^{(n)}. The parentheses distinguish the derivative order from an exponent—f(4)(x)f^{(4)}(x) is the fourth derivative, not ff raised to the fourth power.

In Leibniz notation, the nnth derivative of yy with respect to xx is written

dnydxn\frac{d^n y}{dx^n}


The symbol dny/dxnd^n y / dx^n is a single operator applied to yy, not a fraction with dnyd^n y in the numerator and dxndx^n in the denominator. For the second derivative specifically: d2ydx2\frac{d^2 y}{dx^2} reads "d-two-y, d-x-squared" and represents ddx(dydx)\frac{d}{dx}\left(\frac{dy}{dx}\right).

The Second Derivative


The second derivative f(x)f''(x) measures the rate of change of the slope. Where f(x)>0f''(x) > 0, the slope f(x)f'(x) is increasing—the function bends upward (concave up). Where f(x)<0f''(x) < 0, the slope is decreasing—the function bends downward (concave down).

This information is independent of whether ff itself is increasing or decreasing. A function can rise while decelerating (f>0f' > 0, f<0f'' < 0) or fall while accelerating in the negative direction (f<0f' < 0, f<0f'' < 0). The first and second derivatives describe different aspects of behavior.

The second derivative also powers the second derivative test for classifying critical points. At a point where f(c)=0f'(c) = 0: if f(c)>0f''(c) > 0, the critical point is a local minimum; if f(c)<0f''(c) < 0, a local maximum; if f(c)=0f''(c) = 0, the test is inconclusive.

Inflection points—where concavity reverses—occur where ff'' changes sign. The condition f(c)=0f''(c) = 0 is necessary but not sufficient; the sign of ff'' must actually switch across cc.

Physical Interpretation


In kinematics, the first three derivatives of position s(t)s(t) have standard names.

The first derivative s(t)s'(t) is velocity: the rate of change of position. It tells how fast an object moves and in which direction.

The second derivative s(t)s''(t) is acceleration: the rate of change of velocity. Positive acceleration means speeding up (in the positive direction) or decelerating (in the negative direction). The sign of acceleration relative to the sign of velocity determines whether the object is speeding up or slowing down.

The third derivative s(t)s'''(t) is jerk: the rate of change of acceleration. Jerk is felt physically as a sudden push or lurch—smooth motion has low jerk, while abrupt starts and stops produce high jerk. Elevator design, roller coaster engineering, and vehicle ride comfort all involve controlling jerk.

Beyond the third derivative, the terms snap (s(4)s^{(4)}), crackle (s(5)s^{(5)}), and pop (s(6)s^{(6)}) are used in specialized engineering contexts but rarely appear in standard calculus.

Patterns in Repeated Differentiation — Polynomials


Polynomials terminate under repeated differentiation. Each differentiation reduces the degree by one, so a polynomial of degree nn reaches a constant after nn differentiations and becomes zero after n+1n + 1.

For f(x)=x5f(x) = x^5:

f(x)=5x4,f(x)=20x3,f(x)=60x2,f(4)(x)=120x,f(5)(x)=120,f(6)(x)=0f'(x) = 5x^4, \quad f''(x) = 20x^3, \quad f'''(x) = 60x^2, \quad f^{(4)}(x) = 120x, \quad f^{(5)}(x) = 120, \quad f^{(6)}(x) = 0


The coefficient at the nnth derivative of xnx^n is n!=n(n1)(n2)1n! = n(n-1)(n-2) \cdots 1, accumulated from the power rule applied nn times. Specifically, dndxn[xn]=n!\frac{d^n}{dx^n}[x^n] = n! and dkdxk[xn]=0\frac{d^k}{dx^k}[x^n] = 0 for all k>nk > n.

For a general polynomial p(x)=anxn++a1x+a0p(x) = a_n x^n + \cdots + a_1 x + a_0, the nnth derivative is n!ann! \cdot a_n, a constant. This relationship becomes central in Taylor series, where the coefficient ana_n is recovered as f(n)(a)n!\frac{f^{(n)}(a)}{n!}.

Patterns in Repeated Differentiation — Exponentials


The natural exponential exe^x is unchanged by differentiation:

dndxn[ex]=exfor all n1\frac{d^n}{dx^n}[e^x] = e^x \quad \text{for all } n \geq 1


Every derivative of exe^x is exe^x. No other elementary function has this property (aside from the trivial f(x)=0f(x) = 0).

For the general exponential eaxe^{ax}, the chain rule introduces a factor of aa at each step:

dndxn[eax]=aneax\frac{d^n}{dx^n}[e^{ax}] = a^n e^{ax}


Each differentiation multiplies by aa. After nn differentiations, the accumulated constant is ana^n. This pattern appears in solutions to differential equations, where eaxe^{ax} satisfies equations whose characteristic root is aa.

For axa^x with arbitrary base: since ax=exlnaa^x = e^{x \ln a}, the nnth derivative is (lna)nax(\ln a)^n \cdot a^x. The factor lna\ln a replaces aa in the exponential pattern.

Patterns in Repeated Differentiation — Sine and Cosine


The derivatives of sinx\sin x cycle with period four:

sinxcosxsinxcosxsinx\sin x \to \cos x \to -\sin x \to -\cos x \to \sin x \to \cdots


The nnth derivative of sinx\sin x depends on nmod4n \mod 4:

dndxn[sinx]={sinxn0(mod4)cosxn1(mod4)sinxn2(mod4)cosxn3(mod4)\frac{d^n}{dx^n}[\sin x] = \begin{cases} \sin x & n \equiv 0 \pmod{4} \\ \cos x & n \equiv 1 \pmod{4} \\ -\sin x & n \equiv 2 \pmod{4} \\ -\cos x & n \equiv 3 \pmod{4} \end{cases}


A compact formula captures all four cases: dndxn[sinx]=sin(x+nπ2)\frac{d^n}{dx^n}[\sin x] = \sin\left(x + \frac{n\pi}{2}\right). The same holds for cosine: dndxn[cosx]=cos(x+nπ2)\frac{d^n}{dx^n}[\cos x] = \cos\left(x + \frac{n\pi}{2}\right).

For sin(ax)\sin(ax), the chain rule introduces a factor of aa per differentiation: dndxn[sin(ax)]=ansin(ax+nπ2)\frac{d^n}{dx^n}[\sin(ax)] = a^n \sin\left(ax + \frac{n\pi}{2}\right). The cycle in the trigonometric part persists; only the amplitude grows as ana^n.

This four-fold periodicity distinguishes trigonometric derivatives from polynomial derivatives (which terminate) and exponential derivatives (which replicate).

The nth Derivative of Specific Forms


Several standard functions have known closed-form nnth derivatives.

For f(x)=1xf(x) = \frac{1}{x}: rewriting as x1x^{-1} and applying the power rule repeatedly gives

f(n)(x)=(1)nn!xn+1f^{(n)}(x) = \frac{(-1)^n \cdot n!}{x^{n+1}}


Each differentiation multiplies by one more negative integer, producing the factorial and the alternating sign.

For f(x)=lnxf(x) = \ln x: since f(x)=x1f'(x) = x^{-1}, the higher derivatives follow the pattern above shifted by one:

f(n)(x)=(1)n1(n1)!xnn1f^{(n)}(x) = \frac{(-1)^{n-1} \cdot (n-1)!}{x^n} \qquad n \geq 1


For f(x)=xmf(x) = x^m where mm is a positive integer and nmn \leq m:

f(n)(x)=m!(mn)!xmnf^{(n)}(x) = \frac{m!}{(m-n)!} \cdot x^{m-n}


The coefficient m!(mn)!\frac{m!}{(m-n)!} is the falling factorial, counting the multipliers accumulated over nn applications of the power rule.

These closed-form expressions are useful for computing specific high-order derivatives without performing each differentiation step individually.

Higher-Order Derivatives and Taylor Series


The Taylor series of ff centered at x=ax = a is

f(x)=n=0f(n)(a)n!(xa)nf(x) = \sum_{n=0}^{\infty} \frac{f^{(n)}(a)}{n!}(x - a)^n


Each coefficient depends on a higher-order derivative evaluated at the center point aa. The zeroth derivative f(0)(a)=f(a)f^{(0)}(a) = f(a) gives the constant term. The first derivative gives the linear term. The second derivative gives the quadratic correction. Each successive term captures finer detail about how ff deviates from the lower-order approximation.

The Taylor polynomial of degree kk truncates the series after k+1k + 1 terms, providing a polynomial approximation to ff near aa. The quality of the approximation improves with kk—more derivatives mean a closer fit over a wider interval.

The connection between higher-order derivatives and Taylor series gives these derivatives their deepest significance: the complete collection {f(n)(a)}n=0\{f^{(n)}(a)\}_{n=0}^{\infty} determines ff locally (for analytic functions). Knowing all derivatives at a single point reconstructs the entire function in a neighborhood of that point.

Existence and Smoothness Classes


A function may be differentiable once but not twice. The function f(x)=xxf(x) = x|x| has f(x)=2xf'(x) = 2|x|, which is continuous but not differentiable at x=0x = 0. So ff is in class C1C^1 (continuously differentiable) but not C2C^2.

The smoothness classes organize functions by how many continuous derivatives they possess. A function belongs to CnC^n if f,f,f,,f(n)f, f', f'', \ldots, f^{(n)} all exist and are continuous. Class C0C^0 is simply continuous functions. Class CC^\infty consists of infinitely differentiable functions—called smooth functions—where derivatives of all orders exist and are continuous.

Polynomials, exe^x, sinx\sin x, cosx\cos x, and their compositions are all CC^\infty. Piecewise functions typically belong to a finite CnC^n class determined by how smoothly the pieces join at their boundaries: matching values gives C0C^0, matching first derivatives gives C1C^1, and so on.

There exist functions that are CC^\infty but not analytic—their Taylor series converges but not to the function itself. The standard example is f(x)=e1/x2f(x) = e^{-1/x^2} for x0x \neq 0 and f(0)=0f(0) = 0: every derivative at x=0x = 0 is zero, so the Taylor series is identically zero, yet the function is not zero away from the origin. Smooth does not automatically mean representable by a power series.