Visual Tools
Calculators
Tables
Mathematical Keyboard
Converters
Other Tools


Inner Product






Measuring Length, Angle, and Distance

The dot product assigns a scalar to every pair of vectors, encoding their lengths, the angle between them, and whether they are perpendicular. It is one instance of a broader concept — the inner product — that carries the same geometric structure into polynomial spaces, function spaces, and matrix spaces. Every notion of orthogonality on this site traces back to an inner product.



The Dot Product

The dot product of two vectors u=(u1,,un)\mathbf{u} = (u_1, \dots, u_n) and v=(v1,,vn)\mathbf{v} = (v_1, \dots, v_n) in Rn\mathbb{R}^n is

uv=u1v1+u2v2++unvn=uTv\mathbf{u} \cdot \mathbf{v} = u_1 v_1 + u_2 v_2 + \cdots + u_n v_n = \mathbf{u}^T\mathbf{v}


The result is a scalar, not a vector. The matrix form uTv\mathbf{u}^T\mathbf{v} treats u\mathbf{u} as a 1×n1 \times n row and v\mathbf{v} as an n×1n \times 1 column, making the dot product a 1×11 \times 1 matrix multiplication.

For u=(2,1,3,0)\mathbf{u} = (2, -1, 3, 0) and v=(1,4,2,5)\mathbf{v} = (1, 4, -2, 5): uv=2(1)+(1)(4)+3(2)+0(5)=246+0=8\mathbf{u} \cdot \mathbf{v} = 2(1) + (-1)(4) + 3(-2) + 0(5) = 2 - 4 - 6 + 0 = -8.

The dot product is the standard inner product on Rn\mathbb{R}^n. It is the measuring tool that defines lengths, angles, distances, and perpendicularity throughout finite-dimensional linear algebra.

Properties of the Dot Product

The dot product satisfies three fundamental properties.

Symmetry: uv=vu\mathbf{u} \cdot \mathbf{v} = \mathbf{v} \cdot \mathbf{u}. The order does not matter.

Linearity: (cu+dw)v=c(uv)+d(wv)(c\mathbf{u} + d\mathbf{w}) \cdot \mathbf{v} = c(\mathbf{u} \cdot \mathbf{v}) + d(\mathbf{w} \cdot \mathbf{v}). The dot product distributes over addition and pulls scalars out. Combined with symmetry, it is linear in both arguments (bilinear).

Positive definiteness: vv0\mathbf{v} \cdot \mathbf{v} \geq 0 for all v\mathbf{v}, with equality if and only if v=0\mathbf{v} = \mathbf{0}. The quantity vv=v12+v22++vn2\mathbf{v} \cdot \mathbf{v} = v_1^2 + v_2^2 + \cdots + v_n^2 is a sum of squares, which is zero only when every component is zero.

These three properties — symmetry, linearity, positive definiteness — are not just useful observations. They are the axioms that define an inner product in the abstract setting.

Length

The length (or norm) of a vector v\mathbf{v} is

v=vv=v12+v22++vn2\|\mathbf{v}\| = \sqrt{\mathbf{v} \cdot \mathbf{v}} = \sqrt{v_1^2 + v_2^2 + \cdots + v_n^2}


This is the Euclidean norm — the generalization of the Pythagorean theorem to nn dimensions. In R2\mathbb{R}^2, (3,4)=9+16=5\|(3, 4)\| = \sqrt{9 + 16} = 5. In R3\mathbb{R}^3, (1,2,2)=1+4+4=3\|(1, 2, 2)\| = \sqrt{1 + 4 + 4} = 3.

The norm satisfies v0\|\mathbf{v}\| \geq 0 with equality only for v=0\mathbf{v} = \mathbf{0}, and cv=cv\|c\mathbf{v}\| = |c|\|\mathbf{v}\| — scaling a vector scales its length by the absolute value of the scalar.

A unit vector has v=1\|\mathbf{v}\| = 1. Any nonzero vector can be normalized — replaced by v^=v/v\hat{\mathbf{v}} = \mathbf{v}/\|\mathbf{v}\|, which points in the same direction with length 11. Normalization preserves direction and discards magnitude.

Distance

The distance between two vectors u\mathbf{u} and v\mathbf{v} is the length of their difference:

d(u,v)=uv=(u1v1)2+(u2v2)2++(unvn)2d(\mathbf{u}, \mathbf{v}) = \|\mathbf{u} - \mathbf{v}\| = \sqrt{(u_1 - v_1)^2 + (u_2 - v_2)^2 + \cdots + (u_n - v_n)^2}


This is the Euclidean distance — the straight-line separation between two points in Rn\mathbb{R}^n.

Distance satisfies the properties of a metric: d(u,v)0d(\mathbf{u}, \mathbf{v}) \geq 0 with equality only when u=v\mathbf{u} = \mathbf{v}; d(u,v)=d(v,u)d(\mathbf{u}, \mathbf{v}) = d(\mathbf{v}, \mathbf{u}) (symmetry); and d(u,w)d(u,v)+d(v,w)d(\mathbf{u}, \mathbf{w}) \leq d(\mathbf{u}, \mathbf{v}) + d(\mathbf{v}, \mathbf{w}) (triangle inequality). Every inner product induces a distance through this formula, and the resulting distance always satisfies these metric properties.

Angle Between Vectors

For nonzero vectors u\mathbf{u} and v\mathbf{v}, the angle θ\theta between them satisfies

cosθ=uvuv\cos\theta = \frac{\mathbf{u} \cdot \mathbf{v}}{\|\mathbf{u}\|\|\mathbf{v}\|}


The Cauchy-Schwarz inequality (next section) guarantees that the right-hand side lies between 1-1 and 11, so the formula always produces a well-defined angle θ[0,π]\theta \in [0, \pi].

When cosθ=1\cos\theta = 1 (θ=0\theta = 0), the vectors point in the same direction. When cosθ=1\cos\theta = -1 (θ=π\theta = \pi), they point in opposite directions. When cosθ=0\cos\theta = 0 (θ=π/2\theta = \pi/2), the vectors are orthogonal.

The orthogonality condition uv=0\mathbf{u} \cdot \mathbf{v} = 0 is the case θ=90°\theta = 90°. The dot product encodes the full metric geometry of Rn\mathbb{R}^n: length from vv\mathbf{v} \cdot \mathbf{v}, angle from uv\mathbf{u} \cdot \mathbf{v}, and distance from (uv)(uv)(\mathbf{u} - \mathbf{v}) \cdot (\mathbf{u} - \mathbf{v}).

For u=(1,2,3)\mathbf{u} = (1, 2, 3) and v=(4,1,2)\mathbf{v} = (4, -1, 2): uv=42+6=8\mathbf{u} \cdot \mathbf{v} = 4 - 2 + 6 = 8, u=14\|\mathbf{u}\| = \sqrt{14}, v=21\|\mathbf{v}\| = \sqrt{21}. So cosθ=8/2940.467\cos\theta = 8/\sqrt{294} \approx 0.467, giving θ62.2°\theta \approx 62.2°.

The Cauchy-Schwarz Inequality

For all vectors u\mathbf{u} and v\mathbf{v} in Rn\mathbb{R}^n:

uvuv|\mathbf{u} \cdot \mathbf{v}| \leq \|\mathbf{u}\| \, \|\mathbf{v}\|


Equality holds if and only if one vector is a scalar multiple of the other — they are parallel.

The proof considers the expression utv20\|\mathbf{u} - t\mathbf{v}\|^2 \geq 0 for all tRt \in \mathbb{R}. Expanding: u22t(uv)+t2v20\|\mathbf{u}\|^2 - 2t(\mathbf{u} \cdot \mathbf{v}) + t^2\|\mathbf{v}\|^2 \geq 0. This is a quadratic in tt that is non-negative everywhere, so its discriminant must be non-positive: 4(uv)24u2v204(\mathbf{u} \cdot \mathbf{v})^2 - 4\|\mathbf{u}\|^2\|\mathbf{v}\|^2 \leq 0. Rearranging gives Cauchy-Schwarz.

The inequality is what makes the angle formula legitimate. It guarantees 1uvuv1-1 \leq \frac{\mathbf{u} \cdot \mathbf{v}}{\|\mathbf{u}\|\|\mathbf{v}\|} \leq 1, so cosθ\cos\theta takes a valid value. Without Cauchy-Schwarz, the angle formula could produce numbers outside [1,1][-1, 1], and the geometric interpretation would collapse.

The Triangle Inequality

For all vectors u\mathbf{u} and v\mathbf{v} in Rn\mathbb{R}^n:

u+vu+v\|\mathbf{u} + \mathbf{v}\| \leq \|\mathbf{u}\| + \|\mathbf{v}\|


The length of one side of a triangle never exceeds the sum of the other two. Equality holds if and only if u\mathbf{u} and v\mathbf{v} point in the same direction (one is a non-negative scalar multiple of the other).

The proof follows from Cauchy-Schwarz. Square both sides: u+v2=u2+2uv+v2u2+2uv+v2=(u+v)2\|\mathbf{u} + \mathbf{v}\|^2 = \|\mathbf{u}\|^2 + 2\mathbf{u} \cdot \mathbf{v} + \|\mathbf{v}\|^2 \leq \|\mathbf{u}\|^2 + 2\|\mathbf{u}\|\|\mathbf{v}\| + \|\mathbf{v}\|^2 = (\|\mathbf{u}\| + \|\mathbf{v}\|)^2. The key step uses uvuvuv\mathbf{u} \cdot \mathbf{v} \leq |\mathbf{u} \cdot \mathbf{v}| \leq \|\mathbf{u}\|\|\mathbf{v}\|.

The triangle inequality is essential for the distance function d(u,v)=uvd(\mathbf{u}, \mathbf{v}) = \|\mathbf{u} - \mathbf{v}\| to satisfy the metric axioms. It ensures that going from u\mathbf{u} to w\mathbf{w} directly is never longer than going via v\mathbf{v}.

General Inner Products

An inner product on a vector space VV is a function ,:V×VR\langle \cdot, \cdot \rangle: V \times V \to \mathbb{R} satisfying three axioms:

Symmetry: u,v=v,u\langle \mathbf{u}, \mathbf{v} \rangle = \langle \mathbf{v}, \mathbf{u} \rangle.

Linearity in the first argument: cu+dw,v=cu,v+dw,v\langle c\mathbf{u} + d\mathbf{w}, \mathbf{v} \rangle = c\langle \mathbf{u}, \mathbf{v} \rangle + d\langle \mathbf{w}, \mathbf{v} \rangle.

Positive definiteness: v,v>0\langle \mathbf{v}, \mathbf{v} \rangle > 0 for all v0\mathbf{v} \neq \mathbf{0}.

A vector space equipped with an inner product is called an inner product space. Every inner product induces a norm (v=v,v\|\mathbf{v}\| = \sqrt{\langle \mathbf{v}, \mathbf{v} \rangle}), a distance (d(u,v)=uvd(\mathbf{u}, \mathbf{v}) = \|\mathbf{u} - \mathbf{v}\|), and a notion of orthogonality (uv\mathbf{u} \perp \mathbf{v} iff u,v=0\langle \mathbf{u}, \mathbf{v} \rangle = 0). The Cauchy-Schwarz inequality, the triangle inequality, and the Pythagorean theorem all hold in any inner product space.

The standard dot product on Rn\mathbb{R}^n is one inner product. But the definition admits many others, each defining a different geometry on the same set of vectors.

Examples of Inner Products

The weighted inner product on Rn\mathbb{R}^n is u,v=uTWv\langle \mathbf{u}, \mathbf{v} \rangle = \mathbf{u}^T W \mathbf{v}, where WW is a symmetric positive definite matrix. This distorts the standard geometry — unit circles become ellipses, and "perpendicular" directions depend on WW. When W=IW = I, it reduces to the standard dot product.

On the polynomial space Pn\mathcal{P}_n, the inner product p,q=11p(x)q(x)dx\langle p, q \rangle = \int_{-1}^{1} p(x)q(x)\,dx defines orthogonality via integration. The polynomials 11, xx, and 12(3x21)\frac{1}{2}(3x^2 - 1) are orthogonal under this product — these are the first three Legendre polynomials.

On the function space C[0,2π]C[0, 2\pi], the inner product f,g=02πf(x)g(x)dx\langle f, g \rangle = \int_0^{2\pi} f(x)g(x)\,dx makes sines and cosines orthogonal: 02πsin(mx)cos(nx)dx=0\int_0^{2\pi} \sin(mx)\cos(nx)\,dx = 0 for all integers m,nm, n. This is the foundation of Fourier analysis.

The Frobenius inner product on matrices is A,B=tr(ATB)=ijaijbij\langle A, B \rangle = \text{tr}(A^TB) = \sum_{ij} a_{ij}b_{ij}, which treats matrices as vectors of n2n^2 entries.

Each inner product defines its own geometry, but the linear algebra — projections, Gram-Schmidt, least squares — works identically in all of them.

The Pythagorean Theorem

If u\mathbf{u} and v\mathbf{v} are orthogonal (uv=0\mathbf{u} \cdot \mathbf{v} = 0), then

u+v2=u2+v2\|\mathbf{u} + \mathbf{v}\|^2 = \|\mathbf{u}\|^2 + \|\mathbf{v}\|^2


The proof is a one-line expansion: u+v2=uu+2uv+vv=u2+0+v2\|\mathbf{u} + \mathbf{v}\|^2 = \mathbf{u} \cdot \mathbf{u} + 2\mathbf{u} \cdot \mathbf{v} + \mathbf{v} \cdot \mathbf{v} = \|\mathbf{u}\|^2 + 0 + \|\mathbf{v}\|^2.

The theorem extends to any number of mutually orthogonal vectors: if v1,,vk\mathbf{v}_1, \dots, \mathbf{v}_k are pairwise orthogonal, then

v1+v2++vk2=v12+v22++vk2\|\mathbf{v}_1 + \mathbf{v}_2 + \cdots + \mathbf{v}_k\|^2 = \|\mathbf{v}_1\|^2 + \|\mathbf{v}_2\|^2 + \cdots + \|\mathbf{v}_k\|^2


All cross terms vanish because every pair has dot product zero. This is not a special property of R2\mathbb{R}^2 or R3\mathbb{R}^3 — it holds in any inner product space. The Pythagorean theorem is a direct consequence of the inner product axioms, and it is the reason that orthogonal decompositions are so computationally clean: lengths decompose into independent, additive contributions from each perpendicular direction.