The dot product assigns a scalar to every pair of vectors, encoding their lengths, the angle between them, and whether they are perpendicular. It is one instance of a broader concept — the inner product — that carries the same geometric structure into polynomial spaces, function spaces, and matrix spaces. Every notion of orthogonality on this site traces back to an inner product.
The Dot Product
The dot product of two vectors u=(u1,…,un) and v=(v1,…,vn) in Rn is
u⋅v=u1v1+u2v2+⋯+unvn=uTv
The result is a scalar, not a vector. The matrix form uTv treats u as a 1×n row and v as an n×1 column, making the dot product a 1×1matrix multiplication.
For u=(2,−1,3,0) and v=(1,4,−2,5): u⋅v=2(1)+(−1)(4)+3(−2)+0(5)=2−4−6+0=−8.
The dot product is the standard inner product on Rn. It is the measuring tool that defines lengths, angles, distances, and perpendicularity throughout finite-dimensional linear algebra.
Properties of the Dot Product
The dot product satisfies three fundamental properties.
Symmetry: u⋅v=v⋅u. The order does not matter.
Linearity: (cu+dw)⋅v=c(u⋅v)+d(w⋅v). The dot product distributes over addition and pulls scalars out. Combined with symmetry, it is linear in both arguments (bilinear).
Positive definiteness: v⋅v≥0 for all v, with equality if and only if v=0. The quantity v⋅v=v12+v22+⋯+vn2 is a sum of squares, which is zero only when every component is zero.
These three properties — symmetry, linearity, positive definiteness — are not just useful observations. They are the axioms that define an inner product in the abstract setting.
Length
The length (or norm) of a vector v is
∥v∥=v⋅v=v12+v22+⋯+vn2
This is the Euclidean norm — the generalization of the Pythagorean theorem to n dimensions. In R2, ∥(3,4)∥=9+16=5. In R3, ∥(1,2,2)∥=1+4+4=3.
The norm satisfies ∥v∥≥0 with equality only for v=0, and ∥cv∥=∣c∣∥v∥ — scaling a vector scales its length by the absolute value of the scalar.
A unit vector has ∥v∥=1. Any nonzero vector can be normalized — replaced by v^=v/∥v∥, which points in the same direction with length 1. Normalization preserves direction and discards magnitude.
Distance
The distance between two vectors u and v is the length of their difference:
d(u,v)=∥u−v∥=(u1−v1)2+(u2−v2)2+⋯+(un−vn)2
This is the Euclidean distance — the straight-line separation between two points in Rn.
Distance satisfies the properties of a metric: d(u,v)≥0 with equality only when u=v; d(u,v)=d(v,u) (symmetry); and d(u,w)≤d(u,v)+d(v,w) (triangle inequality). Every inner product induces a distance through this formula, and the resulting distance always satisfies these metric properties.
Angle Between Vectors
For nonzero vectors u and v, the angle θ between them satisfies
cosθ=∥u∥∥v∥u⋅v
The Cauchy-Schwarz inequality (next section) guarantees that the right-hand side lies between −1 and 1, so the formula always produces a well-defined angle θ∈[0,π].
When cosθ=1 (θ=0), the vectors point in the same direction. When cosθ=−1 (θ=π), they point in opposite directions. When cosθ=0 (θ=π/2), the vectors are orthogonal.
The orthogonality condition u⋅v=0 is the case θ=90°. The dot product encodes the full metric geometry of Rn: length from v⋅v, angle from u⋅v, and distance from (u−v)⋅(u−v).
For u=(1,2,3) and v=(4,−1,2): u⋅v=4−2+6=8, ∥u∥=14, ∥v∥=21. So cosθ=8/294≈0.467, giving θ≈62.2°.
The Cauchy-Schwarz Inequality
For all vectors u and v in Rn:
∣u⋅v∣≤∥u∥∥v∥
Equality holds if and only if one vector is a scalar multiple of the other — they are parallel.
The proof considers the expression ∥u−tv∥2≥0 for all t∈R. Expanding: ∥u∥2−2t(u⋅v)+t2∥v∥2≥0. This is a quadratic in t that is non-negative everywhere, so its discriminant must be non-positive: 4(u⋅v)2−4∥u∥2∥v∥2≤0. Rearranging gives Cauchy-Schwarz.
The inequality is what makes the angle formula legitimate. It guarantees −1≤∥u∥∥v∥u⋅v≤1, so cosθ takes a valid value. Without Cauchy-Schwarz, the angle formula could produce numbers outside [−1,1], and the geometric interpretation would collapse.
The Triangle Inequality
For all vectors u and v in Rn:
∥u+v∥≤∥u∥+∥v∥
The length of one side of a triangle never exceeds the sum of the other two. Equality holds if and only if u and v point in the same direction (one is a non-negative scalar multiple of the other).
The proof follows from Cauchy-Schwarz. Square both sides: ∥u+v∥2=∥u∥2+2u⋅v+∥v∥2≤∥u∥2+2∥u∥∥v∥+∥v∥2=(∥u∥+∥v∥)2. The key step uses u⋅v≤∣u⋅v∣≤∥u∥∥v∥.
The triangle inequality is essential for the distance function d(u,v)=∥u−v∥ to satisfy the metric axioms. It ensures that going from u to w directly is never longer than going via v.
General Inner Products
An inner product on a vector spaceV is a function ⟨⋅,⋅⟩:V×V→R satisfying three axioms:
Symmetry: ⟨u,v⟩=⟨v,u⟩.
Linearity in the first argument: ⟨cu+dw,v⟩=c⟨u,v⟩+d⟨w,v⟩.
Positive definiteness: ⟨v,v⟩>0 for all v=0.
A vector space equipped with an inner product is called an inner product space. Every inner product induces a norm (∥v∥=⟨v,v⟩), a distance (d(u,v)=∥u−v∥), and a notion of orthogonality (u⊥v iff ⟨u,v⟩=0). The Cauchy-Schwarz inequality, the triangle inequality, and the Pythagorean theorem all hold in any inner product space.
The standard dot product on Rn is one inner product. But the definition admits many others, each defining a different geometry on the same set of vectors.
Examples of Inner Products
The weighted inner product on Rn is ⟨u,v⟩=uTWv, where W is a symmetric positive definite matrix. This distorts the standard geometry — unit circles become ellipses, and "perpendicular" directions depend on W. When W=I, it reduces to the standard dot product.
On the polynomial space Pn, the inner product ⟨p,q⟩=∫−11p(x)q(x)dx defines orthogonality via integration. The polynomials 1, x, and 21(3x2−1) are orthogonal under this product — these are the first three Legendre polynomials.
On the function space C[0,2π], the inner product ⟨f,g⟩=∫02πf(x)g(x)dx makes sines and cosines orthogonal: ∫02πsin(mx)cos(nx)dx=0 for all integers m,n. This is the foundation of Fourier analysis.
The Frobenius inner product on matrices is ⟨A,B⟩=tr(ATB)=∑ijaijbij, which treats matrices as vectors of n2 entries.
Each inner product defines its own geometry, but the linear algebra — projections, Gram-Schmidt, least squares — works identically in all of them.
The Pythagorean Theorem
If u and v are orthogonal (u⋅v=0), then
∥u+v∥2=∥u∥2+∥v∥2
The proof is a one-line expansion: ∥u+v∥2=u⋅u+2u⋅v+v⋅v=∥u∥2+0+∥v∥2.
The theorem extends to any number of mutually orthogonal vectors: if v1,…,vk are pairwise orthogonal, then
∥v1+v2+⋯+vk∥2=∥v1∥2+∥v2∥2+⋯+∥vk∥2
All cross terms vanish because every pair has dot product zero. This is not a special property of R2 or R3 — it holds in any inner product space. The Pythagorean theorem is a direct consequence of the inner product axioms, and it is the reason that orthogonal decompositions are so computationally clean: lengths decompose into independent, additive contributions from each perpendicular direction.