Visual Tools
Calculators
Tables
Mathematical Keyboard
Converters
Other Tools


Dimension of Vector Space






The Number That Classifies a Vector Space

Every basis for a given vector space has the same number of elements. This number — the dimension — is the single most important invariant of the space. It governs how many vectors can be independent, how large a spanning set must be, how subspaces nest inside one another, and whether two spaces are structurally identical.



Definition

The dimension of a vector space VV, written dim(V)\dim(V), is the number of vectors in any basis for VV.

This definition relies on a nontrivial fact: all bases for VV have the same size. Without this guarantee, "the number of vectors in a basis" would depend on which basis was chosen, and the definition would be meaningless. The basis size theorem, proved in the next section, ensures that the count is intrinsic to the space itself.

By convention, the zero space {0}\{\mathbf{0}\} has dimension 00. Its only "basis" is the empty set, which contains no vectors, is vacuously independent, and spans {0}\{\mathbf{0}\} since the empty linear combination produces 0\mathbf{0}.

The Basis Size Theorem

If a vector space VV has a basis with nn elements, then every basis for VV has exactly nn elements.

The proof rests on two supporting facts. First, in a space spanned by nn vectors, any set of more than nn vectors is dependent. Second, any spanning set contains at least as many vectors as any independent set.

Suppose B\mathcal{B} has nn elements and C\mathcal{C} has mm elements, and both are bases. Since B\mathcal{B} spans VV and C\mathcal{C} is independent, we get mnm \leq n. Since C\mathcal{C} spans VV and B\mathcal{B} is independent, we get nmn \leq m. Together, m=nm = n.

This theorem is what makes dimension well-defined. It converts a property of a particular basis (its size) into a property of the space itself.

Dimensions of Standard Spaces

The dimension of Rn\mathbb{R}^n is nn. The standard basis {e1,,en}\{\mathbf{e}_1, \dots, \mathbf{e}_n\} has nn elements, and any other basis for Rn\mathbb{R}^n also has exactly nn elements.

The matrix space Rm×n\mathbb{R}^{m \times n} has dimension mnmn. The standard basis consists of the mnmn matrix units EijE_{ij}, one for each entry position.

The polynomial space Pn\mathcal{P}_n has dimension n+1n + 1. The monomial basis {1,x,x2,,xn}\{1, x, x^2, \dots, x^n\} contains n+1n + 1 elements. Note the offset: the space of polynomials of degree at most 22 is three-dimensional, not two-dimensional, because the constant term counts as a basis direction.

The space P\mathcal{P} of all polynomials (no degree bound) is infinite-dimensional. For any proposed finite basis {p1,,pk}\{p_1, \dots, p_k\}, let NN be the maximum degree among them. Then xN+1x^{N+1} cannot be written as a combination of p1,,pkp_1, \dots, p_k, so the set does not span P\mathcal{P}.

The space C[a,b]C[a, b] of continuous functions on [a,b][a, b] is also infinite-dimensional, for similar reasons — no finite set of continuous functions can generate all others through linear combinations.

Finite vs. Infinite Dimension

A vector space is finite-dimensional if it has a finite basis, and infinite-dimensional otherwise.

In finite-dimensional spaces, the theory is clean and complete. Every independent set can be extended to a basis. Every spanning set can be reduced to one. The dimension is a finite integer that governs the entire structure of the space.

In infinite-dimensional spaces, the situation is more delicate. Bases still exist (assuming the axiom of choice), but they are called Hamel bases and are often unwieldy — a Hamel basis for C[0,1]C[0, 1] is uncountable and cannot be written down explicitly. In practice, infinite-dimensional spaces are handled with topological tools: Schauder bases (which allow infinite convergent sums), Hilbert space theory, and functional analysis.

On this site, all vector spaces are finite-dimensional unless explicitly stated otherwise. The finite-dimensional theory covers Rn\mathbb{R}^n, Pn\mathcal{P}_n, Rm×n\mathbb{R}^{m \times n}, and solution spaces of linear ODEs with constant coefficients — the spaces that appear in a standard linear algebra course.

Dimension and Independence

Dimension places hard constraints on how large an independent set can be.

In an nn-dimensional space, any set of more than nn vectors is dependent. This is the absolute ceiling: independence cannot survive past nn vectors, no matter how cleverly they are chosen. In R3\mathbb{R}^3, four or more vectors are always dependent. In P2\mathcal{P}_2, four or more polynomials are always dependent.

An independent set of exactly nn vectors is automatically a basis. Spanning comes for free once the count reaches the dimension. This is one of the most useful shortcuts in practice: to verify that nn vectors in an nn-dimensional space form a basis, it suffices to check independence alone.

An independent set of fewer than nn vectors can always be extended to a basis by adding nkn - k more vectors, where kk is the current count. The extension is not unique — there are many ways to complete the set — but the final count is always nn.

Dimension and Spanning

Dimension also constrains spanning sets from the other direction.

In an nn-dimensional space, any set of fewer than nn vectors cannot span the space. The span has dimension at most k<nk < n, so it is a proper subspace — some vectors in VV are unreachable. In R3\mathbb{R}^3, two vectors can span at most a plane, never all of R3\mathbb{R}^3.

A spanning set of exactly nn vectors is automatically a basis. Independence comes for free once the count hits the dimension. This is the mirror image of the independence shortcut: to verify that nn vectors span an nn-dimensional space, it suffices to check spanning alone, and independence follows.

A spanning set of more than nn vectors contains redundancies. At least knk - n vectors can be removed without shrinking the span. Removing all redundant vectors produces a basis.

These two shortcuts — "nn independent vectors in an nn-dimensional space form a basis" and "nn spanning vectors in an nn-dimensional space form a basis" — are the workhorses of basis verification.

Dimension of Subspaces

If WW is a subspace of a finite-dimensional space VV, then

dim(W)dim(V)\dim(W) \leq \dim(V)


The inequality is strict (dim(W)<dim(V)\dim(W) < \dim(V)) for every proper subspace — a subspace that is not all of VV. The only subspace with dim(W)=dim(V)\dim(W) = \dim(V) is W=VW = V itself. This follows because any basis for WW is an independent set in VV with dim(W)\dim(W) elements, and extending it to a basis for VV adds dim(V)dim(W)\dim(V) - \dim(W) vectors. If no vectors need adding, WW already has a basis for all of VV.

In R3\mathbb{R}^3, the possible subspace dimensions are 00 (the zero vector), 11 (a line through the origin), 22 (a plane through the origin), and 33 (R3\mathbb{R}^3 itself). There is no subspace of dimension 32\frac{3}{2} or π\pi — dimension is always a non-negative integer.

For a matrix AA, the column space is a subspace of Rm\mathbb{R}^m with dimension equal to the rank, and the null space is a subspace of Rn\mathbb{R}^n with dimension nrank(A)n - \text{rank}(A). Both dimensions are bounded by the dimensions of their ambient spaces.

The Dimension Formula for Subspace Sums

For two subspaces W1W_1 and W2W_2 of a finite-dimensional space VV, the dimension of their sum satisfies

dim(W1+W2)=dim(W1)+dim(W2)dim(W1W2)\dim(W_1 + W_2) = \dim(W_1) + \dim(W_2) - \dim(W_1 \cap W_2)


This is the linear algebra analogue of the inclusion-exclusion formula for counting elements in the union of two sets. The intersection is subtracted because vectors in W1W2W_1 \cap W_2 are counted once in dim(W1)\dim(W_1) and once in dim(W2)\dim(W_2), but should only contribute once to the dimension of the sum.

When W1W2={0}W_1 \cap W_2 = \{\mathbf{0}\}, the formula simplifies to dim(W1+W2)=dim(W1)+dim(W2)\dim(W_1 + W_2) = \dim(W_1) + \dim(W_2). In this case the sum is called a direct sum, written W1W2W_1 \oplus W_2. Every vector in a direct sum has a unique decomposition as w1+w2\mathbf{w}_1 + \mathbf{w}_2 with w1W1\mathbf{w}_1 \in W_1 and w2W2\mathbf{w}_2 \in W_2.

For example, in R3\mathbb{R}^3, let W1W_1 be the xyxy-plane (dimension 22) and W2W_2 be the zz-axis (dimension 11). Their intersection is {0}\{\mathbf{0}\}, so dim(W1+W2)=2+10=3\dim(W_1 + W_2) = 2 + 1 - 0 = 3. The sum is all of R3\mathbb{R}^3, and the decomposition R3=W1W2\mathbb{R}^3 = W_1 \oplus W_2 splits every vector into its xyxy-component and its zz-component.

The Rank-Nullity Theorem as a Dimension Statement

For an m×nm \times n matrix AA, the rank-nullity theorem states

dim(Col(A))+dim(Null(A))=n\dim(\text{Col}(A)) + \dim(\text{Null}(A)) = n


The column space is a subspace of Rm\mathbb{R}^m with dimension rank(A)\text{rank}(A). The null space is a subspace of Rn\mathbb{R}^n with dimension nrank(A)n - \text{rank}(A). Their dimensions add up to nn, the dimension of the domain Rn\mathbb{R}^n.

Interpreted through the lens of linear transformations, this says that the nn dimensions of the domain split between the image (what the map hits) and the kernel (what the map kills). No dimensions are lost or created — they are redistributed.

The rank-nullity theorem is the fundamental bridge between abstract dimension theory and concrete matrix computation. It connects the number of pivot columns (rank) to the number of free variables (nullity), and it guarantees that every question about the dimension of a null space or column space reduces to row reduction and pivot counting.

Dimension and Isomorphism

Two finite-dimensional vector spaces over the same field are isomorphic — structurally identical as vector spaces — if and only if they have the same dimension.

The forward direction is straightforward: an isomorphism maps a basis to a basis, preserving the count. The reverse direction is the deeper fact: given any two nn-dimensional spaces VV and WW, choosing a basis for each creates coordinate maps VRnV \to \mathbb{R}^n and WRnW \to \mathbb{R}^n. Composing one with the inverse of the other gives a direct isomorphism VWV \to W.

This means dimension is the single complete invariant for finite-dimensional vector spaces. It captures everything about the linear-algebraic structure — independence, span, basis size, subspace behavior — in a single integer. Two spaces with the same dimension may contain very different objects (polynomials vs. matrices vs. functions), but from the perspective of linear algebra, they are indistinguishable.

R3\mathbb{R}^3, P2\mathcal{P}_2, and the solution space of a third-order homogeneous linear ODE are all three-dimensional. They are isomorphic as vector spaces. Every theorem that holds in one holds in all three — that is the payoff of the axiomatic approach.