Basis and dimension

Constructing a basis, as introduced in the previous lecture, can be challenging to analyze if the process never ends. While vector spaces can have infinite bases, we’ll focus on those with finite bases in this course. However, rather than assuming all vector spaces in our discussion have finite bases, we might adopt the weaker assumption that they are finitely generated. This approach is preferable because assuming a finite basis is a stronger condition than assuming finite generation, and we generally aim to start with the weaker assumption.

We should immediately define the basis for a vector space, as we’ve already begun to use this term.

Definition. A basis β for a vector space V is a linearly independent subset of V that generates V. If β is a basis for V, we shall often say that the elements of β form a basis for V.

Example. The standard basis for Fn is the set {(1,0,,0),(0,1,,0),,(0,0,,1)}. Example.{(1,1,0),(0,1,1),(1,0,1)} forms a basis for R3.

Example. Pn(F) has a basis {1,x,x2,,xn}.

Example. P(F) has a basis {1,x,x2.}.

Lemma. Let SV be a finite set that generates the vector space V. Then V has a finite basis.

proof. Let’s use the following process to construct a basis from S={x1,,xn}.

S={x_1,...,x_n}
For i in (1,2,...,n)
  if x_i is in span(S\{x_i})
    remove x_i from S
return S

Since the elements that were removed are generated by other elements, the above algorithm returns a set that generates the vector space V. The elements remaining in the set satisfy xspan(S{x}). Therefore, if the resulting S={x1,,xn} is linearly dependent, then there exist a1,,anF, not all zero, such that

0=a1x1++anxn

Without loss of generality, we may assume a1=1. Thus, x1 is a linear combination of the other elements, which contradicts the property of S. Therefore, S is linearly independent and, consequently, a basis.


Lemma. Let V be a vector space, and let β be a basis for V. If S is a finite set for V with S>β, then S is linearly dependent.

Proof: Let S={y1,,ym} and β={x1,,xn}, where m>n. Since β is a basis for V, each yi can be expressed as a linear combination of the basis elements. That is, for each yi, there exist scalars ai1,,ainF such that

yi=ai1x1++ainxn

Let’s consider the m×n matrix A=(aij). Since m>n, A can be reduced to a triangular system through Gaussian elimination, resulting in a zero row vector at the bottom. This implies that S is linearly dependent.


Lemma. Let V be a vector space having a basis β containing exactly n elements. Let S={y1,,ym} be a linearly independent subset of V containing exactly m elements, where mn. Then there exists a subset S1 of β containing exactly nm elements such that SS1 generates V.

Proof. We proceed by induction. The statement is trivially true if S is empty. For the inductive step, we assume the statement holds when S=mn. We may assume S<n; if S=n, then S adjoining another element would become linearly dependent, so S is already a basis. Now, we need to show that the statement holds when S=m+1. Let S={y1,,ym,ym+1}. By the induction hypothesis, there exists {x1,,xk}β where k=nm such that {y1,,ym,x1,,xk} is a basis. Furthermore, there exist scalars a1,,am,b1,,bkF such that

yn+1=a1y1++anyn+b1x1++bkxk.

Note that b1,,bk are not all zero. Otherwise, yn+1 would be a linear combination of y1,,yn, contradicting the linear independence of S. Without loss of generality, we may assume bk0. This implies

xk=a1bky1++anbkyn+1bkyn+1+b1bkx1++bk1bkxk1,

Thus, {y1,,yk+1,x1,,xk1} generates the vector space. We can therefore conclude that the statement holds by induction.