# Basis – Serlo

Zur Navigation springen Zur Suche springen

In the last two articles we got to know the two concepts generator and linear independence. In this articles we combine the two concepts and introduce the notion of basis of a vector space.

## Motivation

### Via linear independence

We want to work towards the concept of dimension. Intuitively, we can think of it as the maximum number of linearly independent directions in a space. If we stick to this intuition, we can give the following preliminary definition of dimension: The dimension of a vector space is the maximum number of linearly independent vectors that we can simultaneously choose in a vector space. To find this, we need to find a maximal system of linearly independent vectors.

To see if this preliminary definition makes sense, let's try to apply it in the ${\displaystyle \mathbb {R} ^{3}}$: Clearly, the ${\displaystyle \mathbb {R} ^{3}}$ should be three-dimensional. This means we have to ask ourselves two questions: Do three linearly independent vectors fit into the ${\displaystyle \mathbb {R} ^{3}}$ and is there no fourth vector being linearly independent to any of such three independent vectors? Let us now try this out: We take any first vector, e.g. ${\displaystyle v_{1}:=(2,-1,0)^{T}}$. Now we want to find a linearly independent vector, i.e. a vector that is not a multiple of ${\displaystyle v_{1}}$ or does not lie in ${\displaystyle \operatorname {span} (\{v_{1}\})}$. An example for this is ${\displaystyle v_{2}:=(2,1,0)^{T}}$.

If we want to follow the intuition that ${\displaystyle \mathbb {R} ^{3}}$ is three-dimensional, we still have to find a third vector that is linearly independent to the system ${\displaystyle \{v_{1},v_{2}\}}$. Now ${\displaystyle v_{1}}$ and ${\displaystyle v_{2}}$ span a plane in which the last component is always zero. Thus ${\displaystyle v_{3}=(0,0,1)^{T}}$ is a vector linearly independent of ${\displaystyle \{v_{1},v_{2}\}}$.

Now the question is whether with these three vectors the maximum number of linearly independent vectors has already been reached. To answer this, let us first consider the vector ${\displaystyle (1,0,1)^{T}}$ as an example. We want to check whether we can add this vector to ${\displaystyle v_{1},v_{2}}$ and ${\displaystyle v_{3}}$ and still obtain a system of linearly independent vectors. First we note that

${\displaystyle {\begin{pmatrix}1\\0\\0\end{pmatrix}}={\frac {1}{4}}\cdot {\begin{pmatrix}2\\-1\\0\end{pmatrix}}+{\frac {1}{4}}\cdot {\begin{pmatrix}2\\1\\0\end{pmatrix}}}$

So we can write the above vector as

${\displaystyle {\begin{pmatrix}1\\0\\1\end{pmatrix}}={\frac {1}{4}}\cdot {\begin{pmatrix}2\\-1\\0\end{pmatrix}}+{\frac {1}{4}}\cdot {\begin{pmatrix}2\\1\\0\end{pmatrix}}+1\cdot {\begin{pmatrix}0\\0\\1\end{pmatrix}}}$

So ${\displaystyle (1,0,1)^{T}\in \operatorname {span} (\{v_{1},v_{2},v_{3}\})}$. Now let us ask the same question for any vector ${\displaystyle (a,b,c)^{T}\in \mathbb {R} ^{3}}$ with ${\displaystyle a,b,c\in \mathbb {R} }$. For this we first get

{\displaystyle {\begin{aligned}{\begin{pmatrix}1\\0\\0\end{pmatrix}}&={\frac {1}{4}}\cdot {\begin{pmatrix}2\\1\\0\end{pmatrix}}+{\frac {1}{4}}\cdot {\begin{pmatrix}2\\-1\\0\end{pmatrix}}\\[0.5em]{\begin{pmatrix}0\\1\\0\end{pmatrix}}&={\frac {1}{2}}\cdot {\begin{pmatrix}2\\1\\0\end{pmatrix}}-{\frac {1}{2}}\cdot {\begin{pmatrix}2\\-1\\0\end{pmatrix}}\end{aligned}}}

With this consideration we obtain for ${\displaystyle (a,b,c)^{T}}$ the representation

{\displaystyle {\begin{aligned}{\begin{pmatrix}a\\b\\c\end{pmatrix}}&=a\cdot {\begin{pmatrix}1\\0\\0\end{pmatrix}}+b\cdot {\begin{pmatrix}0\\1\\0\end{pmatrix}}+c\cdot {\begin{pmatrix}0\\0\\1\end{pmatrix}}\\[0.5em]&=a\cdot \left({\frac {1}{4}}\cdot {\begin{pmatrix}2\\1\\0\end{pmatrix}}+{\frac {1}{4}}\cdot {\begin{pmatrix}2\\-1\\0\end{pmatrix}}\right)+b\cdot \left({\frac {1}{2}}\cdot {\begin{pmatrix}2\\1\\0\end{pmatrix}}-{\frac {1}{2}}\cdot {\begin{pmatrix}2\\-1\\0\end{pmatrix}}\right)+c\cdot {\begin{pmatrix}0\\0\\1\end{pmatrix}}\\[0.5em]&={\frac {a+2b}{4}}\cdot {\begin{pmatrix}2\\1\\0\end{pmatrix}}+{\frac {a-2b}{4}}\cdot {\begin{pmatrix}2\\-1\\0\end{pmatrix}}+c\cdot {\begin{pmatrix}0\\0\\1\end{pmatrix}}\end{aligned}}}

Thus ${\displaystyle (a,b,c)^{T}\in \operatorname {span} (\{v_{1},v_{2},v_{3}\})}$. Since this vector was arbitrarily chosen, every vector from ${\displaystyle \mathbb {R} ^{3}}$ is representable as a linear combination of the linearly independent vectors ${\displaystyle v_{1},v_{2}}$ and ${\displaystyle v_{3}}$. Thus ${\displaystyle \{v_{1},v_{2},v_{3}\}}$ is a generator of ${\displaystyle \mathbb {R} ^{3}}$. Therefore, we cannot add another vector to ${\displaystyle v_{1},v_{2}}$ and ${\displaystyle v_{3}}$, so the system remains linearly independent, since every other vector from ${\displaystyle \mathbb {R} ^{3}}$ can be represented as a linear combination of ${\displaystyle v_{1},v_{2}}$ and ${\displaystyle v_{3}}$. In other words, ${\displaystyle v_{1},v_{2}}$ and ${\displaystyle v_{3}}$ form a maximal system of linearly independent vectors.

In summary, we proceeded as follows to find a maximal system of linearly independent vectors: We start with a vector that is not the zero vector. That means we should not consider the null vector space here. Then we proceed step by step: Once we have found linearly independent vectors ${\displaystyle v_{1},...,v_{n}}$, we form the span of these vectors. If this is the entire vector space, we have found a generator and are done. We choose a vector ${\displaystyle v_{n+1}}$ that is not in ${\displaystyle \operatorname {span} (\{v_{1},...,v_{n}\})}$. This contributes a new direction and the system ${\displaystyle v_{1},...,v_{n+1}}$ is linearly independent again. Then we do the same again until we find a generator. We thus obtain the characterisation that a maximal system of linearly independent vectors is a generator of linearly independent vectors.

### Via generators

So far we have started with a system of linearly independent vectors (which is not yet a generator) and extended it until it became a maximal system of linearly independent vectors. Now we want to investigate what happens when we reverse the direction. That is, we start with a generator (which is not linearly independent) and reduce it until we find a minimum (i.e., linearly independent) generator.

Let us consider

${\displaystyle E=\left\{{\begin{pmatrix}2\\1\\1\end{pmatrix}},{\begin{pmatrix}-1\\0\\-1\end{pmatrix}},{\begin{pmatrix}2\\1\\2\end{pmatrix}},{\begin{pmatrix}1\\1\\0\end{pmatrix}},{\begin{pmatrix}1\\0\\1\end{pmatrix}}\right\}}$

First, we establish that ${\displaystyle E}$ is a generator of ${\displaystyle \mathbb {R} ^{3}}$ by following calculation: For a vector ${\displaystyle (\alpha ,\beta ,\gamma )^{T}\in \mathbb {R} ^{3}}$ with ${\displaystyle \alpha ,\beta ,\gamma \in \mathbb {R} }$ we have that

${\displaystyle {\begin{pmatrix}\alpha \\\beta \\\gamma \end{pmatrix}}=(\alpha -\beta -\gamma )\cdot {\begin{pmatrix}2\\1\\1\end{pmatrix}}+\beta \cdot {\begin{pmatrix}-1\\0\\-1\end{pmatrix}}+(\beta +\gamma -\alpha )\cdot {\begin{pmatrix}2\\1\\2\end{pmatrix}}+\beta \cdot {\begin{pmatrix}1\\1\\0\end{pmatrix}}+\alpha \cdot {\begin{pmatrix}1\\0\\1\end{pmatrix}}}$

So we can represent any vector ${\displaystyle (\alpha ,\beta ,\gamma )^{T}\in \mathbb {R} ^{3}}$ as a linear combination of vectors from the generator.

Now we ask ourselves whether we can reduce the above generator without losing the property of it being a generator. The vector ${\displaystyle (-1,0,-1)^{T}}$ is a multiple of ${\displaystyle (1,0,1)^{T}}$. That is, the direction represented by ${\displaystyle (-1,0,-1)^{T}}$ is the same as the direction represented by ${\displaystyle (1,0,1)^{T}}$. Hence, we can remove this vector from ${\displaystyle E}$ and obtain a new generator

${\displaystyle E_{1}=\left\{{\begin{pmatrix}2\\1\\1\end{pmatrix}},{\begin{pmatrix}2\\1\\2\end{pmatrix}},{\begin{pmatrix}1\\1\\0\end{pmatrix}},{\begin{pmatrix}1\\0\\1\end{pmatrix}}\right\}}$

Can we reduce the size of this generator, as well? Yes, since

${\displaystyle {\begin{pmatrix}2\\1\\1\end{pmatrix}}={\begin{pmatrix}1\\0\\1\end{pmatrix}}+{\begin{pmatrix}1\\1\\0\end{pmatrix}}}$

So ${\displaystyle (2,1,1)^{T}}$ adds no new direction that is not already spanned by ${\displaystyle (1,0,1)^{T}}$ and ${\displaystyle (1,1,0)^{T}}$. We thus obtain a smaller generator

${\displaystyle E_{2}=\left\{{\begin{pmatrix}2\\1\\2\end{pmatrix}},{\begin{pmatrix}1\\1\\0\end{pmatrix}},{\begin{pmatrix}1\\0\\1\end{pmatrix}}\right\}}$

Now we cannot reduce the generator any further without losing the property of it being a generator. For if we remove any of the three vectors in ${\displaystyle E_{2}}$, it is no longer in the span of the remaining two vectors. For ${\displaystyle (1,0,1)^{T}}$, for example, we see this as follows: Suppose we had some ${\displaystyle \lambda ,\mu \in \mathbb {R} }$, so that

${\displaystyle {\begin{pmatrix}1\\0\\1\end{pmatrix}}=\lambda {\begin{pmatrix}1\\1\\0\end{pmatrix}}+\mu {\begin{pmatrix}2\\1\\2\end{pmatrix}}}$

Then ${\displaystyle \lambda =-\mu }$ would have to hold because the second component must be zero on both sides. Because of the third component, ${\displaystyle \mu ={\frac {1}{2}}}$ must be true. Thus we obtain the contradiction

${\displaystyle {\begin{pmatrix}1\\0\\1\end{pmatrix}}=-{\frac {1}{2}}{\begin{pmatrix}1\\1\\0\end{pmatrix}}+{\frac {1}{2}}{\begin{pmatrix}2\\1\\2\end{pmatrix}}={\begin{pmatrix}0.5\\0\\1\end{pmatrix}}}$

So ${\displaystyle (1,0,1)^{T}}$ is not in ${\displaystyle \operatorname {span} ((1,1,0)^{T},(2,1,2)^{T})}$. So we have found a generator containing linearly independent vectors (a minimal generator).

In summary, we proceeded as follows: We started with a generator of vectors ${\displaystyle v_{1},\dots ,v_{n}}$ and reduced it according to the following algorithm: If ${\displaystyle v_{1},\dots ,v_{n}}$ is a system of linearly independent vectors, we cannot remove any vector without losing the property of having a generator. That means we are done and we have found a minimal generator. In the converse case, we find a vector ${\displaystyle v_{i}}$ with ${\displaystyle i\in \{1,\dots ,n\}}$ that is in ${\displaystyle \operatorname {span} (\{v_{1},\dots ,v_{i-1},v_{i+1},\dots ,v_{n}\})}$. This vector can be omitted and we obtain a new generator consisting of ${\displaystyle (n-1)}$ vectors. With this generator, we do the same steps again until we have found a system of linearly independent vectors.

We thus obtain the characterisation that a minimal generator is a generator consisting of linearly independent vectors.

### Motivation - Conclusion

We have found the characterisations of linearly independent generators as minimal generators and as maximal linearly independent subsets. Thus the property of being a linearly independent generator is a special property of both linearly independent sets and generators. A set with both properties is called a basis in Linear Algebra.

Since bases are generators, every vector has a representation as a linear combination of basis vectors. This representation is unambiguous because bases are linearly independent. So we found another way of describing bases:

A basis is a subset such that every vector has a representation as a unique linear combination of basis vectors

.

## Definition: Basis of a vector space

Definition (Basis of a vector space)

Let ${\displaystyle K}$ be a field and ${\displaystyle V}$ a ${\displaystyle K}$-vector space. If ${\displaystyle B\subseteq V}$ is a generator of ${\displaystyle V}$, whose vectors are linearly independent, then ${\displaystyle B}$ is called a basis of ${\displaystyle V}$.

Hint

We have defined the base as a set of vectors. This does not determine the order of the vectors. Alternatively, one can define the base as a tuple of vectors. In this case, the order of the vectors is fixed. Changing the order in this case results in a different basis.

## Equivalent definitions of basis

Theorem (Equivalent definitions of basis)

For a subset ${\displaystyle B\subseteq V}$ the following four statements are equivalent:

1. ${\displaystyle B}$ is a linearly independent generator of ${\displaystyle V}$.
2. ${\displaystyle B}$ is a maximal linearly independent subset of ${\displaystyle V}$. This means that if another element ${\displaystyle c\in V\setminus B}$ is added to ${\displaystyle B}$, the new set ${\displaystyle B\cup \lbrace c\rbrace }$ is no longer linearly independent.
3. Each element of ${\displaystyle V}$ can be uniquely represented as a linear combination of vectors from ${\displaystyle B}$.
4. ${\displaystyle B}$ is a minimal generator of ${\displaystyle V}$. This means: ${\displaystyle B}$ is a generator of ${\displaystyle V}$. But if an element is removed from ${\displaystyle B}$, the remaining set is no longer a generator of ${\displaystyle V}$.

Proof (Equivalent definitions of basis)

We prove the equivalence of these statements by a "circle of implications" of the kind ${\displaystyle 1\implies 2\implies 3\implies 4\implies 1}$:

Proof step: ${\displaystyle 1\implies 2}$

We show this using a proof by contradiction. We assume that statement 1 is true and statement 2 is false. Then we show that it follows that statement 1 must also be false (which is a contradiction). So statement 2 cannot be false, i.e., it must be true.

So let us assume that ${\displaystyle B:=\{b_{1},\dots ,b_{n}\}}$ is a linearly independent generator but not a maximal linearly independent subset of ${\displaystyle V}$. Thus there is a ${\displaystyle c\in V\setminus B}$ such that the vectors of ${\displaystyle B\cup \lbrace c\rbrace }$ are linearly independent. (This is exactly the opposite of statement 2.) However, since ${\displaystyle B}$ is a generating system according to statement 1, we can write ${\displaystyle c}$ as a linear combination of elements from ${\displaystyle B}$:

${\displaystyle c=\lambda _{1}b_{1}+\dots +\lambda _{n}b_{n}}$

with ${\displaystyle \lambda _{1},...,\lambda _{n}\in K}$. At least one ${\displaystyle \lambda _{i}\neq 0}$ with ${\displaystyle 1\leq i\leq n}$, because ${\displaystyle b\cup \lbrace c\rbrace }$ is linearly independent and therefore ${\displaystyle c\neq 0}$. The above equation can be transformed to:

${\displaystyle 0=-c+\lambda _{1}b_{1}+\dots +\lambda _{n}b_{n}}$

Thus the set ${\displaystyle B\cup \lbrace c\rbrace }$ is linearly dependence. This is a contradiction to our assumption that ${\displaystyle B\cup \lbrace c\rbrace }$ is linearly independent. That means, we verified ${\displaystyle 1\implies 2}$ by contradiction.

Proof step: ${\displaystyle 2\implies 3}$

The proof is divided into two steps. In the first step we show that under the assumption of statement 2 a linear combination exists for every element from ${\displaystyle V}$ with elements from ${\displaystyle B}$. In the second step we show the uniqueness of this linear combination. Both steps must be shown, because statement 3 requires the existence of such a linear combination as well as its uniqueness.

Let ${\displaystyle B:=\{b_{1},\dots ,b_{n}\}}$ be a maximal linearly independent subset of ${\displaystyle V}$. Let ${\displaystyle c\in V}$ be any element from our vector space. We will now show that we can write ${\displaystyle c}$ as a linear combination of elements from ${\displaystyle B}$. We now distinguish two cases. It is important to cover all cases. Here it is obvious, because we first consider a subset of ${\displaystyle V}$ and then its complement.

Fall 1:

We assume that ${\displaystyle c\in B}$, i.e., the vector we are looking for is part of our maximal linearly independent subset. The linear combination is trivial in this case because ${\displaystyle c}$ is already in ${\displaystyle B}$. Thus we can simply write

${\displaystyle c=1\cdot c}$ and we are done.

Fall 2:

We assume that ${\displaystyle c\notin B}$, so ${\displaystyle c\in V\setminus B}$. We now exploit the property of maximum linearity of ${\displaystyle B}$. To do this, we consider the set ${\displaystyle B\cup \lbrace c\rbrace }$. Because of ${\displaystyle c\in V\setminus B}$ and because of the maximum linear independence of ${\displaystyle B}$, the set ${\displaystyle B\cup \lbrace c\rbrace }$ is linearly dependent. Linear dependence means that there exists a linear combination resulting in zero, but with not all coefficients being zero:

${\displaystyle 0=\lambda _{0}c+\lambda _{1}b_{1}+\dots +\lambda _{n}b_{n}}$

Here at least one ${\displaystyle \lambda _{i}\neq is0}$. By transforming, we obtain a linear combination of ${\displaystyle c}$ by the elements from ${\displaystyle B}$:

${\displaystyle c=-{\frac {1}{\lambda _{0}}}\cdot (\lambda _{1}b_{1}+\dots +\lambda _{n}b_{n})}$

We still need to show that ${\displaystyle \lambda _{0}\neq 0}$ because we need ${\displaystyle {\tfrac {1}{\lambda _{0}}}}$. If we had ${\displaystyle \lambda _{0}=0}$, then the above line would form a linear combination of zero only with elements from ${\displaystyle B}$. This would be a contradiction to ${\displaystyle B}$ being linearly independent. Thus we have that also ${\displaystyle \lambda _{0}\neq 0}$.

Thus we have finished the consideration of both cases and get a linear combination of ${\displaystyle c}$ using elements from ${\displaystyle B}$. Since ${\displaystyle c}$ was arbitrarily chosen from ${\displaystyle V}$, we have now already shown that every element from ${\displaystyle V}$ can be represented as a linear combination of vectors from ${\displaystyle B}$. This shows the existence of the linear combination.

In the next and final step we have to prove the uniqueness of this linear combination. We show this again via a proof by contradiction. We assume that there is a ${\displaystyle c\in V}$ which can be represented by two different linear combinations. A contradiction to statement 2 will then be inferred from this representation ambiguity. Let

{\displaystyle {\begin{aligned}c&=\lambda _{1}b_{1}+\dots +\lambda _{n}b_{n}\\c&=\kappa _{1}b_{1}+\dots +\kappa _{n}b_{n}\end{aligned}}}

with ${\displaystyle \lambda _{1},...,\lambda _{n},\kappa _{1},...,\kappa _{n}\in K}$. Since both linear combinations are different, there is at least one ${\displaystyle j}$ for which ${\displaystyle \lambda _{j}-\kappa _{j}\neq 0}$. If this were not true, all coefficients would be identical and thus both linear combinations would be identical. We now subtract these two equations and get the following representation of the zero vector:

${\displaystyle 0=(\lambda _{1}-\kappa _{1})b_{1}+\dots +(\lambda _{n}-\kappa _{n})b_{n}}$

Let us define the difference vector ${\displaystyle \eta _{i}:=\lambda _{i}-\kappa _{i}}$. Since ${\displaystyle \lambda _{j}-\kappa _{j}\neq is0}$, we have that also ${\displaystyle \eta _{j}\neq 0}$. Thus we can rewrite the above equation as:

${\displaystyle 0=\eta _{1}b_{1}+\dots +\eta _{n}b_{n}}$

With ${\displaystyle n_{j}}$ there is at least one non-zero coefficient. So this is a non-trivial linear combination of zero with elements from ${\displaystyle B}$. It follows that ${\displaystyle B}$ is linearly dependent. This is a contradiction to statement 2, according to which ${\displaystyle B}$ is linearly independent. So our ambiguity assumption was false and the linear combination must be unique. Therefore we have that ${\displaystyle 2\implies 3}$.

Proof step: ${\displaystyle 3\implies 4}$

We carry out the proof in two steps. First we show that, assuming statement 3, ${\displaystyle B}$ is a generator. Then we show that it is minimal. By the definition of a generator, ${\displaystyle B}$ must be a subset of ${\displaystyle V}$ and it must span ${\displaystyle V}$ (i.e. ${\displaystyle B=\operatorname {span} (V)}$). By statement 3, ${\displaystyle B\subseteq V}$. Since we can represent every element from ${\displaystyle V}$ as a linear combination using elements from ${\displaystyle B}$, we also have that ${\displaystyle B}$ spans the vector space ${\displaystyle V}$. Thus ${\displaystyle B}$ is a generator.

In the next step we want to show that ${\displaystyle B}$ is minimal. We again perform a proof by contradiction and assume that ${\displaystyle B}$ is not a minimal generator. This leads us to a contradiction to statement 3. If ${\displaystyle B}$ is not a minimal generator, there exists a real subset ${\displaystyle B'\subsetneq B}$ which is also a generator. Let now ${\displaystyle b\in B\setminus B'}$, so ${\displaystyle b\notin B'}$. Then there exist two different linear combinations for ${\displaystyle b}$ constructed by vectors in ${\displaystyle B}$. Now, ${\displaystyle b}$ can be "trivially" represented:

${\displaystyle b=1\cdot b}$

or we use the representation of ${\displaystyle b}$ by elements from ${\displaystyle B'}$ (which can be done because ${\displaystyle B'}$ forms a generator):

${\displaystyle b=\lambda _{1}b'_{1}+\cdots +\lambda _{n}b'_{n}}$

with ${\displaystyle \lambda _{1},...,\lambda _{n}\in K}$. Since, according to statement 3, for every element from ${\displaystyle V}$ there is a unique linear combination of vectors from ${\displaystyle B}$, the existence of the two linear combinations contradicts statement 3. Thus ${\displaystyle B}$ must be a minimal generator. This concludes the proof of contradiction and we have that ${\displaystyle 3\implies 4}$.

Proof step: ${\displaystyle 4\implies 1}$

We show this again using a proof by contradiction. So we assume that statement 1 is false. This means that ${\displaystyle B}$ is not a linearly independents generator. That is, there exists a ${\displaystyle b_{1}\in B}$ that can be represented as a linear combination of vectors ${\displaystyle b_{2},...,b_{n}}$ from ${\displaystyle B\setminus \{b_{1}\}}$:

${\displaystyle b_{1}=\lambda _{2}b_{2}+\cdots +\lambda _{n}b_{n}}$

Here, for at least one coefficient, ${\displaystyle \lambda _{i}\neq 0}$. We now lead this to contradict statement 4 by showing that ${\displaystyle B}$ is then no longer minimal.

Let ${\displaystyle c\in V}$ be any vector. Since ${\displaystyle B}$ is a generator, there is a linear combination of ${\displaystyle c}$ with vectors ${\displaystyle b_{1},...,b_{n}}$ from ${\displaystyle B}$:

${\displaystyle c=\kappa _{1}b_{1}+\cdots +\kappa _{n}b_{n}}$

Now we plug the above linear combination of ${\displaystyle b_{1}}$ into the linear combination of ${\displaystyle c}$:

{\displaystyle {\begin{aligned}c&=\kappa _{1}(\lambda _{2}b_{2}+\cdots +\lambda _{n}b_{n})+\kappa _{2}b_{2}+\cdots +\kappa _{n}b_{n}\\&=\kappa _{1}\lambda _{2}b_{2}+\cdots +\kappa _{1}\lambda _{n}b_{n}+\kappa _{2}b_{2}+\cdots +\kappa _{n}b_{n}\\&=\kappa _{1}\lambda _{2}b_{2}+\kappa _{2}b_{2}+\cdots +\kappa _{1}\lambda _{n}b_{n}+\kappa _{n}b_{n}\\&=(\kappa _{2}+\kappa _{1}\lambda _{2})b_{2}+\cdots +(\kappa _{n}+\kappa _{1}\lambda _{n})b_{n}\end{aligned}}}

We have now found a linear combination for the arbitrary vector ${\displaystyle c\in V}$. Thus ${\displaystyle B\setminus \{b_{1}\}}$ is a generator of ${\displaystyle V}$. This contradicts statement 4, according to which ${\displaystyle B}$ is a minimal generator. So we also verified ${\displaystyle 4\implies 1}$ by contradiction.

## Examples

### Canonical basis in coordinate space

The vector space ${\displaystyle K^{n}}$ of the ${\displaystyle n}$ tuples over the field ${\displaystyle K}$ has a so-called standard basis or canonical basis

${\displaystyle B=\lbrace \underbrace {(1,0,\ldots ,0)^{T}} _{n{\text{ components}}},\,\underbrace {(0,1,\ldots ,0)^{T}} _{n{\text{ components}}},\,\cdots ,\,\underbrace {(0,0,\ldots ,1)^{T}} _{n{\text{ components}}}\rbrace }$

For instance, ${\displaystyle \mathbb {R} ^{3}}$ has the canonical basis ${\displaystyle B_{3}=\lbrace (1,0,0)^{T},\,(0,1,0)^{T},\,(0,0,1)^{T}\rbrace }$ and the ${\displaystyle \mathbb {R} ^{2}}$ the canonical basis ${\displaystyle B_{2}=\lbrace (1,0)^{T},\,(0,1)^{T}\rbrace }$.

Exercise (Basis of the plane)

Show that the set ${\displaystyle B_{2}=\lbrace (1,0)^{T},\,(0,1)^{T}\rbrace }$ is a basis of ${\displaystyle \mathbb {R} ^{2}}$.

Summary of proof (Basis of the plane)

We first show that any vector can be represented as a linear combination of the two given canonical basis vectors, so they form a generator. Then we show that they are linearly independent.

Solution (Basis of the plane)

Proof step: ${\displaystyle B_{2}}$ is generator of ${\displaystyle \mathbb {R} ^{2}}$

Let ${\displaystyle (x_{1},x_{2})^{T}\in \mathbb {R} ^{2}}$ be arbitrary. Then,

${\displaystyle {\begin{pmatrix}x_{1}\\x_{2}\end{pmatrix}}\,=\,x_{1}\cdot {\begin{pmatrix}1\\0\end{pmatrix}}+x_{2}\cdot {\begin{pmatrix}0\\1\end{pmatrix}}}$

So ${\displaystyle B_{2}}$ is a generator of ${\displaystyle R^{2}}$.

Proof step: ${\displaystyle B_{2}}$ is linearly independent

We assume

${\displaystyle \rho \cdot {\begin{pmatrix}1\\0\end{pmatrix}}+\mu \cdot {\begin{pmatrix}0\\1\end{pmatrix}}={\begin{pmatrix}0\\0\end{pmatrix}}}$

if follows

${\displaystyle {\begin{pmatrix}\rho \\0\end{pmatrix}}+{\begin{pmatrix}0\\\mu \end{pmatrix}}={\begin{pmatrix}\rho \\\mu \end{pmatrix}}={\begin{pmatrix}0\\0\end{pmatrix}}}$

So ${\displaystyle \rho =0}$ and ${\displaystyle \mu =0}$. Hence, the vectors of ${\displaystyle B_{2}}$ are linearly independent.

### A different basis of ${\displaystyle \mathbb {R} ^{3}}$

Exercise

Show that the vectors ${\displaystyle v_{1}=(2,0,0)^{T},\,v_{2}=(3,2,3)^{T},\,v_{3}=(0,1,0)^{T}}$ provide a Basis of ${\displaystyle \mathbb {R} ^{3}}$.

Summary of proof

In a first step, one shows that ${\displaystyle v_{1},\,v_{2},\,v_{3}}$ is a generator of ${\displaystyle \mathbb {R} ^{3}}$. In a second step, linear independence of these vectors is established.

Solution

Proof step: generator

Let ${\displaystyle (x,y,z)^{T}}$ be any vector of ${\displaystyle \mathbb {R} ^{3}}$. Then,

${\displaystyle {\begin{pmatrix}x\\y\\z\end{pmatrix}}={\frac {x-z}{2}}{\begin{pmatrix}2\\0\\0\end{pmatrix}}+{\frac {z}{3}}{\begin{pmatrix}3\\2\\3\end{pmatrix}}+(y-{\frac {2}{3}}z){\begin{pmatrix}0\\1\\0\end{pmatrix}}}$

is a linear combination of ${\displaystyle (x,y,z)^{T}}$ using the vectors ${\displaystyle v_{1},\,v_{2},\,v_{3}}$ so these vectors are a generator of ${\displaystyle \mathbb {R} ^{3}}$.

Proof step: Linear independence

We represent the zero as a linear combination of the three vectors and check for which coefficients, a zero can be obtained:

{\displaystyle {\begin{aligned}&\beta _{1}{\begin{pmatrix}2\\0\\0\end{pmatrix}}+\beta _{2}{\begin{pmatrix}3\\2\\3\end{pmatrix}}+\beta _{3}{\begin{pmatrix}0\\1\\0\end{pmatrix}}={\begin{pmatrix}0\\0\\0\end{pmatrix}}\\[1em]\implies &\ {\begin{pmatrix}{2\beta _{1}}\\0\\0\end{pmatrix}}+{\begin{pmatrix}{3\beta _{2}}\\{2\beta _{2}}\\{3\beta _{2}}\end{pmatrix}}+{\begin{pmatrix}0\\{1\beta _{3}}\\0\end{pmatrix}}={\begin{pmatrix}0\\0\\0\end{pmatrix}}\\[1em]\implies &\ {\begin{pmatrix}{2\beta _{1}+3\beta _{2}}\\{2\beta _{2}+\beta _{3}}\\{3\beta _{2}}\end{pmatrix}}={\begin{pmatrix}0\\0\\0\end{pmatrix}}\\[1em]\end{aligned}}}

This results in the following system of equations:

{\displaystyle {\begin{aligned}2\beta _{1}+3\beta _{2}&=0\\2\beta _{2}+\beta _{3}&=0\\3\beta _{2}&=0\end{aligned}}}

The solution of this system of equations is ${\displaystyle \beta _{1}=\beta _{2}=\beta _{3}=0}$. Thus the vectors are linearly independent.

### Example: complex numbers

The set of complex numbers ${\displaystyle \mathbb {C} }$ is a vector space over ${\displaystyle \mathbb {R} }$, with multiplication by real numbers:

${\displaystyle \cdot \,\colon \mathbb {R} \times \mathbb {C} \to \mathbb {C} ;(\lambda ,z)\mapsto \lambda \cdot z}$

Then ${\displaystyle \mathbb {C} }$ has as an ${\displaystyle \mathbb {R} }$-vector space the basis ${\displaystyle \lbrace 1,\mathrm {i} \rbrace }$, because for ${\displaystyle z\in \mathbb {C} }$ we have unique ${\displaystyle \lambda ,\mu \in \mathbb {R} }$ with ${\displaystyle z=\lambda +\mathrm {i} \mu =\lambda \cdot 1+\mu \cdot \mathrm {i} }$.

If we consider ${\displaystyle \mathbb {C} }$ as a ${\displaystyle \mathbb {C} }$-vector space, ${\displaystyle \{1,i\}}$ is no longer a basis of ${\displaystyle \mathbb {C} }$, since.

${\displaystyle 0=\mathrm {i} \cdot 1+(-1)\cdot \mathrm {i} }$

For ${\displaystyle \mathbb {C} }$ as a ${\displaystyle \mathbb {C} }$-vector space we have the one-element basis ${\displaystyle \{1\}}$. As ${\displaystyle \mathbb {R} }$-vector space the complex numbers have a two-element basis (dimension is ${\displaystyle 2}$) and as ${\displaystyle \mathbb {C} }$-vector space a one-element basis (dimension is ${\displaystyle 1}$). So be cautious: it is important over which field we take a fixed vector space!

### Abstract example

The vector space ${\displaystyle \mathbb {R} [x]}$ of the polynomials with coefficients from ${\displaystyle \mathbb {R} }$ has a basis with infinitely many elements. An example for a basis are the powers of ${\displaystyle x}$:

${\displaystyle B=\{1,x,x^{2},x^{3},...\}}$

This is a generator, because for a polynomial ${\displaystyle f}$ of degree ${\displaystyle n}$ we have a representation

${\displaystyle f=\lambda _{n}x^{n}+\dots \lambda _{1}x+\lambda _{0}}$

Where ${\displaystyle \lambda _{i}\in \mathbb {R} }$ for all ${\displaystyle 0\leq i\leq n}$. Thus every polynomial is a finite linear combination of elements from ${\displaystyle B}$. Consequently, ${\displaystyle B}$ is a generator.

For linear independence we consider the following: With ${\displaystyle \lambda _{i}\in \mathbb {R} }$ let:

${\displaystyle 0=\lambda _{n}x^{n}+\dots \lambda _{1}x+\lambda _{0}}$

We can also write the zero polynomial as ${\displaystyle 0=0\cdot x^{n}+\dots +0\cdot x+0}$. A comparison of coefficients yields that

${\displaystyle \lambda _{0}=\dots =\lambda _{n}=0}$

So ${\displaystyle B}$ is a Basis.

## Bases are not unique

### Example: Basis of the plane is not unique

The blue vector v can be represented by two different bases.

We will show by using the plane ${\displaystyle \mathbb {R} ^{2}}$ that the basis of a vector space is not unique. Let us look at the (canonical) basis for ${\displaystyle \mathbb {R} ^{2}}$ consisting of the unit vectors:

${\displaystyle B_{1}=\left\{{\begin{pmatrix}1\\0\end{pmatrix}},{\begin{pmatrix}0\\1\end{pmatrix}}\right\}}$

These vectors obviously form a generator:

${\displaystyle {\begin{pmatrix}a\\b\end{pmatrix}}=a\cdot {\begin{pmatrix}1\\0\end{pmatrix}}+b\cdot {\begin{pmatrix}0\\1\end{pmatrix}}}$

They are also linearly independent because it is impossible to find a linear combination of zero with non-trivial coefficients. Thus ${\displaystyle B_{1}}$ is a basis. However, for the plane there are also a lot of other bases. An example is the following set:

${\displaystyle B_{2}=\left\{{\begin{pmatrix}4\\-2\end{pmatrix}},{\begin{pmatrix}-2\\4\end{pmatrix}}\right\}}$

We can generate all vectors with these two vectors:

${\displaystyle {\begin{pmatrix}a\\b\end{pmatrix}}=\left({\frac {a}{3}}+{\frac {b}{6}}\right)\cdot {\begin{pmatrix}4\\-2\end{pmatrix}}+\left({\frac {b}{3}}+{\frac {a}{6}}\right)\cdot {\begin{pmatrix}-2\\4\end{pmatrix}}}$

These vectors are linearly independent because one vector is not a multiple of the other vector (two vectors are linearly dependent if one vector is a multiple of the other vector). Thus ${\displaystyle B_{2}}$ is also a basis. These two examples show that the basis for ${\displaystyle \mathbb {R} ^{2}}$ is not unique. And one can indeed find a lot of other bases, for instance by stretching and rotating.

### Building a further basis from an existing one

In general, for every ${\displaystyle \mathbb {R} }$-vector space with a basis, we can construct a second, different basis: Consider the two-dimensional vector space ${\displaystyle V}$ with a basis ${\displaystyle B_{1}=\{b_{1},b_{2}\}}$. Then ${\displaystyle B_{2}=\{b_{1},b_{2}-b_{1}\}}$ is also a basis of ${\displaystyle V}$. The same argument of "substituting the last vector" also works in higher dimensions. We first show that ${\displaystyle B_{2}}$ is a generator and then that the basis vectors are linearly independent.

Let ${\displaystyle c\in V}$ be any vector and ${\displaystyle c=\lambda _{1}b_{1}+\lambda _{2}b_{2}}$ a linear combination of it using elements of the basis ${\displaystyle B_{1}}$. Then a linear combination of ${\displaystyle c}$ can also be found via vectors from ${\displaystyle B_{2}}$:

${\displaystyle c=(\lambda _{1}+\lambda _{2})\cdot b_{1}+\lambda _{2}\cdot (b_{2}-b_{1})}$

Thus ${\displaystyle B_{2}}$ is a generator because the vector ${\displaystyle c}$ was arbitrarily chosen.

We show that the basis vectors are linearly independent via proof by contradiction. To do this, we prove that if ${\displaystyle B_{2}}$ is linearly dependent, then ${\displaystyle B_{1}}$ must also be linearly dependent. By contraposition, it thus follows that ${\displaystyle B_{2}}$ is linearly independent if ${\displaystyle B_{1}}$ is linearly independent (which it must be the case it is a basis). If ${\displaystyle B_{2}}$ is linearly dependent, there is a representation of zero:

${\displaystyle 0=\lambda _{1}\cdot b_{1}+\lambda _{2}\cdot (b_{2}-b_{1})}$

Here ${\displaystyle \lambda _{1}\neq 0}$ or ${\displaystyle \lambda _{2}\neq 0}$. Now we also find a representation of the zero with the basis ${\displaystyle B_{1}}$:

${\displaystyle 0=(\lambda _{1}-\lambda _{2})\cdot b_{1}+\lambda _{2}\cdot b_{2}}$

We still have to show that one of the coefficients ${\displaystyle \lambda _{1}-\lambda _{2}}$ or ${\displaystyle \lambda _{2}}$ is not equal to zero. As a premise we have ${\displaystyle \lambda _{1}\neq 0}$ or ${\displaystyle \lambda _{2}\neq 0}$. The case ${\displaystyle \lambda _{2}\neq 0}$ leads directly to the non-trivial representation of the zero, since this factor also appears in the second equation.

If ${\displaystyle \lambda _{1}\neq 0}$ holds, we have to distinguish again. If ${\displaystyle \lambda _{1}\neq 0}$ and ${\displaystyle \lambda _{1}=\lambda _{2}}$, then ${\displaystyle \lambda _{2}\neq 0}$ and hence one of the coefficients is not zero. If ${\displaystyle \lambda _{1}\neq 0}$ and ${\displaystyle \lambda _{1}\neq \lambda _{2}}$ hold, then ${\displaystyle \lambda _{1}-\lambda _{2}\neq 0}$ and hence the first coefficient is non-zero.

It follows that one of the coefficients is always non-zero. Thus the vectors of the basis ${\displaystyle B_{1}}$ are also linearly dependent. It follows (by contradiction) that ${\displaystyle B_{2}}$ is linearly independent if ${\displaystyle B_{1}}$ is linearly independent. ${\displaystyle B_{2}}$ is thus a new basis constructed from the first basis.

This principle can also be applied to larger bases and shows: The basis of a vector space is (usually) not unique. A vector space with dimension equal to or larger than 2 has several bases.

## Proving existence and constructing a basis

### Existence of a basis

We have not yet answered the question of whether there is a basis for every n vector space at all. Attention, this is not self-evident, as bases may be infinitely large! Nevertheless, you can rejoice, because the answer is: Yes, every vector space has (at least) one basis.

Of course, we still have to justify this answer. For the case of finitely generated vector spaces, i.e. all vector spaces that have a finite generator, we will prove this in a moment. For infinitely generated vector spaces, i.e. vector spaces that do not have a finite generating system, the proof is much more complicated and uses the axiom of choice.

Theorem (Basis theorem)

Let ${\displaystyle V}$ be a finitely generated ${\displaystyle K}$-vector space and ${\displaystyle E}$ a finite generator of ${\displaystyle V}$. Then there is a subset ${\displaystyle B\subseteq E}$ that is a basis of ${\displaystyle V}$.

Proof (Basis theorem)

We want to remove vectors from ${\displaystyle E}$ until we find a linearly independent subset of ${\displaystyle E}$ that is still a generator. We do this as follows: If ${\displaystyle E}$ is linearly dependent, then ${\displaystyle E}$ is not a basis. So, according to the theorem about equivalent characterisations of a basis, there exists a subset ${\displaystyle E'\subsetneq E}$ that is a generator of ${\displaystyle V}$. This has fewer elements than ${\displaystyle E}$.

We now inductively find generators ${\displaystyle E_{i}\subsetneq E}$ such that ${\displaystyle E_{i+1}}$ has fewer elements than ${\displaystyle E_{i}}$ as follows:

• set ${\displaystyle E_{0}:=E}$ and
• for ${\displaystyle i\geq 1}$, choose a generator ${\displaystyle E_{i}\subsetneq E_{i-1}}$ of ${\displaystyle V}$ if ${\displaystyle E_{i-1}}$ is not a minimal generator.

Since we start with a finite set, we get a minimal generator after finitely many steps.

Theorem (Existence of a basis)

Every finitely generated vector space has at least one Basis.

Proof (Existence of a basis)

We take a finite generator. With the basis theorem, there is a subset of it that is a basis of the vector space. In particular, the vector space has at least one basis.

### Construction of a basis by removing vectors

Now we know that every vector space has a basis, but how can you find a basis for a given vector space? For finitely generated vector spaces, the proof of the theorem on the existence of a basis gives you a procedure for constructing a basis in finitely many steps (it is not applicable for infinitely generated vector spaces). According to the basis completion theorem, we can proceed as follows:

1. Find a finite generator ${\displaystyle E}$ of the vector space.
2. Check whether ${\displaystyle E}$ is linearly independent.
• If yes: We are done and the generator is a basis.
• If no: Find a smaller generator ${\displaystyle E'\subsetneq E}$ of the vector space and repeat step 2.

We now need an explicit way to get a smaller generator ${\displaystyle E'\subsetneq E}$ from a finite generator ${\displaystyle E=\{e_{1},\dots ,e_{n}\}}$, which is not a minimal generator. Since ${\displaystyle E}$ is not a minimal generator, ${\displaystyle E}$ is linearly dependent. So we find ${\displaystyle \lambda _{1},\dots ,\lambda _{n}\in K}$ so that not all ${\displaystyle \lambda _{i}=0}$ and we have that

${\displaystyle \lambda _{1}e_{1}+\dots +\lambda _{n}e_{n}=0}$

Now we choose an ${\displaystyle i\in \{1,\dots ,n\}}$ with ${\displaystyle \lambda _{i}\neq 0}$ and set

${\displaystyle E':=E\setminus \{e_{i}\}=\{e_{1},\dots ,e_{i-1},e_{i+1},\dots ,e_{n}\}}$

We now want to show that ${\displaystyle E'}$ is also a generator. In the article linear independence of vectors we have proven that

${\displaystyle e_{i}=-{\frac {\lambda _{1}}{\lambda _{i}}}e_{1}-\dots -{\frac {\lambda _{i-1}}{\lambda _{i}}}e_{i-1}-{\frac {\lambda _{i+1}}{\lambda _{i}}}e_{i+1}-\dots -{\frac {\lambda _{n}}{\lambda _{i}}}e_{n}}$

Since ${\displaystyle E}$ is a generator of ${\displaystyle V}$, we find for a vector ${\displaystyle v\in V}$ scalars ${\displaystyle \mu _{1},\dots ,\mu _{n}\in K}$ such that

{\displaystyle {\begin{aligned}v&=\mu _{1}e_{1}+\dots +\mu _{i}e_{i}+\dots +\mu _{n}e_{n}\\&=\mu _{1}e_{1}+\dots +\mu _{i-1}v_{i-1}+\mu _{i}\left(-{\frac {\lambda _{1}}{\lambda _{i}}}e_{1}-\dots -{\frac {\lambda _{i-1}}{\lambda _{i}}}e_{i-1}-{\frac {\lambda _{i+1}}{\lambda _{i}}}e_{i+1}-\dots -{\frac {\lambda _{n}}{\lambda _{i}}}e_{n}\right)+\mu _{i+1}e_{i+1}+\dots +\mu _{n}e_{n}\\&=\left(\mu _{1}-\mu _{i}\cdot {\frac {\lambda _{1}}{\lambda _{i}}}\right)e_{1}+\dots +\left(\mu _{i-1}-\mu _{i}\cdot {\frac {\lambda _{i-1}}{\lambda _{i}}}\right)e_{i-1}+\left(\mu _{i+1}-\mu _{i}\cdot {\frac {\lambda _{i+1}}{\lambda _{i}}}\right)e_{i+1}+\dots +\left(\mu _{n}-\mu _{i}\cdot {\frac {\lambda _{n}}{\lambda _{i}}}\right)e_{n}\end{aligned}}}

Thus ${\displaystyle v\in \operatorname {span} (E')}$. Since ${\displaystyle v\in V}$ was arbitrarily chosen, it follows that ${\displaystyle E'}$ is a generator. With the proof of the base theorem we now get the following procedure for determining a base:

How to get to the proof? (Construction of a basis for finite vector spaces)

1. Find a finite generator of the given vector space.
2. Try to find a non-trivial linear combination of the zero vector from this generator. (For this, you need to solve a linear system of equations.) If no non-trivial linear combination exists, the generator is also linearly independent and we are done.
3. If a non-trivial linear combination exists, remove one of the vectors from the generator whose coefficient in the linear combination is not zero. Then, go back to Step 2.

### Construction of a basis by adding vectors

Alternatively, we can proceed as in the section Motivation via linear independence; that is, we start with a linearly independent set and extend it until it is maximal, i.e., a basis.

Theorem (Basis completion theorem)

For every linearly independent subset ${\displaystyle U}$ of a finitely generated vector space ${\displaystyle V}$ there is a basis ${\displaystyle B}$ of ${\displaystyle V}$ with ${\displaystyle U\subseteq B}$.

Proof (Basis completion theorem)

Let ${\displaystyle V}$ be any vector space and ${\displaystyle U\subseteq V}$ be any independent subset of ${\displaystyle V}$. Since ${\displaystyle V}$ is finitely generated, we find a finite generator ${\displaystyle E}$ of ${\displaystyle V}$. We now want to add elements from ${\displaystyle E}$ to the set ${\displaystyle U}$ until this new set is a generator. When adding the vectors, we want to keep the linear independence, so that the new set is a linearly independent generator and thus a basis.

If ${\displaystyle U}$ is a generator, then ${\displaystyle U}$ is already a basis of ${\displaystyle V}$. Else we find that since ${\displaystyle \operatorname {span} (U)\subsetneq V=\operatorname {span} (E)}$, there is some ${\displaystyle v\in E}$ with ${\displaystyle v\not \in \operatorname {span} (U)}$. Now we set ${\displaystyle U':=U\cup \{v\}}$. To show the linear independence of ${\displaystyle U'}$, we assume ${\displaystyle U'}$ would be linearly dependent. Then a non-trivial linear combination would exist:

${\displaystyle \mu v+\sum _{u\in U}\lambda _{u}u=0}$

with ${\displaystyle \mu ,\lambda _{u}\in K}$. Since ${\displaystyle U}$ is linearly independent, ${\displaystyle \mu \neq 0}$ must hold. So we get

${\displaystyle v=-\sum _{u\in U}{\frac {\lambda _{u}}{\mu }}\cdot u\in \operatorname {span} (U)}$

This is a contradiction to ${\displaystyle v\not \in \operatorname {span} (U)}$. Thus ${\displaystyle U'}$ is linearly independent.

We now inductively construct linearly independent sets ${\displaystyle U_{i}\supseteq U}$ by adding elements from ${\displaystyle E}$ according to the above procedure until ${\displaystyle U_{i}}$ has become a basis:

• start with ${\displaystyle U_{0}:=U}$
• and inductively set ${\displaystyle U_{i+1}:=U_{i}\cup \{v\}}$ for some ${\displaystyle v\in E}$ with ${\displaystyle v\not \in \operatorname {span} (U_{i})}$, if ${\displaystyle U_{i}}$ is not yet a generator of ${\displaystyle V}$.

Since we can only add finitely many vectors from ${\displaystyle E}$ to ${\displaystyle U=U_{0}}$, there is an ${\displaystyle i}$ at which we cannot add a vector from ${\displaystyle E}$ to ${\displaystyle U_{i}}$. Then ${\displaystyle B:=U_{i}}$ is a generator and hence also a basis of ${\displaystyle V}$.

This proof gives you another method to determine a basis of a finitely generated vector space in finitely many steps (this method is also only applicable for finitely generated vector spaces):

1. Choose a finite generator and start with the empty set as your first linearly independent set.
2. Try to find a vector from your generator that is not in the span of your previous linearly independent set. If you don't find one, you're done.
3. Add the vector you found to your linearly independent set and go back to Step 2.

In the next chapter we will see that every two bases of the same finitely generated vector space have the same cardinality. We get that every linearly independent set that has as many elements as a basis is automatically already a maximally linearly independent subset. Therefore, we can change step 2 in the above procedure as follows: "Try to find a vector from your vector space that is not in the span of your previous linearly independent set. If you don't find one, you're done." In this variant of the procedure, you do not have to choose a generator in Step 1.

### Examples: Construction of a basis by removing vectors

Example (Construction of a basis by removing vectors 1)

Let ${\displaystyle B=\{(1,0,0)^{T},(0,1,0)^{T},(0,0,1)^{T},(1,1,1)^{T}\}}$. Then, we have ${\displaystyle \operatorname {span} (B)=\mathbb {R} ^{3}}$, since for all ${\displaystyle x_{1},x_{2},x_{3}\in \mathbb {R} }$ we have that

${\displaystyle {\begin{pmatrix}x_{1}\\x_{2}\\x_{3}\end{pmatrix}}=x_{1}\cdot {\begin{pmatrix}1\\0\\0\end{pmatrix}}+x_{2}\cdot {\begin{pmatrix}0\\1\\0\end{pmatrix}}+x_{3}\cdot {\begin{pmatrix}0\\0\\1\end{pmatrix}}+0\cdot {\begin{pmatrix}1\\1\\1\end{pmatrix}}}$

That means, ${\displaystyle B}$ is a generator. Now, ${\displaystyle B}$ is not a basis of ${\displaystyle \mathbb {R} ^{3}}$ because we have the non-trivial representation of the zero vector

${\displaystyle {\begin{pmatrix}0\\0\\0\end{pmatrix}}=1\cdot {\begin{pmatrix}1\\0\\0\end{pmatrix}}+1\cdot {\begin{pmatrix}0\\1\\0\end{pmatrix}}+1\cdot {\begin{pmatrix}0\\0\\1\end{pmatrix}}+(-1)\cdot {\begin{pmatrix}1\\1\\1\end{pmatrix}}}$

Since in this linear combination the prefactor of ${\displaystyle (0,0,1)^{T}}$ is not zero, according to the above procedure for constructing a basis from a generator ${\displaystyle B':=B\setminus \{(0,0,1)^{T}\}=\{(1,0,0)^{T},(0,1,0)^{T},(1,1,1)^{T}\}}$ is also a generator. We now need to check that ${\displaystyle B'}$ is linearly dependent. So we look at how the zero vector can be combined using vectors from this set:

${\displaystyle {\begin{pmatrix}0\\0\\0\end{pmatrix}}=\lambda _{1}\cdot {\begin{pmatrix}1\\0\\0\end{pmatrix}}+\lambda _{2}\cdot {\begin{pmatrix}0\\1\\0\end{pmatrix}}+\lambda _{3}\cdot {\begin{pmatrix}1\\1\\1\end{pmatrix}}}$

If we set up a linear system of equations and solve it, it follows that ${\displaystyle \lambda _{1}=\lambda _{2}=\lambda _{3}=0}$. Thus B' is linearly independent and hence a basis of ${\displaystyle \mathbb {R} ^{3}}$.

Example (Abstract example: Construction of a basis by removing vectors)

Let ${\displaystyle K:=\mathbb {Z} \operatorname {mod} 2\mathbb {Z} }$ be the two-element field. We consider the ${\displaystyle K}$-vector space ${\displaystyle V:=\operatorname {Fun} (K,K)}$ of all mappings from ${\displaystyle K}$ to ${\displaystyle K}$. This is a function space (missing) consisting of ${\displaystyle 4}$ elements. Thus ${\displaystyle V=\{c_{0},c_{1},\operatorname {id} ,f\}}$. Here ${\displaystyle c_{0}}$ is the constant zero mapping, ${\displaystyle c_{1}}$ is the constant one mapping, ${\displaystyle \operatorname {id} }$ is the identity and ${\displaystyle f}$ is the mapping that interchanges ${\displaystyle {\bar {0}}}$ and ${\displaystyle {\bar {1}}}$:

${\displaystyle f(x):={\begin{cases}{\bar {1}},&x={\bar {0}}\\{\bar {0}},&x={\bar {1}}\end{cases}}}$

As initial generating system ${\displaystyle E_{0}}$ we choose the entire vector space, i.e. ${\displaystyle E_{0}:=V}$. This is linearly dependent, because we have the following non-trivial representation of the zero vector

${\displaystyle 0={\bar {1}}\cdot c_{0}+{\bar {0}}\cdot c_{1}+{\bar {0}}\cdot \operatorname {id} +{\bar {0}}\cdot f}$

So we get a new generator ${\displaystyle E_{1}:=E_{0}\setminus \{c_{0}\}}$ of ${\displaystyle V}$ since the prefactor of ${\displaystyle c_{0}}$ is non-zero. Specifically, ${\displaystyle E_{1}=\{c_{1},\operatorname {id} ,f\}}$. Now ${\displaystyle E_{1}}$ is also linearly dependent because we have

${\displaystyle 0={\bar {1}}\cdot c_{1}+{\bar {1}}\cdot \operatorname {id} +{\bar {1}}\cdot f}$

Since the prefactor of ${\displaystyle c_{1}}$ is non-zero, we choose as a new generator ${\displaystyle E_{2}:=E_{1}\setminus \{c_{1}\}}$. This set is linearly independent. From ${\displaystyle 0=\lambda \cdot \operatorname {id} +\mu \cdot f}$ we obtain (by inserting the two possible arguments):

{\displaystyle {\begin{aligned}{\bar {0}}&=\lambda \cdot \operatorname {id} ({\bar {0}})+\mu \cdot f({\bar {0}})=\lambda \cdot {\bar {0}}+\mu \cdot {\bar {1}}=\mu \\{\bar {0}}&=\lambda \cdot \operatorname {id} ({\bar {1}})+\mu \cdot f({\bar {1}})=\lambda \cdot {\bar {1}}+\mu \cdot {\bar {0}}=\lambda \end{aligned}}}

So ${\displaystyle \lambda =\mu ={\bar {0}}}$ and thus ${\displaystyle E_{2}}$ is linearly independent. With ${\displaystyle E_{2}=\{\operatorname {id} ,f\}}$ we have found a basis of our vector space.

### Example: Construction of a basis by adding vectors

Example (Construction of a basis by adding vectors)

We consider the space ${\displaystyle \mathbb {R} ^{3}}$ and start with the linearly independent set ${\displaystyle U:=\emptyset }$. Then ${\displaystyle (1,1,1)^{T}\not \in \operatorname {span} (U)=\{(0,0,0)^{T}\}}$. That is, we get the linearly independent set ${\displaystyle U':=\{(1,1,1)^{T}\}}$. Further

${\displaystyle {\begin{pmatrix}1\\0\\0\end{pmatrix}}\not \in \left\{{\begin{pmatrix}x\\x\\x\end{pmatrix}};x\in \mathbb {R} \right\}=\operatorname {span} (U')}$

That means ${\displaystyle U'':=U'\cup \{(1,0,0)^{T}\}=\{(1,1,1)^{T},(1,0,0)^{T}\}}$ is linearly independent. Now

${\displaystyle \operatorname {span} (U'')=\left\{\lambda \cdot {\begin{pmatrix}1\\0\\0\end{pmatrix}}+\mu \cdot {\begin{pmatrix}1\\1\\1\end{pmatrix}};\lambda ,\mu \in \mathbb {R} \right\}=\left\{{\begin{pmatrix}\lambda +\mu \\\mu \\\mu \end{pmatrix}};\lambda ,\mu \in \mathbb {R} \right\}=\left\{{\begin{pmatrix}\lambda \\\mu \\\mu \end{pmatrix}};\lambda ,\mu \in \mathbb {R} \right\}}$

Thus ${\displaystyle (0,1,0)^{T}\not \in \operatorname {span} (U'')}$ and ${\displaystyle U'':=U''\cup \{(0,1,0)^{T}\}=\{(1,0,0)^{T},(0,1,0)^{T},(1,1,1)^{T}\}}$ is linearly independent. Further, ${\displaystyle \operatorname {span} (U'')=\mathbb {R} ^{3}}$, because

${\displaystyle {\begin{pmatrix}x_{1}\\x_{2}\\x_{3}\end{pmatrix}}=(x_{1}-x_{3})\cdot {\begin{pmatrix}1\\0\\0\end{pmatrix}}+(x_{2}-x_{3})\cdot {\begin{pmatrix}0\\1\\0\end{pmatrix}}+x_{3}\cdot {\begin{pmatrix}1\\1\\1\end{pmatrix}}}$

Thus ${\displaystyle U'''}$ is a basis of ${\displaystyle \mathbb {R} ^{3}}$.

Example (Abstract example: Construction of a basis by adding vectors)

We consider the two-element field ${\displaystyle K:=\mathbb {Z} \operatorname {mod} 2\mathbb {Z} }$ and the ${\displaystyle K}$-vector space ${\displaystyle V:=\operatorname {Fun} (K,K)}$ of all mappings from ${\displaystyle K}$ to ${\displaystyle K}$. As seen above, ${\displaystyle V}$ consists of 4 elements, that is, in the above notation ${\displaystyle V=\{c_{0},c_{1},\operatorname {id} ,f\}}$. Let again ${\displaystyle U:=\emptyset }$. Then ${\displaystyle \operatorname {id} \not \in \operatorname {span} (U)=\{c_{0}\}}$. Thus ${\displaystyle U':=U\cup \{\operatorname {id} \}=\{\operatorname {id} \}}$ linearly independent. Now

${\displaystyle f\not \in \operatorname {span} (U')=\{\lambda \cdot \operatorname {id} ;\lambda \in K\}=\{c_{0},\operatorname {id} \}}$

This means ${\displaystyle U'':=U'\cup \{f\}=\{\operatorname {id} ,f\}}$ is linearly independent. Further we have that

${\displaystyle \operatorname {span} (U'')=\{\lambda \cdot \operatorname {id} +\mu \cdot f;\lambda ,\mu \in K\}=\{c_{0},\operatorname {id} ,f,\operatorname {id} +f\}=\{c_{0},c_{1},\operatorname {id} ,f\}=V}$

Thus ${\displaystyle U''}$ is a basis of ${\displaystyle V}$.