Basis – Serlo

Aus Wikibooks

In the last two articles we got to know the two concepts generator and linear independence. In this articles we combine the two concepts and introduce the notion of basis of a vector space.


Motivation[Bearbeiten]

Via linear independence [Bearbeiten]

We want to work towards the concept of dimension. Intuitively, we can think of it as the maximum number of linearly independent directions in a space. If we stick to this intuition, we can give the following preliminary definition of dimension: The dimension of a vector space is the maximum number of linearly independent vectors that we can simultaneously choose in a vector space. To find this, we need to find a maximal system of linearly independent vectors.

To see if this preliminary definition makes sense, let's try to apply it in the : Clearly, the should be three-dimensional. This means we have to ask ourselves two questions: Do three linearly independent vectors fit into the and is there no fourth vector being linearly independent to any of such three independent vectors? Let us now try this out: We take any first vector, e.g. . Now we want to find a linearly independent vector, i.e. a vector that is not a multiple of or does not lie in . An example for this is .

If we want to follow the intuition that is three-dimensional, we still have to find a third vector that is linearly independent to the system . Now and span a plane in which the last component is always zero. Thus is a vector linearly independent of .

Now the question is whether with these three vectors the maximum number of linearly independent vectors has already been reached. To answer this, let us first consider the vector as an example. We want to check whether we can add this vector to and and still obtain a system of linearly independent vectors. First we note that

So we can write the above vector as

So . Now let us ask the same question for any vector with . For this we first get

With this consideration we obtain for the representation

Thus . Since this vector was arbitrarily chosen, every vector from is representable as a linear combination of the linearly independent vectors and . Thus is a generator of . Therefore, we cannot add another vector to and , so the system remains linearly independent, since every other vector from can be represented as a linear combination of and . In other words, and form a maximal system of linearly independent vectors.

In summary, we proceeded as follows to find a maximal system of linearly independent vectors: We start with a vector that is not the zero vector. That means we should not consider the null vector space here. Then we proceed step by step: Once we have found linearly independent vectors , we form the span of these vectors. If this is the entire vector space, we have found a generator and are done. We choose a vector that is not in . This contributes a new direction and the system is linearly independent again. Then we do the same again until we find a generator. We thus obtain the characterisation that a maximal system of linearly independent vectors is a generator of linearly independent vectors.

Via generators [Bearbeiten]

So far we have started with a system of linearly independent vectors (which is not yet a generator) and extended it until it became a maximal system of linearly independent vectors. Now we want to investigate what happens when we reverse the direction. That is, we start with a generator (which is not linearly independent) and reduce it until we find a minimum (i.e., linearly independent) generator.

Let us consider

First, we establish that is a generator of by following calculation: For a vector with we have that

So we can represent any vector as a linear combination of vectors from the generator.

Now we ask ourselves whether we can reduce the above generator without losing the property of it being a generator. The vector is a multiple of . That is, the direction represented by is the same as the direction represented by . Hence, we can remove this vector from and obtain a new generator

Can we reduce the size of this generator, as well? Yes, since

So adds no new direction that is not already spanned by and . We thus obtain a smaller generator

Now we cannot reduce the generator any further without losing the property of it being a generator. For if we remove any of the three vectors in , it is no longer in the span of the remaining two vectors. For , for example, we see this as follows: Suppose we had some , so that

Then would have to hold because the second component must be zero on both sides. Because of the third component, must be true. Thus we obtain the contradiction

So is not in . So we have found a generator containing linearly independent vectors (a minimal generator).

In summary, we proceeded as follows: We started with a generator of vectors and reduced it according to the following algorithm: If is a system of linearly independent vectors, we cannot remove any vector without losing the property of having a generator. That means we are done and we have found a minimal generator. In the converse case, we find a vector with that is in . This vector can be omitted and we obtain a new generator consisting of vectors. With this generator, we do the same steps again until we have found a system of linearly independent vectors.

We thus obtain the characterisation that a minimal generator is a generator consisting of linearly independent vectors.

Motivation - Conclusion[Bearbeiten]

We have found the characterisations of linearly independent generators as minimal generators and as maximal linearly independent subsets. Thus the property of being a linearly independent generator is a special property of both linearly independent sets and generators. A set with both properties is called a basis in Linear Algebra.

Since bases are generators, every vector has a representation as a linear combination of basis vectors. This representation is unambiguous because bases are linearly independent. So we found another way of describing bases:

A basis is a subset such that every vector has a representation as a unique linear combination of basis vectors

.

Definition: Basis of a vector space[Bearbeiten]

Definition (Basis of a vector space)

Let be a field and a -vector space. If is a generator of , whose vectors are linearly independent, then is called a basis of .

Hint

We have defined the base as a set of vectors. This does not determine the order of the vectors. Alternatively, one can define the base as a tuple of vectors. In this case, the order of the vectors is fixed. Changing the order in this case results in a different basis.

Equivalent definitions of basis [Bearbeiten]

Theorem (Equivalent definitions of basis)

For a subset the following four statements are equivalent:

  1. is a linearly independent generator of .
  2. is a maximal linearly independent subset of . This means that if another element is added to , the new set is no longer linearly independent.
  3. Each element of can be uniquely represented as a linear combination of vectors from .
  4. is a minimal generator of . This means: is a generator of . But if an element is removed from , the remaining set is no longer a generator of .

Proof (Equivalent definitions of basis)

We prove the equivalence of these statements by a "circle of implications" of the kind :

Proof step:

We show this using a proof by contradiction. We assume that statement 1 is true and statement 2 is false. Then we show that it follows that statement 1 must also be false (which is a contradiction). So statement 2 cannot be false, i.e., it must be true.

So let us assume that is a linearly independent generator but not a maximal linearly independent subset of . Thus there is a such that the vectors of are linearly independent. (This is exactly the opposite of statement 2.) However, since is a generating system according to statement 1, we can write as a linear combination of elements from :

with . At least one with , because is linearly independent and therefore . The above equation can be transformed to:

Thus the set is linearly dependence. This is a contradiction to our assumption that is linearly independent. That means, we verified by contradiction.

Proof step:

The proof is divided into two steps. In the first step we show that under the assumption of statement 2 a linear combination exists for every element from with elements from . In the second step we show the uniqueness of this linear combination. Both steps must be shown, because statement 3 requires the existence of such a linear combination as well as its uniqueness.

Let be a maximal linearly independent subset of . Let be any element from our vector space. We will now show that we can write as a linear combination of elements from . We now distinguish two cases. It is important to cover all cases. Here it is obvious, because we first consider a subset of and then its complement.

Fall 1:

We assume that , i.e., the vector we are looking for is part of our maximal linearly independent subset. The linear combination is trivial in this case because is already in . Thus we can simply write

and we are done.

Fall 2:

We assume that , so . We now exploit the property of maximum linearity of . To do this, we consider the set . Because of and because of the maximum linear independence of , the set is linearly dependent. Linear dependence means that there exists a linear combination resulting in zero, but with not all coefficients being zero:

Here at least one . By transforming, we obtain a linear combination of by the elements from :

We still need to show that because we need . If we had , then the above line would form a linear combination of zero only with elements from . This would be a contradiction to being linearly independent. Thus we have that also .

Thus we have finished the consideration of both cases and get a linear combination of using elements from . Since was arbitrarily chosen from , we have now already shown that every element from can be represented as a linear combination of vectors from . This shows the existence of the linear combination.

In the next and final step we have to prove the uniqueness of this linear combination. We show this again via a proof by contradiction. We assume that there is a which can be represented by two different linear combinations. A contradiction to statement 2 will then be inferred from this representation ambiguity. Let

with . Since both linear combinations are different, there is at least one for which . If this were not true, all coefficients would be identical and thus both linear combinations would be identical. We now subtract these two equations and get the following representation of the zero vector:

Let us define the difference vector . Since , we have that also . Thus we can rewrite the above equation as:

With there is at least one non-zero coefficient. So this is a non-trivial linear combination of zero with elements from . It follows that is linearly dependent. This is a contradiction to statement 2, according to which is linearly independent. So our ambiguity assumption was false and the linear combination must be unique. Therefore we have that .

Proof step:

We carry out the proof in two steps. First we show that, assuming statement 3, is a generator. Then we show that it is minimal. By the definition of a generator, must be a subset of and it must span (i.e. ). By statement 3, . Since we can represent every element from as a linear combination using elements from , we also have that spans the vector space . Thus is a generator.

In the next step we want to show that is minimal. We again perform a proof by contradiction and assume that is not a minimal generator. This leads us to a contradiction to statement 3. If is not a minimal generator, there exists a real subset which is also a generator. Let now , so . Then there exist two different linear combinations for constructed by vectors in . Now, can be "trivially" represented:

or we use the representation of by elements from (which can be done because forms a generator):

with . Since, according to statement 3, for every element from there is a unique linear combination of vectors from , the existence of the two linear combinations contradicts statement 3. Thus must be a minimal generator. This concludes the proof of contradiction and we have that .

Proof step:

We show this again using a proof by contradiction. So we assume that statement 1 is false. This means that is not a linearly independents generator. That is, there exists a that can be represented as a linear combination of vectors from :

Here, for at least one coefficient, . We now lead this to contradict statement 4 by showing that is then no longer minimal.

Let be any vector. Since is a generator, there is a linear combination of with vectors from :

Now we plug the above linear combination of into the linear combination of :

We have now found a linear combination for the arbitrary vector . Thus is a generator of . This contradicts statement 4, according to which is a minimal generator. So we also verified by contradiction.

Examples[Bearbeiten]

Canonical basis in coordinate space [Bearbeiten]

The vector space of the tuples over the field has a so-called standard basis or canonical basis

For instance, has the canonical basis and the the canonical basis .

Exercise (Basis of the plane)

Show that the set is a basis of .

Summary of proof (Basis of the plane)

We first show that any vector can be represented as a linear combination of the two given canonical basis vectors, so they form a generator. Then we show that they are linearly independent.

Solution (Basis of the plane)

Proof step: is generator of

Let be arbitrary. Then,

So is a generator of .

Proof step: is linearly independent

We assume

if follows

So and . Hence, the vectors of are linearly independent.

A different basis of [Bearbeiten]

Exercise

Show that the vectors provide a Basis of .

Summary of proof

In a first step, one shows that is a generator of . In a second step, linear independence of these vectors is established.

Solution

Proof step: generator

Let be any vector of . Then,

is a linear combination of using the vectors so these vectors are a generator of .

Proof step: Linear independence

We represent the zero as a linear combination of the three vectors and check for which coefficients, a zero can be obtained:

This results in the following system of equations:

The solution of this system of equations is . Thus the vectors are linearly independent.

Example: complex numbers[Bearbeiten]

The set of complex numbers is a vector space over , with multiplication by real numbers:

Then has as an -vector space the basis , because for we have unique with .

If we consider as a -vector space, is no longer a basis of , since.

For as a -vector space we have the one-element basis . As -vector space the complex numbers have a two-element basis (dimension is ) and as -vector space a one-element basis (dimension is ). So be cautious: it is important over which field we take a fixed vector space!

Abstract example[Bearbeiten]

The vector space of the polynomials with coefficients from has a basis with infinitely many elements. An example for a basis are the powers of :

This is a generator, because for a polynomial of degree we have a representation

Where for all . Thus every polynomial is a finite linear combination of elements from . Consequently, is a generator.

For linear independence we consider the following: With let:

We can also write the zero polynomial as . A comparison of coefficients yields that

So is a Basis.

Bases are not unique[Bearbeiten]

Example: Basis of the plane is not unique[Bearbeiten]

The blue vector v can be represented by two different bases.

We will show by using the plane that the basis of a vector space is not unique. Let us look at the (canonical) basis for consisting of the unit vectors:

These vectors obviously form a generator:

They are also linearly independent because it is impossible to find a linear combination of zero with non-trivial coefficients. Thus is a basis. However, for the plane there are also a lot of other bases. An example is the following set:

We can generate all vectors with these two vectors:

These vectors are linearly independent because one vector is not a multiple of the other vector (two vectors are linearly dependent if one vector is a multiple of the other vector). Thus is also a basis. These two examples show that the basis for is not unique. And one can indeed find a lot of other bases, for instance by stretching and rotating.

Building a further basis from an existing one[Bearbeiten]

In general, for every -vector space with a basis, we can construct a second, different basis: Consider the two-dimensional vector space with a basis . Then is also a basis of . The same argument of "substituting the last vector" also works in higher dimensions. We first show that is a generator and then that the basis vectors are linearly independent.

Let be any vector and a linear combination of it using elements of the basis . Then a linear combination of can also be found via vectors from :

Thus is a generator because the vector was arbitrarily chosen.

We show that the basis vectors are linearly independent via proof by contradiction. To do this, we prove that if is linearly dependent, then must also be linearly dependent. By contraposition, it thus follows that is linearly independent if is linearly independent (which it must be the case it is a basis). If is linearly dependent, there is a representation of zero:

Here or . Now we also find a representation of the zero with the basis :

We still have to show that one of the coefficients or is not equal to zero. As a premise we have or . The case leads directly to the non-trivial representation of the zero, since this factor also appears in the second equation.

If holds, we have to distinguish again. If and , then and hence one of the coefficients is not zero. If and hold, then and hence the first coefficient is non-zero.

It follows that one of the coefficients is always non-zero. Thus the vectors of the basis are also linearly dependent. It follows (by contradiction) that is linearly independent if is linearly independent. is thus a new basis constructed from the first basis.

This principle can also be applied to larger bases and shows: The basis of a vector space is (usually) not unique. A vector space with dimension equal to or larger than 2 has several bases.

Proving existence and constructing a basis[Bearbeiten]

Existence of a basis[Bearbeiten]

We have not yet answered the question of whether there is a basis for every n vector space at all. Attention, this is not self-evident, as bases may be infinitely large! Nevertheless, you can rejoice, because the answer is: Yes, every vector space has (at least) one basis.

Of course, we still have to justify this answer. For the case of finitely generated vector spaces, i.e. all vector spaces that have a finite generator, we will prove this in a moment. For infinitely generated vector spaces, i.e. vector spaces that do not have a finite generating system, the proof is much more complicated and uses the axiom of choice.

Theorem (Basis theorem)

Let be a finitely generated -vector space and a finite generator of . Then there is a subset that is a basis of .

Proof (Basis theorem)

We want to remove vectors from until we find a linearly independent subset of that is still a generator. We do this as follows: If is linearly dependent, then is not a basis. So, according to the theorem about equivalent characterisations of a basis, there exists a subset that is a generator of . This has fewer elements than .

We now inductively find generators such that has fewer elements than as follows:

  • set and
  • for , choose a generator of if is not a minimal generator.

Since we start with a finite set, we get a minimal generator after finitely many steps.

Theorem (Existence of a basis)

Every finitely generated vector space has at least one Basis.

Proof (Existence of a basis)

We take a finite generator. With the basis theorem, there is a subset of it that is a basis of the vector space. In particular, the vector space has at least one basis.

Construction of a basis by removing vectors[Bearbeiten]

Now we know that every vector space has a basis, but how can you find a basis for a given vector space? For finitely generated vector spaces, the proof of the theorem on the existence of a basis gives you a procedure for constructing a basis in finitely many steps (it is not applicable for infinitely generated vector spaces). According to the basis completion theorem, we can proceed as follows:

  1. Find a finite generator of the vector space.
  2. Check whether is linearly independent.
    • If yes: We are done and the generator is a basis.
    • If no: Find a smaller generator of the vector space and repeat step 2.

We now need an explicit way to get a smaller generator from a finite generator , which is not a minimal generator. Since is not a minimal generator, is linearly dependent. So we find so that not all and we have that

Now we choose an with and set

We now want to show that is also a generator. In the article linear independence of vectors we have proven that

Since is a generator of , we find for a vector scalars such that

Thus . Since was arbitrarily chosen, it follows that is a generator. With the proof of the basis theorem, we now get the following procedure for determining a basis:

How to get to the proof? (Construction of a basis for finite vector spaces)

  1. Find a finite generator of the given vector space.
  2. Try to find a non-trivial linear combination of the zero vector from this generator. (For this, you need to solve a linear system of equations.) If no non-trivial linear combination exists, the generator is also linearly independent, and we are done.
  3. If a non-trivial linear combination exists, remove one of the vectors from the generator whose coefficient in the linear combination is not zero. Then, go back to Step 2.

Construction of a basis by adding vectors[Bearbeiten]

Alternatively, we can proceed as in the section Motivation via linear independence; that is, we start with a linearly independent set and extend it until it is maximal, i.e., a basis.

Theorem (Basis completion theorem)

For every linearly independent subset of a finitely generated vector space there is a basis of with .

Proof (Basis completion theorem)

Let be any vector space and be any independent subset of . Since is finitely generated, we find a finite generator of . We now want to add elements from to the set until this new set is a generator. When adding the vectors, we want to keep the linear independence, so that the new set is a linearly independent generator and thus a basis.

If is a generator, then is already a basis of . Else we find that since , there is some with . Now we set . To show the linear independence of , we assume would be linearly dependent. Then a non-trivial linear combination would exist:

with . Since is linearly independent, must hold. So we get

This is a contradiction to . Thus is linearly independent.

We now inductively construct linearly independent sets by adding elements from according to the above procedure until has become a basis:

  • start with
  • and inductively set for some with , if is not yet a generator of .

Since we can only add finitely many vectors from to , there is an at which we cannot add a vector from to . Then is a generator and hence also a basis of .

This proof gives you another method to determine a basis of a finitely generated vector space in finitely many steps (this method is also only applicable for finitely generated vector spaces):

  1. Choose a finite generator and start with the empty set as your first linearly independent set.
  2. Try to find a vector from your generator that is not in the span of your previous linearly independent set. If you don't find one, you're done.
  3. Add the vector you found to your linearly independent set and go back to Step 2.

In the next chapter we will see that every two bases of the same finitely generated vector space have the same cardinality. We get that every linearly independent set that has as many elements as a basis is automatically already a maximally linearly independent subset. Therefore, we can change step 2 in the above procedure as follows: "Try to find a vector from your vector space that is not in the span of your previous linearly independent set. If you don't find one, you're done." In this variant of the procedure, you do not have to choose a generator in Step 1.

Examples: Construction of a basis by removing vectors[Bearbeiten]

Example (Construction of a basis by removing vectors 1)

Let . Then, we have , since for all we have that

That means, is a generator. Now, is not a basis of because we have the non-trivial representation of the zero vector

Since in this linear combination the prefactor of is not zero, according to the above procedure for constructing a basis from a generator is also a generator. We now need to check that is linearly dependent. So we look at how the zero vector can be combined using vectors from this set:

If we set up a linear system of equations and solve it, it follows that . Thus B' is linearly independent and hence a basis of .

Example (Abstract example: Construction of a basis by removing vectors)

Let be the two-element field. We consider the -vector space of all mappings from to . This is a function space (missing) consisting of elements. Thus . Here is the constant zero mapping, is the constant one mapping, is the identity and is the mapping that interchanges and :

As initial generating system we choose the entire vector space, i.e. . This is linearly dependent, because we have the following non-trivial representation of the zero vector

So we get a new generator of since the prefactor of is non-zero. Specifically, . Now is also linearly dependent because we have

Since the prefactor of is non-zero, we choose as a new generator . This set is linearly independent. From we obtain (by inserting the two possible arguments):

So and thus is linearly independent. With we have found a basis of our vector space.

Example: Construction of a basis by adding vectors[Bearbeiten]

Example (Construction of a basis by adding vectors)

We consider the space and start with the linearly independent set . Then . That is, we get the linearly independent set . Further

That means is linearly independent. Now

Thus and is linearly independent. Further, , because

Thus is a basis of .

Example (Abstract example: Construction of a basis by adding vectors)

We consider the two-element field and the -vector space of all mappings from to . As seen above, consists of 4 elements, that is, in the above notation . Let again . Then . Thus linearly independent. Now

This means is linearly independent. Further we have that

Thus is a basis of .