Exercises Linear Maps – Serlo

Aus Wikibooks

We have compiled some tasks on linear maps here. The proof structures can help you to solve other similar tasks. As a reminder, here is the definition of a linear map:

Definition (Linear map)

Let be a mapping between the two vector spaces and . We call a linear map from to if the following two properties are satisfied:

  1. Additivity: For all we have that
  2. Homogeneity: For all and we have that

Showing linearity of a mapping[Bearbeiten]

Linear maps from to [Bearbeiten]

Exercise (Linear map into a field)

Let be defined by . Show that the map is linear.

How to get to the proof? (Linear map into a field)

First you have to show the additivity and then homogeneity of the map.

Solution (Linear map into a field)

Proof step: Additivity

For this step, let and .

Thus is additive.

Proof step: Homogeneity

Let and .

Thus is homogeneous and is linear.

Exercise (Linear map from to )

Show that the map with is linear.

How to get to the proof? (Linear map from to )

You have to show that for and it holds true that

And you have to show that for , it holds true that

Solution (Linear map from to )

Aktuelles Ziel: Additivity

Aktuelles Ziel: Scaling

Exercise (Linearity of the embedding)

Show that for , the map is linear.

Solution (Linearity of the embedding)

Let and , as well as . By definition of the map , we have that

So is linear.


We consider an example for a linear map of to :

with

Exercise (Linearity of )

Show that the map is linear.

Proof (Linearity of )

is an -vector space. In addition, the map is well-defined.

Proof step: additivity

Let and be any vectors from the plane . Then, we have:


Proof step: homogeneity

Let and . Then:

Thus the map is linear.


Important special cases[Bearbeiten]

Exercise (The identity is a linear map)

Let be a -vector space. Prove that the identity with is a linear map.

Proof (The identity is a linear map)

The identity is additive: Let , then.

The identity is homogeneous: Let and , then


Exercise (The map to zero is a linear map)

Let be two -vector spaces. Show that the map to zero , which maps all vectors to the zero vector , is linear.

Proof (The map to zero is a linear map)

is additive: let be vectors in . Then

is homogeneous: Let and let . Then

Thus, the map to zero is linear


Linear maps between function spaces[Bearbeiten]

Exercise (Mapping on a function space)

Consider the function space of all functions from to , as well as the map

Show that is linear.

Solution (Mapping on a function space)

The operations on the function space are defined element-wise in each case. That means: for , and we have that and . In particular, this is true for , which implies

and

Thus, we have established linearity.

Exercise (The precomposition with a map is linear.)

Let be a vector space, let be sets, and let or be the vector space of functions from or to . Let be arbitrary but fixed. We consider the mapping

Show that is linear.

It is important that you exactly follow the definitions. Note that is a map that assigns to every map of to a map of to . These maps, which are elements of and respectively, need not themselves be linear, since there is no vector space structure on the sets and .

Summary of proof (The precomposition with a map is linear.)

In order to prove the linearity of , we need to check the two properties again:

  1. is additive: for all
  2. is homogeneous: for all and

So at both points an equivalence of maps is to be shown. For this we evaluate the maps at every m element .

Proof (The precomposition with a map is linear.)

Let .

Proof step: additivity

For all we have that

Thus we have shown , i.e., is additive.

Let and .

Proof step: homogeneity

For all we have that

Thus we have shown , i.e., is homogeneous.

Now, additivity and homogeneity of implies that is a linear map.


Exercise (Sequence space)

Let be the -vector space of all real-valued sequences. Show that the map

is linear.

How to get to the proof? (Sequence space)

To show linearity, two properties need to be checked:

  1. is additive: for all
  2. is homogeneous: for all and

The vectors and are sequences of real numbers, i.e. they are of the form and with for all .

Proof (Sequence space)

Proof step: additivity

Let and . Then, we have

It follows that is additive.

Proof step: homogeneity

Let and . Then, we have

So is homogeneous.

Thus it was proved that is a -linear map.


Construction of a linear map from given values[Bearbeiten]

Exercise (Construction of a linear map)

Let .

Further, consider .

Find a linear map with for all .

How to get to the proof? (Construction of a linear map)

Hint: Use the principle of linear continuation.

Solution (Construction of a linear map)

We see that is a basis of , namely the standard basis.

According to the theorem of linear continuation, we can construct a linear map

defined by

Now we only have to check if is satisfied. It is true that , so

Thus the condition is satisfied for each . The mapping is linear by definition, so we are done.

Exercise (Linear maps under some conditions)

Let and . Is there an -linear map that satisfies ?

How to get to the proof? (Linear maps under some conditions)

First you should check if the vectors are linearly independent. If this is the case, is a basis of because of . Using the principle of linear continuation, the existence of such a linear map would follow . Let thus :

But then also and so must be fulfilled. However, this equation has not only the "trivial" solution . In fact, the upper equation is satisfied for . Thus, one obtains

For such a map , the relation would then have to hold, which is a contradiction to

Solution (Linear maps under some conditions)

Let us first assume that such a linear map would exist. By the following calculation

we see that should hold. But this is a contradiction to the other conditions, because those would imply

So there is no such .


Linear independence of two preimages[Bearbeiten]

Exercise

Let be a linear map and let and be two distinct vectors from , both mapped to a vector with . Prove that and are linearly independent.

How to get to the proof?

We show that the two vectors cannot be linearly dependent. So assume that were linearly dependent. Then there would be a such that . We now map these two dependent vectors into the vector space using the linear map . This yields

Since by premise, , this is a contradiction and our assumption of linear dependence must be false.

Solution

Assume that and were linearly dependent. Then there would be a with and . Since the map is linear, it follows that

Thus

Since by assumption , we must have . But this contradicts our assumption . Thus we get a contradiction to our assumption of linear dependence. So the vectors are linearly independent.

Exercises: Isomorphisms[Bearbeiten]

Exercise (complex -vector spaces)

Let be a finite-dimensional -vector space. Show that (interpreted as -vector spaces).

Solution (complex -vector spaces)

Set . We choose a basis of . Define for all .

We have to show that is an -basis of . Then, . According to a theorem above, we have as -vector spaces.

We now show -linear independence.

Proof step: is -linearly independent

Let and assume that . We substitute the definition for , conclude the sums and obtain . By -linear independence of we obtain for all . Thus, for all . This establishes the -linear independence.

Now only one step is missing:

Proof step: is a generator with respect to

Let be arbitrary.

Since is a -basis of , we can find some , such that . We write with for all . Then we obtain

So is inside the -span of . This establishes the assertion.


Exercise (Isomorphic coordinate spaces)

Let be a field and consider . Prove that holds if and only if .

Solution (Isomorphic coordinate spaces)

We know that for all . We use the theorem above, which states that finite-dimensional vector spaces are isomorphic exactly if their dimensions coincide. So holds if and only if .

Exercise (Isomorphism criteria for endomorphisms)

Let be a field, a finite-dimensional -vector space and a -linear map. Prove that the following three statements are equivalent:

(i) is an isomorphism.

(ii) is injective.

(iii) is surjective.

(Note: For this task, it may be helpful to know the terms kernel and image of a linear map. Using the dimension theorem, this exercise becomes much easier. However, we give a solution here, which works without the dimension theorem).

Solution (Isomorphism criteria for endomorphisms)

(i)(ii) and (iii): According to the definition of an isomorphism, is bijective, i.e. injective and surjective. Therefore (ii) and (iii) hold.

(ii)(i): Let be an injective mapping. We need to show that is also surjective. The image of is a subspace of . This can be verified by calculation. We now define a mapping that does the same thing as , except that it will be surjective by definition. This mapping is defined as follows:

The surjectivity comes from the fact that every element can be written as , for a suitable . Moreover, the mapping is injective and linear. This is because already has these two properties. So and are isomorphic. Therefore, and have the same finite dimension. Since is a subspace of , holds. This can be seen by choosing a basis in , for instance the basis given by the vectors . These are also linearly independent in , since . And since and have the same dimension, the are also a basis in . So the two vector spaces and must now be the same, because all elements from them are -linear combinations formed with the . Thus we have shown that is surjective.

(iii)(i): Now suppose is surjective. We need to show that is also injective. Let be the kernel of the mapping . You may convince yourself by calculation, that this kernel is a subspace of . Let be a basis of . We can complete this (small) basis to a (large) basis of , by including the additional vectors . We will now show that are linearly independent. So let coefficients be given such that

By linearity of we conclude: . This means that the linear combination

is in the kernel of . But we already know a basis of . Therefore there are coefficients , such that

Because of the linear independence of it now follows that . Therefore, the are linearly independent. Next, we will show that these vectors also form a basis of . To do this, we show that each vector in can be written as a linear combination of the . Let . Because of the surjectivity of , there is a , with . Since the form a basis of , there are coefficients such that

If we now apply to this equation, we get:

Here we used the linearity of . Since the first elements of our basis are in the kernel, their images are . So we get the desired representation of :

Thus we have shown that forms a linearly independent generator of . So these vectors form a basis of . Now if were not , two finite bases in would not contain equally many elements. This cannot be the case. Therefore, , so is the trivial vector space and is indeed injective.

Exercise (Function spaces)

Let be a finite set with elements and let be a field. We have seen that the set of functions from to forms a -vector space, denoted by . Show that .

Solution (Function spaces)

We already know according to a theorem above that two finite dimensional vector spaces are isomorphic exactly if they have the same dimension. So we just need to show that holds.

To show this, we first need a basis of . For this, let be the elements of the set . We define by

We now show that the functions indeed form a basis of .

Proof step: are linearly independent

Let with being the zero function. If we apply this function to any with , then we obtain: . By definition of it follows that

.

Since was arbitrary and must hold for all , it follows that . So we have shown that are linearly independent.

Proof step: generate

Let be arbitrary. We now want to write as a linear combination of . For this we show , i.e., is a linear combination of with coefficients . We now verify that for all . Let be arbitrary. By definition of we obtain:

.

Since equality holds for all , the functions agree at every point and are therefore identical. So we have shown that generate .

Thus we have proved that is a basis of . Since we have basis elements of , it follows that .

Exercises: Images[Bearbeiten]

Exercise (Associating image spaces to figures)

We consider the following four subspaces from the vector space , given as images of the linear maps


Match these four subspaces to the subspaces shown in the figures below.

Solution (Associating image spaces to figures)

First we look for the image of : To find , we can apply a theorem from above: If is a generator of , then holds. We take the standard basis as the generator of . Then

Now we apply to the standard basis

The vectors generate the image of . Moreover, they are linearly independent and thus a basis of . Therefore . So .

Next, we want to find the image of . However, it is also possible to compute the image directly by definition, which we will demonstrate here.

So the image of is spanned by the vector . Thus .


Now we determine the image of using, for example, the same method as for . That means we apply to the standard basis:

Both vectors are linearly dependent. So it follows that and thus .


Finally, we determine the image of . For this we proceed for example as with .

So the image of is spanned by the vector . Thus is the -axis, so .

Exercise (Image of a matrix)

  1. Consider the matrix and the mapping induced by it. What is the image ?
  2. Now let be any matrix over a field , where denote the columns of . Consider the mapping induced by . Show that holds. So the image of a matrix is the span of its columns.

Solution (Image of a matrix)

Solution sub-exercise 1:

We know that the image of the linear map is a subspace of . Since the -vector space has dimension , a subspace can only have dimension or . In the first case the subspace is the null vector space, in the second case it is already all of . So has only the two subspaces and . Since holds, we have that . Thus, .

Solution sub-exercise 2:

Proof step: ""

Let . Then, there is some with . We can write as . Plugging this into the equation , we get.

Since , we obtain .

Proof step: ""

Let with for . We want to find with . So let us define . The same calculation as in the first step of the proof then shows

Exercise (Surjectivity and dimension of and )

Let and be two finite-dimensional vector spaces. Show that there exists a surjective linear map if and only if .

How to get to the proof? (Surjectivity and dimension of and )

We want to estimate the dimensions of and against each other. The dimension is defined as the cardinality of a basis. That is, if is a basis of and is a basis of , we must show that holds if and only if there exists a surjective linear map. "if and only if" means that we need to establish two directions ().

Given a surjective linear map , we must show that the dimension of is at least . Now bases are maximal linearly independent subsets. That is, to estimate the dimension from below, we need to construct a linearly independent subset with elements. In the figure, we have already a linearly independent subset with elements, which is the basis . Because is surjective, we can lift these to vectors with . Now we need to verify that are linearly independent in . We see this, by converting a linear combination via into a linear combination and exploiting the linear independence of .

Conversely, if holds, we must construct a surjective linear map . Following the principle of linear continuation, we can construct the linear map by specifying how acts on a basis of . For this we need elements of on which we can send . We have already chosen a basis of above. Therefore, it is convenient to define as follows:

Then the image of is spanned by the vectors . However, these vectors also span all of and thus is surjective.

Solution (Surjectivity and dimension of and )

Proof step: ""

Suppose there is a suitable surjective mapping . We show that the dimension of cannot be larger than the dimension of (this is true for any linear map). Because of the surjectivity of , it follows that .

So let be linearly independent. There exists with for . We show that are also linearly independent: Let with . Then we also have that

By linear independence of , it follows that . So are also linearly independent. Overall, we have shown that

In particular, it holds that a basis of (a maximal linearly independent subset of ) must contain at least as many elements as a basis of , that is, .

Proof step: ""

Assume that . We use that a linear map is already uniquely determined by the images of the basis vectors. Let be a basis of and be a basis of . Define the surjective linear map by

This works, since by assumption, holds. The mapping constructed in this way is surjective, since by construction, . As the image of is a subspace of , the subspace generated by these vectors, i.e., , also lies in the image of . Accordingly, holds and is surjective.

Exercises: Kernel[Bearbeiten]

Exercise

We consider the linear map . Determine the kernel of .

Solution

We are looking for vectors such that . Let be any vector in for which is true. We now examine what properties this vector must have. It holds that

So and . From this we conclude . So any vector in the kernel of satisfies the condition . Now take a vector with . Then

We see that . In total

Check your understanding: Can you visualize in the plane? What does the image of look like? How do the kernel and the image relate to each other?

The kernel of f

We have already seen that

Now we determine the image of by applying to the canonical basis.

So holds. We see that the two vectors are linearly dependent. That is, we can generate the image with only one vector: .

In our example, the image and the kernel of the linear map are straight lines through the origin. The two straight lines intersect only at the zero and together span the whole .

Exercise

Let be a vector space, , and be a nilpotent linear map, i.e., there is some such that

is the zero mapping. Show that holds.

Does the converse also hold, that is, is any linear map with nilpotent?

Solution

Proof step: nilpotent

We prove the statement by contraposition. That is we show: If , then is not nilpotent.

Let . Then is injective, and as a concatenation of injective functions, is also injective. By induction it follows that for all the function is injective. But then also for all . Since the kernel of the zero mapping would be all of , the map could not be the zero mapping for any . Consequently, is not nilpotent.

Proof step: The converse implication

The converse implication does not hold. There are mappings that are neither injective nor nilpotent. For example we can define

This mapping is not injective, because . But it is also not nilpotent, because we have for all .

Exercise (Injectivity and dimension of and )

Let and be two finite-dimensional vector spaces. Show that there exists an injective linear map if and only if .

How to get to the proof? (Injectivity and dimension of and )

To prove equivalence, we need to show two implications. For the execution, we use that every monomorphism preserves linear independence: If is a basis of , then the vectors are linearly independent. For the converse direction, we need to construct a monomorphism from to using the assumption . To do this, we choose bases in and and then use the principle of linear continuation to define a monomorphism by the images of the basis vectors.

Solution (Injectivity and dimension of and )

Proof step: There is a monomorphism

Let be a monomorphism and a basis of . Then is in particular linearly independent and therefore is linearly independent. Thus, it follows that . So is a necessary criterion for the existence of a monomorphism from to .

Proof step: there is a monomorphism

Conversely, in the case we can construct a monomorphism: Let be a basis of and be a basis of . Then . We define a linear map by setting

for all . According to the principle of linear continuation, such a linear map exists and is uniquely determined. We now show that is injective by proving that holds. Let . Because is a basis of , there exist some with

Thus, we get

Since are linearly independent, must hold for all . So it follows for that

We have shown that holds and thus is a monomorphism.