Let and let be an -dimensional and an -dimensional -vector space. We have already seen that, after choosing ordered bases, we can represent linear maps from to as matrices. So let be an ordered basis of and be an ordered basis of .
The space of linear maps from to is also a -vector space. The representing matrix of a linear map with respect to the bases and is an -matrix . We will try now transfer the vector space structure of to the space of -matrices over .
So we ask the question: Can we find addition and scalar multiplication on , such that and for all linear maps and all ?
On , is there perhaps even a vector space structure, such that for all finite dimensional vector spaces and and all ordered bases of and of , the mapping is linear?
It is best to think about these questions yourself. There is an exercise for matrix addition and one for scalar multiplication that can help you with this.
A first step is to answer this question is the following theorem:
Proof (Bijective maps induce vector space structures)
Proof step: Existence
For and we define , .
is closed under these operations, since always returns us to .
That forms a vector space with these operations follows directly from the vector space structure of . One can view simply as a renaming of the elements of .
For example, commutativity of the addition on follows from commutativity of the addition on as follows: .
Associativity of the addition on also follows from associativity of the addition on :
The establishment of the other vector space axioms work analogously. Thus we have found a vector space structure on . Let us now show that is linear with respect to . Since is bijective, it suffices to show that the inverse map with respect to is linear (see isomorphism ). We have and .
Thus is linear and hence is also linear.
Proof step: Uniqueness
Uniqueness: Suppose we have a vector space structure such that is linear. Then is the inverse function of a bijective linear function and hence also linear. Therefore we have that
,
.
That is, any vector space structure on with respect to which is linear must be our previously defined vector space structure.
We would now like to explicitly determine the vector space structure of .
Let be a basis of , and a basis of .
We define the addition induced by on the space of matrices as in the last theorem:
.
Now let be arbitrary and be the linear maps associated with and with .
Then
We now calculate this : In the -th column, must hold.
However, by definition of ,
Since the representation of is unique with respect to , it follows that .
That is, the addition induced by on is a component-wise addition.
Let us now examine the scalar multiplication induced by .
Let again and consider . We have that
Furthermore we have
Since we obtain
Thus, from the uniqueness of the representation it follows that . We see, the scalar multiplication induced from by on is the component-wise scalar multiplication.
We also see here that the induced vector space structure is independent of our choice of and .
We have just seen: To define a meaningful vector space structure on the matrices, we need to perform the operations component-wise. So we define addition and scalar multiplication as follows:
Definition (Addition of matrices)
Let be a field and let and be matrices of the same type over . Then
Written out explicitly in terms of matrices, this definition looks as follows:
Written out explicitly in terms of matrices, this definition looks as follows:
Example (Addition of matrices)
We are in .
Example (Multiplication by a field element)
As an example we take the matrix and as field element the real number . Then
Proof (Matrices form a vector space)
Proof step: Component-wise addition and scalar multiplication form a vector space structure on
Proof step: is the neutral element of the addition
Proof step: Every matrix has additive inverse
If we consider matrices just as tables of numbers (without considering them as mapping matrices), we see the following:
Matrices are nothing more than a special way of writing elements of , since matrices have entries.
Just as in , the vector space structure for matrices is defined component-wise.
So we get alternatively the following significantly shorter proof:
Dimension of
[Bearbeiten]
By the above identification of with we obtain a canonical basis of : Let be for the matrix with
Example
In , the basis elements are given by
Thus, is a -dimensional -vector space. We constructed the vector space structure on such that for - and -dimensional vector spaces and with bases and , respectively, we have that the map
is a linear isomorphism. So is a -dimensional -vector space. This result can also be found in the article vector space of a linear map.