# Quotient space – Serlo

In this article we consider the quotient space ${\displaystyle V/U}$ of a ${\displaystyle K}$-vector space ${\displaystyle V}$ with respect to a subspace ${\displaystyle U}$. The quotient space ${\displaystyle V/U}$ is a vector space in which we can do computations as in ${\displaystyle V}$, up to an addition of arbitrary terms from ${\displaystyle U}$.

## Introduction

### Computations with solutions of a linear system

We consider the matrix

${\displaystyle A={\begin{pmatrix}1&1&0\\-1&-1&1\end{pmatrix}}.}$

We now want to solve the linear system of equations ${\displaystyle Ax=b}$ for different vectors ${\displaystyle b\in \mathbb {R} ^{2}}$. For example, taking ${\displaystyle b_{1}=(3,-7)^{T}}$, we get a solution ${\displaystyle x_{1}=(1,2,-4)^{T}}$ and for ${\displaystyle b_{2}=(1,2)^{T}}$, we get a solution ${\displaystyle x_{2}=(2,0,3)^{T}}$. That is, ${\displaystyle Ax_{1}=b_{1}}$ and ${\displaystyle Ax_{2}=b_{2}}$ hold. What is then the solution for ${\displaystyle Ax=b_{1}+b_{2}}$? To find this out, we can use linearity of ${\displaystyle A}$: We just have to add our previous solutions together, since ${\displaystyle A(x_{1}+x_{2})=Ax_{1}+Ax_{2}=b_{1}+b_{2}}$. Thus, a solution to ${\displaystyle Ax=b_{1}+b_{2}}$ is given by ${\displaystyle x_{1}+x_{2}=(3,2,-1)^{T}}$.

The solution to the above system of equations is not unique. For instance, the system ${\displaystyle Ax=b_{1}}$ is also solved by ${\displaystyle x_{1}'=(2,1,-4)^{T}}$ and the system ${\displaystyle Ax=b_{2}}$ is also solved by ${\displaystyle x_{2}'=(-1,3,3)^{T}}$. The solutions ${\displaystyle x_{1}}$ and ${\displaystyle x_{1}'}$, as well as ${\displaystyle x_{2}}$ and ${\displaystyle x_{2}'}$ differ from each other. Their differences are ${\displaystyle x_{1}=x_{1}'+(-1,1,0)^{T}}$ and ${\displaystyle x_{2}=x_{2}'+(3,-3,0)^{T}}$. Both ${\displaystyle (-1,1,0)^{T}}$ and ${\displaystyle (3,-3,0)^{T}}$ are solutions to the (homogeneous) linear system ${\displaystyle Ax=0}$. That is, they lie in the kernel of ${\displaystyle A}$.

To-Do:

This "kernel property" is true in general: if ${\displaystyle x}$ and ${\displaystyle x'}$ are two different solutions of ${\displaystyle Ax=b}$, they differ exactly by an element in the kernel of ${\displaystyle A}$, because ${\displaystyle A(x-x')=Ax-Ax'=b-b=0}$. Since the kernel of ${\displaystyle A}$ is important, we give it the separate name ${\displaystyle U}$ in the following. Conversely, whenever we have two solutions ${\displaystyle x_{1}+x_{2}}$ and ${\displaystyle x_{1}'+x_{2}'}$ of ${\displaystyle Ax=b_{1}+b_{2}}$, then their difference ${\displaystyle (x_{1}+x_{2})-(x_{1}'+x_{2}')}$ is in the kernel ${\displaystyle U}$. So once a single solution is found, then the kernel can be used to find all solutions to the system. Put differently, we can consider two vectors whose difference is in ${\displaystyle U}$ as equivalent, since if one vector solves ${\displaystyle Ax=b}$, then the other also does.

For scalar multiplication by ${\displaystyle \lambda \in \mathbb {R} }$, we can use linearity of ${\displaystyle A}$ again: We have a solution ${\displaystyle x_{1}}$ of ${\displaystyle Ax=b_{1}}$ and we want to solve ${\displaystyle Ax=\lambda b_{1}}$ without recalculating. Again, we can obtain a solution by using our already determined solution ${\displaystyle x_{1}}$: We have ${\displaystyle A(\lambda x_{1})=\lambda (Ax_{1})=\lambda b_{1}}$, so ${\displaystyle \lambda x_{1}}$ is a solution to ${\displaystyle Ax=\lambda b_{1}}$. For the second solution ${\displaystyle x_{1}'}$ this also works: ${\displaystyle x=\lambda x_{1}'}$ is a solution of ${\displaystyle Ax=\lambda b}$. Again, the difference of both (equivalent) solutions ${\displaystyle \lambda x_{1}}$ and ${\displaystyle \lambda x_{1}'}$ is in ${\displaystyle U}$. So we can scale solutions of linear systems to find solutions to scaled systems. While scaling, the differences stay in ${\displaystyle \ker(A)=U}$, so both solutions stay equivalent. A different way to say that two vectors are equivalent is to say that they are the same modulo ${\displaystyle U}$ whenever they differ only by some vector in ${\displaystyle U}$. For example, the solutions ${\displaystyle x_{1}+x_{2}}$ and ${\displaystyle x_{1}'+x_{2}'}$ of the system of equations ${\displaystyle Ax=b_{1}+b_{2}}$ are equal modulo ${\displaystyle U}$, since ${\displaystyle (x_{1}+x_{2})-(x_{1}'+x_{2}')\in U}$. When calculating with solutions of systems of linear equations, we therefore calculate modulo ${\displaystyle U}$.

### Construction of the quotient space

In the example, we made calculations in a vector space ${\displaystyle V}$, but only looked at the results up to differences in a subspace ${\displaystyle U}$. That is, we considered two vectors ${\displaystyle v}$ and ${\displaystyle v'}$ in ${\displaystyle V}$ as equivalent, whenever ${\displaystyle v-v'\in U}$. To formalise these "calculations up to some element in ${\displaystyle U}$", we identify vectors which using an equivalence relation ${\displaystyle \sim }$ that is defined by

${\displaystyle v\sim v'\quad :\iff \quad v-v'\in U.}$

This is exactly the relation we used to define cosets of a subspace. In this article, we have also checked that ${\displaystyle \sim }$ is an equivalence relation. Mathematically, the set of all equivalence classes is denoted by ${\displaystyle V/U}$.

We will now show that on ${\displaystyle V/U}$, we can define a natural vector space structure. To do so, we introduce an addition ${\displaystyle \boxplus }$ and a scalar multiplication ${\displaystyle \boxdot }$ on ${\displaystyle V/U}$: For ${\displaystyle v+U,w+U\in V/U}$ and ${\displaystyle \lambda \in K}$ we define

{\displaystyle {\begin{aligned}(v+U)\boxplus (w+U)&:=(v+w)+U\\\lambda \boxdot (v+U)&:=(\lambda \cdot v)+U\end{aligned}}}

These definitions make use of representatives. That is, we took one element from each involved coset to define ${\displaystyle \boxplus }$ and ${\displaystyle \boxdot }$. However, we still have to show that the definitions are independent of the chosen representative.

That is, we must show that this definition is independent of the choice of representative and thus makes sense. We give this proof further below. The property that a mathematical definition makes sense is also called well-definedness.

We also need to show that ${\displaystyle V/U}$ is a vector space with this addition and scalar multiplication, which we will do below.

## Definition

In the previous section, we considered what a vector space ${\displaystyle V/U}$ might look like, in which we can calculate modulo ${\displaystyle U}$. The elements of ${\displaystyle V/U}$ are the cosets ${\displaystyle v+U}$. We want to define the vector space structure using the representatives. Further below , we then show that the definition makes mathematical sense, that is, the vector space structure is proven to be well--defined.

To distinguish addition and scalar multiplication on ${\displaystyle V/U}$ from that on ${\displaystyle V}$, we refer to the operations on ${\displaystyle V/U}$ as "${\displaystyle \boxplus }$" and "${\displaystyle \boxdot }$" in this article. Other articles and sources mostly use "${\displaystyle +}$" and "${\displaystyle \cdot }$" for the vector space operations.

Definition (Quotient space)

Let ${\displaystyle V}$ be a ${\displaystyle K}$-vector space and ${\displaystyle U\subseteq V}$ be a subspace of ${\displaystyle V}$ with

${\displaystyle V/U=\{v+U\mid v\in V\}}$

being the set of cosets of ${\displaystyle U}$ in ${\displaystyle V}$. Further, let ${\displaystyle v,wbe\in V}$.

We define the addition in ${\displaystyle V/U}$ by:

{\displaystyle {\begin{aligned}&\boxplus :(V/U\times V/U)\to V/U\\&(v+U)\boxplus (w+U):=(v+w)+U.\end{aligned}}}

Analogously, we define scalar multiplication on ${\displaystyle V/U}$ as:

{\displaystyle {\begin{aligned}&\boxdot :(K\times V/U)\to V/U\\&\lambda \boxdot (v+U):=(\lambda \cdot v)+U.\end{aligned}}}

### Explanation of the definition

A short explanation concerning the brackets appearing in ${\displaystyle (v+U)\boxplus (w+U):=(v+w)+U}$ and ${\displaystyle \lambda \boxdot (v+U):=(\lambda \cdot v)+U}$: To define the addition ${\displaystyle \boxplus }$ in ${\displaystyle V/U}$, we need two vectors from ${\displaystyle V/U}$. Vectors in ${\displaystyle V/U}$ are cosets, so ${\displaystyle (v+U)}$ and ${\displaystyle (w+U)}$ denote cosets given by ${\displaystyle v,w\in V}$. The expression ${\displaystyle (v+w)+U}$ is also a coset, namely the one associated with ${\displaystyle v+w}$:

${\displaystyle {\color {OliveGreen}\underbrace {{\color {Blue}\underbrace {v+U} _{\text{coset}}}\boxplus {\color {Blue}\underbrace {w+U} _{\text{coset}}}} _{{\text{addition in }}V/U}}={\color {Blue}\underbrace {{\color {OliveGreen}\underbrace {(v+w)} _{{\text{addition in }}V}}+U} _{\text{coset}}}}$

The scalar multiplication works similarly: For a scalar ${\displaystyle \lambda \in K}$ and a coset ${\displaystyle v+U}$ with ${\displaystyle v\in V}$ we want to define ${\displaystyle \lambda \boxdot (v+U)}$. For this we first calculate the scalar product ${\displaystyle \lambda \cdot v}$ in ${\displaystyle V}$ and then turn to the associated coset ${\displaystyle (\lambda \cdot v)+U}$:

${\displaystyle {\color {OliveGreen}\underbrace {\lambda \boxdot {\color {Blue}\underbrace {(v+U)} _{\text{coset}}}} _{{\text{scalar multiplication in }}V/U}}={\color {Blue}\underbrace {{\color {OliveGreen}\underbrace {(\lambda \cdot v)} _{{\text{scalar multiplication in }}V}}+U} _{\text{coset}}}}$

So we first execute the addition or scalar multiplication of the representatives in ${\displaystyle V}$ and then turn to the coset to get the addition or scalar multiplication on ${\displaystyle V/U}$. Mathematically, we also say that the vector space structure on ${\displaystyle V}$ "induces" the structure on ${\displaystyle V/U}$.

### Well-defined operations in the quotient space

We want to check whether the operations of ${\displaystyle \boxplus }$ and ${\displaystyle \boxdot }$ are independent of the choice of representatives - that is, they are well-defined.

Theorem (Well-defined operations in the quotient space)

Let ${\displaystyle V}$ be a ${\displaystyle K}$-vector space and ${\displaystyle U\subseteq V}$ a subspace of vectors. Then addition and scalar multiplication on ${\displaystyle V/U}$ are well-defined.

Proof (Well-defined operations in the quotient space)

For well-definedness, we need to show the following: If in the definition, we plug in different representatives of the coset(s) on the left-hand side, we end up with the same coset on the right-hand side. Mathematically, we have to show :

• For ${\displaystyle \boxplus }$: If ${\displaystyle v+U=v'+U}$ and ${\displaystyle w+U=w'+U}$, then ${\displaystyle (v+w)+U=(v'+w')+U}$.
• For ${\displaystyle \boxdot }$: If ${\displaystyle v+U=v'+U}$ and ${\displaystyle \lambda \in K}$, then ${\displaystyle (\lambda \cdot v)+U=(\lambda \cdot v')+U}$.

By definition of a coset we have to show that ${\displaystyle (v+w)-(v'+w')\in U}$ holds. Since ${\displaystyle (v+w)-(v'+w')=v+w-v'-w'=(v-v')+(w-w')}$, this is equivalent to ${\displaystyle (v-v')+(w-w')\in U}$. Now ${\displaystyle v}$ and ${\displaystyle v'}$ or ${\displaystyle w}$ and ${\displaystyle w'}$ each represent the same coset modulo ${\displaystyle U}$. Thus, ${\displaystyle v-v',w-w'\in U}$. Since ${\displaystyle U}$ is a subspace of ${\displaystyle V}$, it follows that ${\displaystyle (v-v')+(w-w')\in U}$. So the addition ${\displaystyle \boxplus }$ is indeed independent of the choice of representatives.

Proof step: Well-defined scalar multiplication

Well-definedness of the scalar multiplication ${\displaystyle \boxdot }$ can be seen in same way: In the above notation, we have to show that ${\displaystyle \lambda \cdot v-\lambda \cdot v'=\lambda \cdot (v-v')\in U}$. Since ${\displaystyle v}$ and ${\displaystyle v'}$ represent the same coset modulo ${\displaystyle U}$, we have ${\displaystyle v-v'\in U}$. And since ${\displaystyle U}$ is a subspace, we also have ${\displaystyle \lambda \cdot (v-v')\in U}$. So the scalar multiplication ${\displaystyle \boxdot }$ is also independent of the choice of representative.

### Establishing the vector space axioms

We show that the quotient space is again a ${\displaystyle K}$-vector space by taking the axioms valid for ${\displaystyle V}$ and inferring those axioms of ${\displaystyle V/U}$. Hence, taking quotient spaces is a way to generate new vector spaces from an existing ${\displaystyle K}$-vector space, just like taking subspaces.

Exercise (Proof of the vector space axioms in quotient space)

Let ${\displaystyle V}$ be a ${\displaystyle K}$-vector space and ${\displaystyle U\subseteq V}$ a subspace of it, then ${\displaystyle V/U}$ with the operations ${\displaystyle \boxplus }$ and ${\displaystyle \boxdot }$ defined above is also a ${\displaystyle K}$-vector space

Solution (Proof of the vector space axioms in quotient space)

Proof step: Establishing properties of a commutative additive group (also called an Abelian group).

We first consider the properties of addition. For this let ${\displaystyle v,w,z\in V}$.

1. Associativity:

We trace back the associativity to associativity in ${\displaystyle V}$

{\displaystyle {\begin{aligned}(v+U)\boxplus ((w+U)\boxplus (z+U))&=(v+U)\boxplus ((w+z)+U)\\[0.3em]&=(v+(w+z))+U\\[0.3em]&\quad {\color {OliveGreen}\left\downarrow \ {\text{associativity in the vector space }}V\right.}\\[0.3em]&=((v+w)+z)+U\\[0.3em]&=((v+w)+U)\boxplus (z+U)\\[0.3em]&=((v+U)\boxplus (w+U))\boxplus (z+U)\end{aligned}}}

2. Commutativity

We also trace commutativity back to commutativity in ${\displaystyle V}$

{\displaystyle {\begin{aligned}(v+U)\boxplus (w+U)&=(v+w)+U\\[0.3em]&\quad {\color {OliveGreen}\left\downarrow \ {\text{commutativity in the vector space }}V\right.}\\[0.3em]&=(w+v)+U\\[0.3em]&=(w+U)\boxplus (v+U)\end{aligned}}}

3. Existence of a neutral element

Since we are considering displacements of ${\displaystyle U}$, the coset ${\displaystyle 0+U}$ should be the neutral element with respect to addition. We can verify this by using that ${\displaystyle 0}$ is the neutral element in ${\displaystyle V}$:

${\displaystyle (v+U)\boxplus (0+U)=(v+0)+U=v+U}$

4. Existence of an inverse

We consider the coset ${\displaystyle v+U}$. For the inverse ${\displaystyle v'+U}$ of ${\displaystyle v+U}$, we need that

${\displaystyle (v+U)\boxplus (v'+U)=0+U.}$

Thus, the addition of an element with its inverse indeed yields the neutral element ${\displaystyle 0+U}$.

We also trace the inverse of ${\displaystyle v+U}$ back to inverse in ${\displaystyle V}$. Let ${\displaystyle v}$ be a representative of ${\displaystyle v+U}$ and ${\displaystyle (-v)}$ its inverse in ${\displaystyle V}$. Then,

${\displaystyle (v+U)\boxplus ((-v)+U)=(v-v)+U=0+U.}$

Thus, the element inverse to ${\displaystyle v+U}$ is ${\displaystyle -v+U}$.

Proof step: Distributive laws

1. Scalar Distributive Law

Multiplication of a vector (in a quotient space, i.e., the vector is a coset) with a sum of scalars yields:

{\displaystyle {\begin{aligned}(\lambda +\mu )\boxdot (v+U)&=((\lambda +\mu )\cdot v)+U\\[0.3em]&\quad {\color {OliveGreen}\left\downarrow \ {\text{distributive law in the vector space }}V\right.}\\[0.3em]&=(\lambda \cdot v+\mu \cdot v)+U\\[0.3em]&=\lambda v+U\boxplus \mu v+U\\[0.3em]&=\lambda \boxdot (v+U)\boxplus \mu \boxdot (v+U)\end{aligned}}}

2. Vector Distributive Law

Likewise, we can show that the distributive law also holds for the multiplication of a scalar with the sum of two vectors (i.e., with two cosets in the quotient space):

{\displaystyle {\begin{aligned}\lambda \boxdot (v+U\boxplus w+U)&=\lambda \boxdot (v+w)+U\\[0.3em]&=(\lambda \cdot (v+w))+U\\[0.3em]&\quad {\color {OliveGreen}\left\downarrow \ {\text{distributive law in the vector space }}V\right.}\\[0.3em]&=(\lambda v+\lambda w)+U\\[0.3em]&=\lambda v+U\boxplus \lambda w+U\\[0.3em]&=\lambda \boxdot (v+U)\boxplus \lambda \boxdot (w+U)\end{aligned}}}

Proof step: Properties of scalar multiplication

We now show that the scalar multiplication of cosets also satisfies the corresponding vector space axioms. Again, we trace back properties in the quotient space back to the corresponding properties in ${\displaystyle V}$. To this end, let ${\displaystyle \lambda ,\mu \in K}$ and ${\displaystyle v,w\in V}$. Then the following axioms hold:

1. Associative law for scalars

The scalar multiplication is associative, since

{\displaystyle {\begin{aligned}(\lambda \cdot \mu )\boxdot (v+U)&=((\lambda \cdot \mu )\cdot v)+U\\&\quad {\color {OliveGreen}\left\downarrow \ {\text{associative law in the vector space }}V\right.}\\&=(\lambda \cdot (\mu )\cdot v))+U\\&=\lambda \boxdot ((\mu v)+U)\\&=\lambda \boxdot (\mu \boxdot (v+U))\end{aligned}}}

2. Neutral element of scalar multiplication

We want to prove that ${\displaystyle 1\in K}$ is also the neutral element for ${\displaystyle \boxdot }$. That is, ${\displaystyle 1\boxdot (v+U)=v+U}$ must hold. Since 1 is neutral in ${\displaystyle V}$ and since ${\displaystyle 1\boxdot v=v}$, we get

${\displaystyle 1\boxdot (v+U)=(1\cdot v)+U=v+U}$

So ${\displaystyle 1\in K}$ is the neutral element of scalar multiplication and ${\displaystyle V/U}$ is indeed a ${\displaystyle K}$-vector space.

## Examples

### Satellite images

Example (Satellite images)

We imagine that we are standing on a vantage point in New York City from which we are looking at the skyline. In this situation, we will see the city in three dimensions. So objects (e.g. skyscrapers) can be identified with vectors in ${\displaystyle V=\mathbb {R} ^{3}}$. However, there are also situations where we want to look at the city in only two dimensions, for instance, when taking a virtual tour using a map or a satellite image of New York.

If we want to create a map or a satellite image, we need to "project" information from three dimensions into two dimensions. This process can mathematically be described by a reduction to some quotient space ${\displaystyle V/U}$.

For example, let us take a look at the edge of a skyscraper. On the oblique image, we see that an edge reaches about 600 feet up into the air. However, on the satellite image, the edge is just displayed as a dot. So all points (= vectors) on the edge are identified with this one dot. The dot is then a coset in ${\displaystyle V/U}$. The vector space ${\displaystyle U}$ is given by the (1-dimensional) ${\displaystyle x_{3}}$-axis, since after adding a vector on the ${\displaystyle x_{3}}$-axis (i.e., shifting a point up or down), we end up at the same dot on the sattelite image. The space ${\displaystyle V/U}$ contains all dots, i.e., it corresponds to the (2-dimensional) map.

### Example in finite vector space

Now, we turn to a more abstract mathematical example, that will involve some donuts.

Example (Quotient space in ${\displaystyle (\mathbb {Z} /3\mathbb {Z} )^{2}}$)

In the vector space article, we considered the abstract mathematical set ${\displaystyle (\mathbb {Z} /5\mathbb {Z} )^{2}}$, which can be seen as lattice points on a torus (= surface of a donut). Using the same method, we can think of ${\displaystyle (\mathbb {Z} /3\mathbb {Z} )^{2}}$ as 9 lattice points on a torus as well:

We obtain a torus from a square by stretching gluing the edges as follows:

In other words, the surface of a donut is the same as a square, where, if you walk out on one edge, you immediately enter it at the opposite side. Thus we may visualize ${\displaystyle (\mathbb {Z} /3\mathbb {Z} )^{2}}$ as follows: On the torus, we draw nine points in lattice form. We then get the following picture:

The subspace ${\displaystyle U\subseteq (\mathbb {Z} /3\mathbb {Z} )^{2}}$ generated by ${\displaystyle (1,1)^{T}}$, corresponds to a discrete straight line. We put this line through the above points.

We now have points lying on two different sides directly next to our line. Some points are lying directly to the right of the line; that is, they are displaced from the straight line by ${\displaystyle +(1,0)^{T}}$. Some other points lie directly to the left of our line; that is, they are displaced by ${\displaystyle -(1,0)^{T}}$. In the picture it looks like this:

The vector space ${\displaystyle (\mathbb {Z} /3\mathbb {Z} )^{2}}$ also allows for "adding the points on the donut": Here, we get the following relations:

1. If we add a point on the left and a point on the right of the line, we get a point on the line: For example, ${\displaystyle {\color {Red}(0,1)^{T}}+{\color {Blue}(1,0)^{T}}={\color {Purple}(1,1)^{T}}}$.
2. If we add two points on the left of the line, we get a point on the right: For example, we have ${\displaystyle {\color {Red}(0,1)^{T}}+{\color {Red}(2,0)^{T}}={\color {Blue}(2,1)^{T}}}$.
3. If we add two points on the right of the line, we get a point on the left: For example, ${\displaystyle {\color {Blue}(0,2)^{T}}+{\color {Blue}(1,0)^{T}}={\color {Red}(1,2)^{T}}}$.

If we form the quotient space ${\displaystyle (\mathbb {Z} /3\mathbb {Z} )^{2}/U}$ (with 3 cosets), we see that two points have the same position with respect to the line (on/left/right) if they are in the same coset. Each coset then consists of 3 points. Furthermore, our addition relations above just represent the addition on the quotient space ${\displaystyle (\mathbb {Z} /3\mathbb {Z} )^{2}/U}$.

## Relationship between quotient space and complement

In the quotient space ${\displaystyle V/U}$ we calculate with vectors in ${\displaystyle V}$ up to arbitrary modifications from ${\displaystyle U}$. We know another construction that can be interpreted similarly: The complement. A complement of a subspace ${\displaystyle U\subseteq V}$ is a subspace ${\displaystyle W\subseteq V}$ such that ${\displaystyle U\oplus W=V}$. Here ${\displaystyle U\oplus W}$ denotes the inner direct sum of ${\displaystyle U}$ and ${\displaystyle W}$ in ${\displaystyle V}$, that is, ${\displaystyle U\oplus W=U+W}$ and ${\displaystyle U\cap W=\{0\}}$. A vector ${\displaystyle v\in V}$ can then be decomposed uniquely as ${\displaystyle v=u+w}$, where ${\displaystyle u\in U}$ and ${\displaystyle w\in W}$. But the complement itself is then not unique! There can be different subspaces ${\displaystyle W,W'\subseteq V}$, with ${\displaystyle U\oplus W=V=U\oplus W'}$.

For the quotient space, we "forget" the part of ${\displaystyle v}$ that is in ${\displaystyle U}$ by identifying ${\displaystyle v}$ with the coset ${\displaystyle v+U}$:

${\displaystyle V\to V/U,\quad v\mapsto v+U}$

If ${\displaystyle W}$ is a complement of ${\displaystyle U}$ and ${\displaystyle v=u+w}$ for distinct ${\displaystyle u\in U}$ and ${\displaystyle w\in W}$, then we can analogously forget the ${\displaystyle U}$-part by mapping ${\displaystyle v}$ to the ${\displaystyle W}$-part, called ${\displaystyle w}$:

${\displaystyle V=U\oplus W\to W,\quad v=u+w\mapsto w}$

Apparently ${\displaystyle V/U}$ and the complement ${\displaystyle W}$ are similar. Can we identify the two vector spaces ${\displaystyle V/U}$ and ${\displaystyle W}$, i.e., are they isomorphic? Yes, they are, as we prove in the following theorem.

Theorem (Isomorphism between complement and quotient space)

Let ${\displaystyle W}$ be a complement of ${\displaystyle U}$ in ${\displaystyle V}$. Then the projection ${\displaystyle f\colon W\to V/U;w\mapsto w+U}$ is a linear isomorphism between ${\displaystyle W}$ and the quotient space ${\displaystyle V/U}$.

Proof (Isomorphism between complement and quotient space)

We want to show that ${\displaystyle f}$ is linear, i.e., compatible with addition and scalar multiplication, and bijective.

Proof step: Linearity of ${\displaystyle f}$

Since ${\displaystyle W\subseteq V}$ is a subspace and scalar multiplication and addition is defined on representatives, ${\displaystyle f}$ is compatible with addition and scalar multiplication. That is, for ${\displaystyle \lambda \in K}$ and ${\displaystyle v,w\in W}$, we have

${\displaystyle f(v+w)=(v+w)+U=(v+U)\boxplus (w+U)=f(v)\boxplus f(w)}$

and

${\displaystyle f(\lambda \cdot v)=(\lambda \cdot v)+U=\lambda \boxdot (v+U)=\lambda \boxdot f(v).}$

Proof step: Surjectivity of ${\displaystyle f}$

Let ${\displaystyle v+U\in V/U}$. Since ${\displaystyle W}$ is a complement of ${\displaystyle U}$, we find ${\displaystyle u\in U}$ and ${\displaystyle w\in W}$ with ${\displaystyle v=u+w}$. Then

${\displaystyle f(w)=w+U{\overset {(*)}{=}}(w+u)+U=v+U,}$

where we used in ${\displaystyle (*)}$ that ${\displaystyle (w+u)-w=u\in U}$ and thus ${\displaystyle w+U=(w+u)+U}$ holds. So ${\displaystyle f}$ is surjective.

Proof step: Injectivity of ${\displaystyle f}$

We show ${\displaystyle \ker(f)=\{0\}}$. Let ${\displaystyle w\in \ker(f)}$, i.e. ${\displaystyle w\in W}$ with ${\displaystyle f(w)=0+U}$. So ${\displaystyle w+U=f(w)=0+U}$ holds. Thus ${\displaystyle w=w-0\in U}$. Since ${\displaystyle U}$ is a complement of ${\displaystyle W}$, we have ${\displaystyle U\cap W=\{0\}}$. Further, ${\displaystyle w\in U}$ and ${\displaystyle w\in W}$ implies ${\displaystyle w\in U\cap W=\{0\}}$, so ${\displaystyle w=0}$.

We have seen that ${\displaystyle V/U}$ is isomorphic to any complement of ${\displaystyle U}$. So it should also behave like a complement, i.e. ${\displaystyle U\oplus V/U=V}$ should hold. But be careful: Because ${\displaystyle V/U}$ is not a subspace of ${\displaystyle V}$, we cannot form the inner direct sum with ${\displaystyle U}$. However, we can still consider the outer direct sum of ${\displaystyle U}$ and ${\displaystyle V/U}$:

${\displaystyle U\oplus V/U=\{(u,v+U)\mid u\in U,v+U\in V/U\}}$

This may not be equal to ${\displaystyle V}$, but it may be isomorphic to ${\displaystyle V}$. And we will show that it indeed is isomorphic.

Theorem (${\displaystyle U\oplus V/U\cong V}$)

Let ${\displaystyle U}$ be a subspace of a ${\displaystyle K}$-vector space ${\displaystyle V}$. Then, ${\displaystyle U\oplus V/U\cong V}$ holds.

Proof (${\displaystyle U\oplus V/U\cong V}$)

Let ${\displaystyle W}$ be a complement of ${\displaystyle U}$, i.e. ${\displaystyle U\cap W=\{0\}}$ and ${\displaystyle U+W=V}$. From the previous theorem we know that the function

${\displaystyle f\colon W\to V/U,\quad w\mapsto w+U}$

is an isomorphism. We use this to show that

${\displaystyle g\colon U\oplus W\to U\oplus V/U,\quad (u,w)\mapsto (u,w+U)}$

is an isomorphism, where ${\displaystyle U\oplus W=\{(u,w)\mid u\in U,w\in W\}}$ denotes the outer direct sum.

Proof step: ${\displaystyle g}$ is linear

We have ${\displaystyle g(u,w)=(\operatorname {id} _{U}(u),f(w))}$ for all ${\displaystyle (u,w)\in U\oplus W}$. It follows directly that ${\displaystyle g}$ is linear, since addition and scalar multiplication on ${\displaystyle U\oplus W}$ are defined component-wise and as ${\displaystyle \operatorname {id} _{U}}$ and ${\displaystyle f}$ are linear.

Proof step: ${\displaystyle g}$ is bijective

This also follows from ${\displaystyle g(u,w)=(\operatorname {id} _{U}(u),f(w))}$ for all ${\displaystyle (u,w)\in U\oplus W}$, since the identity ${\displaystyle \operatorname {id} _{U}}$ and ${\displaystyle f}$ are bijective.

Thus we have ${\displaystyle U\oplus V/U\cong U\oplus W}$. By this theorem, the inner direct sum of the subspaces ${\displaystyle U}$ and ${\displaystyle W}$ is isomorphic to their outer direct sum. So ${\displaystyle V=U\oplus _{I}W\cong U\oplus W\cong U\oplus V/U}$, where ${\displaystyle U\oplus _{I}W}$ denotes the inner direct sum of ${\displaystyle U}$ and ${\displaystyle W}$.

To-Do:

link "this theorem" to the appropriate theorem

## Exercises

Exercise (The projection is linear)

Let ${\displaystyle V}$ be a ${\displaystyle K}$-vector space and ${\displaystyle U\subseteq V}$ a subspace. Show that the canonical projection.

${\displaystyle \pi \colon V\to V/U,\quad v\mapsto v+U}$

is linear.

Solution (The projection is linear)

Let ${\displaystyle v,w\in V}$ and ${\displaystyle \lambda \in K}$ be arbitrary. We again write ${\displaystyle \boxplus }$ and ${\displaystyle \boxdot }$ for the vector space structure on ${\displaystyle V/U}$. We then have

{\displaystyle {\begin{aligned}&\pi (\lambda v+w)\\[0.3em]&{\color {OliveGreen}\left\downarrow \ {\text{definition of }}\pi \right.}\\[0.3em]=&(\lambda v+w)+U\\[0.3em]&{\color {OliveGreen}\left\downarrow \ {\text{definition of }}\boxplus \right.}\\[0.3em]=&((\lambda v)+U)\boxplus (w+U)\\[0.3em]&{\color {OliveGreen}\left\downarrow \ {\text{definition of }}\boxdot \right.}\\[0.3em]=&(\lambda \boxdot (v+U))\boxplus (w+U)\\[0.3em]&{\color {OliveGreen}\left\downarrow \ {\text{definition of }}\pi \right.}\\[0.3em]=&(\lambda \boxdot \pi (v))\boxplus \pi (w).\end{aligned}}}

So ${\displaystyle \pi }$ is linear.