Fehler: Aktuelle Seite wurde in der Sitemap nicht gefunden. Deswegen kann keine Navigation angezeigt werden
We have compiled some tasks on linear maps here. The proof structures can help you to solve other similar tasks.
As a reminder, here is the definition of a linear map:
Showing linearity of a mapping
Linear maps from
to 
Exercise (Linear map into a field)
Let
be defined by
.
Show that the map
is linear.
How to get to the proof? (Linear map into a field)
First you have to show the additivity and then homogeneity of the map.
Solution (Linear map into a field)
Exercise (Linear map from

to

)
Show that the map
with
is linear.
Solution (Linear map from

to

)
Aktuelles Ziel: Additivity
Aktuelles Ziel: Scaling
Exercise (Linearity of the embedding)
Show that for
, the map
is linear.
Solution (Linearity of the embedding)
Let
and
, as well as
. By definition of the map
, we have that
So
is linear.
We consider an example for a linear map of
to
:
with
Exercise (Linearity of

)
Show that the map
is linear.
Proof (Linearity of

)
is an
-vector space. In addition, the map is well-defined.
Proof step: homogeneity
Let
and
. Then:
Thus the map is linear.
Important special cases
Proof (The identity is a linear map)
The identity is additive: Let
, then.
The identity is homogeneous: Let
and
, then
Proof (The map to zero is a linear map)
is additive: let
be vectors in
. Then
is homogeneous: Let
and let
. Then
Thus, the map to zero is linear
Linear maps between function spaces
Exercise (Mapping on a function space)
Consider the function space
of all functions from
to
, as well as the map
Show that
is linear.
Solution (Mapping on a function space)
The operations on the function space are defined element-wise in each case.
That means: for
,
and
we have that
and
.
In particular, this is true for
, which implies
and
Thus, we have established linearity.
Exercise (The precomposition with a map is linear.)
Let
be a vector space, let
be sets, and let
or
be the vector space of functions from
or
to
. Let
be arbitrary but fixed. We consider the mapping
Show that
is linear.
Proof (The precomposition with a map is linear.)
Let
.
Let
and
.
Now, additivity and homogeneity of
implies that
is a linear map.
Exercise (Sequence space)
Let
be the
-vector space of all real-valued sequences. Show that the map
is linear.
How to get to the proof? (Sequence space)
To show linearity, two properties need to be checked:
is additive:
for all 
is homogeneous:
for all
and 
The vectors
and
are sequences of real numbers, i.e. they are of the form
and
with
for all
.
Proof (Sequence space)
Proof step: additivity
Let
and
. Then, we have
It follows that
is additive.
Proof step: homogeneity
Let
and
. Then, we have
So
is homogeneous.
Thus it was proved that
is a
-linear map.
Construction of a linear map from given values
Exercise (Construction of a linear map)
Let
.
Further, consider
.
Find a linear map
with
for all
.
Solution (Construction of a linear map)
We see that
is a basis of
, namely the standard basis.
According to the theorem of linear continuation, we can construct a linear map
defined by
Now we only have to check if
is satisfied. It is true that
, so
Thus the condition
is satisfied for each
. The mapping
is linear by definition, so we are done.
Exercise (Linear maps under some conditions)
Let
and
.
Is there an
-linear map
that satisfies
?
How to get to the proof? (Linear maps under some conditions)
First you should check if the vectors
are linearly independent. If this is the case,
is a basis of
because of
. Using the principle of linear continuation, the existence of such a linear map would follow
. Let thus
:
But then also
and so
must be fulfilled. However, this equation has not only the "trivial" solution
. In fact, the upper equation is satisfied for
. Thus, one obtains
For such a map
, the relation
would then have to hold, which is a contradiction to
Linear independence of two preimages
Exercises: Isomorphisms
Solution (complex

-vector spaces)
Set
.
We choose a
basis
of
.
Define
for all
.
We have to show that
is an
-basis of
.
Then,
.
According to a theorem above, we have
as
-vector spaces.
We now show
-linear independence.
Proof step:
is
-linearly independent
Let
and assume that
.
We substitute the definition for
, conclude the sums and obtain
.
By
-linear independence of
we obtain
for all
.
Thus,
for all
.
This establishes the
-linear independence.
Now only one step is missing:
Proof step:
is a generator with respect to 
Let
be arbitrary.
Since
is a
-basis of
, we can find some
,
such that
.
We write
with
for all
.
Then we obtain
So
is inside the
-span of
.
This establishes the assertion.
Exercise (Isomorphism criteria for endomorphisms)
Let
be a field,
a finite-dimensional
-vector space and
a
-linear map.
Prove that the following three statements are equivalent:
(i)
is an isomorphism.
(ii)
is injective.
(iii)
is surjective.
(Note: For this task, it may be helpful to know the terms kernel and image of a linear map. Using the dimension theorem, this exercise becomes much easier. However, we give a solution here, which works without the dimension theorem).
Solution (Isomorphism criteria for endomorphisms)
(i)
(ii) and (iii): According to the definition of an isomorphism,
is bijective, i.e. injective and surjective. Therefore (ii) and (iii) hold.
(ii)
(i): Let
be an injective mapping. We need to show that
is also surjective. The image
of
is a subspace of
. This can be verified by calculation. We now define a mapping
that does the same thing as
, except that it will be surjective by definition. This mapping is defined as follows:
The surjectivity comes from the fact that every element
can be written as
, for a suitable
. Moreover, the mapping
is injective and linear. This is because
already has these two properties. So
and
are isomorphic. Therefore,
and
have the same finite dimension. Since
is a subspace of
,
holds. This can be seen by choosing a basis in
, for instance the basis given by the vectors
. These
are also linearly independent in
, since
. And since
and
have the same dimension, the
are also a basis in
. So the two vector spaces
and
must now be the same, because all elements from them are
-linear combinations formed with the
. Thus we have shown that
is surjective.
(iii)
(i): Now suppose
is surjective. We need to show that
is also injective. Let
be the kernel of the mapping
. You may convince yourself by calculation, that this kernel is a subspace of
. Let
be a basis of
. We can complete this (small) basis to a (large) basis of
, by including the additional vectors
. We will now show that
are linearly independent. So let coefficients
be given such that
By linearity of
we conclude:
. This means that the linear combination
is in the kernel of
. But we already know a basis of
. Therefore there are coefficients
, such that
Because of the linear independence of
it now follows that
. Therefore, the
are linearly independent. Next, we will show that these vectors also form a basis of
. To do this, we show that each vector in
can be written as a linear combination of the
. Let
. Because of the surjectivity of
, there is a
, with
. Since the
form a basis of
, there are coefficients
such that
If we now apply
to this equation, we get:
Here we used the linearity of
. Since the first
elements of our basis are in the kernel, their images are
. So we get the desired representation of
:
Thus we have shown that
forms a linearly independent generator of
. So these vectors form a basis of
. Now if
were not
, two finite bases in
would not contain equally many elements. This cannot be the case. Therefore,
, so
is the trivial vector space and
is indeed injective.
Solution (Function spaces)
We already know according to a theorem above that two finite dimensional vector spaces are isomorphic exactly if they have the same dimension. So we just need to show that
holds.
To show this, we first need a basis of
. For this, let
be the elements of the set
. We define
by
We now show that the functions
indeed form a basis of
.
Proof step:
are linearly independent
Let
with
being the zero function. If we apply this function to any
with
, then we obtain:
. By definition of
it follows that
.
Since
was arbitrary and
must hold for all
, it follows that
. So we have shown that
are linearly independent.
Proof step:
generate 
Let
be arbitrary. We now want to write
as a linear combination of
. For this we show
, i.e.,
is a linear combination of
with coefficients
.
We now verify that
for all
. Let
be arbitrary. By definition of
we obtain:
.
Since equality holds for all
, the functions agree at every point and are therefore identical. So we have shown that
generate
.
Thus we have proved that
is a basis of
. Since we have
basis elements of
, it follows that
.
Exercises: Images
Solution (Associating image spaces to figures)
First we look for the image of
:
To find
, we can apply a theorem from above: If
is a generator of
, then
holds. We take the standard basis
as the generator of
. Then
Now we apply
to the standard basis
The vectors
generate the image of
. Moreover, they are linearly independent and thus a basis of
.
Therefore
. So
.
Next, we want to find the image of
. However, it is also possible to compute the image
directly by definition, which we will demonstrate here.
So the image of
is spanned by the vector
. Thus
.
Now we determine the image of
using, for example, the same method as for
. That means we apply
to the standard basis:
Both vectors are linearly dependent. So it follows that
and thus
.
Finally, we determine the image of
. For this we proceed for example as with
.
So the image of
is spanned by the vector
. Thus
is the
-axis, so
.
Exercise (Image of a matrix)
- Consider the matrix
and the mapping
induced by it. What is the image
?
- Now let
be any matrix over a field
, where
denote the columns of
. Consider the mapping
induced by
. Show that
holds. So the image of a matrix is the span of its columns.
Solution (Image of a matrix)
Solution sub-exercise 2:
Proof step: "
"
Let
. Then, there is some
with
. We can write
as
. Plugging this into the equation
, we get.
Since
, we obtain
.
Proof step: "
"
Exercise (Surjectivity and dimension of

and

)
Let
and
be two finite-dimensional vector spaces. Show that there exists a surjective linear map
if and only if
.
How to get to the proof? (Surjectivity and dimension of

and

)
We want to estimate the dimensions of
and
against each other. The dimension is defined as the cardinality of a basis. That is, if
is a basis of
and
is a basis of
, we must show that
holds if and only if there exists a surjective linear map. "if and only if" means that we need to establish two directions (
).
Given a surjective linear map
, we must show that the dimension of
is at least
. Now bases are maximal linearly independent subsets. That is, to estimate the dimension from below, we need to construct a linearly independent subset with
elements. In the figure, we have already a linearly independent subset with
elements, which is the basis
. Because
is surjective, we can lift these to vectors
with
. Now we need to verify that
are linearly independent in
. We see this, by converting a linear combination
via
into a linear combination
and exploiting the linear independence of
.
Conversely, if
holds, we must construct a surjective linear map
. Following the principle of linear continuation, we can construct the linear map
by specifying how
acts on a basis of
. For this we need elements of
on which we can send
. We have already chosen a basis of
above. Therefore, it is convenient to define
as follows:
Then the image of
is spanned by the vectors
. However, these vectors also span all of
and thus
is surjective.
Solution (Surjectivity and dimension of

and

)
Proof step: "
"
Suppose there is a suitable surjective mapping
. We show that the dimension of
cannot be larger than the dimension of
(this is true for any linear map). Because of the surjectivity of
, it follows that
.
So let
be linearly independent. There exists
with
for
. We show that
are also linearly independent: Let
with
. Then we also have that
By linear independence of
, it follows that
. So
are also linearly independent. Overall, we have shown that
In particular, it holds that a basis of
(a maximal linearly independent subset of
) must contain at least as many elements as a basis of
, that is,
.
Proof step: "
"
Assume that
. We use that a linear map is already uniquely determined by the images of the basis vectors. Let
be a basis of
and
be a basis of
. Define the surjective linear map
by
This works, since by assumption,
holds. The mapping constructed in this way is surjective, since by construction,
. As the image of
is a subspace of
, the subspace generated by these vectors, i.e.,
, also lies in the image of
. Accordingly,
holds and
is surjective.
Exercises: Kernel
Exercise
We consider the linear map
. Determine the kernel of
.
Solution
We are looking for vectors
such that
. Let
be any vector in
for which
is true. We now examine what properties this vector must have. It holds that
So
and
. From this we conclude
. So any vector
in the kernel of
satisfies the condition
.
Now take a vector
with
. Then
We see that
. In total
Check your understanding: Can you visualize

in the plane? What does the image of

look like? How do the kernel and the image relate to each other?
We have already seen that
Now we determine the image of
by applying
to the canonical basis.
So
holds.
We see that the two vectors are linearly dependent. That is, we can generate the image with only one vector:
.
In our example, the image and the kernel of the linear map
are straight lines through the origin. The two straight lines intersect only at the zero and together span the whole
.
Solution
Proof step:
nilpotent 
Proof step: The converse implication
The converse implication does not hold. There are mappings that are neither injective nor nilpotent. For example we can define
This mapping is not injective, because
. But it is also not nilpotent, because we have
for all
.
Exercise (Injectivity and dimension of

and

)
Let
and
be two finite-dimensional vector spaces. Show that there exists an injective linear map
if and only if
.
How to get to the proof? (Injectivity and dimension of

and

)
To prove equivalence, we need to show two implications. For the execution, we use that every monomorphism
preserves linear independence: If
is a basis of
, then the
vectors
are linearly independent. For the converse direction, we need to construct a monomorphism from
to
using the assumption
. To do this, we choose bases in
and
and then use the principle of linear continuation to define a monomorphism by the images of the basis vectors.
Solution (Injectivity and dimension of

and

)
Proof step: There is a monomorphism 
Let
be a monomorphism and
a basis of
. Then
is in particular linearly independent and therefore
is linearly independent. Thus, it follows that
. So
is a necessary criterion for the existence of a monomorphism from
to
.
Proof step:
there is a monomorphism
Conversely, in the case
we can construct a monomorphism: Let
be a basis of
and
be a basis of
. Then
. We define a linear map
by setting
for all
. According to the principle of linear continuation, such a linear map exists and is uniquely determined. We now show that
is injective by proving that
holds. Let
. Because
is a basis of
, there exist some
with
Thus, we get
Since
are linearly independent,
must hold for all
. So it follows for
that
We have shown that
holds and thus
is a monomorphism.
Fehler: Aktuelle Seite wurde in der Sitemap nicht gefunden. Deswegen kann keine Navigation angezeigt werden