Section VS Vector Spaces
In this section we present a formal definition of a vector space, which will lead to an extra increment of abstraction. Once defined, we study its most basic properties.
Subsection VS Vector Spaces
Here is one of the two most important definitions in the entire course.
Definition VS Vector Space
Suppose that $V$ is a set upon which we have defined two operations: (1) vector addition, which combines two elements of $V$ and is denoted by “+”, and (2) scalar multiplication, which combines a complex number with an element of $V$ and is denoted by juxtaposition. Then $V$, along with the two operations, is a vector space over $\complexes$ if the following ten properties hold.
- AC Additive Closure
If $\vect{u},\,\vect{v}\in V$, then $\vect{u}+\vect{v}\in V$.
- SC Scalar Closure
If $\alpha\in\complex{\null}$ and $\vect{u}\in V$, then $\alpha\vect{u}\in V$.
- C Commutativity
If $\vect{u},\,\vect{v}\in V$, then $\vect{u}+\vect{v}=\vect{v}+\vect{u}$.
- AA Additive Associativity
If $\vect{u},\,\vect{v},\,\vect{w}\in V$, then $\vect{u}+\left(\vect{v}+\vect{w}\right)=\left(\vect{u}+\vect{v}\right)+\vect{w}$.
- Z Zero Vector
There is a vector, $\zerovector$, called the zero vector, such that $\vect{u}+\zerovector=\vect{u}$ for all $\vect{u}\in V$.
- AI Additive Inverses
If $\vect{u}\in V$, then there exists a vector $\vect{-u}\in V$ so that $\vect{u}+ (\vect{-u})=\zerovector$.
- SMA Scalar Multiplication Associativity
If $\alpha,\,\beta\in\complex{\null}$ and $\vect{u}\in V$, then $\alpha(\beta\vect{u})=(\alpha\beta)\vect{u}$.
- DVA Distributivity across Vector Addition
If $\alpha\in\complex{\null}$ and $\vect{u},\,\vect{v}\in V$, then $\alpha(\vect{u}+\vect{v})=\alpha\vect{u}+\alpha\vect{v}$.
- DSA Distributivity across Scalar Addition
If $\alpha,\,\beta\in\complex{\null}$ and $\vect{u}\in V$, then $(\alpha+\beta)\vect{u}=\alpha\vect{u}+\beta\vect{u}$.
- O One
If $\vect{u}\in V$, then $1\vect{u}=\vect{u}$.
The objects in $V$ are called vectors, no matter what else they might really be, simply by virtue of being elements of a vector space.
Now, there are several important observations to make. Many of these will be easier to understand on a second or third reading, and especially after carefully studying the examples in Subsection VS.EVS.
An axiom is often a “self-evident” truth. Something so fundamental that we all agree it is true and accept it without proof. Typically, it would be the logical underpinning that we would begin to build theorems upon. Some might refer to the ten properties of Definition VS as axioms, implying that a vector space is a very natural object and the ten properties are the essence of a vector space. We will instead emphasize that we will begin with a definition of a vector space. After studying the remainder of this chapter, you might return here and remind yourself how all our forthcoming theorems and definitions rest on this foundation.
As we will see shortly, the objects in $V$ can be anything, even though we will call them vectors. We have been working with vectors frequently, but we should stress here that these have so far just been column vectors — scalars arranged in a columnar list of fixed length. In a similar vein, you have used the symbol “+” for many years to represent the addition of numbers (scalars). We have extended its use to the addition of column vectors and to the addition of matrices, and now we are going to recycle it even further and let it denote vector addition in any possible vector space. So when describing a new vector space, we will have to define exactly what “+” is. Similar comments apply to scalar multiplication. Conversely, we can define our operations any way we like, so long as the ten properties are fulfilled (see Example CVS).
In Definition VS, the scalars do not have to be complex numbers. They can come from what are called in more advanced mathematics, “fields”. Examples of fields are the set of complex numbers, the set of real numbers, the set of rational numbers, and even the finite set of “binary numbers”, $\set{0,\,1}$. There are many, many others. In this case we would call $V$ a vector space over (the field) $F$.
A vector space is composed of three objects, a set and two operations. Some would explicitly state in the definition that $V$ must be a nonempty set, but we can infer this from Property Z, since the set cannot be empty and contain a vector that behaves as the zero vector. Also, we usually use the same symbol for both the set and the vector space itself. Do not let this convenience fool you into thinking the operations are secondary!
This discussion has either convinced you that we are really embarking on a new level of abstraction, or it has seemed cryptic, mysterious or nonsensical. You might want to return to this section in a few days and give it another read then. In any case, let us look at some concrete examples now.
Subsection EVS Examples of Vector Spaces
Our aim in this subsection is to give you a storehouse of examples to work with, to become comfortable with the ten vector space properties and to convince you that the multitude of examples justifies (at least initially) making such a broad definition as Definition VS. Some of our claims will be justified by reference to previous theorems, we will prove some facts from scratch, and we will do one nontrivial example completely. In other places, our usual thoroughness will be neglected, so grab paper and pencil and play along.
Example VSCV The vector space $\complex{m}$
Example VSM The vector space of matrices, $M_{mn}$
So, the set of all matrices of a fixed size forms a vector space. That entitles us to call a matrix a vector, since a matrix is an element of a vector space. For example, if $A,\,B\in M_{34}$ then we call $A$ and $B$ “vectors,” and we even use our previous notation for column vectors to refer to $A$ and $B$. So we could legitimately write expressions like \begin{equation*} \vect{u}+\vect{v}=A+B=B+A=\vect{v}+\vect{u} \end{equation*} This could lead to some confusion, but it is not too great a danger. But it is worth comment.
The previous two examples may be less than satisfying. We made all the relevant definitions long ago. And the required verifications were all handled by quoting old theorems. However, it is important to consider these two examples first. We have been studying vectors and matrices carefully (Chapter V, Chapter M), and both objects, along with their operations, have certain properties in common, as you may have noticed in comparing Theorem VSPCV with Theorem VSPM. Indeed, it is these two theorems that motivate us to formulate the abstract definition of a vector space, Definition VS. Now, if we prove some general theorems about vector spaces (as we will shortly in Subsection VS.VSP), we can then instantly apply the conclusions to both $\complex{m}$ and $M_{mn}$. Notice too, how we have taken six definitions and two theorems and reduced them down to two examples. With greater generalization and abstraction our old ideas get downgraded in stature.
Let us look at some more examples, now considering some new vector spaces.
Example VSP The vector space of polynomials, $P_n$
Example VSIS The vector space of infinite sequences
Example VSF The vector space of functions
Here is a unique example.
Example VSS The singleton vector space
Perhaps some of the above definitions and verifications seem obvious or like splitting hairs, but the next example should convince you that they are necessary. We will study this one carefully. Ready? Check your preconceptions at the door.
Example CVS The crazy vector space
Subsection VSP Vector Space Properties
Subsection VS.EVS has provided us with an abundance of examples of vector spaces, most of them containing useful and interesting mathematical objects along with natural operations. In this subsection we will prove some general properties of vector spaces. Some of these results will again seem obvious, but it is important to understand why it is necessary to state and prove them. A typical hypothesis will be “Let $V$ be a vector space.” From this we may assume the ten properties of Definition VS, and nothing more. It is like starting over, as we learn about what can happen in this new algebra we are learning. But the power of this careful approach is that we can apply these theorems to any vector space we encounter — those in the previous examples, or new ones we have not yet contemplated. Or perhaps new ones that nobody has ever contemplated. We will illustrate some of these results with examples from the crazy vector space (Example CVS), but mostly we are stating theorems and doing proofs. These proofs do not get too involved, but are not trivial either, so these are good theorems to try proving yourself before you study the proof given here. (See Proof Technique P.)
First we show that there is just one zero vector. Notice that the properties only require there to be at least one, and say nothing about there possibly being more. That is because we can use the ten properties of a vector space (Definition VS) to learn that there can never be more than one. To require that this extra condition be stated as an eleventh property would make the definition of a vector space more complicated than it needs to be.
Theorem ZVU Zero Vector is Unique
Suppose that $V$ is a vector space. The zero vector, $\zerovector$, is unique.
Theorem AIU Additive Inverses are Unique
Suppose that $V$ is a vector space. For each $\vect{u}\in V$, the additive inverse, $\vect{-u}$, is unique.
As obvious as the next three theorems appear, nowhere have we guaranteed that the zero scalar, scalar multiplication and the zero vector all interact this way. Until we have proved it, anyway.
Theorem ZSSM Zero Scalar in Scalar Multiplication
Suppose that $V$ is a vector space and $\vect{u}\in V$. Then $0\vect{u}=\zerovector$.
Here is another theorem that looks like it should be obvious, but is still in need of a proof.
Theorem ZVSM Zero Vector in Scalar Multiplication
Suppose that $V$ is a vector space and $\alpha\in\complex{\null}$. Then $\alpha\zerovector=\zerovector$.
Here is another one that sure looks obvious. But understand that we have chosen to use certain notation because it makes the theorem's conclusion look so nice. The theorem is not true because the notation looks so good; it still needs a proof. If we had really wanted to make this point, we might have used notation like $\vect{u}^\sharp$ for the additive inverse of $\vect{u}$. Then we would have written the defining property, Property AI, as $\vect{u}+\vect{u}^\sharp=\zerovector$. This theorem would become $\vect{u}^\sharp=(-1)\vect{u}$. Not really quite as pretty, is it?
Theorem AISM Additive Inverses from Scalar Multiplication
Suppose that $V$ is a vector space and $\vect{u}\in V$. Then $\vect{-u}=(-1)\vect{u}$.
Because of this theorem, we can now write linear combinations like $6\vect{u}_1+(-4)\vect{u}_2$ as $6\vect{u}_1-4\vect{u}_2$, even though we have not formally defined an operation called vector subtraction.
Our next theorem is a bit different from several of the others in the list. Rather than making a declaration (“the zero vector is unique”) it is an implication (“if…, then…”) and so can be used in proofs to convert a vector equality into two possibilities, one a scalar equality and the other a vector equality. It should remind you of the situation for complex numbers. If $\alpha,\,\beta\in\complexes$ and $\alpha\beta=0$, then $\alpha=0$ or $\beta=0$. This critical property is the driving force behind using a factorization to solve a polynomial equation.
Theorem SMEZV Scalar Multiplication Equals the Zero Vector
Suppose that $V$ is a vector space and $\alpha\in\complex{\null}$. If $\alpha\vect{u}=\zerovector$, then either $\alpha=0$ or $\vect{u}=\zerovector$.
Example PCVS Properties for the Crazy Vector Space
Subsection RD Recycling Definitions
When we say that $V$ is a vector space, we then know we have a set of objects (the “vectors”), but we also know we have been provided with two operations (“vector addition” and “scalar multiplication”) and these operations behave with these objects according to the ten properties of Definition VS. One combines two vectors and produces a vector, the other takes a scalar and a vector, producing a vector as the result. So if $\vect{u}_1,\,\vect{u}_2,\,\vect{u}_3\in V$ then an expression like \begin{equation*} 5\vect{u}_1+7\vect{u}_2-13\vect{u}_3 \end{equation*} would be unambiguous in any of the vector spaces we have discussed in this section. And the resulting object would be another vector in the vector space. If you were tempted to call the above expression a linear combination, you would be right. Four of the definitions that were central to our discussions in Chapter V were stated in the context of vectors being column vectors, but were purposely kept broad enough that they could be applied in the context of any vector space. They only rely on the presence of scalars, vectors, vector addition and scalar multiplication to make sense. We will restate them shortly, unchanged, except that their titles and acronyms no longer refer to column vectors, and the hypothesis of being in a vector space has been added. Take the time now to look forward and review each one, and begin to form some connections to what we have done earlier and what we will be doing in subsequent sections and chapters. Specifically, compare the following pairs of definitions: