next up previous contents
Next: Functions Up: Axiomatic Viewpoint Previous: Vector space axioms   Contents

Order relations

Any subset of the vectors in a vector space, keeping the same coefficients, which is a vector space all by itself, is called a subspace. Of course, the coefficients could be changed too - to pass from a complex space to a real space, for instance - but that is a specialty which not usually considered to be a part of an introduction to vector space theory. Save closure, finding zero, and locating negatives, none of the other axioms depends on whether a subset is being considered or not. Checking all three requirements at once can be accomplished by considering closure under differences.

Vector subspaces are familiar enough: in three dimensions, just think of lines and planes. The important point is that in whatever vector space, the subspaces are ordered by inclusion. Moreover, there is a smallest subspace, containing just zero itself (the origin), which is contained in every other subspace. Apparently trivial, but still important for consistency, the whole space is not just a subspace, but it is the largest subspace, within which every other is contained.

Comparison between pairs of subspaces can be attempted, yet pairs are rarely arranged so that one is contained within the other. Instead, there is a largest subspace common to both, as well as a smallest subspace containing the two. The first of these is just the setwise intersection of the two, but the second requires some construction: it consists of all linear combinations of vectors in the two subsets; not just their union. Just keep on thinking of lines and planes.

It isn't just subspaces which have upper and lower bounds; mere subsets can be included in the hierarchy by looking for the smallest vector space containing a given subset, or the largest (if any) subspace which it contains. This is an idea which gives rise to bases: Start with any non-zero vector, and find the smallest vector space containing it. That will be the set of all its scalar multiples, or in short, a line passing through the vector (and evidently also passing through the origin) .

Supposing that there are some vectors still left in the full space, choose one of them, and repeat the process. But now we have to account for the first vector, so it is a good idea to find the smallest subspace containing both vectors, which is just the set of sums taken from the two subspaces. We might as well call that the sum of the two subspaces. Of course, there is some verifying involved, to ensure that everything is well done and consistent.

Maybe there are still vectors not accounted for; so start all over again, finding the smallest subspace containing all three vectors, and so on as long as the entire space hasn't been generated. If the process terminates, the chosen vectors are said to form a basis, whose dimension is the number of independent vectors which it contains (linear dependence means that one vector is a linear combination of others; equivalently, summing multiples of all of them, not all coefficients being zero, still gives zero). The interesting thing is that the dimension always ends up the same, whatever sequence of vectors is chosen for the construction. It is not hard to prove; just replace the first basis by the second, one vector at a time, while substituting all the previous results in each new expression.

If the process of exhausting the supply of vectors never terminates, the space is not finite-dimensional, turning the search for bases into a much more intricate activity.


next up previous contents
Next: Functions Up: Axiomatic Viewpoint Previous: Vector space axioms   Contents
Pedro Hernandez 2004-02-28