Vector space

From Free net encyclopedia

(Redirected from Vector spaces)

Vector spaces (or linear spaces) are spaces whose elements, known as vectors, can be scaled and added; all linear combinations can be formed. They provide the correct context within which to study linear phenomena, are the basic object of study in linear algebra, and are used extensively in almost every area of mathematics and many branches of the sciences.

The most familiar vector spaces are spaces of geometrical vectors, usually depicted as arrows with magnitude and direction. These can also be represented more formally as ordered n-tuples of numbers. Both representations admit the two important operations of vector addition and scalar multiplication. In general, a vector space is any abstract mathematical structure on which these operations, satisfying their natural axioms, are defined.

Under this abstract definition, the vectors need not be geometric vectors in the normal sense of arrows, but can be elements of any mathematical set that satisfies the axioms. For example, the polynomials with real coefficients form a vector space. This abstract quality reflects the need to use the theory of vector spaces in many areas of modern mathematics.

Contents

Formal definition

Let F be a field (such as the field of real numbers or the field of complex numbers), called the field of scalars. Then a vector space over the field F is a set V of vectors together with two operations,

  • vector addition: V × VV denoted v + w, where v, wV, and
  • scalar multiplication: F × VV denoted a v, where aF and vV,

such that some axioms are satisfied. Four require vector addition to be an abelian group, two are distributive laws, one for associativity of scalar multiplication, and one for scalar multiplication by the multiplicative identity. The following is a list of the eight axioms:

  1. Vector addition is associative:

    For all u, v, wV, we have u + (v + w) = (u + v) + w.

  2. Vector addition is commutative:

    For all v, wV, we have v + w = w + v.

  3. Vector addition has an identity element:

    There exists an element 0V, called the zero vector, such that v + 0 = v for all vV.

  4. Vector addition has inverse element:

    For all v ∈ V, there exists an element wV, called the additive inverse of v, such that v + w = 0.

  5. Distributivity holds for scalar multiplication over vector addition:

    For all aF and v, wV, we have a (v + w) = a v + a w.

  6. Distributivity holds for scalar multiplication over field addition:

    For all a, bF and vV, we have (a + b) v = a v + b v.

  7. Scalar multiplication is associative:

    For all a, bF and vV, we have a (b v) = (ab) v.

  8. Scalar multiplication has an identity element:

    For all vV, we have 1 v = v, where 1 denotes the multiplicative identity in F.

These are just the axioms of a module, so the axioms can be concisely described "a vector space is a module over a ring which is also a field". The module axioms can be described somewhat abstractly the following way. Denote by fa the map VV given by fa(v)=av, that is the scalar multiplication, viewed as a map from V to V. Then let f be the map which takes a to fa. The first four axioms say that V is an abelian group. Then next one says that for every a, fa is a group homomorphism, and the last three say that f is a ring homomorphism from F to End(V).

Note that some sources may choose to also include two axioms of closure:

  1. Vector addition is closed:

    If u, vV, then u + vV.

  2. Scalar multiplication is closed:

    If aF, vV, then a vV.

However the modern formal understanding of the operations as maps with codomain V makes these axioms satisfied by definition, and thus obviates the need to list them as independent axioms.

Note that expressions of the form “v a”, where vV and aF, are, strictly speaking, not defined. Because of the commutativity of the underlying field, however, “a v” and “v a” may be treated synonymously, and this is often done in practice.

Like the concept of a field itself, the formal definition of a vector space is entirely abstract. It is analogous to the concept of a module over a ring, of which it is a specialization. To determine if a set V is a vector space, one only has to specify the set V, a field F, and define vector addition and scalar multiplication on V. Then V is a vector space over the field F if and only if it satisfies the eight axioms listed above.

Elementary properties

There are a number of properties that follow easily from the vector space axioms.

  • The zero vector 0V is unique:

    If 01 and 02 are zero vectors in V, such that 01 + v = v and 02 + v = v for all vV, then 01 = 02 = 0.

  • Scalar multiplication with the zero vector yields the zero vector:

    For all aF, we have a 0 = 0.

  • Scalar multiplication by zero yields the zero vector:

    For all vV, we have 0 v = 0, where 0 denotes the additive identity in F.

  • No other scalar multiplication yields the zero vector:

    We have a v = 0 if and only if a = 0 or v = 0.

  • The additive inverse −v of a vector v is unique:

    If w1 and w2 are additive inverses of vV, such that v + w1 = 0 and v + w2 = 0, then w1 = w2. We call the inverse −v and define w − vw + (−v).

  • Scalar multiplication by negative unity yields the additive inverse of the vector:

    For all vV, we have (−1) v = −v, where 1 denotes the multiplicative identity in F.

  • Negation commutes freely:

    For all aF and vV, we have (−a) v = a (−v) = − (a v).

Examples

See Examples of vector spaces for a list of standard examples.

Subspaces and bases

Main articles: Linear subspace, Basis

Given a vector space V, any nonempty subset W of V which is closed under addition and scalar multiplication is called a subspace of V. It is easy to see that subspaces of V are vector spaces (over the same field) in their own right. The intersection of all subspaces containing a given set of vectors is called their span; if no vector can be removed without diminishing the span, the set is described as being linearly independent. A linearly independent set whose span is the whole space is called a basis for V.

Using Zorn’s Lemma (which is equivalent to the axiom of choice), it can be proved that every vector space has a basis. Using the ultrafilter lemma (which is strictly weaker than the axiom of choice), one can show that all bases for a given vector space have the same cardinality. Thus vector spaces over a given field are fixed up to isomorphism by a single cardinal number (called the dimension of the vector space) representing the size of the basis. For instance, the real vector spaces are just R0, R1, R2, R3, …. As you would expect, the dimension of the real vector space R3 is three.

A basis makes it possible to express every vector of the space as a unique combination of the field elements. Sometimes, vector spaces are introduced from this coordinatised viewpoint.

One often considers vector spaces which also carry a compatible topology. Compatible here means that addition and scalar multiplication should be continuous operations. This requirement actually ensures that the topology gives rise to a uniform structure. When the dimension is infinite, there are generally more than one inequivalent topologies, which makes the study of topological vector spaces richer than that of general vector spaces.

Only in such a topological vector spaces can one consider infinite sums of vectors, i.e. series, through the notion of convergence. This is of importance e.g. in quantum mechanics, where physical systems are defined as Hilbert spaces, and in other areas where Fourier expansions are used.

Linear transformations

Main article: Linear transformation

Given two vector spaces V and W over the same field F, one can define linear transformations or “linear maps” from V to W. These are maps from V to W which are compatible with the relevant structure — i.e., they preserve sums and scalar products. The set of all linear maps from V to W, denoted L (V, W), is also a vector space over F. When bases for both V and W are given, linear maps can be expressed in terms of components as matrices.

An isomorphism is a linear map that is one-to-one and onto. If there exists an isomorphism between V and W, we call the two spaces isomorphic; they are then essentially identical.

The vector spaces over a fixed field F, together with the linear maps, form a category.

Generalizations and additional structures

It is common to study vector spaces with certain additional structures. This is often necessary for recovering ordinary notions from geometry.

The definition of a vector space makes perfectly good sense if one replaces the field of scalars F by a general ring R. The resulting structure is called a module over R. In other words, a vector space is nothing but a module over a field.

An affine space is a set with a transitive vector space action — informally, a vector space that has forgotten its origin.

See also

Template:Wikibookspar

cs:Vektorový prostor cy:Gofod fectoraidd da:Vektorrum de:Vektorraum es:Espacio vectorial eo:Vektora spaco fr:Espace vectoriel gl:Espazo vectorial ko:벡터 공간 it:Spazio vettoriale he:מרחב וקטורי hu:Vektortér nl:Vectorruimte ja:ベクトル空間 no:Vektorrom pl:Przestrzeń liniowa pt:Espaço vetorial ro:Spaţiu vectorial ru:Линейное пространство sk:Lineárny priestor sl:Vektorski prostor sr:Векторски простор fi:Lineaariavaruus sv:Linjärt rum uk:Лінійний простір vec:Spazsio vetoriàl zh:向量空间