# Zero vector

In mathematics, the zero vector is a special vector of a vector space , namely the uniquely determined neutral element with regard to vector addition . Examples of zero vectors are the number zero , the zero matrix and the zero function . In a scalar product space, the zero vector is orthogonal to all vectors of the space. In a normalized space it is the only vector with norm zero. Each sub-vector space of a vector space contains at least the zero vector, the smallest sub-vector space being the zero vector space . The zero vector is used to define some of the key terms in linear algebra such as linear independence , base, and kernel . It plays an important role in the solution structure of linear equations .

## definition

The zero vector of a vector space is the uniquely determined vector for which ${\ displaystyle V}$ ${\ displaystyle 0_ {V} \ in V}$

${\ displaystyle v + 0_ {V} = 0_ {V} + v = v}$

holds for all vectors . It is therefore the neutral element with regard to vector addition . ${\ displaystyle v \ in V}$

## notation

The zero vector is usually denoted by means of the number zero through , or simply simply . The zero vector, however, is generally different from the zero element of the scalar field of the vector space, which is also represented by. If there is a risk of confusion, the zero vector is labeled with and the scalar zero with . Occasionally the zero vector is also notated by , or as a small o . ${\ displaystyle {\ vec {0}}}$${\ displaystyle \ mathbf {0}}$${\ displaystyle 0}$ ${\ displaystyle K}$${\ displaystyle 0}$${\ displaystyle 0_ {V}}$${\ displaystyle 0_ {K}}$${\ displaystyle {\ vec {o}}}$${\ displaystyle \ mathbf {o}}$${\ displaystyle {\ mathfrak {o}}}$

As the only vector of the Euclidean plane , the zero vector cannot be graphically represented by an arrow , since neither direction nor length can be assigned to it.

## Examples

• In the vector space of real numbers , the zero vector is the number and thus equal to the zero of the scalar field.${\ displaystyle \ mathbb {R}}$ ${\ displaystyle 0}$
• In the vector space of complex numbers , the zero vector is the number and thus also corresponds to the scalar zero.${\ displaystyle \ mathbb {C}}$${\ displaystyle 0 + 0i}$
• In the coordinate space , the zero vector is the n -tuple consisting of the zero elements of the body .${\ displaystyle K ^ {n}}$ ${\ displaystyle (0_ {K}, \ ldots, 0_ {K})}$${\ displaystyle K}$
• In matrix space , the zero vector is the zero matrix , the elements of which are all equal .${\ displaystyle K ^ {m \ times n}}$${\ displaystyle 0_ {K}}$
• In the sequence space the zero vector is the sequence and not to be confused with the concept of the zero sequence .${\ displaystyle {\ mathbb {K}} ^ {\ mathbb {N}}}$ ${\ displaystyle (0 _ {\ mathbb {K}}, 0 _ {\ mathbb {K}}, \ ldots)}$
• In a linear function space, i.e. a vector space consisting of functions from a set into a vector space , the null vector is the null function , where the null vector is the target space .${\ displaystyle A}$${\ displaystyle W}$ ${\ displaystyle f \ equiv 0_ {W}}$${\ displaystyle 0_ {W}}$

## properties

### Uniqueness

The zero vector of a vector space is unique. If there were two different zero vectors and , then applies immediately ${\ displaystyle 0}$${\ displaystyle {\ bar {0}}}$

${\ displaystyle 0 = 0 + {\ bar {0}} = {\ bar {0}}}$

and thus equality of the two vectors.

### Scalar multiplication

The following applies to all scalars from the scalar body ${\ displaystyle \ alpha \ in K}$

${\ displaystyle \ alpha \ cdot 0_ {V} = 0_ {V}}$

and analogously for all vectors of the vector space ${\ displaystyle v \ in V}$

${\ displaystyle 0_ {K} \ cdot v = 0_ {V}}$,

which follows directly from the two distributive laws in vector spaces by choosing or . Together with it applies ${\ displaystyle \ alpha = \ beta = 0_ {K}}$${\ displaystyle u = v = 0_ {V}}$

${\ displaystyle \ alpha \ cdot v = 0_ {V} \ Leftrightarrow \ alpha = 0_ {K}}$or ,${\ displaystyle v = 0_ {V}}$

because it follows either or and then . ${\ displaystyle \ alpha \ cdot v = 0_ {V}}$${\ displaystyle \ alpha = 0_ {K}}$${\ displaystyle \ alpha \ neq 0_ {K}}$${\ displaystyle v = \ alpha ^ {- 1} \ cdot 0_ {V} = 0_ {V}}$

### Special rooms

In a scalar product , so a vector space with a scalar product , the zero vector is orthogonal to all vectors of the space, that is, for all vectors applies ${\ displaystyle v \ in V}$

${\ displaystyle \ langle 0_ {V}, v \ rangle = \ langle v, 0_ {V} \ rangle = 0_ {K}}$,

what follows from the linearity or semi- linearity of the scalar product. In particular, the zero vector is thus also orthogonal to itself. In a normalized vector space , the norm of the zero vector applies

${\ displaystyle \ | 0_ {V} \ | = 0 _ {\ mathbb {R}}}$

and the zero vector is the only vector with this property, which follows from the definiteness and the absolute homogeneity of the norm.

In a semi-normalized space there can be more than one vector whose norm is zero and such a vector is sometimes also called a zero vector. In a Minkowski space , light-like vectors are also referred to as zero vectors. In these cases, however, the concept of the zero vector does not correspond to the definition above.

### Cross product

In three-dimensional Euclidean space , the cross product of any vector with the zero vector results in the zero vector again, that is ${\ displaystyle V = \ mathbb {R} ^ {3}}$${\ displaystyle 0 \ in \ mathbb {R} ^ {3}}$

${\ displaystyle v \ times 0 = 0 \ times v = 0}$.

The same applies to the cross product of a vector with itself,

${\ displaystyle v \ times v = 0}$.

Furthermore, the Jacobi identity applies , i.e. the cyclic sum of repeated cross products also gives the zero vector:

${\ displaystyle u \ times (v \ times w) + v \ times (w \ times u) + w \ times (u \ times v) = 0}$.

## use

### Linear combinations

For a given family of vectors with an index set , the zero vector can always be a linear combination${\ displaystyle (v_ {i}) _ {i \ in I}}$ ${\ displaystyle I}$

${\ displaystyle 0_ {V} = \ sum _ {i \ in I} \ alpha _ {i} \ cdot v_ {i}}$

express. The vectors are linearly independent if and only if all coefficients have to be in this linear combination . The zero vector can therefore never be part of a basis of a vector space, because it is already linearly dependent in itself. Each sub-vector space of a vector space contains at least the zero vector. The set , which consists only of the zero vector, forms the smallest possible sub-vector space of a vector space, the zero vector space ; its basis is the empty set , because the empty sum of vectors gives, by definition, the zero vector, i.e. ${\ displaystyle \ alpha _ {i} = 0_ {K}}$${\ displaystyle \ {0_ {V} \}}$ ${\ displaystyle \ emptyset}$

${\ displaystyle \ sum _ {i \ in \ emptyset} v_ {i} = 0_ {V}}$.

### Linear maps

A linear mapping between two vector spaces and over the same scalar body always maps the zero vector onto the zero vector, because it applies ${\ displaystyle T \ colon V \ to W}$${\ displaystyle V}$${\ displaystyle W}$${\ displaystyle K}$

${\ displaystyle T (0_ {V}) = T (0_ {K} \ cdot 0_ {V}) = 0_ {K} \ cdot T (0_ {V}) = 0_ {W}}$.

However, further vectors from can also be mapped onto the zero vector of the target area. This set is called the kernel of the linear mapping and it forms a subspace of . A linear mapping is injective if and only if the kernel consists only of the zero vector. ${\ displaystyle W}$${\ displaystyle V}$${\ displaystyle V}$

### Linear equations

A homogeneous linear equation

${\ displaystyle T (v) = 0_ {W}}$

therefore has at least the zero vector as a solution. It can be solved uniquely if and only if the kernel of the linear operator consists only of the zero vector. The reverse is an inhomogeneous linear equation ${\ displaystyle v = 0_ {V}}$ ${\ displaystyle T}$

${\ displaystyle T (v) = w}$

with never solved by the zero vector. An inhomogeneous linear equation can be solved uniquely if the corresponding homogeneous equation only has the zero vector as a solution, which is a consequence of the superposition property . ${\ displaystyle w \ neq 0_ {W}}$