Definition
The mathematical precise way to define a vector requires the algebraic structure of a vector space. A vector is then an element of a vector space. A vector space defines the concept of
linear combination of its elements.
Vector Space (Linear Space)

A vector space $\mathbb{V}=\{V,\mathbb{F}\}$ consists of an additive abelian (commutative) group $V$, whose elements u,v,... are called vectors together with a field $\mathbb{F}$ whose elements are called scalars. The law of composition $u+v$ defining the abelian group is called vector addition. There is also an operation $\mathbb{F} \times V \Rightarrow V$ called scalar multiplication, which assigns a vector $au \in V$ to any pair $(a,u) \in \mathbb{F} \times V$. The identity element $0$ for vector addition is called the zero vector and the inverse of any vector $u$ is denoted $u$. A vector space $\{V,\mathbb{F}\}$ is often referred to as a vector spave $V$ over a field $\mathbb{F}$. If $\mathbb{F}=\mathbb{R}$, $V$ is called a real vector space, similarly if $\mathbb{F}=\mathbb{Q}$ or $\mathbb{F}=\mathbb{C}$, $V$ is called a rational or complex vector space. Vectors are often given distinct notations such as $\vec u$ to distinguish them from scalars.
Below the axioms of the operations on vectors are summarized.
Addition

Closure

\[\vec x,\vec y \in V \Rightarrow \vec x + \vec y \in V\]

Associativity

\[\vec x + (\vec y + \vec z) = (\vec x + \vec y) + \vec z\]

Communicativity

\[\vec x + \vec y=\vec y + \vec x\]

There exists an additative identity $\vec 0 \in V$ such that:

\[\vec x \in V \Rightarrow \vec x + \vec 0 = \vec x\]

Every vector $\vec x$ has an additative inverse $\vec x$ such that:

\[\vec x + (\vec x) = \vec 0\]
Multiplication

Closure

\[\alpha \in \mathbb{F}, \vec x \in V \Rightarrow \alpha\vec x \in V\]

Associativity

\[(\alpha\beta)\vec x=\alpha(\beta \vec x)\]

Distributive

\[\alpha(\vec x + \vec y)=\alpha \vec x + \alpha \vec y , (\alpha+\beta)\vec x =\alpha \vec x + \beta \vec x\]

There exists a multiplicative identity, $1 \in \mathbb{F}$ such that:

\[1\vec x=\vec x\]
The definition of a vector space has no notion of position (points), distance, angle. It is just a structure where we can perform arithmetic operations on its elements like in the number systems. Vectors have no fixed position, they float around in space and are sometimes called free vectors.
The set $\mathbb{R}^n$ of all ntuples $\vec x=(x^1,x^2,...,x^n)$ with $x^i \in \mathbb{R}$ is a vector space with the operations of addition and scalar multiplication defined as follows:
\begin{align*}
\boldsymbol{x}+\boldsymbol{y}=(x^1+y^1,x^2+y^2,...,x^n+y^n)
a\boldsymbol{x}=(ax^1,ax^2,...,ax^n)
\end{align*}
The axioms of the vector space follow directly from the axioms of real numbers. This vector space is called ndimensional real coordinate space.
Linearity
We define first the vocabulary used for important relations between the objects of a vector space which all relate to the concept of linear combinations.
Linear combination

Let $\mathbb{V}=(V,\mathbb{F})$ be a vectors space, $\{\vec{v}_i \}\subset V$ and $\lambda_i \in \mathbb{F}$ then we say that $\vec x$ is a linear combination of $\{\vec{v}_i\}$ if:
\[
\vec x = \sum_{i=1}^n \lambda_i\vec{v}_i
\]
Linearly dependent /linearly independent

Let $\mathbb{V}=(V,\mathbb{F})$ be a vectors space, $\{\vec{v}_i \}\subset V$ then we say $\{\vec{v}_i\}$ is linearly dependent if there exist a set of scalars $\{\lambda_i\}$, not all zero, such that
\[
\vec 0 = \sum_{i=1}^n \lambda_i\vec{v}_i
\]
If on the other hand $\vec 0 = \sum_{i=1}^n \lambda_i\vec{v}_i$ implies that $\lambda_i=0$ for each $i$ then the set $\{\vec{v}_i\}$ is linearly independent.
There exists a relation bewteen these two concepts which is formulated in the following proposition
Proposition 1

$\{x_i\}$ is linearly dependent $\Leftrightarrow$ there exists a $\vec{x}_k$ with $2\leq k\leq n$ which is a linear combination of the preceding ones.

$\Rightarrow$: $\{\vec{x}_i\}$ is linearly dependent and we must proof that for some $k$, $\vec{x}_k$ is a linear combination of the preceding ones. Let $k$ be the first integer for wich $\vec{x}_1, \ldots,\vec{x}_k$ are linearly dependent. Then
\[\lambda_1\vec{x}_1+\dots+\lambda_k\vec{x}_k=0\]
with $\lambda_i$ not all zero. Furthermore $\lambda_k \neq 0$ , because $k$ is the first integer for which $\vec{x}_1, \ldots,\vec{x}_k$ are linearly dependent. Hence $\vec{x}_k$ is a linear combination of the others:
\[
\vec{x}_k=\frac{1}{\lambda_k}\sum_{i=1}^{k1}\lambda_i \vec{x}_i
\]
$\Leftarrow$: $x_k=\sum_{i=1}^{k1}\lambda_i\vec{x}_i$ and we must proof that $\{\vec{x}_i\}$ is linearly dependent.
But this is easy as
\[
\sum_{i=1}^{k1}\lambda_i \vec{x}_i  x_k + \sum_{i=k+1}^{n}0\vec{x}_i =0
\]
with $\lambda_i$ not all zero.
Q.E.D.
Basis

A linear basis or coordinate system of a vector space $\mathbb{V}$ is a set $\{\vec{b}_i\}$ of linearly independent vectors such that every vector of $\mathbb{V}$ is a linear combination of the element of $\{\vec{b}_i\}$:
\[
\vec x = \sum_i x^i\vec{b}_i=x^i\vec{b}_i
\]
The $x^i$ are called the components or coordinates of $x$ in the basis $\{\vec{b}_i\}$. A vector space is finite dimensional if it has a finite basis. The number of vectors in a basis is called the dimension of the finite dimension vector space $\mathbb{V}$.
We used the Einstein convention, if the same index appears as a superscript and as a subscript in the same term then summation over this index is assumed, to avoid the summation sign.
The $x^i$ are uniquely determined by the basis. This follows from the following argument. Let $\vec x=x^i\vec{b}_i=x'^i\vec{b}_i$ then we must have $(x^ix'^i)\vec{b}_i=0$ as the $\{\vec{b}_i\}$ are linearly independent we must have $x^i=x'^i$.
A natural basis for the ndimensional vector space $\mathbb{R}^n$ is defined by:
\[
\vec{e}_i, \text{ with } e_i^j=\delta_{ij} (i=1,\ldots,n)
\]
Where $\delta_{ij}$ is the
Kronecker delta defined by:
\[
\delta_{ij}=
\begin{cases}
1, & \mbox{if }i=j\\
0, & \mbox{if }i\neq j\
\end{cases}
\]
Some new vocabulary now.

Let $\mathbb{V}$ and $\mathbb{W}$ be two vector spaces over the same field $\mathbb{F}$ and let $T:\mathbb{V} \mapsto \mathbb{W}$ be a map. $T$ is called linear or a vector space homomorphism if:
\[
T(a \vec x+b \vec y) = aT(\vec x) + b T(\vec y)
\]
for all $a,b \in \mathbb{F}$ and all $\vec x, \vec y \in \mathbb{V}$ . $T$ is called a linear operator if it is a linear map $T:\mathbb{V} \mapsto \mathbb{V}$. If $T$ is also bijective (onetoone and onto) then it is called a vector space isomorphism and the two vector spaces are called isomorphic. Then also the inverse map $T^{1}$ exists and is an isomorphism. A bijective linear map $T:\mathbb{V} \mapsto \mathbb{V}$ is called a linear transformation .
A homomorphism preserves the basic operations of vector addition and scalar multiplication between two vector spaces. Two vector spaces that are isomorphic are essential identical in all their properties.
Proposition 2

Every ndimensional vector space $\mathbb{V}$ over a field $\mathbb{F}$ is isomorphic to $\mathbb{F}^n$.

Let $\{b_1,\ldots,b_n\}$ be a basis for $\mathbb{V}$. We must show that there exists a map $T:\mathbb{V} \mapsto \mathbb{F}^n$ which is bijective and linear. Every vector $\vec x$ is a linear combination of $\{b_1,\ldots,b_n\}$, so $\vec x=\lambda^ìb_i$. Then clearly $T(\vec x)=(\lambda^1,\ldots,\lambda^n)$ is a bijective map. Let $\vec y=\mu^ib_i$ then it follows $T(\alpha \vec x + \beta \vec y)=\alpha T(\vec x) + \beta T(\vec y) = (\alpha \lambda^i+\beta \mu^i,\ldots, \alpha \lambda^n+\beta \mu^n)$. So $T$ is bijective and linear.
Q.E.D.
We could formulate the whole theory of vector spaces in $\mathbb{F}^n$. However with that one looses some generality because of $\mathbb{F}^n$ is equipped with a straight line coordinate system.
Geometric representations of linear objects are lines and planes in $\mathbb{R}^2$ or $\mathbb{R}^3$. A more abstract form of linear objects are called subspaces or linear manifolds. These are linear objects, like lines and planes which pass through the origin. The formal definition follows.
Subspace

A nonempty subset $\mathbb{M}$ of the vector space $\mathbb{V}$ is a subspace or linear manifold if and only if:
\[
\forall \vec x, \vec y \in \mathbb{M} \wedge \forall \alpha \in \mathbb{F} \to \vec x + \alpha \vec y \in \mathbb{M}
\]
A subspace is a vector space itself. $\vec 0$ and $\mathbb{V}$ are all subspaces of $\mathbb{V}$.