Linear sums and independence, span, and linear bases

The idea of linear dependence is an important one in linear algebra, because it's an important idea in defining things like the span of vectors (and by extension the dimensionality of a vector space), the basis of a vector space, etc. It is also the first idea we'll be discussing that gives us an appreciation of the importance the underlying scalar field.

A linear combination of some n vectors $v_1,...v_n$ is some combination of the vectors $a_1v_1+...+a_nv_n$. In other words, it's a vector that can be obtained by adding up any of these vectors scaled to arbitrary extents. We immediately observe a few properties of this idea:
  •  If $x_1,...x_n$ are all linear combinations of a set of vectors $v_1...v_n$, then so is any linear combination of the x-vectors.
  • Every one vector $v_i$ of $v_1...v_n$ is a linear combination of the vectors, namely with the coefficients $(a_1,...a_i,...a_n)=(0,...1,...0)$.
  • The zero vector is always a linear combination of any set of vectors
We can state the idea of a linear combination more formally in terms of sets -- given some vector space $V$ and an underlying field of scalars $F$: for every finite set of vectors $X\in V$ with $n$ elements $v_1,...v_n$, we can construct a set $Y\in V$ such that (a) $\forall y\in Y$, there exists a finite sequence of $a_1,...a_n$ such that every $a_i\in F$ and $\sum a_iv_i=y$ (b) the converse: given any finite sequence $(a_1,...a_n)\in F^n$, the sum $\sum a_iv_i\in Y$.

The set Y in the above definition is known as the span of the set of vectors X. The span is, quite literally, the entire subspace of V that can be spanned by scaling and adding the vectors in X.

A linear subspace, or when the context is clear, "subspace", is a subset of a linear vector space (or simply a "linear space") that also carries with it the operations of addition and scalar multiplication and satisfies closure, i.e. it is also a linear space itself. The only subspaces of $\mathbb{R}^n$ are flat hypersurfaces of dimension n or lower passing through the origin (Can you explain why this is so? Prove it formally.). For instance in the case of $\mathbb{R}^3$, all planes and lines passing through the origin, as well as the set {0}, are linear subspaces.

The set $v_1,...v_n$ is said to be linearly independent if none of the vectors can be written as a linear combination of another. Some simple algebraic manipulation will show that as long as even one of the vectors can be written as a linear combination of the others, the set is not linearly independent (the converse does not hold true -- why?). This idea will be very useful to us later on when we talk about defining a basis for a vector space, etc. An alternative formulation of linear independence can be obtained through some more simpler algebraic manipulation:

If the only solution to the equation $a_1v_1+...+a_nv_n=0$ is $(a_1,...a_n)=(0,...0)$, then the vectors are linearly independent.

Explain why this works geometrically in $\mathbb R^n$. Keep in mind that the geometric explanation is usually closely related or equivalent to the algebraic one, much like formal proofs usually just formalise our intuition about something.

This idea of linear dependence and independence might remind you of systems of linear equations. Indeed, linear algebra provides us with several tools for solving such equations, which we will study in future.

If you return to this article once you have a good understanding of matrix multiplication, you'll notice that the condition $a_1v_1+...+a_nv_n=0$ for linear dependence is equivalent to the statement that the matrix-vector product $\left[\begin{array}{}
  {v_1}&{\cdots}&{v_n}
\end{array}\right]\left[\begin{array}{}
  {a_1}\\ {\cdots}\\{a_n}
\end{array}\right]=0$ (which we will write as $Bw=0$) has a solution other than the zero vector. A matrix that maps a non-zero vector to the zero vector is called a singular matrix, and has a determinant of zero. In other words, having linearly dependent rows is equivalent to being a singular matrix.

An important theorem with regard to span and linear independence is that the maximum cardinality of a linearly independent set of vectors in $\mathbb{R}^n$ is $n$. In fact, this definition is generally used as the very definition of the dimension of a linear space -- the maximum cardinality of a linearly independent set of vectors.

Another way of saying this is that the span of $n$ linearly independent vectors in $\mathbb{R}^n$ is always $\mathbb{R}^n$.

Try to examine why this property is true by graphing the span of two vectors in $\mathbb{R}^2$. The following diagram should be of help.

Span of two vectors – draw parallel versions of the span of $v_2$ at each point 
on the span of $v_1$

Do the same for $\mathbb{R}^3$. Notice that if two vectors in $\mathbb{R}^2$ lie on the same line, they do not span the entire space. Similarly, if the three vectors in $\mathbb{R}^3$ lie on a plane, they do not span the entire space.

Compare this theorem with the idea of a system of $n$ linear equations only having a unique solution if there are $n$ variables. Indeed, this idea -- of switching between the "vector" interpretation and the "linear equation" interpretation -- is one that we will revisit later on with what is called the "row picture" and the "column picture".

We conclude with perhaps the most essential part of this article: if a vector space or subspace is the span of a linearly independent set of vectors, the set is called a basis of the vector space. For example, any two vectors in $\mathbb{R}^2$ that do not lie on a line can be used as a basis for $\mathbb{R}^2$. We will often use, by default, a basis of orthogonal unit vectors in $\mathbb{R}^n$, called the standard basis, for reasons that will eventually become apparent (although most of the linear algebra we'll study is true independent of the basis used).

However, it is important to understand the existence of multiple possible bases and therefore representations of vectors, as this idea will eventually be studied in detail and has important applications in physics.

No comments:

Post a Comment