Linear dependence/independence

Review: linear combination

Recall that a linear combination (weighted sum) of two vectors $\mathbf{v}_2$ and $\mathbf{v}_2$ in $\mathbb{R}^n$ is an expression of the form \[ a \mathbf{v}_1 + b \mathbf{v}_2 \] where $a$ and $b$ are scalars (numbers).

In general, a linear combination of a set of vectors $\{ \mathbf{v}_1, \dots, \mathbf{v}_n \}$ in $\mathbb{R}^n$ is an expression of the form \[ c_1 \mathbf{v}_1 + \dots + c_n \, \mathbf{v}_n, \] where $c_1,\dots,c_n$ are scalars known as coefficients.

Of course, for this to be meaningful, we require all the vectors to be of the same shape. Indeed, we will soon learn that this operation only make sense when all the vectors are coming from the same "vector space".

Geometric interpretation

Vectors in $\mathbb{R}^2$ can represent geometric (free) vectors in the plane.

It is often convenient to visualize these vectors as directed line segments on the plane with initial points being the origin.

Can we visualize linear combinations of two vectors?

Recall that grayscale images can be represented as vectors. E.g., a image of $100 \times 100$ pixel can be represented as a vector in $\mathbb{R}^{10000}$.

Following this interpretation, suppose we have grayscale images represented by $\mathbf{v}_1$ and $\mathbb{v}_2$ respectively. What do you think is the meaning of the linear combination \[ \frac{1}{2} \mathbf{v}_1 + \frac{1}{2} \mathbf{v}_2 \, ? \]

Linear dependence/independence (set of 2)

$\{ \mathbf{v}_1, \mathbf{v}_2 \}$ is said to be linearly dependent if one of them is a scalar multiple of the other, including the cases where one or both are $\mathbf{0}$.

Conversely, $\{ \mathbf{v}_1, \mathbf{v}_2 \}$ is linearly independent if neither is a multiple of the other. (This also implies that both vectors are not $\mathbf{0}$)

I.e., a set $S$ of two is linearly independent if it is not linearly dependent. Equivalently, $S$ is linearly dependent if it is not linearly independent.

In the following list of sets of vectors, which ones are linear independent? Which ones are linearly dependent?
\[ S_1 = \left\{ \begin{bmatrix} 2 \\ 4 \end{bmatrix} \,,\, \begin{bmatrix} 1 \\ 2 \end{bmatrix} \right\} \] \[ S_2 = \left\{ \begin{bmatrix} 2 \\ 4 \end{bmatrix} \,,\, \begin{bmatrix} 1 \\ 5 \end{bmatrix} \right\} \]
\[ S_3 = \left\{ \begin{bmatrix} 2 \\ 4 \end{bmatrix} \,,\, \begin{bmatrix} 0 \\ 0 \end{bmatrix} \right\} \] \[ S_4 = \left\{ \begin{bmatrix} 1 \\ 0 \end{bmatrix} \,,\, \begin{bmatrix} 0 \\ 1 \end{bmatrix} \right\} \]

Geometric interpretation

Vectors in $\mathbb{R}^2$ can represent geometric (free) vectors in the plane (think displacement).

Explain that does it mean for two vectors in $\mathbb{R}^2$ to be linearly independent (vs. linearly dependent).
Determine if the following sets are linearly independent or linearly dependent. \[ \begin{aligned} \left\{ \begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix} , \begin{bmatrix} 3 \\ 2 \\ 1 \end{bmatrix} \right\} && \left\{ \begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix} , \begin{bmatrix} 0 \\ 0 \\ 0 \end{bmatrix} \right\} && \left\{ \begin{bmatrix} 1 \\ 2 \\ 1 \end{bmatrix} , \begin{bmatrix} -3 \\ -6 \\ -3 \end{bmatrix} \right\} \end{aligned} \]

Linear dependence

A nonempty set of vector $\{ \mathbf{v}_1, \dots, \mathbf{v}_m \} \subset \mathbb{R}^n$ is said to be linearly dependent if there are real numbers $c_1,\dots,c_m$, not all zero, such that \[ c_1 \mathbf{v}_1 + \cdots + c_m \mathbf{v}_m = \mathbf{0}. \]

A simpler way to say this is that a set of vectors is linearly dependent if...

  • One vector is a linear combination of the other vectors, or
  • At least one of them is the zero vector (this is actually included in the previous case).

Linear independence

A set that is not linearly dependent is said to be linearly independent.

More explicitly: A set of vector $\{ \mathbf{v}_1, \dots, \mathbf{v}_m \}$ is linearly independent if the equation \[ c_1 \mathbf{v}_1 + \cdots + c_m \mathbf{v}_m = \mathbf{0} \] for real numbers $c_1,\dots,c_m$ implies that $c_1 = \cdots = c_m = 0$.

That is, for a linearly independent set of vectors, the only way to create $\mathbf{0}$ as a linear combination of this set is to use all zero coefficients.

The case of the empty set

In the above, we specifically required the set to be nonempty. So one question remains: "what about $\{ \, \}$?" Should the empty set be considered as linearly dependent or independent.

The convention that the empty set $\varnothing = \{\,\}$ is linearly independent.

This is actually consistent with and implied by the definition of linear independence stated above.

Can 3 vectors in $\mathbb{R}^2$ be linearly independent? Explain your reasoning.
What about $n+1$ vectors in $\mathbb{R}^n$? Explain your reasoning.

Recall that grayscale images can be represented as vectors. E.g., a image of $100 \times 100$ pixel can be represented as a vector in $\mathbb{R}^{10000}$.

In this context, what does it mean when a set of two images is linearly independent? (Considering the images as vectors)