Linear Independence
Linear independence is a property of sets of vectors that tells whether or not any of the vectors can be expressed in terms of the other vectors (and any scalars).
Linear Combinations
A linear combination of elements in a set \(S\) of vectors in some vector space \(V\) is a finite sum of scalar multiples of vectors in \(S\). If \(n\) is a positive integer, \(a_1, \, a_2, \, \dots, \, a_n\) are nonzero elements of the underlying field, and \(v_1, \, v_2, \, \dots, \, v_n\) are vectors in \(V\), then
\[a_1 v_1 + a_2 v_2 + \dots + a_n v_n\]
is a linear combination.
Note that \(n = 0\) could occur, meaning that, for any set, the zero-vector \(\textbf{0}\) can be a linear combination that is, by definition, trivial.
Linear combinations capture the concept of "reachable" vectors, vectors that could be reached by performing some finite number of vector space operations on the elements of \(S\). So the set of linear combinations of \(S\) is the same as the set of "reachable" vectors of \(S\), and that set is a vector space itself, a subspace of \(V\). That set is known as the span of \(S\).
The question of whether or not a vector is a linear combination of other vectors factors in the discussion of the kernel and image.
Is \((1,\,2,\,3)\) a linear combination of \((3,\,2,\,3)\) and \((2,\,2,\,3)?\)
Note that \((1,\,2,\,3) = -1 \cdot (3,\,2,\,3) + 2 \cdot (2,\,2,\,3)\). So it is a linear combination of those two vectors. \(_\square\)
Questions like these can be answered more generally with row reduction.
Linearly Dependent Sets
A set \(S\) is known as linearly dependent if one of its elements is a linear combination of its other elements. In other words, there exists some vector \(v \in S\) such that \(v \in \text{Span}\big(S \setminus \{v\}\big)\).