Linearly Independent Vectors

What is Linear Independence?

Linear independence is an important property of a set of vectors. A set of vectors is called linearly independent if no vector in the set can be expressed as a linear combination of the other vectors in the set. If any of the vectors can be expressed as a linear combination of the others, then the set is said to be linearly dependent.

Linearly independent sets are vital in linear algebra because a set of n linearly independent vectors defines an n-dimensional space -- these vectors are said to span the space. Any point in the space can be described as some linear combination of those n vectors.

An example should clarify the definition. Consider the two-dimensional Cartesian plane. We can define a certain point on the plane as (x, y). We could also write this as xî + yĵ, where î = (1, 0) and ĵ = (0, 1). î and ĵ are linearly independent. î and ĵ also happen to be orthonormal, but this isn't necessarily the case with all linearly independent sets of vectors; if we define k̂ = (2, 1), then {î, k̂} is a linearly independent set, even though î and k̂ aren't orthogonal and k̂ isn't normalized. We can still define any point on the plane in terms of î and k̂ -- for example, the point (3, 2) can be expressed as 2î - k̂.


A Mathematical Example:

Composing the target vector as a linear combination of the basis vectors

Chart from TheCleverMachine


The bases b(x) and b(y) are linearly independent. There is no possible way to compose the basis vector b(x) as a linear combination of the other basis vector b(y), nor the other way around. When we look at this chart in 2D vector space, the blue and red lines are perpendicular to each other (orthogonal). However, linear independence can’t always be represented in 2D space. If we want to officially determine if two column vectors are linearly independent, we do so by calculating the column rank of a matrix A. We compose this by concatenating the two vectors:


A = [b^{(x)},b^{(y)}]

= \begin{bmatrix} 1&0 \\ 0&1 \end{bmatrix}


The rank of a matrix is the number of linearly independent columns in the matrix. If the rank of A has the same value as the number of columns in the matrix, then the columns of  A forms a linearly independent set of vectors. In our example, the rank of A is higher than two, and the number of columns is higher than two as well. This means these basis vectors are linearly independent. The same matrix rank-based test can also verify if vectors of a higher dimension are linearly independent. If we want to be able to define a unique model, then we will care about the linear independence of the basis set.

Please sign up or login with your details

Forgot password? Click here to reset