Sitemap

Building with Vectors: Unit Vectors, Linear Combinations, and Span Explained

5 min readJun 1, 2025

--

In this article, we’ve explored what vectors are and how some basic operations work on them.

Now, let’s try performing some of those operations ourselves and see what they lead to.

But before we do that, let me introduce a special type of vector: the Unit Vector.

Unit Vector

When we represent a vector in 2D space, we’re essentially describing how to reach a point if we’re only allowed to move along the x-axis and y-axis. But a vector can also take us to that same point without having to follow those strict, perpendicular directions.

By the way, we’re using 2D space throughout this explanation because it’s easier to visualize and understand — but everything we talk about here applies just as well to higher dimensions.

Let’s revisit something we already touched on earlier here.

Now vector u can be written as the sum of vector v and vector w - meaning we can reach point B in space by first moving along vector v and then along vector w. Mathematically that looks like,

Now, imagine two vectors in 2D space:

  • One pointing along the x-axis
  • One pointing along the y-axis

And both have a magnitude of 1 — meaning they lie exactly one unit away from the origin in their respective directions.

Let’s call them, vector a and vector b.

Does this remind you of anything?

Yep — they’re very similar to vector v and vector w from earlier! If you recall the concept of scaling and squishing vectors, this should feel familiar.

In fact, vector v is just 4 times vector vector a, which can be written as:

Similarly, vector w is as 2 times vector b.

Notice something interesting?

Any vector that lies purely along the x-axis or y-axis is just a scaled version of vector a or vector b. You can scale them up or down using any number — positive or negative.

These kinds of vectors are called Unit Vectors. In any N-dimensional space, there are exactly N unit vectors, each one pointing along one axis of that space, and each with a magnitude of exactly 1.

They also have special names and notation:

  • “i-hat” — lies along the x-axis, 1 unit from the origin.
  • “j-hat” — lies along the y-axis, 1 unit from the origin.

These are also known as the standard basis vectors in 2D space.

Think of a basis as the ultimate set of building blocks — you can use them to describe any point or vector in that space. Just combine and scale them, and you’ve got every position on the plane covered.

We’ll revisit the idea of basis vectors more deeply later. For now, just remember this:

  • Every unit vector is a basis vector, but not every basis vector is a unit vector.

Linear Combination

Now that we’ve established that any vector along the x-axis or y-axis can be represented as some scalar multiple of a unit vector, let’s explore whether all other vectors can be represented in a similar way using unit vectors.

Earlier, we saw that:

But vector v and vector w can also be written as,

Which implies,

The equation above is simply the linear combination of vector u.

Here’s a formal definition of linear combinations:

  • A linear combination of vectors is any expression made by multiplying those vectors by scalars (numbers) and then adding the results.

We don’t always have to define linear combinations in terms of unit vectors. Let’s look at two arbitrary vectors:

A linear combination could look like this:

No unit vectors involved — and voila, it still works perfectly.

What Does a Linear Combination Represent?

A linear combination represents a way to create a new vector by scaling and adding other vectors.

For example, given vectors, v1 and v2:

A linear combination is:

Depending on the values of c1 and c2, this combination creates different vectors — each one a point in 2D plane.

What’s “Linear” About Linear Combinations?

The term linear refers to a very specific kind of simplicity and structure in math:

Scaling + Adding (Only!)

A linear combination only allows:

  • Multiplying vectors by scalars (scaling),
  • Adding those scaled vectors together.

There are no powers, no products of vectors, no trig functions, no curves — just straight-line math.

That’s the “linear” part.

Span

So far, we’ve understood what unit vectors are and how we can reach different points in the plane by using linear combinations of two vectors.

To define the span:

  • The span of a set of vectors is the set of all possible vectors that can be created by taking all possible linear combinations of those vectors.

Example:

If you have two vectors, v and w, the span of vector v and vector w is all the vectors you can reach by scaling and adding vector v and vector w.

Geometrically,

In 2D space:
If you have two non-parallel vectors, the span of those vectors is the entire 2D plane. You can reach any point in the plane by appropriately scaling and adding those two vectors.

If the vectors are parallel, their span is just a line, since all combinations of the two vectors will lie along that line.

In 3D space:
If you have three non-coplanar vectors (i.e., they don’t all lie on the same plane), their span is the entire 3D space. Essentially, these vectors can “fill” the space, and any point in 3D can be reached by appropriately scaling and adding them.

If the vectors are coplanar (i.e., they lie on the same plane), the span will only cover a 2D plane within the 3D space. No matter how much you scale or combine them, you’ll always be confined to that plane.

We might be able to explore this in more detail later blogs.

Why is This Important for Learning AI?

Understanding vectors, linear combinations, and span is crucial in AI, particularly for machine learning and data science. Here’s why:

  • Feature Representation: Data is often represented as vectors, and linear combinations help model complex relationships for predictions.
  • Dimensionality Reduction: Span helps reduce high-dimensional data to lower dimensions while retaining important information (e.g., PCA).
  • Optimization: Many AI algorithms, like neural networks, involve optimizing functions in high-dimensional spaces, which requires understanding linear combinations and span.
  • Geometric Intuition: Machine learning involves geometric structures (e.g., decision boundaries), and understanding span and linear combinations aids in visualizing these structures in higher dimensions.

Conclusion

In this article, we’ve explored the fundamental concepts of unit vectors, linear combinations, and span, which form the basis for understanding vector spaces.

These principles are not just theoretical; they play a key role in AI, helping us model data, reduce its dimensionality, and visualize complex relationships.

As you continue learning AI, these foundational concepts will be essential for understanding machine learning algorithms, optimizations, and how intelligent systems make decisions.

--

--

No responses yet