Sitemap

Linear Algebra: The LEGO blocks of AI

3 min readMay 18, 2025

--

If you’re new to AI or have tried learning machine learning (ML) or artificial intelligence (AI) before and struggled (like I did), I have a suggestion: don’t rush into learning the models — don’t even start with the so-called easier ones, like regression and classification. Instead, try something different: learn Linear Algebra first (and later, probability and statistics as well).

Why Linear Algebra?

So, why Linear Algebra? What makes it so interesting, and how does it help you become an expert in machine learning and AI?

We all learned linear algebra in high school, dealing with matrices, vectors, and some math around them. Believe it or not, that’s the foundation of machine learning.

How Does a Human “Learn”?

To understand this better, let’s imagine how kids learn about the world around them. As their guardians, we might point to a cat and say, “This is a cat.” We repeat this process with different types of cats: black, white, fat, thin. We don’t explain each individual part of the cat — like its pointy ears, fluffy fur, or whiskers. They just learn intuitively what a cat looks like, and if they see a red cat one day, they’ll recognise it as a cat.

How does a Computer “Learn”?

Step 1: Pictures Become Numbers

Now, let’s think about how a computer perceives the world. Unlike us, computers can’t see things in their physical form. Instead, they translate everything into numbers. For images, these numbers are often represented as grids of pixels. For example, a 28x28 pixel image is simply a grid of 784 numbers.

In this case, Linear Algebra helps us handle these large grids of numbers. These grids are usually represented as vectors or matrices.

Step 2: The AI Tries to Learn from the Picture

Once the computer has these numbers, it tries to figure out what’s in the picture. To do this, the AI looks for patterns in the numbers — such as edges, shapes, or colors — that typically indicate “cat.”

How does it recognise these patterns? By multiplying, adding, and transforming these numbers.

And guess what tools it uses to do that? Vectors and matrices — again, linear algebra!

Step 3: The AI Learns by Adjusting Numbers

Just like when you try a recipe for the first time and adjust ingredients to make it tastier, the AI keeps adjusting its weights to make its predictions more accurate.

These weights are stored as vectors or matrices too.

So, each time the AI “learns,” it’s:

- Performing matrix multiplication, to transform the data i.e. vectors.

- Calculating dot products, to find similarity between two vectors.

- Adjusting vector values, for optimum weights and fine tuning.

You guessed it — more linear algebra.

Real-World Applications: Where AI Meets Linear Algebra

Now, you might be wondering: “How does all of this apply to real-world AI?” Well, here are some areas where linear algebra is essential:

- Image Recognition: AI models use convolutional neural networks (CNNs) to identify patterns in images — like distinguishing a cat from a dog.

- Predicting House Prices: Linear regression models use matrix operations to predict prices based on features like size and location.

- Recommender Systems: Ever wonder how Netflix recommends shows? It uses matrix factorization, which relies heavily on linear algebra.

Conclusion: Linear Algebra is the Foundation of AI

Linear algebra is at the heart of AI and machine learning because it allows us to work with large sets of data (like images, sound, or text) and extract useful patterns. It’s the “LEGO blocks” that allow AI to take raw data and learn from it to make predictions or decisions.

So, before diving into machine learning models, spend some time getting comfortable with linear algebra. Once you do, everything in AI and ML will start to make a lot more sense!

--

--

No responses yet