Skip to main content

Command Palette

Search for a command to run...

The DeepMind Quest

Clearing Chapter 1 of "Mathematics for Machine Learning" and the strategy keeping me on track.

Updated
3 min read
The DeepMind Quest

There is a seductive path in modern AI learning: jump straight into Python, import Scikit-learn or PyTorch, run a tutorial script, and watch the accuracy numbers go up. I took that path. It felt productive. I was "doing AI."

But there was always a nagging feeling, an imposter syndrome hovering in the background. I knew how to call the functions, but I didn’t truly understand why they worked. I was building a mental skyscraper on a foundation of sand. When things broke, or when I wanted to move beyond canned tutorials, I hit a wall. I realized something crucial was missing: the deep, intuitive understanding of the mathematics underneath the code.

My ultimate ambition is high: I want to work on cutting-edge AI at a place like Google DeepMind. To get there, "good enough" isn't good enough. I needed to stop, rewind, and build the foundation properly.

The Map: Mathematics for Machine Learning

I decided to tackle the definitive textbook: Mathematics for Machine Learning by Deisenroth, Faisal, and Ong. My goal is to internalize the "grammar" of ML before writing more "paragraphs" of code.

Today, I’m celebrating a significant milestone: I have completed Chapter 2: Linear Algebra. (The book starts counting content at Ch. 2).

This wasn't just about memorizing formulas. It was a journey from computation to intuition:

  1. The Mechanics: It started with the nuts and bolts, what is a matrix, how does multiplication work, and solving systems of linear equations using Gaussian elimination.

  2. The Structure: Then, we moved into the abstract, vector spaces, subspaces, and bases. Understanding where these mathematical objects live.

  3. The Breakthrough (Linear Mappings): This was the most critical shift. I stopped seeing matrices as just static grids of numbers and started seeing them as active transformations, machines that stretch, rotate, and squash vectors from one coordinate system to another.

Understanding that matrix multiplication is essentially a transformation of space is a game-changer for visualizing what neural networks are actually doing.

$$\begin{bmatrix} a & b \\ c & d \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} = x \begin{bmatrix} a \\ c \end{bmatrix} + y \begin{bmatrix} b \\ d \end{bmatrix} = \begin{bmatrix} ax + by \\ cx + dy \end{bmatrix}$$

The Strategy: How I Stay Consistent

Math is hard. Self-studying math is even harder. Relying solely on willpower doesn't work for long-term goals like this. I needed a system.

My routine relies on a few key pillars to keep the momentum going:

  • The Pomodoro Technique: I break my study sessions into focused intervals. It prevents burnout and keeps my brain fresh for dense topics like null spaces and affine subspaces.

  • Virtual Accountability (James Scholz): Studying alone can be isolating. I sync my sessions with James Scholz’s "Study With Me" streams on YouTube. There is a powerful, unspoken motivation in working alongside thousands of others, even virtually.

  • Daily Inspiration (5amoljen): Following accounts like 5amoljen on Instagram provides that necessary jolt of daily discipline. Seeing others wake up early and grind reminds me that the work needs to be done, regardless of how I feel.

What’s Next

Clearing the Linear Algebra chapter feels great, but I’m not rushing to the next topic yet. Knowledge without application fades quickly.

My next step is to dive into the practice problems in the book. I need to prove to myself that I can apply these concepts, calculating ranks, finding kernels, and performing changes of basis, before moving on.

Once I’m confident, I will begin Chapter 3: Analytic Geometry, where we will add lengths, angles, and distances to the vector spaces I just learned about.

The journey to DeepMind is long, and I'm still near the starting line. But for the first time, I feel like I’m running on solid ground.