Swizec Teller - a geek with a hatswizec.com

Senior Mindset Book

Get promoted, earn a bigger salary, work for top companies

Senior Engineer Mindset cover
Learn more

    I think I finally understand what a neural network is

    Last night I was consumed with watching lessons from the online machine learning class at Stanford. The topic was neural networks, or rather the finer points of forward and back propagation.

    Every time I have wanted to learn how neural networks work it just didn't click. Anyone I asked mostly just waved their hands and said that something magical happens and ... but let's start at the beginning.

    The basics of neural networks is that you have neurons that are connected to other neurons. At one end you enter your input data and on the other end the neural network produces some numbers according to what it has learned. Everyone can imagine this much and it's not really difficult to visualise.

    It looks like a bunch of circles with arrows:

    Simple neural network, Italian nouns

    Anyone who's studied this a tad further can tell you the connections between neurons are very important and the weights associated with them are somehow used in calculating stuff. Not a hard concept to grasp - every neuron outputs a number that is multiplied with the weight on each connection before being fed as an input into the next neuron.

    Where it always got a bit hairy for me was trying to understand what do the neurons do? The most I could get out of anyone supposedly knowing this stuff is that "it calcualtes stuff". Yes but how? What does it do? What exactly?

    Nobody knew.

    Last night I finally figured it out! Neurons don't do anything. They don't even exist per se. In fact a neural network looks pretty damn odd inside a computer, it's really just a matrix of weights.

    What happens when you're doing forward propagation (using a learned network) is simply this:

    1. Take the outputs from the previous layer (a vector of numbers)
    2. Multiply with a vector of weights (the arrows)
    3. Apply the cost function (this becomes the new layer)

    Then you just repeat this for all the layers and that's that. That is literally all that happens.

    In the end you are left with a vector of numbers representing the output layer, which you then just have to correctly interpret.

    The part I don't have completely figured out yet is the backpropagation. This is the bit where neural networks learn how to do their magic. Basically backpropagations sets those weights from step 2 via a simple hill climbing algorithm ... it is essentially a way to calculate the gradient of the cost function so that you can correctly change the weights to achieve ever lower differences between what you're supposed to know and what you actually know. Eventually you hope to achieve a global minimum, but you are guaranteed to at least achieve a local minimum and not being able to tell whether it's global.

    That's it. That is really all there is to it. Neural networks are just a nice way to visualise a sequence of matrix multiplications. And I guess it's easier to get grants for "neural networks" than "sequence of matrix multiplications" ...

    Published on November 9th, 2011 in Artificial intelligence, Artificial neural network, Learning, Neural network, People, Stanford, Uncategorized

    Did you enjoy this article?

    Continue reading about I think I finally understand what a neural network is

    Semantically similar articles hand-picked by GPT-4

    Senior Mindset Book

    Get promoted, earn a bigger salary, work for top companies

    Learn more

    Have a burning question that you think I can answer? Hit me up on twitter and I'll do my best.

    Who am I and who do I help? I'm Swizec Teller and I turn coders into engineers with "Raw and honest from the heart!" writing. No bullshit. Real insights into the career and skills of a modern software engineer.

    Want to become a true senior engineer? Take ownership, have autonomy, and be a force multiplier on your team. The Senior Engineer Mindset ebook can help 👉 swizec.com/senior-mindset. These are the shifts in mindset that unlocked my career.

    Curious about Serverless and the modern backend? Check out Serverless Handbook, for frontend engineers 👉 ServerlessHandbook.dev

    Want to Stop copy pasting D3 examples and create data visualizations of your own? Learn how to build scalable dataviz React components your whole team can understand with React for Data Visualization

    Want to get my best emails on JavaScript, React, Serverless, Fullstack Web, or Indie Hacking? Check out swizec.com/collections

    Did someone amazing share this letter with you? Wonderful! You can sign up for my weekly letters for software engineers on their path to greatness, here: swizec.com/blog

    Want to brush up on your modern JavaScript syntax? Check out my interactive cheatsheet: es6cheatsheet.com

    By the way, just in case no one has told you it yet today: I love and appreciate you for who you are ❤️

    Created by Swizec with ❤️