Skip to content
Swizec Teller - a geek with a hatswizec.com

I think I finally understand what a neural network is

Last night I was consumed with watching lessons from the online machine learning class at Stanford. The topic was neural networks, or rather the finer points of forward and back propagation.

Every time I have wanted to learn how neural networks work it just didn't click. Anyone I asked mostly just waved their hands and said that something magical happens and ... but let's start at the beginning.

The basics of neural networks is that you have neurons that are connected to other neurons. At one end you enter your input data and on the other end the neural network produces some numbers according to what it has learned. Everyone can imagine this much and it's not really difficult to visualise.

It looks like a bunch of circles with arrows:

Simple neural network, Italian nouns

Anyone who's studied this a tad further can tell you the connections between neurons are very important and the weights associated with them are somehow used in calculating stuff. Not a hard concept to grasp - every neuron outputs a number that is multiplied with the weight on each connection before being fed as an input into the next neuron.

Where it always got a bit hairy for me was trying to understand what do the neurons do? The most I could get out of anyone supposedly knowing this stuff is that "it calcualtes stuff". Yes but how? What does it do? What exactly?

Nobody knew.

Last night I finally figured it out! Neurons don't do anything. They don't even exist per se. In fact a neural network looks pretty damn odd inside a computer, it's really just a matrix of weights.

What happens when you're doing forward propagation (using a learned network) is simply this:

  1. Take the outputs from the previous layer (a vector of numbers)
  2. Multiply with a vector of weights (the arrows)
  3. Apply the cost function (this becomes the new layer)

Then you just repeat this for all the layers and that's that. That is literally all that happens.

In the end you are left with a vector of numbers representing the output layer, which you then just have to correctly interpret.

The part I don't have completely figured out yet is the backpropagation. This is the bit where neural networks learn how to do their magic. Basically backpropagations sets those weights from step 2 via a simple hill climbing algorithm ... it is essentially a way to calculate the gradient of the cost function so that you can correctly change the weights to achieve ever lower differences between what you're supposed to know and what you actually know. Eventually you hope to achieve a global minimum, but you are guaranteed to at least achieve a local minimum and not being able to tell whether it's global.

That's it. That is really all there is to it. Neural networks are just a nice way to visualise a sequence of matrix multiplications. And I guess it's easier to get grants for "neural networks" than "sequence of matrix multiplications" ...

Enhanced by Zemanta

Did you enjoy this article?

Published on November 9th, 2011 in Artificial intelligence, Artificial neural network, Learning, Neural network, People, Stanford, Uncategorized

Learned something new?
Want to become a high value JavaScript expert?

Here's how it works 👇

Leave your email and I'll send you an Interactive Modern JavaScript Cheatsheet 📖right away. After that you'll get thoughtfully written emails every week about React, JavaScript, and your career. Lessons learned over my 20 years in the industry working with companies ranging from tiny startups to Fortune5 behemoths.

Start with an interactive cheatsheet 📖

Then get thoughtful letters 💌 on mindsets, tactics, and technical skills for your career.

"Man, love your simple writing! Yours is the only email I open from marketers and only blog that I give a fuck to read & scroll till the end. And wow always take away lessons with me. Inspiring! And very relatable. 👌"

~ Ashish Kumar

Join over 10,000 engineers just like you already improving their JS careers with my letters, workshops, courses, and talks. ✌️

Have a burning question that you think I can answer? I don't have all of the answers, but I have some! Hit me up on twitter or book a 30min ama for in-depth help.

Ready to Stop copy pasting D3 examples and create data visualizations of your own?  Learn how to build scalable dataviz components your whole team can understand with React for Data Visualization

Curious about Serverless and the modern backend? Check out Serverless Handbook, modern backend for the frontend engineer.

Ready to learn how it all fits together and build a modern webapp from scratch? Learn how to launch a webapp and make your first 💰 on the side with ServerlessReact.Dev

Want to brush up on your modern JavaScript syntax? Check out my interactive cheatsheet: es6cheatsheet.com

By the way, just in case no one has told you it yet today: I love and appreciate you for who you are ❤️