[ad_1]
In this video, I discuss how “gradient descent” can be used to adjust the weights during back propagation in my “toy” JavaScript neural network library.

Next video:

This video is part of Chapter 10 of The Nature of Code (

This video is also part of session 4 of my Spring 2017 ITP “Intelligence and Learning” course (

Support this channel on Patreon:
To buy Coding Train merchandise:
To donate to the Processing Foundation:

Send me your questions and coding challenges!:

Contact:
Twitter:
The Coding Train website:

Links discussed in this video:
The Coding Train on Amazon:
Deeplearn.js:
Backpropagation on Wikipedia:
Machine Learning for Artists:
Matrix Math website:

Videos mentioned in this video:
My Neural Networks series:
3Blue1Brown Neural Networks playlist:
3Blue1Brown’s Linear Algebra playlist:
My Video on Gradient Descent:
My Video on Perceptron:
My Video on Linear Regression:

Source Code for the all Video Lessons:

p5.js:
Processing:

The Nature of Code playlist:
For More Coding Challenges:
For More Intelligence and Learning: