This live stream is a continuation of neural network series. I attempt to discuss backpropagation, but it mostly goes haywire.

30:20 – Reading Random Numbers
36:22 – Backpropagation Part 1
1:06:56 – Backpropagation Part 2
1:40:25 – Learning with Gradient Descent

Schedule and topics:

Support this channel on Patreon:
To buy Coding Train merchandise:
To Support the Processing Foundation:

Send me your questions and coding challenges!:

Contact:
Twitter:
The Coding Train website:

Links discussed in this video:
SFPC’s Learning to Teach 2018:
Michael Nielsen’s Book on Neural Networks:
To Support me in my half marathon for charity:
The Coding Train on Amazon:
Deeplearn.js:
Sigmoid function on Wikipedia:

Videos mentioned:
My Neural Networks series:
3Blue1Brown Neural Networks playlist:
3Blue1Brown’s Linear Algebra playlist:
Gradient Descent by 3Blue1Brown:
Gradient Descent by Siraj Raval:
My Video on Gradient Descent:

Source Code for the all Video Lessons:

p5.js:
Processing:

For an Introduction to Programming:
For More Live Streams:
For More Coding Challenges: