[ad_1]
In this live stream, I continue my Machine Learning series, covering Linear Regression with Gradient Descent. I also discuss some concepts from calculus: the power rule, the chain rule, and partial derivatives.
This video is part of the third session of my ITP “Intelligence and Learning” course (
Edited videos:
Linear Regression with Gradient Descent:
Calculus – Power Rule:
Calculus – Chain Rule:
Calculus – Partial Derivative:
22:25 – Linear Regression with Gradient Descent
1:05:30 – Intro to Calculus – Power Rule
1:30:24 – Chain Rule
1:47:05 – Partial Derivative
To support this channel:
Patreon:
Coding Train Merchandise:
Send me your questions and coding challenges!:
Contact:
Twitter:
The Coding Train website:
Links discussed in this video:
Session 3 of Intelligence and Learning:
Nature of Code:
kwichmann’s Linear Regression Diagnostics:
Craig Reynolds’ Steering Behaviors:
Siraj Raval’s Youtube channel:
3Blue1Brown’s Youtube channel:
This.dot song:
Kitten song:
Books:
Make Your Own Neural Network:
Calculus Made Easy:
Source Code for the all Video Lessons:
p5.js:
Processing:
For more Intelligence and Learning videos:
For an Introduction to Programming:
For my Nature of Code videos:
For More Live Streams:
For More Coding Challenges:
Help us caption & translate this video!