- 1. Specialization & Course Introduction
- 2. System of linear equations: 2 variables
- Quiz
- 3. System of linear equations: 3 variables
- Quiz
- Notes
- ungraded lab 1
- ungraded lab 2
- 1. Solving system of linear equations: Elimination
- Quiz
- 2.Solving system of linear equation: Row echelon form and rank
- Lab
- Notes
- Quiz
- Lab
- Programming Assignment: System of Linear Equations
- 1. Vector algebra
- Lab
- Quiz
- 2. Linear transformations
- Quiz
- Assignment
- Lab
- Linear Transformations
- Matrix Multiplication
- Notes
- 1. Determinants In-depth
- Quiz
- 2. Eigenvalues and Eigenvectors
- Quiz
- Assignment
- Notes
-
1. Lesson 1 - Derivatives
-
1.Course Introduction by Andrew Ng.mp4
-
2.Course Introduction by Luis Serrano.mp4
-
3.Machine Learning Motivation.mp4
-
4.Motivation to Derivatives - Part I.mp4
-
5.Derivatives and Tangents.mp4
-
6.Slopes, maxima and minima.mp4
-
7.Derivatives and their notation.mp4
-
8.Some common derivatives - Lines.mp4
-
9.Some common Derivatives - Quadratics.mp4
-
10.Some common derivatives - Higher degree polynomials.mp4
-
11.Some common derivatives - Other power functions.mp4
-
12.The inverse function and its derivative.mp4
-
13.Derivative of trigonometric functions.mp4
-
14.Meaning of the Exponential (e).mp4
-
15.The derivative of e^x.mp4
-
16.The derivative of log(x).mp4
-
17.Existence of the derivative.mp4
-
18.Properties of the derivative: Multiplication by scalars.mp4
-
19.Properties of the derivative: The sum rule.mp4
-
20.Properties of the derivative: The product rule.mp4
-
21.Properties of the derivative: The chain rule.mp4
-
- Quiz
-
2. Lesson 2 - Optimization
-
1.Introduction to Optimization.mp4
-
2.Optimization of squared loss - The one powerline problem.mp4
-
3.Optimization of squared loss - The two powerline problem.mp4
-
4.Optimization of squared loss - The three powerline problem.mp4
-
5.Optimization of log-loss - Part 1.mp4
-
6.Optimization of log-loss - Part 2.mp4
-
7.Week 1 - Conclusion.mp4
-
- Quiz
- Assignment
- Lab
- Notes
- 1. Lesson 1 - Gradients
- Quiz
-
2. Lesson 2 - Gradient Descent
-
1.Optimization using Gradient Descent in one variable - Part 1.mp4
-
2.Optimization using Gradient Descent in one variable - Part 2.mp4
-
3.Optimization using Gradient Descent in one variable - Part 3.mp4
-
4.Optimization using Gradient Descent in two variables - Part 1.mp4
-
5.Optimization using Gradient Descent in two variables - Part 2.mp4
-
6.Optimization using Gradient Descent - Least squares.mp4
-
7.Optimization using Gradient Descent - Least squares with multiple observations.mp4
-
8.Week 2 - Conclusion.mp4
-
- Assignment
- Lab
- Optimization Using Gradient Descent in One Variable
- Optimization Using Gradient Descent in Two Variables
- Quiz
- Notes
-
1. Lesson 1 - Optimization in Neural Networks
-
1.Regression with a perceptron.mp4
-
2.Regression with a perceptron - Loss function.mp4
-
3.Regression with a perceptron - Gradient Descent.mp4
-
4.Classification with Perceptron.mp4
-
5.Classification with Perceptron - The sigmoid function.mp4
-
6.Classification with Perceptron - Gradient Descent.mp4
-
7.Classification with Perceptron - Calculating the derivatives.mp4
-
8.Classification with a Neural Network.mp4
-
9.Classification with a Neural Network - Minimizing log-loss.mp4
-
10.Gradient Descent and Backpropagation.mp4
-
- Lab
- Classification with Perceptron
- Regression with Perceptron
- Quiz
- 2. Lesson 2 - Newton is Method
- Assignment
- Lab
- Quiz
- Notes
7.png
Views | |
---|---|
0 | Total Views |
0 | Members Views |
0 | Public Views |
Actions | |
---|---|
0 | Likes |
0 | Dislikes |
0 | Comments |
Share by mail
Please login to share this video by email.