Notes

Practice


What is Machine Learning?

Fundamental Concepts


Training: Model, Loss and Optimization

Overfitting and Underfitting

Validation and Testing and Deployment

Supervised, Semi and Unsupervised Learning

Types of Tasks

Optional: Inductive Bias (Maybe Next Year)

Pytorch Introductory

Building Intuitions in Neural Network


From Perceptron to Multi-Layer Perceptron

Neural Network is A Layered Approximator

Neural Network Can Fold Papers

Neural Network Consists of Logic Gates

Neural Network as Template Matching

Why Depth Instead of Width?

Code - MLP Implementation

Code - Assign Parameters to Neural Network

Code - Fit a Function using MLP

Optimization - Numerical Optimization


Optimization in General

Gradient Calculation and Gradient Descent for General Functions

Gradient-based Algorithm (Old)

Read - Grid Search

Read - Newton's Method

Read - Finite Difference Gradient Approximation (need refinement)

Read - Gradient Descent

Optimization - Neural Network Optimization


Loss Function: Bridge between Neural Network and Optimization

Example: Gradient Descent for Logistic Regression

Convexity and Neural Network Loss Surface

Batch and Stochastic Gradient Descent

Hand - Optimize a Neural Network

Read - Inversion of Multiplication

Read - Jacobian Matrix and Weight Gradient Matrix (Next Year)

Optimization - Backpropagation


Computational Graph and Chain Rule

Gradient Calculation using Backpropagation

Multipath Backpropagation

Subgradient for Non-Differentiable Function

Retain Graph and Checkpoint Scheme (Next Year)

(Deprecated) Modularized Functions

Read - Symbolic Library

Read - Reverse Automatic Differentiation: A Tutorial with NumPy Implementation

Optimization - Gradient Descent Improvements