1. Introduction:
- Deduction, Abduction & Induction
- Induction in Machine Learning
- Probability Formulae
- Types of Machine Learning
- Model Selection vs Parameter Optimization
2. Supervised Learning - Regression
- Regression
- Squared Error Function
- Empirical Error (Parameter Optimization)
- Regularisation
- Perceptron in Mathematical Notation
3. Probabilitstic Modelling
- Aleatoric vs Epistemic Uncertainty
- Using Likelihood to Model Probability
- Bayesian Inference
- Maximum a-posteriori estimation for a Gaussian distributed data model using regularization.
- MAP = Regularized Least Squares
- Predictive Distribution
4. Incremental Bayesian Learning
5. Error Minimization
- Generalized Learning Rule
- Finding Hyper Parameter
- Cross Validation
- Bias-Variance decomposition
- Bias vs Variance
- Double Descent
6. Models
7. Definition of Decision Boundary
- Decision Boundary and Hyperplane
- Three approaches to classification