Ashu's Online Notes

Home

❯

Literature

❯

Machine Learning

Machine Learning

Jul 10, 20251 min read

1. Introduction:

  • Deduction, Abduction & Induction
  • Induction in Machine Learning
  • Probability Formulae
  • Types of Machine Learning
  • Model Selection vs Parameter Optimization

2. Supervised Learning - Regression

  • Regression
  • Squared Error Function
  • Empirical Error (Parameter Optimization)
  • Regularisation
  • Perceptron in Mathematical Notation

3. Probabilitstic Modelling

  • Aleatoric vs Epistemic Uncertainty
  • Using Likelihood to Model Probability
  • Bayesian Inference
  • MAP estimation for a Gaussian distributed data model using regularization
  • MAP = Regularized Least Squares
  • Predictive vs Bayesian Predictive Distribution

4. Incremental Bayesian Learning

  • Incremental Bayesian Learning

5. Error Minimization

  • Generalized Learning Rule
  • Finding Hyper Parameter
  • Cross Validation
  • Bias-Variance decomposition
  • Bias vs Variance
  • Double Descent

6. Models

  • General Model Classes
  • Linear Learning Models
  • Finding Optimum Parameters for Linear Learning Model
  • Gradient Descent
  • Radial Basis Function
  • Weighted Linear Regression
  • Unified Model

7. Classification

  • Decision Boundary and Hyperplane
  • Three approaches to classification
    • Discriminant Function (Fisher LDF as Example)
    • Direct Posterior Modeling

8. Concept Learning

  • Validity of Fundamental Equivalence between Induction and Deduction

Graph View

  • 1. Introduction:
  • 2. Supervised Learning - Regression
  • 3. Probabilitstic Modelling
  • 4. Incremental Bayesian Learning
  • 5. Error Minimization
  • 6. Models
  • 7. Classification
  • 8. Concept Learning

Backlinks

  • Ahoy!

  • Website
  • Substack