Ashu's Working Notes

Home

❯

Literature

❯

Machine Learning

Machine Learning

Jun 10, 20251 min read

1. Introduction:

  • Deduction, Abduction & Induction
  • Induction in Machine Learning
  • Probability Formulae
  • Types of Machine Learning
  • Model Selection vs Parameter Optimization

2. Supervised Learning - Regression

  • Regression
  • Squared Error Function
  • Empirical Error (Parameter Optimization)
  • Regularisation
  • Perceptron in Mathematical Notation

3. Probabilitstic Modelling

  • Aleatoric vs Epistemic Uncertainty
  • Using Likelihood to Model Probability
  • Bayesian Inference
  • Maximum a-posteriori estimation for a Gaussian distributed data model using regularization.
  • MAP = Regularized Least Squares
  • Predictive Distribution

4. Incremental Bayesian Learning

  • Incremental Bayesian Learning

5. Error Minimization

  • Generalized Learning Rule
  • Finding Hyper Parameter
  • Cross Validation
  • Bias-Variance decomposition
  • Bias vs Variance
  • Double Descent

6. Models

  • Linear Learning Models
  • Finding Optimum Parameters for Linear Learning Model
  • Gradient Descent

7. Definition of Decision Boundary

  • Decision Boundary and Hyperplane
  • Three approaches to classification
    • Discriminant Function (Fisher LDF as Example)
    • Direct Posterior Modeling

Concept Learning

  • Validity of Fundamental Equivalence between Induction and Deduction

Graph View

  • 1. Introduction:
  • 2. Supervised Learning - Regression
  • 3. Probabilitstic Modelling
  • 4. Incremental Bayesian Learning
  • 5. Error Minimization
  • 6. Models
  • 7. Definition of Decision Boundary
  • Concept Learning

Backlinks

  • Ahoy!

Created with Quartz v4.5.1 © 2025

  • Website
  • Substack