Using GMM for Regression

Gaussian Mixture Regression (GMR) uses a GMM to perform Regression. The process is:

  1. Model the Joint Space: Combine the input and output data into a single vector, . Train a GMM on this joint data space to learn the distribution .
  2. Conditioning: For a new input query , calculate the conditional probability distribution from the learned joint GMM. This conditional distribution is itself a Gaussian.
  3. Predict: The regression output for is the expected value (mean) of this conditional distribution. The analytical solution for the predicted mean is: where are weights corresponding to the responsibilities.

GMR has the same mathematical form as Locally Weighted Regression (LWR) and is considered a special case of a broader unified model for regression. The unified model is expressed as: GMR fits into this framework, demonstrating its connection to a wide family of regression algorithms, including those based on mixtures of linear models and weighted sums of basis functions.