Skip to content

brianspiering/gaussian_mixture_models

Repository files navigation

A Gentle into Gaussian Mixture Models (GMM) & Expectation–Maximization (EM) Algorithm

You know and love the Gaussian / Normal / Bell Curve. It is very common, appearing in almost all domains. In particular, it is the work-horse of statistics.

If one Gaussian distribution is awesome, what would be more awesome? TWO GAUSSIAN DISTRIBUTIONS!

That is what Gaussian Mixture Models (GMM) are - take your old friend the single Gaussian distribution and mix in another Gaussian distribution.

How does that dark magic happen? The Expectation–Maximization (EM) Algorithm.

This technical talk will start with a quick review of the Gaussian, then move in to GMMs, and discuss how to estimate a GMM with the EM algorithm. An introductory level of statistics is assumed. If you need a refresher, check out https://galvanizeopensource.github.io/stats-shortcourse/ or https://www.khanacademy.org/math/statistics-probability/modeling-distributions-of-data


Suggested Preparation Materials:

Challenge Preparation Materials:

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published