Built on top of Scikit-learn and PyMC3
Built with the broader community
Pymc-learn is open source and freely available. It is built on top Scikit-learn & PyMC3.
Scikit-learn is a popular Python library for machine learning providing a simple API that makes it very easy for users to train, score, save and load models in production.
PyMC3 is a Python package for Bayesian statistical modeling and Probabilistic Machine Learning focusing on advanced Markov chain Monte Carlo (MCMC) and variational inference (VI).
Familiar for Scikit-Learn users
easy to get started
You don't have to completely rewrite your scikit-learn ML code.
Pymc-learn provides models built on top of the scikit-learn API. So you can easily and quickly instantiate, train, score, save, and load models just like in scikit-learn.Learn More »
- Scikit-learn syntax
# Linear regression in scikit-learn from sklearn.linear_model \ import LinearRegression lr = LinearRegression() lr.fit(X, y)
- Pymc-learn syntax
# Linear regression in pymc-learn from pmlearn.linear_model \ import LinearRegression lr = LinearRegression() lr.fit(X, y)
Models should know when they don't know
Quantify the degree of uncertainty in model parameters and predictions.
Probability is the fundamental mathematical principle for quantifying uncertainty. Pymc-learn provides probabilistic models that represent and process uncertain values using Bayesian inference.Why Uncertainty Quantification is Important »
Scale up to Big Data
Using Variational Inference
Recent research has led to the development of variational inference algorithms that are fast and almost as flexible as MCMC.
Instead of drawing samples from the posterior, these algorithms instead fit a distribution (e.g. normal) to the posterior turning a sampling problem into an optimization problem. ADVI – Automatic Differentation Variational Inference – is implemented in PyMC3.Learn About Variational Inference »