Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

pymc3 provides this for Python in a way that is very concise and modular (certainly much more concise than tensorflow-probability) -- and it is an open question if TensorFlow might be used to replace Theano as the backend execution engine for the next versions.

In particular, pymc3's use of ADVI to automatically transform discrete or boundary random variables into unconstrained continuous random variables and carry out an initialization process with auto-tuned variational Bayes automatically to infer good settings and seed values for NUTS, and then to automatically use an optimized NUTS implementation for the MCMC sampling, is incredibly impressive.

For most problems, you use a simple pymc3 context manager and from there on it acts kind of like a mutually recursive let block in some functional languages: you define random and deterministic variables that inter-depend on each other and are defined by their distribution functions, with your observational data indicating which values are used for determining the likelihood portion of the model.

After the context manager exits, you can just start drawing samples from the posterior distribution right away.

I've used it with great success for several large-scale hierarchical regression problems.



The plan of record is already to build pymc4 on top of TensorFlow: https://medium.com/@pymc_devs/theano-tensorflow-and-the-futu...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: