Bayesian Deep Learning - Resources
This Fall at my graduate program I am taking STAT578: Advanced Bayesian Modelling; having come from a Deep Learning background, it was only obvious for me to question the usefulness of the new material I'm learning; what is up with all the posterior and prior; having never used them before in any of my deep models. Is there anything like Bayesian Deep Learning where the best of both worlds meet - something like - rather than having point estimates for our weights and biases of deep neural network models; can we have something like a posterior predictive interval for our weights given a prior for same.
As a matter of fact - Bayesian Deep Leaning is the most awesome thing right now! Don't believe me, believe NIPS; starting just last year ( 2016! ) they started a workshop for the same; you can check more at http://bayesiandeeplearning.org .
Here's few other useful links
- Dustin Tran, the lead developer for Edward - a library for probabilistic modeling, inference, and criticism.
http://dustintran.com - 2017 O'Reilly talk by Yarin Gal
http://mlg.eng.cam.ac.uk/yarin/PDFs/2017_OReilly_talk.pdf
https://www.safaribooksonline.com/library/view/oreilly-artificial-intelligence/9781491976289/video311817.html - Building a Bayesian Deep Learning Classifier
https://medium.com/towards-data-science/building-a-bayesian-deep-learning-classifier-ece1845bc09 - Awesome Bayesian Deep Learning
https://github.com/robi56/awesome-bayesian-deep-learning - Edward - A library for probabilistic modeling, inference, and criticism.
http://edwardlib.org - Andrew Rowan - Bayesian Deep Learning with Edward (and a trick using Dropout)
https://www.youtube.com/watch?v=I09QVNrUS3Q - Bayesian Deep Learning - NIPS 2016
http://bayesiandeeplearning.org/2016/index.html - PyCon 2017 - Bayesian Machine Learning
https://github.com/UnataInc/PyCon2017 - While My MCMC Gently Samples
http://twiecki.github.io/blog/2016/06/01/bayesian-deep-learning/