Sampling-free variational inference of Bayesian neural networks by variance backpropagation | Kütüphane.osmanlica.com

Sampling-free variational inference of Bayesian neural networks by variance backpropagation

İsim Sampling-free variational inference of Bayesian neural networks by variance backpropagation
Yazar Haußmann, M., Hamprecht, F. A., Kandemir, Melih
Basım Tarihi: 2019
Basım Yeri - Association For Uncertainty in Artificial Intelligence (AUAI)
Tür Belge
Dil İngilizce
Dijital Evet
Yazma Hayır
Kütüphane: Özyeğin Üniversitesi
Demirbaş Numarası 2-s2.0-85084012503
Kayıt Numarası 3482b6a0-79c3-4955-9dbc-a336086f2de8
Lokasyon Computer Science
Tarih 2019
Örnek Metin We propose a new Bayesian Neural Net formulation that affords variational inference for which the evidence lower bound is analytically tractable subject to a tight approximation. We achieve this tractability by (i) decomposing ReLU nonlinearities into the product of an identity and a Heaviside step function, (ii) introducing a separate path that decomposes the neural net expectation from its variance. We demonstrate formally that introducing separate latent binary variables to the activations allows representing the neural network likelihood as a chain of linear operations. Performing variational inference on this construction enables a sampling-free computation of the evidence lower bound which is a more effective approximation than the widely applied Monte Carlo sampling and CLT related techniques. We evaluate the model on a range of regression and classification tasks against BNN inference alternatives, showing competitive or improved performance over the current state-of-the-art.
Kaynağa git Özyeğin Üniversitesi Özyeğin Üniversitesi
Özyeğin Üniversitesi Özyeğin Üniversitesi
Kaynağa git

Sampling-free variational inference of Bayesian neural networks by variance backpropagation

Yazar Haußmann, M., Hamprecht, F. A., Kandemir, Melih
Basım Tarihi 2019
Basım Yeri - Association For Uncertainty in Artificial Intelligence (AUAI)
Tür Belge
Dil İngilizce
Dijital Evet
Yazma Hayır
Kütüphane Özyeğin Üniversitesi
Demirbaş Numarası 2-s2.0-85084012503
Kayıt Numarası 3482b6a0-79c3-4955-9dbc-a336086f2de8
Lokasyon Computer Science
Tarih 2019
Örnek Metin We propose a new Bayesian Neural Net formulation that affords variational inference for which the evidence lower bound is analytically tractable subject to a tight approximation. We achieve this tractability by (i) decomposing ReLU nonlinearities into the product of an identity and a Heaviside step function, (ii) introducing a separate path that decomposes the neural net expectation from its variance. We demonstrate formally that introducing separate latent binary variables to the activations allows representing the neural network likelihood as a chain of linear operations. Performing variational inference on this construction enables a sampling-free computation of the evidence lower bound which is a more effective approximation than the widely applied Monte Carlo sampling and CLT related techniques. We evaluate the model on a range of regression and classification tasks against BNN inference alternatives, showing competitive or improved performance over the current state-of-the-art.
Özyeğin Üniversitesi
Özyeğin Üniversitesi yönlendiriliyorsunuz...

Lütfen bekleyiniz.