Variational closed-Form deep neural net inference | Kütüphane.osmanlica.com

Variational closed-Form deep neural net inference

İsim Variational closed-Form deep neural net inference
Yazar Kandemir, Melih
Basım Tarihi: 2018-09
Basım Yeri - Elsevier
Konu Bayesian Neural Networks, Variational Bayes, Online learning, Active learning
Tür Süreli Yayın
Dil İngilizce
Dijital Evet
Yazma Hayır
Kütüphane: Özyeğin Üniversitesi
Demirbaş Numarası 0167-8655
Kayıt Numarası bafad6f4-fd58-46ac-8b4c-98cf9e4ee511
Lokasyon Computer Science
Tarih 2018-09
Örnek Metin We introduce a Bayesian construction for deep neural networks that is amenable to mean field variational inference that operates solely by closed-form update rules. Hence, it does not require any learning rate to be manually tuned. We show that by this virtue it becomes possible with our model to perform effective deep learning on three setups where conventional neural nets are known to perform suboptimally: i) online learning, ii) learning from small data, and iii) active learning. We compare our approach to earlier Bayesian neural network inference techniques spanning from expectation propagation to gradient-based variational Bayes, as well as deterministic neural nets with various activations functions. We observe our approach to improve on all these alternatives in two mainstream vision benchmarks and two medical data sets: diabetic retinopathy screening and exudate detection from eye fundus images.
DOI 10.1016/j.patrec.2018.07.001
Cilt 112
Kaynağa git Özyeğin Üniversitesi Özyeğin Üniversitesi
Özyeğin Üniversitesi Özyeğin Üniversitesi
Kaynağa git

Variational closed-Form deep neural net inference

Yazar Kandemir, Melih
Basım Tarihi 2018-09
Basım Yeri - Elsevier
Konu Bayesian Neural Networks, Variational Bayes, Online learning, Active learning
Tür Süreli Yayın
Dil İngilizce
Dijital Evet
Yazma Hayır
Kütüphane Özyeğin Üniversitesi
Demirbaş Numarası 0167-8655
Kayıt Numarası bafad6f4-fd58-46ac-8b4c-98cf9e4ee511
Lokasyon Computer Science
Tarih 2018-09
Örnek Metin We introduce a Bayesian construction for deep neural networks that is amenable to mean field variational inference that operates solely by closed-form update rules. Hence, it does not require any learning rate to be manually tuned. We show that by this virtue it becomes possible with our model to perform effective deep learning on three setups where conventional neural nets are known to perform suboptimally: i) online learning, ii) learning from small data, and iii) active learning. We compare our approach to earlier Bayesian neural network inference techniques spanning from expectation propagation to gradient-based variational Bayes, as well as deterministic neural nets with various activations functions. We observe our approach to improve on all these alternatives in two mainstream vision benchmarks and two medical data sets: diabetic retinopathy screening and exudate detection from eye fundus images.
DOI 10.1016/j.patrec.2018.07.001
Cilt 112
Özyeğin Üniversitesi
Özyeğin Üniversitesi yönlendiriliyorsunuz...

Lütfen bekleyiniz.