Dropout regularization in hierarchical mixture of experts | Kütüphane.osmanlica.com

Dropout regularization in hierarchical mixture of experts

İsim Dropout regularization in hierarchical mixture of experts
Yazar Alpaydın, Ahmet İbrahim Ethem
Basım Tarihi: 2021-01-02
Basım Yeri - Elsevier
Konu Dropout, Hierarchical models, Mixture of experts, Regularization
Tür Süreli Yayın
Dil İngilizce
Dijital Evet
Yazma Hayır
Kütüphane: Özyeğin Üniversitesi
Demirbaş Numarası 0925-2312
Kayıt Numarası ac1faa23-e384-4c11-94da-01238a514c35
Lokasyon Computer Science
Tarih 2021-01-02
Örnek Metin Dropout is a very effective method in preventing overfitting and has become the go-to regularizer for multi-layer neural networks in recent years. Hierarchical mixture of experts is a hierarchically gated model that defines a soft decision tree where leaves correspond to experts and decision nodes correspond to gating models that softly choose between its children, and as such, the model defines a soft hierarchical partitioning of the input space. In this work, we propose a variant of dropout for hierarchical mixture of experts that is faithful to the tree hierarchy defined by the model, as opposed to having a flat, unitwise independent application of dropout as one has with multi-layer perceptrons. We show that on a synthetic regression data and on MNIST, CIFAR-10, and SSTB datasets, our proposed dropout mechanism prevents overfitting on trees with many levels improving generalization and providing smoother fits.
DOI 10.1016/j.neucom.2020.08.052
Cilt 419
Kaynağa git Özyeğin Üniversitesi Özyeğin Üniversitesi
Özyeğin Üniversitesi Özyeğin Üniversitesi
Kaynağa git

Dropout regularization in hierarchical mixture of experts

Yazar Alpaydın, Ahmet İbrahim Ethem
Basım Tarihi 2021-01-02
Basım Yeri - Elsevier
Konu Dropout, Hierarchical models, Mixture of experts, Regularization
Tür Süreli Yayın
Dil İngilizce
Dijital Evet
Yazma Hayır
Kütüphane Özyeğin Üniversitesi
Demirbaş Numarası 0925-2312
Kayıt Numarası ac1faa23-e384-4c11-94da-01238a514c35
Lokasyon Computer Science
Tarih 2021-01-02
Örnek Metin Dropout is a very effective method in preventing overfitting and has become the go-to regularizer for multi-layer neural networks in recent years. Hierarchical mixture of experts is a hierarchically gated model that defines a soft decision tree where leaves correspond to experts and decision nodes correspond to gating models that softly choose between its children, and as such, the model defines a soft hierarchical partitioning of the input space. In this work, we propose a variant of dropout for hierarchical mixture of experts that is faithful to the tree hierarchy defined by the model, as opposed to having a flat, unitwise independent application of dropout as one has with multi-layer perceptrons. We show that on a synthetic regression data and on MNIST, CIFAR-10, and SSTB datasets, our proposed dropout mechanism prevents overfitting on trees with many levels improving generalization and providing smoother fits.
DOI 10.1016/j.neucom.2020.08.052
Cilt 419
Özyeğin Üniversitesi
Özyeğin Üniversitesi yönlendiriliyorsunuz...

Lütfen bekleyiniz.