Time-correlated sparsification for communication-efficient federated learning | Kütüphane.osmanlica.com

Time-correlated sparsification for communication-efficient federated learning

İsim Time-correlated sparsification for communication-efficient federated learning
Yazar Özfatura, E., Özfatura, Ahmet Kerem, Gündüz, D.
Basım Tarihi: 2021
Basım Yeri - IEEE
Tür Belge
Dil İngilizce
Dijital Evet
Yazma Hayır
Kütüphane: Özyeğin Üniversitesi
Demirbaş Numarası 978-153868209-8
Kayıt Numarası 61750359-461f-4a47-a58f-c465853dc910
Tarih 2021
Notlar Engineering and Physical Sciences Research Council ; European Research Council
Örnek Metin Federated learning (FL) enables multiple clients to collaboratively train a shared model, with the help of a parameter server (PS), without disclosing their local datasets. However, due to the increasing size of the trained models, the communication load due to the iterative exchanges between the clients and the PS often becomes a bottleneck in the performance. Sparse communication is often employed to reduce the communication load, where only a small subset of the model updates are communicated from the clients to the PS. In this paper, we introduce a novel time-correlated sparsification (TCS) scheme, which builds upon the notion that sparse communication framework can be considered as identifying the most significant elements of the underlying model. Hence, TCS exploits the correlation between the sparse representations at consecutive iterations in FL, so that the overhead due to encoding of the sparse representation can be significantly reduced without compromising the test accuracy. Through extensive simulations on the CIFAR-10 dataset, we show that TCS can achieve centralized training accuracy with 100 times sparsification, and up to 2000 times reduction in the communication load when employed with quantization.
DOI 10.1109/ISIT45174.2021.9518221
Kaynağa git Özyeğin Üniversitesi Özyeğin Üniversitesi
Özyeğin Üniversitesi Özyeğin Üniversitesi
Kaynağa git

Time-correlated sparsification for communication-efficient federated learning

Yazar Özfatura, E., Özfatura, Ahmet Kerem, Gündüz, D.
Basım Tarihi 2021
Basım Yeri - IEEE
Tür Belge
Dil İngilizce
Dijital Evet
Yazma Hayır
Kütüphane Özyeğin Üniversitesi
Demirbaş Numarası 978-153868209-8
Kayıt Numarası 61750359-461f-4a47-a58f-c465853dc910
Tarih 2021
Notlar Engineering and Physical Sciences Research Council ; European Research Council
Örnek Metin Federated learning (FL) enables multiple clients to collaboratively train a shared model, with the help of a parameter server (PS), without disclosing their local datasets. However, due to the increasing size of the trained models, the communication load due to the iterative exchanges between the clients and the PS often becomes a bottleneck in the performance. Sparse communication is often employed to reduce the communication load, where only a small subset of the model updates are communicated from the clients to the PS. In this paper, we introduce a novel time-correlated sparsification (TCS) scheme, which builds upon the notion that sparse communication framework can be considered as identifying the most significant elements of the underlying model. Hence, TCS exploits the correlation between the sparse representations at consecutive iterations in FL, so that the overhead due to encoding of the sparse representation can be significantly reduced without compromising the test accuracy. Through extensive simulations on the CIFAR-10 dataset, we show that TCS can achieve centralized training accuracy with 100 times sparsification, and up to 2000 times reduction in the communication load when employed with quantization.
DOI 10.1109/ISIT45174.2021.9518221
Özyeğin Üniversitesi
Özyeğin Üniversitesi yönlendiriliyorsunuz...

Lütfen bekleyiniz.