Improving Computational Efficiency in Federated Learning via Powerpropagation

Guastella, Adriano (2024) Improving Computational Efficiency in Federated Learning via Powerpropagation. [Laurea magistrale], Università di Bologna, Corso di Studio in Ingegneria informatica [LM-DM270], Documento full-text non disponibile
Il full-text non è disponibile per scelta dell'autore. (Contatta l'autore)

Abstract

Federated Learning (FL) represents a paradigm shift in machine learning, enabling the training of models without centralizing the data. In contrast to traditional methods that require centralization of data, FL enables distributed training by sending the model directly to peripheral devices. This approach not only preserves data privacy but also allows access to a wide range of information without transferring sensitive data. However, FL introduces challenges such as computational constraints and communication limitations. Sparse models offer a solution by reducing communication costs and computational load. This work introduces Power-SWAT, a training accelerator that exploits Powerpropagation and SWAT techniques to improve computational efficiency in FL. Through the integration of Powerpropagation, we have effectively addressed some of the primary challenges associated with sparse networks in FL. By slightly modifying the SWAT pipeline, we were able to apply a high sparsity ratio to the model with minimal or no loss of performance, achieving sparsity levels of up to 99.9\%. This resulted in a remarkable 145x speed-up in communication costs, all while maintaining an easily implementable approach. Experimental results demonstrate significant reductions in computational operations and communication costs, making FL more scalable and efficient.

Abstract
Tipologia del documento
Tesi di laurea (Laurea magistrale)
Autore della tesi
Guastella, Adriano
Relatore della tesi
Correlatore della tesi
Scuola
Corso di studio
Indirizzo
CURRICULUM INGEGNERIA INFORMATICA
Ordinamento Cds
DM270
Parole chiave
Federated Learning,Powerpropagation,SWAT,Sparsification,Machine Learning,Deep Learning
Data di discussione della Tesi
19 Marzo 2024
URI

Altri metadati

Gestione del documento: Visualizza il documento

^