Return to search

A Resource-Aware Federated Learning Simulation Platform

The increasing concerns regarding users‘ data privacy leads to the infeasibility of
distributed Machine Learning applications, which are usually data-hungry. Federated
Learning has emerged as a privacy-preserving distributed machine learning paradigm,
in which the client dataset is kept locally, and only the local model parameters are
transmitted to the central server.
However, adoption of the Federated Learning paradigm leads to new edge computing
challenges, since it assumes computationally intensive tasks can be executed locally
by each device. The diverse hardware resources in a population of edge devices (e.g.,
smartphone models) can negatively impact the performance of Federated Learning, at
both the global and local levels.
This thesis contributes to this context with the implementation of a hardware-aware
Federated Learning platform, which provides comprehensive support regarding the
impacts of hardware heterogeneity on Federated Learning performance metrics by
modeling the costs associated with training tasks on aspects of computation and
communication.

Identiferoai:union.ndltd.org:kaust.edu.sa/oai:repository.kaust.edu.sa:10754/670246
Date07 1900
CreatorsLeandro, Fellipe
ContributorsSalama, Khaled N., Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division, Eltawil, Ahmed, Zhang, Xiangliang
Source SetsKing Abdullah University of Science and Technology
LanguageEnglish
Detected LanguageEnglish
TypeThesis
Rights2022-07-14, At the time of archiving, the student author of this thesis opted to temporarily restrict access to it. The full text of this thesis will become available to the public after the expiration of the embargo on 2022-07-14.

Page generated in 0.0057 seconds