Reddy, Sashlin2021-04-302021-04-302020Reddy, Sashlin (2020) A comparative analysis of dynamic averaging techniques in federated learning, University of the Witwatersrand, Johannesburg, <http://hdl.handle.net/10539/31059>https://hdl.handle.net/10539/31059A dissertation submitted in fulfilment of the requirements for the degree Master of Science in the School of Computer Science and Applied Mathematics, Faculty of Science, University of the Witwatersrand, 2020Due to the advancements in mobile technology and user privacy concerns, federated learning has emerged as a popular machine learning (ML) method to push training of statistical models to the edge. Federated learning involves training a shared model under the coordination of a centralized server from a federation of participating clients. In practice federated learning methods have to overcome large network delays and bandwidth limits. To overcome the communication bottlenecks, recent works propose methods to reduce the communication frequency that have negligible impact on model accuracy also defined as model performance. Naive methods reduce the number of communication rounds in order to reduce the communication frequency. However, it is possible to invest communication more efficiently through dynamic communication protocols. This is deemed as dynamic averaging. Few have addressed such protocols. More so, few works base this dynamic averaging protocol on the diversity of the data and the loss. In this work, we introduce dynamic averaging frameworks based on the diversity of the data as well as the loss encountered by each client. This overcomes the assumption that each client participates equally and addresses the properties of federated learning. Results show that the overall communication overhead is reduced with negligible decrease in accuracyOnline resource (108 leaves)enWeb-based instructionMachine learningA comparative analysis of dynamic averaging techniques in federated learningThesis