Publications
Detailed Information
GRADUAL FEDERATED LEARNING USING SIMULATED ANNEALING
Cited 2 time in
Web of Science
Cited 2 time in Scopus
- Authors
- Issue Date
- 2021-06
- Publisher
- IEEE
- Citation
- 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), pp.3090-3094
- Abstract
- Federated learning is a machine learning framework that enables AI models training over a network of multiple user devices without revealing user data stored in the devices. Popularly used federated learning technique to enhance the learning performance of user devices is to globally evaluate the learning model at the server by averaging the locally trained models of the devices. This global model is then sent back to the user devices so that every device applies it in the next training iteration. However, this average-based model is not always better than the local update model of a user device. In this work, we put forth a new update strategy based on the simulated annealing (SA) algorithm, in which the user devices choose their training parameters between the global evaluation model and their local models probabilistically. The proposed technique, dubbed simulated annealing-based federated learning (SAFL), is effective in solving a wide class of federated learning problems. From numerical experiments, we demonstrate that SAFL outperforms the conventional approach on different benchmark datasets, achieving an accuracy improvement of 50% in a few iterations.
- Files in This Item:
- There are no files associated with this item.
Item View & Download Count
Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.