Publications

Detailed Information

Information-theoretic privacy in federated submodel learning

Cited 0 time in Web of Science Cited 2 time in Scopus
Authors

Kim, Minchul; Lee, Jungwoo

Issue Date
2022-01
Publisher
한국통신학회
Citation
ICT Express
Abstract
© 2022 The AuthorsWe consider information-theoretic privacy in federated submodel learning, where a global server has multiple submodels. Compared to the privacy considered in the conventional federated submodel learning where secure aggregation is adopted for ensuring privacy, information-theoretic privacy provides the stronger protection on submodel selection by the local machine. We propose an achievable scheme that partially adopts the conventional private information retrieval (PIR) scheme that achieves the minimum amount of download. With respect to computation and communication overhead, we compare the achievable scheme with a naïve approach for federated submodel learning with information-theoretic privacy.
ISSN
2405-9595
URI
https://hdl.handle.net/10371/184013
DOI
https://doi.org/10.1016/j.icte.2022.02.008
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share