Publications
Detailed Information
Information-theoretic privacy in federated submodel learning
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Kim, Minchul | - |
dc.contributor.author | Lee, Jungwoo | - |
dc.date.accessioned | 2022-06-24T07:14:18Z | - |
dc.date.available | 2022-06-24T07:14:18Z | - |
dc.date.created | 2022-05-09 | - |
dc.date.issued | 2022-01 | - |
dc.identifier.citation | ICT Express | - |
dc.identifier.issn | 2405-9595 | - |
dc.identifier.uri | https://hdl.handle.net/10371/184013 | - |
dc.description.abstract | © 2022 The AuthorsWe consider information-theoretic privacy in federated submodel learning, where a global server has multiple submodels. Compared to the privacy considered in the conventional federated submodel learning where secure aggregation is adopted for ensuring privacy, information-theoretic privacy provides the stronger protection on submodel selection by the local machine. We propose an achievable scheme that partially adopts the conventional private information retrieval (PIR) scheme that achieves the minimum amount of download. With respect to computation and communication overhead, we compare the achievable scheme with a naïve approach for federated submodel learning with information-theoretic privacy. | - |
dc.language | 영어 | - |
dc.publisher | 한국통신학회 | - |
dc.title | Information-theoretic privacy in federated submodel learning | - |
dc.type | Article | - |
dc.identifier.doi | 10.1016/j.icte.2022.02.008 | - |
dc.citation.journaltitle | ICT Express | - |
dc.identifier.scopusid | 2-s2.0-85126069987 | - |
dc.description.isOpenAccess | N | - |
dc.contributor.affiliatedAuthor | Lee, Jungwoo | - |
dc.type.docType | Article | - |
dc.description.journalClass | 1 | - |
- Appears in Collections:
- Files in This Item:
- There are no files associated with this item.
Item View & Download Count
Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.