TY - GEN
T1 - Fed2KD
T2 - 17th Conference on Wireless On-Demand Network Systems and Services, WONS 2022
AU - Sun, Chuanneng
AU - Jiang, Tingcong
AU - Zonouz, Saman
AU - Pompili, Dario
N1 - Funding Information:
Acknowledgement: This work was supported by the NSF RTML Award No. CCF-1937403.
Publisher Copyright:
© 2022 IFIP.
PY - 2022
Y1 - 2022
N2 - The world has suffered a lot from the COVID-19 pandemic. Though vaccines have been developed, we still need to be ready for its variants and other possible pandemics in the future. To provide people with pandemic risk assessments without violating privacy, a Federated Learning (FL) framework is envisioned. However, most existing FL frameworks can only work for homogeneous models, i.e., models with the same configuration, ignoring the preferences of the users and the various properties of their devices. To this end, we propose a novel two-way knowledge distillation-based FL framework, Fed2KD. The knowledge exchange between the global and local models is achieved by distilling the information into or out from a tiny model with unified configuration. Nonetheless, the distillation cannot be conducted without a common dataset. To solve this bottleneck, we leverage the Conditional Variational Autoencoder (CVAE) to generate data that will be used as a proxy dataset for distillation. The proposed framework is firstly evaluated on benchmark datasets (MNIST and FashionMNIST) to test its performance against existing models such as Federated Averaging (FedAvg). The performance of Fed2KD improves by up to 30% on MNIST dataset, and 18% on FashionMNIST when data is non-independent and identically distributed (non-IID) as compared to FedAvg. Then, Fed2KD is evaluated on the pandemic risk assessment tasks through a mobile APP we developed, namely DP4coRUna, which provides indoor risk prediction.
AB - The world has suffered a lot from the COVID-19 pandemic. Though vaccines have been developed, we still need to be ready for its variants and other possible pandemics in the future. To provide people with pandemic risk assessments without violating privacy, a Federated Learning (FL) framework is envisioned. However, most existing FL frameworks can only work for homogeneous models, i.e., models with the same configuration, ignoring the preferences of the users and the various properties of their devices. To this end, we propose a novel two-way knowledge distillation-based FL framework, Fed2KD. The knowledge exchange between the global and local models is achieved by distilling the information into or out from a tiny model with unified configuration. Nonetheless, the distillation cannot be conducted without a common dataset. To solve this bottleneck, we leverage the Conditional Variational Autoencoder (CVAE) to generate data that will be used as a proxy dataset for distillation. The proposed framework is firstly evaluated on benchmark datasets (MNIST and FashionMNIST) to test its performance against existing models such as Federated Averaging (FedAvg). The performance of Fed2KD improves by up to 30% on MNIST dataset, and 18% on FashionMNIST when data is non-independent and identically distributed (non-IID) as compared to FedAvg. Then, Fed2KD is evaluated on the pandemic risk assessment tasks through a mobile APP we developed, namely DP4coRUna, which provides indoor risk prediction.
KW - COVID-19 Risk Assessment
KW - Federated learning
KW - Knowledge Distillation
UR - http://www.scopus.com/inward/record.url?scp=85130283952&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85130283952&partnerID=8YFLogxK
U2 - 10.23919/WONS54113.2022.9764443
DO - 10.23919/WONS54113.2022.9764443
M3 - Conference contribution
AN - SCOPUS:85130283952
T3 - 17th Conference on Wireless On-Demand Network Systems and Services, WONS 2022
BT - 17th Conference on Wireless On-Demand Network Systems and Services, WONS 2022
A2 - Welzl, Michael
A2 - Karlsson, Gunnar
A2 - Alay, Ozgu
A2 - Peng, Chunyi
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 30 March 2022 through 1 April 2022
ER -