TY - GEN
T1 - Integrating Homomorphic Encryption and Synthetic Data in FL for Privacy and Learning Quality
AU - Wang, Yenan
AU - Chiasserini, Carla Fabiana
AU - Schiller, Elad Michael
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025/1/1
Y1 - 2025/1/1
N2 - Federated learning (FL) enables collaborative training of machine learning models without sharing sensitive client data, making it a cornerstone for privacy-critical applications. However, FL faces the dual challenge of ensuring learning quality and robust privacy protection while keeping resource consumption low, particularly when using computationally expensive techniques such as homomorphic encryption (HE). In this work, we enhance an FL process that preserves privacy using HE by integrating it with synthetic data generation and an interleaving strategy. Specifically, our solution, named Alternating Federated Learning (Alt-FL), consists of alternating between local training with authentic data (authentic rounds) and local training with synthetic data (synthetic rounds) and transferring the encrypted and plaintext model parameters on authentic and synthetic rounds (resp.). Our approach improves learning quality (e.g., model accuracy) through datasets enhanced with synthetic data, preserves client data privacy via HE, and keeps manageable encryption and decryption costs through our interleaving strategy. We evaluate our solution against data leakage attacks, such as the DLG attack, demonstrating robust privacy protection. Also, Alt-FL provides 13.4% higher model accuracy and decreases HE-related costs by up to 48% with respect to Selective HE.
AB - Federated learning (FL) enables collaborative training of machine learning models without sharing sensitive client data, making it a cornerstone for privacy-critical applications. However, FL faces the dual challenge of ensuring learning quality and robust privacy protection while keeping resource consumption low, particularly when using computationally expensive techniques such as homomorphic encryption (HE). In this work, we enhance an FL process that preserves privacy using HE by integrating it with synthetic data generation and an interleaving strategy. Specifically, our solution, named Alternating Federated Learning (Alt-FL), consists of alternating between local training with authentic data (authentic rounds) and local training with synthetic data (synthetic rounds) and transferring the encrypted and plaintext model parameters on authentic and synthetic rounds (resp.). Our approach improves learning quality (e.g., model accuracy) through datasets enhanced with synthetic data, preserves client data privacy via HE, and keeps manageable encryption and decryption costs through our interleaving strategy. We evaluate our solution against data leakage attacks, such as the DLG attack, demonstrating robust privacy protection. Also, Alt-FL provides 13.4% higher model accuracy and decreases HE-related costs by up to 48% with respect to Selective HE.
KW - Federated learning
KW - Homomorphic encryption
KW - Privacy protection
KW - Resource consumption
UR - https://www.scopus.com/pages/publications/105017614004
U2 - 10.1109/LANMAN66415.2025.11154574
DO - 10.1109/LANMAN66415.2025.11154574
M3 - Conference contribution
AN - SCOPUS:105017614004
T3 - IEEE Workshop on Local and Metropolitan Area Networks
BT - 2025 IEEE 31st International Symposium on Local and Metropolitan Area Networks, LANMAN 2025
PB - Institute of Electrical and Electronics Engineers
T2 - 31st IEEE International Symposium on Local and Metropolitan Area Networks, LANMAN 2025
Y2 - 7 July 2025 through 8 July 2025
ER -