Knowledge-distillation based personalized federated learning with distribution constraints

Ziyang Zhang,Chang Mu,K. Guo,Xiang Tian,Xiangming Xu

Published 2025 in Neural Networks

ABSTRACT

Personalized federated learning PFL seeks to develop models that are tailored to the unique data distributions of individual clients. While some methods rely on a global server model to guide personalization, more recent methods focus on directly learn personalized models. Among these, leveraging inter-client correlations has become a widely adopted strategy for personalized model generation. PFedGraph exemplifies this by constructing client relationships based on model similarity for personalized model aggregation, achieving outstanding personalized performance. However, pFedgraph overlooks category distribution information, a critical aspect reflecting data distribution heterogeneity, although it has been extensively applied in machine learning and federated learning tasks. Category distribution can serve as a direct and informative metric for measuring inter-client data divergence. Furthermore, pFedGraph underutilizes global knowledge derived from diverse client datasets, limiting its personalized ability. To address these limitations, we incorporate category distribution constraints into the computation of client-specific aggregation weights, enabling the generation of personalized models enriched with distribution-aware information. Additionally, to mitigate the risk of overfitting to local data and enhance the use of global knowledge, we align the outputs of personalized models with those of the global model, which is obtained through the classical federated averaging algorithm, to effectively transfer shared global knowledge to personalized models. The proposed method consistently outperforms state-of-the-art approaches across diverse data types and distribution scenarios, demonstrating its effectiveness.

PUBLICATION RECORD

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-49 of 49 references · Page 1 of 1

CITED BY

  • No citing papers are available for this paper.

Showing 0-0 of 0 citing papers · Page 1 of 1