Differential Privacy Federated Learning (DP-FL) combines Differential Privacy (DP) with Federated Learning (FL), enabling multiple clients to collaboratively train a shared model while protecting data privacy. However, introducing DP into FL will add noise to model parameters, which typically deteriorates model convergence. Although a recent work has revealed the compensation effect by increasing the total batch size, it overlooks the “generalization gap” phenomenon, which is induced by excessively large batch size and has been long discussed in the machine learning field. In order to avoid the other extreme, we strengthen several core components in, and propose an Incentive-driven Differential Privacy Federated Learning (IDP-FL) framework. First, instead of building all hopes on batch sizes, the proposed framework jointly considers the non-IID degrees of local data and clients’ privacy budgets, minimizing the difference between the optimal batch size for each selected client and its corresponding critical batch size. Second, we reconfigure the batch size for each selected client by balancing the negative impact of DP noise on convergence and of the “generalization gap” phenomenon. Finally, we design a Stackelberg game-based incentive mechanism that encourages clients to contribute computational resources, and prove the existence of a Stackelberg equilibrium to guarantee stability. Through numerical evaluations on real-world datasets, we show that our IDP-FL framework outperforms existing algorithms in terms of test accuracy and utility. Ablation studies further confirm the effectiveness of each component.
Do Larger Batch Sizes Always Help? Revisiting Incentive-Driven Differential Privacy Federated Learning
Jin Xu,Huiqun Yu,Guisheng Fan,Hengrun Zhang
Published 2026 in IEEE Transactions on Cognitive Communications and Networking
ABSTRACT
PUBLICATION RECORD
- Publication year
2026
- Venue
IEEE Transactions on Cognitive Communications and Networking
- Publication date
Unknown publication date
- Fields of study
Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-36 of 36 references · Page 1 of 1
CITED BY
- No citing papers are available for this paper.
Showing 0-0 of 0 citing papers · Page 1 of 1