Bayesian Optimization for Online Bandit Model Partitioning in Split Federated Learning

Junxia You,Jia Yan,Zhenjiang Li,Liuqing Yang

Published 2025 in 2025 IEEE/CIC International Conference on Communications in China (ICCC)

ABSTRACT

Federated learning (FL) has been recognized as a promising paradigm to support distributed AI model training among wireless devices (WDs) under the coordination of an edge server (ES) without sharing local datasets. To alleviate computation burden of resource-limited WDs, model partitioning that leverages computing capability at the ES is further integrated into FL, yielding the split (S) FL framework. In this paper, we study online bandit model partitioning for SFL over dynamic wireless networks, aiming to minimize overall energy-latency cost (ELC). Unlike prior works focusing on offline static or online gradientbased model splitting, we consider a practical setting where the analytical expression of ELC function is unavailable, and instead only the function values at queried points are revealed. To tackle such a challenging problem, a novel Bayesian optimization (BO)based approach is put forth by relying on a Gaussian process (GP)-based surrogate model to actively select the model splitting points per round via acquisition function optimization. The training model-specific structural information is incorporated in the kernel design of the GP surrogate to better capture network dynamics. Numerical tests demonstrate that the proposed BObased approach outperforms the contemporary baselines under various practical SFL settings.

PUBLICATION RECORD

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-17 of 17 references · Page 1 of 1

CITED BY

  • No citing papers are available for this paper.

Showing 0-0 of 0 citing papers · Page 1 of 1