IJCAI 2025 Tutorial: Federated Compositional and Bilevel Optimization

  • Date: TBD

  • Time: TBD

  • Location: TBD

Overview

Federated Learning has attracted significant attention in recent years, resulting in the development of numerous methods. However, most of these methods focus solely on traditional minimization problems and fail to address new learning paradigms in machine learning. Therefore, this tutorial focuses on the learning paradigm that can be formulated as the stochastic compositional optimization (SCO) problem and the stochastic bilevel optimization (SBO) problem, as they cover a wide variety of machine learning models beyond traditional minimization problem, such as model-agnostic meta-learning, imbalanced data classification models, contrastive self-supervised learning models, graph neural networks, neural architecture search, etc. The compositional structure and bilevel structures bring unique challenges in computation and communication for federated learning. To address these challenges, a series of federated compositional optimization and federated bilevel optimization methods have been developed in the past few years. However, these advances have not been widely disseminated. Thus, this tutorial aims to introduce the unique challenges, recent advances, and practical applications of federated SCO and SBO. The audience will benefit from this tutorial by gaining a deeper understanding of federated SCO and SBO algorithms and learning how to apply them to real-world applications.

Tutorial Outline

  • Section I: Section I: Introduction (15 min).

  • Section II: Federated Compositional Optimization (35 min).

  • Section III: Federated Bilevel Optimization (35 min).

  • Section IV: Summary and Future Directions (10 min).

Presenters

alt text 

Hongchang Gao is an assistant professor in the Department of Computer and Information Sciences at Temple University. His research interests include machine learning, optimization, and biomedical data science, with a special focus on distributed optimization and federated learning. His work has been published in top venues such as ICML, NeurIPS, AISTATS, KDD, AAAI, and IJCAI. He currently serves as an Associate Editor for the Journal of Combinatorial Optimization and regularly acts as an Area Chair for ICML and NeurIPS. He is a recipient of the NSF CAREER Award (2024), the AAAI New Faculty Highlights (2023), and the Cisco Faculty Research Award (2023).

alt text 

Xinwen Zhang is a Ph.D. student in the Department of Computer and Information Sciences at Temple University. Her research primarily focuses on stochastic minimax optimization and stochastic compositional optimization, as well as their applications to real-world data mining tasks. She has published multiple pioneering works on federated compositional optimization in top machine learning venues, such as ICML and NeurIPS.