ICDM 2025 Tutorial: Federated Stochastic Compositional and Bilevel Optimization

  • Date: TBD

  • Time: TBD

  • Location: TBD

Overview

In recent years, Federated Learning has emerged as a rapidly growing area of research, sparking the development of a wide range of algorithmic approaches. Yet, the majority of these efforts have been confined to tackling conventional optimization problems, often overlooking the broader machine learning paradigms. This tutorial shifts the focus to two increasingly important problem formulations: stochastic compositional optimization (SCO) and stochastic bilevel optimization (SBO). These frameworks encompass a variety of advanced learning scenarios that go well beyond standard objective minimization, including model-agnostic meta-learning, classification with imbalanced data, contrastive self-supervised learning, graph-based neural models, and neural architecture search. The inherently nested structure of SCO and SBO pose unique challenges, particularly in the Federated Learning setting, where both computational and communication constraints must be carefully managed. In response, a new line of research has emerged, aiming to adapt and extend optimization techniques to the federated setting for these complex problems. Despite this progress, the resulting methodologies remain relatively underexplored in the broader machine learning and data mining communities. This tutorial seeks to bridge that gap. We will provide a comprehensive overview of the theoretical foundations, algorithmic innovations, and practical applications of federated SCO and SBO. Participants will leave with a clear understanding of the challenges unique to these problems, the latest techniques developed to address them, and actionable insights on applying these methods to real-world federated learning applications.

Tutorial Outline

  • Section I: Section I: Introduction (15 min).

  • Section II: Federated Stochastic Compositional Optimization (30 min).

  • Section III: Federated AUC Maximization (30 min).

  • Section IV: Federated Stochastic Bilevel Optimization (30 min).

  • Section V: Summary and Future Directions (5 min).

Presenters

alt text 

Hongchang Gao is an assistant professor in the Department of Computer and Information Sciences at Temple University. His research interests include machine learning, optimization, and biomedical data science, with a special focus on distributed optimization and federated learning. His work has been published in top venues such as ICML, NeurIPS, AISTATS, KDD, AAAI, and IJCAI. He currently serves as an Associate Editor for the Journal of Combinatorial Optimization and regularly acts as an Area Chair for ICML and NeurIPS. He is a recipient of the NSF CAREER Award (2024), the AAAI New Faculty Highlights (2023), and the Cisco Faculty Research Award (2023).

alt text 

Xinwen Zhang is a Ph.D. student in the Department of Computer and Information Sciences at Temple University. Her research primarily focuses on stochastic minimax optimization and stochastic compositional optimization, as well as their applications to real-world data mining tasks. She has published multiple pioneering works on federated compositional optimization in top machine learning venues, such as ICML and NeurIPS. She is a recipient of KDD 2023 Student Travel Award, NeurIPS 2023 Scholar Award, WiML 2023 Travel Award, and ICML 2024 Travel Award.