In recent years, Federated Learning has emerged as a rapidly growing area of research, sparking the development of a wide range of algorithmic approaches.
Yet, the majority of these efforts have been confined to tackling conventional optimization problems, often overlooking the broader machine learning paradigms.
This tutorial shifts the focus to two increasingly important problem formulations: stochastic compositional optimization (SCO) and stochastic bilevel optimization (SBO).
These frameworks encompass a variety of advanced learning scenarios that go well beyond standard objective minimization, including model-agnostic meta-learning, classification with imbalanced data, contrastive self-supervised learning, graph-based neural models, and neural architecture search.
The inherently nested structure of SCO and SBO pose unique challenges, particularly in the Federated Learning setting, where both computational and communication constraints must be carefully managed.
In response, a new line of research has emerged, aiming to adapt and extend optimization techniques to the federated setting for these complex problems.
Despite this progress, the resulting methodologies remain relatively underexplored in the broader machine learning and data mining communities.
This tutorial seeks to bridge that gap. We will provide a comprehensive overview of the theoretical foundations, algorithmic innovations, and practical applications of federated SCO and SBO.
Participants will leave with a clear understanding of the challenges unique to these problems, the latest techniques developed to address them, and actionable insights on applying these methods to real-world federated learning applications.