Loading…
Monday July 21, 2025 4:15pm - 5:30pm PDT
Session: Recent advances in federated learning
Chair: Minseok Ryu
Cluster: Optimization Applications (Communication, Energy, Health, ML, ...)

Talk 1: FedSpaLLM: Federated Pruning of Large Language Models
Speaker: Yijiang Li
Abstract: Federated Learning (FL) has gained significant interest in training AI models in a distributed computing environment benefiting from its capability to maintain the privacy of sensitive data of the participating parties. However, challenges remain in effectively handling of participating parties with heterogeneous computational power, such as edge devices. In this work, we propose a federated framework that involves an adaptive global pruning scheme to enable collaborative training of large models, such as LLMs, on parties with heterogeneous computational power.

Talk 2: Balancing uneven client participation in asynchronous Federated Learning
Speaker: Charikleia Iakovidou
Abstract: Asynchronous communication is a popular approach for speeding up the convergence of Federated Learning (FL) in the presence of slowly updating clients. Existing asynchronous FL methods typically provide convergence guarantees under the assumption that each client is equally likely to participate in a global aggregation. In practice, however, due to variations in computation or communication capabilities, clients may update the server at different frequencies. We demonstrate theoretically that under uneven client participation and non-iid local data, vanilla asynchronous FedAvg cannot achieve convergence to an arbitrarily small neighborhood of the optimum of the global loss function, even when a diminishing stepsize sequence is adopted. We introduce AREA, a new asynchronous FL method that employs a memory-based correction mechanism for balancing uneven client participation, and supports a wide variety of deterministic and stochastic aggregation protocols. Without the strong assumptions of bounded maximum client delay and bounded gradients, we establish theoretically optimal convergence rates for AREA for (i) strongly convex and smooth functions, (ii) convex and nonsmooth functions, and (iii) nonconvex and smooth functions.

Talk 3: Federated Communication-Efficient Multi-Objective Optimization
Speaker: Baris Askin
Abstract: We study a federated version of multi-objective optimization (MOO), where a single model is trained to optimize multiple objective functions. MOO has been extensively studied in the centralized setting but is less explored in federated or distributed settings. We propose FedCMOO, a novel communication-efficient federated multi-objective optimization (FMOO) algorithm that improves the error convergence performance of the model compared to existing approaches. Unlike prior works, the communication cost of FedCMOO does not scale with the number of objectives, as each client sends a single aggregated gradient, obtained using randomized SVD (singular value decomposition), to the central server. We provide a convergence analysis of the proposed method for smooth non-convex objective functions under milder assumptions than in prior work. In addition, we introduce a variant of FedCMOO that allows users to specify a preference over the objectives in terms of a desired ratio of the final objective values. Through extensive experiments, we demonstrate the superiority of our proposed method over baseline approaches.

Speakers
YL

Yijiang Li

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
CI

Charikleia Iakovidou

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
BA

Baris Askin

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Monday July 21, 2025 4:15pm - 5:30pm PDT
Taper Hall (THH) 118 3501 Trousdale Pkwy, 118, Los Angeles, CA 90089

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link