Posts by Collection

portfolio

publications

FLECS-CGD: A Federated Learning Second-Order Framework via Compression and Sketching with Compressed Gradient Differences

Published in HOOML2022: Order up! Workshop in Neurips 2021, 2022

In the recent paper FLECS (Agafonov et al, FLECS: A Federated Learning Second-Order Frame-work via Compression and Sketching), the second-order framework FLECS was proposed for theFederated Learning problem. This method utilize compression of sketched Hessians to make communication costs low. However, the main bottleneck of FLECS is gradient communication without compression. In this paper, we propose the modification of FLECS with compressed gradient differences, which we call FLECS-CGD (FLECS with Compressed Gradient Differences) and make it applicable for stochastic optimization. Convergence guarantees are provided in strongly convex and nonconvex cases. Experiments show the practical benefit of proposed approach.

Recommended citation: Artem Agafanov, Brahim Erraji, Martin Takáč. (2022). "FLECS-CGD: A Federated Learning Second-Order Framework via Compression and Sketching with Compressed Gradient Differences." HOOML2022: Order up! Workshop in Neurips 2021.
Download Paper

Fairness-aware Reweighting in Federated Learning

Published in Conférence sur l'Apprentissage automatique, 2024

We address the problem of enforcing group fairness in federated learning. To this end, we first adapt FairGrad, a recently proposed fairness method in centralized learning, to the federated setting. This comes with a large communication cost as it requires sharing the fairness level of the learned models after each gradient descent step. Unfortunately, using the usual federated learning trick consisting in taking several local steps before communicating with the server leads to models with low utility. To tackle this issue, we propose an alternative approach that combines ideas from FairGrad and cost-sensitive fairness methods. Our preliminary experiments show that this new method is either competitive or better than several state-of-the-art approaches.

Recommended citation: Paul Andrey, Brahim Erraji, Michaël Perrot. (2024). "Fairness-aware Reweighting in Federated Learning" Conférence sur l'Apprentissage automatique.
Download Paper

Loss Gaps Parity for Fairness in Heterogeneous Federated Learning

Published in AISTATS 2026 Conference, 2026

While clients may join federated learning to improve performance on data they rarely observe locally, they often remain self-interested, expecting the global model to perform well on their own data. This motivates an objective that ensures all clients achieve a similar loss gap—the difference in performance between the global model and the best model they could train using only their local data. To this end, we propose EAGLE, a novel federated learning algorithm that explicitly regularizes the global model to minimize disparities in loss gaps across clients. Our approach is particularly effective in heterogeneous settings, where clients’ optimal local models may be misaligned. Unlike existing methods that encourage loss parity, potentially degrading performance for many clients, EAGLE targets fairness in relative improvements. We provide theoretical convergence guarantees for EAGLE under non-convex loss functions, and characterize how its iterates perform relative to the standard federated learning objective using a novel heterogeneity measure. Empirically, we demonstrate that EAGLE reduces the disparity in loss gaps among clients by prioritizing those furthest from their local optimal loss, while maintaining competitive utility in both convex and non-convex cases compared to strong baselines.

Recommended citation: Brahim Erraji, Michaël Perrot, Aurélien Bellet (2026). Loss Gaps Parity for Fairness in Heterogeneous Federated Learning.
Download Paper

talks

teaching

Teaching experience 1

Undergraduate course, University 1, Department, 2014

This is a description of a teaching experience. You can use markdown like any other post.

Teaching experience 2

Workshop, University 1, Department, 2015

This is a description of a teaching experience. You can use markdown like any other post.