The Department of Applied Mathematics weekly seminar is given by scholars and researchers working in applied mathematics, broadly interpreted.
Title: FedCBO: Reaching Group Consensus in Clustered Federated Learning and Robustness to Backdoor Adversarial Attacks
Abstract: Federated learning is an important framework in modern machine learning that seeks to integrate the training of learning models from multiple users, each with their own local data set, in a way that is sensitive to the users’ data privacy and to communication cost constraints. In clustered federated learning, one assumes an additional unknown group structure among users, and the goal is to train models that are useful for users in each group, rather than to train a single global model for all users.
In the first part of this talk, I will present a novel solution to the problem of clustered federated learning that is inspired by ideas in consensus-based optimization (CBO). Our new CBO-type method is based on a system of interacting particles that is oblivious to group memberships. Our algorithm is accompanied by theoretical justification and tested on real data experiments. I will then discuss an additional issue of concern in federated learning: the vulnerability of federated learning protocols to “backdoor” adversarial attacks. This discussion will motivate the introduction of a second, improved particle system with enhanced robustness properties and that, at an abstract level, can be interpreted as a bi-level optimization algorithm based on interacting particle dynamics.
This talk is based on joint works with Sixu Li, Yuhua Zhu, Konstantin Riedl, and Jose Carrillo.