CPE Master of Science Thesis Defense by Rishit Velabhai Prajapati
Topic: Secure Federated Learning against Data Poison Attack: Aggregator-Level Neuron Masking with Oracle-Labeled Clients
Abstract:
Federated learning (FL) protects privacy in deep learning by enabling clients to train a model collaboratively without sharing their data. However, it is vulnerable to poisoning attacks when adversaries craft malicious client updates. This thesis studies data poisoning attacks under current aggregator mechanisms with non-independent and identically distributed clients, investigating whether security can be achieved without breaking the privacy boundary.
Our approach treats the post-aggregation model update as structured data, detects directional fingerprints of collusion using median-centered statistics and median absolute deviation, and applies a soft, neuron-wise mask only where evidence is strongest. A lightweight class conditional divergence checks on a small public validation set flags suspicious rounds, and a short pseudo-label reconstruction step repairs flagged contributions so that the benign signal is not discarded. All analysis and mitigation occur after masks cancel, so per-client updates remain private, the aggregation outcome is unchanged, and the defense adds one small server pass with no extra rounds or client metadata.
Experiments on popular image classification, such as MNIST, with compact CNNs and staged label flipping demonstrate the effectiveness of our proposed approach. Specifically, we test on the expected failure pattern for undefended training, partial recovery with a directional screen, and stable recovery with defended aggregation. Mitigation concentrates in the classifier head and, at times, the deepest convolutional layer, consistent with the attack mechanism.
Although the study is limited to a small number of clients and a typical dataset, the results indicate that privacy and security are not in conflict. Aggregating the unit of analysis and the neuron of action reduces adversarial influence while preserving useful signal from non-IID clients, keeping secure aggregation robust.
Advisor(s): Dr. Hong Liu, Commonwealth Professor, Dept. of Electrical & Computer Engineering, UMass Dartmouth
Committee members:
- Dr. Lance Fiondella, Professor, Dept. of Electrical & Computer Engineering, UMass Dartmouth
- Dr. Sezer Goren, Associate Teaching Professor, Dept. of Electrical & Computer Engineering, UMass Dartmouth
Note: All ECE Graduate Students are encouraged to attend. All interested parties are invited to attend. Open to the public.
*For further information, please contact Dr. Hong Liu.
Lester W. Cory Conference Room, Science & Engineering Building (SENG), Room 213A
: https://umassd.zoom.us/j/93281343753 Meeting ID: 932 8134 3753 Passcode: 518247
Hong Liu
508.999.8841
hliu@umassd.edu
https://umassd.zoom.us/j/93281343753