#6292. Adversarial interference and its mitigations in privacy-preserving collaborative machine learning

September 2026publication date
Proposal available till 20-05-2025
4 total number of authors per manuscript0 $

The title of the journal is available only for the authors who have already paid for
Journal’s subject area:
Computer Networks and Communications;
Human-Computer Interaction;
Software;
Artificial Intelligence;
Computer Vision and Pattern Recognition;
Places in the authors’ list:
place 1place 2place 3place 4
FreeFreeFreeFree
2350 $1200 $1050 $900 $
Contract6292.1 Contract6292.2 Contract6292.3 Contract6292.4
1 place - free (for sale)
2 place - free (for sale)
3 place - free (for sale)
4 place - free (for sale)

Abstract:
Despite the rapid increase of data available to train machine-learning algorithms in many domains, several applications suffer from a paucity of representative and diverse data. The medical and financial sectors are, for example, constrained by legal, ethical, regulatory and privacy concerns preventing data sharing between institutions. Collaborative learning systems, such as federated learning, are designed to circumvent such restrictions and provide a privacy-preserving alternative by eschewing data sharing and relying instead on the distributed remote execution of algorithms. However, such systems are susceptible to malicious adversarial interference attempting to undermine their utility or divulge confidential information. Here we present an overview and analysis of current adversarial attacks and their mitigations in the context of collaborative machine learning. We discuss the applicability of attack vectors to specific learning contexts and attempt to formulate a generic foundation for adversarial influence and mitigation mechanisms. We moreover show that a number of context-specific learning conditions are exploited in similar fashion across all settings. Lastly, we provide a focused perspective on open challenges and promising areas of future research in the field.
Keywords:
+

Contacts :
0