#11536. Privacy-Preserving federated learning in medical diagnosis with homomorphic re-Encryption
August 2026 | publication date |
Proposal available till | 16-05-2025 |
4 total number of authors per manuscript | 0 $ |
The title of the journal is available only for the authors who have already paid for |
|
|
Journal’s subject area: |
Law;
Computer Science (all);
Computer Science Applications;
Software;
Hardware and Architecture; |
Places in the authors’ list:
1 place - free (for sale)
2 place - free (for sale)
3 place - free (for sale)
4 place - free (for sale)
More details about the manuscript: Science Citation Index Expanded or/and Social Sciences Citation Index
Abstract:
Unlike traditional centralized machine learning, distributed machine learning provides more efficient and useful application scenarios. However, distributed learning may not meet some security requirements. For example, in medical treatment and diagnosis, an increasing number of people are using IoT devices to record their personal data, when training medical data, the users are not willing to reveal their private data to the training party. How to collect and train the data securely has become the main problem to be resolved. Federated learning can combine a large amount of scattered data for training, and protect user data. Compared with general distributed learning, federated learning is more suitable for training on scattered data. In this paper, we propose a privacy-preserving federated learning scheme that is based on the cryptographic primitive of homomorphic re-encryption, which can protect user data through homomorphic re-encryption and trains user data through batch gradient descent (BGD). In our scheme, we use the IoT device to encrypt and upload user data, the fog node to collect user data, and the server to complete data aggregation and re-encrypting.
Keywords:
Federated learning; Gradient descent; Homomorphic re-Encryption; Privacy-Preserving
Contacts :