Privacy-aware Federated Learning with Global Differential Privacy

dc.contributor.authorAirody Suresh, Spoorthien
dc.contributor.committeechairYi, Yangen
dc.contributor.committeememberYang, Yalingen
dc.contributor.committeememberLiu, Lingjiaen
dc.contributor.departmentElectrical and Computer Engineeringen
dc.date.accessioned2023-02-01T09:00:16Zen
dc.date.available2023-02-01T09:00:16Zen
dc.date.issued2023-01-31en
dc.description.abstractThere is an increasing need for low-power neural systems as neural networks become more widely used in embedded devices with limited resources. Spiking neural networks (SNNs) are proving to be a more energy-efficient option to conventional Artificial neural networks (ANNs), which are recognized for being computationally heavy. Despite its significance, there has been not enough attention on training SNNs on large-scale distributed Machine Learning techniques like Federated Learning (FL). As federated learning involves many energy-constrained devices, there is a significant opportunity to take advantage of the energy efficiency offered by SNNs. However, it is necessary to address the real-world communication constraints in an FL system and this is addressed with the help of three communication reduction techniques, namely, model compression, partial device participation, and periodic aggregation. Furthermore, the convergence of federated learning systems is also affected by data heterogeneity. Federated learning systems are capable of protecting the private data of clients from adversaries. However, by analyzing the uploaded client parameters, confidential information can still be revealed. To combat privacy attacks on the FL systems, various attempts have been made to incorporate differential privacy within the framework. In this thesis, we investigate the trade-offs between communication costs and training variance under a Federated Learning system with Differential Privacy applied at the parameter server (curator model).en
dc.description.abstractgeneralFederated Learning is a decentralized method of training neural network models; it employs several participating devices to independently learn a model on their local data partition. These local models are then aggregated at a central server to achieve the same performance as if the model had been trained centrally. But with Federated Learning systems there is a communication overhead accumulated. Various communication reductions can be used to reduce these costs. Spiking Neural Networks, being the energy-efficient option to Artificial Neural Networks, can be utilized in Federated Learning systems. This is because FL systems consist of a network of energy-efficient devices. Federated learning systems are helpful in preserving the privacy of data in the system. However, an attacker can still obtain meaningful information from the parameters that are transmitted during a session. To this end, differential privacy techniques are utilized to combat privacy concerns in Federated Learning systems. In this thesis, we compare and contrast different communication costs and parameters of a federated learning system with differential privacy applied to it.en
dc.description.degreeMaster of Scienceen
dc.format.mediumETDen
dc.identifier.othervt_gsexam:36318en
dc.identifier.urihttp://hdl.handle.net/10919/113600en
dc.language.isoenen
dc.publisherVirginia Techen
dc.rightsIn Copyrighten
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/en
dc.subjectDifferential Privacyen
dc.subjectFederated learningen
dc.subjectCommunication Constraintsen
dc.titlePrivacy-aware Federated Learning with Global Differential Privacyen
dc.typeThesisen
thesis.degree.disciplineComputer Engineeringen
thesis.degree.grantorVirginia Polytechnic Institute and State Universityen
thesis.degree.levelmastersen
thesis.degree.nameMaster of Scienceen

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Airody_Suresh_S_T_2023.pdf
Size:
669.11 KB
Format:
Adobe Portable Document Format

Collections