Machine Learning Model Watermarking through DRAM PUFs
dc.contributor.author | Khatun, Arju | en |
dc.contributor.committeechair | Xiong, Wenjie | en |
dc.contributor.committeemember | Wang, Haining | en |
dc.contributor.committeemember | Nazhandali, Leyla | en |
dc.contributor.department | Electrical and Computer Engineering | en |
dc.date.accessioned | 2025-06-07T08:02:16Z | en |
dc.date.available | 2025-06-07T08:02:16Z | en |
dc.date.issued | 2025-05-22 | en |
dc.description.abstract | In the modern day, neural networks are of utmost importance, and their applications can be found across a wide range of areas including social media, healthcare, navigation, and personal assistance. Modern neural networks are large-scale and contain billions of parameters. Hence, training these networks is a costly affair, both in terms of resources and finances. With the rising cost of training, security concerns over model theft have also emerged, where an adversarial party may replicate a pre-trained model without proper authorization and deploy it for their advantage. Watermarking serves as a tool that, in such scenarios, allows the legitimate owner to claim the authenticity of the stolen model. Researchers have developed various watermarking schemes for neural networks, typically by modifying the training code. In this thesis, I worked on developing a hardware-based watermarking scheme utilizing the PUF (Physical Unclonable Function) characteristics of DRAM modules. PUFs can work as strong hardware-based security fingerprints, and DRAMs have been shown to exhibit inherent PUF behavior. One way to generate a PUF from DRAM is by disabling the DRAM refresh mechanism, which causes bit-flips in the stored charge. In my work, a machine learning model is trained on a PUF-enabled DRAM platform where the model parameters are stored directly on the decaying DRAM cells. This process integrates the DRAM's PUF into the model parameters, and enables embedding of a robust watermark without making any modifications to the training code. | en |
dc.description.abstractgeneral | In the modern day, neural networks are crucial in many different areas of our lives, with applications found across fields such as social media, healthcare, navigation, and personal assistance. Before a neural network is ready to be used, it needs to be trained. Modern neural networks are large and require significant resources to train, making the training process costly. As training costs rise, there is growing concern over model theft, where someone could illegally copy a pre-trained model and use it as their own. In such scenarios, watermarks provide a way for the original owner to identify their models and claim copyright. The majority of current research on neural network watermarking requires changes to the training code. In this work, I developed a hardware-based watermarking method that uses a hardware feature called a Physical Unclonable Function (PUF). DRAM PUFs naturally occur in computer memory (DRAM) when refresh operations are turned off, causing small-scale random changes in the data. I built a system where a neural network is trained on such a memory platform, and the model's parameters are stored directly on the decaying memory. This embeds a hidden, unique signature into the model and serves as a watermark — without changing the training code. | en |
dc.description.degree | Master of Science | en |
dc.format.medium | ETD | en |
dc.identifier.other | vt_gsexam:43877 | en |
dc.identifier.uri | https://hdl.handle.net/10919/135400 | en |
dc.language.iso | en | en |
dc.publisher | Virginia Tech | en |
dc.rights | In Copyright | en |
dc.rights.uri | http://rightsstatements.org/vocab/InC/1.0/ | en |
dc.subject | Neural-Network | en |
dc.subject | DRAM | en |
dc.subject | Watermark | en |
dc.subject | FPGA | en |
dc.subject | Hardware Acceleration | en |
dc.title | Machine Learning Model Watermarking through DRAM PUFs | en |
dc.type | Thesis | en |
thesis.degree.discipline | Computer Engineering | en |
thesis.degree.grantor | Virginia Polytechnic Institute and State University | en |
thesis.degree.level | masters | en |
thesis.degree.name | Master of Science | en |
Files
Original bundle
1 - 1 of 1