Machine Learning Model Watermarking through DRAM PUFs

dc.contributor.authorKhatun, Arjuen
dc.contributor.committeechairXiong, Wenjieen
dc.contributor.committeememberWang, Hainingen
dc.contributor.committeememberNazhandali, Leylaen
dc.contributor.departmentElectrical and Computer Engineeringen
dc.date.accessioned2025-06-07T08:02:16Zen
dc.date.available2025-06-07T08:02:16Zen
dc.date.issued2025-05-22en
dc.description.abstractIn the modern day, neural networks are of utmost importance, and their applications can be found across a wide range of areas including social media, healthcare, navigation, and personal assistance. Modern neural networks are large-scale and contain billions of parameters. Hence, training these networks is a costly affair, both in terms of resources and finances. With the rising cost of training, security concerns over model theft have also emerged, where an adversarial party may replicate a pre-trained model without proper authorization and deploy it for their advantage. Watermarking serves as a tool that, in such scenarios, allows the legitimate owner to claim the authenticity of the stolen model. Researchers have developed various watermarking schemes for neural networks, typically by modifying the training code. In this thesis, I worked on developing a hardware-based watermarking scheme utilizing the PUF (Physical Unclonable Function) characteristics of DRAM modules. PUFs can work as strong hardware-based security fingerprints, and DRAMs have been shown to exhibit inherent PUF behavior. One way to generate a PUF from DRAM is by disabling the DRAM refresh mechanism, which causes bit-flips in the stored charge. In my work, a machine learning model is trained on a PUF-enabled DRAM platform where the model parameters are stored directly on the decaying DRAM cells. This process integrates the DRAM's PUF into the model parameters, and enables embedding of a robust watermark without making any modifications to the training code.en
dc.description.abstractgeneralIn the modern day, neural networks are crucial in many different areas of our lives, with applications found across fields such as social media, healthcare, navigation, and personal assistance. Before a neural network is ready to be used, it needs to be trained. Modern neural networks are large and require significant resources to train, making the training process costly. As training costs rise, there is growing concern over model theft, where someone could illegally copy a pre-trained model and use it as their own. In such scenarios, watermarks provide a way for the original owner to identify their models and claim copyright. The majority of current research on neural network watermarking requires changes to the training code. In this work, I developed a hardware-based watermarking method that uses a hardware feature called a Physical Unclonable Function (PUF). DRAM PUFs naturally occur in computer memory (DRAM) when refresh operations are turned off, causing small-scale random changes in the data. I built a system where a neural network is trained on such a memory platform, and the model's parameters are stored directly on the decaying memory. This embeds a hidden, unique signature into the model and serves as a watermark — without changing the training code.en
dc.description.degreeMaster of Scienceen
dc.format.mediumETDen
dc.identifier.othervt_gsexam:43877en
dc.identifier.urihttps://hdl.handle.net/10919/135400en
dc.language.isoenen
dc.publisherVirginia Techen
dc.rightsIn Copyrighten
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/en
dc.subjectNeural-Networken
dc.subjectDRAMen
dc.subjectWatermarken
dc.subjectFPGAen
dc.subjectHardware Accelerationen
dc.titleMachine Learning Model Watermarking through DRAM PUFsen
dc.typeThesisen
thesis.degree.disciplineComputer Engineeringen
thesis.degree.grantorVirginia Polytechnic Institute and State Universityen
thesis.degree.levelmastersen
thesis.degree.nameMaster of Scienceen

Files

Original bundle
Now showing 1 - 1 of 1
Name:
Khatun_A_T_2025.pdf
Size:
1.96 MB
Format:
Adobe Portable Document Format

Collections