Practical Federated Recommendation Model Learning Using ORAM with Controlled Privacy

Files

TR Number

Date

2025-03-30

Journal Title

Journal ISSN

Volume Title

Publisher

ACM

Abstract

Training high-quality recommendation models requires collecting sensitive user data. The popular privacy-enhancing training method, federated learning (FL), cannot be used practically due to these models’ large embedding tables. This paper introduces FEDORA, a system for training recommendation models with FL. FEDORA allows each user to only download, train, and upload a small subset of the large tables based on their private data, while hiding the access pattern using oblivious memory (ORAM). FEDORA reduces the ORAM’s prohibitive latency and memory overheads by (1) introducing 𝜖-FDP, a formal way to balance the ORAM’s privacy with performance, and (2) placing the large ORAM in a power- and cost-efficient SSD with SSD-friendly optimizations. Additionally, FEDORA is carefully designed to support (3) modern operation modes of FL. FEDORA achieves high model accuracy by using private features during training while achieving, on average, 5× latency and 158× SSD lifetime improvement over the baseline.

Description

Keywords

Citation