Nonparametric distributed learning under general designs
dc.contributor.author | Liu, Meimei | en |
dc.contributor.author | Shang, Zuofeng | en |
dc.contributor.author | Cheng, Guang | en |
dc.contributor.department | Statistics | en |
dc.date.accessioned | 2021-01-12T15:10:19Z | en |
dc.date.available | 2021-01-12T15:10:19Z | en |
dc.date.issued | 2020-08-21 | en |
dc.description.abstract | This paper focuses on the distributed learning in nonparametric regression framework. With sufficient computational resources, the efficiency of distributed algorithms improves as the number of machines increases. We aim to analyze how the number of machines affects statistical optimality. We establish an upper bound for the number of machines to achieve statistical minimax in two settings: nonparametric estimation and hypothesis testing. Our framework is general compared with existing work. We build a unified frame in distributed inference for various regression problems, including thin-plate splines and additive regression under random design: univariate, multivariate, and diverging-dimensional designs. The main tool to achieve this goal is a tight bound of an empirical process by introducing the Green function for equivalent kernels. Thorough numerical studies back theoretical findings. | en |
dc.description.notes | This work was completed while Cheng was a member of Institute for Advanced Study, Princeton in the fall of 2019. Cheng would like to acknowledge hospitality of IAS, and also financial support from NSF DMS-1712907, DMS-1811812, DMS-1821183, Office of Naval Research, (ONR N00014-18-2759) and Adobe Data Science Fund. | en |
dc.description.sponsorship | NSFNational Science Foundation (NSF) [DMS-1712907, DMS-1811812, DMS-1821183]; Office of Naval ResearchOffice of Naval Research [ONR N00014-18-2759]; Adobe Data Science Fund | en |
dc.format.mimetype | application/pdf | en |
dc.identifier.doi | https://doi.org/10.1214/20-EJS1733 | en |
dc.identifier.issn | 1935-7524 | en |
dc.identifier.issue | 2 | en |
dc.identifier.uri | http://hdl.handle.net/10919/101855 | en |
dc.identifier.volume | 14 | en |
dc.language.iso | en | en |
dc.rights | Creative Commons Attribution 4.0 International | en |
dc.rights.uri | http://creativecommons.org/licenses/by/4.0/ | en |
dc.subject | Computational limit | en |
dc.subject | divide and conquer | en |
dc.subject | kernel ridge regression | en |
dc.subject | minimax optimality | en |
dc.subject | nonparametric testing | en |
dc.title | Nonparametric distributed learning under general designs | en |
dc.title.serial | Electronic Journal of Statistics | en |
dc.type | Article - Refereed | en |
dc.type.dcmitype | Text | en |
dc.type.dcmitype | StillImage | en |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- euclid.ejs.1597975224.pdf
- Size:
- 471.81 KB
- Format:
- Adobe Portable Document Format
- Description: