Al Daas, HussamBallard, GreyCazeaux, PaulHallman, EricMiedlar, AgnieszkaPasha, MirjetaReid, Tim W.Saibaba, Arvind K.2024-02-122024-02-122023-01-271064-8275https://hdl.handle.net/10919/117944The tensor-train (TT) format is a highly compact low-rank representation for high-dimensional tensors. TT is particularly useful when representing approximations to the solutions of certain types of parametrized partial differential equations. For many of these problems, computing the solution explicitly would require an infeasible amount of memory and computational time. While the TT format makes these problems tractable, iterative techniques for solving the PDEs must be adapted to perform arithmetic while maintaining the implicit structure. The fundamental operation used to maintain feasible memory and computational time is called rounding, which truncates the internal ranks of a tensor already in TT format. We propose several randomized algorithms for this task that are generalizations of randomized low-rank matrix approximation algorithms and provide significant reduction in computation compared to deterministic TT-rounding algorithms. Randomization is particularly effective in the case of rounding a sum of TT-tensors (where we observe 20\times speedup), which is the bottleneck computation in the adaptation of GMRES to vectors in TT format. We present the randomized algorithms and compare their empirical accuracy and computational time with deterministic alternatives.Pages A74-A9522 page(s)application/pdfenIn Copyrighthigh-dimensional problemsrandomized algorithmstensor decompositionstensortrain formatRandomized Algorithms for Rounding in the Tensor-Train FormatArticle - RefereedSIAM Journal on Scientific Computinghttps://doi.org/10.1137/21M14511914511095-7197