| JOURNAL OF COMPUTATIONAL PHYSICS | 卷:383 |
| Reduction of multivariate mixtures and its applications | |
| Article | |
| Beylkin, Gregory1  Monzon, Lucas1  Yang, Xinshuo1,2  | |
| [1] Univ Colorado, Dept Appl Math, UCB 526, Boulder, CO 80309 USA | |
| [2] Colorado Sch Mines, Dept Elect Engn, Golden, CO 80401 USA | |
| 关键词: Multivariate mixtures; Reduction algorithms; Integral equations; KDE; Far-field summation in high dimensions; | |
| DOI : 10.1016/j.jcp.2019.01.015 | |
| 来源: Elsevier | |
PDF
|
|
【 摘 要 】
We consider fast deterministic algorithms to identify the best linearly independent terms in multivariate mixtures and use them to compute, up to a user-selected accuracy, an equivalent representation with fewer terms. Importantly, the multivariate mixtures do not have to be a separated representation of a function. One algorithm employs a pivoted Cholesky decomposition of the Gram matrix constructed from the terms of the mixture to select what we call skeleton terms and the other uses orthogonalization for the same purpose. Both algorithms require O(r(2)N + p(d) rN) operations, where N is the initial number of terms in the multivariate mixture, r is the number of selected linearly independent terms, and p(d) is the cost of computing the inner product between two terms of a mixture in d variables. For general Gaussian mixtures p(d) similar to d(3) since we need to diagonalize a d x d matrix, whereas for separated representations p (d) similar to d (there is no need for diagonalization). Due to conditioning issues, the resulting accuracy is limited to about one half of the available significant digits for both algorithms. We also describe an alternative algorithm that is capable of achieving higher accuracy but is only applicable in low dimensions or to multivariate mixtures in separated form. We describe a number of initial applications of these algorithms to solve partial differential and integral equations and to address several problems in data science. For data science applications in high dimensions, we consider the kernel density estimation (KDE) approach for constructing a probability density function (PDF) of a cloud of points, a far-field kernel summation method and the construction of equivalent sources for non-oscillatory kernels (used in both, computational physics and data science) and, finally, show how to use the new algorithm to produce seeds for subdividing a cloud of points into groups. (C) 2019 Elsevier Inc. All rights reserved.
【 授权许可】
Free
【 预 览 】
| Files | Size | Format | View |
|---|---|---|---|
| 10_1016_j_jcp_2019_01_015.pdf | 2185KB |
PDF