JOURNAL OF COMPUTATIONAL PHYSICS | 卷:254 |
Massively parallel Monte Carlo for many-particle simulations on GPUs | |
Article | |
Anderson, Joshua A.1  Jankowski, Eric1  Grubb, Thomas L.2  Engel, Michael1  Glotzer, Sharon C.1,2  | |
[1] Univ Michigan, Dept Chem Engn, Ann Arbor, MI 48109 USA | |
[2] Univ Michigan, Dept Mat Sci & Engn, Ann Arbor, MI 48109 USA | |
关键词: Monte Carlo; Parallel algorithm; Detailed balance; GPGPU; CUDA; Hard disk system; | |
DOI : 10.1016/j.jcp.2013.07.023 | |
来源: Elsevier | |
【 摘 要 】
Current trends in parallel processors call for the design of efficient massively parallel algorithms for scientific computing. Parallel algorithms for Monte Carlo simulations of thermodynamic ensembles of particles have received little attention because of the inherent serial nature of the statistical sampling. In this paper, we present a massively parallel method that obeys detailed balance and implement it for a system of hard disks on the GPU. We reproduce results of serial high-precision Monte Carlo runs to verify the method. This is a good test case because the hard disk equation of state over the range where the liquid transforms into the solid is particularly sensitive to small deviations away from the balance conditions. On a Tesla K20, our GPU implementation executes over one billion trial moves per second, which is 148 times faster than on a single Intel Xeon E5540 CPU core, enables 27 times better performance per dollar, and cuts energy usage by a factor of 13. With this improved performance we are able to calculate the equation of state for systems of up to one million hard disks. These large system sizes are required in order to probe the nature of the melting transition, which has been debated for the last forty years. In this paper we present the details of our computational method, and discuss the thermodynamics of hard disks separately in a companion paper. (C) 2013 Elsevier Inc. All rights reserved.
【 授权许可】
Free
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
10_1016_j_jcp_2013_07_023.pdf | 328KB | download |