Whether a computer is simulating the aging and performance of a nuclear weapon, the folding of a protein, or the probability of rainfall over a particular mountain range, the necessary calculations can be enormous. Our computers help researchers answer these and other complex problems, and each new generation of system hardware and software widens the realm of possibilities. Building on Livermores historical excellence and leadership in high-performance computing, Computation added more than 331 trillion floating-point operations per second (teraFLOPS) of power to LLNLs computer room floors in 2008. In addition, Livermores next big supercomputer, Sequoia, advanced ever closer to its 20112012 delivery date, as architecture plans and the procurement contract were finalized. Hyperion, an advanced technology cluster test bed that teams Livermore with 10 industry leaders, made a big splash when it was announced during Michael Dells keynote speech at the 2008 Supercomputing Conference. The Wall Street Journal touted Hyperion as a bright spot amid turmoil in the computer industry. Computation continues to measure and improve the costs of operating LLNLs high-performance computing systems by moving hardware support in-house, by measuring causes of outages to apply resources asymmetrically, and by automating most of the account and access authorization and management processes. These improvements enable more dollars to go toward fielding the best supercomputers for science, while operating them at less cost and greater responsiveness to the customers. Computation led diverse efforts across many LLNL mission areas and made important strides in application development. New large-text-corpus analysis technologies reduce from weeks to hours the time needed by key customers in the intelligence community to exploit large data sets. LLNL staff specified and validated codes used to track and monitor nuclear materials in the Russian Federation, thus fulfilling key U.S. treaty goals. A new workflow-based analysis engine automatically processes data from National Ignition Facility (NIF) shot diagnostics, which enables the National Ignition Campaign goal of three shots per day. LLNLs premier data analysis and visualization tool (VisIt) supplanted other less-capable tools in the research community and became open source while preserving traditional support for funded customers. Computations research this year led to new insights in hydrogen fuel efficiency, improved laser-optics maintenance, and the most effective prediction of intruder/malicious traffic on our networks yet achieved by automated analysis. In addition, new algorithms are enabling scientists to more effectively apply the massively parallel computing platforms of today and tomorrow to the radiation-transport problema key capability to securing the nation through radiation detector networks and portal monitoring.