The College of Computing at Georgia Tech and Chapman & Hall/CRC Press today announced the launch of “Petascale Computing: Algorithms and Applications”, the first published collection on petascale techniques for computational science and engineering, at the SC07 conference. Edited by David A. Bader, associate professor of computing and executive director of high-performance computing at Georgia Tech, this collection represents an academic milestone in the high-performance computing industry and is the first work to be released through Chapman & Hall/CRC Press’ new Computational Science series.
“High-performance computing will enable breakthrough science and engineering in the 21st century,” said Bader. “My goal in developing this book was to inspire members of the high-performance computing community to solve computational grand challenges that will help our society, protect our environment, and improve our understanding in fundamental ways, all through the efficient use of petascale computing.”
Featuring contributions from the world’s leading experts in computational science, “Petascale Computing: Algorithms and Applications” discusses expected breakthroughs in the computational science and engineering field and covers a breadth of topics in petascale computing, including architectures, software, programming methodologies, tools, scalable algorithms, performance evaluation and application development. Covering a wide range of issues critical to the advancement of the high-performance computing/supercomputing industry, this edited collection illustrates the application of petascale computing to space and Earth science missions, biological systems and climate science, among others, and details the simulation of multiphysics, cosmological evolution, molecular dynamics and biomolecules.
“In the same way as petascale computing will open up new and unprecedented opportunities for research in computational science, I expect this current book to lead to a deeper understanding and appreciation of research in computational science and engineering,” said Horst Simon, associate laboratory director for computing sciences, Lawrence Berkeley National Laboratory and editor of Chapman & Hall/CRC Press’ new Computational Science book series.
hmm. Exascale is next. The power and environmental requirements make my head swim though. Even with the reductions we've been talking about through the nextgen (or two) technologies. Imagine a single 'node' requires 100w. Now there are 10 million of them...
computronium *snorks*
Dude. I made Greg come check this out. I now seem to have lost him for the evening... Note to self. Don't encourage him in the future.
ReplyDeleteHow did he get so engrossed? The HPC stuff or the goofy computronium bit?
ReplyDelete"Goofy computronium bit"?
ReplyDeleteAlso, aren't we still a year or two away from actual petascale computing? Or am I missing something?
Doug M.
"Goofy computronium bit"?
ReplyDeleteBased on everything that I am seeing via stuff I cannot share with you all (*mutter* NDA *mutter*) computronium is such an inane fantasy that it's starting to boil my blood. Heat and energy requirements are going to completely nuke the idea.
The Singularity will be consumed by its own waste heat and insane, impossible energy requirements.
Also, aren't we still a year or two away from actual petascale computing?
Yeah, we're about a year away, but people are using some virtual machines to simulate some of issues so they can make algorithms ahead of the hardware being available. Perennial HPC problem: hardware comes first, software then spends the next x number years catching up and then the new hardware arrives. They're trying to fix that now. We'll see how well it works.