Scientific computing has often been called the third approach to scientific discovery, emerging as a peer to experimentation and theory. Historically, the synergy between experimentation and theory has been well understood: experiments give insight into possible theories, theories inspire experiments, experiments reinforce or invalidate theories, and so on. As scientific computing has evolved to produce results that meet or exceed the quality of experimental and theoretical results, it has become indispensable.
Parallel processing has been an enabling technology in scientific computing for more than 20 years. This book is the first in-depth discussion of parallel computing in 10 years; it reflects the mix of topics that mathematicians, computer scientists, and computational scientists focus on to make parallel processing effective for scientific problems. Presently, the impact of parallel processing on scientific computing varies greatly across disciplines, but it plays a vital role in most problem domains and is absolutely essential in many of them.
Parallel Processing for Scientific Computing is divided into four parts: The first concerns performance modeling, analysis, and optimization; the second focuses on parallel algorithms and software for an array of problems common to many modeling and simulation applications; the third emphasizes tools and environments that can ease and enhance the process of application development; and the fourth provides a sampling of applications that require parallel computing for scaling to solve larger and realistic models that can advance science and engineering.
Some of my coworkers contributed to this. My boss' boss' boss' boss (Horst) was one of the editors. If you're going to be doing HPC parallel programming, I'd recommend it. I'll be probably picking up a copy, myself. I've not been doing much in the way of coding for a while. That's something I'd like to address again.
It ain't cheap though.
No comments:
Post a Comment