| <- HREF="index.html" Prev | Index | Next -> |
NHSE ReviewTM: Comments · Archive · Search
Unfortunately, standardization is hard to achieve in the world of HPC. Vendor companies are struggling to exist, not contemplating how standards might improve the future usability of their systems. There is a widespread notion that "software doesn't sell HPC machines," so vendors do not perceive a real economic advantage to implementing features that would make their machines consistent with others. Other constraints, such as the limited window-of-opportunity for introducing HPC standards and mixed responses from the user community, have also inhibited standardization. Until unequivocal requirements are included consistently in procurements of HPC machines, vendors are unlikely to take user requests for standardization seriously.
In 1995, a national task force was convened to address this situation. Endorsed by the National Coordinating Office for HPCC and the Parallel Tools Consortium, the group set out to establish the basic requirements for a "standard software infrastructure" to support the development of HPC applications. Representatives from both the HPC industry and major HPC user sites participated in the effort. They drafted standard verbiage for system software and tools requirements, for use in preparing RFPs and other procurement documents.
This article begins by discussing why standardization is important for the future of HPC. It goes on to describe the comprehensive discussions held by the task force on all aspects of system software and tools - from distributed file systems to interactive debuggers and parallelizing compilers. Hyperlinks refer to pertinent sections of the task force's formal documentation. The rationale for how the group established guidelines is presented, including where elements were omitted because it was felt that technology is not sufficiently mature for standardization or because the need was representative of only some user sites. Attention is drawn to the group's discussions on obstacles to standardization and promising areas for further work. A concluding chapter discusses the outcomes of the task force efforts and reflects on what can be done to improve the quality and consistency of HPC software support.
Her research area is software support for high performance computing, with emphasis on usability requirements and user interface design. Pancake's previous career involved extensive ethnographic fieldwork, where she applied cross-cultural techniques to study social change among Guatemalan Indians. Since receiving a Ph.D. in Computer Engineering from Auburn University in 1986, she has applied both ethnographic and engineering techniques to the problem of how software tools can support users' conceptual models.
Focussing on the needs of HPC users, Pancake conducted much of the seminal work to identify how the needs of scientists and engineers differ from those of the computer science community. The visual idioms and language modifications she developed in response to those needs are used by several HPC manufacturers. Pancake is on the editorial boards of Communications of the ACM, the NHSE Review, and Scientific Programming. She is director of the Northwest Alliance for Computational Science and Engineering, chair of the Parallel Tools Consortium, and a co-chair of the High Performance Debugging Forum.
| <- HREF="index.html" Prev | Index | Next -> |
NHSE ReviewTM: Comments · Archive · Search