Satoshi Matsuoka Mocks 12 Myths of High-Performance Computing
insideHPC reports that Satoshi Matsuoka, the head of Japan's largest supercomputing center, has co-authored a high-performance computing paper challenging conventional wisdom. In a paper entitled "Myths and Legends of High-Performance Computing" appearing this week on the Arvix site, Matsuoka and four colleagues offer opinions and analysis on such issues as quantum replacing classical HPC, the zettascale timeline, disaggregated computing, domain-specific languages (DSLs) vs. Fortran and cloud subsuming HPC, among other topics. "We believe (these myths and legends) represent the zeitgeist of the current era of massive change, driven by the end of many scaling laws, such as Dennard scaling and Moore's law," the authors said. In this way they join the growing "end of" discussions in HPC. For example, as the industry moves through 3nm, 2nm, and 1.4nm chips - then what? Will accelerators displace CPUs altogether? What's next after overburdened electrical I/O interconnects? How do we get more memory per core? The paper's abstract promises a "humorous and thought provoking" discussion - for example, on the possibility of quantum computing taking over high-performance computing. ("Once a quantum state is constructed, it can often be "used" only once because measurements destroy superposition. A second limitation stems from the lack of algorithms with high speedups....") The paper also tackles myths like "all high-performance computing will be subsumed by the clouds" and "everything will be deep learning." Thanks to guest reader for submitting the article.
Read more of this story at Slashdot.