IEEE Spectrum ran an interesting article the other day, titled: Next-Generation Supercomputers: Supercomputers are now running our search engines and social networks. But the heady days of stunning performance increases are over.
The article is written by one of the authors of a 278-page DARPA report on the future of supercomputing, one of the conclusions of which was that the accelerating increase in supercomputer power we’ve seen in the past is not likely to continue. To quote:
to construct an exaflops-class supercomputer that occupies a large building or a petaflops-class one that fits in a couple of refrigerator-size racks … would require engineers to rethink entirely how they construct number crunchers in the future.
The problem identified isn’t computer architecture per se but rather power consumption. They estimate that making an exaflops-class supercomputer by simply scaling up current technology would result in a machine taking up more than 0.1 percent of the total U.S. power grid, or about 1.5 gigawatts of power. Oops.
Singularity skeptics are often heard claiming that technology development is going to be an S-curve rather than an ongoing exponential explosion. Are we finally hitting the “slowdown” part of the S?
Of course, that’s not what DARPA’s report told them. Rather, the conclusion was basically to get with the times, and focus on dividing up problems among a larger set of smaller computers, rather than focusing on building individual super-fast computers. Even if the speed of the fastest supercomputer does decelerate, what about, for instance, the total amount of computer power on Earth? I haven’t seen any good estimates of this (let me know if you find one!) but I’ll bet you it’s accelerating steadily and dramatically.
Another lesson from this report is the synergy between various advanced technologies. The bottleneck in supercomputer speed isn’t human ingenuity at designing new computers nor the laws of physics, it’s the current state of electrical power generation technology. Or to put it differently, a breakthrough in power generation (plus some “business as usual”) will yield a breakthrough in supercomputer speed.
And then of course, there is the emerging area of low power consumption supercomputing, using technology similar to that used in mobile devices, aimed specifically at bypassing these problems.
Still, this sort of data is a caution against approaching the future via mindless curve-plotting and trend extrapolation. There are many trends one can look at, and ultimately one needs to interpret them in the context of one’s holistic understanding of the future. Numeric extrapolation and qualitative extrapolation need to proceed in synergy, a methodology that strips away the illusion that we can do any kind of fully objective futurology — but leaves plenty of room for real understanding.