Supercomputing on a Desktop
Supercomputing is the engine that drives our science, commerce, and communication. Giant search engines trawl the net with billions of queries, molecules are modeled and modified in massive simulations, and deep under Wall Street hulking processors trade massive blocks of money at the speed of light. But the era of the giant, room-spanning supercomputers may soon have some serious competition in the form of a small Calgary-based startup bringing supercomputing to the masses.
Calgary is known throughout Canada as the center of the country’s oil and gas industry, a booming city buoyed by the bubble-swelled profits of 20th century fossil fuel excess. It’s a prosperous if not especially cosmopolitan urban hub on the Canadian prairies. But tucked away in an unassuming office in the suburbs of this middling city, Tycrid Platform Technologies has been quietly putting together desktop systems boasting terahertz level horsepower under the hood since it was founded in 2008.
Miniaturization and Moore’s law have shrunk the components, but supercomputers still share one trait with their ancient, less powerful brethren; the ability to take up an entire room. The world’s fastest and most complex thinking machines may crunch stupendous amounts of data but they also take up a lot of space and require a surfeit of energy.
“Desktop supercomputing really changes the way people can work. In science, it allows those companies that need access to High Performance Computing (HPC) resources to have HPC performance affordably. What this does is let the researcher become more creative in the development of new science. Algorithms that used to be considered a ‘holy grail’ for HPC, can be attainable at the desktop level,” Tycrid president Chris Heier said.
Supercomputing as a term can be a little misleading. Though current, room-spanning facilities are capable of incredible feats of simulation, at base they are simply many personal computers networked together and dedicated to a specific task. But trends in personal computing are bringing similar capabilities to the desktop.
The technological advances that made desktop supercomputing possible weren’t a result of big science or government necessity. Instead, the visceral, polygon-soaked world of gaming pushed graphics cards to new levels of speed and performance, and in the process put supercomputer-level power in every desktop. But some tinkering was required before a typical desktop could move beyond first person shooters and into gene sequencing and subatomic modeling. Heier: “If it wasn’t for game developers’ desire to push things forward with more beautiful graphics, and thus programmable shading models, companies like NVIDIA may not have pushed an architecture so reprogrammable, it could easily be called a CPU itself.”
Their machines are being used for everything from astrophysics research to molecular simulations and machine vision.
Another key enabler for HPC is OpenCL, Apple’s bid to avoid being tied down to a single vendor when building their Mac Pro systems. OpenCL is the general computing equivalent of OpenGL, the API that enables high-speed processing and rapid-fire gaming on any system regardless of vendor or underlying hardware. According to Heier, "Because companies are looking beyond just the CPU, there really needed to be a standardized platform for developing for all of these new processing architectures,” Heier said.
You can make the argument that all current supercomputing is simply networking personal computers together, and Heier pointed to applications such as Folding@Home and the widespread adoption of cloud computing as two obvious examples of such activity on the macro scale. “In a way, if you run Folding@Home, you can consider your computer to be part of a multi-petaflop supercomputing cluster, the most powerful distributed supercomputer in the world. I’ve seen CG studios automatically kick their workstations into compute mode if they have been sitting idle for a period of time. This way, all of the powerful workstations being used for day to day activities contribute to rendering movie frames,” Heier said.
Concentrating a taste of that power on the desktop brings other advantages, according to Heier. Over ninety percent of Tycrid’s clients are universities, including the Applied Physics Lab and at Johns Hopkins University, the University of Montana’s NIH Lab and Canadian Universities like Dalhousie and McGill. And their machines are being used for everything from astrophysics research to molecular simulations and machine vision. But Heier definitely sees potential in applying desktop supercomputing to the mobile space, and said the technology, if not the market, is here today to create a mobile supercomputer. And he’s equally optimistic about supercomputing at a
“I could imagine blood cell sized computers at the rate technology is moving. There are already experiments going on today which look at utilizing cell sized structures as the next version of the transistor. It is something that has to be looked at carefully and implemented properly in order to be viable, and I don’t see that happening for another 10 to 20 years,” Heier said.
The miniaturization of computing from giant mainframes to mobile devices that fit in your pocket hasn’t just been an amazing technical achievement. Massive social changes, from the turmoil currently roiling the media industry to Twitter revolutions in Iran, are all due to computer chips getting smaller and more powerful. Supercomputing will follow the same path, whether Tycrid is the company to push it forward or not. The more supercomputers there are on desktops, the more people will have access to previously unattainable computing power, and with that power will come transformational change swelling from the bottom up, in a million directions at once and faster than ever before. By that point, blood cell sized computers might seem quaint.