Microsoft Speeds Up AI with New Programmable FPGA Chips
September 27, 2016
In 2012, Microsoft chief executive Steve Ballmer and computer chip researcher Doug Burger believed they had found the future of computing: chips that could be programmed for specific tasks, dubbed field programmable gate arrays (FPGAs). Project Catapult, as it was called, was intended to shift the underlying technology of all Microsoft servers in that direction. FPGAs now form the basis of Bing. Soon, the specialized chips will be capable of artificial intelligence at a tremendous speed — 23 milliseconds versus four seconds.
Wired notes that FPGAs also “drive Azure, the company’s cloud computing service,” and that, “in the coming years, almost every new Microsoft server will include an FPGA.”
“This gives us massive capacity and enormous flexibility, and the economics work,” said Burger. “This is now Microsoft’s standard, worldwide architecture.”
Microsoft — which spends between $5 billion and $6 billion a year on hardware — isn’t the only company working on new, specialized chipsets. “All the Internet giants are supplementing their standard server chips — central processing units, or CPUs — with alternative silicon that can keep pace with the rapid changes in AI.” As Microsoft chief executive Satya Nadella said, this is “no longer just research …[but] an essential priority.”
Artificial intelligence is driving the need for specialized chips, even as “services like Bing have outstripped Moore’s Law, the canonical notion that the number of transistors in a processor doubles every 18 months.” In fact, that’s not sufficient to address the problem, which makes FPGAs — which are customizable, faster and need less power — a better solution than CPUs.
For example, Microsoft’s Catapult hardware “costs less than 30 percent of everything else in the server, consumes less than 10 percent of the power, and processes data twice as fast as the company could without it.” The FPGA chips, which enable deep neural nets, will soon provide encryption, compression and machine learning for Office 365’s 23.1 million users.
Google has done it a different way, building tensor processing units (TPUs) for executing neural nets, meaning that if the neural networking models change, a new chip is required. Microsoft, however, “is playing a longer game.” Its FPGAs are not as fast as Google’s TPUs, but reprogrammable for any task.
Wired notes that the “alternative chips” that Microsoft, Google and Amazon are building, “are driving so much of the world’s technology that those alternative chips will drive the wider universe of apps and online services.” Microsoft, meanwhile, is looking beyond Project Catapult to a future of quantum computing and its “ultrafast computers.”
Related:
Microsoft CEO Satya Nadella on How AI Will Transform His Company, TechCrunch, 9/26/16
No Comments Yet
You can be the first to comment!
Sorry, comments for this entry are closed at this time.