Microsoft has risen to dominance in the 1980s and 1990s thanks to the success of its Windows operating system running on Intel processors, a cozy relationship with the nickname “Wintel.”

Now Microsoft hopes that yet another hardware – software combination will help to recapture that success – and capture rivals Amazon and Google in the race to provide advanced artificial intelligence through the cloud.

Microsoft hopes to expand the popularity of its Azure cloud platform with a new type of computer chip designed for the age of AI. Starting today, Microsoft Azure is offering customers access to chips made by the British start-up Graphcore.

Graphcore, founded in 2016 in Bristol, UK, has attracted a lot of attention from AI researchers – and several hundreds of millions of dollars in investments – on the promise that its chips will speed up the calculations needed to make AI work. So far it has not made the chips publicly available or has shown the results of tests with early testers.

Microsoft, which invested its own money in Graphcore last December as part of a $ 200 million financing round, is looking for hardware that will make its cloud services more attractive to the growing number of customers for AI applications.

Unlike most chips used for AI, the Graphcore processors have been completely redesigned to support the calculations that allow machines to recognize faces, understand speech, parse-language, drive and train robots. Graphcore expects it to be attractive for companies that perform mission-critical activities on AI, such as starting car driving, trading companies, and activities that process large amounts of video and audio. Those working on next-generation AI algorithms may also want to explore the benefits of the platform.

Microsoft and Graphcore have today published benchmarks that suggest that the chip matches or is better than the performance of the best AI chips from Nvidia and Google using algorithms written for those competing platforms. Code written specifically for Graphcore’s hardware may be even more efficient.

The companies claim that certain image processing tasks, for example, work many times faster on the Graphcore chips than on its rivals that use existing code. They also say that they were able to train a popular AI model for language processing, called BERT, at rates that match those of other existing hardware.

BERT has become enormously important for AI applications with language. Google recently said it uses BERT to strengthen its core business. Microsoft says it is now using Graphcore’s chips for internal AI research projects with natural language processing.

Karl Freund, who follows the AI ​​chip market at Moor Insights, says the results show that the chip is advanced but still flexible. A highly specialized chip can surpass a Nvidia or Google chip, but would not be programmable enough for engineers to develop new applications. “They have done well and made it programmable,” he says. “Good performance in both training and inference is something that they have always said they would do, but it is really, really hard.”

Freund adds that the deal with Microsoft is crucial for the Graphcore company because it offers customers a new opportunity to try the new hardware. The chip may be superior to existing hardware for some applications, but it takes a lot of effort to redevelop AI code for a new platform. With a few exceptions, Freund says, the benchmarks of the chip are not striking enough to lure companies and researchers away from the hardware and software they already use comfortably.

Graphcore has created a software framework with the name Poplar, with which existing AI programs can be transferred to the hardware. Many existing algorithms may still be better suited for software that runs on rival hardware. The Tensorflow AI software framework from Google has become the de facto standard for AI programs in recent years and has been specifically written for Nvidia and Google chips. Nvidia is also expected to release a new AI chip next year, which is likely to deliver better performance.

Graphcore

Nigel Toon, co-founder and CEO of Graphcore, says the companies started working together a year after the launch of his company through Microsoft Research Cambridge in the UK. The chips from his company are particularly suitable for tasks involving very large AI models or temporary data, he says. One financial customer would have seen a 26-fold performance improvement in an algorithm used to analyze market data thanks to Graphcore’s hardware.

A handful of other, smaller companies also announced today that they work with Graphcore chips through Azure. This includes Citadel, which will use the chips to analyze financial data, and Qwant, a European search engine that wants the hardware to perform an image recognition algorithm known as ResNext.

The AI ​​boom has shaken the market for computer chips in recent years. The best algorithms perform parallel mathematical calculations, which can be done more effectively on graphic chips (or GPUs) with hundreds of simple processing cores as opposed to conventional chips (CPUs) with a few complex processing cores.

GPU maker Nvidia has brought the AI ​​wave to riches and Google announced in 2017 that it would develop its own chip, the Tensor Processing Unit, which is architecturally similar to a GPU, but optimized for Tensorflow.

The Graphcore chips, which it calls Intelligent Processing Units (IPUs), have much more cores than GPUs or TPUs. They also have memory on the chip itself, which removes a bottleneck associated with moving data to a chip for processing and off.

Facebook is also working on its own AI chips. Microsoft has previously touted reconfigurable chips made by Intel and modified by its engineers for AI applications. A year ago, Amazon revealed that it was also busy making chips, but with a more general processor that is optimized for Amazon cloud services.

More recently, the AI ​​boom has fueled a flurry of starting hardware companies to develop more specialized chips. Some of these are optimized for specific applications such as autonomous driving or surveillance cameras. Graphcore and some others offer much more flexible chips, which are crucial for the development of AI applications, but also much more challenging to produce. The latest investment round of the company gave the company a valuation of $ 1.7 billion.

The Graphcore chips may first find traction with top AI experts who are able to write the code needed to reap the benefits. Several prominent AI researchers have invested in Graphcore, including Demis Hassabis, co-founder of DeepMind, Zoubin Ghahramani, a professor at Cambridge University and the head of Uber’s AI lab, and Peiter Abbeel, a professor at UC Berkeley who specializes in AI and robotics. In an interview with WIREDlast December, AI visionary Geoffrey Hinton discussed the potential of Graphcore chips to promote fundamental research.

Not long after that, companies can be tempted to try out the latest. As Graphcore’s CEO Toon says, “Everyone is trying to innovate, trying to find an advantage.”

This story originally appeared on wired.com.

Mention image by Graphcore

Similar Posts

Leave a Reply