![](http://4.bp.blogspot.com/-SsS3Nvce7_A/WZ7FlpGyucI/AAAAAAAAPQY/xqtdKO6u7_UU9gDTTWdCTnG-dnxEHOhqACLcBGAs/s1600/MicrosoftIntroducesRealTimeAISystem.jpg)
Microsoft showed off Project Brainwave, an AI system that runs workloads in real-time using Intel's 14nm Stratix 10 FPGA chip.
Microsoft's latest system, dubbed Project Brainwave, uses field-programmable gate arrays (FPGAs) from Intel to process artificial intelligence (AI) workloads in real time, a capability that is coming soon to the Redmond, Wash., software giant's cloud.
By attaching high-performance FPGAs directly to datacenter network, it can serve DNNs [deep neural networks] as hardware microservices, where a DNN can be mapped to a pool of remote FPGAs and called by a server with no software in the loop,"
This system architecture both reduces latency, since the CPU does not need to process incoming requests, and allows very high throughput, with the FPGA processing requests as fast as the network can stream them.
Project Brainwave also features a so-called "soft" DNN processing unit (DPU) that exploits the flexibility provided by commercially available FPGAs to match or surpass the performance provided by hard-coded DPUs.
Project Brainwave supports deep learning framework, Microsoft Cognitive Toolkit, along with Tensorflow from Google. Microsoft plans to extend support to many other frameworks.
Microsoft Azure customers will soon be able to run their AI workloads using the Project Brainwave system. Users of the company's other services, including Bing, will indirectly feel the performance-enhancing benefits the technology offers.
Alibaba, too, has high hopes for FPGAs in cloud data centers. In March, the Chinese web services provider announced that it had teamed with Intel to launch a cloud-based workload acceleration service that uses Intel Arria 10 FPGAs.
More information on Project Brainwave, including the record-setting results of tests on the system, is available in this blog post.
Credit: eWEEK