Microsoft Azure CTO Mark Russinovich has revealed some major advances in Microsoft’s Hyperscale deployment on Intel field programmable gate arrays (FPGAs). These advances have given way to the industry’s fastest public cloud network, and new technology for acceleration of deep Neural Networks (DNNs). These replicate “thinking” in a way that it’s conceptually similar to the manner in which a human brain works.
These advances render superior performance in terms of flexibility and scale using super low latency networking to leverage the world’s largest cloud investment in FPGAs. This low latency networking will help in achieving high networking speed which in turn would benefit business, government, healthcare, and universities to better process Big Data workloads. Moreover, Azure’s FPGA-based Accelerated Networking reduces inter-virtual machine latency by up to 10 times while making the Intel Xeon processors idle to be used for other tasks.
Microsoft Outlines Advances in the Hyperscale Deployment of Intel FPGAs (Image Courtesy: Intel)
In addition, a new cloud acceleration framework was outlined called Hardware Microservices. The infrastructure which delivers this acceleration has been built on Intel FPGAs. This new technology will allow accelerated computing services like Deep Neural Networks to run in the cloud without requiring any software, thus resulting in large advances in speed and efficiency.
“Application and server acceleration requires more processing power today to handle large and diverse workloads, as well as a careful blending of low power and high performance—or performance per Watt, which FPGAs are known for,” said Dan McNamara, corporate vice President, and General Manager, Programmable Solutions Group, Intel. “Whether used to solve an important business problem, or decode a genomics sequence to help cure a disease, this kind of computing in the cloud, enabled by Microsoft with help from Intel FPGAs, provides a large benefit.”
Intel expands the boundaries of technology to make the most amazing experiences possible.
Filed Under: News