Azure Updates: VMs for AI; Innovate anywhere; Databases
Microsoft senior program manager Sherry Wang profiled NC A100 v4 VMs for AI, which are now generally available. Leveraging NVIDIA A100 80GB PCIe Tensor Core GPUs and 3rd Gen AMD EPYC processors, the new VMs are intended to boost AI and inferencing workloads, particularly for scenarios like object detection, autonomous driving, image classification, or speech recognition. Wang wrote:
For AI training workloads, customers will experience between 1.5 to 3.5 times the performance boost compared to the previous NC generation (NCv3) with NVIDIA Volta architecture-based GPUs. Similar performance applies to AI Inference workloads. Moreover, customers will experience between 1.5 to five times performance boost with seven independent GPU instances on a single A100 GPU through the multi-instance GPU (MIG) feature. Customers will experience increased performance gains with smaller batch sizes.
CVP for Azure Database Services, Peter Carlin discussed the theme of “innovate anywhere” with Azure Arc. Microsoft has added automatic feature updates, multi-layered security with Transparent Data Encryption and Always Encrypted, elastic scale, and a more streamlined DevOps experience. Most recently, Microsoft is adding a data services Business Critical tier, failover within the same Kubernetes cluster, and feature parity with SQL Server Enterprise Edition.
As part of his pitch for Azure databases, Carlin noted that in a recent ESG study, customers who migrated their data from on-prem to Azure VMs were able to reduce their costs by up to 47 percent. He shared other study data suggesting that Azure VMs are better priced than AWS competitors.
FREE Membership Required to View Full Content:
Joining MSDynamicsWorld.com gives you free, unlimited access to news, analysis, white papers, case studies, product brochures, and more. You can also receive periodic email newsletters with the latest relevant articles and content updates.
Learn more about us here