Azure Updates: Phi-3; VMs; AI; Latin America market changes
FREE Membership Required to View Full Content:
Joining MSDynamicsWorld.com gives you free, unlimited access to news, analysis, white papers, case studies, product brochures, and more. You can also receive periodic email newsletters with the latest relevant articles and content updates.
Learn more about us here
Microsoft hosted its annual Build conference in Seattle, unveiling new details about its products and services. CVP Misha Bilenko shared the introduction of new Phi-3 models. The previously announced Phi-3 small and medium models are available for Azure, along with Phi-3 mini. All three models are also available for Hugging Face, and are optimized to run across a variety of infrastructure such as ONNX Runtime, DirectML, NVIDIA NIM microservices, and Intel accelerators.
“Phi-3 models are the most capable and cost-effective small language models (SLMs) available, outperforming models of the same size and next size up across a variety of language, reasoning, coding, and math benchmarks. They are trained using high quality training data, as explained in Tiny but mighty: The Phi-3 small language models with big potential. The availability of Phi-3 models expands the selection of high-quality models for Azure customers, offering more practical choices as they compose and build generative AI applications,” stated Bilenko.
FREE Membership Required to View Full Content:
Joining MSDynamicsWorld.com gives you free, unlimited access to news, analysis, white papers, case studies, product brochures, and more. You can also receive periodic email newsletters with the latest relevant articles and content updates.
Learn more about us here