Azure Updates: LLMOPS; FOCUS; Data Manager for Energy
Microsoft executives emphasized different aspects of the company’s AI vision in a series of blog posts in recent days. CVP John Montgomery discussed the opportunity to create repeatable value with LLMOPS. Large language model operations, or LLMOPS for short, is founded on capabilities like Azure AI prompt flows announced at this year’s Build event. In September, Microsoft issued a public preview of a flow code-first experience with Azure AI Software Development Kit, VS Code, and Command Line Interface.
“Unlike the traditional ML models which often have more predictable output, the LLMs can be non-deterministic, which forces us to adopt a different way to work with them. A data scientist today might be used to control the training and testing data, setting weights, using tools like the responsible AI dashboard in Azure Machine Learning to identify biases, and monitoring the model in production,” Montgomery wrote.
According to Montgomery, the process of setting up LLMOPS involves three phases. During a startup phase, customers might choose to create an AI Search index on data, adding that information into a model like GPT-4. Next comes refinement, during which a customer would iterate and refine a flow. Finally comes production, using Azure AI to monitor performance.
Senior product marketing manager Andy Beatman explored the value of AI input prompts, which help to support a user’s intent and expectations from AI. Explaining the value of prompts, Beatman noted that customers should start with a simple, open-ended question, set boundaries, provide context, and iterate.
FREE Membership Required to View Full Content:
Joining MSDynamicsWorld.com gives you free, unlimited access to news, analysis, white papers, case studies, product brochures, and more. You can also receive periodic email newsletters with the latest relevant articles and content updates.
Learn more about us here