Microsoft sheds more light on Dynamics 365 Copilot's security and privacy guardrails

May 22 2023

Listen to article:

FREE Membership Required to View Full Content:

Become a Member Login
Joining gives you free, unlimited access to news, analysis, white papers, case studies, product brochures, and more. You can also receive periodic email newsletters with the latest relevant articles and content updates. 
Learn more about us here

Source: Microsoft

Microsoft's vision for AI-driven Copilot tools within the context of business applications will rely on computing services like the large language-models available through the Azure OpenAI Service. But these copilots will also require businesses to apply those LLMs to their critical business data. Organizations store, create, and process huge amounts of information on a daily basis, and, if Microsoft (and other vendors) execute successfully on its plans, virtually all of it has the potential to become input to an AI-based application or service in the future.

With the reveal of Dynamics 365 Copilot, Microsoft now has plans for a collection of AI-based tools that will work in the context of Dynamics 365 applications. The options will be limited at first to pre-built tools: suggested responses to customer emails, drafts of new product descriptions based on existing item attributes, and supply chain alerts based on publicly available weather and news events, for example. But with so much data potentially available for use by LLMs, the potential for risk also rises. If an AI model in a public cloud can operate on an organization's data, how much can that resource really see, what can it do with data it processes, and what does it all mean for an organization's security and regulatory posture?

Microsoft vice president of applied AI for the Business Applications and Platform group Walter Sun offered answers to some of the fundamental questions about how Dynamics 365 Copilot works with an organization's data and security policies. His recent blog post explained some of the guardrails that Microsoft has put in place and some of the concepts that Microsoft believes will allow customers to both use copilots and maintain their existing policies around secure access to data.

Sun explains the multi-stage process through which a Dynamics 365 app or Power App prompts a business user, then uses grounding to assure the relevance of the prompt based on the user's role-based access to the organization's Microsoft data sources like Microsoft Graph, Dataverse, and documents. He wrote:

About Jason Gumpert

As the editor of, Jason oversees all editorial content on the site and at our events, as well as providing site management and strategy. He can be reached at

Prior to co-founding, Jason was a Principal Software Consultant at Parametric Technology Corporation (PTC), where he implemented solutions, trained customers, managed software development, and spent some time in the pre-sales engineering organization. He has also held consulting positions at CSC Consulting and Monitor Group.

More about Jason Gumpert