Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
[This article is prerelease documentation and is subject to change.]
These frequently asked questions (FAQ) describe the AI impact of Copilot’s summarize feature in Business Central.
Important
- This is a production-ready preview feature.
- Production-ready previews are subject to supplemental terms of use.
What is summarize?
Business Central users often need an overview of their data and what urgently needs their attention to decide how to proceed with their tasks optimally.
Microsoft Copilot is an AI-powered assistant that sparks creativity, boosts productivity, and eliminates tedious tasks. Copilot uses AI to generate an insightful summary of any record, making it effortless for people to learn what is important or urgent.
What are the capabilities of summarize?
A concise summary is displayed as a FactBox on most card and document pages in Business Central. It typically provides the top two or three insightful points as brief sentences.
From the initial summary, users can select Show more to get more information about the record. The Copilot pane opens and generates additional points of interest.
When any summary is displayed, Copilot makes it easy to review and learn about the facts referenced by the summary. These references are hyperlinked, so that users can quickly view or navigate to the source where they can explore details and take action.
While the summary text is AI-generated, each insight is grounded in factual data from Business Central. Copilot inherits the user's data permissions and can’t read any more data than the user already has access to.
What is the intended use of summarize?
This feature helps people understand their business data, reduces the time it takes to sift through Business Central’s rich dataset, and makes Business Central easier to use. It isn't designed to make decisions on your behalf, provide recommendations, or advise on how to act or optimize your business.
Summaries don't automatically trigger actions on behalf of the user or the organization, and they don't persist to the database for other automated functions to use.
How was summarize evaluated? What metrics are used to measure performance?
This feature is built in accordance with Microsoft's Responsible AI Standard. Learn more about responsible AI from Microsoft in Empowering responsible AI practices.
The feature underwent extensive AI testing using Business Central's demonstration data supplemented with extra, fictitious business data and volume data. Testing primarily covered various fields and pages from Business Central’s base application, with some testing also carried out on custom pages and fields. Copilot’s output was evaluated for accuracy of ranked insights, use of language, grounding of values in database data, and other metrics.
To ensure customer safety and data protection, this feature underwent rigorous testing to detect and deflect harmful content, jailbreaks, and other risks.
How does Microsoft monitor the quality of generated content?
Microsoft has various automated systems to ensure that output from Copilot is of the highest quality. Automated systems also detect abuse and ensure safety for our customers and their data by filtering harmful content.
Microsoft might turn off Copilot features for specific customers if abuse is detected.
Users can provide feedback on every Copilot response and report inaccurate or inappropriate content to help Microsoft improve this feature. If you encounter inappropriate content, report it to Microsoft using this feedback form: Report abuse. We analyze user feedback and use it to improve responses.
You provide feedback by using the like (thumbs up) or dislike (thumbs down) icons that are displayed alongside generated content.
What are the AI limitations of summarize? How can users minimize the impact of the limitations when using the system?
General AI limitations
AI systems are valuable tools but they're nondeterministic. The content they generate might not be accurate. It's important to use your judgment to review and verify responses before making decisions that could affect stakeholders like customers and partners.
Geographic and language availability
This Copilot feature is validated and supported in specific languages. While it can be used in other languages, it might not function as intended. Language quality might vary based on the user's interaction or system settings, which might impact accuracy and the user experience. Learn more about geographic and language availability at Copilot international availability.
Certain industry, product, and subject limitations
Organizations that operate in some business domains, such as medical, drugs, legal, and weapons, might experience lower quality or limited output from Copilot because of the sensitive nature of that domain.
Summaries about people
Summaries about people, such as customers, vendors, or employees in Business Central, might result in limited output from Copilot because of other safety mechanisms designed to reduce the risk of inaccuracies. Even though Copilot isn't designed to provide recommendations or conclusions about people, you should use your judgment to review and verify responses before making decisions.
What data does Microsoft collect and how is it used?
The Microsoft Privacy Statement applies to Dynamics 365 Business Central and other Microsoft products and services. The information below provides additional transparency about what data is collected and how it is used when you access Microsoft’s Copilot and agent features in Business Central.
Diagnostic data: All prompts (inputs) and responses (outputs) are stored securely in Business Central for a period of 20 days to facilitate responding to support requests. This diagnostic data is stored alongside your Business Central company data within the same geographic and compliance boundary. The data doesn't count against database quotas. Prompts and responses might include company data and personal data. The data can only be accessed by Microsoft personnel as part of a customer support request. Organizations implementing Customer Lockbox policies will receive a data access request for their approval before Microsoft can access this data as part of a support request. Learn more about lockbox in Business Central security.
Microsoft doesn't use your company data, prompts, or responses to train AI models. Learn more in Dynamics 365 terms for Azure OpenAI-powered features.
Usage data: Microsoft collects the minimum data required to operate and improve the Business Central service, including anonymized data about your use of AI. Usage data doesn't include prompts (inputs) or responses (outputs), and it doesn't include customer data or personal data.
Feedback: Microsoft collects feedback that users provide using the like (thumbs up) or dislike (thumbs down) icons in the UI. When you give feedback, we record whether you liked or disliked something, the dislike reason (if you share one), and which AI feature your feedback is about.
We don't automatically collect your prompt (input), or the AI response (output) when you give feedback.
Related information
Summarize records with Copilot
FAQ for Copilot data security and privacy
Azure OpenAI Service and Business Central data
Copilot data movement across geographies