Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
[This article is prerelease documentation and is subject to change.]
These frequently asked questions (FAQ) describe the AI impact of the analysis assist feature in Business Central.
Important
- This is a production-ready preview feature.
- Production-ready previews are subject to supplemental terms of use.
What is analysis assist?
Analysis assist is a copilot that provides assistance for working with the data analysis mode in Business Central. The data analysis mode enables you to organize, aggregate, and summarize data on pages and queries to make it more suitable for analyzing and extracting meaningful insights. With analysis assist, you can automatically construct the view of the data you want to analyze by expressing your needs in simple, natural language, like "show vendors by location sorted by number of purchases." Analysis assist makes it easier to work with data without the need for complex technical skills.
What are capabilities of analysis assist?
Analysis assist converts natural language instructions into a structured design for displaying data in the analysis mode, without creating, modifying, or updating customer business data itself.
What is the intended use of analysis assist?
Analysis assist helps create analysis tabs in the data analysis mode to present data in a manner that makes it easier for you to draw conclusions. However, it's important to note that analysis assist doesn't provide direct insights or conclusions about the data. It's a tool to help users organize and view their data. It's up to the user to extract actionable information, discover trends, and make informed decisions to drive business value.
How was analysis assist evaluated? What metrics are used to measure performance?
The feature underwent extensive testing based on Business Central's demonstration data and other fictitious business data. Copilot was given numerous prompts in the supported languages that covered a broad range of instructions and styles of expressing intent. The outcomes were evaluated against accuracy, relevance, and safety.
The feature is built in accordance with Microsoft's Responsible AI Standard. Learn more about responsible AI from Microsoft.
How does Microsoft monitor the quality of generated content?
Microsoft has various systems in place to ensure that content generated by Copilot is of the highest quality, detect abuse, and ensure safety for our customers and their data.
Users have the opportunity to provide feedback to every Copilot response and report inaccurate or inappropriate content to help Microsoft improve this feature.
If you encounter inappropriate generated content, report it to Microsoft by using this feedback form: Report abuse
We analyze user feedback on the feature and use it to help us improve responses.
You provide feedback by using the like (thumbs up) or dislike (thumbs down) icon on the Copilot pane in Business Central.
Microsoft might disable the Copilot features for selected customers if abuse of the functionality is detected.
What are the limitations of analysis assist? How can users minimize the impact of the analysis assist limitations when using the system?
General AI limitations:
AI systems are valuable tools but they're nondeterministic. The content they generate might not be accurate. It's important to use your judgment to review and verify responses before making decisions that could affect stakeholders like customers and partners.
Geographic and language availability
This Copilot feature is available in all supported Business Central countries/regions. However, the feature uses Microsoft Azure OpenAI Service, which is currently available for Business Central in some geographies. If your environment is located in a country/region where Azure OpenAI Service isn't available, administrators must allow data to move across geographies. Learn more at Copilot data movement across geographies.
This feature was validated and is supported in specific languages. While it can be used in other languages, it might not function as intended. Language quality might vary based on the user's interaction or system settings, which might impact accuracy and the user experience.
Learn more about geographic and language availability at Copilot international availability.
Certain industry, product, and subject limitations:
Organizations that operate in some business domains, such as medical, drugs, legal, and weapons, might experience lower quality of service.
What data does Microsoft collect and how is it used?
The Microsoft Privacy Statement applies to Dynamics 365 Business Central and other Microsoft products and services. The information below provides additional transparency about what data is collected and how it is used when you access Microsoft’s Copilot and agent features in Business Central.
Diagnostic data: All prompts (inputs) and responses (outputs) are stored securely in Business Central for a period of 20 days to facilitate responding to support requests. This diagnostic data is stored alongside your Business Central company data within the same geographic and compliance boundary. The data doesn't count against database quotas. Prompts and responses might include company data and personal data. The data can only be accessed by Microsoft personnel as part of a customer support request. Organizations implementing Customer Lockbox policies will receive a data access request for their approval before Microsoft can access this data as part of a support request. Learn more about lockbox in Business Central security.
Microsoft doesn't use your company data, prompts, or responses to train AI models. Learn more in Dynamics 365 terms for Azure OpenAI-powered features.
Usage data: Microsoft collects the minimum data required to operate and improve the Business Central service, including anonymized data about your use of AI. Usage data doesn't include prompts (inputs) or responses (outputs), and it doesn't include customer data or personal data.
Feedback: Microsoft collects feedback that users provide using the like (thumbs up) or dislike (thumbs down) icons in the UI. When you give feedback, we record whether you liked or disliked something, the dislike reason (if you share one), and which AI feature your feedback is about.
We don't automatically collect your prompt (input), or the AI response (output) when you give feedback.
Related information
Analyze data with Copilot (preview)
Learn more about Copilot data movement across geographies