Share via


Microsoft Security Copilot Frequently Asked Questions

General information

What is Microsoft Security Copilot?

Microsoft Security Copilot is a generative AI-powered assistant for security and IT. It provides tailored insights and recommendations using global threat intelligence, industry best practices, and organizations’ data from Microsoft and partner tools. Teams can also automate workflows with autonomous Security Copilot agents, accelerating responses, prioritizing risks, and reducing manual workloads - all while remaining firmly in control.

What are the use cases and capabilities that Security Copilot unlocks for customers?

Security Copilot focuses on making the following use cases easy to accomplish.

  • Investigating and remediating security threats
  • Building KQL queries and analyzing suspicious scripts
  • Understanding risks and managing organization’s security posture
  • Troubleshooting IT issues faster
  • Defining and managing security policies
  • Configuring secure lifecycle workflows
  • Developing reports for stakeholders
  • Automating tasks with autonomous agents

Visit the Security Copilot adoption hub to delve deeper into how Security Copilot benefits roles like CISOs, threat intelligence analysts, IT admins, data security admins, and more.

Does Microsoft Security Copilot work with other Microsoft products?

Yes. Security Copilot is integrated with and embedded in other Microsoft Security products. These products include, but aren't limited to:

  • Azure Firewall
  • Microsoft Defender Attack Surface Management
  • Microsoft Defender for Cloud
  • Microsoft Defender Threat Intelligence
  • Microsoft Defender XDR
  • Microsoft Intune
  • Microsoft Purview,
  • Microsoft Sentinel

Security Copilot can access data from these products and provide genAI assistance and agentic automation to increase the effectiveness and efficiency of security professionals using those solutions.

Does Security Copilot include access to Microsoft Defender Threat Intelligence (Defender TI)?

Yes*. When prompted, Security Copilot reasons over all content and data in Microsoft Defender Threat Intelligence (Defender TI) to return crucial context around activity groups, tooling, and vulnerabilities. Customers also have tenant-level Defender TI premium workbench access, enabling them to access Defender TI's full breadth of intelligence - Intel profiles, threat analysis, internet data sets, and more - to do a deeper dive into the content surfaced in Security Copilot.

*This access doesn't include the Defender TI API, which remains separately licensed.

Who are the intended users of Security Copilot?

SOC analysts, compliance analysts, IT admins, data security admins, identity admins and CISOs are some of the intended users of Security Copilot. Visit the Security Copilot adoption hub to learn about the key scenarios.

What languages are supported?

Security Copilot supports multiple languages. The model is available in eight languages* and the user experience is available in 25 languages.** 

*Model: English, German, Japanese, Spanish, French, Italian, Portuguese, and Chinese

**UX: Above languages plus Korean, Dutch, Swedish, Polish, Norwegian, Turkish, Danish, Finnish, and more in the user experience.

For more information, see Supported languages.

What is the difference between Security Copilot and generic LLM?

A generic LLM lacks key benefits such as:

  • Real-time signal processing with structured log data, to detect and analyze threats as they happen rather than after the fact.
  • Investigative reasoning to trace the source and impact of incidents. AI needs to pivot across many data points, just as human analysts would, and adjust conclusions as new information emerges.
  • Precision with evidence to ensure that the AI is accurate and backed by reliable data.
  • Persistent data collection to maintain a continuous, comprehensive view.

Without these capabilities and a deep understanding of your specific security environment, a generic model simply can’t provide the depth of analysis required.

Microsoft is uniquely positioned to tackle the challenges of using AI in security – our solution is built on the most advanced models available, and builds on:

  • A hyperscale infrastructure, providing the scalability and infrastructure needed to process massive volumes of data in real-time.
  • A security-specific orchestrator that streamlines, contextualizes, and coordinates responses across tools and teams.
  • Seamless integration through plugins, connecting your existing systems and extending what our solution can do.
  • Evergreen threat intelligence, informed by 84 trillion daily signals, for unmatched visibility across your whole threat landscape.
  • Built-in cyber skills training and promptbooks to empower teams with the knowledge and resources they need.

By building on all these components, we provide an AI security solution that understands your security environment, delivers insights in natural language, and handles sensitive data securely.

Purchasing and billing information

Are there any prerequisites to purchase?

An Azure subscription and Microsoft Entra ID (formerly known as Azure Active Directory) are prerequisites for using Security Copilot; there are no other product prerequisites. For more information, see Get started with Security Copilot.

Is deployed Microsoft Entra ID (formerly known as Azure Active Directory) a requirement for Security Copilot?

Yes. Security Copilot is a SaaS application and requires Microsoft Entra ID to authenticate the users who have access.

How is Security Copilot priced?

Security Copilot is priced based on Security Compute Units (SCUs).

  • Provisioned SCUs support regular workloads and are billed monthly.
  • Overage SCUs offer flexible, on-demand capacity and are billed only when used.

Use the in-product dashboard to monitor SCU usage and adjust capacity as needed.

For more information, see Microsoft Security Copilot - Pricing

Does Security Copilot support tenant or subscription transfers?

No, at this time Security Copilot doesn't support moving Security Copilot resources across Microsoft Entra tenants or subscription transfers.

Can SCUs be shared between a tenant’s Security Copilot Workspaces?

SCUs, whether Provisioned or Overage, can't be shared between Workspaces. For example, if an organization has Workspace A (1 provisioned, three overage) and Workspace B (2 provisioned, five overage), Workspace A can't use Workspace B’s SCUs to prevent throttling if it exhausts its own total of 4 SCUs. Similarly, Workspace B can't use Workspace A’s SCU capacity if it exhausts its own SCUs.

How can I estimate SCU provisioning and budget for Security Copilot?

Estimating SCU needs and budgeting depends on how your organization uses generative AI across Microsoft Security products. Use the SCU capacity calculator to get a starting point based on the number of users and workloads across Microsoft Defender, Microsoft Intune, Microsoft Purview, Microsoft Entra, and the Security Copilot standalone experience. Because every prompt and workflow varies in complexity, SCU consumption isn't fixed. Start small, experiment, and refine based on your actual usage. Track usage in real time with the in-product dashboard, which helps you monitor SCU consumption and adjust capacity as needed.

How do I interpret the results on the SCU capacity calculator?

The results show the maximum expected number of SCUs per hour, based on the number of monthly users per experience and the inclusion of automation through Logic Apps and Promptbooks. In addition to the maximum SCUs per hour, the calculator also displays all possible combinations of provisioned and overage SCUs, along with the corresponding monthly cost range. The highest monthly cost is calculated based on continuous 24/7 capacity usage.

What if my needs aren't covered by the SCU capacity calculator (for example, I have more than 50 Intune users).

Feel free to Contact us or reach out to your Microsoft account manager for more information.

What are the minimum provisioning requirements for SCUs in Workspaces?

To use Microsoft Security Copilot, each tenant must provision a minimum of one SCU. The one provisioned SCU enables access to at least one Workspace.

  • Provisioned SCUs are required to activate capacity and are billed hourly.
  • Overage SCUs can be set from 0 to 999 and are billed only when used.

Technical and product information

What partner tools are integrated with Security Copilot?

Microsoft Security Copilot supports many plugins, including Microsoft and non-Microsoft plugins. For more information, see Plugins.

Note

Products that integrate with Security Copilot need to be purchased separately.

Does Security Copilot make recommendations for IoT/OT scenarios?

No, Security Copilot doesn't currently support IoT/OT.

Does Security Copilot offer dashboarding?

Security Copilot offers an in-product usage dashboard where customers can dive deep into their SCU consumption.

Can Security Copilot execute workflows - from triaging to using pinned messages, to governing how the customer should label the incident and whether an incident should be closed?

Security Copilot provides Promptbooks that are ready-to-use workflows that can serve as templates to automate repetitive steps—for instance, regarding incident response or investigations. Additionally, there are connectors in Security Copilot that are a wrapper around the API that allows developers and users to call out to the Microsoft Security Copilot platform to perform specialized tasks. For example, the Logic Apps connector allows you to call into Copilot from an Azure Logic Apps workflow. Similarly, the Copilot Studio connector enables you to access Security Copilot to perform actions like ‘submit a Security Copilot prompt’ or ‘fetch a Security Copilot prompt status’.

What role-based access control or delegation features does Security Copilot have? How are user permissions kept in Security Copilot aligned to user permission configurations in other solutions?

Copilot uses on-behalf-of authentication to access security-related data through active Microsoft plugins. Specific Security Copilot roles must be assigned in order for a group or individual to access the Security Copilot platform. For more information, see Understand authentication.

How is Security Copilot dealing with a "token limit"?

Large language models (LLMs) including GPT have limits on how much information they can process at once. This limit is known as a “token limit”, and roughly correlates to 1.2 words per token. Security Copilot uses the latest GPT models from Azure OpenAI to ensure we can process as much information as possible in a single session. In some cases, large prompts, long sessions, or verbose plugin output may overflow the token space. When this scenario happens, Security Copilot attempts to apply mitigations to ensure an output is always available, even if the content in that output isn’t optimal. Those mitigations aren’t always effective, and it might be necessary to stop processing the request and direct the user to try a different prompt or plugin.

What are Security Copilot agents?

Microsoft Security Copilot agents enhance security and IT operations with autonomous and adaptive automation. Agents seamlessly integrates with Microsoft Security solutions and third-party partner ecosystem to handle high-volume security tasks. Purpose-built for security, these agents learn from feedback, adapt to organizational workflows with your team fully in-control, and operate securely within Microsoft’s Zero Trust framework—accelerating responses, prioritizing risks, and driving efficiency. By reducing manual workloads, they enhance operational effectiveness and strengthen your organization’s overall security posture.

Visit the adoption hub to learn more about Security Copilot agents.

What compute resources do Security Copilot agents use?

Agents utilize SCUs to operate just like other features in Security Copilot. They integrate seamlessly with Microsoft Security solutions and the broader supported partner ecosystem.

Where can I find Security Copilot agents?

You can easily discover Microsoft Security Copilot agents from both the standalone and embedded experiences.

For the standalone experience, you can select Go to agents from the banner. You can also navigate to the agent library from the home menu. In the embedded experiences, you'll see agents within the portal and explore their capabilities.

Visit the adoption hub to learn more.

What are connectors in Security Copilot?

The connectors in Security Copilot are a wrapper around the API that allows the developers and users to call out to the Microsoft Security Copilot platform to perform specialized tasks. Currently, Logic Apps and Copilot Studio connectors are supported. For more information, see Connectors.

How does the Copilot Studio connector work?

Copilot Studio connector enables you to access Security Copilot while creating automation workflows. Using the Security Copilot connector, you can:

  • Submit a Security Copilot prompt - Submit a natural language prompt to create a new Security Copilot investigation. After completion, the evaluation result will then be returned to your workflow.
  • Fetch a Security Copilot prompt status - Submit a natural language prompt to pull the status of a Security Copilot evaluation. After completion, the evaluation result will then be returned to your workflow.

How does the Logic Apps connector work?

The Microsoft Security Copilot Logic Apps connector allows you to call into Copilot from an Azure Logic Apps workflow. The connector exposes two connector actions:

  • Submit a Security Copilot prompt - Submit a natural language prompt to create a new Security Copilot investigation. After completion, the evaluation result will then be returned to your workflow.
  • Submit a Security Copilot promptbook - Given a promptbook, invoke a new Security Copilot promptbook evaluation and return the output to your Azure Logic Apps workflow.

What is a Security Copilot Workspace?

In the context of Security Copilot, Workspaces help teams manage resources, optimize workflows, and maintain compliance with organizational policies. You can also configure it to the specific needs of teams and groups, including things like designating access, assigning capacity, configuring specific plugins, deploying agents, and adding promptbooks to tailor the experience based on unique requirements of each team or group.

Workspaces provide a flexible way to segment environments, making it easier to align access and capacity with organizational needs, legal structures, or compliance requirements. For more information, see Workspaces overview.

Data and privacy information

Is Customer Data used to train Azure OpenAI Service foundation models?

No, Customer Data isn’t used to train Azure OpenAI Service foundation models, and this commitment is documented in our Product Terms. For more information on data sharing in the context of Security Copilot, see Privacy and data security.

What is the GDPR Guidance for EU Markets?

Microsoft complies with all laws and regulations applicable to its providing the Products and Service including security breach notification law and Data Protection Requirements (as defined in the Microsoft DPA). However, Microsoft isn’t responsible for compliance with any laws or regulations applicable to Customer or Customer’s industry that aren’t generally applicable to information technology service providers. Microsoft doesn’t determine whether Customer’s data includes information subject to any specific law or regulation. For more information, see Microsoft Products and Services Data Protection Addendum (DPA).

Are US Government Cloud (GCC) customers eligible?

Currently, Security Copilot isn't designed for use by customers using US government clouds, including but not limited to GCC, GCC High, DoD, and Microsoft Azure Government. For more information, see with your Microsoft representative.

Are US and Canada health care customers eligible?

US and Canada HLS customers are eligible to purchase Security Copilot. Microsoft Security Copilot is now listed and covered by Business Associate Agreement (“BAA”), which is important to healthcare providers who are subject to regulations under HIPAA. For additional information on compliance offerings currently covered for Microsoft Security Copilot can be found in the Service Trust Portal.

How do I export or delete data from Security Copilot?

You will need to contact support. For more information, see Contact support.

Where can I find more information on Data Protection and Privacy?

You can learn more at the Microsoft Trust Center.

The Azure OpenAI Service code of conduct includes “Responsible AI Mitigation Requirements”. How do those requirements apply to Security Copilot customers?

These requirements don’t apply to Security Copilot customers because Security Copilot implements these mitigations.

Why does Microsoft Copilot transfer data to a Microsoft tenant?

Microsoft Copilot is a SaaS (Software as a Service) offering that runs in the Azure production tenant. Users enter prompts and Security Copilot provides responses based on the insights sourced from other products such as Microsoft Defender XDR, Microsoft Sentinel, and Microsoft Intune. Security Copilot stores past prompts and responses for a user. The user can use the in-product experience to access prompts and responses. Data from a customer is logically isolated from the data of other customers. This data doesn't leave the Azure production tenant and is stored until customers ask to delete them or offboard from the product.

How is the transferred data secured in transit and at rest?

The data is encrypted both in transit and at rest as described in the Microsoft Products and Services Data Protection Addendum.

How is the transferred data protected from unauthorized access and what testing was done for this scenario?

By default, no human users have access to the database and the network access is restricted to the private network where the Microsoft Copilot application is deployed. If a human needs access to respond to an incident, then the on-call engineer needs elevated access and network access approved by authorized Microsoft employees.

Apart from regular feature testing, Microsoft also completed penetration testing. Microsoft Security Copilot complies with all the Microsoft Privacy, security and compliance requirements.

In "My Sessions" when an individual session is deleted, what happens to the session data?

Session data is stored for runtime purposes (to operate the service), and also in logs. In the runtime database, when a session is deleted via the in-product UX, all data associated with that session is marked as deleted and the time to live (TTL) is set to 30 days. After that TTL expires, queries can't access that data. A background process physically deletes the data after that time. In addition to the 'live' runtime database, there are periodic database backups. The backups will age out – these have short-lived retention periods (currently set to four days).

Logs, which contain session data aren't affected when a session is deleted via the in-product UX. These logs have a retention period of up to 90 days.

What Product Terms apply to Security Copilot? Is Security Copilot a "Microsoft Generative AI Service" within the meaning of Microsoft's Product Terms?

The following Product Terms govern Security Copilot customers:

  • Universal License Terms for Online Services terms in the Product Terms, which include the Microsoft Generative AI Services terms and the Customer Copyright Commitment.

  • Privacy & Security Terms in the Microsoft Product Terms, which include the Data Protection Addendum.

Security Copilot is a Generative AI Service within the definition of the Product Terms. Additionally, Security Copilot is a "Covered Product" for purposes of the Customer Copyright Commitment. At this time, in the Product Terms there are no product-specific terms unique to Security Copilot.

In addition to the Product Terms, customers' MBSA/EA and MCA agreements, for example, govern the parties' relationship. If a customer has specific questions about its agreements with Microsoft, engage the CE, the deal manager, or the local CELA supporting the deal.

The Microsoft Customer Copyright Commitment is a new commitment that extends Microsoft's existing intellectual property indemnity support to certain commercial Copilot services. The Customer Copyright Commitment applies to Security Copilot. If a third party sues a commercial customer for copyright infringement for using Microsoft's Copilots or the output they generate, Microsoft will defend the customer and pay the amount of any adverse judgments or settlements that result from the lawsuit, provided that the customer used the guardrails and content filters built into our products.

Can Security Copilot customers opt out of Azure OpenAI Service abuse monitoring? Does Security Copilot engage in any content filtering or abuse monitoring?

Azure OpenAI abuse monitoring is currently disabled service-wide for all customers.

Does Security Copilot make any location of data processing or data residency commitments?

For more information on Customer Data storage location and processing, see Privacy and data security.

Is Security Copilot a Microsoft EU Data Boundary service?

At the time of GA, all Microsoft Security Services are out of scope for EU data residency requirements and Security Copilot won't be listed as an EUDB service.

Where is EU customer data stored?

Security Copilot stores Customer Data and Personal Data such as user prompts and Microsoft Entra Object IDs in the tenant Geo. If a customer provisions their tenant in the EU and isn’t opted in to data sharing, all Customer Data and pseudonymized personal data are stored at rest within the EU. Processing of Customer Data and Personal Data prompts can occur in the designated Security GPU Geo. For more information on Security GPU geography selection, see Get Started with Security Copilot. If a customer is opted in to data sharing, prompts can be stored outside of the EU Data Boundary. For more information on data sharing, see Privacy and data security..

Are customer prompts (such as input content from the customer) considered Customer Data within the terms of the DPA and the Product Terms?

Yes, customer prompts are considered Customer Data. Under the Product Terms, customer prompts are considered Inputs. Inputs are defined as "all Customer Data that Customer provides, designates, selects, or inputs for use by a generative artificial intelligence technology to generate or customize an output".

Is "Output Content" considered Customer Data within the terms of the DPA and the Product Terms?

Yes, Output Content is Customer Data under the Product Terms.

Is there a transparency note or transparency documentation for Security Copilot?

Yes, the Responsible AI transparency document can be found here: Responsible AI FAQ.

What are the Compliance Offerings for Microsoft Security Copilot?

Microsoft Security Copilot is dedicated to upholding the highest standards of security, privacy, and operational excellence, as demonstrated by its extensive array of industry certifications. These include ISO 27001 for information security management, ISO 27018 for the protection of personal data in the cloud, ISO 27017 for cloud-specific security controls, and ISO 27701 for privacy information management.

Additionally, Security Copilot holds certifications for ISO 20000-1 in IT service management, ISO 9001 in quality management, and ISO 22301 in business continuity management. It also complies with SOC2 requirements for security, availability, and confidentiality, underscoring our commitment to delivering secure and reliable services. For healthcare-related services, Security Copilot is certified under the HiTrust CSF framework, further enhancing its security and compliance stance, and is covered by HIPAA Business Associate Agreements (BAA), ensuring adherence to healthcare regulations and the protection of sensitive health information.

For more information on compliance offerings currently covered for Microsoft Security Copilot see, the Service Trust Portal.

Partner information

What are the use cases for Partners?  

Partners can provide signals or build complementary solutions around Security Copilot scenarios.

If a customer works with a managed security service provider (MSSP), can the MSSP use and manage Security Copilot on the customer's behalf?

Yes, MSSPs that provide SOC services for customers are able to access the customer's Security Copilot environment if the customer elects to provide access. Available options include:

  • Azure Lighthouse
  • B2B Collaboration / Guest Accounts
  • Granular Delegated Admin Privileges (GDAP)

There currently isn't a CSP or reseller multitenant model for MSSPs. Each customer is responsible for purchasing their own SCUs and setting up their MSSPs with the necessary access if they're accessing within the customer’s tenant. If Azure Lighthouse delegated access is provided to the customer’s Microsoft Sentinel Workspaces then the partner is able to use their partner capacity plan SCUs to perform prompting against the customer’s Microsoft Sentinel Workspace data.

If an MSSP prompts across a customer’s tenant, using Azure Lighthouse, will the MSSP use their own SCUs or the customer’s SCUs?

Azure Lighthouse allows partners to gain Security Copilot permissions for customer’s Microsoft Sentinel Workspaces and other supported Azure resources. The capacity plan (SCUs) used is the partner tenant's capacity plan.

Can MSSPs, use a single instance of Security Copilot to manage multiple tenants?

Azure Lighthouse is supported for invoking Sentinel-based skills from the partner tenant against a single customer's Microsoft Sentinel Workspace at a time that the partner has been provided delegated access to through Azure Lighthouse. The partner tenant will leverage their SCUs to perform Security Copilot invoked Microsoft Sentinel skills against the customer's tenant without the need for the customer tenant to be provisioned for Security Copilot or having SCUs of their own.

Are there third-party integrations available today?

Microsoft Security Copilot supports many plugins, including Microsoft and non-Microsoft plugins. For more information, see Non-Microsoft plugins overview. Additionally, there are agents from our partners that are available in Security Copilot. For more information, see Partner agents.

Note

Products that integrate with Security Copilot need to be purchased separately.

Is there a marketplace for the plugins or services? 

There isn't a plugin marketplace. ISVs can publish their solutions to GitHub. All partners are required to publish their solutions or managed services to the Microsoft Commercial marketplace. For more information on publishing to marketplace, see:

Publish solution to the Microsoft Commercial Marketplace:

MSSP Specific: Must have a security designation in Microsoft AI Cloud Partner Program.

SaaS Specific:

What if MSSPs aren't using Microsoft Defender XDR or Microsoft Sentinel? 

Microsoft Security Copilot doesn't have any specific Microsoft security product requirement for provisioning or use since the solution is built on aggregating data sources both from Microsoft and third party services. With that said, there's a significant value in having Microsoft Defender XDR and Microsoft Sentinel enabled as supported plugins for enriching investigations. Security Copilot only uses skills and accesses data from enabled plugins.

Does an MSSPs SOC Solution need to be hosted on Azure? 

It's recommended that the solution is hosted on Azure but not required. 

Is there a product roadmap that can be shared with Partners? 

Not at this time.