You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: defender-xdr/responsible-ai-copilot-defender.md
+5-7Lines changed: 5 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,7 +18,7 @@ ms.topic: concept-article
18
18
search.appverid:
19
19
- MOE150
20
20
- MET150
21
-
ms.date: 03/21/2025
21
+
ms.date: 03/25/2025
22
22
#customer intent: I want to learn about how Microsoft applies responsible AI principles to Microsoft Copilot in Microsoft Defender.
23
23
---
24
24
@@ -32,9 +32,9 @@ Responsible AI FAQs are part of a broader effort to put Microsoft's AI principle
32
32
33
33
## Responsible AI FAQs
34
34
35
-
### What is Microsoft Copilot in Defender?
35
+
### What is Microsoft Copilot in Microsoft Defender?
36
36
37
-
Microsoft Copilot in Defender is a security solution that uses AI to help security analysts investigate and respond to threats. Copilot in Defender is designed to help security analysts work more efficiently and effectively by providing them with relevant information and recommendations.
37
+
Microsoft Copilot in Defender is the integration of Security Copilot in the Microsoft Defender portal. It is a security solution that uses AI to help security analysts investigate and respond to threats. Copilot in Defender is designed to help security analysts work more efficiently and effectively by providing them with relevant information and recommendations.
38
38
39
39
Copilot in Defender draws context from the data in the workloads that it monitors, and uses that context to provide recommendations to security analysts.
40
40
@@ -59,10 +59,6 @@ Now that it is released, user feedback is critical in helping Microsoft improve
59
59
60
60
### What are the limitations of Copilot in Defender? How can users minimize the impact of Copilot in Defender’s limitations when using the system?
61
61
62
-
- The Early Access Program is designed to give customers the opportunity to get early access to Copilot in Defender and provide feedback about the platform. Preview features aren’t meant for production use and might have limited functionality.
63
-
64
-
- Like any AI-powered technology, Copilot in Defender doesn’t get everything right. However, you can help improve its responses by providing your observations using the feedback tool, which is built into the platform.
65
-
66
62
- The system is designed to generate responses and respond to prompts related to the security domain like incident investigation and threat intelligence. Prompts outside the scope of security might result in responses that lack accuracy and comprehensiveness.
67
63
68
64
- Copilot in Defender might generate code or include code in responses, which could potentially expose sensitive information or vulnerabilities if not used carefully. Responses might appear to be valid but might not actually be semantically or syntactically correct or might not accurately reflect the intent of the developer. Users should always take the same precautions as they would with any code they write that uses material users didn't independently originate, including precautions to ensure its suitability. These include rigorous testing, IP scanning, and checking for security vulnerabilities.
@@ -73,6 +69,8 @@ Now that it is released, user feedback is critical in helping Microsoft improve
73
69
74
70
- Use of the platform might be subject to usage limits or capacity throttling. Even with shorter prompts, generating responses, and checking them before displaying them to the user can take time (up to several minutes) and require high GPU capacity.
75
71
72
+
- Like any AI-powered technology, Copilot in Defender doesn’t get everything right. However, you can help improve its responses by providing your observations using the feedback tool, which is built into the platform.
73
+
76
74
### How is Microsoft approaching responsible AI for Copilot in Defender?
77
75
78
76
At Microsoft, we take our commitment to responsible AI seriously. Security Copilot is being developed in accordance with our [AI principles](https://www.microsoft.com/ai/principles-and-approach). We're working with OpenAI to deliver an experience that encourages responsible use. For example, we have and will continue to collaborate with OpenAI on foundational model work. We have designed the Copilot in Defender user experience to keep humans at the center. We developed a safety system that is designed to mitigate failures and prevent misuse with things like harmful content annotation, operational monitoring, and other safeguards. The invite-only early access program is also a part of our approach to responsible AI. We're taking user feedback from those with early access to Security Copilot to improve the tool before making it broadly available.
0 commit comments