This Cybersecurity Awareness Month, we are kicking off a blog series to answer some of the most common security questions we hear from our audience. Because we work across IT, cybersecurity, web development, and Microsoft business applications, security is, naturally, a big focus in everything we do. That means we spend a lot of time thinking, researching, and talking about how new technologies affect your data.
To start the series, we are diving into one of the hottest topics right now: Microsoft Copilot. In a recent webinar about maximizing Microsoft 365 Copilot for everyday work and writing effective prompts, the top concern we heard was about data security. So, in this post, we are exploring some of the risks of using generative AI at work and clarifying the security differences between the public/free Copilot experience and the enterprise-licensed Microsoft 365 Copilot.
Free Microsoft Copilot: What you need to know
Microsoft Copilot is the public, free AI platform that answers questions, generates content, and creates images based on users’ prompts.
Key security points:
Data usage for AI training
Your voice and text interactions can be used to improve Copilot for the broader community.
Personal data protections
Microsoft scrubs identifiable information, like names, phone numbers, email addresses, and device IDs, before using it for model training.
Opting out
You can prevent your conversations from being used for AI model training in settings across copilot.microsoft.com, the Windows app, and the mobile app. Here’s how you can do it:
- On copilot.microsoft.com: Select your profile icon, select your profile name, then select Privacy > Model training on text/voice.
- In Copilot for Windows: Select your profile icon, then select Settings > Privacy > Model training on text/voice.
- In the Copilot mobile app: Open the menu, select your profile icon, then select Account > Privacy > Model training on text/voice.
- Learn more here.
Bottom line: While your personal data is generally protected, confidential company information or sensitive personal data shared with the free Copilot may be at risk if used for AI training.
Microsoft 365 Copilot: Enterprise security built in
Microsoft 365 Copilot is a licensed AI assistant integrated into apps like Word, Excel, Outlook, and PowerPoint. It draws from content in Microsoft Graph—your work emails, chats, and documents—to provide personalized responses.
Key advantages:
No model training with your data
Enterprise-grade security
Governance matters
Bottom line: Microsoft 365 Copilot is more secure for enterprises, but it’s only as safe as the underlying governance of your Microsoft 365 environment. Learn more about Microsoft 365 Copilot security.
Key risks of using generative AI at work
Even with enterprise AI, there are risks you should consider:
Accidental Data Exposure
Employees may paste sensitive information like financials, customer records, or source code into AI prompts. Microsoft Copilot risks sending data outside the organization; licensed Copilot risks internal exposure if permissions aren’t correct.
Shadow IT Adoption
Using free AI tools without IT approval can bypass company policies, creating compliance gaps and leaving data unprotected.
Incorrect or Misleading AI Output
Access Misconfigurations
Best practices for safe AI use
Create an AI governance policy that provides your employees with clear AI acceptable use cases and guidelines.
Audit user permissions regularly in Microsoft 365.
Use enterprise AI like Microsoft 365 Copilot whenever possible to keep sensitive data within secured environments.
Validate AI outputs before using them in business reports or to support decision-making
Get Expert Guidance on Microsoft Copilot Security & Licensing
Want to ensure your business is using Copilot securely and maximizing Microsoft 365 licensing? Contact us today to understand the differences between free and enterprise Copilot, secure your deployment, and get guidance on licensing options.
Related articles
Cybersecurity Q&A Series: Are CAPTCHAs Enough to Stop Bots from Spamming Web Forms?
We explore what CAPTCHA is, why it is becoming less effective, and what alternate strategies your business can adopt to protect web forms.
Features to Impact: Microsoft 365 Copilot AI Solutions for Sales, Service, & Finance
A look at the new Microsoft 365 Copilot AI solutions for Sales, Finance, and Service. Plus how to plan user adoption and change management!
How to Create Effective Microsoft 365 Copilot Prompts: Lessons from our Webinar
In this webinar recap, we go over 6 building blocks to create strong and effective Microsoft 365 Copilot prompts.


