This Cybersecurity Awareness Month, we are kicking off a blog series to answer some of the most common security questions we hear from our audience. Because we work across IT, cybersecurity, web development, and Microsoft business applications, security is, naturally, a big focus in everything we do. That means we spend a lot of time thinking, researching, and talking about how new technologies affect your data.

To start the series, we are diving into one of the hottest topics right now: Microsoft Copilot. In a recent webinar about maximizing Microsoft 365 Copilot for everyday work and writing effective prompts, the top concern we heard was about data security. So, in this post, we are exploring some of the risks of using generative AI at work and clarifying the security differences between the public/free Copilot experience and the enterprise-licensed Microsoft 365 Copilot.

testing-icon

Free Microsoft Copilot: What you need to know

Microsoft Copilot is the public, free AI platform that answers questions, generates content, and creates images based on users’ prompts.

Key security points:

Data usage for AI training

Your voice and text interactions can be used to improve Copilot for the broader community.

Personal data protections

Microsoft scrubs identifiable information, like names, phone numbers, email addresses, and device IDs, before using it for model training.

Opting out

You can prevent your conversations from being used for AI model training in settings across copilot.microsoft.com, the Windows app, and the mobile app. Here’s how you can do it:

  • On copilot.microsoft.com: Select your profile icon, select your profile name, then select Privacy > Model training on text/voice.
  • In Copilot for Windows: Select your profile icon, then select Settings > Privacy > Model training on text/voice.
  • In the Copilot mobile app: Open the menu, select your profile icon, then select Account > Privacy > Model training on text/voice.
  • Learn more here.

Bottom line: While your personal data is generally protected, confidential company information or sensitive personal data shared with the free Copilot may be at risk if used for AI training.

testing-icon

Microsoft 365 Copilot: Enterprise security built in

Microsoft 365 Copilot is a licensed AI assistant integrated into apps like Word, Excel, Outlook, and PowerPoint. It draws from content in Microsoft Graph—your work emails, chats, and documents—to provide personalized responses.

Key advantages:

No model training with your data

Prompts and responses remain within your Microsoft 365 tenant.

Enterprise-grade security

Uses Azure OpenAI services and benefits from Microsoft 365’s compliance, privacy, and encryption framework.

Governance matters

Even licensed Copilot depends on proper Microsoft 365 permissions. Misconfigured access could expose sensitive documents internally.

Bottom line: Microsoft 365 Copilot is more secure for enterprises, but it’s only as safe as the underlying governance of your Microsoft 365 environment. Learn more about Microsoft 365 Copilot security. 

testing-icon

Key risks of using generative AI at work

Even with enterprise AI, there are risks you should consider:

Accidental Data Exposure

Employees may paste sensitive information like financials, customer records, or source code into AI prompts. Microsoft Copilot risks sending data outside the organization; licensed Copilot risks internal exposure if permissions aren’t correct.

Shadow IT Adoption

Using free AI tools without IT approval can bypass company policies, creating compliance gaps and leaving data unprotected.

Incorrect or Misleading AI Output

AI tools – whether free or licensed – can generate convincing but inaccurate content. Relying on these outputs without validation can lead to mistakes in business decisions or communications.

Access Misconfigurations

Microsoft 365 Copilot respects existing permissions. Misconfigured access in SharePoint, Teams, or OneDrive can expose sensitive documents internally, emphasizing the need for proactive governance.
testing-icon

Best practices for safe AI use

Create an AI governance policy that provides your employees with clear AI acceptable use cases and guidelines.

Audit user permissions regularly in Microsoft 365.

Use enterprise AI like Microsoft 365 Copilot whenever possible to keep sensitive data within secured environments.

Validate AI outputs before using them in business reports or to support decision-making

Cloud

Get Expert Guidance on Microsoft Copilot Security & Licensing

Want to ensure your business is using Copilot securely and maximizing Microsoft 365 licensing? Contact us today to understand the differences between free and enterprise Copilot, secure your deployment, and get guidance on licensing options.

Related articles