The chief of the very agency responsible for national information security (CISA) recently got in hot water for loading sensitive information into the publicly available AI engine ChatGPT. 

You may not be surprised at the incompetence, but it should raise alarm bells for you as a business owner with valuable information all over your network. Everyone is experimenting with AI: but how is your own team using it? Do you have guidelines, policies, and procedures for AI usage in your business? I know, I know – it seems WAY down the list of “things I need to do.” 

But this can be simple to implement. And it also involves one tool – Microsoft 365 Copilot – which generally goes underutilized in businesses, even though there is significant potential in training your folks how to use it. 

Elsewhere we’ve discussed best practices for when to use each AI engine, along with some warnings about absolute no-gos (like uploading any sensitive or proprietary information into a public AI engine). 

Copilot works within your MS 365 environment, and it is a much safer option when you are dealing with any proprietary information (especially as that info is likely stored in your MS 365 environment anyway). 

Below are some further tips on what it does and does not do, along with some suggestions for how to use it effectively. 

How Copilot Works with Your Company Data 

Copilot uses your existing Microsoft 365 permissions. If you can access it, Copilot can help you work with it. If you cannot access it, Copilot cannot either.   

This applies to: 

  • SharePoint 
  • OneDrive 
  • Teams 
  • Outlook 
  • Your Calendar 
  • Any file or library you have permission to view 

Copilot does not search your entire company’s whole environment automatically. It only uses the files you open, the items you explicitly reference, and the permissions tied to your account. If you don’t have permission to see it, then Copilot can’t either (or your administrator may have further restrictions on its usage).  

What Copilot Does NOT Do 

  • It cannot access data you don’t have permission to view 
  • It does not override privacy or security settings 
  • It does not automatically read all files across Teams or SharePoint 
  • It does not learn from or store your business data outside Microsoft 365 

Three Skills for Good Prompts 

Skill 1 — Give clear prompts 

Make sure your prompts include a clear task, context, sufficient but not distracting details, and your desired structure, if necessary.  If you are dealing with material inside your MS 365 environment, go ahead and reference or provide that material. 

Example: 

“Summarize the last 10 emails from the client ‘Premier HVAC’ and provide a list of action items in bullet points.” 

This skill significantly improves your Copilot outcomes.  Again, be as detailed as possible (while also making sure you are emphasizing what is important and not including distracting details) 

Skill 2 — Re-iterate like you would with a coworker 

Ask follow-up questions to get more specific, pointed outcomes. Copilot improves the more you refine the output. 

  • “Shorten this.” 
  • “Make it more professional.” 
  • “Rewrite in simpler language.” 
  • “Give me three versions.” 

Skill 3 — Verify and finalize 

Copilot is a drafting tool, not an autopilot. Always check tone, accuracy, and numbers.  You need to be the final editor to make sure that everything makes sense.  You should not use Copilot drafted materials as final products. 

Everyday Copilot Use Cases by App 

Outlook 

  • Summarize long email threads 
  • Draft replies 
  • Highlight tasks and deadlines 
  • Convert emails into meeting agendas or follow-ups 

Word 

  • Draft proposals, SOPs, summaries, letters 
  • Rewrite documents for clarity or professionalism 
  • Convert rough notes into finished content 
  • Extract key points from long reports 

Excel 

  • Explain formulas 
  • Clean and structure data 
  • Create charts and insights 
  • Find trends or anomalies in data 

PowerPoint 

  • Generate slide decks from documents 
  • Redesign slide formatting 
  • Create speaker notes 
  • Simplify or expand content 

Teams 

  • Summarize meeting recordings 
  • Generate action items 
  • Answer questions about what was discussed 
  • Draft follow-up emails 

Conclusion

AI isn’t going away, and neither is your team’s curiosity about using it to work faster and smarter. The real risk isn’t that AI exists.  Rather, the risk is that it gets used casually, inconsistently, and without guardrails. If a national security agency can make headlines for mishandling sensitive information, it’s a reminder that all organizations need clarity on how and when to use AI engines (and what type of engine). 

Microsoft 365 Copilot gives businesses a safer, more controlled way to use AI where most of their data already lives, but only if it’s paired with clear expectations, basic training, and the understanding that AI is a drafting assistant, not a decision-maker. A few simple policies, some training, and a reminder that humans still own the final output can dramatically reduce risk while unlocking real productivity gains. 

This is one of those small, unglamorous steps that quietly prevents big problems later. And like most good IT decisions, it’s far easier to put in place now than to clean up after something goes wrong.