AI skeptics are not alone in warning users not to blindly trust the output of models – AI companies themselves say so in their terms of service.
Take Microsoft, which is currently Focused on getting corporate customers to pay for Copilot. But it is also making waves on social media Copilot Terms of UseWhich appears to have been last updated on 24 October 2025.
“CoPilot is for entertainment purposes only,” the company warns. “This may cause mistakes, and may not work as intended. Do not rely on Copilot for important advice. Use Copilot at your own risk.”
Microsoft spokesperson told PCMag The company will update what it describes as “legacy language.”
“As the product has evolved, that language no longer reflects how Copilot is used today and will be replaced with our next update,” the spokesperson said.
Tom’s Hardware noted Microsoft isn’t the only company to use this type of disclaimer for AI. For example, both OpenAI And xai Caution users that they should not rely on its output as “the truth” (to quote XAI) or as “the only service of truth or factual information” (OpenAI).

