Microsoft's Own Terms of Service Call Copilot "For Entertainment Purposes Only"
What Happened
Microsoft's Terms of Use for Copilot include the line: "Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don't rely on Copilot for important advice." This language, reportedly present since October 2025, went viral over the weekend. Microsoft called it "legacy language" that will be updated, but the disclaimer currently applies to a product sold at $30/user/month to enterprises.
My Take
This is not a legal technicality. This is the quiet part said loud. Microsoft is selling Copilot as an enterprise productivity suite while its own lawyers say do not rely on it for anything important. The "legacy language" defense does not hold — someone at Microsoft decided it was less risky to disclaim the product than to stand behind it. That tells you everything about where the reliability bar actually sits. For product engineers integrating Copilot into workflows, the real question is: what happens when something goes wrong and your company points to the thing Microsoft's own ToS told you not to trust? AI gives people capability before it gives them calibration. And right now, Microsoft is selling the capability at enterprise prices while legally disclaiming the calibration. If you are a PM evaluating AI tools for your org, screenshot this ToS before it gets "updated." It is the most honest thing any AI company has said about their product this year.
Read Original Source