Microsoft Labels Copilot an Entertainment Tool and Warns Users to Proceed at Their Own Risk

The420 Web Desk
3 Min Read

Microsoft has updated its Copilot terms and conditions to state that the artificial intelligence tool is for “entertainment purposes only” and that users should use it at their own risk, according to text shown in the screenshots. The report says the change was made quietly last fall, even as Copilot continued to be promoted as a productivity tool for a mass user base.

Terms of use raise fresh questions

The screenshots show Microsoft’s guidelines stating that Copilot “can make mistakes” and “may not work as intended.” Users are told not to rely on the tool for important advice and are warned that they assume the risks of using it.

The article visible in the screenshots says this language may suggest that Microsoft does not have enough trust in its own product. It adds that, while such a disclaimer may make technical sense to those familiar with large language models, ordinary users could feel misled by the contrast between the company’s marketing and the wording of its user agreement.

Liability warning goes beyond accuracy concerns

Another section shown in the screenshots says Microsoft makes no warranty or representation of any kind about Copilot. It states that the company cannot promise that Copilot’s responses will not infringe the rights of others, including copyrights, trademarks or privacy rights, or defame them.

The same text says users are solely responsible if they choose to publish or share Copilot’s responses publicly or with any other person. That warning broadens the issue beyond simple errors, placing responsibility on users for the legal and reputational consequences of what the tool generates.

Public reaction and product pullback

The screenshots also show social media reactions questioning Microsoft’s position. One user is quoted as saying that describing a flagship AI product as being “for entertainment purposes only” should be a major story. Another comments that a company with extensive legal resources should have produced a better explanation. A third says AI companies publicly present AI as transformative, while privately warning users that the technology may produce “complete and utter nonsense” if relied on for serious work.

The screenshots further show that Windows Vice President Pavan Davuluri recently acknowledged that the company had gone too far in pushing its AI products to users who had become frustrated. In a blog post quoted in the report, he said Microsoft was reducing unnecessary Copilot entry points, starting with apps such as Snipping Tool, Photos, Widgets and Notepad.

Stay Connected