Microsoft’s AI assistant, Copilot, has come under scrutiny following the resurfacing of its terms and conditions, which were first updated in October 2022. These terms, which have gained significant attention on social media platforms like Reddit, emphasize that Copilot is intended for “entertainment purposes only” and caution users against relying on it for crucial advice. The company also states it cannot guarantee that the responses generated by Copilot will not infringe upon the rights of others, including copyright and trademark laws. Users are warned that they are solely responsible for any consequences stemming from the publication or sharing of Copilot’s outputs.
Privacy expert Frith Tweedie described the language in the terms as a “fairly standard approach” for free AI tools. “Microsoft is essentially pointing to the limitations of the tool, which are—or should be—well known, particularly in respect to hallucinations and other accuracy challenges,” Tweedie noted. She highlighted how the updated terms reflect a growing recognition of the inaccuracies prevalent in generative AI models.
Experts are divided on whether the terms are overly alarming or a necessary precaution. An AI specialist from a prominent consulting firm remarked, “It’s freaking wild,” while others argue that similar disclaimers are common across free AI chatbots. For businesses, Microsoft offers a different set of terms under M365 Copilot that includes stronger data protection measures, aimed at ensuring more secure use of corporate data.
Dr. Andrew Lensen, a senior lecturer in artificial intelligence at Victoria University, pointed out that the terms reflect the realities of AI technology. “We are seeing many people take the advice from these AI language models as gospel, when they can be wrong, often subtly,” he said. He also expressed concerns about mixed messaging from Microsoft, which simultaneously promotes Copilot as a productivity tool for individual users.
According to Tweedie, businesses utilizing M365 Copilot benefit from enhanced privacy and security protections, but the cautionary language about the potential for mistakes and unreliability remains applicable to all generative AI chatbots. “This appears to be an attempt by Microsoft to limit potential liability by pointing out the unreliability of Copilot outputs,” she explained.
Rick Shera, a partner at Lowndes Jordan, cautioned businesses against using free AI versions. He stated, “Anyone using them in a business or who is generally concerned over issues such as confidentiality and privacy should steer clear of free versions.” Shera emphasized the importance of assurances provided by paid versions regarding security and privacy, especially in light of recent incidents suggesting that AI platforms could be compelled to disclose user prompts.
The conversation around AI and its limitations is increasingly relevant as more organizations adopt these technologies. Despite the assurances offered through paid versions of AI tools, risks remain, particularly with how companies navigate the usage of free AI platforms. “Businesses need to pay proper attention to the accuracy problems that are fundamental to large language models, a risk that I often see downplayed,” Tweedie cautioned.
In conclusion, while Microsoft’s updated terms for Copilot aim to clarify the limitations and risks associated with its use, experts agree on the necessity for users—both individuals and businesses—to approach AI tools with caution. As the technology continues to evolve, ongoing dialogue about its reliability and ethical implications will be crucial for users looking to integrate AI into their workflows.
See also
Germany”s National Team Prepares for World Cup Qualifiers with Disco Atmosphere
95% of AI Projects Fail in Companies According to MIT
AI in Food & Beverages Market to Surge from $11.08B to $263.80B by 2032
Satya Nadella Supports OpenAI’s $100B Revenue Goal, Highlights AI Funding Needs
Wall Street Recovers from Early Loss as Nvidia Surges 1.8% Amid Market Volatility




















































