Microsoft: Copilot is for entertainment purposes only

(microsoft.com)

165 points | by lpcvoid 2 hours ago

27 comments

  • everdrive 1 hour ago
    Lawyers are playing Calvinball again. I have no idea why the law finds this kind of argumentation compelling. "I clearly intentionally deceived, but I stashed some bullshit legalese into a document no one will read so my deception is completely OK."
    • torginus 1 hour ago
      My two cents is that if it didn't, 'I didn't know that was illegal/breach of contract' would be a valid legal defense.

      Although intentionally saying things that contradict whats in the contract might be legally objectionable.

      • crote 1 hour ago
        On the other hand: imagine someone putting "by agreeing to this, you owe us $1,000,000,000 - unless you opt out in writing within 90 days" halfway down the 100-page EULA of some cookie-cutter smartphone app.

        It is not at all uncommon for such absurd contract terms to be unenforceable - especially in B2C contracts, although it might even be tricky for B2B clickthrough ones.

        The idea being that most contracts are fairly standard, so a lot of people will just skim through them. Putting a landmine in them is obviously in bad faith, so making it enforceable would basically make it impossible to do any kind of business at all.

        • observationist 1 hour ago
          On the other other hand, they can put whatever they want in there, and because they've forced everything into arbitration with "third party" mediation and carved out their own little niche of the justice system, they'll never actually go to court, they'll just settle and evolve their ToS and contracts and word games accordingly.
      • ryandrake 1 hour ago
        I wish we lived in more of a "spirit of the law" world than a "letter of the law" world, where everything needs to be spelled out, but we don't. A small minority of people enjoy Rules Lawyering their way through life, insisting on trying to "gotcha" counterparties who are acting in good faith, so as a consequence, we all have to be Rules Lawyers and everything needs to be spelled out.
        • d3ckard 46 minutes ago
          No, you don’t. It only sounds nice. In practice this enables all kinds of spontaneous prosecution with any possible motive.
        • WesolyKubeczek 27 minutes ago
          Theoretically, courts and judges exist precisely to balance the word and the spirit, and find and judge the actual intent. In practice, I'm in awe that good judgments still happen, despite everything.
      • marcosdumay 1 hour ago
        When the contract is purposefully obtuse and hard to understand, that should be a valid legal defense.

        When it's huge, falls upon people that can't justify a lawyer, and keeps changing all the time, one shouldn't even need to claim it. It should be automatically invalid.

        • voxic11 1 hour ago
          > Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk.

          Seems pretty clear to me, do you really think people need a lawyer to understand that?

          • andy81 53 minutes ago
            The only thing "clear" about that License agreement is it contradicts all their other marketing about Copilot.

            So either that document is fraudulent or everyone else at Microsoft is committing fraud daily.

            Examples from the first search result: https://support.microsoft.com/en-us/topic/microsoft-365-copi...

            Support page with ~25 tutorials provided by Microsoft about how to "Create a document with Copilot" or "Create a branded presentation from a file" or "Start a Loop workspace from a Teams meeting".

            Do you actually believe that creating branded presentations (from Microsoft's own examples) is something people do for "entertainment purposes"?

          • jon-wood 1 hour ago
            If Copilot is for entertainment purposes only then why is https://office.com all about how you can use Copilot, and closes with the small print "Copilot Chat in the Microsoft 365 Copilot app is available for Microsoft 365 Enterprise, Academic, SMB, Personal and Family subscribers with a work, education, or personal account."

            Why would they include a product for entertainment purposes only in the product they sell to large companies for doing work?

            • WesolyKubeczek 26 minutes ago
              Microsoft is pivoting to become an entertainment company, the Copilot being the final form of what Microsoft Bob has always wanted to become.
          • Sharlin 1 hour ago
            Sure, if you make that clear in all of your marketing rather than lying your ass off and then trying the "lol we didn’t really mean it" defense.
          • lazide 1 hour ago
            If it’s in a locked cabinet in the downstairs bathroom with the ‘out of order’ sign on the door, guarded by a leopard?
            • recursive 1 hour ago
              A disused lavatory?
              • lazide 1 hour ago
                We can neither confirm nor deny on advice of counsel.
    • ThrowawayR2 1 hour ago
      "Our software developers clearly were negligent, but we stashed some bullshit legalese saying 'No warranty express or implied' into a document no one will read so our bug-infested software is completely OK."

      People in glass houses shouldn't throw stones.

  • wowoc 8 minutes ago
    Anthropic does a somewhat similar thing. If you visit their ToS (the one for Max/Pro plans) from a European IP address, they replace one section with this:

    Non-commercial use only. You agree not to use our Services for any commercial or business purposes and we (and our Providers) have no liability to you for any loss of profit, loss of business, business interruption, or loss of business opportunity.

    It's funny that a plan called "Pro" cannot be used professionally.

    https://www.anthropic.com/legal/consumer-terms

  • jeffwask 2 hours ago
    I can hear the lawyers huddled around a conference table rolling the bones and chanting the sacred words to come up with that "get out of trouble free" card. It told your son he had terminal cancer and should kill himself... sorry, it clearly says for Entertainment Purposes only.
  • _trampeltier 8 minutes ago
    Just today afternoon, I did read a bit trough Adobes EULA and I saw most of Adobes Software is not allowed to be used from children. I guess most (todays) software are not allowed for children because of the whole user tracking and spying.
  • sgbeal 1 hour ago
    The section titled

    > IMPORTANT DISCLOSURES & WARNINGS

    Tells us:

    > You may stop using Copilot at any time.

    That's an odd thing to include in a ToS.

    • throwa356262 1 hour ago
      I am working really hard to not start using Copilot.

      And belive me, if you use any Microsoft products or services they really make it hard to avoid accidentally using the damn thing.

      Including adding it to your office plan and then charging you 2x.

      • Junk_Collector 1 hour ago
        Gotta love how they moved the "Create Email/meeting" buttons in Outlook mobile and stuck the Copilot button there so that you will hit it accidentally.
      • qubex 45 minutes ago
        I’m a Mac user and the only way to get Office 365 is a monthly subscription. Since there’s no subscription that doesn’t include CoPilot and since they hiked the price with the excuse that they’d added this thing I didn’t want, I just cancelled my subscription. A customer lost: hardly an issue, but if enough people do it, maybe they’ll get a clue and stop ramming this unwelcome abomination down our throats.
    • banannaise 1 hour ago
      104.3a A player can concede the game at any time.
    • monegator 1 hour ago
      Like when i went to my github account to withdraw all copilot consents - which i never used anyway

      just to be greeted with an email that welcomed me to copilot and the free plan. No button or link to disable the thing.

      • sgbeal 1 hour ago
        > No button or link to disable the thing.

        The line i initially quoted:

        > You may stop using Copilot at any time.

        Was incomplete. It continues with what initially appears to be a non sequitur:

        > You may stop using Copilot at any time. If you want to close your Microsoft Account, please see the Microsoft Services Agreement.

        It may not be a non sequitur, but may well be the only way to "opt out" of Copilot.

    • xnorswap 1 hour ago
      I doubt it is odd, I suspect almost every ToS has something similar.
      • Mordisquitos 1 hour ago
        I really hope so. Now I must peruse all ToS that I have agreed in the past to ensure that they have an equivalent clause. I hope I'm not contractually obliged to keep using some random website or whatever for the rest of my life.
  • yoyohello13 1 hour ago
    I've been reading Jurassic Park recently. Hammond's monologue about expensive technology only being fundable via Entertainment seems very relevant.
  • Raed667 1 hour ago
    a blanket "entertainment only" disclaimer likely wouldn't survive scrutiny for a product actively/relentlessly marketed as a productivity tool
    • varispeed 44 minutes ago
      depends how much judges are interested in bling.
  • anshumankmr 9 minutes ago
    If it is for entertainment purposes only, why am I not laughing when I use it?
  • ar0 2 hours ago
    To be clear this is only for the standalone Copilot chat or app and website; not for the “Copilot” services integrated into Office 365 etc.
    • sgbeal 1 hour ago
      > To be clear this is only for the standalone Copilot chat or app and website; not for the “Copilot” services integrated into Office 365 etc.

      The section titled "WHEN & WHERE THESE TERMS APPLY" includes:

      > Conversations you have with Copilot through other Microsoft apps and websites

      • rdsubhas 1 hour ago
        Would be nice to know if it includes Github Copilot. I can't understand how to interpret "Copilot branded apps".
        • sgbeal 1 hour ago
          It says "through other Microsoft apps and websites," i.e. they reserve the right to include or remove it when and where they like throughout their whole product line (which includes github, of course), as well as:

          - Conversations you have with Copilot through third-party apps and platforms

          - Other Copilot-branded apps and services that link to these Terms

          That first point (#4 in the original list) can cover all software, Copilot-branded or otherwise, which, even internally, uses Copilot (perhaps without your knowing so).

          Github Copilot (to take your specific example) is both "other Microsoft apps and websites" and "Copilot-branded". So, yeah, those ToS undoubtedly apply to Github Copilot.

  • LurkandComment 1 hour ago
    I thought a year ago when I bought a new laptop with 365 and Copilot integrated that they would make better use of AI and its integration. I can't think of when I actually used it and cancelled any subscription associated with it. On the otherhand, I use ChatGPT all the time.
  • caycep 3 minutes ago
    I should ask it to produce an image of Satya Nadella in Maximus garb yelling "are you not entertained?!"
  • classified 4 minutes ago
    So they finally admit that it's just a toy? Where does that leave all the mega-"productive" developers?
  • nerdjon 1 hour ago
    Can I get this on a sticker to pass out anyone tries to shove copilot down my throat at work?

    Maybe a shirt, could sell it on the Microsoft store even. Now that would be entertainment.

  • giancarlostoro 1 hour ago
    How does this affect Copilot in VS 2022 / VS 2026? Because this is kind of insulting to a professional. I really wish Microsoft would learn to name things correctly. There's Copilot the ChatGPT-like service, then there's Copilot for Visual Studio which is not the same as far as I can tell.
  • monegator 1 hour ago
    > Copilot may include advertising
  • wxw 2 hours ago
    > Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk.

    > We don’t own Your Content, but we may use Your Content to operate Copilot and improve it. By using Copilot, you grant us permission to use Your Content, which means we can copy, distribute, transmit, publicly display, publicly perform, edit, translate, and reformat it, and we can give those same rights to others who work on our behalf.

    lol

    • Junk_Collector 1 hour ago
      This is as good as when the engineer from the Claude team said they load their website in such a way as to protect against hostile actions such as scraping.
  • jrochkind1 1 hour ago
    No way that holds up in court when they are marketing it for things other than entertainment.
  • staticautomatic 1 hour ago
    Guys they're just disclaiming warranties relax
  • tech_ken 26 minutes ago
    Another bingo square for that 'AI is gambling' post (https://news.ycombinator.com/item?id=47428541)
  • maieuticagent 1 hour ago
    They're just trying to pick up that Disney deal (Clippy rhymes with Mickey)
  • j45 28 minutes ago
    Non-exact software will be causing sleepless nights for non-exact legal writers.
  • ortusdux 2 hours ago
    It worked for Fox News
  • ratelimitsteve 1 hour ago
    i like the way that when ai does something good of course the people who built it should make a lot of money but when it does something bad no one is responsible
    • bradleyankrom 1 hour ago
      Lots of that going around these days (and for many of the previous days, at least in the US)
  • Simulacra 2 hours ago
    If it's for entertainment purposes only then why is it being shoved down our throats at every opportunity???
    • sheikhnbake 2 hours ago
      ARE YOU NOT ENTERTAINED?
    • boothby 1 hour ago
      It's not for your entertainment, silly, it's for theirs.
    • ranger_danger 2 hours ago
      Mandatory Fun (TM)
  • ashleyn 1 hour ago
    Ah yes, the new "for tobacco use only" of tech.
  • anthk 1 hour ago
    I told you so, dear LLM evangelists.
  • Handy-Man 2 hours ago
    Seems fine to me for the consumer facing product terms lol