NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Microsoft: Copilot is for entertainment purposes only (microsoft.com)
wowoc 1 days ago [-]
Anthropic does a somewhat similar thing. If you visit their ToS (the one for Max/Pro plans) from a European IP address, they replace one section with this:

Non-commercial use only. You agree not to use our Services for any commercial or business purposes and we (and our Providers) have no liability to you for any loss of profit, loss of business, business interruption, or loss of business opportunity.

It's funny that a plan called "Pro" cannot be used professionally.

https://www.anthropic.com/legal/consumer-terms

giobox 1 days ago [-]
Ha out of curiosity I loaded that same consumer terms URL on both a USA and a UK VPN exit node - sure enough, the UK terms inject that extra clause you quoted banning commercial usage that is not present for USA users.

diff of the changes between US and UK:

https://www.diffchecker.com/BtqVrR9p/

There's the usual expected legal boilerplate differences. However, the UK version injects the additional clause at line 134 that has no analog in the US version.

rkagerer 21 hours ago [-]
Wow, if you brought a paper contract to court that mutated itself depending which way you look at the paper, I wonder what a judge would think of that?

Personally I would crumple it up and pitch it out the window. I don't know why they can't simply be clear about what clauses apply to which geographies. An IP address should not be assumed as a reliable indicator of the jurisdiction in which an end-user resides. (Eg. In addition of VPNs's and unexpected routing, what happens if you travel?)

adonovan 19 hours ago [-]
I once wrote a contract document in PostScript that changed the wording based on the date. Two parties could cryptographically sign an agreement in the document, which would change when printed on a later date.

One of the reasons we don’t use PostScript so much any more.

someguyornotidk 13 hours ago [-]
Isn't PDF also programmable? I think it supports javascript and even a subset of PostScript if I'm not mistaken.
ytoawwhra92 16 hours ago [-]
It's perfectly normal for contracts in different jurisdictions to use different wording and include different clauses.

Even within the US, employment contracts with the same organisation may contain different wording depending on the state in which the employment is occurring.

solid_fuel 16 hours ago [-]
> It's perfectly normal for contracts in different jurisdictions to use different wording and include different clauses.

Before signing, yes, but once signed the contract stays constant. Mutating terms of service are weird - I would expect them to be locked to a canonical URL at least, like "https://.../tos?region=eu" or ideally something that locks the version too, like "https://.../tos?version=eu-002".

Let me pose a question from a different angle - these are legal contracts we are talking about, and the version they present to the user apparently changes based only on the client IP address. So if the terms in the EU ToS are better than in the US ToS, what would prevent me from signing up with an EU IP Address the first time? I would expect to be bound to the contract I actually agree to, not just the one they "intended" to show me.

hsbauauvhabzb 13 hours ago [-]
What if I sign the contract in the US, then fly to the UK?
przemub 13 hours ago [-]
Ask to sign a separate one for providing services in the UK, or include terms that vary depending on where you are in the first place.
hsbauauvhabzb 11 hours ago [-]
That would be sane. I did not get any TOS prompts last time I visited Europe.
graemep 1 days ago [-]
In the Uk there seem to be separate commercial and consumer terms.

In the UK the consumer terms say its subject to English law and the courts of the UK jurisdiction you live in.

The commercial terms say that in the UK, Switzerland and the EEA there will be binding arbitration by an arbitrator in Ireland appointed by the President of the Law Society of Ireland.

giobox 1 days ago [-]
The UK commercial terms explicitly do not apply to individual user plans. The US also has a separate terms sheet for commercial plans.

We are comparing like for like - an individual user using a Claude Pro subscription. A US user can use it for commercial use and be in compliance with the terms, the UK user cannot.

conductr 22 hours ago [-]
> A US user can use it for commercial use and be in compliance with the terms, the UK user cannot.

But why? My guess is the liability exposure is what they’re trying to control. So you probably can if you’re ok with no liability. It’s still noncompliant to how they wrote it but I would guess it’s the motivation. Unless they really just want to force the UK to pay for all commercial uses, which I suppose is possible.

graemep 22 hours ago [-]
I think its because the law in the UK limits exclusions of liabilities in consumer contracts far more than in business contracts (in general consumer law has a LOT of protections that do not apply to business contracts). If you look at the clauses excluding liabilities they are very different. I think the same applies to many other countries so they will also have separate consumer contracts.
n1tro_lab 9 hours ago [-]
So the two biggest AI companies are telling paying customers their products are toys. Meanwhile every enterprise sales deck says AI will transform your business. Pick one.
wowoc 5 hours ago [-]
OpenAI's ToS seems OK to me, it's just Microsoft and Anthropic.
motbus3 5 hours ago [-]
Sam Altman already solved AGI 1834 years ago according to himself
tmtvl 19 hours ago [-]
It'd be a dream come true if the TOS said code generated by Claude is licensed under the AGPL.
zombot 13 hours ago [-]
Not after all the training data was just stolen. That would be legalizing the biggest heist in human history after the fact.
rubyfan 21 hours ago [-]
It's only called Pro so they can charge you a Pro rate.
biker142541 21 hours ago [-]
Should just be called the “Funsies” and “More Funsies” plans
1 days ago [-]
1 days ago [-]
SoftTalker 1 days ago [-]
Software in general has disclaimed any warranties or fitness for purpose for as long as I can remember. This is nothing new.
mcmcmc 1 days ago [-]
Will be interesting to see if those still hold any weight (in the US at least) after the latest Meta rulings established defective design as a valid reason to sue big tech for damages
dylan604 23 hours ago [-]
That's not not set yet though. It hasn't made it to appeals yet.
chrisjj 1 days ago [-]
> Software in general has disclaimed any warranties or fitness for purpose

This is not such a disclaimer. If Copilot fails its purpose of entertaining you, you can sue. /i

wat10000 1 days ago [-]
Prohibiting the user from using it for any commercial or business purposes is definitely new!
klez 1 days ago [-]
Have you really never seen any software saying "for non-commercial use only"?
flawi 24 hours ago [-]
When I'm paying for said software? No, I don't think I have.
zombot 13 hours ago [-]
The Home/Personal edition of Mathematica is for non-commercial use only although it's a paid subscription. The world around you is not bound by your ignorance.
wat10000 3 hours ago [-]
Does Wolfram advertise that edition of Mathematica as being "for work" like MS does the individual Copilot plan?
wat10000 24 hours ago [-]
Only when it's a free/cheap consumer version of something with a pricey business version.
zombot 13 hours ago [-]
No, it's not.
naikrovek 1 days ago [-]
show me any that have claimed that they were for entertainment purposes only. sql server has never had that in its EULA. The GPL does not say that the software is for entertainment purposes only.
phyzome 22 hours ago [-]
That's not what this is, though.
jrecyclebin 23 hours ago [-]
Well, except that, in this case, Copilot really is for entertainment purposes only.
lenerdenator 1 days ago [-]
Well, there's your rationale as to why AI cannot replace you.

When sh!t hits the fan, Anthropic will immediately point to this clause. Who knows, maybe a court would see it as valid.

Meanwhile, your customer (and thus, your management) is looking for someone to blame for excrement making contact with the impellers. And that someone's gonna be you.

wowoc 1 days ago [-]
Well, OpenAI doesn't seem to have clauses like this. Europeans are allowed to use it for commercial purposes under the ToS. (But check it yourself, I'm not a lawyer).

I reimplemented my startup idea from scratch with Codex a few months ago, just for peace of mind.

tjmtjmtjm 22 hours ago [-]
Honest question, what peace of mind does this give you? If my idea could be implemented from scratch by one of these agentic harnesses I would be concerned about the viability of it more than anything.
jmalicki 1 days ago [-]
But you have limited funds to take in a lawsuit realistically the worst they can do is fire you, it's not like being blameable somehow makes you more valuable.
motbus3 5 hours ago [-]
the moment they can, each token will be 10,100,1000x more expensive
throwawaytea 1 days ago [-]
Employees often make mistakes that cost companies thousands of dollars. And there's no shortage of stories where employees cost companies tens of thousands and millions.

When a construction guy messes up measurements and thousands of dollars of work has the be removed and redone, no one thinks of taking the employee to court. Why would you want to take your Ai to court?

antihipocrat 23 hours ago [-]
When the construction worker messes up a job that then causes injury or damages the property they absolutely get sued. The state can even get involved if the mistake is deemed criminal negligence.

In your example the owners will often take the construction company or small business owner to court. Most trades people negotiate and redo the work for free or much reduced cost to avoid this.

In office settings if you expose PII you will likely be fired.

throwawaytea 16 hours ago [-]
I am really losing faith in hacker news intelligence levels or at least reading comprehension.

We were talking about people sueing AI for mistakes.

Employees do not get sued by their employer for mistakes. If your employer wants you to dig a foundation per plan, and you measure it wrong and dig it in the wrong orientation on the lot, you might get fired, but you will not pay the $50k+ to rip out the cement and put a new foundation.

hubertdinsk 1 days ago [-]
what the hell are you on about? Have you ever been employed? Employees do got reprimanded because of their mistakes. Employers just don't sue via the courts for the same reason you don't sue your spouse first thing when they break a plate. They settle via internal penalties first.

(Not only that, employees who got a reprimand too heavy handed can sue back. Plenty of cases around.)

"AI" company provides a service. They might or might not be adequate, that's not the point, the point is that the ability to sue them must always be on the cards if the agreed upon terms aren't met.

throwawaytea 16 hours ago [-]
I have no idea what you are talking about. I've been employed my whole adult life and I have never seen an employee get sued ft $50k because his mistake caused the company to lose $50k.
saghm 8 hours ago [-]
"Surely our customers understand it's just Latin! Claude _for_ you!"
everdrive 1 days ago [-]
Lawyers are playing Calvinball again. I have no idea why the law finds this kind of argumentation compelling. "I clearly intentionally deceived, but I stashed some bullshit legalese into a document no one will read so my deception is completely OK."
BrandoElFollito 1 days ago [-]
Some 20 years ago there was a story about a guy who was opening a bank account. The bank sent the contract, the guy ameneded it with things like "you will give le unlimited credit that I do not need to repay" (if my memory serves me right).

He signed, sent both copies, got his bank signed copy back

Went yo the bank, the bank sued him, he won (the judge told the bank that when you play dirty games you sometimes loose) and they ultimately settled.

lucianbr 1 days ago [-]
https://www.rt.com/business/man-outsmarts-banks-wins-court-2...

I can never find an article that mentions the final outcome.

Mathnerd314 23 hours ago [-]
It is on Wikipedia under T-Bank, this seems the best source that announces the resolution: https://web.archive.org/web/20220905212700/https://www.tinko...
tmtvl 7 hours ago [-]
From the Wikipedia article:

> The legal action was later withdrawn by both the parties after an undisclosed settlement was reached.

gamblor956 17 hours ago [-]
...In Russia.

That wouldn't work in the U.S. Changes to material terms in a contract generally informed consent (meaning, that the modifications are actually disclosed to the counterparty before they sign) or specific consent (such as a initializing the sections of the contract where the modifications occur). This is a basic part of the UCC, which all states have adopted in some form.

There are a lot of people on the internet claiming that you can get away with surreptitious material changes to a contract before it is signed. None of them are lawyers.

lucianbr 16 hours ago [-]
It's depressing to see how the system works. Sure, now there are different kind of terms in a contract, some are material terms and some are... immaterial? And conveniently, you can change some but not others in such a way that the banks and powerful corporations always come out on top.

I never heard of a corporation being forced to point out explicitly which lines in their long terms and conditions document have changed. But it's a well known obligation for regular citizens, because material terms.

> that the modifications are actually disclosed to the counterparty before they sign

Does Microsoft explicitly draw your attention to the fact that Copilot is for entertainment purposes? No, it buries that in a long document hoping you won't see it, and advertises it as the complete opposite, but it's ok when they do it, because those are not material terms, whatever that means. It means it's ok when the big guys do it, in the end.

14 hours ago [-]
Veserv 1 days ago [-]
I have frequently proposed a objective legal standard for false advertising that handles that: "Technically, your honor". If somebody says that in court, they lose.

The words they used, as commonly understood by the target audience, were intentionally crafted to be interpreted differently than what they were going to say they meant in court. They spent time, effort, and money, ran focus groups, and carefully selected and curated their words to be incorrectly interpreted by the target audience to reach knowingly false conclusions.

The correct standard should be that they spent time, effort, and money, ran focus groups, and carefully selected and curated their words to be correctly interpreted by the target audience to reach true conclusions. Their statements should only be accidentally incorrect in proportion to the time and effort spent crafting and distributing them.

"Technically, your honor", should be treated as the ethical abomination it is.

protocolture 21 hours ago [-]
I know there's some tort caselaw in Australia towards both parties actual understanding of the contract vs written word. We went over a few of these cases in high school commerce. Its been further enshrined by the ACCC, which tends to take the view that the verbal understanding provided at the point of sale can often supercede terms and conditions.
torginus 1 days ago [-]
My two cents is that if it didn't, 'I didn't know that was illegal/breach of contract' would be a valid legal defense.

Although intentionally saying things that contradict whats in the contract might be legally objectionable.

crote 1 days ago [-]
On the other hand: imagine someone putting "by agreeing to this, you owe us $1,000,000,000 - unless you opt out in writing within 90 days" halfway down the 100-page EULA of some cookie-cutter smartphone app.

It is not at all uncommon for such absurd contract terms to be unenforceable - especially in B2C contracts, although it might even be tricky for B2B clickthrough ones.

The idea being that most contracts are fairly standard, so a lot of people will just skim through them. Putting a landmine in them is obviously in bad faith, so making it enforceable would basically make it impossible to do any kind of business at all.

disillusioned 1 days ago [-]
FullStory just tried to pull this with their renewal. We had a mult-year contract that started with a two-page order form, on which the words "renewal" or "cancellation" never once appear. During negotiations, it was never discussed that the plan would renew, or that there was a cancellation window. Instead, buried at the very bottom of the form (which they send via CongaSign, and wasn't clickable or obvious), was a line about their subscription agreement being linked to their terms and conditions page. On THAT page, they mention the plan will auto renew and must be cancelled with 60 days notice.

We cancelled at T-45 or so days before renewal, having determined it wasn't a fit for our client anymore, and they insisted "well, actually, you've renewed anyway!" which, no, we haven't. Absolutely absurd to try to "clickwrap" buried renewal terms in a 20+ page T&C/privacy document rather than as a material point of fact on the actual order form being executed.

Feels like the height of absurdity to try to bully your client into forcing them to use your services against their will when they still gave ample notice that they were cancelling and when there was no material loss to the business, but it's always felt like their revenue team has been unhinged in general: exploding offers, insane terms, super high-pressure sales... part of the reason we left them in the first place.

observationist 1 days ago [-]
On the other other hand, they can put whatever they want in there, and because they've forced everything into arbitration with "third party" mediation and carved out their own little niche of the justice system, they'll never actually go to court, they'll just settle and evolve their ToS and contracts and word games accordingly.
graemep 1 days ago [-]
Not going to work in a lot of countries, again, especially with regard to consumer contracts.
jerf 1 days ago [-]
Nominally, Common Law, the system of law that to a first approximation is used in countries descended from the UK, has a lot of protections of that sort. You can't put "unconscionable" terms in a contract, e.g., it is simply illegal to sell yourself into total slavery in common-law derived systems. All signatories to a contract must consent, must not be under duress, the contract can not be one-sided (this doesn't mean "the contract is 'fair' from a 3rd-party point of view" but "the contract can't result in only one side giving things but the other doesn't"), and a variety of other common sense rules.

In practice, availing yourself of any of these protections is a massively uphill battle. Judges tend to presume that these common law matters are already embedded into the de facto legal system because the people writing the laws already operated under those assumptions while framing the law. Personally, I disagree and think a lot of these protections have eroded away into either nothing, or so little that it might as well be nothing, but you have a 0% chance of drawing me as a judge in your case so that won't help you much if you try.

ryandrake 1 days ago [-]
I wish we lived in more of a "spirit of the law" world than a "letter of the law" world, where everything needs to be spelled out, but we don't. A small minority of people enjoy Rules Lawyering their way through life, insisting on trying to "gotcha" counterparties who are acting in good faith, so as a consequence, we all have to be Rules Lawyers and everything needs to be spelled out.
xboxnolifes 1 days ago [-]
We live in neither. Many things spelled out are unenforceable. Maybe things not spelled out are implied.

We live in a world where advertising boneless chicken does not actually mean the chicken does not contain bones.

d3ckard 1 days ago [-]
No, you don’t. It only sounds nice. In practice this enables all kinds of spontaneous prosecution with any possible motive.
NetMageSCW 1 days ago [-]
I think a “spirit of the law” world would result in judges that already abuse their absurd powers way too much have free rein over any abuse they want to do, and there would be no system for ensuring everyone is treated equally or fairly.
vkou 1 days ago [-]
The current alternative is the corpos abuse their absurd powers, and there's no system for ensuring everyone is treated equally or fairly.
WesolyKubeczek 1 days ago [-]
Theoretically, courts and judges exist precisely to balance the word and the spirit, and find and judge the actual intent. In practice, I'm in awe that good judgments still happen, despite everything.
marcosdumay 1 days ago [-]
When the contract is purposefully obtuse and hard to understand, that should be a valid legal defense.

When it's huge, falls upon people that can't justify a lawyer, and keeps changing all the time, one shouldn't even need to claim it. It should be automatically invalid.

SoftTalker 1 days ago [-]
Contract language is obtuse and hard to understand precisely because of previous challenges over meaning. There are stock phrasings and clauses in contracts that have established (by precident) legal meanings. That's why contracts seem to be walls of boilerplate.

If you just wrote them in "plain language" there would be far too much ambiguity and arguing over what was really meant or implied or agreed to.

hananova 9 hours ago [-]
A company acting in good faith could provide both a binding version, and a version in plain language.
voxic11 1 days ago [-]
> Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk.

Seems pretty clear to me, do you really think people need a lawyer to understand that?

andy81 1 days ago [-]
The only thing "clear" about that License agreement is it contradicts all their other marketing about Copilot.

So either that document is fraudulent or everyone else at Microsoft is committing fraud daily.

Examples from the first search result: https://support.microsoft.com/en-us/topic/microsoft-365-copi...

Support page with ~25 tutorials provided by Microsoft about how to "Create a document with Copilot" or "Create a branded presentation from a file" or "Start a Loop workspace from a Teams meeting".

Do you actually believe that creating branded presentations (from Microsoft's own examples) is something people do for "entertainment purposes"?

NetMageSCW 1 days ago [-]
Did Microsoft force you to follow the tutorials and use CoPilot for business?
adgjlsfhk1 23 hours ago [-]
By advertising Copilot as capable of doing something they are guaranteeing the product is capable of it.
18 hours ago [-]
jon-wood 1 days ago [-]
If Copilot is for entertainment purposes only then why is https://office.com all about how you can use Copilot, and closes with the small print "Copilot Chat in the Microsoft 365 Copilot app is available for Microsoft 365 Enterprise, Academic, SMB, Personal and Family subscribers with a work, education, or personal account."

Why would they include a product for entertainment purposes only in the product they sell to large companies for doing work?

WesolyKubeczek 1 days ago [-]
Microsoft is pivoting to become an entertainment company, the Copilot being the final form of what Microsoft Bob has always wanted to become.
Sharlin 1 days ago [-]
Sure, if you make that clear in all of your marketing rather than lying your ass off and then trying the "lol we didn’t really mean it" defense.
marcosdumay 1 days ago [-]
There are 1698 words before that phrase.

Granted that this one document has a surprisingly clear language, but no, it's still not reasonable. Also, it was changed less than 6 months ago.

1 days ago [-]
lazide 1 days ago [-]
If it’s in a locked cabinet in the downstairs bathroom with the ‘out of order’ sign on the door, guarded by a leopard?
recursive 1 days ago [-]
A disused lavatory?
lazide 1 days ago [-]
We can neither confirm nor deny on advice of counsel.
1 days ago [-]
promagnon 21 hours ago [-]
[dead]
1 days ago [-]
ThrowawayR2 1 days ago [-]
"Our software developers clearly were negligent, but we stashed some bullshit legalese saying 'No warranty express or implied' into a document no one will read so our bug-infested software is completely OK."

People in glass houses shouldn't throw stones.

owenm 1 days ago [-]
As far as I can tell, this is only for the free personal plan, not any of the business offerings (ie not Copilot for M365) and Github Copilot is under a separate set of terms.

“These Terms don’t apply to Microsoft 365 Copilot apps or services unless that specific app or service says that these Terms apply.”

Think of Copilot being a suite of different products under the same overall banner and it starts to make (a bit) more sense.

pwarner 19 hours ago [-]
M365 Copilot is neither useful nor entertaining then.
brunoborges 1 days ago [-]
This should be the top comment.
harvey9 1 days ago [-]
Not really since the clause in full is "Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk."

Are you saying that the business version cannot make mistakes and can be relied upon for important advice?

owenm 1 days ago [-]
No, I’m saying that MS have different terms for their business and personal offerings (as do OpenAI and Anthropic).

To be fair to them, MS are quite open about accuracy for the business offerings, see here as one example:

https://learn.microsoft.com/en-us/copilot/microsoft-365/micr...

22 hours ago [-]
jeffwask 1 days ago [-]
I can hear the lawyers huddled around a conference table rolling the bones and chanting the sacred words to come up with that "get out of trouble free" card. It told your son he had terminal cancer and should kill himself... sorry, it clearly says for Entertainment Purposes only.
argee 19 hours ago [-]
"Are you not entertained?"
sgbeal 1 days ago [-]
The section titled

> IMPORTANT DISCLOSURES & WARNINGS

Tells us:

> You may stop using Copilot at any time.

That's an odd thing to include in a ToS.

throwa356262 1 days ago [-]
I am working really hard to not start using Copilot.

And belive me, if you use any Microsoft products or services they really make it hard to avoid accidentally using the damn thing.

Including adding it to your office plan and then charging you 2x.

Junk_Collector 1 days ago [-]
Gotta love how they moved the "Create Email/meeting" buttons in Outlook mobile and stuck the Copilot button there so that you will hit it accidentally.
qubex 1 days ago [-]
I’m a Mac user and the only way to get Office 365 is a monthly subscription. Since there’s no subscription that doesn’t include CoPilot and since they hiked the price with the excuse that they’d added this thing I didn’t want, I just cancelled my subscription. A customer lost: hardly an issue, but if enough people do it, maybe they’ll get a clue and stop ramming this unwelcome abomination down our throats.
NetMageSCW 1 days ago [-]
I’ve only used it once, for WorkFlow creation but it seemed really useful there, but that may be more of an indictment of WorkFlow than an endorsement of CoPilot.
monegator 1 days ago [-]
Like when i went to my github account to withdraw all copilot consents - which i never used anyway

just to be greeted with an email that welcomed me to copilot and the free plan. No button or link to disable the thing.

sgbeal 1 days ago [-]
> No button or link to disable the thing.

The line i initially quoted:

> You may stop using Copilot at any time.

Was incomplete. It continues with what initially appears to be a non sequitur:

> You may stop using Copilot at any time. If you want to close your Microsoft Account, please see the Microsoft Services Agreement.

It may not be a non sequitur, but may well be the only way to "opt out" of Copilot.

monegator 1 days ago [-]
I don't even have a microsoft account! anything microsoft i had (like the ancient hotmail address) was deleted years ago
banannaise 1 days ago [-]
104.3a A player can concede the game at any time.
mindcrime 1 days ago [-]
But according to the Birmingham modifications of 1973, subsection 12.b, stroke 7a, a player so conceding is not deemed to have actually conceded unless they be within a finite number of hops from Mornington Crescent station at the time of the concession.
d1sxeyes 1 days ago [-]
No, as part of the Cameron rules of 2016, concession means concession, regardless of anything else (including whether or not it’s a good idea).
NetMageSCW 1 days ago [-]
Does that mean I can get an ice cream?
d1sxeyes 6 hours ago [-]
I think so, I think they proposed ice creams only for those who haven’t been in Nidd since the 2021 Paris incident, but that got voted down.
1 days ago [-]
mcv 1 days ago [-]
> > You may stop using Copilot at any time.

> That's an odd thing to include in a ToS.

Maybe it's the only Microsoft product for which that's true? (It certainly feels that way, sometimes.)

xnorswap 1 days ago [-]
I doubt it is odd, I suspect almost every ToS has something similar.
Mordisquitos 1 days ago [-]
I really hope so. Now I must peruse all ToS that I have agreed in the past to ensure that they have an equivalent clause. I hope I'm not contractually obliged to keep using some random website or whatever for the rest of my life.
kylehotchkiss 1 days ago [-]
Were you able to stop using Adobe software any anytime before they got in trouble for making cancelling so hard?
sgbeal 23 hours ago [-]
> Were you able to stop using Adobe software any anytime before they got in trouble for making cancelling so hard?

To be fair, "stop using" does not automatically imply "stop paying for a subscription".

polyamid23 1 days ago [-]
They should tell copilot then! It seems to disagree.

- ‚Are you for entertainment purposes only?‘

-‚Not at all — unless you want me to be. The short version: I’m not “for entertainment only.”‘

Edit: Ok I see it is legal framing to not be held liable, but can they just do that via ToS and let the tool itself promote something else?

nashashmi 1 days ago [-]
That’s exactly what an LLM made for entertainment purposes only would say!
lateforwork 1 days ago [-]
Go to https://www.copilot.com/ and ask a question. You'll see from the answers that it is indeed for entertainment only. It is ridiculously behind ChatGPT, and I don't know how that can happen since Microsoft has access to the same models.
lousken 1 days ago [-]
it not as bad as in gpt 4.1 days, but i am wondering if it is just the system prompt or what is going on
commandlinefan 23 hours ago [-]
Are you not entertained?!
HDBaseT 19 hours ago [-]
[dead]
nunez 1 days ago [-]
FYI: This is only for the "Cortana replacement" Copilot, not the other Copilots. This language doesn't appear in GitHub Copilot's Consumer Agreement, for example.
therein 1 days ago [-]
Maybe they shouldn't name everything Copilot.
aleph_minus_one 1 days ago [-]
In the past, Microsoft named everything ".NET" [1] or "Windows Live" [2]. And before naming everything "Copilot", Microsoft named everything "Microsoft 365" [3].

[1] https://en.wikipedia.org/wiki/Microsoft_.NET_strategy

[2] https://en.wikipedia.org/wiki/Windows_Live

[3] https://en.wikipedia.org/wiki/Microsoft_365

brookst 24 hours ago [-]
I'm old enough to remember when everything was "Active". Active Directory, ActiveX, etc.
przemub 13 hours ago [-]
And Active Directory is probably the only still relevant one. Is it going to become Copilot Directory soon?
7 hours ago [-]
aleph_minus_one 22 hours ago [-]
Now that you mention it: in addition to your mentioned Active Directory and ActiveX:

- (Microsoft) Active Accessibility (MSAA): https://en.wikipedia.org/wiki/Microsoft_Active_Accessibility

- Active Channel: https://en.wikipedia.org/wiki/Active_Channel

- ActiveX Data Objects (ADO): https://en.wikipedia.org/wiki/ActiveX_Data_Objects

- Active Desktop: https://en.wikipedia.org/wiki/Active_Desktop

- ActiveMovie: https://en.wikipedia.org/wiki/ActiveMovie

- Active Server Pages (ASP): https://en.wikipedia.org/wiki/Active_Server_Pages

- Active Setup: https://en.wikipedia.org/wiki/Active_Setup

- ActiveSync: https://en.wikipedia.org/wiki/ActiveSync

---

In particular at the time around Windows Vista, Microsoft named a lot of technologies "Windows ... Foundation", for example:

- Windows Communication Foundation (WCF)

- Windows Driver Foundation (today: Windows Driver Frameworks): https://en.wikipedia.org/w/index.php?title=Windows_Driver_Fr...

- Windows Identity Foundation

- Windows Presentation Foundation (WPF)

- Windows Workflow Foundation (WF)

---

Also the "Windows Media ..." branding was big for media technologies at the respective time:

- Windows Media Audio (WMA)

- Windows Media Center, Windows Media Connect (both abbreviated to WMC): https://en.wikipedia.org/wiki/Windows_Media_Connect

- (Windows) Media Center Extender (MCX): https://en.wikipedia.org/wiki/Windows_Media_Center_Extender

- Windows Media Device Manager: https://learn.microsoft.com/en-us/windows/win32/wmdm/windows...

- Windows Media DRM

- Windows Media Encoder (WME)

- Windows Media Player (WMP)

- Windows Media Services (WMS): https://en.wikipedia.org/wiki/Windows_Media_Services

- Windows Media Video (WMV)

javadocmd 1 days ago [-]
Copilot copilot Copilot copilot copilot copilot Copilot copilot.

https://en.wikipedia.org/wiki/Buffalo_buffalo_Buffalo_buffal...

Raed667 1 days ago [-]
a blanket "entertainment only" disclaimer likely wouldn't survive scrutiny for a product actively/relentlessly marketed as a productivity tool
varispeed 1 days ago [-]
depends how much judges are interested in bling.
yoyohello13 1 days ago [-]
I've been reading Jurassic Park recently. Hammond's monologue about expensive technology only being fundable via Entertainment seems very relevant.
chrisjj 1 days ago [-]
These terms too are pretty entertaning.

we can’t promise that any Copilot’s Responses won’t infringe someone else’s rights (like their copyrights, trademarks, or rights of privacy) or defame them.

You agree to indemnify us and hold us harmless (including our affiliates, employees and any other agents) from and against any claims, losses, and expenses (including attorneys' fees) arising from or relating to your use of Copilot

i-e-b 1 days ago [-]
"Don’t use bots or scrapers"

Says the bot based on scraped data

giancarlostoro 1 days ago [-]
How does this affect Copilot in VS 2022 / VS 2026? Because this is kind of insulting to a professional. I really wish Microsoft would learn to name things correctly. There's Copilot the ChatGPT-like service, then there's Copilot for Visual Studio which is not the same as far as I can tell.
adambb 1 days ago [-]
https://docs.github.com/en/copilot/responsible-use/chat-in-y...

They do seem to word this at a more professional level in this context (the terms linked are for individuals using Copilot in Windows, probably?)

Smalltalker-80 1 days ago [-]
Cool, I'm going to put this disclaimer in my work email signature. So I'm never accountable for any mistakes.
anshumankmr 1 days ago [-]
If it is for entertainment purposes only, why am I not laughing when I use it?
SoftTalker 1 days ago [-]
Some people find being whipped while bound in leather to be entertaining.
cindyllm 1 days ago [-]
[dead]
noisy_boy 19 hours ago [-]
They didn't say whose entertainment. It certainly isn't yours. Meanwhile they are laughing their way to the bank.
pwdisswordfishy 1 days ago [-]
You need a better sense of humour apparently.
AdieuToLogic 16 hours ago [-]
Other products which have a disclaimer similar to Copilot's:

  Copilot is for entertainment purposes only.
- Magic 8 Ball[0]

- Tarot cards[1]

- Reality TV shows

- Psychic hotlines

Caveat emptor.

0 - https://en.wikipedia.org/wiki/Magic_8_Ball

1 - https://en.wikipedia.org/wiki/Tarot

wxw 1 days ago [-]
> Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk.

> We don’t own Your Content, but we may use Your Content to operate Copilot and improve it. By using Copilot, you grant us permission to use Your Content, which means we can copy, distribute, transmit, publicly display, publicly perform, edit, translate, and reformat it, and we can give those same rights to others who work on our behalf.

lol

Junk_Collector 1 days ago [-]
This is as good as when the engineer from the Claude team said they load their website in such a way as to protect against hostile actions such as scraping.
LurkandComment 1 days ago [-]
I thought a year ago when I bought a new laptop with 365 and Copilot integrated that they would make better use of AI and its integration. I can't think of when I actually used it and cancelled any subscription associated with it. On the otherhand, I use ChatGPT all the time.
rubyfan 21 hours ago [-]
Is this like how company's sell drugs or lab products and say it's "not for human consumption" or "for research purposes only"?
kklisura 1 days ago [-]
> Other people may send similar Prompts as yours, and they could get the same, similar, or different Responses and Creations.

This is why I'm skeptical about all this AI coding thing...

nerdjon 1 days ago [-]
Can I get this on a sticker to pass out anyone tries to shove copilot down my throat at work?

Maybe a shirt, could sell it on the Microsoft store even. Now that would be entertainment.

ar0 1 days ago [-]
To be clear this is only for the standalone Copilot chat or app and website; not for the “Copilot” services integrated into Office 365 etc.
sgbeal 1 days ago [-]
> To be clear this is only for the standalone Copilot chat or app and website; not for the “Copilot” services integrated into Office 365 etc.

The section titled "WHEN & WHERE THESE TERMS APPLY" includes:

> Conversations you have with Copilot through other Microsoft apps and websites

rdsubhas 1 days ago [-]
Would be nice to know if it includes Github Copilot. I can't understand how to interpret "Copilot branded apps".
sgbeal 1 days ago [-]
It says "through other Microsoft apps and websites," i.e. they reserve the right to include or remove it when and where they like throughout their whole product line (which includes github, of course), as well as:

- Conversations you have with Copilot through third-party apps and platforms

- Other Copilot-branded apps and services that link to these Terms

That first point (#4 in the original list) can cover all software, Copilot-branded or otherwise, which, even internally, uses Copilot (perhaps without your knowing so).

Github Copilot (to take your specific example) is both "other Microsoft apps and websites" and "Copilot-branded". So, yeah, those ToS undoubtedly apply to Github Copilot.

convexly 22 hours ago [-]
If your tool is only for entertainment then stop putting it in every product I use for work...
satisfice 20 hours ago [-]
Are you not entertained??
convexly 19 hours ago [-]
Apparently not enough to justify the subscription :\
ortusdux 1 days ago [-]
It worked for Fox News
_trampeltier 1 days ago [-]
Just today afternoon, I did read a bit trough Adobes EULA and I saw most of Adobes Software is not allowed to be used from children. I guess most (todays) software are not allowed for children because of the whole user tracking and spying.
mghackerlady 1 days ago [-]
It could also be that minors aren't allowed to sign contracts, which a EULA could maybe be considered (I'm not a lawyer)
monegator 1 days ago [-]
> Copilot may include advertising
stuaxo 9 hours ago [-]
Interested to see these put to the test.
IFC_LLC 20 hours ago [-]
Yeah, so what exactly do you call entertainment?

Is slapping "Made for AI" label on every device in the known universe entertaining?

snu 1 days ago [-]
Hilariously, immediately after I read this, my boss sends a global message to us reminding us that we 'need to be trying to integrate copilot into our jobs.'
jmugan 1 days ago [-]
I thought the title was a joke until I actually read the thing.
none_to_remain 21 hours ago [-]
I simply no longer consider myself morally bound by any contract I'm not seriously expected to read
TheRoque 22 hours ago [-]
Sure, but which copilot ? The one in my code editor ? The one in my OS ? The one in my sheets editor ?
staticautomatic 1 days ago [-]
Guys they're just disclaiming warranties relax
osmsucks 1 days ago [-]
To us, all the profit. To you, all the risk.
alok-g 1 days ago [-]
+1

Software in general is usually provided on an "as is" basis with the creator not taking responsibility for anything going wrong.

soupfordummies 1 days ago [-]
Worth noting that this is in the terms of use as of October 2025. This isn't "new".
jrochkind1 1 days ago [-]
No way that holds up in court when they are marketing it for things other than entertainment.
oldnetguy 23 hours ago [-]
Then get it off my laptop
Surac 23 hours ago [-]
1. April is near. Perhaps copilot allowed himself to prank us all
oytis 1 days ago [-]
I might be alone with this, but I don't find it very entertaining.
maieuticagent 1 days ago [-]
They're just trying to pick up that Disney deal (Clippy rhymes with Mickey)
Simulacra 1 days ago [-]
If it's for entertainment purposes only then why is it being shoved down our throats at every opportunity???
sheikhnbake 1 days ago [-]
ARE YOU NOT ENTERTAINED?
kotaru 1 days ago [-]
I legit laughed for couple minutes, thank you for this comment.
boothby 1 days ago [-]
It's not for your entertainment, silly, it's for theirs.
ranger_danger 1 days ago [-]
Mandatory Fun (TM)
rkagerer 21 hours ago [-]
I really hate these so-called Terms of Service that are more a set of instructions than they are a contract.

Eg. "If you see something wrong or inappropriate from Copilot, use the Report or Feedback features in Copilot to let us know."

Does that bind me to a commitment to report? Am I in breach of the Terms if I see something wrong and I don't?

I really hope judges take the way these "contemporary" terms are written as a signal of just how little bargaining power end-users have and start nullifying unfair clauses. It's getting ridiculous how one-sided these so-called agreements are. Companies don't want to take any responsibility whatsoever for the slop they're shilling.

ibejoeb 1 days ago [-]
They unironically relaunch it as XBox Copilot tomorrow...
jamesgill 23 hours ago [-]
So is Windows.
OfirMarom 1 days ago [-]
That one line is…doing A LOT of legal work.
ratelimitsteve 1 days ago [-]
i like the way that when ai does something good of course the people who built it should make a lot of money but when it does something bad no one is responsible
bradleyankrom 1 days ago [-]
Lots of that going around these days (and for many of the previous days, at least in the US)
tech_ken 1 days ago [-]
Another bingo square for that 'AI is gambling' post (https://news.ycombinator.com/item?id=47428541)
tasuki 1 days ago [-]
Well, I'm entertained!
mihaaly 1 days ago [-]
My employer does not allow me using software with entertainment function on company hardver.

Now what?! Do I have to uninstall Windows?

NetMageSCW 1 days ago [-]
No, but you can’t use CoPilot any more.
cartoonfoxes 24 hours ago [-]
And yet my licensed Visual Studio 2026 that's supposed to be for serious-face commercial development is rife with Copilot.

> You may stop using Copilot at any time.

But how? Microsoft has shoved it into so many products that I don't see how it's possible, without dropping them alltogether.

brookst 24 hours ago [-]
The link is pretty clear that it is only about copilot.com, not every product that has copilot in the name:

> These Terms apply to your use of “Copilot,” which includes:

> The standalone Copilot apps on your computer or mobile device

> The Copilot service we offer at copilot.microsoft.com, copilot.com, and copilot.ai

> Conversations you have with Copilot through other Microsoft apps and websites

> Conversations you have with Copilot through third-party apps and platforms

> Other Copilot-branded apps and services that link to these Terms

> These Terms don’t apply to Microsoft 365 Copilot apps or services unless that specific app or service says that these Terms apply.

HackerThemAll 21 hours ago [-]
"Copilot is for entertainment purposes"

Nah, I don't think so. It sucks at that.

pseudosavant 1 days ago [-]
I can't help but be reminded of Joe Pesci in Goodfellas:

"Funny how? I mean, funny like I’m a clown? I amuse you? I make you laugh? I’m here to fuckin’ amuse you? What do you mean funny, funny how?"

SoftTalker 1 days ago [-]
I've read the claim that he ad-libbed a lot of that too.
OrvalWintermute 1 days ago [-]
One of the most toxic TOS I have ever had the misfortune of reading.
classified 1 days ago [-]
So they finally admit that it's just a toy? Where does that leave all the mega-"productive" developers?
j45 1 days ago [-]
Non-exact software will be causing sleepless nights for non-exact legal writers.
ashleyn 1 days ago [-]
Ah yes, the new "for tobacco use only" of tech.
caycep 1 days ago [-]
I should ask it to produce an image of Satya Nadella in Maximus garb yelling "are you not entertained?!"
soupfordummies 1 days ago [-]
[dead]
catlikesshrimp 1 days ago [-]
The ownership section is hilarious (tldr your content is not ours, but we can do anything you could do with it except being liable)

"We don’t own Your Content... By using Copilot, you grant us permission to use Your Content, which means we can copy, distribute, transmit, publicly display, publicly perform, edit, translate, and reformat it, and we can give those same rights to others who work on our behalf."

anthk 1 days ago [-]
I told you so, dear LLM evangelists.
tempodox 1 days ago [-]
But they know better. They probably asked an LLM.
bustah 19 hours ago [-]
[dead]
Handy-Man 1 days ago [-]
Seems fine to me for the consumer facing product terms lol
hn_acc1 1 days ago [-]
Do not taunt Happy Copilot Ball.
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 20:08:34 GMT+0000 (Coordinated Universal Time) with Vercel.