Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Just a warning, the license [1] specifically blocks EU use:

> 3. Conditions for License Grant. You represent and warrant that You will not, access, download, install, run, deploy, integrate, modify, or otherwise use the Model, directly or indirectly, within the European Union.

[1] https://gitcode.com/ascend-tribe/pangu-pro-moe-model/blob/ma...






What’s the reason behind this? What am I missing?

Most likely EU AI act regulations they don't see any value in bothering with.

Even so, why would the licensor put it in and force it through a license. It's on the licensee to check the laws and regulations they themselves operate in.

The EU AI Act is supposed to affect all AI "providers", which includes any "natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge" [0].

This would plausibly include anyone developing an LLM, even if they aren't selling access to it or building applications based on it. There are several exemptions, and the Act obstensibly avoids creating burdens for most general-purpose LLMs, but the point is that Huawei wants to avoid any worry by not "plac[ing] it on the market" in the first place.

[0] https://artificialintelligenceact.eu/article/3/


Hence, the lack of European innovation in AI

I don’t agree. Tools like DeepL were and still are better than Google Translate long before chat bots became a thing. The French-made Mistral AI is pretty decent as well.

Saw some benchmarks recently that put Mistral well behind basically every other competitor. Don't have them on hand unfortunately.

FWIW, I refactored 500+ Junit 4 to Junit 5 tests with locally running Mistral 8B on an M3 MBP. It worked flawlessly, but surely I cannot attest for other use cases.

Edit: it was 8B, not 7B.


They are not the best sure, but are very inexpensive, offer better privacy and are super fast.

What specifically is it about the law slowing innovation?

Is it something these companies do that they worry violates it?


The company is Chinese, I presume that's why.


Worse, it's American!

And who thinks that, for even a second, that an European (in this case) will not download, install, and try to run this just because the LICENSE says you can't?

FYI, this is not intended to be offensive to Europeans, I am European myself. That is not the point. The point is, who gives a damn about the LICENSE in reality, on their PERSONAL computer? Serious question.


The licence is not there for enforcement from their side. It's a legal protection for Huawei. Essentially "We told you it's not for the EU. If you get sued don't try to put it on us."

Also any company of a serious size will have lawyers interested in licences of everything you're running.


I am not talking about companies. I edited my comment. Emphasis on "[their] PERSONAL" computer.

I know that companies would probably not. But individuals?


It's probably the inverse:

they might license it to companies in the US, but don't want to have to deal with the changes and bureucracy needed to support individuals.

The statement's purpose is to say the equivalent "if you're a European and do run it, it's on you, this is not a product we release or support for the European market, don't expert support, liability, etc".


Why would they open themselves up to liability in the rest of the world where it is allowed?

I get that. What I do not get it some other commenters "scaring" Europeans attempting (or thinking) to run this product.

I mean, this other commenter literally said:

> You'll be both breaking their licence and potentially your local European data laws.


I'm really torn on the whole thing. I consider myself a patriotic American and would never do anything to undermine the security of my country or its allies (using the same definition of national security that the serious sworn oaths use, "all enemies foreign and domestic", which makes NSA backdoors that compromise American devices squarely a "domestic enemy").

But loyalties don't change facts and China is where serious hackers are rising on merit, doing a lot with limited resourves, giving zero fucks about empty slick talk.

If we wanted to hobble the PRC's technical rise we should have subsidized wasteful NVIDIA use and had Altman/YC be in charge: they'd still be gladhanding about how to pump their portfolio companies sticker price and avoid "systemic shocks" to the stock market anchored on NVDA.


Just for the record, I would never run this product, but it has nothing to do with the LICENSE itself.

Well, people say such things even for watching pirated shows, which to be truthful, almost everybody does...

Some just are narc types.


> The point is, who gives a damn about (doing an illegal thing) in reality, on their (private property where nobody is likely to see that)?

I'm not sure which part of that you find confusing. Some people will estimate benefit>risk and won't care.


What? I do not find anything confusing. You live in a Marvel world if you think a LICENSE is going to stop people from using a product. But like you said, it is not intended to be for enforcement purposes, but Huawei is trying to save its own ass.

So what is your answer? Mostly companies only? That is a fair answer, but you are the one who said this:

> You'll be both breaking their licence and potentially your local European data laws.

Again, who cares, dude? Companies might, but individuals probably give a rat's ass. So why leave that comment?

And just for the record, if you quote someone, quote them verbatim, otherwise it is not a quote.


Breath.

Been there, done that.

That said, I agree that it is my fault that self-contradicting virtue signaling hypocrites always find a way to irk me.

And I think it is good for the world to know that the LICENSE often means jack shit, unless when companies of significant size are involved.

Again, we all agree that they put it there to cover their own asses, not that Europeans cannot download, install, and run their product, right?


Yes we’re all aware of the unlimited rights of Europeans, including subjecting the rest of the world to annoying cookie notifications.

Europeans definitely do not have unlimited rights, and I do not agree with the annoying cookie crap either.

For those that would not remember, this was a real thing in the late 80s and 90s relating the cryptography.

There were serious laws limiting the export a "modern" cryptography software from the USA.

Some of us had to face up to the serious challenge of connecting to an FTP server and downloading PGP and risking violating US law to download a software package.

A few years later we had to decide "Do you want the secure Netscape, or the insecure Netscape?".

I'm sure we all chose the ethical choice.


You should elaborate on this for the unacquainted.


Thank you.

Legalese and licenses aren't to make sure no X will download/install/or run something.

It's to make it a matter of legal record that you stated they should abstain.

Copyright warnings on music and DVDs never stopped people pirating them either.


Try selling pirated copies and see what the warnings are really about.

When CDs and DVDs were a thing people wanted, there were people selling pirated copies on every corner, so...

I know. That is why I do not get what is the big fuss about.

Most companies abide the law. So no self hosted LLM for europeans.

... what? Self-hosted LLMs are precisely for individuals.

A lot of companies and research institutes in the EU would like to be able to use a locally hosted LLM for their employees so they don't have to worry what data they give away.

Also it is not rational for any individual to buy the hardware for running a serious LLM and then let it idle 99.9% of the day.


Why would not these companies or research institutes in the EU not be able to run locally hosted LLMs for their employees though?

What model do you propose that is close enough to chatgpt or Claude so people will actually use it for their work?

I am not up to date with the models, but I have heard good stories about a couple of open source models. You should ask Simon Willis. I hope he will be summoned (@simonw).

It was my point that currently the best open weight models are from China, so not usable in the EU.

But of course the world changes so rapidly that what is now is irrelevant tomorrow.


I wonder if you'd say the same if the license were coming from Microsoft of Apple...

Why would I not? Of course I would.

A lot of companies and research institutes in the EU would like to be able to use a locally hosted LLM for their employees so they don't have to worry what data they give away.

They will certainly not violate EU laws and also probably not the licence.


It's plausible deniability. Someone at Huawei presumably thinks there's a chance that exporting this to Europe might be a legal problem at some point in the future. So they added a restriction, enough for plausible deniability.

It's not exactly "plausible deniability" in the common sense of the term.

It's not supposed to make them appear as plausibly denying that some European can download and use this.

It's role is to signal that if someone does, it's on him, not them, and he wont have any support, liability claims, etc as if they could if it was a product intended for their use.


Quite a few, actually.

Wow this is a huge caveat: a guarantee that they are using data and not complying with GDPR.

GDPR is not the issue here, the new AI act is. Since this is an open-weight release it is not bound by the training data disclosure rules, but it probably didn't go through the evaluation that is required above a certain number of FLOPs. That's why many recent big player model releases had a staggered release in the EU.

If you download to your PC and run locally, what will happen?

Picture your PC as a cheery little planet in the EU’s cosmic backwater, sipping a digital Pan-Galactic Gargle Blaster. You download Pangu Pro MoE, hit “run,” and expect to chat with an AI wiser than Deep Thought. Instead, you’ve hailed a Vogon Demolition Fleet. Your machine starts moaning like Marvin with a hangover, your screen spews gibberish that could pass for Vogon poetry, and your poor rig might implode faster than Earth making way for a hyperspace bypass.

The fallout? This AI’s sneakier than a two-headed president—it could snitch to its creators quicker than you can say “Don’t Panic.” If they spot your EU coordinates, you’re in for a galactic stink-eye, with your setup potentially bricked or your data hitchhiking to a dodgy server at the edge of the galaxy. Worse, if the code’s got a nasty streak, your PC could end up a smoking crater, reciting bad poetry in binary.


To translate for those not familiar with the writings of Douglas Adams:

nord is suggesting it's possible that the physical computer running this model could be used as a "hub" for potential spyware, or be overloaded with workloads that are not related to the actual task of running the model (and instead may be some form of malware performing other computational tasks). It could potentially perform data exfiltration, or act discriminatorily based on your percieved location (such as if you're located within the EU). At worst, data loss or firmware corruption/infection may be of concern in case of license violation.

I'm not sure I would outright disagree that this as possible, but with some caveats. I would think the reason that the license stipulates that usage within the EU is forbidden due to the EU AI Act (here is a resource to read through it: https://artificialintelligenceact.eu/ai-act-explorer/).


how will the "open weights" know that the pc is running within EU? again, you are not talking about software that actually runs in your pc but the file that the software reads and loads into memory for its own use.

No it's actually worse. Approximately three seconds after you install the model in offline mode on your computer, a small detector van will come and park outside your door with an antenna on the roof, and relay your position to a Chinese ICBM for immediate targeting.

>If they spot your EU coordinates how.

can anyone give a technical answer how will weights get to know this fact?


Sorry, sounds like total bullshit. The weights aren't going to do anything. And if you are worried about the code, with current deployment practices of curl | sudo bash there are much more low-hanging fruits out there. That's not even mentioning the possibility of running the model on a PC without internet access (no matter how good the new Chinese AI is, it's still not good enough yet to convince you to let it out of the box).

you can use existing apps that take random huggingface files, do you expect weights to somehow coax the software to do exfiltration?

same. i call bull on this.

remember how they convinced huawei was public enemy without evidence because nokia and others were unable to compete with them?


with mcp, and the right tools, it's effectively already out of the box

Don't give it mcp then (and I struggle to understand why would anyone give a stochastic model such access even if it is trained on very American NSA-certified hardware approved by Sam Altman himself).

a siren will go off and in 10 secs your computer will explode

The same thing that usually happens when you violate TOC …

Probably safe enough on your own computer, but could have consequences if it’s a work computer.


>could have consequences if it’s a work computer.

consequences for employer who "might" get a license audit done on their machines.

does it really happen so often that a random employer in the eu would have to be concerned?


consequences for you when you unwittingly open up a back door or expose your organisation to a data breach

You'll be both breaking their licence and potentially your local European data laws.

so what. will they send cops after you.

breaking license will do what? whats up with licenses and violations? you and me are random people on internet


>breaking license will do what?

The same thing breaking any license does. If you do it in your basement, nothing by definition. If you incorporate it in a service or distribute it as part of a project, well then you're on the hook. (and that is what license holders tend to care about)


" potentially your local European data laws."

If run locally, why?


I called him out in another thread. It makes absolutely no sense. He is talking against himself, judging by his comments.

To answer your question, he modified my comment (see the parentheses):

"> The point is, who gives a damn about (doing an illegal thing) in reality, on their (private property where nobody is likely to see that)?"

So... at best what he said is purely theoretical. He admitted it himself: "nobody is likely to see that". Though I am not sure I agree with it, but then again, in reality, no one gives a fuck, at least not in Europe.


There are likely multiple potential issues here, but one specific example: Processing and storage of PII without consent/authorisation is not allowed, regardless of whether you do it yourself or for others. And you can't guarantee that this model does not contain private information hoovered up by accident.

energy consumption

There is not a single AI model that fully complies with GDPR. How can you inform everyone, even those not named by actual name but otherwise identifiable, that their data is being processed and give them the ability to object when the data they train on isn’t public.

Literally the same for all other open weights, this is just legal ass covering where most others don’t even do that.


Shocking. At least they acknowledge it.

It isn't acknowledging. It is just a legalese to wash their hands away from following whatever EU restrictions and requirements may be applicable here otherwise.

With EU love for regulations soon everyone will exclude it without even reading those regulations.

does anyone comply with gdpr & Ai act? Even for mistral i m not sure, the best we can say is "we don't know"

There's something nefarious about this.

I doubt the Chinese ever care about licensing so I would not care about following their license



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: