Hacker Newsnew | past | comments | ask | show | jobs | submit | DocTomoe's commentslogin

There is a delay between 'bad stuff happens' and 'lawyer tries to squeeze some money out of it'. We now see 4o lawsuits because we have gotten to that point in discovery for 4o-related incidents. Give it half a year, and we'll see 5-related lawsuits.

Yes, because we want to entrust governments with the inner worlds of eight billion people. What could possibly go wrong?

They were forced to retain even 'deleted' chatlogs about half a year ago because of a copyright lawsuit involving the NYT.[1] Once more, the copyright-industrial complex makes things weird for everyone.

[1] https://openai.com/index/response-to-nyt-data-demands/


Right, but that's retention for legal defense — they keep everything. The selective hiding is a different layer. They retain it, they just choose when to surface it. So users get "deleted" as UX theater while the data sits in cold storage waiting for subpoenas or PR fires. The irony is the same infrastructure that protects them in copyright suits also lets them curate what investigators see. Retention and visibility are decoupled by design.

I am fairly sure that they made a big theater back in the day how they did, in fact, delete before. But ultimately, no-one outside of OpenAI really knows one way or the other.

> A few months ago, OpenAI shared some data about how with 700 million users, 1 million people per week show signs of mental distress in their chats

Considering that the global prevalence of mental health issues in the population is one in seven[1], that would make OpenAI users about 100 times more 'sane' than the general population.

Either ChatGPT miraculously selects for an unusually healthy user base - or "showing signs of mental distress in chat logs" is not the same thing as being mentally ill, let alone harmed by the tool.

[1] https://www.who.int/news-room/fact-sheets/detail/mental-diso...


Having a mental health issue is not at all the same thing as "showing signs of mental distress" in any particular "chat". Many forms of mental illness wouldn't show up in dialogue normally; when it would, it doesn't necessarily show up all the time. And then there's the matter of detecting it in the transcript.

I don't know the full details, but 700M users and 1 million per a week, means up to 52M per year though I imagine a lot of them show up multiple weeks.

You also don't take into account that the userbase itself is shifting.

That being said: Those of us who grew up when the internet was still young remember alt.suicide.holiday, and when you could buy books explaining relatively painless methods on amazon. People are depressed. It's a result of the way we choose to live as a civilization. Some don't make the cut. We should start accepting that. In fact, forcing people to live on in a world that is unsuited for happiness might constitute cruel and unusual punishment.


Just because you do not use a piece of technology or see no use in a particular use-case does not make it useless. If you want your Java code repaired, more power to you, but do not cripple the tool for people like me who use ChatGPT for more introspective work which cannot be expressed in a tweet.

By the way, I would wager that 'long-form'-users are actually the users that pay for the service.


> By the way, I would wager that 'long-form'-users are actually the users that pay for the service.

I think it may be the case that many of these people that commit suicide or do other dangerous things after motivation from AI, are actually using weaker models that are available on the free versions. Whatever ability there is in AI to protect the user, it must be lower for the cheaper models that are freely available.


Alright, as someone who is currently suffering from burnout (which is classified as a form of depression in my country, making me swear the holy oath to my doctor once a month that I do not think, believe, or plan to end it): This is probably the worst possible conclusion you could make.

It will breed paranoia. "If I use the wrong words, will my laptop rat me out, and the police kick in my door to 'save me' and drag me into the psych ward against my will, ruining my life and making my problems just so much more difficult?"

Instead of a depressed person using cheap, but more importantly: available resources to manage their mood, you will take them into a state of helplessness and fear of some computer in Florida deciding to cause an intervention at 2am. What do you think will happen next? Is such a person less or more likely to make a decision you'd rather not have them make?


Fasttracking WMD programs up to and including nuclear for country leaders who are feeling a bit too 'abductable' for comfort.

As someone who is going through a dark, dark time right now: profusely thank you. This is exactly what I needed.

May all your days be joyful.


glad to hear my words helped, may all your days be joyful too <3 <3 <3

The major difference being that the US crew got medals for 'meritorious service', including a Navy Commendation Medal and a Legion of Merit. Russia is not quite that ballsy over accidentally butchering civilians.


> Russia is not quite that ballsy over accidentally butchering civilians.

I don't know about accidental, but if anyone thinks Russia is not ballsy about butchering civilians, they need a refresher on Russia's wars during the last few decades. Last few years would be enough too. It's a principle of their military affairs.


Switch on the critical thinking part of your brain and go read about american war crimes, the reality is much dirtier than "we're Good and they're Evil". It's not a competition so I'm not going to start ranking armies but they all have their fair share of atrocities.


What makes you think I'm not thinking critically? You're the person here who seems to be thinking in terms of competition, as far as I can tell. And, who's we?

Not sure if you're being subtly apologetic, so I'll elaborate my point. Russian commanders that led campaigns in Syria got nicknames like Butcher of Aleppo and General Armageddon, for not only using scorched earth tactics, indiscriminate bombing, but actually systemically bombing schools, hospitals, field clinics, bread lines. UN High Commissioner for Human Rights called it "crimes of historic proportions." Aid organizations would actually stonewall the UN, because through UN Russia would find out where the bread lines are and would bomb them. These are not accidents, or freak, isolated occurances: it's doctrine. Look at Mariupol. Or, Ukraine in general.


Abu Ghraib, Guantanamo, Bagram collection point ? The wikileak scandals ? 100k+ civilians dead in Iraq for weapons of mass destruction that never existed. 15%+ of drone strike victims being civilian over the last 20 years

These are all accidents too I assume ? Idk what to expect from people who are currently blowing up random boats in international waters and who just declared fentanyl a weapon of mass destruction lmao


Guy, you're the only one here acting like this is a competition. Do you think what Russia does is somehow more acceptable if you can find other criminals? Yeah, just lean into it. Good luck with that.


I was specifically talking about accidental shootdowns of civilian airliners. Leave your politics out of this.

Unless you have tangible evidence that MH17 was deliberately downed. In which case I am sure people would just love to see that.


> I was specifically talking about accidental shootdowns of civilian airliners.

Maybe you should think twice before dropping offhand comments related to mass killings. If you want to talk about what exactly I said that's "politics" as opposed to history, we can do that.


Oh, it is my responsibility to work around YOUR preferred way of doing things, when I have zero benefit from it?

Maybe I just get your scraper's IP range and start poisoning it with junk instead?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: