Hacker Newsnew | past | comments | ask | show | jobs | submit | more sombragris's commentslogin

Obviously the whole point is to make AI overreach avoidance as painful as possible.

Of course, that's also the reason why Lens was deprecated despite being a good, useful app, forcing one to deal with the bload of Copilot 365.


English is not my native language, and whenever I can watch TV or movies, I'll always try to watch it with English subtitles. This is because:

- As I said, English is not my first language. Thus I'm not that proficient in listening.

- Actors nowadays speak with worse diction. Sometimes they just mumble something unintelligible, no matter how much you rewind and replay.

- The sound balance makes voices to come off as drowned by the environmental sounds and the soundtrack. This is very different from, e.g., classical movies where sound and diction was very clear almost at all times.

- Some accents are very hard to parse. Canadian folks, for example, make it very difficult to be understood.

So, the subtitles feature are a lifesaver a lot of times.


It's just native speakers in general across any language. I swear I can barely understand native speakers in my second language because they all mumble or liase into fractions of words that are understood only if you've learned the specific mumbles of that accent.

"Welyuno..." "Wellyouno..." "Well youknow..." "Well, you know..."

But if you didn't know "welyuno" in English, tough luck you'll understand it in as a second language learner.


Once you are used to how natives speak, these stop being mystery. Basically, you learn to understand real speach instead of just "learner" version.


I'm replying to this comment as well to its parent.

You're both right as for colloquial speech. But we are talking about movies. Actors used to study elocution and diction techniques to polish their speech. As I said in my comment, this was apparent in older movies. Now, they don't even care about this, it seems.


I doubt it would come to completely replace LaTeX. The momentum behind LaTeX is enormous. Having said that, if it manages to simplify language handling, fonts, and bibliographhies it would be great.

But, especially, it would be good to see whether it improves on LaTeX's handling of tabulars (tables) and floats (figures), somethingh that is kind of an esoteric art right now. I wish Typst the best.


> In parallel I don't understand gamers with 15 years old hardware leaving bad reviews or whining when a game chokes above 720p with minimum settings.

I won't leave a bad review or whine on BG3, and my (otherwise very capable laptop) is just 6 years old with an Intel UHD 620 integrated GPU, and BG3 barely reaches the 10fps level on 1024x768 with lowest settings on everywhere. So it's not even 720p, and BG3 chokes a lot in this somewhat recent hardware.

I see BG3's graphics and while they are beautiful, they're nothing out of the ordinary in comparison to other games. That is, there are good games that could run very well in my laptop and which look good.

In sum, I see BG3 as being needlessly demanding, and pushes out a large sector of machines and potential buyers. I'd love to have an RTX-class GPU (and have the cash to afford it), but all I have it's a laptop whose GPU cannot be upgraded, and that is perfectly capable in all other areas.

Every time when I point out this limit in games, which I see as silly, I get flamed to death. People in the gaming communities are seemingly unable to understand why making extremely high minimum requirements is not a good sales strategy.

Games can look good with integrated GPUs. See the Wolfenstein games (id engine). Even more recent games like Generation Zero (Apex open world engine) can be run decently at lowest settings on my hardware. MGS5:PhantomPain also runs and looks very good. But no luck with BG3.


To be fair, there is world between "extremely high minimum requirements" and a 6 years old laptop, considering laptop of even the current year are never considered super high end rigs.


If you had s/minimum/recommended I would agree. But no, we are talking about minimum requirements. I submit that a game should be playable even on 10-year old hardware. Of course, the devs can blow the ceiling in their highest settings. Do they need three RTX-7000 series in parallell cooled with liquid nitrogen and eating 3000W of electricity in order to run at the highest/ultra settings? Yeah, be my guest, more power to them. But we are, I insist, talking about minimum requirements. These should be as broad as possible in order to keep the entry barrier low.

In a game which doesn't even look especially good, I see the very demanding hardware requirements as just a contribution to planned/artificial obsolescence.

(and yeah, I got downvoted as expected. This is getting old...)


I get where you're coming from, and I do think game devs should put more effort into supporting lower-end hardware.

That said, bg3 does run on 10-year old hardware. The minimum requirements lists a gtx 970, released 2014. Which, despite its age, is still 6x faster than your integrated graphics.


Good, but it is still a discrete GPU. My point is that games should be made at least playable on integrated GPUs.


Many Linux distros have Firefox's JavaScript (SpiderMonkey?) runtime independently packaged and available. Can it be used for this?


Yes, Spidermonkey can be ran standalone and would probably be much more secure than Deno would be because it does not have all the server-related APIs.


There's an aspect that the article managed to imply but I think it warrants more thinking because it directly affects a lot of people, myself including: that people cannot afford to pay for a "gaming PC".

Well, I thankfully might afford it, but how could I justify the cost? I have a laptop which works very well and is my daily workhorse for everything, including gaming. But it has Intel onboard graphics. Spend something north of a thousand bucks of hard currency (even more costly in my country) just to play Baldur's Gate 3..?

This means that I can do almost everything on it except playing some games. This is because most recent games would require a quite good discrete GPU to be even playable on lowest settings (e.g., achieving something like 25 fps on low settings on 720 fps). In fact I think this is a quite stupid move by game companies, by imposing such artificial constraints on which machines can play their games and locking out millions of PCs and thus a similar number of potential users.

Seeing those games, I don't see them as being specifically advanced or better looking in their lower settings as to justify imposing such artificial barriers of entry.

To be clear, I have no problems with games being all the graphically advanced they want. They can have Ultra settings with double advanced real-time ray tracing with three parallell RTX 6000 cooled by liquid nitrogen, by all means. But don't put those stupid gatekeeps locking out onboard GPUs and thus millions of potential players from all over the world.


I have been a KDE user since KDE 1.x in Red Hat Linux 6.2, back in 2000, and used KDE almost exclusively for my Linux desktop since KDE 2.2. Right now using Plasma 6.4.5.

In all that time, I was quite disappointed to see major distro after major distro (and even Sun Microsystems back in the day) choose GNOME over KDE/Plasma as their default desktops. How could they choose GNOME when KDE/Plasma is/was (in my very subjective opinion) way better? Go figure. Still until today, and with the exception of Steam Desktop, it's disappointing to see that Plasma is not the default/preferred desktop environment in (almost?) all major distros.

So, it's really refreshing to see posts like these. I like when someone finally "gets it" and realizes the advantages and potential Plasma offers.

In case you can't use Plasma, I'd recommend (in no particular order) LXQt, Cinnamon, MATE or XFCe as adequate options. But if you haven't, try Plasma, and customize it to your heart's content. More often than not, you'll end up liking it quite a bit.


I vaguely remember that the shift to gnome was because of fear around QT licensing.


"tender love making" and then the specific URL mention photos and corruption... it could look really bad!


You are welcome not to visit it if you do not want to hear from a Rails contributor for over 10 years.


Of course. I have no issues with that but I acknowledge that it could be funny.


> But in the end I lost my friends, my colleagues, my job, my career and my family.

As a Plasma user and a part of a KDE team for 10 years at some time (in the past), reading this really breaks my heart. I hope Jonathan can find peace and healing.


From the title, I thought the lady had switched to Android. It turned out she wrote about ditching smartphones, not just iPhones.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: