Hacker Newsnew | past | comments | ask | show | jobs | submit | chpatrick's commentslogin

Modern heat pump dryers also work at a lower temperature because they cool the air to evaporate the moisture so they don't need to be as hot to start with.

I was about to write this. Heat pump dryers take a little longer, but they are so much gentler on clothes.

In Hungary it gets sorted out locally. We also recently implemented a bottle return system that (although it's annoying) produces clean stacks of PET, aluminium and glass, all of which are recyclable.

Even with PET, arguably the most recyclable plastic, most of it doesn't go bottle-to-bottle but rather bottle-to-textile. Because most PET "recycling" doesn't close the loop, so it's dubious to even call it recycling. That said, some bottle-to-bottle recycling of PET is done, and this has been getting better.

This has been happening for decades in the UK and the US regardless of the regime.

[flagged]


Do you have a link about the case you're quoting? I can't find any reference to it.


So I guess my next question is, why are all your recent comments saying things that are obviously and unambiguously not true? These things are all trivial to check, and it's not like nobody is calling you out on it. I don't get what's in it for you.

There's a version of this where you make your case (which IMO is, at its core, based on reasonable concerns) without relying on obviously untrue statements. Why not try that?


Clarify which statement is "not true".

Unlike the time the US banned a millions of people on the basis of religion, pure socialism.

Yeah keep making stuff up bro.

I think that definition is pretty obsolete for the last 20 years.

To me "AI" is machine learning, statistical algorithms trained on data. That's not true for Lean.


So basically anything we don’t know how to write an algorithm for? I see where you’re coming from - but at the same time it’s actually an AI meme and smells of permanently moving goalposts.

Consumer local AI? Maybe.

On the other hand everyone non-technical I know under 40 uses LLMs and my 74 year old dad just started using ChatGPT.

You could use a search engine and hope someone answered a close enough question (and wade through the SEO slop), or just get an AI to actually help you.


I've been buying LPs after concerts just to have a nice souvenir, I can always listen to them on Spotify. I only just got a turntable this Christmas and it's cool to actually listen to them.



I am aware, but Python has that by default. In Javascript it's opt-in and less ergonomic. E.g. try loading a 64-bit integer from JSON.


I agree, but bigints are missing from json because the json spec defines all numbers as 64 bit floats. Any other kind of number in JSON is nonstandard.

JavaScript itself supports bigint literals just fine. Just put an ‘n’ after your number literal. Eg 0xffffffffffffffn.

There’s a whole bunch of features I wish we could go in and add to json. Like comments, binary blobs, dates and integers / bigints. It would be so much nicer to work with if it has that stuff.


> the json spec defines all numbers as 64 bit floats

It absolutely doesn't. It doesn't impose any limits on number precision or magnitude.


Its type system is miles better than Python and it has some basic stuff Python doesn't have like block scope. Functional programming is also intentionally kind of a pain in Python with the limited lambdas.

If TypeScript had the awesome python stdlib and the Numpy/ML ecosystem I would use it over Python in a heartbeat.


Typescript also has significantly better performance. This is largely thanks to the browser wars funnelling an insane amount of engineering effort toward JavaScript engines over the last couple decades. Nodejs runs v8, which is the JavaScript engine used by chrome. And Bun uses JSC, written for safari.

For IO bound tasks, it also helps that JavaScript has a much simpler threading model. And it ships an event based IO system out of the box.


you can define a named closure in python, i do it from time to time, though it does seem to surprise others sometimes. i think maybe it's not too common.


I know, it's just very unergonomic.


I went to a top university and most people's mental health was definitely in the gutter so I think it is worth talking about it.


Because FFmpeg is a swiss army knife with a million blades and I don't think any easy interface is really going to do the job well. It's a great LLM use case.


I know everybody uses a subscription for these things, but doesn't it at least feel expensive to use an LLM like this? Like turning on the oven to heat up a single slice of pizza.


No, LLMs are extremely useful for dealing with ffmpeg. Also I don't think they're sufficient, they get confused too easily and ffmpeg is extremely confusing.


ChatGPTs free tier is just fine for me.


Nice


But you only need to find the correct tool once and mark it in some way. Aka write a wrapper script, jot down some notes. You are acting like you’re forced to use the cli each time.


One can do that with LLM as well. Honestly, I almost always just save the command if I think I am going to use it later. Also, I can just look back at the chat history.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: