Hacker Newsnew | past | comments | ask | show | jobs | submit | BoredomIsFun's commentslogin

I think for Unix-likes, a good old TUI based like Turbo Debugger would be very useful.

Very convenient to use LLMs for the that "Please add debug fprintf(stderr, {print var x y here})". The "please comment out the debug fprintfs"

I've always been sceptical of the modern tendency of throwing powerful hardware at every embedded projects. In most cases good old atmel AVR or even 8051 would suffice.

I think I used to have that view as well, and in a way still do, but this particular project proved otherwise.

The first version was built pretty much that way, with a tiny microcontroller and extremely optimized code. The problem then became that it was very hard to iterate quickly on it and prototype new features. Every new piece of hardware that was added (or just evaluated) would have to be carefully integrated and it really added to the mess. Maybe it would have been different if the code had been structured with more care from the get-go, who knows (I entered the project already in version 2).

For version 2, the micro-controller was thrown out, and raspberry-pi based solutions were brought in. Sure, it felt like carrying a shotgun to fire at a couple of flies, but having a linux machine with such a vast ecosystem was amazing. On top of that, it was much easier to hire people to work on the project because now they could get by with higher level languages like python and javascript. And it was much, much, much faster to develop on.

The usage of the raspberry pi was, in my view, one of the key details that allowed for what ultimately became an extremely successful product. It was much less energy-efficient, but it was very simple to develop and iterate on. In the span of months we experimented with many hardware addons, as product-market-fit was still being found out, and the plethora of online resources for everything else was a boon.

I'm pretty sure this was _the_ project that really made me realize that more often than not the right solution is the one that lets the right people make the right decisions. And for that particular team, this was, without a doubt, a remarkably successful decision. Most of the problems that typically come with it (such as bloat, and inefficiency) were eventually solved, something which would not have been possible by going slowly at first.


> but it really does seem that trying to be the language for all possible and potential architectures might not be the right play for C++ in 202x.

Portability was always a selling point of C++. I'd personaly advise those who find it uncomfortable, to choose a different PL, perhaps Rust.


> Portability was always a selling point of C++.

Judging by the lack of modern C++ in these crufty embedded compilers, maybe modern C++ is throwing too much good effort after bad. C++03 isn't going away, and it's not like these compilers always stuck to the standard anyway in terms of runtime type information, exceptions, and full template support.

Besides, I would argue that the selling point of C++ wasn't portability per se, but the fact that it was largely compatible with existing C codebases. It was embrace, extend, extinguish in language form.


> Judging by the lack of modern C++ in these crufty embedded compilers,

Being conservative with features and deliberately not implementing them are two different thing. Some embedded compilers go through certification, to be allowed to be used producing mission critical software. Chasing features is prohibitively expensive, for no obvious benefit. I'd bet in 2030s most embedded compiler would support C++ 14 or even 17. Good enough for me.


> Being conservative with features and deliberately not implementing them are two different thing.

There is no version of the C++ standard that lacks features like exceptions, RTTI, and fully functional templates.

If the compiler isn't implementing all of a particular standard then it's not standard C++. If an implementation has no interest in standard C++, why give those implementations a seat at the table in the first place? Those implementations can continue on with their C++ fork without mandating requirements to anyone else.


> If the compiler isn't implementing all of a particular standard then it's not standard C++.

C++ have historically been driven by practicalities, and violated standards on regular basis, when it deemed useful.

> Those implementations can continue on with their C++ fork without mandating requirements to anyone else.

Then they will diverge too much, like it happened with countless number of other languages, like Lisp.


> Then they will diverge too much, like it happened with countless number of other languages, like Lisp.

Forgive me if I am unconvinced that the existence of DSP-friendly dialects of C++ will cause the kinds of language fracturing that befell Lisp.

DSP workloads are relatively rare compared to the other kinds of workloads C++ is tasked with, and even in those instances a lot of DSP work is starting to be done on more traditional architectures like ARM Cortex-M.


How about autocomplete with LLMs? Should it be disclosed too? (scratching my balding head).

nah, this is just rhetorical polemic..

> excellent show "halt and catch fire".

I found it very caricature, too saturated with romance - which is untypical for tech environment, much like "big bang theory".


It's still very good I'd say. It shows the relation between big oil and tech: it began in Texas (with companies like Texas Instruments) then shifted to SV (btw first 3D demo I saw on a SGI, running in real time, was a 3D model of... An oil rig). As it spans many years, it shows the Commodore 64, the BBSes, time-sharing, the PC clone wars, the discovery of the Internet, the nascent VC industry etc.

Everything is period correct and then the clothes and cars too: it's all very well done.

Is there a bit too much romance? Maybe. But it's still worth a watch.


I never really could get into the Cameron/Joe romance, it felt like it was initially inserted to get sexy people doing sexy things onto the show and then had to be a star crossed lovers thing after character tweaks in season 2.

But when they changed the characters to be passionate stubborn people eventually started to cling to each other as they together rode the whirlwind of change the show really found its footing for me. And they did so without throwing away the events of season 1, instead having the 'takers' go on redemption arcs.

My only real complaint after re-watching really was it needed maybe another half season. I think the show should have ended with the .com bust and I didn't like that Joe sort of ran away when it was clear he'd attached himself to the group as his family by the end of the show.


IMO it really came into its own after the first season. S1 felt like mad men but with computers, whereas in the latter seasons it focused more on the characters - quite beautiful and sad at times.

I vaguely remember that they tried to reboot it several times. So the same crew invented personal computers, BBSes and the Internet (or something like that), but every time they started from being underfunded unknowns. They really tried to make the series work.

That's not really what happens at all. The characters on the show never make the critical discoveries or are responsible for the major breakthroughs, they're competing in markets that they ultimately cannot win in, because while the show is fictional, it also follows real computing history.

(MILD SPOILERS FOLLOW)

For example, in the first season, the characters we follow are not inventing the PC - that has been done already. They're one of many companies making an IBM clone, and they are modestly successful but not remarkably so. At the end of the season, one of the characters sees the Apple Macintosh and realizes that everything he had done was a waste of time (from his perspective, he wanted to change the history of computers, not just make a bundle of cash), he wasn't actually inventing the future, he just thought he was. They also don't really start from being underfunded unknowns in each season - the characters find themselves in new situations based on their past experiences in ways that feel reasonable to real life.


The BBC made a docudrama, Micro Men, with Alexander Armstrong as Clive Sinclair and Martin Freeman as Chris Curry.

Sophie Wilson cameos when they have a fight.


IMO the first series was excellent, the 2nd took a massive downturn and stopped watching after that.

Not sure about that - a local 12b-32b LLM consumes miniscule amount of energy compared to gaming on the same hardware.

> What you're suggesting is as if you could load Linux kernel modules into the FreeBSD kernel.

Afaik, you partially can.


I'd stay away from ollana, just use llama.cpp; it is more up date, better performing and more flexible.

But you can't just switch between installed models like in ollama, can you?


Performance of LLM inference consists of two independent metrics - prompt processing (compute intensive) and token generation (bandwidth intensive). For autocomplete with 1.5B you can get away with abysmal 10 t/s token generation performance, but you'd want as fast as possible prompt processing, pi in incapable of.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: