Hacker Newsnew | past | comments | ask | show | jobs | submit | Peteragain's commentslogin

The original idea was there with html 1.0. The lesson we needed to learn was that some think you can't sell stuff to do things if doing them is simple. https://en.wikipedia.org/wiki/Embrace,_extend,_and_extinguis...

Perhaps md is an opportunity to re invent the web: a browser for just md AND a search engine with an open algorithm that indexes just what is visible.


The funny thing is that HTML was supposed to be a markup language that could be read/written by hand, while making it also machine-to-machine friendly - notably by making some "semantic" features accessible for browsers. One of these for instance is the structure of the document; marking section headers was supposed to let browser to automatically generate a table of contents. Additionally CSS was supposed to let users choose how all this was displayed.

All of this failed - or rather, was undone and cancelled by the "modernization" of the Web. Namely the arrival of for-profit companies on the Web, be it Facebook of the press like the New York Times.

It was a boon as they brought valuable content, but they brought it with their own rules. The first set of which is the ads-supported model, which is by definition the opposite of free content; an ad-supported website is not free in a very sneaky way, and it's not just about privacy and manipulative practices (targeted ads, as if ads were not already manipulative enough). Users are actively prevented from consuming the content the way the want.

The situation today is that very few browsers offer out-of-the-box a way to apply a personal CSS, and I think none will generate a ToC from the headers of a HTML page.

And the "semantic" part - far from specialized and more accurate semantic markup frameworks that were considered - is being completely taken over by LLMs; an insanely expensive brute-force solution IMHO.

The web has already be reinvented mostly the way you suggest, see for instance the Gopher and Gemini protocols, but they'll stay forever "niche" networks. Which could be not so bad, as it is very clear that the Web is full of actors malicious to various degrees. Tranquility by obscurity?


I used gopher before mosaic! And yes the issue is not the tech, but the social engineering of a community. Git(hub) has a community; IMHO GitHub users need to put more cool things on there like blogs.. perhaps..

Did people here see the Cory Doctorow thing a few days back about DMCA and "Article 6 of the 2001 EU Copyright Directive"? https://pluralistic.net/2026/01/01/39c3/#the-new-coalition Basically without open source, the rest of the world will have to keep paying a tithe to American companies, and hence tax in the USA.

The point is the opportunity created by trump's tariff policy. Saying do what I want or I'll burn your house down, and then burning your house down - you no longer need to do what is demanded. An opportunity has appeared.

My great, great grand dad carted telegraph poles for the construction of the southern half of that! Family oral history.

There are two things happening here. A really small LLM mechanism which is useful for thinking about how the big ones work, and a reference to the well known phenomenon, commonly dismissively referred to as a "trick", in which humans want to believe. We work hard to account for what our conversational partner says. Language in use is a collective cultural construct. By this view the real question is how and why we humans understand an utterance in a particular way. Eliza, Parry, and the Chomsky bot at http://chomskybot.com work on this principle. Just sayin'.

MAYBE

Universally correct reply, although honestly a bit vague.

Fair. The background reading is the EMCA stuff - conversation analysis cf Sacks etc at, and Ethnomethods (Garfunkel). And Vygotsky cf Kozulin. People such as Robert Moore at IBM and Lemon at Herriot-Watt work in this space but there is no critical mass in the face of LLM mania.

And the Chomskybot analysis is quite enlightening..

I think it will absorb at least as many hours as doom scrolling, and be much better for me.


Programming languages were originally designed by mathematicians based on a Turing machine. A modern language for FPGAs is a challenge for theoretical computer science, but we should keep computer literate researchers away from it. This is a call out to hard core maths heads to think about how we should think about parallelism and what FPGA hardware can do.


https://clash-lang.org/ we've already done the research! Circuits are just functional programming (the vast majority of the time).

We just need the toolchains to be opened up.


The real reason it won't end up in a park is not the engineering. The problem is the same one as NPCs in computer games: synthetic characters are, to date, just really transparent and boring. The real research question is why.


Every single non-face character in Disney parks doesn't even talk to guests?


I guess that's why most computer games don't have NPCs...Oh wait there's entire computer games built entirely around interacting with synthetic NPCs.

There are, of course, limitations to synthetic characters. Even with those limitations there are plenty of entertaining experiences to crafted.

The real challenges are around maintaining and safely operating automous robots around children in a way that isn't too expensive. These constraints place far more limits than those on synthetic characters in video games.


Most people aren't paying 100s or 1000s of dollars to interact with NPCs in video games. If they were, they'd probably expect a lot more and get bored of it quicker.

> The real challenges are around maintaining and safely operating automous robots around children in a way that isn't too expensive.

This is one of the challenges, but only one. The one GP outlined is still very much real - see the Defunctland video on Living Characters for some older examples, but for a recent example, there's the DS-09 droid from Galactic Starcruiser.


Exactly! I am going for "glorified auto complete" is far more useful than it seems. In GOFAI terms, it does case-based reasoning.. but better.


I call it clippy’s revengeance


Clippy 2: Clippy Goes Nuclear

But more seriously, this is ELIZA with network effects. Credulous multitudes chatting with a system that they believe is sentient.


There is an argument, perhaps no longer PC, that the indigenous population used fire to hunt, and so burnt off regularly. Fires these days are indeed devastating because we try to stop them. Established eucalyptus trees also thrive after a scrub fire; a "devastating" fire kills them.


Cultural burning is pretty much the current accepted understanding of how Australian indigenous people managed the land prior to colonisation.

https://study.unimelb.edu.au/student-life/inside-melbourne/c...

https://en.wikipedia.org/wiki/Fire-stick_farming

Just want to reassure you that is not at all 'no longer PC'. If anything, the practice was banned by the coloniser - only for it more recently reintroduced.


Thanks for the confirmation!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: