Hacker Newsnew | past | comments | ask | show | jobs | submit | scandum's commentslogin

For all practical matters it is xterm that has become the standard.

Like with web browsers, it is irrelevant what the standard states. The only thing that matters is that it displays correctly with the most popular / authoritative browser.


There are several different xterms. As in, really different with totally different Fkeys and arrows and colors etc, even while still being largely vt-like, but they all call themself the same TERM=xterm by default.

xterm is not a standard.


> There are several different xterms.

Based on my understanding of the situation this claim seems to perhaps be... a little lacking in nuance? :)

In a different context perhaps we'd be talking about "genericization of trademarks" but, at least as a term *cough* referring to a specific software application, well, "the" "xterm" docs[0] think there is "only" "one":

* "xterm (pronounced 'eks-term') is a specific program, not a generic item. It is the standard X terminal emulator program."

Then again, even with a narrow definition of "the standard X terminal emulator program", the xterm moniker can apparently apply to two "implementations"[1], variously[1][2] referred to as "xterm-new" (a.k.a "based on XFree86 xterm" a.k.a. "modern" a.k.a "based on the X11R6.3 xterm, with the addition of ANSI color and VT220 controls") and "xterm-old" (a.k.a "the X11R6 xterm" a.k.a. "xterm-r6").

And, indeed, the docs do use the phrase "other versions of xterm"[3] in reference to:

* other programs "based on xterm's source"

However, it also distinguishes between those programs and programs in two further categories:

* "similar programs not based on xterm's source, which are compatible to different degrees"

* "programs which set TERM to 'xterm', in the hope that applications will treat them the same as xterm" (which apparently includes some whose developers think that by "[...] copying from xterm [source code], they are entitled to do this whether or not the program actually matches xterm's terminal description").

Which segues nicely into...

> they all call themself the same TERM=xterm by default

A habit that the main developer of xterm since 1996 appears to have, um, some additional thoughts about: :D

* https://invisible-island.net/ncurses/ncurses.faq.html#xterm_...

* https://invisible-island.net/xterm/xterm.faq.html#xterm_term...

Hey, maybe we should replace `TERM=` with `USER_AGENT="$TERM (Compatible with xterm)"`, cos that worked out great on the web! :)

Now, some people may respond "No, we should be using feature detection!", to which I might respond, "Hey, yeah, like, we should have a way to find out what features or capabilities the current terminal emulator supports!".

Problem solved!

Guess the only question now is whether we should call it "The Terminal Capabilities System" or "The Terminal Information System"...

...wait, no, those names don't sound... "unixy" enough.

Let's call it "terminfo". Yeah, let's use "terminfo"!

To which I imagine, at least one person might reply, with a sigh, "I wish you would":

* https://invisible-island.net/xterm/terminfo-contents.html#ti...

* https://invisible-island.net/xterm/terminfo-contents.html#ti...

* https://invisible-island.net/xterm/terminfo-contents.html#ti...

* etc

Wait, so, why doesn't this system "work" then? Well, turns out, apparently its "easier" to just tell someone to set `TERM` to a value which might charitably be called "aspirational" and hope it all works out.

(Spoiler: it doesn't[4].)

> xterm is not a standard.

If xterm ain't a standard then why everyone always try to bite its style? :) (Or, whatever the kids say these days...)

So, yes, in one sense, "xterm" is not a standard, it's one specific program, with a specific set of behaviours.

In another sense, well, while some people set out to write a program to emulate a "DEC VT100" terminal behaviour in a compatible manner, other people very much do set out to write a program to emulate "xterm" terminal behaviour in a compatible manner--which suggests the behaviour of "the xterm" is at least a "de facto" standard.

And the problem comes when those developers claim their implementation is so accurate that it is "indistinguishable" from "the real xterm" that `TERM=xterm`, when that's, umm, definitely not the case[5][4].

Heck, even xterm itself isn't so brazen as to set, say, `TERM=vt220`! :)

The good news is that if setting `TERM` to a realistic rather than "aspirational" value is too hard, the author of `xterm` even provides a tool that terminal emulator developers can use to validate & improve some of their compatibility with the "one & only, definitely not a de facto standard, xterm": https://www.invisible-island.net/vttest/

And, personally, I think, all those terminal users out there, when they ask for "xterm" it's not unreasonable for them to expect to get the Real Thing(TM) and not a knock off "we've got xterm at home". :)

But, hey, I probably only first used the xterm in 1994 at the earliest so I could easily be missing some relevant historical nuance. :D

---- faq ----

q: Did this comment really need to be this long?

a: No. But, you know, thanks for reading it. :)

---- footnotes ----

[0] https://invisible-island.net/xterm/xterm.faq.html#what_is_it

[1] https://invisible-island.net/ncurses/ncurses.faq.html#no_xte...

[2] https://invisible-island.net/xterm/xterm.faq.html#forward_hi...

[3] https://invisible-island.net/xterm/xterm.faq.html#other_vers...

[4] https://invisible-island.net/ncurses/ncurses.faq.html#xterm_...

[5] https://invisible-island.net/xterm/xterm.faq.html#compare_ve...


xterm, the term used in error to refer to the definition of a particular terminal, is used by several different implimentations going back decades and spanning many platforms.

Those implimentations not only all assert the exact same "TERM=xterm" by default, many of them even name the executable the same "xterm", yet do not all adhere to any single common standard. The SCO OSR5 xterm is different from old versions of Xfree86 xterm is differrent from the current xorg xterm etc. There is some overlap, but not nearly enough. The differences are not merely small superset/subset feature differences like 256 color support etc. They include, as I did say, totally differrent F-keys and arrows for example.

That is what "xterm is not a standard" means.

It's a word that refers to a range of different definitions that depend on context, not a single specific definition that always means the same thing in any context, and so is not a meaningful definition, or standard.


With browsers, it used to be irrelevant what the standard stated. But the standards these days are actually useful, and where implementations differ from the standards, one or the other is changed.


Web pages are increasingly bulky. A 3 MB page will take 1 second to load at 25 Mbps, so latency is often not the primary bottleneck.

Part of the problem may be that companies who own network infrastructure, and get paid for data usage, are also the ones that are the largest content providers.

This also comes with an electricity cost. We regulate efficiency for refrigerators, it might be time to add some sane limits to the largest content providers, which will also improve connectivity for those stuck with 2 Mbps.


Latency is important because that 3MB isn't one 3MB transfer, its a tree of dependencies that can't be completely parallelised.

On high latency Internet with otherwise OK speed, pages of "only" 3MB can take several seconds to load.


am I the only person who sees this as a webdev problem and not ISP problem? I don't want my website to recursively load the entire freaking internet with Russian nesting doll dependency loading.

when did we stop caring about optimization and good technical design?



Which commonly used webpage is 3 MB for return requests excluding the images? Figma has everything and it's 317 KB transferred for small design for return return. Most of the content is cached.


Why would you exclude images?


Because they don't contribute to latency. I can start reading the content for 1 second they would take to load.


Even if it's a thing, the idea that a teacher should take on the role of psychologists and figure out a child's optimal learning style should be preposterous to anyone with common sense.

It might work in combination with a software suite, with e-readers getting cheaper and more capable, and easily lasting through a school day, it's an option.


Ah yes, we can sort kids into humanities and sciences, by age, by ability, into gymnasiums and normal schools, but sorting visual learners from other ones is too hard.

We can only seperate kids into groups when it serves adninistrative or industrial beenfits, when its for the benefit of the child, forget it.

I mean how can you expect a ridiculously expensive system where children spend the majority of their waking hours to actually ask a child if it suits them?


I agree burdening the teachers with individualized instruction is too much. But students learning about themselves seems like a good thing.

My only point is that I believe the situation has not been studied well, and it should be.


I think you pointed out the problem.

I suspect it has the same problem as ncurses: Not easy to learn, get proficient with, or rapidly produce something decent with.

I've been working on a project for a while that allows running any console program within it, and use text, keyboard, or mouse triggers to add a VT100 TUI, but it's uncomfortable to learn, very hard to get proficient with, and while it's possible to rapidly produce something decent, you're going to hit a hard wall producing something excellent.


Seems like speculation on top of speculation on top of speculation.

I wonder what the archeological data looks like, 6 burned down houses and a very active imagination?


Why don't you look at the size of the area they're talking about where these have been observed. And then, perhaps, try the process we like to refer to as "thought".


Isn't the problem the absence of random DNA?

I wouldn't call random data 'complex', but it is easy to sequence when assembling short reads.


My main take away from the article was that we, apparently, need a time traveler to confirm/disprove Einstein's theory.


There's likely a market for office workers since there's a significant reduction of stress on the eyes.

You'd save about 0.5 kwh a day in electricity, more if the AC is running. So I could see them becoming popular once the price comes down. People who run 2 monitors might be interested as well.


> There's likely a market for office workers since there's a significant reduction of stress on the eyes.

There may be, but some studies have shown there's little difference.

https://pubmed.ncbi.nlm.nih.gov/22762257/

Methods: Participants read for several hours on either e-Ink or LCD, and different measures of reading behaviour and visual strain were regularly recorded. These dependent measures included subjective (visual) fatigue, a letter search task, reading speed, oculomotor behaviour and the pupillary light reflex.

Results: Results suggested that reading on the two display types is very similar in terms of both subjective and objective measures.


There is a significant reduction in color depth and refresh rate, plus a need to provide external lighting to make up for the lack of built-in lighting, which, depending on the office environment may cancel out any power savings.


> lack of built-in lighting

That depends on the environment. Natural light fights against the functioning of light-emitting displays. Reflective displays cooperate with natural light.

You in the dark? Use OLED. Under the sun? Use EPD.


I don't know too many people operating desktop computers under the sun. They are usually situated indoors, often distant from any window access. Anyways the point stands that additional lighting discounts any power savings from using e-ink in a desktop environment. I get the use case for e-ink in the field. This product is not for in-field use.


> usually

HN is nth standard deviation sensitive. There is always market for some.

> additional lighting discounts any power savings

Not necessarily. Lighting today can cost fractions of watts, and on the other hands EPD can be energy costly - it depends on how many cell updates you are causing. So, the matter is probably less with energy consumption, and more about getting a better effect based on user and environment.

> This product is not for in-field use

Sure, it does not seem specific. But it can have its places. Be it some production site - maybe a quarry near the tropics -, be it personal - maybe you want to do some work in your garden...


As kindle devices have shown though, you can backlight an e-ink display too


This is technically a frontlight. The panel contains a diffuser and has LEDs at all edges of the display.


> you can backlight an e-ink display too

And you could read OLED at noon in the tropics under an umbrella (I know I did) - but I was exactly talking about optimal use.

--

A good scientist can file with a saw and saw with a file

~~~ Ben Franklin

A wise guy does not, unless necessary

~~~ We, here and now


MUDs are easily hosted abroad. Problem solved.

This reminds me of UK's 90% tax rate, which caused many multi-millionaires to leave the country.


Could you please provide evidence of a 90% tax rate? I don't believe you.


It ended 51 years ago in 1972, but there was indeed a top rate of income tax called "super-tax" or "surtax" (only for very high earners) of 90% for a few decades:

https://en.wikipedia.org/wiki/History_of_taxation_in_the_Uni...

"The highest rate of income tax peaked in the Second World War at 99.25%. It was then slightly reduced and was around 90% through the 1950s and 60s.[citation needed]

In 1971 the top rate of income tax on earned income was cut to 75%. A surcharge of 15% kept the top rate on investment income at 90%. In 1974 the cut was partly reversed and the top rate on earned income was raised to 83%. With the investment income surcharge this raised the top rate on investment income to 98%, the highest permanent rate since the war.[14] This applied to incomes over £20,000 (£221,741 as of 2021).

The Government of Margaret Thatcher, who favoured taxation on consumption, reduced personal income tax rates during the 1980s in favour of indirect taxation. In the first budget after her election victory in 1979, the top rate was reduced from 83% to 60% and the basic rate from 33% to 30%."

According to the history linked below, the super-tax/surtax was started in 1909, taxed only the top 0.05% of earners, and caused a constitutional crises as the budget was rejected by the House of Lords:

https://www.nuff.ox.ac.uk/Economics/History/Paper43/43atkins...


1909 - 1972 UK was a global superpower.

1972 - 2023 UK is powerless and poor.

I don't like 99% tax rates, correlation is not causation, but the decline is real.


I get the current time in microseconds, and increment by one nanosecond for each call between updates.

Problem solved for the next 200 years.


Next article: Picosecond timestamp collisions are common.


Problem solved until your boss calls you an idiot for storing fake nanosecond precisions.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: