Some (the worst) clocks do that.. It's convenient that the hour hand is moving continuously because it means that unless you need to be able to say "it's five seconds past two minutes past four _in the morning_", you simply look at the hour hand, if it's in the middle of two hours, well it's half past the smaller.. if it's one forth past the smaller, it's.. yes, quarter past.. if it's one forth from the larger then it's quarter to.. and well, honestly, if you need to read the time more precisely than that and chose to use an analogue clock for it, you've chosen the wrong type of clock, a digital clock with seconds and 24 hour display is a superior tool for telling the time anyway.
When I was a kid, before kinder garden, I remember my parents beginning to teach me how to read an analogue clock.. But I this was the late 80s, maybe it was 1990, but, this thing called Digital Clocks were a thing at the time.. And I absolutely refused to learn that old fashioned shit when I was already staring right at the objectively better solution.. My reasoning was that the old clocks would either be replaced by digital clocks within a short time, and those that weren't replace would be when they broke (5 year old me didn't grasp the idea that people would continue buying the obviously inferior products until this very day), honestly, I'm still a bit perplexed by the fact that one can buy an analogue clock today.. It's objectively inferior in every way.. Most of them don't even do 24 hours, which, is the amount of hours we have in a day, leading some idiots to refer to 18:00 as "six-o-clock", and other idiots (like myself) to have to ask EVERY_TIME someone tells me a time that's less than, or equal to 12.. fuck that shit.
Yeah, I learned how to read inferior clocks, but.. I don't see the point.
So no, it's not that those students can't read a clock, they just can't read an analogue one, because they're probably need to as often as they need to read an octal clock, or a binary led clock, or a 24 hour dial clock, or Chinese..
Years ago I once wasted 2 hours arriving at 07:00 instead of 19:00. The AM/PM stuff is ridiculous, especially when people don't specify it. It was the time we were going to leave for a trip, so 07:00 or 19:00 were all acceptable times. So 24-hour time is obviously better. Most people don't even bother saying "AM" or "PM" if it even exists in their language or culture.
I think analog clocks are mostly for old people who don't like change, for people nostalgic for the past, for people who think like it makes them better, smarter, fancier of classier somehow - especially with expensive mechanical analog watches.
I've thought a lot about law-as-code, but my conclusion is always that bad actors will be given an advantage by being able to brute-force the code until they find a way to get away with whatever obviously-immoral-harmful stuff they want (imagine giga-corps spending a few millions on hardware to brute-force tax law - ROI probably even better than tunneling through mountains to grab stonks first..).
In the end it reminds me of a quote by Edmund Burke:
"Bad men obey the law only out of fear of punishment; good men obey it out of conscience - and thus good men are often restrained by it, while bad men find ways around it."
I'm wondering if it might be impossible to write a law that both prevents the sprit of what we want it to prevent, while also not preventing the spirit of what we don't want to prevent. :)
I recently used it to boot a ~1996 Compaq Presario from CD-Rom to image the hard-drive to a USB stick before wiping it for my retro-computer fun :)
It's kind of sad to hear "adult" people claim in all seriousness that it's reasonable that a kernel alone spends more memory than the minimum requirement for running Windows 95, the operating system with kernel, drivers, a graphical user interface and even a few graphical user-space applications.
I got this insight from a previous thread : you can run linux with gui on the same specs as win 95 fine if your display resolution is 640x480. The framebuffer size is the issue
I mean why is that a problem? Win95 engineering reflects the hardware of the time, the same way today's software engineering reflects the hardware of our time. There's no ideal here, there's no "this is correct," etc its all constantly changing.
This is like car guys today bemoaning the simpler carburetor age or the car guys before them bemoaning the model T age of simplicity. Its silly.
There will never be a scenario where you need all this lightweight stuff outside of extreme edge cases, and there's SO MUCH lightweight stuff its not even a worry.
Also its funny you should mention win95 because I suspect that reflects your age, but a lot of people here are from the dos/first mac/win 2.0 age, and for that crowd win95 was the horrible resource pig and complexity nightmare. Tech press and nerd culture back then was incredibly anti-95 for 'dumbing it all down' and 'being slow' but now its seen as the gold standard of 'proper computing.' So its all relative.
The way I see hardware and tech is that we are forced to ride a train. It makes stops but it cannot stop. It will always go to the next stop. Wanting to stay at a certain stop doesn't make sense and as in fact counter-productive. I wont go into this, but linux on the desktop could have been a bigger contender if the linux crowd and companies were willing to break a lot of things and 'start over' to be more competitive with mac or windows, which at he time did break a lot of things and did 'start over' to a certain degree.
The various implementations of linux desktop always came off clunky and tied to unix-culture conventions which dont really fit the desktop model, which wasn't really appealing for a lot of people, and a lot of that was based on nostalgia and this sort of idealizing old interfaces and concepts. I love kde but its definitely not remotely as appealing as win11 or macos gui and ease of use.
In other words, when nostalgia isn't pushed back upon, we get worse products. I see so much unquestionable nostalgia in tech spaces, I think its something that hurts open source projects and even many commercial ones.
I agree with this take. Win95's 4MB minimum/8MB recommended memory requirement and a 20MHz processor is seen as the acceptable place to draw the line but there were graphical desktops on the market before that on systems with 128K of RAM and 8MHz processors. Why aren't we considering Win95's requirements as ridiculously bloated?
Yep, at the time the Amiga crowd was laughing at the bloat. But now its suddenly the gold standard on efficiency? I think a lot of people like to be argumentative because they refuse to understand they are engaging in mere nostalgia and not actually anything factual or logical.
if you can compile the kernel though, there is no reason that W95 should be any smaller than your specifically compiled kernel - in fact it should be much bigger
> There will never be a scenario where you need all this lightweight stuff
I think there are many.
Some examples:
* The fastest code is the code you don't run.
Smaller = faster, and we all want faster. Moore's law is over, Dennard scaling isn't affordable any more, smaller feature sizes are getting absurdly difficult and therefore expensive to fab. So if we want our computers to keep getting faster as we've got used to over the last 40-50 years then the only way to keep delivering that will be to start ruthlessly optimising, shrinking, finding more efficient ways to implement what we've got used to.
Smaller systems are better for performance.
* The smaller the code, the less there is to go wrong.
Smaller doesn't just mean faster, it should mean simpler and cleaner too. Less to go wrong. Easier to debug. Wrappers and VMs and bytecodes and runtimes are bad: they make life easier but they are less efficient and make issues harder to troubleshoot. Part of the Unix philosophy is to embed the KISS principle.
So that's performance and troubleshooting. We aren't done.
* The less you run, the smaller the attack surface.
Smaller code and less code means fewer APIs, fewer interfaces, less points of failure. Look at djb's decades-long policy of offering rewards to people who find holes in qmail or djbdns. Look at OpenBSD. We all need better more secure code. Smaller simpler systems built from fewer layers means more security, less attack surface, less to audit.
Higher performance, and easier troubleshooting, and better security. There's 3 reasons.
Practical examples...
The Atom editor spawned an entire class of app: Electron apps, Javascript on Node, bundled with Chromium. Slack, Discord, VSCode: there are multiple apps used by tens to hundreds of millions of people now. Look at how vast they are. Balena Etcher is a, what, nearly 100 MB download to write an image to USB? Native apps like Rufus do it in a few megabytes. Smaller ones like USBimager do it in hundreds of kilobytes. A dd command in under 100 bytes.
Now some of the people behind Atom wrote Zed.
It's 10% of the size and 10x the speed, in part because it's a native Rust app.
The COSMIC desktop looks like GNOME, works like GNOME Shell, but it's smaller and faster and more customisable because it's native Rust code.
GNOME Shell is Javascript running on an embedded copy of Mozilla's Javascript runtime.
Just like dotcoms wanted to dis-intermediate business, remove middlemen and distributors for faster sales, we could use disintermediation in our software. Fewer runtimes, better smarter compiled languages so we can trap more errors and have faster and safer compiled native code.
Smaller, simpler, cleaner, fewer layers, less abstractions: these are all goods things which are desirable.
Dennis Ritchie and Ken Thompson knew this. That's why Research Unix evolved into Plan 9, which puts way more stuff through the filesystem to remove whole types of API. Everything's in a container all the time, the filesystem abstracts the network and the GUI and more. Under 10% of the syscalls of Linux, the kernel is 5MB of source, and yet it has much of Kubernetes in there.
Then they went further, replaced C too, made a simpler safer language, embedded its runtime right into the kernel, and made binaries CPU-independent, and turned the entire network-aware OS into a runtime to compete with the JVM, so it could run as a browser plugin as well as a bare-metal OS. Now we have ubiquitous virtualisation so lean into it: separate domains. If your user-facing OS only runs in a VM then it doesn't need a filesystem or hardware drivers, because it won't see hardware, only virtualised facilities, so rip all that stuff out. Your container host doesn't need to have a console or manage disks.
This is what we should be doing. This is what we need to do. Hack away at the code complexity. Don't add functionality, remove it. Simplify it. Enforce standards by putting them in the kernel and removing dozens of overlapping implementations. Make codebases that are smaller and readable by humans.
Leave the vast bloated stuff to commercial companies and proprietary software where nobody gets to read it except LLM bots anyway.
I wonder if it would be possible to have gone directly to Zed, without going through Atom first (likewise, Plan 9 would never have been the first iteration of a Unix-like OS). "Rewrite it in Rust" makes a lot of sense if you have a working system that you want to rewrite, but maybe there's a reason that "rewrite it in Rust" is a meme and "write it in Rust" isn't. If you just want to move fast, put things up on the screen for people to interact with, and figure out how you want your system to work, dynamic languages with bytecode VMs and GC will get you there faster and will enable more people to contribute. Once the idea has matured, you can replace the inefficient implementation with one that is 10% of the size and 10x the speed. Adding lots of features and then pruning out the ones that turn out to be useless may also be easier than guessing the exact right feature set a priori.
This is true, but it is generally true.
Even for UV-EPROMs the retention time can be as low as a 25 years, if kept warm, even with the window sealed correctly.
Magnetic drives are quite a lot better, around 50 years.
CD-RWs are somewhat wider in their stability, I have ~20 year old discs that are becoming unreadable because the actual foil is delaminating from the plastic disc. Meanwhile I have ~40 year old DS-DD floppies that are still fully readable even though their medium is in physical contact with the read/write heads (although here, again, storage conditions and especially the different brands/batches seem to make a difference).
reply