Apparently, 25% of Americans over age 50 are millionaires by wealth, and so if you don’t expect those figures to go down, your odds of living to be a millionaire are pretty good!
No one disagreed it happens. You claimed it happened "all the time". Unless I"m missing it, your links don't provide numbers of how many Americans are kidnapped & murdered per year. Further, it'd be useful to compare that to the overall number of American visitors to Mexico.
I'm going to go out on a limb and claim it's a small fraction of a percent that find themselves kidnapped & murdered "all the time". But prove me wrong.
A lot of posts in this thread are conflating two separate but related topics. Statically typing a string as EmailAddress does not imply validating that the string in question is a valid email address. Both operations have their merits and downsides, but they don't need to be tied together.
Having a type wrapper of EmailAddress around a string with no business logic validation still allows me to take a string I believe to be an email address and be sure that I'm only passing it into function parameters that expect an email address. If I misorder my parameters and accidentally pass it to a parameter expecting a type wrapper of UserName, the compiler will flag it.
I would argue it's the other way around. If I take a string I believe to be a phone number and wrap it in a `PhoneNumber` type, and then later I try to pass it in as the wrong argument to a function like say I get order of name & phone number reversed, it'll complain. Whereas if both name & phone number are strings, it won't complain.
That's what I see as the primary value to this sort of typing. Enforcing the invariants is a separate matter.
> I find that staunch static typing proponents are often middle or junior engineeers
I wouldn't go this far as it depends on when the individual is at that phase of their career. The software world bounces between hype cycles for rigorous static typing and full on dynamic typing. Both options are painful.
I think what's more often the case is that engineers start off by experiencing one of these poles and then after getting burned by it they run to the other pole and become zealous. But at some point most engineers will come to realize that both options have their flaws and find their way to some middle ground between the two, and start to tune out the hype cycles.
I've seen a mix between stringly typed apps and strongly typed apps. The strongly typed apps had an upfront cost but were much better to work with in the long run. Define types for things like names, email address, age, and the like. Convert the strings to the appropriate type on ingest, and then inside your system only use the correct types.
This is a metric I never really understood. how often are people booting? The only time I ever reboot a machine is if I have to. For instance the laptop I'm on right now has an uptime of just under 100 days.
My Mac - couldn’t tell you, I just close the lid. My work laptop? Probably every day, as it makes its own mind up what it does when you close the lid. Even the “shut down” button in the start menu often restarts the machine in win 11.
My work desktop? Every day, and it takes > 30 seconds to go from off to desktop, and probably another minute or two for things like Docker to decide that they’ve actually started up.
Back in the bad old days of Intel Macs, I had a full system crash just as I was about to get up to give a presentation in class.
It rebooted and got to desktop, restoring all my open windows and app state, before I got to the podium (it was a very small room).
The Mac OS itself seems to be relatively fast to boot, the desktop environment does a good job recovering from failures, and now the underlying hardware is screaming fast.
I should never have to reboot, but in the rare instances when it happens, being fast can be a difference maker.
What I've found is if I open a picture in iMessage it tends to trigger the CPU hungry behavior. I notice it after a while as my laptop starts getting hot and battery draining much faster than expected. I hard quite iMessage, reopen it, and all is fine.
Sentiments like this make me wonder if perhaps the dream of the 90s was just ahead of its time. Things like UML, 4GLs, Rational were all being hyped. We were told that the future was a world where people could express the requirements & shape of the system, and the machines would do the rest.
Clearly that didn't happen, and then agile took over from the more waterfall/specs based approaches, and the rest was history.
But now we're entering a world where the state of the art is expressing your requirements & shape of the system. Perhaps this is just part of a broader pendulum swing, or perhaps the 1990s hopes & dreams finally caught up with technology.
Yes and no I'd say.
It's still the case that now only by iterating and testing things with the AI you get closer to an actually good solution.
So up front big spec will also not work so well.
The only exception maybe if you already have a very clear understanding and existing tests (like what they did with the Claude's building the rust c compiler to compile the Linux kernel)
I think PG said something about sitting down and hacking being how you understand the problem, and it’s right. You can write UML after you’ve got your head round it, but the feedback loop when hacking is essential.
I've seen the movie countless times. It was only last year that I learned it was "butter zone" and not "border zone". And I never understood why Nikon called it "border zone" as it made no sense in context. But I also had never heard the term "butter zone". So there you go.
reply