In theory, this would be a rich landscape for an entirely different abstraction layer for fingerprinting… However, I am skeptical that the typical fingerprinting tool chains are receiving data that reaches that far down in the stack…
Also TCP timestamp if the network stack doesn't apply a randomized offset. As of 4.10 Linux added randomization; no idea about others.
Getting a bit farther out there, CPU clock skew can be derived from the (randomly offset) TCP timestamps. That varies with temperature and thus load so it can be used to pick out otherwise indistinguishable traffic streams originating on the same physical host.
Back in the realm of commonly employed techniques, higher levels of the networking stack are fingerprinted in the wild. https://blog.cloudflare.com/ja4-signals/
Moving even farther up, the human interaction streams themselves are commonly fingerprinted. I realize that's a bit of a tangent but OP had suggested that fingerprints had short half lives and this is a very strong counterexample that I failed to mention earlier. https://www.whonix.org/wiki/Keystroke_and_Mouse_Deanonymizat...
You're both right. Being an influencer seems like the worst possible choice: all of the life-deranging detriments of fame without and of the prestige or wealth.
Consciousness is a specific biological adaptation which is primarily focused in the management of social relationships, status, and the prolonged adolescence of children. (and their required care)
There's no reason to think that consciousness is an important question in the objective sense; it just matters to people. (and rightfully so) People wondering about consciousness in the universe might be akin to dogs wondering what the big bang smelled like.
What is an important question in the objective sense? Is life, or no because that just matter to life? It seems oxymoronic, "objective question", "objectively important".
I don't follow GP's sort of solipsist (?) take, but would say question of whether big bang took place in a black hole is pointless compared to life/experience and how they arise.
Except for non-negotiables (eg: bill paying, government websites, etc.) a website that fully breaks when blocking js is just a worthless site which is not worth my time.
Anubis (https://anubis.techaro.lol) requires Javascript and is required to view some otherwise static websites now because AI scrapers are ruining the internet for small websites.
Hard disagree. On the one hand, "we have to hire a minority" is extremely blunt interpretation of DEI. But the intent is to foster diversity and that does mean hiring underrepresented groups. Having a diverse team is net positive on a lot of fronts. There are studies showing it actually improves team performance, but more generally it improves society. It may be marginally "unfair" for a particular company to be the one picking up the slack for society, but given the systemic discrimination that exists against minorities in so many part of society, giving them any sort of accommodation in hiring can at least offset the injustice elsewhere. The net result is a better society. Why is that the responsibility of a private company? Because companies are people and people live in society.
The people "balancing the scales" are ideologically motivated and are prone to their own error. So, you need to reckon with whether they're doing more harm than good. And to the extent that they're doing harm, what is the solution?
Is your argument "humans are fallible so we should give up?" The effectiveness (and sincerity) of DEI programs have certainly been mixed, but they should produce measurable outcomes that can be iterated and improved upon. A lot of the commentary in this thread and the rhetoric from the White House is that DEI is bad idea and should not be attempted because we shouldn't seek equity.
I'm not be sarcastic or funny when I ask this. Why isn't this called the Linux subsystem for Windows? It seems like a Linux subsystem running on Windows. If it were the other way around, (ie, a Windows Subsystem for Linux) I'd think that Linux would be the primary OS, and something like WINE would the subsystem.
I think it's supposed to be read as "the Windows subsystem for [running] Linux [applications]". Back in the old days there used be a thing called Services For UNIX (SFU), which was a bunch of Windows tools and services that provided UNIX-like behavior. Then came Subsystem for UNIX Applications (SUA). And now it's WSL.
Microsoft seems to just be expanding rapidly, and not worrying too much about feature parity, compatibility, or reliability. Why did our logic app fail last night? No reason, just Azure hiccups. Why doesn't the Sentinel data connector work? Whoever maintains it doesn't care. etc.
Until we start optimizing code, websites, etc this is a meaningless argument. Computers could use much less carbon than they do, but everyone needs them to be really fast, really powerful, use ray tracing, etc. But what most people do on computers could easily be done from a computer 20+ years ago; email, chatting, watching videos, word processing. In an alternate reality, the computing advances over time would have all been about efficiency, and code, operating systems, and websites would be written lean to work on much slower devices. We'll never live in that world, so Seagate's argument could potentially be true in a technical sense, but ultimately doesn't matter.
I still have an old 3930K which is 6 cores at about 3.6Ghz with 16GB of RAM, it was used as a game server for years but its not on now. It consumes about 110Watts at idle, there is no GPU of note (710GT or something like that) but its mostly all the CPU and lack of power control. A newer desktop however with 32GB DDR5 and a 9800X3D will idle at 40Watts with a lot more drives and a modern GPU etc. New machines use considerably less power at idle and when you go back as far as a Pentium 4 those things used 100Watts all the time just for the CPU whether in use or not.
Anything since about the Core 9th gen does behave fairly well as do all the modern era Ryzen processors. There is definitely some CPUs in the middle that had a bunch of issues with power management and had performance issues ramping clockspeed up which was felt on the desktop as latency. Its been for me a major advance of the past 10 generations of CPUs the power management behaviour has improved significantly.
Eh, your CPU can't stay eg in the high power state (post its steady state thermal envelope) for very long, but you'd still like to know how much power that consumes.
The kill-a-watt is unlikely to be fast enough. Especially if there are perhaps capacitors in your computer's power supply?
Are you interested in how much energy a certain instruction uses or are you interested in how much power your computer uses while running a certain program?
That's definitely true, but I guess what I mean is that we sort of keep eating up new efficiency gain with more capacity. It's not uncommon for modern PCs to have much larger PSUs than in the past -- it's just that these PCs are doing far _more_ with their power. We could have moved both directions, though -- hit a stasis capability, but keep improving and refining efficiency.
But, to your point, ARM and Apple's M line are really exciting in this regard.
No, intel didn’t get it. AMD certainly did. An i7 14700k can draw 253 watts and do 5.4 GHz, a 9800X3D can boost to 5.2GHz at 160W. Thats pretty close to the top end of desktop CPUs (for home use). As you go down the chain, you’ll see huge drops in power usage.
Intel in particular are guilty of replacing their mid and low range CPUs and replacing them with neutered low power cores to try and claw back the laptop market.
And I bet that even with AMD you get 85% of the performance with 50% of the power consumption on any current silicon...
And 85% of the current performance is a lot.
I have an AMD box that I temperature limited from the BIOS (because I was too lazy to look where the power limits were). It never uses more than 100W and it's fast enough.
None of these high end CPUs are bursting to 5+GHz for a regular website’s JavaScript. They’re running at their normal 40-60w draws. That massive burst is when you throw video encoding or compilation tasks at it. You can tell because you need to open a window when it happens.
HR _loves_ using LLMs. If there's a central authority telling them the appropriate way to speak, and the appropriate way to feel, they can't resist it.
Moderna is, as the article points out, under a lot of financial pressures. Obviously reducing headcount is a way to reduce spending. I don't think it is unlikely that top management looked at HR and determined the department was far too large for what it did.