The plastic used in the black/yellow brand is brittle when it gets cold — it breaks upon simple impact/sliding. Also, you cannot see inside them without opening the lid.
The clear plastic is usually a bit thicker, and more rubberized — it'll still break, but more difficultly.
As to why they're against them, I don't know their reason, but there used to be only one size of tote. There there was big and small. And then, for some fxcking reason, they decided to make ones that were roughly as big as the big ones. Just enough that you have to take half a second to re-eyeball-ruler measure them. But in isolation, if you've got one in front of you, you can't know if it'll tetris properly with another one until they're side-by-side and it turns out they're not.
> When you get bullied in American public schools for being a "nerd" and liking science and math, your country doesn't exactly produce a lot of state-of-the-art STEM professionals. You get a small handful of exceptional people who overcame the adversity but that's it.
Is bullying nerds still happening? It was commonplace when I was young in the 1980s. (In fact, it was so common that it was the basis of the 1984 movie Revenge of the Nerds.) But I had thought the social status of nerds and geeks had leveled up a few times since then. Did the level-ups not happen?
Yes and no. Generally, you don't necessarily get bullied but you lose opportunities to interact with people. Most students in the US do not care about academics more than they need to, and the kind of "nerd" to care about math and science likely doesn't have much to talk about with these people or even is able to have a meaningful conversation without being told something along the lines of "it's not that deep" or "I'm not reading allat"
My store used to have a big bread oven, desserts made in-house, fresh prepared food made in woks etc. right next to the buffet table, etc. All gone now; the coffee shop got replaced by robots, they tried to close the seafood counter (with enough negative feedback they reversed it), etc.
It's all made centrally now, for 3x the price and half the taste. All the kids went and got MBAs and the third generation family business curse hit hard as a result.
I've heard locals say "Bob Wegman loved people, Danny Wegman loves food, and Colleen Wegman loves money".
In Ithaca the coffee went downhill lately, that's for sure. On the other hand, my favorite drip coffee anywhere is made by machines that brew it by the cup.
Honestly, it's not even about the coffee. The lady working there would see me, greet me by name, ask after my kids, and start making my drink without me having to tell her my order. That was part of the Wegmans magic for a long, long time.
(Same reason closing the seafood counter got a big backlash. There's a similarly awesome guy working there. For now.)
One of my favourite cafes ... thirty years ago now ... the barista would set up my drink when she saw me walk through the door, by the time I'd reached the counter she was handing it to me with a big smile.
Tipped her generously on her last day there, got a big hug for it.
In general, it seems like the pareto products dont exist anymore, the midrange has basically dropped out for daily products and it's been bifurcated. If quality is a scale from 1-100, most places sell a 1, a 10, or you go to an artisanal place for a 90, for exorbitant prices.
But in the past a supermarket or toy store would have sold you an 80 for a reasonable price.
What sucks even more is that for example due to the cacao shortage, lots of products now contain less cacao for the same price. And usually down from 500g/250g to something like 485g/235g. Shrinkflation.
But, when cacao becomes cheaper or inflation stabilizes, companies don't think "let's push the quality back up for the same price", no, they'll pocket the difference. The same is planned to happen if Trump's tariffs get struck down. Businesses will get a huge refund, but the customers that got the costs passed along won't see a penny.
I know it's widespread, I just would've thought Wegmans would be one of the last to do it. The premium vibes have long been their thing, and it was part of their secret sauce to vastly larger per-square-foot sales in their stores.
One thing I'm really envious of as European is Costco. Costco is absolute king of finding pareto stuff (20% of the investment nets 80% of the quality) and offering that. I know their whiskys are good, their tires are good, their medicines are good, their chicken is good. And all for a relatively reasonable price. It really seems like a last bastion haha.
I love that keyboard, but I wish they'd increase the TrackPoint cut out for the GHB keys more. It's not the easiest when attempting to move the cursor to the top-left, top-right, and bottom areas of the screen.
> There's nothing wrong with this choice [to work extra hours to get promoted].
But if there are limited slots for promotion, and that's generally always the case, the resulting competition among deserving engineers makes the extra hours more or less mandatory. Say that Amy is a better engineer than Jim and gets a third more done per hour. If Jim puts in 60 hours instead of the expected 40, then Amy isn't going to beat him for a slot unless she also starts working extra hours.
In the end, promotion becomes more about grinding than being effective. That's not great for company culture or retention of top talent.
That doesn't make the promotion more about grinding because the company doesn't care about how much work you get done in a set unit of time compared to other employees in the same set unit of time. The company cares about how much you get done, period.
If the only differentiating factor between Amy and Jim is quantity of work done (this is never the case in real life), most companies will prefer a Jim that works 60 hours to an Amy that works 40 if Jim is producing 5% more.
In software development, sure (maybe). Most jobs aren't software development.
The vast majority of jobs your production slows as hours increase but there isn't a tipping point where you're less productive, even after accounting for errors or rework. There's a reason CPAs don't clock out at 37.5 hours during tax season, or warehouses or service desks or any number of things other than the specific thing most of us do often work more than 40 hours a week, especially when actively working to get a promotion.
One reason is that using static binaries greatly simplifies the problem of establishing Binary Provenance, upon which security claims and many other important things rely. In environments like Google’s it's important to know that what you have deployed to production is exactly what you think it is.
> One reason is that using static binaries greatly simplifies the problem of establishing Binary Provenance, upon which security claims and many other important things rely.
It depends.
If it is a vulnerability stemming from libc, then every single binary has to be re-linked and redeployed, which can lead to a situation where something has been accidentally left out due to a unaccounted for artefact.
One solution could be bundling the binary or related multiple binaries with the operating system image but that would incur a multidimensional overhead that would be unacceptable for most people and then we would be talking about «an application binary statically linked into the operating system» so to speak.
> If it is a vulnerability stemming from libc, then every single binary has to be re-linked and redeployed, which can lead to a situation where something has been accidentally left out due to a unaccounted for artefact.
The whole point of Binary Provenance is that there are no unaccounted-for artifacts: Every build should produce binary provenance describing exactly how a given binary artifact was built: the inputs, the transformation, and the entity that performed the build. So, to use your example, you'll always know which artefacts were linked against that bad version of libc.
> […] which artefacts were linked against that bad version of libc.
There is one libc for the entire system (a physical server, a virtual one, etc.), including the application(s) that have/have been deployed into an operating environment.
In the case of the entire operating environment (the OS + applications) being statically linked against a libc, the entire operating environment has to be re-linked and redeployed as a single concerted effort.
In dynamically linked operating environments, only the libc needs to be updated.
The former is a substantially more laborious and inherently more risky effort unless the organisation has achieved a sufficiently large scale where such deployment artefacts are fully disposable and the deployment process is fully automated. Not many organisations practically operate at that level of maturity and scale, with FAANG or similar scale being a notable exception. It is often cited as an aspiration, yet the road to that level of maturity is windy and is fraught with many shortcuts in real life which result in the binary provenance being ignored or rendering it irrelevant. The expected aftermath is, of course, a security incident.
I claimed that Binary Provenance was important to organizations such as Google where it is important to know exactly what has gone into the artefacts that have been deployed into production. You then replied "it depends" but, when pressed, defended your claim by saying, in effect, that binary provenance doesn't work in organizations that have immaturate engineering practices where they don't actually follow the practice of enforcing Binary Provenance.
But I feel like we already knew that practices don't work unless organizations actually follow them.
My point is that static linking alone and by itself does not meaningfully improve binary provenance and is mostly expensive security theatre from a provenance standpoint due to a statically linked binary being more opaque from a component attribution perspective – unless an inseparable SBOM (which is cryptographically tied to the binary), plus signed build attestations are present.
Static linking actually destroys the boundaries that a provenance consumer would normally want due to erasure of the dependency identities rendering them irrecoverable in a trustworthy way from the binary by way of global code optimisation, inlining (sometimes heavy), LTO, dead code elimination and alike. It is harder to reason about and audit a single opaque blob than a set of separately versioned shared libraries.
Static linking, however, is very good at avoiding «shared/dynamic library dependency hell» which is a reliability and operability win. From a binary provenance standpoint, it is largely orthogonal.
Static linking can improve one narrow provenance-adjacent property: fewer moving parts at deploy and run time.
The «it depends» part of the comment concerned the FAANG-scale level of infrastructure and operational maturity where the organisation can reliably enforce hermetic builds and dependency pinning across teams, produce and retain attestations and SBOM's bound to release artefacts, rebuild the world quickly on demand and roll out safely with strong observability and rollback. Many organisations choose dynamic linking plus image sealing because it gives them similar provenance and incident response properties with less rebuild pressure at a substantially smaller cost.
So static linking mainly changes operational risk and deployment ergonomics, not evidentiary quality about where the code came from and how it was produced, whereas dynamic linking, on the other hand, may yield better provenance properties when the shared libraries themselves have strong identity and distribution provenance.
NB Please do note that the diatribe is not directed at you in any way, it is an off-hand remark and a reference to people who prescribe purported benefits to the static linking that it espouses because «Google does» it without taking into account the overall context, maturity and scale of the operating environment Google et al operate at.
> like repeating they decreased drugs price by 600%
The NYT and other media outlets like to point out that this claim is mathematically impossible. However, “cut prices by 600%” is understood perfectly well by most people (but not pedants) to mean “we undid price hikes of 600%.”
I suspect that this phrasing was chosen as a “wedge” to drive home to the MAGA faithful that the news media is biased against them.
Does that logic apply only when the claimed cut is over 100%?
If I advertise that my store "cut prices by 50%" but the prices are actually only 33% lower (which is the same as undoing a 50% price hike), would it be pedantic to call me out on my bullshit?
> Does that logic apply only when the claimed cut is over 100%?
Yes, I’d say.
It’s the same as the informal usage of “X times smaller” to describe scaling by 1/X. The idiom generally isn’t used unless X > 1. (The exception might be when several values of X are reported together. Then one might say “0.74 times smaller” to maintain parallel form with nearby “4 times smaller” and similar claims.)
No, it does not conform. As I wrote earlier, I have not seen that usage for less than 100%. So 600% conforms; 50% does not.
That is, expressions like "twice as slow/thin/short/..." or "2x as slow/thin/short/..." or "200% as slow/thin/short/..." have a well-established usage that is understood to mean "half as fast/thick/tall/..."
But "50% as slow/thin/short/..." or "half as slow/thin/short/..." have no such established usage.
For some evidence to support my claim, please see this 2008 discussion on Language Log:
Since HN has a tendency to trim URLs and might prevent this link from taking you to the relevant portion of a rather lengthy article, I'll quote the salent bits:
"A further complexity: in addition to the N times more/larger than usage, there is also a N times less/lower than [to mean] '1/Nth as much as' usage"
"[About this usage, the Merriam-Webster Dictionary of English Usage reports that] times has now been used in such constructions for about 300 years, and there is no evidence to suggest that it has ever been misunderstood."
I believe that the history of English language usage is replete with examples such as "X times less than" when X > 1, but similar constructions for X <= 1 do not appear with appreciable frequency.
In any case, I think that continuing our conversation is unlikely to be productive, so this will be my last reply.
I will just say in closing that our conversation is a good example of why the MAGA folks have probably chosen phrasing such as this.
Would you mind sharing more about these tubs and why you are against them?
reply