Unfortunate. I'll wait some days for the response, but it better be a good one.
This behavior from Core may be par for the course, but I can already buy watches from companies that have values only for marketing. It's a small niche, and being nice would not cost much.
And they already died once, without having a proper off-ramp for their users - for now I don't trust them to exist in another two years. (I'm not really sure they even are in this for the long term - talk is cheap.)
So, the response is here. Without a closer look I can't say what's really going on - although I lean toward believing that Core is going in the right direction - but there still seem to be some orange flags.
And also, some tools still break when using the non-default umask.
Yes, yes, we all run Postgres in containers, but if you don't, and you upgrade to a new Postgres major version, gladly using the Debian scripts that make it all more comfortable, while using umask 027, you will enjoy your day. Though I don't remember if those upgrade-scripts where from Debian proper or from Postgres.
Since that experience I always wondered what other tools may have such bugs lurking around.
I once had a colleague from Iran. Working (legally) in the middle of the EU. He was already blocked from using credit cards, but thanks to not-100%-US-dominance still allowed to use local banks. For such local banking he will likely need to have Play services.
It's not countries that are affected, but people. And people sometimes move.
All these announces are scenery and promotion. Very low chance any of these "corrections" were not planned. For some reason, sama et al. make me feel like a mouse played with by a cat.
Why on earth would they undercut the launch of their new model by "planning" to do a stunt where people demand the old models instead of the new models?
I'm not sure if you are serious or joking. "6 degrees of separation" is the famous radius for how many steps are needed such that everyone is connected.
Graph databases have a very narrow usecase, and it's almost always in relation to people - at least ime.
Though the data type isn't really important for the performance question, the amount of data selected is. So a 6-level depth graph of connections that only connect 2-3 entities would never get into performance issues. You'd be able to go way beyond that too with such a narrow window. (3 entity Connections on 6 level join would come out to I believe ~750 rows)
If you're modeling something like movies instead, with >10 actors per production you're looking at millions of rows.
I have signed a two-year non-compete as a part of my contract in the Netherlands. The official government website says basically "yeah so non-competes are binding, but if you go to court, you have a decent chance of voiding it, no promises though, good luck".
Regarding the EU document, there's a very long way between "EU noticed a problem" and "EU fixed a problem".