Two years back the S-bahn ticket machines at the aiport only supported chip+pin, not contactless. Had to open my banking app to figure out my pin code, as I wanted to use my corporate Amex
It's very confusing with a new Zigbee standard when I thought it was being replaced with Thread
And from the same organization that co-designed and promotes Matter.
Personally, I'm very happy that Zigbee continues to be developed. I am not very enthusiastic about everything being IP addressable (even if it is just ULA) and the convoluted Thread flow where a bunch of border routers require you to initiate the pairing with a phone over BLE to hand it over to the router. I have an Eve Energy Thread + Matter plug and it requires many takes to pair it correctly. Most Zigbee devices are just a matter of permitting join on your coordinator and holding/pressing the pairing button on the device.
I wish that Apple and Google would just add a Zigbee coordinator (or just leave home automation to others) and put some effort into supporting a wide variety of devices rather than trying to disrupt a perfectly working standard. They often cannot even bother implementing the latest Matter spec timely.
As someone who's run (and continues to run) both ZWave and Zigbee networks for over 10 years I find the direction of Zigbee rather frustrating. They used to be the antithesis of Zwave with it's frustrating "licensing" and now seems to be as though they're towing that line. Most likely because they got in bed with Google, Apple and the garbage that is Matter.
It didn't solve any of the issues it proclaimed to. And, if you look across open platforms (e.g. HomeAssistant) it has, comparatively, low uptake on available integrations because Matter is a vehicle for proclaimed interop by walled garden experts (e.g. Apple, Google, etc).
I've got several Matter smart plugs and a couple Matter smart bulbs.
They all were quick and easy to set up with their first Matter controller (an RPi4 running Home Assistant or an iPad with Apple Home), and quick and easy to add to whichever controller I didn't use as the first controller.
They all worked then without requiring me to get their manufacturer's proprietary app or make an account or anything like that.
Some needed a firmware update to support Matter 1.3, and so I had to use the manufacturer's app for that. Some also have proprietary functions and options (for example one of bulbs supports some kind of presence detection if you have at least two of those bulbs in the same room) so I might get the manufacturers app if I decide I want to use those functions.
Adding them to the manufacturer's app does not interfere with their use as Matter devices so if I do decide I want to use some of the proprietary stuff it doesn't break things.
1) If you have to use a manufacturers app for updates that's already falling into my point. 2) There are plenty of threads out there discussing manufacturers that leverage Matter but they force their own controller to be able to be used. A lot of these are together at builders as another revenue stream for them.
Finally... This [0] does a better job of explaining the issues with Matter. But, Matter is ultimately a joke. It was promoted as a standard by vendors nobody should trust for interoperability at this point.
How would you approach migrating a ~5 TB OLTP database that fundamentally constain analytical time series data? I’d think e.g., Apache Iceberg could be a better data store, and make writing much easier (almost just dump a parquet in there)
That 5 TB of data will probably be 3-400 GB in Parquet. Try and denormalise the data into a few datasets or just one dataset if you can.
DuckDB querying the data should be able to return results in milliseconds if the smaller columns are being used a better if the row-group stats can be used to answer queries.
You can host those Parquet files on a local disk or S3. A local disk might be cheaper if this is exposed to the outside world as well as giving you a price ceiling on hosting.
If you have a Parquet file with billions of records and row-groups measuring into the thousands then hosting on something like Cloudflare where there is a per-request charge could get a bit expensive if this is a popular dataset. At a minimum, DuckDB will look at the stats for each row-group for any column involved with a query. It might be cheaper just to pay for 400 GB of storage with your hosting provider.
E.g., this battery is 1.5V for ~70% of the capacity, before it gradually reduce to 1.0 V
https://www.xtar.cc/product/xtar-1-5v-aa-clr-3300-lithium-ba...
edit: And one with USB-C and linearly decreasing voltage curve https://www.xtar.cc/product/xtar-aa-lithium-lr-2000mah-usb-c...