Just wait a bit and there will be some TLS denialism spawning here.
For a lot stuff on my local network I don’t want the hassle and there are loads of use cases in local networks for normal people to just have port 80 no certs on something like 192.x.x.x because there is no easy way to set up public certificates for that and I don’t want everything hostem on cloud - some stuff I want to still host for myself in my local network.
Corporations or companies should not do that - even internal networks should have proper certs and encryption but it also is not that easy.
Stuff sent over the internet for others to see should have TLS always because you don’t know where your packets travel.
> For a lot stuff on my local network I don’t want the hassle and there are loads of use cases in local networks for normal people to just have port 80 no certs on something like 192.x.x.x because there is no easy way to set up public certificates for that and I don’t want everything hostem on cloud - some stuff I want to still host for myself in my local network.
Tbh I don't see what's hard about this. All you need is an A record pointing to your 192.x.x.x, acme capable dns host and a modern reverse proxy. You can even use a free ddns service if you want. Wouldn't bother with this for development, but anything hosted for longer than a few days absolutely yes. Imo not getting browser warnings is alone worth the few minutes it takes nowadays.
> All you need is an A record pointing to your 192.x.x.x, acme capable dns host and a modern reverse proxy
And to distribute keys that allow those appliances to update the DNS records, to secure those keys, have an a way to install those keys (and update/rotate them), and make sure your DNS host is supported by your acme client.
“ All you need is an A record pointing to your 192.x.x.x, acme capable dns host and a modern reverse proxy”. That’s a LOT more than socket(), listen(), and accept().
I think it was just any laptop with i9 just bad CPU for a laptop.
I had dell and swapped it as soon as was possible to i7.
I think they just made those only because they knew there will be enough people thinking “bigger better CPU == better laptop” - and yeah seems like I am not the only one that got caught with that. But I also trusted that someone there did any testing…
That’s a bit of evergreen topic. “stop bloating web with js” comes up fairly often and there are those people who think they found a solution and everyone should start using whatever they imagine is “best for everyone”.
In my opinion most of those people struggle with whatever they encountered in ecosystem and just want to find a way that fits them - while also trying to make others do the same.
*“You didn’t want to make things perfect. You just hated things the way they are.”*
"Hi, I see you're the owner of this 6000-line mess of a component, could you answer some questions for me?"
"I don't own it, I didn't write it, and I don't understand it even slightly. I just made a one-line bug fix for one function in it a year ago and nobody has touched it since, so my name is on top of the git history."
Makes you wonder if the reason why some trivial bug in a closed source project goes unfixed for years; is because all the engineers are afraid to touch the code in some obscure library and instantly become its new 'owner'.
Mostly it is that you don’t go around fixing random stuff.
You might actually get in trouble picking up stuff that is not a priority.
Company I work for is less strict so we do “fix anything Friday”.
But for some other companies you might get a slap on the wrist for not following the plan and product owners pick what gets fixed and what not based on business plan. If there are big customers nagging - bug will be fixed asap.
Yeah at work I’m paid to own some components that I didn’t write and don’t entirely understand, so I figure my job is to help discover answers for the questions that arise.
I would not want to be a public maintainer though. I don’t have the patience or motivation to use my spare time for that.
Yeah there are are startups where head guys don’t know that and developers jump the gun because they feel like they’re ones that have the best understanding of the issue at hand.
About apps done by software houses, even though we should strive for doing good job and I agree with sentiment...
First argument would be - take at least two 0's from your estimation, most of applications will have maybe thousands of users, successful ones will maybe run with 10's of thousands. You might get lucky to work on application that has 100's of thousands, millions of users and you work in FAANG not a typical "software house".
Second argument is - most users use 10-20 apps in typical workday, your application is most likely irrelevant.
Third argument is - most users would save much more time learning how to use applications (or to use computer) properly they use on daily basis, than someone optimizing some function from 2s to 1s. But of course that's hard because they have 10-20 apps daily plus god know how many other not on daily basis. Though still I see people doing super silly stuff in tools like Excel or even not knowing copy paste - so not even like any command line magic.
Explain to me how you self-host a git repo without spending any money and having no budget which is accessed millions of time a day from CI jobs pulling packages.
For a lot stuff on my local network I don’t want the hassle and there are loads of use cases in local networks for normal people to just have port 80 no certs on something like 192.x.x.x because there is no easy way to set up public certificates for that and I don’t want everything hostem on cloud - some stuff I want to still host for myself in my local network.
Corporations or companies should not do that - even internal networks should have proper certs and encryption but it also is not that easy.
Stuff sent over the internet for others to see should have TLS always because you don’t know where your packets travel.
reply