Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But the maintenance bar raised. It is not the case anymore that you could set up web server as hobby and leave it running untouched for years. Now you won't connect at all in 2 months if you don't set up letsencrypt correctly. And their API and root certificate also changes regularly so you must keep software up to date. If you use some cross-domain stuff to interact with outside services from browser, that now breaks every year or two. Things like that add up.


> And their API and root certificate also changes regularly so you must keep software up to date.

Their API deprecated one method with a security risk once and their root certificate is none of your concern if you run a webserver (and it also only changed once and not "regularly"). Their certificate chain is an issue that may concern you, but if your software is working correctly then it should just serve the chain that you get with a new cert.


That's a lot of ifs and buts, just to keep up with the last decade's implementations. For a simple blog that maybe ten people a month read. Good luck keeping up on developments of the next one...

Whether it's lets encrypt or Google or Apple or Facebook, the internet has largely moved away from a culture of small time hackers operating on barebones standards to super complex implementations gatekept by a few huge companies with infinite resources and conflicting values. They want to curate the web and monetize it, not lower the barrier to entry. You are free to use their ecosystems to produce things they can share revenue from, but everything else will only keep getting harder... what even is the web anymore but marketing landing pages redirecting to walled gardens.


That dichotomy is false, lots of levels between "self-host everything" (and deal with the pain of maintenance) and "walled gardens". For "just a blog", good old shared hosting works just as well as it did in the 90s/00s.


It's not really true, but essentially it is.

It used to be a web server was something you could almost auto deploy. Then it became a series of increasingly complex steps as various 'security' measures were employed. You can do these things yourself, and they aren't that hard, but they were never made easy in a way that didn't imply a lot of specific technical know how. I kept up with it for a while, eventually everyone has to deal with the real world and it's time constraints, and the 'security' of today provides undeniable barriers compared to the yesteryears of the web.

I'm not convinced this browser change is a good thing - I think the issue is the aforementioned crap on personal networks, not the ability for a browser to go there. If your security is endagered by your shitty dishwasher, either don't connnect it, or since you are doing the connecting, put it on an isolated private network. This move is encouraging bad security practices while at the same time just throwing another roadblock in the way of legitimate uses of 'home' software.


You do realize that the managed website hosting of the late 90s/early 00s still exists today, right?

You don't have to stand up your own servers in your favorite cloud provider and become a Cloud DevOps expert. You don't have to manage deployments, dependencies, etc. You can still pay $3/month to get shared hosting on DreamHost, upload your HTML file, and it gets served. No fiddling with nginx, no operating system patching, etc.

Even if you don't want to pay $3/month, I'm sure there are still hosts that will give you a few megabytes of storage and a couple gigabytes of traffic for free.


Hmm I'm not sure most people mean html files when they say 'web server' - I used to run a mail server, a couple websites, a blog, a couple wikis, with auth integration, and a couple custom web apps with live two-way messaging capabilities and associated backends...

You don't need much fancy for a plain page, no, but that's also not really what I'm talking about. I still sometimes use local services on my lan, with web interfaces, which are NOT routers, dishwashers, etc. - think file or media management.


> Then it became a series of increasingly complex steps

honestly, what series of increasingly complex steps? The main thing today is an expectation of HTTPS, and that is added complexity, but also something you can auto-deploy today and lots of tutorials available. E.g. I'm fairly sure I've spent more time of my life on .htaccess and nginx redirect syntax than on HTTPS, despite starting early with Let's Encrypt and not choosing the most-automated solutions - and in other setups "add HTTPS to a domain" is literally a line of config file, with the webserver doing the rest. But that's beside the point I made:

This is assuming that you actually are deploying something to a server, instead of making use of the myriad of ways of having that be someone else's problem. How are those "essentially" not true options?

"We can trust users and random developers to do the right thing" is understandably not the security position browsers take, so this needs some solution eventually. What the right tradeoff is is a good question. (i.e. IMHO there should be clear ways for devices to opt-in to being accessed)


(FWIW, their servers use their own certificates, so in fact I had to spend some time today updating the root certificates on a web server so it could fix certificate renewal.)


And try running a mail server from home. Receiving is ok, sending is a non starter. It’s hard even from a server.


> Now you won't connect at all in 2 months if you don't set up letsencrypt correctly

So set it up correctly, or just buy a cert like in the good ole days, or just don't use any encryption like in the good ole days.

All the options from the good ole days are still available to you.


If you don't enable encryption, many browsers reduce functionality available to you.


Functionality that wasn't available in the good old days.


Downloading files? Firefox gives a warning everytime you download from http.


What does the warning say? "You might not be downloading the file you think you are"? That just seems like useful, accurate information that you probably want to be aware of.


I have half a dozen websites that have been running without maintenance for the last 5+ years.


You're lucky. Or skilled. I have half a dozen websites that broke, with maintenance, in the same period. Often because of SSL issues configured by someone else. Plus my own screw ups. It's not impossible to do right but it's definitely not trivial. Even if you configured everything right, something up chain will probably break, in time...


Why should a webserver need maintenance?

To feed the SSL ponzi pyramid?


How can something that literally costs $0 be a ponzi pyramid?

And why should a web server need maintenance? I mean, just search Google for your favorite web server software and "CVE" and you'll find plenty of reasons.


I use a CDN (namely the one on Amazon AWS) to provide HTTPS for my website. That knocks two things at once; fast distribution across the globe, and security. Do you wish to abstain from using a CDN ?


> Do you wish to abstain from using a CDN?

Yes. You are missing the point entirely.


Let's Encrypt is easily automated with certbot, I've been running my home webserver for over 10 years with Debian and NixOS, without touching it apart from stable OS version upgrades.


Let's Encrypt needs internet access, something I prefer not to have for various (rather dated) systems on my network. Worse several things that ran on file:// in the past have been blocked by most browsers so even having to set up a server that then needs a valid cert is a painful complication over just clicking index.html and letting the browser handle every request locally.


For local-only access, you could run your own CA. I found gnoMint to be quite easy to use to generate and manage certificates. It does everything in an SQLite database. I do this for OpenVPN, but you could do it for web services just the same.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: