I 100% agree with you. I should be able to run any code on my device (after I flip a bit and it gets wiped). The first thing I've done with every Nexus or Pixel purpose is to wipe and root it.
But that's not what this is about. Apple has been enforcing these rules for years. F.lux tried to get around the App Store by reaching users how to sideload via Xcode. Apple killed it.
The big players should be subject to the same rules. If they want to run their own code, they can't just flagrantly ignore Apple's TOS.
I'm also onboard with the Nielsen metaphor but not for kids. And both were scummy in targeting kids (though FB was definitely worse judging from marketing materials).
> F.lux tried to get around the App Store by reaching users how to sideload via Xcode. Apple killed it.
Specifically, Apple killed it because f.lux decided to distribute their app in a really sketchy manner where they essentially pushed an opaque binary blob to the phone rather than compiling the app from source and installing the build product from that.
I don't understand: Do you expect to be able to run any programs you want on the micro-controller on your washing machine?
That's what Apple is doing here. Pushing iPhone as a commodity, not a replacement for your macbook. This way they get the benefits of controlling the experience as much as they want. (I am not saying it is right or wrong, just that many people are fine with commodity phones and don't care for the loss of configurability).
I'm not sure where you're getting the "legal responsibility" part from - I'm not advocating legislation, simply stating my personal preferences as a consumer. I do what I can to try and bring others to my point of view, but I am in no part trying to push this as a legal burden on manufacturers. Please don't bring strawman arguments into this, this topic is complex and nuanced enough as-is.
Regarding security, that very much depends on your threat model and definition of "secure". Indeed, I see this general trend of decreasing user control over increasingly complex and connected hardware as a massive security threat where I am forced to trust multiple 3rd parties who may arbitrarily disrupt my life anytime new "features" or "policies" get pushed out.
It is perfectly possible to securely implement a tamper-evident "I know what I'm doing" switch/fuse that enables advanced control by device owners. However, I'm well aware that I'm in the minority on this topic, so I'm not holding my breath for such features to be implemented.
I don't use a washing machine controller as a general purpose computing device. I don't install apps on a washing machine and would never buy one that had that feature. It's not a reasonable comparison.
That said, I doubt washing machine microcontrollers use signed code. It's easier to modify them than your phone which is completely backwards.
> Do you expect to be able to run any programs you want on the micro-controller on your washing machine?
Yes. It is within my full legal right to install whatever programs I want on my washing machine.
Apple lost a bunch of lawsuits, when it tried to sue people for doing this. The courts proved that yes, you do have a legal right to do whatever you want with hardware that you own.
Sure. And Apple aren’t going to sue you if manage to. It sounds like you’re confused between “I should be able to” and “Apple should make it easy for me to”.
However you could get a developer license and load anything you want on your phone. Granted it's not for everybody, but if you're so incline to sideload apps on your phone you can pretend to be a developer (Meaning you just need to know enough to use the tools available, not in a demeaning way).