How is local more private? Whether AI runs on my phone or in a data center I still have to trust third parties to respect my data. That leaves only latency and connectivity as possible reasons to wish for endpoint AI.
If you can run AI in airplane mode, you are not trusting any third party, at least until you reconnect to the Internet. Even if the model was malware, it wouldn’t be able to exfiltrate any data prior to reconnecting.
You’re trusting the third party at training time, to build the model. But you’re not trusting it at inference time (or at least, you don’t have to, since you can airgap inference).