The problem is that the current "solution" is to cache based on the URL, which breaks if the URL is not accessible, as in this instance.
The suggestion solves that issue by using hashes of the files, so it doesn't matter if they are loaded from a remote/CDN URL or from the same server, they will be considered cached by the browser (and loaded from cache) regardless once the hash matches.
Realusername would like "a cross-website cache for public scripts". That's what this does. Every site gets to load the version from your browser's cache without downloading it. The problem given is that "each of them is downloaded thousands of times", and this fixes that.
But you're presuming that the shared URL is available to the browser. The whole point of this story is that that presumption is absolutely false for internet users in China. I'm betting that you'd find the same to be true for users in Iran, North Korea and any other embargoed nation. Realusername's solution was an attempt to solve the problem for everyone without writing off the billion or so users unfortunate enough to live in repressive countries.
But, you know, you live in the US and your solution works for everyone in the US, so F everyone who doesn't.