In rust's case part of that is that LLVM is very C optimized. It's not great at taking advantage of the more strict rules of rust yet. It is getting better though.
The main problem of WASI in my opinion is that it's based on a standard which is 30 years old. And yes, the idea was cool, to let you run even C without having to rewrite libc. But the reality is that we can't do this now and it's still a big question whether we can in the near future. Besides, C is probably the only one that needs such a low level api, but everyone has to suffer because of it, essentially reinventing libc based on WASI with their own language runtime on top of it. This is simply a waste of binary space.
Also it lead to fragmentation into the WebAseembly ecosystem. A compiled program in wasm32-wasi cannot run in a browser, only in wasmtime, wasmer or node.js. WASI-polyfill.js is very limited, adds a bunch of kilobytes of glue code and is still not even officially introduced.
As a relative newcomer to WebAssembly, the first thing I noticed was a disconnect between WebAssembly, which is loosely governed by W3C process, whereas WASI, a subgroup, never released a draft, closest thing being the module id people tend to use, wasi_snapshot_preview1. The next surprise is finding most implementations usable by end users (ex language compilers, not necessarily wasi-libc), employ a small subset of the apis on a whack-a-mole basis. Then, you find that most attempts to have a test kit failed, and unlike the core spec which has a way to say if you passed or not, there's neither decent documentation, nor a test kit for wasi. Then, when issues are raised, some comments you mention about copy/paste CloudABI problems are conceded even by the project team. However, there's no usable infrastructure to bridge to what the next one is (I think based on component model, but supposedly usable without it).
My feeling is that the amount of surprised I had was brought about by the overwhelming marketing that WASI was basically this thing that gives you a capabilities based model for system calls, and in practice it was a semi-implemented, glorified markdown file that was not a part of any standardization process, at least not W3C draft..REC.
Now, I look at some of the tensions and they seem to imply this solved backend, but not frontend :) Well, I would say that it got backends turning, basically pushing the abstraction cliff a bit farther out than ad-hoc host ABIs times X targets. This surely was helpful as it allows people to get much farther along especially when everything uses llvm and libc. However, I hope what's next can do a bit more rigor as the first was an example of how to thrash a lot of people. spectest shouldn't be an optional thing in something called a standard, detailed mapping documents can happen to help people implement the paper and pass those tests. I think whatever happens next can fix things, and it will be both a matter of avoiding copy/pasta spec, but also adding far more rigor.
Regardless, I personally am happy to continue supporting who has used WASI and its alternatives including the next WASIs. Progress is important and I think WASI, albeit stumbling, progressed the backend a lot.
Cap'nProto
Is so much better.
Really a lot are, but Cap'nProto is main by the original protobuff dev so his description of why he made it is relevant.
I use protobuff everday at work and honestly the generated code is incredibly awful conpared to pretty much any alternative. We're way to bought in to switch now though sadly.
Unfortunately, melony is right: I've only really been able to focus on the C++ implementation of Cap'n Proto. Some of the others are solid (e.g. Go, Rust), but many are sort of half-baked weekend projects of random people, which were later abandoned. As a result it's tough for me to recommend Cap'n Proto in polyglot environments. Maybe someday I'll be able to find the time / resources to build out the ecosystem more fully. Great tooling is really important for any of this to work.
I'm also an investor in Buf FWIW. I think what they're doing is pretty cool.
If you find the quality of ddg results lacking, I know I did, they are basically just a bing wrapper after all, then give Kagi a shot. It's a subscription service search engine, but in my opinion it's more than worth the cost, especially for programming queries
I reponsed to a similar comment hete earlier, I've reproduced that below, maybe that will give some context:
I don't have much hope for it, I know several people that worked for amazon lumberyard (it's old name), that have quit because that organization is collapsing in on itself, losing engineers way faster than they can hire, and the engine is still buried under a mountain of tech debt inherited from Cryengine (that real Cryengine has resolved a long time ago)
One quit because the new render was so performance intensive, an rtx 2080 super was min spec for the lightest of scenes to achieve 60fps.
Another because he became the last engineer on his team after all the others quit and moved elsewhere at Amazon.
Even hoping open source saves it is unlikely, as it's only in name. The CI and infrastructure to meaningfully develop it open source does not exist.
As a final nail, the install process takes over two hours, which is just not competitive.
I don't have much hope for it, I know several people that worked for amazon lumberyard (it's old name), that have quit because that organization is collapsing in on itself, losing engineers way faster than they can hire, and the engine is still buried under a mountain of tech debt inherited from Cryengine (that real Cryengine has resolved a long time ago)
One quit because the new render was so performance intensive, an rtx 2080 super was min spec for the lightest of scenes to achieve 60fps.
Another because he became the last engineer on his team after all the others quit and moved elsewhere at Amazon.
Even hoping open source saves it is unlikely, as it's only in name. The CI and infrastructure to meaningfully develop it open source does not exist.
As a final nail, the install process takes over two hours, which is just not competitive.
It's unfortunate that it's not available for Firefox mobile, for me at least, mobile is where I want translation. For example when traveling and viewing the sites of businesses
However on the plus side, I'm not seeing a complete inability to translate in Nightly since the context menu let's me translate using Google Translate (i realise many won't want to send their text off to G but this does at least show up one of the claimed limitations in the article, that it simply wasn't possible at all)
And even if it wasn't architecture-agnostic, it's pretty easy to translate restricted and well-behaved machine code between architectures. Much easier than dealing with javascript itself.
There is none, one of the main objectives of WASM was to be a machine agnostic bytecode, similar to JVM bytecode for example. People have even built wasm VMs on FPGAs