I am curious where you get that impression. From what I could see, recent releases have just been "smoothing over pits" / filling in obvious type holes left over from older changes. Changes in rust edition 2021 are tiny and it seems to only exist in order to establish a regular cadence.
Disclaimer:
I've only dabbled in Rust from time to time, but have not written anything serious with it.
Those kind of comments are not based on looking at Rust's features on a theoretical level.
It's more from the viewpoint of "How hard is it to get something (correctly) done". Those impressions are mostly formed either from using the language and failing (or having severe difficulties) to solve a given problem, or being exposed to blog posts describing those kinds of problems.
Rust sure does not have as many features as C++, but it seems that in practice, the combination of available features (and/or the problems those create) is perceived as just as complex as C++.
Though not being a fair assessment, the Rust community will have to try fighting this perception, if Rust is not to be labeled way more complex than C++ in the coming years.
I think the problem here is that Rust exposes a lot of additional problems in a domain that are swept under the rug in other languages. I mean, those problems have always existed, it's just we are used to hoping or assuming that the generic solution within a language runtime will save us from them.
I've been developing mostly in Rust for the last five years. C++ for decades before that.
It is always comforting to dismiss something that you haven't made much effort to understand as unnecessarily overcomplex. I think we're all guilty of that at times.
Ah, i see. But the systems crowd (I'm going to lump them all together: microcontroller, kernel, high performance networking, etc.) is supposed to be the main target audience. For me it is hard to believe that so many of them just brush it off as too complex on first sight.
On the other hand, maybe those folks decided, after giving it a serious try, it's not worth it.
Mind you, that does not mean they are right. But in the end, if the effort for those people is perceived as too high, then maybe, just maybe, in practice it really is.
After more than 25 years as a developer, I learned the hard way, that programming languages, are, in the end, a matter of taste. You won't get someone to use a language, just by citing the feature list. Programming languages are a product made for developers. If they don't like it (for whatever reasons, and yes, these are at times highly subjective) they won't use it.
Sure, but we aren't making decisions in a vaccum with no consequences.
If I really like C and I ship a bunch of C code that gets exploited to build a botnet, that decision had consequences. I should have valued safety higher and the fact I didn't "like" safe alternatives was a distraction.
It's a stupid fad where you can "own" a digital asset (jack's first tweet in this case) using blockchain tech. People are comparing it to owning a physical asset, like an original painting by Leonardo da Vinci, but that's nonsense.
This data is stored on the Matic blockchain. The most important bits are
1) Text contents of the tweet
2) Canonical link to tweet (on twitter)
3) Digital signature by owner (using their ethereum wallet)
Valuables (the service minting tweets) is also responsible for linking a twitter account to an ethereum address. Also, these tweets are purely collectibles, i.e. they don't confer copyright or commercial rights to the buyer.
Perfect, thanks, that's what I was looking for (and none of the news coverage seems to mention).
So when you want to sell a tweet, Valuables verifies who you are and that you own the account (presumably), they produce one certificate and promise never to produce another, and then it goes into the blockchain.
So now there is one and only one entry for this tweet in the Valuables Twitter namespace, but there can be more from other organizations and on other blockchains.
It's going to become even more of a collectors' item when Valuables shuts down, or Twitter changes their URL scheme, or we're not using https:// anymore ;)
The types of analysis and programming practices used to send stuff to Mars is beyond what Rust, or D, or any other safer-systems-language tries to do. It's not that simple.
These types of projects effectively need to prove the absence of bugs using formal verification and very extensive testing. Surprise surprise, C makes it extremely expensive and theoretically difficult too.
For example: NASA wrote this project https://github.com/NASA-SW-VnV/ikos which uses abstract interpretation and would catch bugs in practically any language.
Do you know what subset of C NASA limits itself to? Or hw architecture? The rigour of their testing? Should all C developers follow the same restrictions as NASA?
Hardware (according to Wikipedia) is a BAE Systems RAD750 radiation-hardened single board computer based on a ruggedized PowerPC G3 microprocessor (PowerPC 750). The computer contains 128 megabytes of volatile DRAM, and runs at 133 MHz.
Personally I firmly believe that "all C developers" do not need to follow these regulations. It might even be counter-productive to slow down the development process for some clients. For safety-critical systems, these rules make sense. For little startups, they don't.
Developers are smart enough to learn these rules, so HR shouldn't ask for "5 years MISRA experience". It's really a choice of business model, time to market, and risk management. If you're a big company looking to cut costs, be careful about outsourcing firmware development to a little startup who might not follow these rules so strictly. I won't follow these rules for the stuff I throw together in my free time and put on Github, but I will be careful before committing code to master for medical device firmware.