Could you give an example of "clever" (bad) vs "simple" (good)?
In my experience C has a lot of simple grammar, a commonly-held simple (wrong) execution model, and a lot more complexity lurking underneath where it can't be so easily seen.
Abstraction is necessary to handle scale. If you have painstakingly arrived at a working solution for a complex problem like say locking, you want to be able to package it up and use it throughout your codebase. C lacks mechanisms to do this apart from using its incredibly brittle macro facility.
Ada has built-in constructs for concurrency, with contracts, and there is formal verification in a subset of Ada named SPARK, so Ada / SPARK is pretty good.
> C lacks mechanisms to do this apart from using its incredibly brittle macro facility.
We programmers are the ultimate abstraction mechanism, and refining our techniques in pattern design and implementation in a codebase is our highest form of art. The list of patterns in the Gang-of-Four's "Design Patterns" are not as interesting as its first 50 pages, which are seminal.
From the organization of files in a project, to organization of projects, to class structure and use, to function design, to debug output, to variable naming as per scope, to commandline argument specification, to parsing, it's nothing but patterns upon patterns.
You're either doing patterns or you're doing one-offs, and one-offs are more brittle than C macros, are hard to comprehend later, and when you fix a bug in one, you've only fixed one bug, not an entire class of bugs.
Abstraction is the essense of programming, and abstraction is just pattern design and implementation in a codebase, the design of a functional block and how it's consumed over time.
The layering of abstractions is the most fundamental perspective on a codebase. They not only handle scale, they make or break correctness, ease of malleability, bug triage, performance, and comprehendability -- I'm sure I could find more.
The design of the layering of abstractions is the everything of a codebase.
The success of C's ability to let programmers create layers of abstractions is why C is the foundational language of the OS I'm using, as well as the browser I'm typing this message in. I'm guessing you are, too, and, while I could be wrong, it's not likely. And not a segfault in sight. The scale of Unix is unmatched.
> The success of C's ability to let programmers create layers of abstractions is why C is the foundational language of the OS I'm using, as well as the browser I'm typing this message in.
What browser are you using that has any appreciable amount of C in it? They all went C++ ages ago because it has much better abstraction and organization capabilities.
That's a fair point that I hadn't considered. I was developing C+objects as C++ was first being released in the mid-90s, and then using Borland's C++ compiler in the early 2000s, but never really thought about it as anything more than what its name implies: "C with some more abstractions on top of it".
Thank you for the correction, but I consider C++ to be just a set of abstractions built upon C, and, if you think about it, and none of those structures are separate from C, but merely overlaid upon it. I mean it is still just ints, floats, and pointers grouped using fancier abstractions. Yes, they're often nicer and much easier to use than what I had to do to write a GUI on top of extended DOS, but it's all just wrappers around C, IMO.
C++ is very definitely not just wrappers around C and it's pretty ridiculous to frame it like that. Or if you want to insist on that, then C doesn't exist, either, as it's just a few small abstractions over assembly.
> The success of C's ability to let programmers create layers of abstractions
You wrote several entirely valid paragraphs about how important abstractions are and then put this at the end, when C has been eclipsed by 40+ years of better abstractions.
Because programmers are creating the abstractions, not the programming language.
And there is no OS I'm aware of that will threaten Unix's dominance any time soon.
I'm not against it, but C's being so close to what microprocessors actually do seems to be story of of its success, now that I think about it.
I personally haven't written in C for more than a half-decade, preferring Python, but everything I do in Python could be done in C, with enough scaffolding. In fact, Python is written in C, which makes sense because C++ would introduce too many byproducts to the tightness required of it.
I was programming C using my own object structuring abstractions as C++ was being developed and released. It can be done, and done well (as evidenced by curl), but it just requires more care, which comes down to the abstractions we choose.
So, I would say "eclipsed" is a bit strong a sentiment, especially given our newly favorite programming langauges are running on OSes written in C.
If I had my druthers, I'd like everything to be F# with native compilation (i.e. not running using the .NET JIT), or OCaml with a more C-ish style of variable instantiation and no GC. But the impedance mismatch likely makes F# a poor choice for producing the kinds of precise abstractions needed for an OS, but that's just my opinion. Regardless, the code that runs runs via the microprocessor so the question really is, "What kinds of programming abstractions produce code that runs well on a microprocessor."
I've never thought of this before, thanks for the great question.
> And there is no OS I'm aware of that will threaten Unix's dominance any time soon.
Depends on the point of view, and what computing models we are talking about.
While iDevices and Android have UNIX like bottom layer, the userspace has nothing to do with UNIX, developed in a mix of Objective-C, Swift, Java, Kotlin and C++.
There is no UNIX per se on game consoles, and even on Orbit OS, there is little of it left.
The famous Arduino sketches are written in C++ not C.
Windows, dominant in games industry to the point Valve failed to attract developers to write GNU/Linux games, and had to come up with Proton instead, it is not UNIX, the old style Win32 C code has been practically frozen since Windows XP, with very few additions, as since Windows Vista it became heavily based on C++ and .NET code.
macOS while being UNIX certified, the userspace that Apple cares about, or NeXT before the acquisition, has very little to do with UNIX and C, rather Objective-C, C++ and Swift.
On the cloud native space, with managed runtimes on application containers or serverless, the exact nature of the underlying kernel or type 1 hypervisor is mostly irrelevant for application developers.
> I'd like everything to be F# with native compilation
This already works today (even with GUI applications) - just define non-unbound-reflection using replacements for printfn (2 LOC) and you're good to go: dotnet publish /p:PublishAot=true
To be clear, in .NET, both JIT runtime and ILC (IL AOT Compiler) drive the same back-end. The compiler itself is called RyuJIT but it really serves all kinds of scenarios today.
> makes F# a poor choice for producing the kinds of precise abstractions needed for an OS
You can do this in F# since it has access to all the same attributes for fine-grained memory layout and marshalling control C# does, but the experience of using C# for this is better (it is also, in general, better than using C). There are a couple areas where F# is less convenient to use than C# - it lacks C#'s lifetime analysis for refs and ref structs and its pattern matching does not work on spans and, again, is problematic with ref structs.
> there is no OS I'm aware of that will threaten Unix's dominance any time soon
True, but irrelevant?
> What kinds of programming abstractions produce code that runs well on a microprocessor
.. securely. Yes, this can be done in C-with-proofs (sel4), but the cost is rather high.
To a certain extent microprocessors have co-evolved with C because of the need to run the same code that already exists. And existing systems force new work to be done with C linkage. But the ongoing CVE pressure is never going to go away.
I'm not at all against a new model providing a more solid foundation for a new OS, but it's not going to be garbage collected, so the most popular of the newer languages make the pickings slim indeed.
> But the ongoing CVE pressure is never going to go away.
I think there are other ways to deflect or defeat that pressure, but I have no proof or work in that direction, so I really have nothing but admittedly wild ideas.
However, one potentially promising possibility in that direction is the dawn of immutable kernels, but once again, that's just an intuition on my part, and they can likely be eventually defeated, if only by weaknesses in the underlying hardware architecture, even though newer techniques such as timing attacks should be more easily detected because they rely on being massively brute force.
The question, to me, is "Can whittling away at the inherent weaknesses reduce the vulns to a level of practical invulnerability?" I'm not hopeful that that can occur but seeing the amount of work a complete reimplementation would require, it may simply be the best approach to choose from a cost-benefit analysis perspective where having far fewer bugs and vulns is more feasible than guaranteed perfection. And, once again, such perfection would require the hardware architecture be co-developed with the OS and its language to really create a bulletproof system, IMO.
Yes, I agree, that is why I am put off by some supposed C replacements that are trying to be clever with their abstractions or constructs.