And yet, not much has changed in that decade, right? Well, other than the Steam Deck, which is a well-defined set of hardware for a specific purpose, and which is the main driver for Linux game compatibility...
And that's great! But for a random owner of random hardware. the experience is, well... same as it ever was?
The experience on random hardware in 2025 is nowhere close to what is was in 2015. Have you tried it recently? In 2025 I can install pretty much any game from Steam on my Linux desktop with an nvidia gpu and it just works. The experience is identical to Windows.
The 2015 experience was nothing like this, you'd be lucky to get a game running crash-free after lots of manual setup and tweaking. Getting similar performance as Windows was just impossible.
> But for a random owner of random hardware. the experience is, well... same as it ever was?
Far from it... the only area you tend to see much issue with a current Linux distro is a few wifi/bt and ethernet chips that don't have good Linux support. Most hardware works just fine. I've installed Pop on a number of laptops and desktops this past year and only had a couple issues (wifi/bt, and ethernet) in those cases it's either installing a proprietary driver or swapping the card with one that works.
Steam has been pretty great this past year as well, especially since Kernel 6.16, it's just been solid AF. I know people with similar experience with Fedora variants.
I think the Steam Deck's success with Proton and what that means for Linux all around is probably responsible for at least half of those who have tried/converted to Linux the past couple years. By some metrics as much as 3-5% in some markets, which small is still a massive number of people. 3-5 Million regular users of Desktop Linux in the US alone. That's massive potential. And with the groundwork for Flatpak and Proton that has been taken, there's definitely some opportunity for early movers in more productivity software groups, not just open-source.
Gaming on linux in 2015 was a giant pita and most recent games didn't work properly or didn't work at all through wine.
In 2025 I just buy games on steam blindly because I know they'll work, except for a handful of multiplayer titles that use unsupported kernel level anticheat.
>And yet, not much has changed in that decade, right?
the performance difference between SteamOS and Windows did
>Well, other than the Steam Deck, which is a well-defined set of hardware for a specific purpose, and which is the main driver for Linux game compatibility...
>And that's great! But for a random owner of random hardware. the experience is, well... same as it ever was?
the 2025 ars technica benchmark was performed on a Legion Go S, not on a steam deck
I think the idea is that the kind of organization that would create the wayback machine in a world prior to the wayback machine is one which will also continue to push boundaries beyond that
I think that argument has a certain stasis to it, and kind of assumes that organisations maintain their energy and people (and those people are not changing!)… but there are realities where the initial push is by some people and then future maintenance is by others.
But I think the IA is a uniquely tough project because of how much the ground is shifting around them constantly. It’s not Wikipedia
And Brewster Kahle's notions about culture and information sharing start well before the Internet Archive. In theory one could pick and choose, but this is Brewster's life-long passion project. The man even outfitted a van with a printer and a binder to distribute physical books for free.
It's very strange to insist that he _not_ push the boundaries of copyright law for the common good. without that you wouldn't have had the Wayback machine in the first place.
Why not? This wasn't found by source review. The computer was slow, somebody looked into why. The bug was discovered via analysis of binary artifacts, and only then traced back to the source. Bruce Dawson does this all the time on Windows.
Proprietary software typically does everything within its power to stop you introspecting it.
Also, Windows is just suspicious in general. It's slow, everything makes network requests. Finding malware in Windows is a needle in a haystack. For some perspectives, Its all malware.
It is difficult to find out why Windows is slow again. My colleagues using Windows complain about it regularly, but not not even one ever started an investigation whether there might be backdoor or not, because this would be hopeless. With open-source it is feasible.
> Note that as these machines have not been released for general availability yet, supported and missing features are predictions based on what Apple has changed on a per-SoC and per-machine basis in the past. This page will change rapidly once work begins on support for these machines.
These machines have been available for quite a while.
The same text is on their M3 page, so at this point you have to assume it really means they haven't gotten to a point where the support page needs updating. Although it would be nice if they updated their page to just say that instead I guess beggars can't be choosers.
Do you have a source for transglutiminase used to put the pink slime back together? This is the first I'm hearing it. I thought they just stirred it into the ground beef.
That article is very confused. They're saying that transglutaminase, street name meat glue, extracted from bacteria is literally the same thing as pink slime. Lean finely textured beef, street name pink slime, comes from cows. However processed, it's still beef. You can't squeeze it out of bacteria.
Reading the Wikipedia articles it's pretty clear these are different things, even if both are added to beef in some way.
Of course it's confused. I barely read it. I just remember, as a meat glue enthusiast (glue two skirt steaks together sometime! amazing!) that TG was at the heart of a "pink slime" controversy, as one of the ways manufacturers made salable products out of mechanically separated meat.
I don't, like, agree that it's a real issue! That's my point. TG is more than enough to make a food product "UPF", but a lot of TG meat products are probably a whole hell of a let better for you than non-UPF "olive-oil fried potato chips".
"has no parameters" is not the same as "cannot take arguments". Defining `int main()` does not stop the runtime from passing the usual 3 arguments (typically named argc, argv, envp), it only means that no parameters are bound to those arguments. Technically it's no problem to have a C function ignore its arguments by not binding parameters. Way too many programmers seem to not understand the difference between parameter and argument.
Surely part of the problem is having a distinct term and handling for parameters passed to functions. What is the point? It seems confusing with no upside.
Do you find the difference between abstract and concrete confusing? Or the difference between container and contents? Is that a pointless distinction with no upside?
I do agree these are useful concepts to distinguish, but I don't get the connection to the topic at-hand. To me, there is just the function signature. I don't see a benefit to referring to passed values as distinct from received values. To my ear "argument" and "parameter" are perfect synonyms.
> referring to passed values as distinct from received values.
That’s not the distinction being made by those terms.
“Parameter” refers to a named variable in a function definition.
“Argument” refers to an actual value that’s passed to a function when it’s called.
It’s exactly the same as the distinction between variables and values (which you probably see the use for), just applied to the special cases of function signatures and function calls.
(As an aside, in the lambda calculus this relationship becomes a perfect equivalence: all variables are parameters and all values are arguments.)
> "A parameter is a special kind of variable used in a function to refer to one of the pieces of data provided as input to the function. These pieces of data are the values of the arguments with which the function is going to be called/invoked."
> "Parameters refer to the variables listed in a function's declaration, defining the input that the function can accept. Arguments, however, are the actual values passed to the function when it is called, filling the parameters during execution."
While you might be tempted to "do you" and use your own idiosyncratic definitions, I advise against it, since it makes it difficult for you to understand what others are saying, and vice versa.
lol, it's not a "you do you" thing, that's what they're actually named, "parameters" and "arguments" have distinct objective definitions in this context and those are it. In this specific case it's you who's using made up words for concepts that others already have a specific name for.
...by what authority? c'mon, communication is important, and insisting on the correctness of definitions tanks that.
EDIT: however, I will concede there's good evidence for widespread usage of this, and I'll adjust my usage accordingly. Insisting on "correctness" is just asinine, though.
I believe that since C23 foo() is now a nullary function. As this is the last approved standard and it supersedes all previous standards, it is technically correct to say that de-jure this is what the (unqualified) C standard mandates.
None, but that is not my point. Before C23, fn() already meant the same thing as fn(void) in function definitions, which the situation under discussion here.
C23 changed what fn() means outside a function definition.
Oh, yeah, the codegen for the fn() itself would likely be the same, but the prototype of that definition is still a K&R function. https://godbolt.org/z/Psvae55Pr
https://arstechnica.com/gaming/2015/11/ars-benchmarks-show-s...