This is very much a big compromise where you decide for yourself that storage capacity and maybe throughput are more important than anything else.
The md metadata is not adequately protected. Btrfs checksums can tell you when a file has gone bad but not self-heal. And I'm sure there are going to be caching/perf benefits left on the table not having btrfs manage all the block storage itself.
> What slice of my mortality pie was radon before and after spending $5000?
You'll never know. The same way people in the exclusion zone will never know if their thyroid cancer was always destined to be or if it really was related to the Chernobyl meltdown.
But spending (closer to $1000) to mitigate some risk from a known threat vector does seem thrifty.
> Why hire an intern or a recent college-grad when they lack both the expertise and experience to do what an AI could probably do?
AI can barely provide the code for a simple linked list without dropping NULL pointer dereferences every other line...
Been interviewing new grads all week. I'd take a high performing new grad that can be mentored into the next generation of engineer any day.
If you don't want to do constant hand holding with a "meh" candidate...why would you want to do constant hand holding with AI?
> I often find myself choosing to just use an AI for work I would have delegated to them, because I need it fast and I need it now.
Not sure what you are working on. I would never prioritize speed over quality - but I do work in a public safety context. I'm actually not even sure of the legality of using an AI for design work but we have a company policy that all design analysis must still be signed off on by a human engineer in full as if it were 100% their own.
I certainly won't be signing my name on a document full of AI slop. Now an analysis done by a real human engineer with the aid of AI - sure, I'd walk through the same verification process I'd walk through for a traditional analysis document before signing my name on the cover sheet. And that is something a jr. can bring to me to verify.
> What an average person wants in their desktop is Windows - not Linux and certainly not some obscure independent distro. And this is still not a problem of that distro or Linux.
The average person doesn't even want Windows. They want to click a button and not be bothered with the implementation details.
That is why mobile/tablet is such a popular form of compute these days. People don't even have to learn the basics of interfacing with a file system most of the time. Want to look at pictures you've taken? You can be oblivious to the fact that your camera app puts picture files in a specific directory and embeds a date code in the file name, the photo viewer app takes care of that for you.
15 years ago almost everyone would recognize the name because of the popularity of the Iron Man movies.
10 years ago Jarvis became a part of Vision in Age of Ultron and effectively no longer exists in the MCU. A variety of new AI assistants with new names were made in later movies.
None of the new ones became as recognizable, and I guess Jarvis is also falling into obscurity.
Don't read too much into my only vague awareness (or person I responded to's unawareness) - I think I've seen one Iron Man film, or maybe only bits of it even. I'm just not into all that Marvel/DC/superhero stuff.
(And as a student I Saturday-jobbed at a cinema, so there's a certain era for which I've seen at the very least many odd scenes out of order for essentially all widely released films...)
> GCC Go does not support generics, so it's currently not very useful.
I don't think a single one of the Go programs I use (or have written) use generics. If generics is the only sticking point, then that doesn't seem to be much of a problem at all.
> You’re also at the mercy of the libraries you use, no?
To a certain extent. No one says you must use the, presumably newer, version of a library using generics or even use libraries at all. Although for any non-trivial program this is probably not how things are going to shake out for you.
> Which likely makes this an increasingly niche case?
This assumes that dependencies in general will on average converge on using generics. If your assertion is that this is the case, I'm going to have to object on the basis that there are a great many libraries out there today that were feature-complete before generics existed and therefore are effectively only receiving bug fix updates, no retrofit of generics in sight. And there is no rule that dictates all new libraries being written _must_ use generics.
I just used them today to sort a list of browser releases by their publication date. They're not universal hammers but sometimes you do encounter something nail shaped that they're great at.
In general, no. Most of the coal companies went bust and the rights are owned by gas and/or fracking companies or consolidated by one of the surviving companies.
> Also 60fps is pretty low, certainly isn't "high fps" anyway
Uhhhhhmmmmmm....what are you smoking?
Almost no one is playing competitive shooters and such at 4k. For those games you play at 1080p and turn off lots of eye candy so you can get super high frame rates because that does actually give you an edge.
People playing at 4k are doing immersive story driven games and consistent 60fps is perfectly fine for that, you don't really get a huge benefit going higher.
People that want to split the difference are going 1440p.
Anyone playing games would benefit from higher frame rate no matter their case. Of course it's most critical for competitive gamers, but someone playing a story driven FPS at 4k would still benefit a lot from framerates higher than 60.
For me, I'd rather play a story based shooter at 1440p @ 144Hz than 4k @ 60Hz.
You seem to be assuming that the only two buckets are "story-driven single player" and "PvP multiplayer", but online co-op is also pretty big these days. FWIW I play online co-op shooters at 4K 60fps myself, but I can see why people might prefer higher frame rates.
Games other than esports shooters and slow paced story games exist, you know. In fact, most games are in this category you completely ignored for some reason.
Also nobody is buying a 4090/5090 for a "fine" experience. Yes 60fps is fine. But better than that is expected/desired at this price point.