"Heck, why isn't there a cornucopia of new apps, even trivial ones?"
There is. We had to basically create a new category for them on /r/golang because there was a quite distinct step change near the beginning of this year where suddenly over half the posts to the subreddit were "I asked my AI to put something together, here's a repo with 4 commits, 3000 lines of code, and an AI-generated README.md. It compiles and I may have even used it once or twice." It toned down a bit but it's still half-a-dozen posts a day like that on average.
Some of them are at least useful in principle. Some of them are the same sorts of things you'd see twice a month, only now we can see them twice a week if not twice a day. The problem wasn't necessarily the utility or the lack thereof, it was simply the flood of them. It completely disturbed the balance of the subreddit.
To the extent that you haven't heard about these, I'd observe that the world already had more apps than you could possibly have ever heard about and the bottleneck was already marketing rather than production. AIs have presumably not successfully done much about helping people market their creations.
This is about minimizing attack surface. Not only could secrets be leaked by hacking the OS process somehow to perform arbitrary reads on the memory space and send keys somewhere, they could also be leaked with root access to the machine running the process, root access to the virtualization layer, via other things like rowhammering potentially from an untrusted process in an entirely different virtual context running on the same machine, and at the really high end, attacks where the government agents siezing your machine physically freeze your RAM (that is, reduce the physical temperature of your RAM to very low temperatures) when they confiscate your machine and read it out later. (I don't know if that is still possible with modern RAM, but even if it isn't I wouldn't care to bet much on the proposition that they don't have some other way to read RAM contents out if they really, really want to.) This isn't even intended as a complete list of the possibilities, just more than enough to justify the idea that in very high security environments there's a variety of threats that come from leaving things in RAM longer than you absolutely need to. You can't avoid having things in RAM to operate on them but you can ensure they are as transient as possible to minimize the attack window.
If you are concerned about secrets being zeroed out in almost any language, you need some sort of support for it. Non-GC'd languages are prone to optimize away zeroing out of memory before deallocation, because under normal circumstances a write to a value just before deallocation that is never effectfully read can be dropped without visible consequence to the rest of the program. And as compilers get smarter it can be harder to fool them with code, like, simply reading afterwards with no further visible effect might have been enough to fool 20th century compilers but nowadays I wouldn't count on my compiler being that stupid.
There are also plenty of languages where you may want to use values that are immutable within the context of the language, so there isn't even a way to express "let's zero out this RAM".
Basically, if you don't build this in as a language feature, you have a whole lot of pressures constantly pushing you in the other direction, because why wouldn't you want to avoid the cost of zeroing memory if you can? All kinds of reasons to try to avoid that.
In a nutshell, the sovereign debt crisis. If you don't realize there's a sovereign debt crisis (ongoing across years), or even more accurately, a wide variety of sovereign debt crises, or even more accurately, a wide variety of debt crises of both sovereign and private entities, well, your governments and some of the more government-adjacent private entities have bent a lot of resources into make sure that's the case and convincing that it's just peachy when they borrow money, if not outright a boon, without regard to how much they borrow or how much they've already borrowed. They may have convinced you that this is true, but they know better.
Whatever happens and however it resolves, there aren't a lot of options where they retain as much power as they have now for very long. (Even if the top people maintain control they're going to be cutting loose a lot of lower level elites because they'll have to because they won't be able to maintain their upkeep.) The wheel turns and we're in that phase where they're still in power, but have begun to feel their decline. Human psychology fears and feels loss much more keenly than gain and they both fear and feel a lot of loss of power underneath the veneer they maintain.
My theory for your downvotes - even though you are directly over the target - is that folks in Europe would rather blame Russia than address the very real dysfunction in their own societies. In fact, I tend to think that the degree to which an individual blames Russia is directly related to their failure to take responsibility for the crimes and misdeeds of their own state.
I mean, look at Brexit. Almost every single Briton was told that it's a terrible deal for Briton's, that it would raise prices and decrease the availability of goods and services in exchange for a smidge more autonomy in the global economy.
But then somebody said "them damn foreigners" and they went for it head first.
Humans don't understand their thought process either.
In general, neural nets do not have insight into what they are doing, because they can't. Can you tell me what neurons fired in the process of reading this text? No. You don't have access to that information. We can recursively model our own network and say something about which regions of the brain are probably involved due to other knowledge, but that's all a higher-level model. We have no access to our own inner workings, because that turns into an infinite regress problem of understanding our understanding of our understanding of ourselves that can't be solved.
The terminology of this next statement is a bit sloppy since this isn't a mathematics or computer science dissertation but rather a comment on HN, but: A finite system can not understand itself. You can put some decent mathematical meat on those bones if you try and there may be some degenerate cases where you can construct a system that understands itself for some definition of "understand", but in the absence of such deliberation and when building systems for "normal tasks" you can count on the system not being able to understand itself fully by any reasonably normal definition of "understand".
I've tried to find the link for this before, but I know it was on HN, where someone asked an LLM to do some simple arithmetic, like adding some numbers, and asked the LLM to explain how it was doing it. They also dug into the neural net activation itself and traced what neurons were doing what. While the LLM explanation was a perfectly correct explanation of how to do elementary school arithmetic, what the neural net actually did was something else entirely based around how neurons actually work, and basically it just "felt" its way to the correct answer having been trained on so many instances already. In much the same way as any human with modest experience in adding two digit numbers doesn't necessarily sit there and do the full elementary school addition algorithm but jumps to the correct answer in fewer steps by virtue of just having a very trained neural net.
In the spirit of science ultimately being really about "these preconditions have this outcome" rather than necessarily about "why", if having a model narrate to itself about how to do a task or "confess" improves performance, then performance is improved and that is simply a brute fact, but that doesn't mean the naive human understanding about why such a thing might be is correct.
> In much the same way as any human with modest experience in adding two digit numbers doesn't necessarily sit there and do the full elementary school addition algorithm but jumps to the correct answer in fewer steps by virtue of just having a very trained neural net.
Right, which is strictly worse than humans are at reporting how they solve these sorts of problems. Humans can tell you whether they did the elementary school addition algorithm or not. It seems like Claude actually doesn't know, in the same way humans don't really know how they can balance on two legs, it's just too baked into the structure of their cognition to be able to introspect it. But stuff like "adding two-digit numbers" is usually straightforwardly introspectable for humans, even if it's just "oh, I just vibed it" vs "I mentally added the digits and carried the one"- humans can mostly report which it was.
Makes me wonder if one could train a "neural net surgeon" model which can trace activations in another live model and manipulate it according to plain language instructions.
I remember back in the late 90s that, if you ignored the matter of hardware driver quality (and that is a big "if", no question) that open source software tended to be higher quality in general than a lot of commercial software. Not because of any moral characteristic per se, but just the "many eyes make bugs shallow" sort of thing. Since it was mostly only programmers using open source anyhow, if someone hit an annoyance, statistically speaking, there was a good chance that someone who could fix the problem had hit the same annoyance.
Then maybe in the 2010s commercial software at least caught up.
But it seems to be swinging back around to, if I want my software to effing work I want to be seeking out open source again. Statistically speaking, fewer of the users who may encounter problems can fix any problems they find, as the systems have gotten much larger, but it is still possible, and on the compensating side, no one on the emacs team is figuring out how to stuff AI where it doesn't belong [1] or how to monetize it via ads or any of the other exciting ways to arbitrage long-term software quality against short-term money.
It's an opinion, it is clearly highly path-dependent, and I won't deny this is just my impression... but it is something I've been noticing again lately. Especially as Windows seems to be heading down the catastrophe curve and this time I'm not sure they can stop it.
[1]: I'm not anti-AI at this point... but there are places where it belongs, and there are places it just doesn't, and stuffing it where it does not belong is not a win.
"there were certain historical characters removed from Sora 2 because people kept making racist videos that are hard to censor, and it became increasingly unhinged"
Also Google "Elsagate" to see what sorts of things people would like to do with Disney characters. Or a YouTube search for Elsagate.
The other thing I'd point out is that people kind of seem to forget this, but it isn't a requirement that AI video be generated, then shoveled straight out without modification. Elsagate shows the level of effort that people are willing to put into this (a strange combination of laziness, but extreme effort poured into enabling that laziness). You can use the blessed Disney video generator to generate something, then feed it into another less controlled AI system to modify it into something Disney wouldn't want. Or a video of a Disney character doing something innocent can be easily turned into something else; it's not hard to ask the AI systems to put something "against a green screen", or with a bit more sophistication, something that can be motion tracked with some success and extracted.
"A front camera shot of Cinderella crouching down, repeatedly putting a cucumber in and out of her mouth. She is against a green screen." - where ever that video is going, Disney isn't going to like it. And that's just a particularly obvious example, not the totality of all the possibilities.
Just putting controls on the AI video output itself isn't going to be enough for Disney.
> Also Google "Elsagate" to see what sorts of things people would like to do with Disney characters. Or a YouTube search for Elsagate.... Elsagate shows the level of effort that people are willing to put into this (a strange combination of laziness, but extreme effort poured into enabling that laziness).
I still wonder what motivates the people behind that sort of thing. It'd be easy to understand if it were just porn, but what's been described to me is just... bizarre.
I always figured it was an engagement optimization thing—there were people mass producing content using popular characters and just throwing tons of stuff at the wall, and the ones that veered unsettling/bizarre wound up getting lots of engagement so they kept doubling down on it. That kind of feedback loop is certainly responsible for many other curious traits of online content that is circulated in algorithmically-curated feeds.
The tighter the loop between content creation (e.g. when you can generate unlimited content essentially for free) and the ability to measure its success (engagement), the more social media becomes a sort of genetic algorithm for optimizing content to be the most addictive possible at the expense of any other attribute.
There won't be one single reason. For some it is a dark sense of humour perhaps twisted a little too far off track, that perhaps they should keep in their own head or at least just between very close friends. For some it is simply money without caring that it might upset people: get enough engagement and ad impressions and it is worthwhile if you can ignore the moral aspect. Money might not be the objective at all, there are people who just want the attention, or the appearance of attention, and fake internet points (youtube views and such) sate their need at least temporarily. For some it is simply deliberate griefing, for all the reasons that is a thing generally. Or some mix of the above. None of it healthy IMO, but explainable.
In a few cases it is a dark in-joke between a small set of people that just happened to have used a public host for distribution, that unexpectedly went more viral.
There are people in this world who will do anything for money. They will destroy your children mentally if it makes them a single dollar, they will traumatize them and cause lasting damage. We have created a world in which these people have free access to our children.
Generative AI and getting everyone on the planet online is going to contribute massively to this. You’re already seeing a massive rise in sextortion scams, pig butchering scams, scams of all kinds.
Whatever the reason is (maybe online doesn’t feel “real” to people or something), a person with an internet connection where $100 is a great monthly income will do anything to make that money, even if that means endangering someone else’s children or mentally scarring them. Combined with poor enforcement in places like Nigeria and India, we’re already in the midst of a scam epidemic.
I'd like to give those people the benefit of the doubt, and state that I believe they don't start out intentionally trying to damage children. They're simply trying to maximise their own earnings, and don't give a shit about what collateral damage occurs in response to their actions, as long as earnings go up.
They'll optimise for whatever causes numbers to increase. Children just happen to sometimes be what makes that happen.
They aren't trying to pervert the children. This isn't some cabal.
It's just money.
It's just people trying to get children's eyeballs to collect minuscule ad revenue.
It's the same as the people who abuse their kids for a Youtube channel, or the russian companies that put out 10 """DIY""" shorts a day which are just fake.
Youtube rewards constant churning content creation, so that's what is done
Yes, spiderman and Elsa on YouTube is a prime example. It's just slop for kids, they're not even in the same universe. But kids like spiderman, and they also like Elsa, so... here we are.
It's probably a lot of kids just being silly. Sure there are plenty of adult trolls, but whenever I see people bewildered at unruly online behavior, I think it's because they cannot see the age/maturity level of the troll. I can absolutely see why people would find this funny.
Well, bizarre is the point. Surely you do understand that this is the content to gather kids views, because there is a ton of kids on the internet, and they can be monetized. I don't know what kind of research they do on their audience and if they purposefully want to traumatize kids as much as possible, but I suppose all this shit does capture kid's attention more than just Disney characters fucking.
In the end, almost everything has a soap opera in it somewhere. People have a hard time processing stories that don't have a soap opera in them somewhere. For some people it's just impossible. There's really only a minority of people who are interested in stories that have no personal relationship stories in them at all.
That's not to say that the parts that aren't soap opera aren't meaningfully different. I disagree with the reductionistic claim that "everything is just a soap opera in the end", and leave it to the reader to determine whether or not the original link is making that mistake.
I would say it's more like salt in cooking for the vast majority of people; they expect a certain proper amount and trying to engage a normal human's taste without it is an uphill battle at best. As a result, across a wide variety of genres and styles, you'll find soap operas.
(I use soap opera as a bit of shorthand for things focusing on human relationships a lot. Soap operas tend to focus on the romantic end more than average, so the embedding is not quite perfect. But I use "soap opera" as the shorthand here because they are one of the more pure embodiments of the idea, because they are basically nothing but human relationships churning and spinning, with generally not much more going on. Yeah, a couple of them have a more exotic framing device, but all that does is move them slightly off the center of the genre, not really change them much.)
Here's what's funny. You know what they used to call a book that foregrounded the soap opera elements you're talking about? A novel. That's why Tolstoy called Anna Karenina his first novel. Now, if you go to Wikipedia, War and Peace is also categorized as a novel. What else could you call it? But it's funny to imagine a time when novel was a genre.
I think you mean romance? A romance used to be a Roman-style long narrative fictional work that described extraordinary deeds, soap opera plots. Novels were more concerned with realistic narratives describing the nitty gritty of everyday life.
It is kind of like how modern art doesn't mean modern today. It means that time period where people called art "modern". Novel meant new as in "novel science results". It was used differentiate prose (the new style at the time) from epic poetry back in the 16 hundreds and stuck. How that translates to Russian IDK.
There is no "novel" (as like "new" thing) as genre in Russian lit. in russian things called "novel" in english are called a russian word that is a translation of "romance". and tbh "romance" makes tons more sense than "novel".
But "novella" (different genre) is a thing in russian.
I don't speak Russian, but whatever the Russian word is for "book." Or maybe others called it a novel but Tolstoy rejected the label. I'm not sure.
Either way, the word "novel" wasn't necessarily equivalent to how it is used today: any book length work of narrative fiction.
Though watch out, this is a rabbit hole. Just look up novel on wikipedia. You'll see a big orange message at the top which is the first sign there is a problem. And then the article is excessively long. A lot of ink has been spilled trying to define what a "novel" is.
I think a lot of character-centered conflicts boil down to the same set of problems, regardless of the setting. For instance, you often see "keep the status quo and die a slow death" vs "expensive, risky gamble". Sometimes the setting is a small midwest town, sometimes it's a spaceship on the way to Beta Virginis. Sometimes the solution is actually unique to the setting but often it's just "find a compromise, prevent the extremists from blowing up the deal". Replace the mayor with a captain and TNT with nuclear bombs and you basically have the same story.
> There's really only a minority of people who are interested in stories that have no personal relationship stories in them at all.
All that to say I wish there were more stories that are more focused on the plot / implications of the setting. What-ifs that aren't derailed by character drama. "What if telekinesis was real? How can we exploit it for energy / propulsion / everyday gadgets?" Like basically thought-experiments in narrative form, or a textbook with characters.
Or at least I wish I knew how to search for these types of stories. Searching for "hard sci-fi" comes close but it requires the science is plausible (no FTL, minimal new physics, etc). I don't think it's reasonable to expect authors to simulate an entire universe / provide plausibility proofs for every bit of engineering / physics. As long as the mechanics of whatever fantasy physics are consistent and developments are plausible, that's good enough for me. I don't even need a satisfying conclusion, if the protagonist rebels fail because the ultra-wealthy corpos are just better equipped, so be it, at least the ride was fun.
The Expanse (both a recently completed book series and a cancelled yet mostly complete TV adaptation) is pretty good at this; it sets up a world with complex political dynamics, and lets things mostly evolve as a result of those dynamics, with the main characters largely just along for the ride. It takes the science parts super seriously too: ships have to worry about acceleration and debris fields during battle, communications have to account for the speed of light, that sort of thing. There's only the occasional injection of new technologies to push things forward about once per book/season.
The Expanse has been called, "The closest thing that we're going to get to a live-action Gundam series," in the past. And it's certainly better in a lot of ways. You do have to thank Gundam (and Alien) for dragging us out of the John Carter Valley (which OG Star Trek certainly fell into quite often).
Honestly, the SCP wiki might scratch this itch for you—it's sci-fi but with a lot of fantasy elements, and I'd put it on the "hard" side of the spectrum. Also, I think Greg Egan's books are pretty out there (the two I've read are Diaspora and Permutation City, whose settings aren't particularly "plausible" IMHO), and they really make you think.
Agreed. It's framing device of reports / scattered documents / etc also remove a lot of the characterization or characters completely and focus just on the "what if?" of the story.
How many very smart people with excellent writing skills and grasp of human relations would spend their time writing fiction?
There’s probably not even 50,000 of those on Earth per annual cohort coming of age. And of the remainder practically no one will turn down the 7 figure cushy hedge fund job or equivalent career path.
> I use soap opera as a bit of shorthand for things focusing on human relationships a lot.
I don't know if that's really fair. I don't think that's really what most people think the term soap opera denotes, and if you broaden it to mean any work that has any sort of relational elements, its almost a tautology that all fiction will meet the standard.
More to the point, i think its an unfair response to the article, as the author is not claiming that the similarity between these two works is merely that they have relationships in them.
Which can rapidly exceed in size and count the "non-user-facing" objects.
This objection really needs to die. GC does not instantly mean you can't program games. At most it locks you out of the tip-of-the-tippy-top AAAA games, but if you were trying for that you weren't going to use "someone's GitHub project" anyhow. And most of them probably have meaningful GC in them anyhow.
You know I would be happy to offer this service to investors for a mere tens of millions of dollars. I'll send you photos of our weekly money bonfire, built with your money, and when you're tired of pictures of your money on fire, I'll simply... stop.
Heck, in accordance with the several zeitgeists of our age, I'll even do you the solid of fraudulently generating the money-on-fire pictures with AI, so when you get tired of seeing your money on fire I'll even hand, say, 25% of it back to you, as the result of my tireless efforts to bring value to my shareholders. That's a better return than you'll get from most of these investments!
I haven't played the first one but I played Grandia II on the Dreamcast and I think it's still my favorite battle system in a JRPG to date. Not only does it have the obvious details you can see on a YouTube playthrough, but higher-end play with it also requires managing positioning, which is easy to miss as an option at all in the menus, or to think it has no purpose. A low-level challenge run would probably be a lot of fun.
Unfortunately in my casual playthrough I accidentally broke the combat system and by the end of the game nothing was a challenge; as with many other games there are "resistances" and "vulnerabilities" but also as with most non-Shin Megami Tensei games of the era, they aren't really strong enough or frequent enough to matter. I just pumped all my upgrades into Fire upgrades until eventually my routine end-game battle was one character to wipe all the enemies in one move, move to next battle. You could easily pump an elemental bonus enough to overwhelm the resistances the enemies had. More resistances and immunities distributed around would have helped prevent a degenerate strategy.
And of all the battle systems to have a degenerate strategy for, this one hurts the most because it is otherwise so good.
(Sadly, Grandia III was never completed. It was released... but it was never completed. The game as shipped has visible gaping holes in it, which is sad because what is there was quite good.)
Im probably a fool for posting this but this thread and these responses warm my heart. It's good to see so many people were affected by this era of gaming from Sega the same way that I was.
The battle system in the later SMT games, especially after 3 is one of the best turn based systems I've ever played. It was a refinement of FFXs in so many ways. It encouraged you to "break" it as it were.
Then there's games like Grandia 2... And Shenmue. God, I love Shenmue these days. The first one is brilliant in so many ways I didn't recognize when I played it when it came out. Absolutely wonderful game. The track that plays when Ryo and Guizhang fight the Mad Angels at the dock, "Earth and Sea", takes me back. For me, it's the most perfect Christmas game that Sega ever made
It's very common with RPGs of that era (and all eras really) that the developers don't test every edge case and end up leaving ultra-powered (and just as many or more close to useless) builds in the game. Every feature added increases the possibility of breakage by some quadratic factor. Once your battle system hits a certain level of complexity it's close to inevitable.
Even carefully developed modern games like Baldur's Gate 3 have game breaking build options.
To be fair to e.g. Baldur's Gate, finding game breaking builds appeals to many people in the core audience of that sort of game along with classic TTRPG players. Making those builds harder to achieve by accident is a good thing, but doing away with them entirely would probably be detrimental for the intended audience. True brilliance is also have systems that make that sort of build still fun to play, e.g. BG3 has some pretty amusing hidden interactions if you steamroll events you're not supposed to be able to win.
Reminds me of a game-breaking strategy in the (interesting, flawed) hybrid RTS/RPG War in Middle Earth (1988).
The RTS part involved moving armies and heroes around to fight Sauron / Saruman’s armies and defend your citadels. There was a game loss condition if you lost something like three citadels in battle.
But if you abandoned your citadels, their subsequent occupation didn’t trigger the loss. So you could simply aggregate all your forces into one giant army and take Barad Dur and Mt Doom by force.
I'd put Noita in that category. I usually describe it as "broken both ways", because (as a rogue like) you have very little healing, and the enemies are punishingly hard. Not only that, but it's a full falling sand+physics simulation, so certain elements will randomly combine and kill you in the most unexpected and spectacular ways. On the flip side, the wand system is near turing-complete, and gets abused in the most crazy ways, to the point that you can do millions of damage per tick. One of the most chaotic and fun games I've played!
That makes games that aren't fun unless you're a wizard with the systems. They have their place, but I'm not a fan.
Personally, I think devs should embrace some stuff being broken. It's a single player game, it doesn't need balance. One of my favorite RPGs is FF8 precisely because you can trivialize the game if you engage with the character building systems. It feels awesome to stomp things with your broken party.
Perhaps "requirement" was a poor phrasing of the idea. "Feature?"
> Personally, I think devs should embrace some stuff being broken. It's a single player game, it doesn't need balance.
Exactly! Bravely Default & Octopath Traveler's job systems are built around the idea that the system should be breakable. BD2 even has a push-your-luck system that adds a multiplier for one-shotting multiple encounters in a row: you can get like a 50 percent rewards boost from random encounters if your team is able to "go infinite" against the current enemy mobs. And there are skills to remove damage caps, so you know they thought about it.
OT did kinda tone that down some to add the timing and break mechanics; if you land enough hits on enemy weaknesses, they lose their current turn, and go into a stun status next turn (but go first the turn after that, so no stun locking!). But you still get end game builds that max out boost points every turn; you just can't usually one shot bosses.
I think Disgaea fits into that class. The solution to every problem is "more levels", but the game is basically built around that. I think it's also fun to craft your own challenges out of the raw materials given to you... to get as many levels as quickly as possible, to win at minimum levels, to win with only X, to ignore Y and Z, etc.
I think besides the mechanics, the other thing that makes the grandia/grandia 2 battle system so fun is how snappy all the animations and interactions are. You never really feel like you're waiting for things to happen even though it is semi turn based.
My first experience with Grandia was Grandia II as well but on PS2. I ended up getting the PC version as well, which at the time was fairly novel to see JRPG's on PC. Grandia II is still one of my nostalgic favorites. As you mentioned the typical turn based combat with positioning was a fun addition that could change your combat experience each time. Was like an evolution of the Chrono Trigger combat system.
I find myself designing a TRPG (Final Fantasy Tactics, Disgaea) with Grandia time & space mechanics in my head; take position even a bit more seriously than Grandia did, but build on a TRPG balance and skill structure. Basically end up with a much more dynamic take on the TRPG, which has always been a bit of a static experience. The canceling mechanics coming from Grandia would be banger in that sort of more dynamic TRPG.
There is, of course, a lot of games all around that space but I don't know of anything that quite matches what I'm laying out here.
(Although the cancellation mechanics would need some careful attention. It allows for a whole bunch of weak characters to keep a single strong character down by always cancelling what they're doing. I suppose just turning it into a skill check itself instead of being 100% as it is in Grandia would do the trick, though.)
Grandia II is the only JRPG I've ever really truly enjoyed the battle system of. I feel like with a lot of JRPG's I'm sitting around doing math.
Grandia 2 was largely just timing things such that if you did it right you would bonk the enemies turn back repeatedly and they'd never get to attack. Way more fun.
The very first area at the start of the game actually encouraged this if you experimented even a little: Combo is the default attack but it doesn't defeat those early enemies so they'll hit you back, but the battles start with just the right timing so if you use Critical first it'll knock them back so your next Combo would hit and defeat them without you taking any damage.
Grandia II battle system also breaks when you get Teo. An area effect Critical? Basically allows you to completely control every battle even if you haven't invested in a board wiping Uber character.
Grandia II's battle system was really great but the story and voice acting was so rough haha, I ended up not caring about any of the characters and skipping all that I could to get to the combat
There is. We had to basically create a new category for them on /r/golang because there was a quite distinct step change near the beginning of this year where suddenly over half the posts to the subreddit were "I asked my AI to put something together, here's a repo with 4 commits, 3000 lines of code, and an AI-generated README.md. It compiles and I may have even used it once or twice." It toned down a bit but it's still half-a-dozen posts a day like that on average.
Some of them are at least useful in principle. Some of them are the same sorts of things you'd see twice a month, only now we can see them twice a week if not twice a day. The problem wasn't necessarily the utility or the lack thereof, it was simply the flood of them. It completely disturbed the balance of the subreddit.
To the extent that you haven't heard about these, I'd observe that the world already had more apps than you could possibly have ever heard about and the bottleneck was already marketing rather than production. AIs have presumably not successfully done much about helping people market their creations.
reply