Hacker Newsnew | past | comments | ask | show | jobs | submit | LinXitoW's commentslogin

What is real banana? How much processing is allowed for it to be still real? Considering the selective breeding of banana, is banana even still real?

Chemical is just a bad word choice. Artificial, or ultra processed get closer to the issue. They still are vague with a lot of grey area. If you cook at home, you're also highly processing your food. The fruit in winter is likely also artificial, in some sense: Grown against the will of god/nature with pesticides, in a tent, in a climate that doesn't naturally feature them, devoid of flavour because they were artificially bred for yield, color and size, etc.


>What is real banana? How much processing is allowed for it to be still real? Considering the selective breeding of banana, is banana even still real?

This is arbitrary subjective qualifier, goes somewhere between "isoamyl acetate" flavoring chemical and organic wild forest bananas. I would subjectively say that any grown bananas is REAL while isoamyl acetate made by rectification of amyl acetate is not REAL banana.


Is Baking Powder considered a “chemical”? How about sodium bicarbonate and monocalcium phosphate?

Maybe people are simply reacting to chemical-sounding words.


Add some artificial bacon flavouring, starch and you will get "beef flavoured product" which most people would call "chemical".

I think it's actually a great example of very very important non-pedantry. The entire crux of their argument/issue is dependent on their definition of "chemicals". I would even go so far as to say it's just the nature fallacy in disguise.

With the nature fallacy, the definition (or more like the lack of) of what is natural is the entire crux of it. In both cases (natural and "non-chemical") it's the very non-defined-ness that reveals the problem with it: You cannot create a sensible definition.

For nature, what's the definition that puts "rape" and "artificial insulin" on the morally correct side?

For chemical, what's the definition that puts "fortification with iodine, flouride, or whatevers in flour" and "arsenic" on the right side?


What the poster meant wasn't that the LLM itself is an entity with a preference, but simply that because of the training, LLMs are better at doing stuff in a standard Linux environment. If you have to teach it a new environment it either needs to waste time and context every time to look up stuff, or you need a company to do RL to teach it that new stuff (unlikely).

It would probably help if the sandbox presented a linux-y looking API, and translated that to actual browser commands.


You’re right that a lot of livestock feed is crop residues/byproducts humans don’t eat—but that doesn’t make beef “necessary” or erase the land/opportunity-cost problem. Globally, ~36% of crop calories go to animal feed and only ~12% of those feed calories come back as animal-product calories (Cassidy et al.). Livestock still consume ~1/3 of global cereal production (Mottet et al.). And in full-system LCAs that include grazing + feed land, meat/dairy provide ~18% of calories and ~37% of protein but use ~83% of farmland; cutting them can reduce farmland >75% while still feeding the world (Poore & Nemecek / Oxford). Plus, even if pasture isn’t croppable, it can be restored—land used for animal foods has a big carbon opportunity cost (Hayek et al.). Nutritionally, major dietetic bodies say well-planned vegetarian/vegan diets can be nutritionally adequate, with attention to nutrients like B12.

Just in case you need some recommendations:

Party games: Scale well with more people, easy to explain

- Werewolf

- Werewords

- Codenames (favorite)

Beginner Games: Accept a decent amount, somewhat easy to explain

- Camel Up

- Flip 7

- Dungeon Fighter

- Ticket to Ride

Games that have nothing to do with your problem, but I just wanna mention:

- Everdell: Cute critters prepare for winter

- Root: Cute critters prepare for war

- Azul: Place fancy tiles that look and feel delicious

- Bohnanza: The best part of Catan without the bad parts


I'd like to add a very simple one: Uno.

With the rules variant that you can play out-of-order if you add an identical card to the one that's on top of the stack, it disrupts the otherwise pretty linear play, and easily scales up to 10ish persons and still be fun.


But your program HAS to have some invariants. If those are not held, simply reject all the data!

What the hell is really the alternative here? Do you just pretend your process can accept any kind of data, and just never do anything with it??

If you need an integer and you get a string, you just don't work. This has nothing to do with types. There's no solution here, it's just no thank you, error, panic, 500.


You handle that in the validation layer, like millions of people have done with dynamic languages in the past.


I don't think any serious dev has claimed 10x as a general statement. Obviously, no true scotsman and all that, so even my statement about makers of anecdotal statements is anecdotal.

Even as a slight fan, I'd never claim more than 10-20% all together. I could maybe see 5x for some specific typing heavy usages. Like adding a basic CRUD stuff for a basic entity into an already existing Spring app.


Productivity gains in programming have always been incredibly hard to prove, esp. on an individual level. We've had these discussions a million times long before AI. Every time a manager tries to reward some kind of metric for "good" code, it turns out that it doesn't work that way. Every time Rust is mentioned, every C fan finds a million reasons why the improvement doesn't actually have anything to do with using Rust.

AI/LLM discussions are the exact same. How would a person ever measure their own performance? The moment you implement the same feature twice, you're already reusing learnings from the first run.

So, the only thing left is anecdotal evidence. It makes sense that on both sides people might be a little peeved or incredulous about the others claims. It doesn't help that both sides (though mostly AI fans) have very rabid supporters that will just make up shit (like AGI, or the water usage).

Imho, the biggest part missing from these anecdotes is exactly what you're using, what you're doing, and what baseline you're comparing it to. For example, using Claude Code in a typical, modern, decently well architected Spring app to add a bunch of straight forward CRUD operations for a new entity works absolutely flawlessly, compared to a junior or even medior(medium?) dev.

Copy pasting code into an online chat for a novel problem, in an untyped, rare language, with only basic instructions and no way for the chat to run it, will basically never work.


Even better. Job security for current seniors.


This makes no sense. Not even from a cynical and selfish view point.

I consider my job to be actually useful. That I produce useful stuff to society at large.

I definitely hope that I'm replaced with someone/thing better; whatever it is. That's progress.

I surely don't hope for a futre where I retire and medics have access to worse tech than they have now.


But what if it isn't just a basic website? Most sites I've worked on required things like content management, or auditing stuff, a bunch database stuff, SAML single sign on etc.

Most languages end up being better at some parts of the stack, like Java for overcomplicated enterprise BS backends. It seems bad to "fight" that trend.


A full stack framework like Next.js is, at the end of the day, still a server running on Node.js, so there is nothing that prevents you from doing anything that you could be doing with a regular express.js server. Is there anything that prevents you from implementing content management, auditing stuff, or database stuff in your Next.js project? Nothing comes to my mind.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: