Hacker Newsnew | past | comments | ask | show | jobs | submit | Edman274's commentslogin

Is it completely insane and incoherent to imagine a situation where ice cream has two equilibrium prices, one higher and one lower, and the market just settles on the higher one? Like, imagine a case where Jeni's would start losing money on every pint if they reduced the price by a dollar, but they'd make the same amount of money overall if they reduced it by 3. But they're in a local optimum, the "price reduced by 3" is identical for revenue purposes, and they choose their current local optimum. Then ice cream could still be priced too high and be "appropriately priced". Is this impossible?

> Is it completely insane and incoherent to imagine a situation where ice cream has two equilibrium prices, one higher and one lower, and the market just settles on the higher one?

It it completely insane? No. But draw a set of supply and demand curves that supports it, and then try to come up with a narrative that explains them. In the static, same time, all other things being equal case, it is hard to do.


At what point does a demand for evidence come back around to making the requestor seem less like a prudent, rational truth seeker and more like someone with naive lack of personal, lived experience? Like, not a single soul will say "got evidence for that assertion?" when it's a news story about EA or Oracle or Adobe acquiring a company and people are predicting that the acquired product will be destroyed, and isolated demands for rigor will be laughed out of the comment section. Why is that - when does it flip over to "oh, so I guess it's okay to just nakedly assert that food companies will seek profit by reformulating their recipes, even though there isn't a shred of evidence to support that, therefore, we're now allowed to predict anything!"

The complement of the claim is essentially "food manufacturers will never again attempt to modify their recipes to make them more hyperpalatable, now that GLP-1 exists." Does that need evidence? It's the null hypothesis, but it certainly sounds a lot more unrealistic than the opposite.


Destroying a product is a well understood process, and we've witnessed many big companies do it. That's evidence!

Designing a food to be more appealing is also a relatively well understood process that is already carried out, but Ozempic seems to blunt the effectiveness of it.

Food companies will surely try to make food that is appealing for Ozempic users, and will do so if they can. But it is a massive assumption that they will be able to, given that they're already doing as much as possible to make food appealing to people.

So there is significant uncertainty that the food companies can do what the parent suggested they would do.


It needs evidence that there's a general phenomenon of "hyperpalatable" food companies can search for, not just a latent property of how certain macronutrients balance in food. Otherwise, it's like proposing that public transit is pointless because car companies will somehow defeat it by making up more reasons to drive.

But that's what happened. I mean, it doesn't mean that proposing public transit is pointless, but if someone in 1930 heard about a trolley track being run in town and another person said "it's only a matter of time before the car companies try to sabotage mass transit", they would've been right. That's what actually happened.

Okay, people say this. Could you please, and this is not a rhetorical device, it's a sincere question: how do you keep the browser updated without updating the operating system? Or if you are updating the os, doesn't that change the user interface? And if the user interface is changing, doesn't that confuse your grandmother? I installed Ubuntu for my mom and after four years Firefox was out of date, and the website for banking she'd use would have checks where logging in was only possible if the one if the user agent was recent enough. One can fake that, but I didn't want to. But updating Firefox meant updating Ubuntu, which means that every single icon and every single menu position changes, and I didn't want to have to teach her where everything was again. How do you avoid this?

I haven't dealt with this for her in a few years, but basically:

Pin all their apps in favorites and they will persist through updates. Updates don't overwrite desktop shortcuts either (although like other os, a couple might be added that need to be removed). This might be more difficult in gnome, I wouldn't know since I am firmly in the kde camp.

To stay as up to date as possible, use the mozilla apt repo:

https://support.mozilla.org/en-US/kb/install-firefox-linux#w...


In my experience, the Mozilla apt repo would still have dependencies on system libraries that can only be installed by updating the operating system to another LTS. Like, the Mozilla Firefox package depends on libssl, which depends on another package and that other package can only be updated by updating the operating system, which typically drastically changes the look and feel of system menus and things that are not easily gleaned by looking at a screenshot of an empty desktop. Maybe this isn't true of KDE and the interface remains stable across to update cycles. Thank you for the suggestion.

There actually is a stopping point , and the definition of ultra processed food versus processed food is often drawn at the line where you can expect someone in their home kitchen to be able to do the processing. So, the question kind of goes whether or not you would expect someone to be able to make cheese or wine at home. I think there you would find it natural to conclude that there's a difference between a Cheeto, which can only be created in a factory with a secret extrusion process, versus cottage cheese, which can be created inside of a cottage. And you would probably also note that there is a difference between American cheese which requires a process that results in a Nile Red upload, and cheddar cheese which still could be done at home, over the course of months like how people make soap at home. You can tell that wine can be made at home because people make it in jails. I have found that a lot of people on Hackernews have a tendency to flatten distinctions into a binary, and then attack the binary as if distinctions don't matter. This is another such example.


There actually is no agreed-upon definition of "ultra-processed foods", and it's much murkier than you make it out to be. Not to mention that "can't be made at home" and "is bad for you" are entirely orthogonal qualities.


Would you consider all food in existence to be "processed", because ultimately all food is chopped up by your teeth or broken down by your saliva and stomach acid? If some descriptor applies to every single member of a set, why use the descriptor at all? It carries no semantic value.


Is it fair to recognize that there is a category difference between the processing that happens by default on every cell phone camera today, and the time and labor intensive processing performed by professionals in the time of film? What's happening today is like if you took your film to a developer and then the negatives came back with someone having airbrushed out the wrinkles and evened out skin tones. I think that photographers back in the day would have made a point of saying "hey, I didn't take my film to a lab where an artist goes in and changes stuff."


Kent state massacre pole picture is a point of controversy in this area, but may be more relevant then ever.

https://petapixel.com/2012/08/29/the-kent-state-massacre-pho...


It’s fair to recognize. Personally I do not like the aesthetic decisions that Apple makes, so if I’m taking pictures on my phone I use camera apps that’s give me more control (Halide, Leica Lux). I also have reservations about cloning away power lines or using AI in-painting. But to your example, if you got your film scanned or printed, in all likelihood someone did go in and change some stuff. Color correction and touching the contrast etc is routine at development labs. There is no tenable purist stance because there is no “traditional” amount of processing.

Some things are just so far outside the bounds of normal, and yet are still world-class photography. Just look at someone like Antoine d’Agata who shot an entire book using an iPhone accessory FLIR camera.


I would argue that there's a qualitative difference between processing that aims to get the image to the point where it's a closer rendition of how the human eye would have perceived the subject (the stuff described in TFA) vs processing that explicitly tries to make the image further from the in-person experience (removing power lines, people from the background, etc)


Are there any languages in existence that lack a facility for counting numbers, to your knowledge?


"idiotsecant" is clueless but according to Daniel Everett's studies/research the Pirahã language lacks a proper numeral system - https://en.wikipedia.org/wiki/Pirah%C3%A3_language#Numerals_...


It seems very self evident to me (given what we now see with legalized gambling) that the harms of broadly legalizing it and creating an industry around it far outweigh any harms associated with black markets by a wide margin. Also, black markets for gambling still exist, so this kind of just feels additively worse. Just from a measure of utility, even if we went back to only having gambling performed with organized criminals breaking legs when people can't pay, that would still result in significantly fewer ruined lives, significantly better quality of life for the communities that are having wealth sucked out of them and into gambling syndicates. It's entirely unproductive destruction of wealth, robbing from the poor and giving to the rich. If gambling made anything better in net for society, gambling syndicates would never attempt to legalize it.


A few things; one, even if it's not strictly speaking true that cocaine use always leads to addiction every single time, we know now better than in Victorian era England how often it does, and Doyle not having been a cocaine user may have lost some of the elements of how cocaine is addictive and what it looks like. I hate to say that there is some moral duty to show a protagonist using cocaine as having a problem with its use that needs to be overcome, but I do think it'd be strange too to keep what was effectively this SMBC comic (https://smbc-comics.com/index.php?db=comics&id=191) as Holmes' use of coke.

Secondly, the stories that mention coke use are all written from the perspective of Holmes' best friend, who we'd expect to be biased towards writing about his friend in a positive light. I don't think this is accidental. Watson quotes him effectively saying "I just do coke because life is so mundane and boring, and not stimulating enough for me" which is nearly the exact same justification and thought process used by like, every addict and if not a word-for-word quote, then at least very similar for Chris Moltisanti's justification of his own addiction to Tony Soprano.

It may not be an exact rendering of what was in the books but it is extremely natural modification to make, where otherwise we'd have flat Marty Stu character who is talking in ways that seem very consistent with at least problematic use and yet who's not addicted. "Our own times" have dealt with at least 100 years of coke addiction, 50 years of crack so maybe we're just not naive enough to believe that a guy who's saying "my friend just takes it when he's bored, but he's bored all the time because his mind is too sharp for this dull world" isn't a problematic user or addict.


The definition of a monopoly basically resolves to "those companies that don't get pressured to meaningfully compete on price or quality", it's a tautology. If a firm has to compete, it doesn't remain a monopoly. What's the point you're making here?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: