Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The challenge is that AI generated content will only register as garbage to those who fall 1-2 standard deviations away from the mean interest level for whatever content the AI is producing. For everybody else, it will typically be a 'good enough' story, recipe, video, etc. Several months ago, I wanted to cook chicken curry, so I googled and chose a top result. The recipe was fine, but it wasn't what I wanted. I then googled for a new recipe, this time more carefully vetting the ingredients. The recipe was an improvement, but still not what I wanted. Finally, I used NYT Cooking as my search engine, and the recipe was excellent. If I didn't have a strong preference, and know exactly what I wanted, the first recipe would have been perfectly suitable. The danger is that demand for 'advanced sources' erodes to the point that these publications/content creators are either forced to adopt the AI and turn to garbage or go out of business.


Strangely, recipes in particular are generally untested and so faulty as to fail when done strictly according to the written instructions.

That might make it a little harder to recognize when an AI is 'pulling your leg'


Agreed. A person who cooks regularly will just adapt on the fly and might even not realize they're correcting an error in the recipe. A person who cooks infrequently will stick to the recipe, fail, and then blame themselves for their own inexperience. This is already the case with human-produced recipes, and, as you say, it'll make it harder to recognize you're dealing with AI-generated nonsense instead of regular human mistakes or occasional cluelessness.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: