Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't know, I get this feeling ChatGPT has also read all the quack books where the condition depends on the alignment of stars when you were born, or how chicken entrails land when the shaman "studies" it. Those books are also written confidently without giving any sign of being completely fabricated.

In the end, why do people believe what they believe? The answer is it connects with what they already believe. That's it. If you had a diet of what we call science, you'll have a foothold in a whole bunch of arenas where you can feel yourself forward, going from one little truth to another. If you are a blank slate with a bit of quackery seeded onto it, you end up believing the stars predict your life and that you can communicate with dead people with a Ouija board.

CGPT doesn't have an "already believe". It just has a "humans on the panel give me reward" mechanism, and all it's doing is reflecting what it got rewarded for. Sometimes that's the scientific truth, sometimes it's crap. All the time it's confident, because that's what's rewarded.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: