Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

One thing ChatGPT (specifically, the GPT4 version) keeps doing to me is confidently lying, and when I call it out, apologizing and spitting out another response. Sometimes the right answer, sometimes another wrong one (after a couple tries it then says something like "well, I guess I don't have the right answer after all, but here is a general description of the problem")

Part of me laughs out loud (literally, out loud for once) when it does that. But the other part of me is irritated at the overconfidence. It is a potentially handy tool but keep the real documentation handy because you'll need it.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: