What if instead of asking ChatGPT for facts, we instead asked it for plausible methods of determining those facts? So instead of answering "The statue of Liberty is 151 feet tall" it could instead respond with "Look it up yourself on wikipedia."
ChatGPT has made up sources when I've asked for them, so I wouldn't 100% trust it to provide great sources.
Also, if ChatGPT just redirects you to search, doesn't that remove most of the value?
> Also, if ChatGPT just redirects you to search, doesn't that remove most of the value?
That's pretty much the conclusion I've already come to. I have to verify everything ChatGPT tells me, so using it is pointless if I already know where/how to look something up.