> It's trained on generating the most likely completion to some text, it's not at all easy to tell if it's bullshitting you if you're a newbie.
I don't think many people (at least not myself and others I know who use it) use GPT4 as a source of absolute truth, but more like a "iterate together until solution", taking everything it says with a grain of truth.
I wouldn't decide any life or death decisions on just a chat with GPT4, but I could use it to help me lookup specific questions and find out more information that then gets verified elsewhere.
When it comes to making games (with Rust), it's pretty easy to verify when it's bullshitting as well. If I ask it to write a function, I copy-paste the function and either it compiles or it doesn't. If it compiles, I test it out in the game, and if it works correctly, I write tests to further solidify my own understand and verification it works correctly. Once that's done, even if I have no actual idea of what's happening inside the function, I know how to use it and what to expect from it.
I don't think many people (at least not myself and others I know who use it) use GPT4 as a source of absolute truth, but more like a "iterate together until solution", taking everything it says with a grain of truth.
I wouldn't decide any life or death decisions on just a chat with GPT4, but I could use it to help me lookup specific questions and find out more information that then gets verified elsewhere.
When it comes to making games (with Rust), it's pretty easy to verify when it's bullshitting as well. If I ask it to write a function, I copy-paste the function and either it compiles or it doesn't. If it compiles, I test it out in the game, and if it works correctly, I write tests to further solidify my own understand and verification it works correctly. Once that's done, even if I have no actual idea of what's happening inside the function, I know how to use it and what to expect from it.