Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When I search for Cr2Gr2Te6, Google Gemini tells me:

"AI Overview Cr2Gr2Te6 is a miswritten, imaginary compound; the correct compound is Cr2Ge2Te6 (Chromium Germanium Telluride), where Cr stands for chromium, Ge for germanium, and Te for tellurium. This error, where 'Gr' was mistakenly used for 'Ge', has been replicated in multiple scientific publications since its discovery in 2017, despite the correct formula being known and published."



It seems like the only reason Gemini knows this is because of the exact article we're discussing. Seeing as it's the first result when you search it, Gemini is just summarizing the article rather than synthesizing the info itself.


What leads you to believing that that's a reason, even "the only" reason?

If the top search hit is your only indication then you might want to brush up on your understanding of how LLMs work.


Gemini doesn't know anything. All of its outputs are synthesized via pattern matching of the prompt against its training data. No one knows exactly what the sources of any given LLM synthesis are. If one asks for a summary of a specific article then it will do that, but that wasn't the prompt.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: