Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The bag is already empty

Some don't want to believe it



Don't you think there are a few things we could say on this subject to bring the debate to good-old-HN level?

(1) LLM's Attention mechanisms are clear enough at a conceptual level: "the chicken didn't cross the road because it was too wide"... OK, so LLMs "understand" that the "it" is the road, because QVK etc is enough to "learn" this.

some say this is all you need... I beg to differ:

(2) Human brains are complex but better and better studied. You have one, should be interested in the hardware. So: IMHO current LLM's look a lot like the Default Mode Network in our brains. If you read the description there: https://en.wikipedia.org/wiki/Default_mode_network I think you will see, like I do, a striking similarity between the behaviour of LLM's and our DMN's ways.

What a synthetic FPN would be I, have no idea so here goes: The bag is very interesting!


Yes we can!

I would never say that's all we need, but... I do say that that might be the most important part we need! That is, language is the most distinctive feature our brains we have. DMN, and similar shallow "activity scans" don't tell us much. Yes, some animals have some kind of language, they communicate and predict trivial results or remember some events. But this is meaningless compared to the output of a human brain, the difference is abysmal.

I don't think there's anything interesting left in the bag.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: