Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think the "abstract" part here was the concept of "three-ness" or "two-ness" - that the bees could see something that represents 3 and correctly choose the path with three shapes, regardless of what the shapes were.


Based on the picture of the experiment setup in the article, it's hard to infer the bee understands 'three-ness'.

Based on this reasoning, does the bee already understand 'flowerness' since it knows to get nectar from many different kinds of flowers?

Do dogs understand 'dogness' because they know to sniff the butt of any dog that crosses their path?

Pretty sure I can create a simple AI that can distinguish three of something under many different kinds and orientations within suitable constraints, and have that capability associated with another easily distinguishable visual code. Yet, I would never say I've made my computer understand 'threeness', as the computer has no abstract reasoning capabilities whatsoever.

In general, how do we know when something understands '____ness' in the way we understand '____ness'? Per my previous examples, either the article's meaning is trivial, or unsubstantiated.


How about you just look at the paper?[0]

The positions of symbols were randomized, and they used symbols previously unseen by bees.

The concept of three-ness (as a cardinal number) is just that: being able to identify sets of objects which can be put into 1-to-1 correspondence with the following set of asterisks: * * * .

The bees, according to the experiment, can do that.

Is this our understanding of three-ness? Maybe not; there's also the notion of ordinals (as in 3 is what follows 2, 2 is what follows 1, and 1 is where you start). We have at least two different notions for numbers. But the notion tested in the paper is good enough to do most mathematics with (cardinals and ordinals are kind of the same until you reach infinity).

[0]https://royalsocietypublishing.org/doi/pdf/10.1098/rspb.2019...


I won't deny it is a very interesting study and seems to be well put together.

However, the paper is trying to present the bee study as evidence that human cognition is on a spectrum with animals, instead of a unique thing not found in the rest of the animal kingdom as part of a larger project to identify whether/how human cognition could have evolved from animal cognition:

"Nevertheless, there is no evidence that any species apart from Homo sapiens have ever spontaneously developed symbolic representations of numerosity, which opens the question of which animals are capable of learning symbolic number representations, which are capable of generating such representations, and whether this implies a fundamental difference in the mental processing of Homo sapiens compared with other animals."

So, for the bee study to evidence an answer to their question, we need to know whether this symbol-to-numerosity association the bees learn is the same sort of thing as symbolic number representation that humans use. The study is certainly a good step in that direction, but it seems to me there is a more fundamental question that is not being asked. Are the bees really perceiving 'twoness' and 'threeness' in the cardinality sense you mention, or is there some analog property, such as shape surface area, general shade of picture (i.e. two will seem lighter than three), triangular configuration (three) vs dots (two), etc. that the bees are responding to instead of exact cardinality?

To take a contrary perspective, why couldn't the bee symbol to number association be the same sort of thing as the AI I mentioned in my previous comment? The AI can recognize 2 and 3 symbols effectively, but it cannot extrapolate beyond that, nor could we take the AI and build an arithmetic module out of it through some kind of incremental training. The 2/3 recognizer is a hardcoded piece of functionality in the AI that will never expand beyond its core functionality. Perhaps 2/3 recognition is part of the hardcoded bee behavior that is necessary for it to set landmarks and navigate. This is much different than human arithmetic capability, where we can learn about 2s and 3s through physical examples, and then realize there is such a thing as 'number' and eventually realize there are an infinite number of 'numbers'.

So, for these two main reasons, despite its well constructed sophistication, I don't see the study as very compelling evidence that the human cognitive ability is not unique and merely part of a continuum wherein the dial can be turned up from animal cognition to human cognition.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: