Think we need to distinguish between math as "performing pre-defined calculations rapidly", and math as "figuring out solutions to problems, often novel, sometimes new, that have a mathematical underpining and an ability to be represented in mathematical terms."
In short, computers are good at rapidly performing the calculations of a solved problem, or implementation of the algorithm. Humans are good at coming up with the algorithm. A good, general purpose algorithm generator is probably ai-complete, although i could see "evolution" as a plausible answer to such an algorithm generator too, though maybe only in its capacity to generate beter generators. Or maybe I'm talking out my excretorial orifice, I'm often not sure.
Interesting point. Before the theory of gravity was formulated, planets were still behaving the same way. Then when Newton said F = Gm1m2/r^2, and we came up with equations to define exactly how they move, does that mean planets started doing arithmetic? Did they start solving differential equations that describe their motion?
I think ultimately it means that "doing arithmetic / doing math" are defined by the observer rather than the doer. True for any abstraction - the brain does exactly what it does, then to ask if its doing "abstracted task X" is not really meaningful from the perspective of brain, the abstraction in the first place is a mental construct we create.
The computer doesn't do arithmetics either. Signals just pass through wires and gates according to their nature (or as we might say: while obeying the laws of physics).
Is it? Is there any evidence that it's converting the signals to symbols and performing arithemtic on the symbols? Or is the signal processing being performed "directly", through the essentially analogue operation of the neurons?
Otherwise you're asserting "doing arithmetic" as a property of any analogue system, and would say that a falling ball needs to understand calculus in order to make an arc.
I see what you're saying, but I think the comparison to a falling ball is not fair.
The brain is doing purposeful data processing. Yes it's just signals doing their thing following the laws of physics but that's also what computers do.
Arithmetics in a CPU are just electrical signals passing through wires. If you were to look at these signals without any awareness of why they are flowing this way here and there, you wouldn't be able to tell that these signals are an encoding of some arithmetic operations.
If processing audio and visual signals does not involve doing arithmetics, then what does it involve?
Not symbolic arithmetic though, and the answer is not precise.
If people are going to argue that inanimate objects perform arithmetic, they will need to define arithmetic. Digital or analog calculators do it symbolically, subject to interpretation of the symbols by a human.
If a human is using a calculator or an abacus or blackboard to perform arithmetic, would we also say that the abacus or blackboard performs arithmetic? (Great, now we have to define "performs")
Why stop there? by the same sort of argument, one could say that it not only computes arithmetic, but it is a quantum computer.
The way I see it, an operational amplifier, or a person catching a ball (to take an example from another post here) is performing a task that can be analyzed mathematically and modeled arithmetically, which is not the same as doing the analysis or setting up the model and performing the calculations.
But does it produce a precise, numerical answer, or just a fuzzy approximation? Or does it do signal manipulation with a similar end results to a computer, but using a non-arithmetic algorithm? I don't think that it's obvious that the human brain literally does something that we'd label "arithmetic".
The brain is actually really good at math - the best example I've heard of this (I think from the book 'Blink') is catching a flyball - in an instant someone is able to calculate the trajectory, wind speed, spin and coordinate with it's own body to run at the right speed to be in the right place when it lands. Now of course that takes experience but it's not just pattern matching since training allows people to adapt to new and unseen conditions. But anytime you have a 'gut' feeling, it's usually because of some behind the scenes calculations.
Translating maths into a mess of symbols and deconstructing it consciously is of course a laborious process - but AFAIK no computer is good at that either.
My own personal metaphor is the conscious mind as programmer and the unconscious mind as computer. If I correctly program my unconscious mind I can spontaneously realize the right answer far faster than I can understand why.
The result is the similar if one was to calculate such trajectories, but I am not sure what happens in the mind can be called calculation. How can it be properly tested that brain indeed calculates trajectories when catching a ball? Would practicing catching ball improve their relevant mathematical abilities demonstrable on paper?
What if the brain operated more like a neural network than a calculator? Your second point seems to raise another, separate issue, whether or not knowledge gained can be applied in an unrelated context, or even if these two calculations would be performed in the same ways by the brain, such that training in one yields greater efficiency in the other. I'd wager that practicing hand eye coordination would yield greater efficiency in other task related to processing visual input, maybe video games.
I don't agree with that description of what the brain does when you catch a ball, or with the principle it's proposing. I don't think the brain does any kind of calculations to figure out how to catch the ball; I think it's effectively muscle memory. If you've never caught anything before, your brain will have no clue what to do. As you practice running to catch things, I think a better description of your subconscious process is something like: "The ball (or whatever) looks like it's growing bigger in my visual field at a certain rate. A previous time when it grew bigger at a rate kind of like that, I applied about x force in the legs, and I didn't get there in time. Another time when it was growing at about this rate, I put applied a larger amount of force y, and the ball landed behind me. This time I'll try to apply a little more than x force, but less than y force, and see how that works."
This is a substantial oversimplification of course (there are many more factors involved than how fast the ball is growing in the visual field), but I think the point is clear enough. I doubt there's any trigonometry happening in the brain's circuitry; it seems much more plausible to me that the brain is really good at remembering how it felt in previous circumstances, recognizing how those remembered circumstances relate to the current one, and trying to adjust.
As I understand it, this is actually a significant debate in cognitive science, philosophy of mind, and related fields. One prominent proponent of a view like the one I've expressed here, that the brain doesn't require or use heavy math to do things like catch flying objects but rather acquires the ability over time through experience, is John Searle. He is known for using the example of his dog's ability to catch a ball that's bounced off a wall when discussing and arguing against theories of mind that propose that all unconscious processes must be following algorithms or rules (like running through computations to figure out how to catch a ball). Here's a quote of his from the BBC program Horizons (quote found in "New Technologies in Language Learning and Teaching", issue 532, on page 37 [1]):
If my dog can catch a ball that's bounced off the wall, that may be
just a skill he's acquired. The alternative view (the pro-AI view)
would say: "Look, if the dog can catch the ball it can only be
because he knows the rule: go to the point where the angle of
incidence equals equals the angle of reflection in a plane where the
flatness of the trajectory is a function of the impact velocity
divided by the coefficient of friction" - or something like that.
Now, it seems to me unreasonable to think that my dog really *knows*
that. It seems to me more reasonable to suppose he just learns how
to look for where the ball is going and jumps *there*. And a lot of
our behavior is like that as well. We've acquired a lot of skills,
but we don't have to suppose that, in order to acquire these skills,
the skills have got to be based on our mastery of some complex
intellectual structure. For an awful lot of things, we just *do* it.
Sure, it might not be trig, but in your description there is still calculation (the ball is growing larger at x rate, so adjust my legs accordingly), just not of the self-aware conscious variety. There are really two definitions of 'calculation' that are used interchangabley - one is a) the act of deconstructing into symbols and picking apart, and the other is b) the execution of algorithm.
My point is that 'feeling' or 'muscle memory' is the execution of (super hacky, ad hoc and efficient) algorithm, the big difference between computer calculation and brain calculation being that the brain is far more plastic than the computer. Presumably in an insects mind the 'muscle memory' is hardcoded (deliberately reductionist here), whereas we have a conscious mind that can train and 'program' our unconscious mind.
In your quote, 'we just do it' is ignoring the great complexity of the human mind that can perform complex interconnected tasks with seemingly no effort at all, if it has been trained properly. Replace training with programming, and the parallels are there - graphics programming for instance is a huge collection of hacks and rules of thumb to get something that looks like perspective and light. Like our brain, it's results orientated so it truly doesn't matter if it's not 'correct'.
Now of course the brain and computers are different, but it's incorrect to say the brain doesn't calculate - it just is a lot more unwieldy to program than a computer.
Exactly. I always have to make this clear to people. The conscious mind is incapable of fast arithmetic, but the subconscious mind is amazingly capable. It's doing complex, freebase (relative vs static) calculations constantly. The ability to track and accurately move your limbs with a non-determined base point with a few hundred nanosecond-millesecond latency (proactive) or 150-250ms latency (responsive) is without compare.
Just look at machines that don't do precalculated movements (ASIMO, but even many of its patterns are prepared; Boston Dynamics robots, etc) versus those that do (industrial assembly line robots) and it gap is insanely wide. And even then, the dynamic bots are doing maybe one or two things at a time (walk, stop, pickup, walk, drop) nothing complex or integrated.
Is it? Is there any evidence that it's converting the signals to symbols and performing arithemtic on the symbols? Or is the signal processing being performed "directly", through the essentially analogue operation of the neurons?
Otherwise you're asserting "doing arithmetic" as a property of any analogue system, and would say that a falling ball needs to understand calculus in order to make an arc. Downthread someone is saying that crystal formation is "doing arithmetic". While there may be a philosophical sense in saying that all actions of the universe are in some sense arithmetical as they obey physical laws, this is not a useful way to talk.
What's the difference between "crystal formation" and cpu processing? They are both physical processes that don't understand the mathematical concepts that underpin their functioning. Maybe they do computation, but not math. Only humans can do math so far, and maybe some AIs, in a limited sense. Understanding math is harder than computing.
The point I'm trying to make is that computers "do arithmetic" through digital operations where there is a symbolic representation (through assigning analogue state to discrete symbolic values). Not through transistors operating continuously in their linear region.
Inside neuronal systems there doesn't seem to be a direct symbolic representation - and if there is, the neuronal patterns of somebody doing calculus on paper versus e.g. catching a ball are entirely different.
The symbolic representation is just a language though.
We might not be equipped to understand a language different from the formalisms we came up with.
To expand on that point, the brain is not good at simple products of large numbers, but is fantastic at doing things like triangulation. Or calculating how much muscle to contract to keep your balance going into a turn whilst running. Or eyeballing quantities - we notice when we get it wrong, but the vast majority of the time, we're pretty good at getting estimates correct about the human-scale world around us.
What we do regularly (..say.. move your hand to reach something or catch a ball) does take a large amount of very precise products though (i.e. Inverse Kinematics).
It is not very meaningful from an information theoretic pov to debate whether the results come from the multiplication algorithms computers follow or via some trained analog network; the result is the same.
Well, yes, but that's not a very useful distinction. You train computers as well by programming them. In meatspace, there was a famous experiment in the '60s where kittens had their heads locked into position so that they only saw vertical lines - their nascent neuroplastic brains then trained that way, and as they matured, they simply couldn't see horizontal lines (eg: would walk into horizontal bars).
In the Nature vs Nurture debate, the purists on either side tend to use tortured, hair-splitting definitions to make their arguments, and it's usually those somewhere in the middle that sound the sanest.
I agree (though probably not in the sense you intended) - in fact, the human brain is the only thing we know of in the universe that can do math.
I do not hold the apparently widespread view that anything performing an action that can be modeled mathematically is doing mathematics, but it is clear by now the two sides on this issue are not going to reach an agreement.
At least in so far as living macroscopic beings are concerned, they're all doing computation of some sort by processing environmental information and using the results to produce behavior. They're about as good at math as any other computer programmed to reproduce those algorithms.
So long as "processing" information isn't the same as actually producing that behavior, as in the case of crystal growth, they're doing something besides what they're doing.
Though I'd prefer to think that crystals are actually doing math too, but they're so good they don't even need to think about it.
Aren't you overshooting what you mean by linking to that paper? Treating some physical processes as computational processes might lead to some intractable problems, but it also might lead so some insights. Aaronson's section on space seems like an example of this.
Neither is an issue that some way of thinking might raise more questions than it answers. Actually, if any of those questions is both interesting and solvable, that is a virtue.
I don't believe I am overshooting. I am pursuing this conversation thread in light of the original claim; that the brain is incredibly good at math. I just wanted to poke at this statement a bit, to show that if you accept this claim (that the brain is "incredibly good at math" based on its inherent structure), it opens the door to a whole other discussion around what constitutes computation.
In sum, I was just trying to see through what angle OP was framing their point.
Daniel Dennett has a nice phrase for what plants, etc. do: Competence without comprehension. I think it sort of applies here. (And broadly to a lot of human activities, but I would definitely classify "good at math" as requiring some degree of comprehension whereas limb movement... not so much[1].)
[1] You don't really need to understand _how_ you're moving your arm. You just do it -- it's on autopilot.
I'm reading Dennet's From Bacteria to Bach and Back right now.
Highly recommended if you're interested in philosophical discussions about this sort of stuff. I'm finding it highly entertaining and deliciously provocative.
Yeah, think about how complicated it is to play a sport like tennis or frisbee. Running with co-moving objects and players. Integrating the position, velocity, and acceleration of all these things in order to compute a solution. That's some decent undergrad math!
I don't think it is a mathimatical computation in that way. Let's say you want to press the powerbutton on your pc with your finger. It's not like the brain takes the 3D position in space of your finger and the position of the power button in 3D space and calculates the way it has to move your body/arm etc. to press the button. That would be insane inefficient..
Assuming computation has ontological existence, as opposed to cultural demarcation, where we arrange certain physical devices to have predictable behavior we can modify, and call that computation. Or before computers, denoting the markings human beings make on paper as computation.
Saying the brain literally computes is making a philosophical claim as to what exists, as opposed to a useful metaphor.
Walking usually involves ending up in distant places so you can say get food or something to drink etc. A top generally falls over within a few feet of where it was spun.
"Bad at Calculating, Good at Everything Else" <- correct title IMHO
It will be a long, long time until computers spit out something like the prove of Fermat's Last Theorem (which requires being good at math and not calculating).
If this not only verification but generates genuine new proofs, then I might be wrong and have to remove one 'long'. Even then, I am convinced, it will still be a long time until computers can come up with such a complicated proof, involving so many different fields of mathematics, like Fermat's Last Theorem.
Furthermore, math is not just about proving things. Mathematicians invent the consistent formalisms in which the process of proving takes place, and spot isomorphisms between apparently different ones. They also have a knack for choosing to follow paths that will be fruitful, rather than than lose themselves in a maze of pointless symbol manipulation. AFAIK, no computer has ever taken it upon itself to find a proof of some nontrivial issue in mathematics, let alone do any of the less formal things in mathematics.
The issue is that some (actually, most!) of the logics you can reason about are not decidable.
So, it is hard to have a sense of progress towards a goal.
Natural language provides a huge amount of overhead for doing math. If computers did math in terms of natural language it would be a lot slower for them too.
It'll be interesting to see what changes occur in the human brain as we develop neural-computer interfaces. The human brain appears to be very good at performing instantaneous approximate calculations (catching or dodging a thrown ball).
Interestingly, humans don't appear to be as good at memorising sequences as monkeys (1). It'll be awesome if someday we can see the difference between how the monkey and human brain behaves while carrying out this task.
Just making your eyes converge at a focal point so you can read this involves math, and that's probably some of the most trivial stuff. The inverse kinematics involved in moving your limbs also do involve plenty of math, try to do it in a robot arm and see what I am talking about.
The problem is that our way of doing arithmetic has a lot of overhead. The algorithm we learn in school is not that good in terms of efficiency. However imagining an abacus (not a 10 bead one) and exploiting muscle memory seems to be probably faster: https://www.youtube.com/watch?v=Px_hvzYS3_Y
Since all definitions of mathematics are controversial when subject to philosophical rigor, and the foundation of mathematics themselves are axioms (which by definition cannot be proven), and sometimes these axiomatizations are either incomplete and inconsistent with each other, I will just say it's an open question.
What we do however know is that mathematical principles so far hold up well and have been of vital importance to understand nature.
Oh, the brain wetware is good at arithmetic; just unfortunately not in a way that is available to conscious thought.
Just like there are computer languages in which there is little or no arithmetic support, even though the machine does nothing but arithmetic when evoking their meaning.
Just like we can apply silly abstraction inversions to bring about arithmetic in a system that doesn't expose it (e.g. Church numerals in a lambda calculus), the brain implements conscious arithmetic in a very inefficient way.
The human mind can grasp platonic/Aristotelian forms, which are at least potentially infinite in ways of being instantiated. Computers can only deal with finite things, such as arithmetic. That's the fundamental difference between the two. Hence Gödel's second theorem that no axiomatic system can know truth, whereas we can know his theorem is true.
Here is a question for all the "brain as computer" proponents. Given the molecular level activity of a bunch of neurons over time can you figure out what "maths" happened there? If not, then please stop this "brain as computer" advertisement and come back when you have the answer to the above question.
In short, computers are good at rapidly performing the calculations of a solved problem, or implementation of the algorithm. Humans are good at coming up with the algorithm. A good, general purpose algorithm generator is probably ai-complete, although i could see "evolution" as a plausible answer to such an algorithm generator too, though maybe only in its capacity to generate beter generators. Or maybe I'm talking out my excretorial orifice, I'm often not sure.