I think the point he is trying to make is that there are boundaries to intelligence. I think of it this way - no matter how smart an AI is, it still would take 4.3 years to reach Alpha Centauri going at the speed of light. An AI still needs to run experiments, collect evidence, conjure hypothesis, reach consensus, etc.. Is this really that far more efficient than what humans do today?
But we, humans, aren't going at anything like the speed of light. What if we tweaked our DNA to produce human beings with the working memory capacity of 50 items instead of the normal 7-ish [1]? One such researcher would be able to work faster, on more problems at once, and to consider more evidence and facts. The next bottleneck for that person, of course, would be the input/output capacity (reading, writing, typing, communicating), but even with those limitations, I bet they would be a lot more efficient than the average "normal" human. The question is - would you call such a person more "intelligent"?
Or we get more humans and then it's a coordination problem right? I mean there is a point in comparing individual vs collective intelligence. This is a bit like communist systems. They work in theory because you get to plan the economy centrally, but in fact more chaotic systems (unplanned) do better (check growth of capitalist countries vs communist ones).
Sure there are boundaries. But the limit of these boundaries may be way above what humans are doing. Computers, unlike humans aren't limited to the domain of the physical. An AI may well be able to meaningfully organize (read: hack) all of the worlds computers because it can self-replicate, increase computing power, communicate very complex information very fast, etc. We're limited by the output of fingers and vocal chords, by the size of our brains, by imprecise and slow memory formation and recall, by the input we can get from mostly eyes and ears, computers aren't.
An AI may well be able to reach consensus on millions of hypotheses per second.
Is this really that far more efficient than what humans do today?
A major managerial problem with humans is sorting out our irrational emotional biases and keeping everyone working on something resembling the appointed task. Can you imagine the productivity gain if that problem suddenly went away?