Rationality suggests incorporating new ideas into existing ones. Where are these people change their projections based on the slowing increase in computing power? (aka second derivative of computing power).
The intelligence explosion discussed in the original LessWrong post is a separate issue from exponential growth, only loosely related, although folks like Ray Kurzweil constantly conflate the two.
There far more issues on that side of the singularity idea than simple questions of computational power. There are fundamental limitations on information as a direct result of quantum mechanics which limit how accurately you can say predict the weather without directly controlling it.
So, a super intelligent Jupiter brain simply can't predict* the temperature of every cubic centimeter out to 5 decimal places 10 years from now regardless of what computational power it's given or what measurements it takes. And, peoples behavior is influenced by the weather so again it's can't model human behavior to that nth degree. Now that's just one example, but you really can't predict the stock market accurately over long time frames for the same reasons etc etc.
*again ignoring more direct influences.
PS: Now QM might not be correct, but assuming a world view without evidence is the realm of religion not reason.
I don't think anybody from the SIAI is disagreeing with you here. I don't think the kind of prediction you're talking about is part of the intelligence explosion hypothesis. A superintelligence doesn't need literal omniscience, or even something that close to it, in order to be much, much, more effective than humans at achieving goals.