I voted for asteroids as the most statistically feared existential risk in one of the LW surveys and was shocked when the results came back:
"Of possible existential risks, the most feared was a bioengineered pandemic, which got 194 votes (17.8%) - a natural pandemic got 89 (8.2%), making pandemics the overwhelming leader. Unfriendly AI followed with 180 votes (16.5%), then nuclear war with 151 (13.9%), ecological collapse with 145 votes (12.3%), economic/political collapse with 134 votes (12.3%), and asteroids and nanotech bringing up the rear with 46 votes each (4.2%)."
I think the reasoning is that an AI singularity is more likely than not within the next 2 centuries. During that time, people can differ about whether they think nuclear war, nanotech, or bioengineered pandemics are likely. But the risk of a catastrophic asteroid strike are basically known, and small. (1km asteroids hit the earth every 500k years.) So, depending on your assumptions, asteroid deflection might be the most cost-effective existential risk to mitigate, but it shouldn't be the most statistically feared.
"Of possible existential risks, the most feared was a bioengineered pandemic, which got 194 votes (17.8%) - a natural pandemic got 89 (8.2%), making pandemics the overwhelming leader. Unfriendly AI followed with 180 votes (16.5%), then nuclear war with 151 (13.9%), ecological collapse with 145 votes (12.3%), economic/political collapse with 134 votes (12.3%), and asteroids and nanotech bringing up the rear with 46 votes each (4.2%)."