The problem is essentially spam filtering with much higher stakes. A priori the source of an unusual paper might be one of:
- a genuine innovation from a polymath outside the field or undiscovered talent (Ramanujan)
- someone outside the field applying a well-understood technique from their own field in a new area (used to be common in bioinformatics back when it was done with Perl)
- someone in a non-first-world country much closer to the problem or with a connection to traditional relevant knowledge
Then there are the ones which turn out to be wrong:
- respected but crank-ish behaviour within the field: someone well respected who is extremely enthsiastic about an idea beyond all evidence, such as Linus Pauling's enthusiasm for vitamin C
- respected but ideological behaviour within the field: e.g. the warring schools of economics
- novices who are bad at checking their work: students who believe they've solved a famous conjecture but left out a minus sign on page 65. Most senior academics deal with a lot of these routinely.
- field outsiders getting cranky in another field: William Shockley's opinions on biology
- freelance not-for-profit cranks: outsiders who are wrong, but simply because of error and not malice
- for-profit cranks: this is where it starts getting genuinely dangerous, as these people can be high-output and are aimed at the public. All manner of quacks are in this category, such as the "miracle mineral solution" people who have been trying to get people to inject bleach.
- culture war cranks: The Alex Jones and David Ikes of the world. Even more dangerous as they are not afraid to libel people and destroy those who cross them.
It is no more possible to submit every paper you see to rigorous review and replication than it is to do your spam filtering by sending money to every Nigerian prince who asks for it and seeing which ones send it back. They will destroy you because their capacity to waste your time and effort exceeds yours. You have to go Bayesian; look for red flags that indicate that it falls into one of those categories above.
- a genuine innovation from a polymath outside the field or undiscovered talent (Ramanujan)
- someone outside the field applying a well-understood technique from their own field in a new area (used to be common in bioinformatics back when it was done with Perl)
- someone in a non-first-world country much closer to the problem or with a connection to traditional relevant knowledge
Then there are the ones which turn out to be wrong:
- respected but crank-ish behaviour within the field: someone well respected who is extremely enthsiastic about an idea beyond all evidence, such as Linus Pauling's enthusiasm for vitamin C
- respected but ideological behaviour within the field: e.g. the warring schools of economics
- novices who are bad at checking their work: students who believe they've solved a famous conjecture but left out a minus sign on page 65. Most senior academics deal with a lot of these routinely.
- field outsiders getting cranky in another field: William Shockley's opinions on biology
- freelance not-for-profit cranks: outsiders who are wrong, but simply because of error and not malice
- for-profit cranks: this is where it starts getting genuinely dangerous, as these people can be high-output and are aimed at the public. All manner of quacks are in this category, such as the "miracle mineral solution" people who have been trying to get people to inject bleach.
- culture war cranks: The Alex Jones and David Ikes of the world. Even more dangerous as they are not afraid to libel people and destroy those who cross them.
It is no more possible to submit every paper you see to rigorous review and replication than it is to do your spam filtering by sending money to every Nigerian prince who asks for it and seeing which ones send it back. They will destroy you because their capacity to waste your time and effort exceeds yours. You have to go Bayesian; look for red flags that indicate that it falls into one of those categories above.