Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's very interesting! I remember this happening a lot with sports / nutrition studies. I'm not surprised to see medicine in there as well. Overall, understanding the brain is hard, understanding our bodies is hard, it's normal that we can come up with a limited amount of consistently reproducible knowledge per year.

From all my friends who pursued STEM, I assume this is happening because of the pressure on people pursuing higher education at all costs, not finding enough qualified jobs, getting stuck in academia and printing low quality papers to get a promotion / keep getting another public grant.

If you add in to the mix results influenced by the current political climate, what your peers think, what your sponsors want to prove for financial gain, the situation gets bleak very fast.

Irreproducible papers are the abandoned OSS projects of software engineers in private tech + corruption + public money, a recipe for disaster.



I don't think that's quite fair. Irreproducible papers have supposedly gone through a fairly rigorous review process by their peers, and are usually published by someone with a PhD (or who will soon have one) at an institution of higher learning. Many papers (which often end up being irreproducible) use language that says "This is the thing I found, and it is/is not true."

OSS is often someone learning, or who has an itch to scratch, and put some code out there that may or may not solve your problem or work right. The "lack of warranty" clause in most open-source licensing is really important here, because it's largely designed to say "hey, I did a thing but make no guarantees so use at your own risk", whereas a published paper says "hey, a number of experts and people who should know all agree that this is a thoroughly-researched and well-thought-out position, and you can probably consider it to be true (or nearly so) and base some of your decisions on it." I think that change in context is really important to consider when making the analogy here.


> Irreproducible papers have supposedly gone through a fairly rigorous review process by their peers

This is a myth. Modern "peer review" simply means checking to see if the claim is interesting and if the proper Word or LaTeX template was used. Peer review meant replication in the distant past before the modern Grants and Impact Factor system was built.


> This is a myth. Modern "peer review" simply means checking to see if the claim is interesting and if the proper Word or LaTeX template was used.

I got peer review from two reviewers on a paper submitted to Frontiers last week--it consisted of much more than that. Sure, the reviewers couldn't replicate our exact work themselves, but the paper's methodology was also heavily considered, and both reviewers asked for modifications to be made on different parts of the paper.

It's an imperfect system, but it's not as if there's a rubber stamp floating around. The goals of the the peer review system vs. open source are fundamentally different, and the way the products for both are treated should reflect that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: