The problem with fighting disinformation is that it requires an entity that has the authority to stamp something as true or untrue. This entity will be made of humans and therefore susceptible to disinformation in the same way as the people it is intended to protect and also prone to abuse of its power. If it's between the order of one centralized authority determining truth (China) and the chaos of thousands of competing entities, I'll take the chaos.
Then you have the problem of who gets to decide who gets to be a ministry of truth. The entity with that power is now the one centralized ministry of truth.
Well, that's just one possibility in a space of many.
The core of the point is this, reiterating from the article: obviously a ministry of truth is bad. But isn't the chaos just as bad, since it is susceptible to be dominated by an "implicit" equivalent of the ministry of truth?
To what extent are people really deciding for themselves when they think they're deciding for themselves? If freedom is our highest ideal, how do we empower the maximal amount of people to make decisions for themselves without these influences?
> To what extent are people really deciding for themselves when they think they're deciding for themselves?
To no extent. All decisions are made with some outside influence. This is true now and has always been true. Unless you live as a hermit your thoughts are influenced by other members of society.
> If freedom is our highest ideal, how do we empower the maximal amount of people to make decisions for themselves without these influences?
We don't. How can you eliminate outside influence if you are under outside influence yourself?
I'd say it's a combination of things. Partly the fact they actually have humans reviewing material and performing at least a modicum of veracity verification. BUT, this happens over time. So yes, while Wikipedia is a target of mis/dis-information, a lot of it is eventually cleaned up, and where there's back-and-forth changes, this is pointed out, meaning readers are warned to take the info with a grain of salt. Incorrect changes may persist for days, months or even years on Wikipedia. However, the Wikipedia approach doesn't scale that well for time-sensitive information. In the week before an election, there may not be sufficient time to review and verify all information published. And even though the record may eventually be corrected, the damage is done and the correction becomes almost moot.
Wikipedia is struggling hard against misinformation, POV-pushing and edit wars on almost every controversial topic - the big ones being politics(American, Eastern European, Israel/Palestine) and medicine(gmos, pseudoscience healing, etc)
These are places where the fighting is so common, admins are allowed to ban on a hair trigger. This doesn't really lead to "high quality" - instead it leads to massive gamesmanship as people try to smear each other and get the "other side" banned so they can freely edit the topics to their content. Just click through some of the cases to see the issues leading to them(beware, this is an endless black hole of drama).
>The problem with fighting disinformation is that it requires an entity that has the authority to stamp something as true or untrue.
Any real solution to the problem of disinformation has to work through giving individuals better tools, more visibility and more control over the information they are exposed to.