https://twitter.com/reckless/status/1136088138357493761YouTube tried to explain this decision on background to us, but I made the call to ignore it because they won't go on the record. YouTube has a major harassment problem, but an even bigger problem with transparency and consistency of policy enforcement. It is not our job to paraphrase their explanations. It is their job to own their policy decisions.
A series of articles in mainstream publications does tend to make you recalculate profit implications. Since their ToS and its enforcement are so consistently fuzzy, they can just follow where the wind is blowing.
I imagine there's boiler room full of guys somewhere tagging videos as "Borderline content", creating data which then feeds into a machine learning algorithm which decides which videos get promoted. All videos are metricized by the similarity to borderline content, and are increasingly likely to be promoted in proportion to increasing graph distance from the borderline content. Youtube management gets nice plausible deniability that they don't actively surpress political adversaries, because the whole process is stochastic, and who knows how machine learning works anyway?
I have personal experience with working on recommendations at google and you are pretty off. It was based on increasing engagement, user satisfaction, quality and on reducing controversial news coverage of google(this meant removing the right-wing content which could lead to media outrage no more no less). In addition there were a lot of left leaning googlers who had a personal agenda to suppress all right-leaning content.
You are right that it was a room full of guys, but what you may not realize is a majority was Chinese or Indian.
Many very popular religions are supremacist in nature.
And pretending otherwise to allow them to skirt the censors just makes the entire censor regime seem like a hollow power play by a company trying to keep from sinking in a turbulent political time.
It's often a shame that network effects are nearly impossible to overcome. It looks like the feature of "less censorship" will never be enough to mount a challenge to YouTube. But, perhaps posts like these from Google will one day succeed in persuading people to start looking elsewhere for videos.
> It looks like the feature of "less censorship" will never be enough to mount a challenge to YouTube.
s/censorship/moderation/. And no, "we host everything" is a fast way to kill a potential competitor to YouTube. The first and fastest adopters of such a platform will be the people rejected from other platforms, which rapidly turns any "we host everything" platform into a cesspool that nobody else wants to use.
YouTube could wind up with cesspool videos? The horror!! It's a good thing they've now got these new guidelines in place to remove all the garbage videos. I'm sure that all the remaining allowed videos won't hurt anyone's feelings. /s
I'm not advocating for zero censorship. I just think this reads like they're heading down a slippery slope.
You might be right. I'm not sufficiently knowledgeable about YouTube to argue whether or not Google's new censorship policies are quickly getting out of hand. But, someone like Tim Pool seems to be quite well informed on that subject, and he makes a compelling case.
I'm sure the Nazis and Sandy Hook Truthers will be happy to know there are so many other options. Look, we live in a world where, for better or worse, we don't force publishers to publish anything they don't want to. Youtube telling Nazis to go jump in a lake is no different than your local paper declining to run an editorial from one. It just seems different because the scope is so much bigger.
Exactly. You have a freedom to speak. You don't have a freedom to use someone else's forum to gain an audience. No more than I have a right to walk into your house and do whatever I want.
Funny, I seem to recall all sorts of instances where the government has forced people and businesses to cater to those that they don't want to... Hell, Title IX is based on the premise of providing equal access by protected classes.
And for reference, political party is a protected class in DC, and that could extend to other areas. Politicians aren't allowed to block people on Twitter or Facebook, it's only a matter of time before the companies themselves aren't allowed to in many conditions.
There's also the concept of a public space in private ownership to consider. There's a big difference between your living room and a restaurant or park.
None of this is relevant. None of these protected classes are applicable here. Nothing has ever been about compelled speech or compelled publication.
If you really believe where you’re argument is going, I demand you give me your twitter account to use, so I can tweet whatever I want. If you don’t do that, you’re censoring me, and I guess I should sue you or something.
I’m not hard find online. I’ll be waiting for your DM.
At one point, protected classes weren't... then they were... The government cannot restrict your access, but they can force companies to expand it to be more equal.
I'll give you my passwords when you give me yours. Also, I never claimed to be both a publisher and a platform.
Facebook and Twitter should not have the protections of both. Also, you never came up with a coherent argument of why political viewpoint shouldn't be a protected class across the US, as it already is in DC. Just some false narrative of associating a single account to a set of massive corporations that hold control over what equates to public discourse.
If you want to change the status quo, you’re going to have to argue why it should be changed.
If you want a reason why politics is not a protected class I’ll give you third. First, there’s not wide spread persecution based on political beliefs. There simply isn’t. Second, eliciting political beliefs to a protected class is tantamount to undercutting all political association everywhere. You can’t argue for anything, as you’re “oppressing” the opposite view. Political speech is strongly protected already. What you’re arguing for is forced association, and compelled speech. Third, political speech is inherently mutable, which makes it completely irrelevant to a protected class, and do nebulous they are impossible adjudicate.
Far from a false narrative, I’m saying there is no difference a social network and any other private publication. It’s a private enterprise and they can do use it anyway they want. Just because something is big, doesn’t suddenly make it public. A newspaper also has a huge audience, but they don’t have to publish every rando’s letter. Nor do these platforms. Enforcing a TOS is well within their right.
If there wasn’t a Ga, or a Venemo, or a 8chan, or a Voat, maybe you’d have an argument. If we were talking about something at a very basic layer, you’d definitely have an argument. But none of that is what we’re talking about. It’s “I have to use twitter/YouTube/Facebook, because they’re popular!” Well.. too bad.its no different than the old mock headline, “Man upset that unpopular ideas are unpopular”
It might see a few of the existing flat earthers harden their view, but I suspect they'll gain fewer followers overall when hidden from mainstream content. What's interesting is that fracturing hateful communities does decrease the spread of their message[0]. I have to think that we could see similar trends with fringe theories, like the earth being flat, or bleach being a cure for autism. Which would in turn prevent users who might fall into the rabbit-hole that is the cesspool like communities among hate based groups, and the more fringe theories (again, flat earthers, anti-vax luancy, etc.)
This is basically what happened when Reddit banned racist subreddits. Everyone cried
"But my free speech!"
"Won't they just polarize even harder?"
"Won't they just operate in the ShAdOwS?"
Except that didn't happen. The people left for Voat, and then Voat died. Banning harmful communities works time-and-time again, and it would work for YouTube if they actually did anything. For every 'guideline update' or press release they do about content moderation, they do basically nothing. The one exception was the sketchy videos with children, but their solution has been pretty heavy-handed and reactionary. I think that's the problem: YouTube has never been pro-active in moderating its platform and it shows.
Even more interesting on that article I linked about exactly that: the people who stayed stopped being as racist / hateful towards fat people. They didn't drag their garbage ideas along to the new places they started posting (on reddit). It wouldn't surprise me if some of them even evolved a little and abandoned those views.
I'm proof-reading a book that a friend of mine is writing about how normal right-wing/conservative populism becomes corrupted by fascist/hate stuff, and a big part of it is hate groups actively "normalizing" their content and spreading it via mainstream social media like twitter and youtube. That way, they show up alongside more mainstream political thought. If hate speech, even "polite" hate speech, is consigned to its own dedicated world rather than sharing ours, it doesn't spread the same way.
That's a pretty dramatic assertion - in the field of hateful things, you'd have to be pretty extreme to qualify as "some of the most hateful". What do you have in mind?
Yes I'm being sarcastic and should make more effort to actually communicate. Thanks.
I'm just pissed off by how many people embrace this shit with no notion of the principles that whizzed past while they were having a 2 minute hate against Nazis. Sure they weren't well served and this is just a public embrace of the biases that have already been in place. They were important though.
The guy who runs a printing press should not take pride in refusing jobs for people he doesn't like, should he? Is his service, which has social impact beyond his neighborhood, supposed to be available to all or would it be a bad thing if he decides not to print newsletters for nazis, or blacks, or this or that church because their doctrine is wrong... None of these are new issues and we've thrashed this shit out before.