Friday, 9 September 2011

How Flagging Works

Everyone knows that the flagging system on YouTube is broken.  This is in no way news, and I'm fairly certain that no one who has used the site for any significant amount of time believes YouTube when they say "when a video gets flagged as inappropriate, we review the video to determine whether it violates our Terms of Use—flagged videos are not automatically taken down by the system."  I'm sure it's true sometimes, but it's clearly not the standard.  In fact, despite their claim (true or not) that they review flagged videos 24/7, I don't think it's possible to have a real, live person review every flagged video and make a thoughtful decision as to whether a Community Guideline has been violated.  Just the task of consciously reviewing and removing videos with explicit sexual content and nudity would keep a sizeable staff busy 24/7.  How long do you suppose it would take for a staff member to intelligently determine if a video's content constitutes "bullying?"  I'm sure it rarely happens, if ever.

This reality is frustrating enough on its own, but what's worse is YouTube's refusal to let you know why exactly a video was supposedly determined to be "inappropriate."  Not only does this enable the site to remove videos completely arbitrarily, but, assuming some person in some office had a real reason for removing a video, it denies a user information that would help her predict what might lead to an issue in the future.  It's an absurd policy and only in place, I assume, in order to prevent people from pestering them with informed and reasonable appeals to their unfair actions.  

But at least when videos are removed you know what was in them.  Unfortunately, your account can obtain strikes from having text comments flagged and removed without ever being told what you wrote (or when, or where) that was so "inappropriate."  A user can't reasonably be expected to adjust his behaviour based on a punishment for an undefined offence.  The process is stupid and not logically connected to a goal of promoting understanding and compliance of the site's standards - probably because the goal is simply to cover their own asses.

Of course, a toxic cocktail is made when underhanded and vindictive users choose to take advantage of the broken system.  It has been demonstrated time and again that if someone wants to take the time to pepper a particular person's content with multiple flags, eventually some of them will stick (as long as the target is not a partner with hundreds of thousands of subscribers).  A feature that should only function to keep users safe and alert administrators to porn, gore, violence or its incitement, has become a tool of censorship and malice in the hands of many assholes.  

I recently had seven videos and an unknown number of text comments removed all at once due to flagging; undoubtedly by the user Brett Keane.  He all but admitted it in a subsequent video.  But this isn't meant to be a woe-is-me piece.  Innumerable YouTubers have had to deal with this.  However, the way this occurred helps to illustrate how fucked up the flagging system really is.  Videos are often removed in clusters, which, as I tried to convey to the YouTube support team, should be a red flag indicating that they probably don't contain Community Guidelines violations, but rather a particular user or group of users wants to screw someone over.  This should be especially apparent when some of the videos in question are years old.  If YouTube had any interest in having a fair process they'd take a close look at these instances.  I e-mailed YouTube support to request that they re-evaluate their "decision" to remove all of these videos.  In reply I received a cardboard message that stated that they "reviewed the videos in question and have decided to uphold their removal decision," whatever that means.  Did they take a second look as per my request, or did they review the videos initially and, by the way, are upholding their decision?  I seriously doubt either would be the whole truth.

In the pink slip I received telling me my text comments had been flagged and removed, the reason I was given was that they were "identified as harassment."  
Harassment is one of those unfortunate concepts that people think can be infinitely moulded and adapted to fit their own definitions.  The term is often applied to communication that people feel is distasteful or not nice.  Harassment is a behaviour.  To harass someone on YouTube you have to follow him to every page he posts on after being told to leave him alone, or create multiple accounts to post on his content/page after being blocked.  Nothing I've done could be considered harassment by anyone who understands what harassment is.  What does the YouTube machine consider harassment?  I don't know, because I don't know which of my comments were slapped with that label and removed.

I sent YouTube a second e-mail asking to see what these comments said and why the videos were removed, but I don't expect a reply.  Trying to communicate with them is usually like talking to a brick wall.  The other action I took was to remove my remaining videos regarding Keane.  I don't plan on making any more.  I've contributed my share, and I'm no longer willing to tussle with someone who has no principles, no ethics, and will do whatever it takes to shut me up or shut me down.  The fact is, as long as the system remains broken, flagging works.