Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014)

from the movin'-videos dept

Summary: The internet is the way that many new musical artists get discovered these days, with perhaps the most famous story being that of that of Justin Bieber on YouTube. Some of this came from finding undiscovered musicians who had talent, and some of it came from finding otherwise unsigned artists who had managed to build large followings themselves.

Of course, this latter situation also opened up the possibility of gaming the system to appear more popular than you are in reality. Partly in response to this — and more likely to prevent gaming views in order to gain advertising revenue — YouTube put in place a policy of removing videos that appeared to use automated systems to game the number of views.

An independent musician by the name of Darnaa sought to gain a following via YouTube, and engaged in a marketing campaign designed to drive traffic and popularity to her videos. In 2012 she had uploaded a video that had received nearly 1.9 million views according to YouTube?s counter. In 2013, another video received over 1.1 million views. In 2014, she uploaded a new video, for a song entitled Cowgirl, which started receiving views as well. Darnaa claimed that these came from a coordinated marketing campaign that cost her hundreds of thousands of dollars.

YouTube, however, believed that the views on the video were inflated through artificial means, violating the terms of service. Rather than simply removing the video, or shutting down Darnaa?s videos, the service simply moved the video to a new URL, resetting the counter (and breaking earlier links to the video). Darnaa sent an email complaining about this, and convinced her marketing partners to restart the marketing campaign, leading to YouTube relocating the video a second time, which again, reset the view counter.

Darnaa?s music label, the conveniently named Darnaa, LLC. then sued YouTube arguing that moving the location of the video was both a breach of contract, and interference with her business dealings.

Decisions to be made by YouTube:

  • How should a service like YouTube determine which videos are getting legitimate traffic compared to which are generated traffic through artificial means, such as bots?
  • Is it possible to distinguish a heavy marketing campaign to point traffic to a video from methods involving artificially generated views?
  • In which cases should a video that has received artificial views be moved to a different location (cutting off old links and restarting a counter) as compared to being removed entirely?

Questions and policy implications to consider:

  • Will fighting back against artificially inflated views lead to false accusations?
  • Could actions designed to stop artificial view inflation impact a legitimate marketing campaign?
  • Should musicians and labels rely heavily on things like ?views? to determine the actual popularity of an artist when they might be manipulated?

Resolution: After many twists and turns, the lawsuit Darnaa LLC filed against Google was dismissed at both the district court and the appeals court, though much of the dismissal was due to the case being filed after the statute of limitations had passed. However, the court also rejected the parts of the case that survived the statute of limitations questions, noting that YouTube was effectively entitled to manage its service as it saw fit, including how it treated Darnaa?s videos.

Originally posted to the Trust & Safety Foundation website.

Filed Under: , , , , , ,
Companies: youtube

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014)”

Subscribe: RSS Leave a comment
2 Comments
Anonymous Coward says:

Yes, Goog/YT could have done a better job (assuming the claims are accurate).

On the other hand…
If you literally spent hundreds of thousands of dollars on marketing to get your videos noticed (seriously wtaf for 3 songs?), you probably could have afforded to fix the broken links. And if you are remotely popular (it doesn’t take much, even in a small niche), people will bloody well search for your content anyway, particularly since most people don’t know how to enter a URL or save a bookmark.

I mean, she may have a valid complaint, but the fix is simple stupid. Now she paid money for really stupid lawsuit action. But maybe that is just more marketing money "well-spent".

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow