Dangerous Ruling On DMCA Safe Harbors May Backfire On Hollywood

from the be-careful-what-you-wish-for,-mpaa dept

Late last week an important, but disappointing, ruling came down from the 9th Circuit appeals court. The ruling in the case of Mavrix Photographs v. LiveJournal found that volunteer moderators could be deemed agents of a platform, and thus it’s possible that red flag knowledge of infringement by one of those volunteer moderators could lead to a platform losing its safe harbors. There are a lot of caveats there, and the ruling itself covers a lot of ground, so it’s important to dig in.

The case specifically involved a site hosted on LiveJournal called “Oh No They Didn’t” (ONTD) which covers celebrity news. Users submit various celebrity stories, and ONTD has a bunch of volunteer moderators who determine what gets posted and what does not. Some of the images that were posted were taken by a paparazzi outfit named Mavrix. Rather than send DMCA takedowns, Mavrix went straight to court and sued LiveJournal. LiveJournal claimed that it was protected by the DMCA safe harbors as the service provider and the lower court agreed. This ruling sends the case back to the lower court, saying that its analysis of whether or not the volunteer moderators were “agents” of LiveJournal was incomplete, and suggests it tries again.

There are a number of “tricky” issues involved in this case, starting with this: because ONTD became massively big and popular, LiveJournal itself got a bit more involved with ONTD, which may eventually prove to be its undoing. From the decision by the court:

When ONTD was created, like other LiveJournal communities, it was operated exclusively by volunteer moderators. LiveJournal was not involved in the day-to-day operation of the site. ONTD, however, grew in popularity to 52 million page views per month in 2010 and attracted LiveJournal?s attention. By a significant margin, ONTD is LiveJournal?s most popular community and is the only community with a ?household name.? In 2010, LiveJournal sought to exercise more control over ONTD so that it could generate advertising revenue from the popular community. LiveJournal hired a then active moderator, Brendan Delzer, to serve as the community?s full time ?primary leader.? By hiring Delzer, LiveJournal intended to ?take over? ONTD, grow the site, and run ads on it.

As the ?primary leader,? Delzer instructs ONTD moderators on the content they should approve and selects and removes moderators on the basis of their performance. Delzer also continues to perform moderator work, reviewing and approving posts alongside the other moderators whom he oversees. While Delzer is paid and expected to work full time, the other moderators are ?free to leave and go and volunteer their time in any way they see fit.? In his deposition, Mark Ferrell, the General Manager of LiveJournal?s U.S. office, explained that Delzer ?acts in some capacities as a sort of head maintainer? and serves in an ?elevated status? to the other moderators. Delzer, on the other hand, testified at his deposition that he does not serve as head moderator and that ONTD has no ?primary leader.?

It’s this oversight by a paid employee of LiveJournal that makes things a bit sticky. The question is whether or not this oversight and control went so far that the volunteer moderators could also be seen as “agents” of LiveJournal, rather than independent users of the platform.

Evidence presented by Mavrix shows that LiveJournal maintains significant control over ONTD and its moderators. Delzer gives the moderators substantive supervision and selects and removes moderators on the basis of their performance, thus demonstrating control. Delzer also exercises control over the moderators? work schedule. For example, he added a moderator from Europe so that there would be a moderator who could work while other moderators slept. Further demonstrating LiveJournal?s control over the moderators, the moderators? screening criteria derive from rules ratified by LiveJournal

The court doesn’t fully answer the question, but sends it back to the lower court, saying that it’s a “genuine issue of material fact” that should be explored to determine if LiveJournal was responsible, and thus would lose its safe harbors. The specific fact pattern and details here may mean that this ruling doesn’t turn out to be a huge problem in the long run for safe harbors, but… it is somewhat worrisome, in that there are at least a few statements in the ruling that are… concerning. For example:

… LiveJournal relies on moderators as an integral part of its screening and posting business model.

But… lots of sites rely on independent and volunteer moderators as a part of their business model. That alone shouldn’t matter as to whether or not a volunteer is truly an agent of the company.

A larger issue may be the simple fact that even if a moderator is deemed to be an “agent” of a platform, if they’re not experts in copyright, it would be ridiculous to then argue that their own failure to stop infringement makes an entire company liable. That would doom many websites that rely on volunteer help. If one were to mess up and not understand the vast nuances of copyright law, the liabilities for the platform could be immense. As Parker Higgins notes, the expectation here is unbalanced in a ridiculous way, especially as this very same court doesn’t seem to think that the sender of a DMCA takedown should take as much responsibility for its actions:

Still, even if the moderator draws a paycheck from the platform, it seems unreasonable to expect them to approach thorny copyright questions with the nuance of a trained professional. That is especially true when you compare this ruling with the Ninth Circuit?s most recent opinion in Lenz v. Universal, the ?dancing baby? case, which looks down the other end of the copyright gun at takedown notice senders. Notice senders must consider fair use, but only so far as to form a ?subjective good faith belief? about it. If courts don?t require the people sending a takedown notice to form an objectively reasonable interpretation of the law, why should they impose a higher standard on the moderators at platforms handling staggering quantities of user uploads?

But if moderators are a platform?s ?agents,? then it runs into trouble if they have actual or ?red flag? knowledge of infringements. The Ninth Circuit has instructed the lower court to find out whether the moderators had either. Noting the watermarks on some of the copyrighted images in the case, the court phrased the question of ?red flag? knowledge as whether ?it would be objectively obvious to a reasonable person that material bearing a generic watermark or a watermark referring to a service provider?s website was infringing.? That?s an important point to watch. Copyright ownership and licensing can be extremely complex ? so oversimplifying it to the idea that the presence of a watermark means any use is infringing would have profound negative consequences.

And this is why this ruling may backfire for Hollywood — even as it pushed the court to rule this way. As EFF notes, at the very time that the MPAA is demanding that platforms do more to moderate content, the implications of this ruling may force them to do much less moderation:

The fact that moderators reviewed those submissions shouldn?t change the analysis. The DMCA does not forbid service providers from using moderators. Indeed, as we explained in the amicus brief (PDF) we filed with CCIA and several library associations, many online services have employees (or volunteers) who review content posted on their services, to determine (for example) whether the content violates community guidelines or terms of service. Others lack the technical or human resources to do so. Access to DMCA protections does not and should not turn on this choice.

The irony here is that copyright owners are constantly pressuring service providers to monitor and moderate the content on their services more actively. This decision just gave them a powerful incentive to refuse.

There are a few other issues in this case that are also potentially problematic. As Annemarie Bridy points out over at Stanford’s Center for Internet & Society, the court seems to totally mess up the analysis of the DMCA’s safe harbors by confusing part (a) of the DMCA 512 (which applies to network providers) and part (c) (which applies to online service providers):

According to the court, the section 512(a) safe harbor covers users? submission of material to providers, and section 512(c) covers the providers? subsequent posting of that material to their sites. There is no such submission-posting distinction in section 512. On the face of the statute and in the legislative history, it?s quite clear that section 512(a) is meant to cover user-initiated, end-to-end routing of information across a provider?s network. A residential broadband access provider is the paradigmatic section 512(a) provider. Section 512(c) covers hosting providers like LiveJournal that receive, store, and provide public access to stored user-generated content. To characterize LiveJournal as a hybrid 512(a)/512(c) provider misapplies the statute and introduces into the case law a wrongheaded distinction between submitting and posting material.

Putting aside the peculiar submission-posting dyad, the dispositive question concerning LiveJournal?s eligibility for the section 512(c) safe harbor is whether the site?s moderator-curated, user-submitted posts occur ?at the direction of users,? taking into consideration the nature of moderators? review and the fact that only about one-third of user submissions are ultimately posted. That question can be answered entirely within the ambit of section 512(c) and the existing case law interpreting it, including the Ninth Circuit?s own decision in Shelter Capital. There was simply no need for the court to invoke section 512(a) in this case.

The court’s analysis here is… just weird. It’s on page 13 of the ruling, and it really does seem to take a totally unchartered path in arguing that the submission of content is covered by 512(a) while the posting is covered by (c). But… that’s wrong:

The district court focused on the users? submission of infringing photographs to LiveJournal rather than LiveJournal?s screening and public posting of the photographs. A different safe harbor, § 512(a), protects service providers from liability for the passive role they play when users submit infringing material to them…. The § 512(c) safe harbor, however, focuses on the service provider?s role in publicly posting infringing material on its site.

Among the other issues with this case, there’s also one on the question of whether or not the anonymous volunteer moderators should be disclosed. As we’ve discussed in the past, because the First Amendment also protects anonymity, any move to reveal an anonymous commenter must be carefully weighed against their First Amendment right to anonymity. The court here more or less brushes off this issue, saying that once the lower court determines the level of agency, that will answer the question on preserving anonymity:

Notwithstanding the deferential standard of review and complex issues of law that govern this discovery ruling, we vacate the district court?s order denying the motion and remand for further consideration. Whether the moderators are agents should inform the district court?s analysis of whether Mavrix?s need for discovery outweighs the moderators? interest in anonymous internet speech. Given the importance of the agency analysis to the ultimate outcome of the case, and the importance of discovering the moderators? roles to that agency analysis, the district court should also consider alternative means by which Mavrix could formally notify or serve the moderators with process requesting that they appear for their deposition at a date and time certain.

This is yet another important case in determining how online platforms can actually function today — and rulings that undermine safe harbors like the DMCA frequently seem to be what Hollywood wants — but again, this may backfire. Making it harder for these sites to function if they’re actively involved in moderation only means they’ll do much less of it.

Filed Under: , , , ,
Companies: livejournal, mavrix

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Dangerous Ruling On DMCA Safe Harbors May Backfire On Hollywood”

Subscribe: RSS Leave a comment
16 Comments
Anonymous Coward says:

This is why a sites are basically forced to adhere to one of two standards: either A.) diligently monitor all posted content or B.) take a completely hands-off approach. Anything in between these two extremes is poison, as it will draw the accusation of selective enforcement.

It’s worth remembering that copyright plaintiffs always point out that if the user-upload site they’re suing can instruct its staff to actively and judiciously look out for child porn, and yet at the same time turn a blind eye to “obvious” copyrighted content, then that’s clear proof of willful ignorance — or worse.

Anonymous Coward says:

Re: Re:

Yep. Hence why when you see crap on the internet, you can blame the MAFIAA.

A lot of bad behavior gets permitted and overlooked online, simply because a website has the resources to monitor it’s users behavior, but nowhere near the resources required to maintain the copyright regime’s iron grip. As this story shows, if a site tries to monitor their users, all of a sudden they are drafted copyright cops who can be sued for not doing “their” job.

Remember that when dealing with trolls, and the like. They are 100% backed and supported by the MAFIAA, who would rather make the internet a hellhole, than loose even a single penny to “copyright infringement”.

That One Guy (profile) says:

Moderators? You think we're crazy enough to have those?

This strikes me as a case where it would almost be worth it for Hollywood to ‘win’. If moderation means liability, then sites won’t have moderators, they’ll just let people post at will and the site will have to keep an entirely hands-off approach lest they open themselves up to a potential world of hurt(bringing things right back to pre-Safe Harbors time, where sites didn’t dare moderate then either for the same reason).

Where before a site might have moderators trying to keep more obvious cases of infringement down now none of it will be removed until they receive a DMCA claim regarding it, leading to vastly more potential infringement being posting and staying up longer than before.

If I thought that Hollywood was capable of long-term planning I might actually suspect that they want something like this to point to as an example of why they need even harsher copyright law, but as it stands I’m pretty sure this is just another case of them being so shortsighted and eager to ‘shoot’ those dastardly pirates that they don’t realize that their foot is also in the line of fire.

PaulT (profile) says:

“But… lots of sites rely on independent and volunteer moderators as a part of their business model.”

The conspiracy theorist in me thinks this might be part of the plan. Force all media-related sites to act as large businesses rather than the largely fan/independent majority that’s always existed. Then with net neutrality abolished they can throttle traffic to all but those who pay the required ransom. The **AAs regain complete control of their industry, and they laugh all the way to the bank until stagnation and poor quality offerings with no competition eventually strangle the industry (which they won’t care about, since that’s long after the current crop have cashed in).

I don’t necessarily think that this is what they’re trying, but it’s something that crosses my mind occasionally.

Anonymous Coward says:

Re: Re: Re:

Pretty much. Having to hire trained staff, “pros” so to speak is a very noble idea in theory. In practice with the amount of content it would lead to huge expenses or deadlocks.
Both of those outcomes mean less competition and an internet that’s closer to TV (think “America’s funniest”).
Which is, incidentally, a wet dream of the **AAs – just like PaulT said.

For every starving artist affected by piracy there’s at least two people being censored by “rightsholders” with the help of copyright.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...