Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Dealing With Controversial & Sexual Fan Fiction (May 2007)

from the fan-friction dept

Summary: Sexual content can be challenging for content moderation on a number of different levels — especially when it involves fictional content about taboo, controversial, or even illegal activities. Literary fiction around these topics has been controversial throughout history, including books like Vladimir Nabokov?s Lolita, which focuses on a story told (somewhat unreliably) by a middle-aged male English professor who becomes obsessed with a 12 year-old girl.

But while there have been widespread public debates about whether or not such written works have artistic merit or are obscene, the debate becomes different when such content is hosted on social media platforms, and raises questions about whether or not it complies with terms of service.

LiveJournal, the very popular blogging platform in the mid-2000s, faced that question in 2007. A religious group called ?Warriors for Innocence,? that was ostensibly set up to track down child abuse online, launched a public campaign accusing LiveJournal (at the time owned by another blogging company, SixApart) of harboring people promoting child sexual abuse. In response, LiveJournal suspended approximately 500 accounts. Many of the suspended accounts, however, hosted fictional writings, including fan fiction about the Harry Potter universe, as well as a (Spanish-language) LiveJournal that hosted a discussion about Nabokov?s Lolita.

Many of the LiveJournal users were upset about this, and argued that even if they were writing about taboo sexual content, fiction about criminal behavior is quite different than supporting or engaging in the same criminal behavior.

However given that all the material in question is fiction and artwork it seems preposterous to censor these communities. If works of fiction that address illegal or immoral activities are going to be subject to this treatment surely crime thrillers and murder mysteries should be censored just as heavily as erotic material. Part of the reason I use livejournal is because of the freedom it allowed for writers such as myself who deal with difficult and unconventional subject matter. If this purge continues I will be forced to leave livejournal find another outlet for my writing and I am sure I am not the only lj user who feels this way.

Decisions to be made by SixApart:

  • How do you distinguish fictional stories that describe criminal activities from supporting or encouraging such activities?
  • How responsive should a website be to public campaigns that seek to condemn certain content online?
  • Should the company be judging the literary merit of written works?

Questions and policy implications to consider:

  • Where, if anywhere, is the line to be drawn between fictional works depicting abuse, and policies against abuse?
  • Different people may view these works through very different lenses. How do you handle concerns raised by some, as compared to the creative outlet it provides others?

Resolution: SixApart?s CEO later apologized, saying that they screwed up the removal process.

For reasons we are still trying to figure out what was supposed to be a well planned attempt to clean up a few journals that were violating LiveJournal’s policies that protect minors turned into a total mess. I can only say I?m sorry, explain what we did wrong and what we are doing to correct these problems and explain what we were trying to do but messed up so completely.

Many of the suspended journals were put back online, after each was manually reviewed by the company. He admitted that they struggled with some content that ?used a thin veneer of fictional or academic interest? to actually promote that activity, and the company sought to shut down those accounts.

Another issue we needed to deal with was journals that used a thin veneer of fictional or academic interest in events and storylines that include child rape, pedophilia, and similar themes in order to actually promote these activities. While there are stories, essays, and discussions that include discussion of these issues in an effort to understand and prevent them, others use a pretext to promote these activities. It?s often very hard to tell the difference.

It is also worth noting that approximately six months after this incident, SixApart sold LiveJournal.

Originally posted on the Trust & Safety Foundation website.

Filed Under: , , ,
Companies: sixapart

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: Dealing With Controversial & Sexual Fan Fiction (May 2007)”

Subscribe: RSS Leave a comment
31 Comments
Uriel-238 (profile) says:

Ah yes, the lolicon debate

Lolicon is crazy popular in Japan, with a fandom that has infected much of the world, including the US. Generally it’s recognized like most porn as fantasy and wish-fulfillment. We’re not sure why Japan likes Lolicon so much, but its popularity there gave the rest of the world a chance to see what they thought.

Turns out there’s fandom everywhere, and according to our moral guardians, that’s terrible.

Also throughout much of the world, there’s a bit of confusion between art that represents sexuality or even celebrates it and art that encourages sexual misconduct, and as such here in the US the criminality of lolicon (which often depicts child sex abuse) varies from county to county.

Bluer states will have it decriminalized throughout, red states don’t like porn or hentai featuring adults, let alone kids… or fictional para-human species that are pedomorphic (aliens elves, demons and spirits, typically). Some states haven’t sorted out a statewide position, and so it’s decided county to county whether Sandman (Volume 2) #14: Collectors is or is not a felony to have or read.

Australia at the furthest extreme not only bans Lisa Simpson porn but even pictures of small-chested women, lest an Australian be excited to sexual mania and lose control of his sensibilities.

It’s only going to get worse as we are able to render realistic human beings at any age and engaged in many activities. On the other hand, if realistic CGI child porn can be decriminalized, it could kill the industry that actually exploits kids (or reduce it to a very niche market).

Meanwhile podcasts are now talking about the two movie productions of Nabokov’s Lolita, and their emphasis on romance and the illusion of informed, sound-minded consent, which implies there’s more chronophilia interest in our society than we let on.

And don’t get me started about child beauty pageants.

anonttt says:

Re: Ah yes, the lolicon debate

A debate I expect to have for the rest of my life.

My position, it’s fiction, bugger off. (Not attacking you Uriel-238 but the anitas, brittany ventis and other twitter moral warriors who decide to open their traps about it.)

Making it illegal in my region of the world won’t stop me from enjoying it.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re: Re:

There’s no need to. Give it time. Support for the LGBT community was a long time coming, and the truth that being LGBT is natural centered around the Greeks as a core argument. One of the most enlightened, celebrated cultures, which had a social system of pairing boys with older men as teachers and lovers. This was natural, accepted, and encouraged.

Now as a society we are more diverse than ever before. The LGBT acronym has expanded to include innumerable definitions of sexuality, just as it should be. Give it time. Pedarasty will come to be as natural as it was treated millennia ago and conservative values will lose their stranglehold on social norms.

For what it’s worth, I’m not a pedarast. Children are a sexually transmitted disease borne from men who can’t see that the ideal partner is someone who is naturally equipped to reach the prostate gland. My friends call such foolish males "breeders".

This comment has been deemed insightful by the community.
Uriel-238 (profile) says:

Re: Re: Re:2 Um, No.

Support for the LGBT+ community is based on the notion that people should have liberty to do what they want so long as it doesn’t harm anyone. That pederasty was embraced by ancient civilizations (or how those civilizations are regarded today) has no bearing on the modern movement of tolerance and recognition that not all people have the same sexual interests.

To be fair, in the Hellenist states, children didn’t even have rights; the first regulation of prostitution was in Rome, requiring a child to be at least six before he / she is used as a sex worker. That is to say toddlers were doing sex work before the regulation, and children seven and up would continue to do so. Progress!

Our miserable modern debacle of a civilization is still working out basic matters of consent, because it’s still to easy to think it’s okay to leverage a colleague’s job or reputation for sex, or to take advantage of someone who is inebriated or unconscious.

Industrialized societies are still struggling to figure out what kids should be allowed to do, and with whom. Children seem to want to experiment with their peers, and doing so seems to be a healthy thing — usually — but kids also are not clear on consent, so for now we punt and leave that to parents.

We also know kids respond with trust and credulity to people with seniority and authority, which is one of the reasons the industrialized world has a youth-ministry problem.

We know tweens should learn the risks of sexting each other, but not be jailed for distributing child porn when a friend distributes a candid shot they gave a sweetheart. TechDirt is full of articles about that very sort of thing happening.

So your dreams of swimming with Little Fishes as did Tiberius Caesar are not likely to manifest in this society Anonymous Coward, at least not legally.

Rocky says:

Re: Re: Re:3 Um, No.

It’s fascinating to see people cherry-pick and espouse millennia old practices that are wrong on so many levels, even in jest. What they don’t understand is that if they pretend to be idiots to get their jollies they will soon find that the company they keep only consists of real idiots.

Anonymous Coward says:

Re: Re: Re:4 Um, No.

It’s similar to other individuals who espouse views of a "glorious past", which weren’t anywhere near as sunny as they think it was.

If the best defense is some people (probably in positions of authority on their slave) in a barbaric and primitive society practiced it, perhaps you don’t have a very good defense for such practices? Rome had gladiators fight against lions in the coliseum for entertainment. Is this humane? Should we bring it back?

The risk of exploitation is too high. And under a certain age, it can cause severe psychological, and possibly, physical damage. These things cannot be understated. If you do something to a robot, it’s not a sentient and vulnerable being, I could care less what you do it, even if Congress would rather denounce that.

Anonymous Coward says:

Re: Ah yes, the lolicon debate

You’re talking about the decision by the classification board in Australia ("small boob ban")? I don’t know if that is still in effect, but it is a moot effort, as someone can just access mainstream porn sites, which almost certainly would contain the content, and I doubt someone would be arrested for a lot of these things (unless, they’re looking at lolicon, particularly lolicon of eight year olds).

Uriel-238 (profile) says:

Re: Re: Ah yes, the lolicon debate

Lolicon ranges from infants to grown teens. And then there are those who are in the bodies of child or adolescent humans but are really elves or thousand-year-old spirits.

Here in the states, there are plenty of things we Americans could get arrested for (and incarcerated as long as we imprison murderers). We usually don’t thanks to prosecutorial discretion but if someone wants a guy put way (say because he embarrassed a VIP, or the sheriff wants his land or because his skin is too dark) then these matters come up.

So yeah, laws that aren’t enforced except sometimes are notorious for creating chilling effects and making undesirables disappear. At least here in the US, and lately Australia has been aspiring to be as ugly and cruel as the States.

Anonymous Coward says:

Re: Ah yes, the lolicon debate

Realistic CGI child porn (which is indistinguishable from reality) would have to be decriminalized at the federal level. There is blocking Supreme Court precedent.

It would make it really hard to find and remove (and arrest) child porn (as any piece of apparent "child porn" could really be a simulation someone made on their laptop when they were bored), which would almost certainly sink such an effort. I feel the government is more of the persuasion to eradicate child porn no matter what, regardless of the cost.

Uriel-238 (profile) says:

Re: Re: Is it real or CGI

This became a matter of issue kick-started by Ashcroft’s eagerness to curb pornography and indecent material (leading to Eric Idle’s second song about radio censorship.†). Famously, Ashcroft required the breast of Iustitia to be covered for press briefings.

For a short time, thanks to many efforts to censor porn, the courts decided the US had no cause to criminalize CGI rendered porn, even if it would be illegal were it to involve models (child, snuff, rape, revenge, etc.) This created a market of processed porn, in which photographs of children would then be artified — rendered to look drawn or created with graphics software. Rather than creating narrower categories of acceptable and unacceptable media, the courts just returned CGI porn to be regarded like photographed porn.

So yeah, it’s unlikely CGI will be revisited anytime in the near future, though in the 2010s law enforcement started using rendered little girls to lure online child-porn enthusiasts into sting traps.

† Mr. Idle added part two in 2018!

This comment has been flagged by the community. Click here to show it.

ECA (profile) says:

Who dont Love the debate of Who is correct about MORALS?

Even the religion we declare we Love so much, has so many different ideas in it, that HOW in Hell does 1-2 groups out of 40+ decide WHO/WHAT is right or wrong?
Its strange that no one knows What fruit was eaten in Eden. But we Can declare it to be anything we wish. at least some can.
But they will Never talk about BEFORE, that fruit. When there were NO RULES, except 1.
And the big debate of what to name that tree, it came from.
Tree of knowledge?
Forbidden knowledge?
Of knowledge of good and evil?
Iv heard 3-4 different names and its always something abit off about that name.
before that point there was NO GUILT.

Now understanding that. Think about all the strange things MAN has done to MAN, over time. And thats a long history of "who is RIGHT?", and who do you enslave.

Uriel-238 (profile) says:

Morality and religion

Not long ago the internet was buzzing about trolley problems and I added my take and came to some conclusions informed partly about how our religious communities (who are all very big on creeds, or deontological ethics) were still able to take a very pragmatic approach to the 2016 and 2020 election, and suspend their judgement for the miscreant they voted for. (White Protestant Evangelists and Catholics both voted 80% for renowned adulterer and racketeer Donald Trump even though Clinton was significantly less creepy.)

My conclusions were (TLDR):
~ People don’t adhere to codes of morality, whether they are religious or ethicists or not. They do what they feel.
~ But that isn’t particularly horrible. We tend to have high respect and regard for our immediate community and only transgress out of desperation. Also, we’re freaked out by weirdos and nations of millions. And…
~ This informs why we tend to stand by our cohorts in The Prisoner’s Dilemma even when we don’t trust our buddy or even sacrifice ourselves so our mates might live. (Incidentally, other social animals do as well.)

Anonymous Coward says:

Re: Morality and religion

People have priorities. Some things are weighed heavier than others.

He pushes policies no other president would be able to push. And he has a certain charisma which jibes with his base. That charisma is his confidence. No matter what he does, he has the utmost confidence behind it, and that is contagious.

So long as he does what they want, the oddities and idiocy can be overlooked by them. And by playing the fool, he plays the role of someone who can slam through terrible policy, but who no one wants to argue with, because you are literally arguing with a fool.

Anonymous Coward says:

Good question! I believe there are multiple parties at play here.

Pedophilia could be considered a sexual disorder, or even sexual orientation (this doesn’t mean it is a good thing, or that it should be acted upon) in the sense that it’s an innate trait, it’s stable over time, and it contains elements of romantic attraction, rather than merely just being a sexual reaction to a particular stimulus.

There are variants of it too. But, this is a basic model to think about. The clinical criteria goes as far as requiring someone’s attraction to children to overpower their attraction to adults, or for it to have a significant impact on their lives, which they otherwise cannot navigate around. Some have absolutely zero attraction to adults, and sometimes even actual revulsion.

I don’t believe many people who consume this content meet this specific criteria. But, perhaps there are some who do. And there are yet some teenagers who may have fantasies about engaging in activities in peers, who most certainly would not be considered to be pedophilic. Whether they should be engaged in uploading, or consuming fantasies is a whole other debate, altogether involving factors like sex education.

I don’t believe people are convinced into committing terrible acts by consuming fictional content. Rather, people have largely been socialized by society to see pedophilic activities as bad. I am sure you too hold this opinion, no?

Most people would be so socialized, that any depiction of "happy consensual relationships" in a piece of fiction, would be seen as the fantasy that it is. As for depictions which depict rape or clearly non-consensual themes, they wouldn’t of their own existence convince someone that suffering is good, and that they should go about and do it.

Even in the case of pedophiles, they have an attraction to children, therefore they look at content involving children. It isn’t the content that makes them like that, nor the demand. Depriving them of the content meaninglessly may even serve to frustrate them, and further deepen their adversarial relationship with society, which could lead to an increased number of clashes.

https://www.sciencedaily.com/releases/2010/11/101130111326.htm
Some research even suggests that some material (perhaps artificial) might even serve to reduce crime, although this sort of research is highly stigmatized, and very few are willing to engage in it.

So, what does convince someone to commit a crime then? If we apply Maslow’s Hierarchy of Needs, human beings have a certain number of needs, which they need to meet to reach fulfillment. A pedophile obviously has unmet sexual needs. They may also find rationalizations for their behavior.

A fictional media (if it were sufficient for their needs) would waste time they could be spending with actual children, burn the energy they would need to do so, and give them a minuscule, but certain level of fulfillment in their lives. Producing such content might even give their lives some semblance of purpose and meaning, which might otherwise be spent pursuing anti-social goals. This is not an uncommon thing, and a report written in Europe, which I currently cannot locate has theorized as such.

https://www.theatlantic.com/health/archive/2016/01/can-child-dolls-keep-pedophiles-from-offending/423324/
There are also some reports that other aids like child sex dolls can also help to prevent pedophiles from committing crimes. Dr. Cantor’s Non-Offending Pedophiles Paper also makes note of this.

Rationalizations for behavior which lead to crime could come in the form of nambla-like rhetoric, which try to downplay the damage which sexual activity does to children, and may even try to spin it as a "positive thing". However, this is not a piece of fiction which is doing this, but a very specific piece of advocacy from a specific group of people, who may employ the power of persuasion, and cite largely discredited research. Others who are of a more anti-social persuasion may be even more utilitarian, and lack the need for a "belief" to make themselves feel better about their actions.

In this way, I believe there is a very big difference between advocacy from an organization, and whether a specific piece of fiction promotes a specific activity. I truly hope that interactions with nambla-like individuals does not color your thoughts on this matter.

Frankly speaking, nambla-like ideology could even be seen as a sort of religion, for desperate people facing a societal dead-end.

https://www.vice.com/en/article/59nbnq/first-man-sentenced-to-chemical-castration-in-indonesia-said-he-would-rather-die
Of course, there are things which are purported to be solutions, but they involve a great degree of suffering, and due to the coercive nature of society pressing down on them, it may even lead to resentment which could lead to worse conditions like mysopedia.

Regardless of that, it is simply not right to not afford people the proper due process and freedoms here. Just because someone who bears a similarity to them (even in a superficial way for those who aren’t really pedophiles), it doesn’t mean they should have to suffer for the acts of someone else.

Sometimes, individuals who do commit such acts also have other mental disorders, which could have been treated at a far earlier time. But, which aren’t due to flaws in the mental healthcare system.

Many pedophiles, like most people really, probably realize that their actions would have a terrible impact on children, and avoid engaging in sexual activity with them. Unfortunately, some of the loudest people also tend to be the least scrupulous, or the most deluded.

Christenson says:

The money quote -- extending Masnick's impossibility Theorem

SixApart’s CEO:
While there are stories, essays, and discussions that include discussion of these issues in an effort to understand and prevent them, others use a pretext to promote these activities. It’s often very hard to tell the difference.


Not only can a few words of context can change the intent of a piece of content, but the audience has a mind of its own and can react exactly opposite to the intent, or in any other direction for that matter.

As an example, we are so tied up in knots about pornography that I don’t think we have solid data on its actual effects on people’s behavior — it’s really unclear in which way what causes might flow, except that the ability to sell or benefit somehow seems to cause people to make content.

I’m wondering if the ethical answer to child porn, for example, isn’t to flood the market so there’s little external incentive to make it, and those that need it act on the fantasy rather than real children. Is that what is happening with Lolicon in Japan?

Anonymous Coward says:

This looks like a classic response to outside pressure. When someone says you’re not doing enough, you over react to make it look like you’re "doing something", and smash any account in your line of fire.

Ironically, the accounts most likely to be in the line of fire are less likely to be the most problematic ones to begin with. This may too be an example of "moderation at scale is hard". Tumblr nuked all the porn on their platform to try to get at a few pieces of child porn.

Even if we agree that action was necessary, for child porn being an absolute evil (which it is, although we can’t shutdown society and free expression for it, it will always exist, and we have to accept that on some level), a simple nudity filter trained on real human beings would have done the exact same job, and many furry / hentai artists would not have been affected.

But, no one ever thinks of the collateral damage from their policies, or even if they entirely make sense.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow