Dumb Question Of The Day: Should Google Try To Prevent Terrorism?

from the well,-of-course dept

I have to admit, I was pretty dumbfounded when I saw the title of this recent Wired blog post:

Should Google Try to Prevent Terrorism?

I mean, who’s going to say no to that? Of course Google should try to prevent terrorism. Everyone should try to prevent terrorism if they can. So I was curious what the article was actually about if it would even bring that up… And it’s not about Google preventing terrorism at all. It’s about the misguided notion that Google should block any videos from those claiming to be part of terrorist groups, which is a totally different thing:

Jihadists have flocked to YouTube to spread their propaganda. One of those clips, released last week, appeared to take credit for the Times Square bombing attempt — before Faisal Shahzad tried to ignite his SUV. The video may have been a vital clue for investigators. But does YouTube and its corporate parent, Google, have an obligation to block these videos before they’re seen?

That’s what one long-time monitor of online jihadists is arguing. “If a certain percentage of Islamist sympathizers are radicalized, in part, online, then it stands to reason that more eyeballs that are exposed to violent Islamist propaganda would eventually translate into more would-be terrorists,” writes “Rusty Shackleford,” the pseudonymous patron of The Jawa Report. “Which is why even though YouTube has been a boon in helping law enforcement agents detect, post hoc, would-be terrorists it has been a bane in that far more Muslims today can easily access violent Islamist propaganda.”

Of course, this isn’t even a new issue. Two years ago, Senator Joe Lieberman grandstanded on the issue, and eventually got YouTube/Google to agree to ban such videos if they “advertise” terrorism or “extremist causes.”

The whole thing seemed ridiculous at the time. As the guy above even admits, these videos are helpful to law enforcement. The idea that people are becoming radicalized because they watch a YouTube video seems pretty unlikely in most cases anyway. These videos are preaching to the choir, not converting kids. Besides, blocking these videos only gives the folks behind them more of a martyr feeling about how people are trying to hold them down and don’t want to hear what they’re saying. The idea that blocking these videos is “preventing terrorism” seems quite unlikely. But using the videos to actually monitor terrorists and help law enforcement seems like a much more important and useful task.

Filed Under: ,
Companies: google

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Dumb Question Of The Day: Should Google Try To Prevent Terrorism?”

Subscribe: RSS Leave a comment
26 Comments
Dark Helmet (profile) says:

Re: Re: Re:2 Re:

“But if it were more underground the general public would have a much harder time coming across crack at the cheap price it is now.”

Okay, but which do you think is more affected? The person who wants crack and is determined to find it or the policeman who kinda sorta wants to stop the crackdealer because it’s his/her job? Very few police have the kind of veracious dedication to finding crackdealers that crack addicts have.

More to the point, those crack dealers you see on the streets? There’s a reason they’re out there. In many cases it’s because they have a relationship with law enforcement that protects them. By this I mean that they snitch on others who commit larger crimes. Police need an “in” into the criminal world. It’s only a guess, but I’d bet that no class of criminal has helped law enforcement close more crimes than drug solicitors….

Chronno S. Trigger (profile) says:

Hell No

“But does YouTube and its corporate parent, Google, have an obligation to block these videos before they’re seen? “

No, for two reasons.

1) “The video may have been a vital clue for investigators.”

2) If they believe that the only way to prevent the spread of this ideal is to hide it, they need to rethink the solution.

Ryan says:

The idea that people are becoming radicalized because they watch a YouTube video seems pretty unlikely in most cases anyway. These videos are preaching to the choir, not converting kids. Besides, blocking these videos only gives the folks behind them more of a martyr feeling about how people are trying to hold them down and don’t want to hear what they’re saying. The idea that blocking these videos is “preventing terrorism” seems quite unlikely.

I presume this is pure speculation that you just pulled out of your ass? I’m inclined to believe you’re right, but that’s a piss-poor way of determining policy.

More significantly…so what if the videos do aid certain aspects of terrorism? YouTube is a tool that can be utilized for any number of purposes, good or bad, including terrorism. Why is it on Google or anyone else to define terrorist activity (one man’s terrorist is another man’s freedom fighter – should Google have relentlessly pulled down the videos of protesters in Iran?), to monitor for expressions of that activity (what expressions are worthy of being pulled? can a philosophical discussion about the merits of various quasi-terrorist acts be construed as potentially convincing stupid people to join Al Qaeda, and thus promoting terrorism?), and then to act as enforcers by removing them? That’s censorship and really no different than Apple arbitrarily rejecting apps from the store. If Google wants to do that, fine. But it shouldn’t be expected of them, just as it’s not on Apple to keep all possible offensive content out of a user’s iPhone experience.

Anonymous Coward says:

There are certainly an awful lot of missing links in the chain if this dude’s skipping straight from posting the video to increasing the risk of casualty-bearing attacks. Even if the video is posted and left alone, people have to watch it, and understand it, and be sympathetic, and be idiots, and want to take action, and have someone to contact about it, and be operationally able to pull something off, and be successful. That’s an awful lot of conditions to satisfy, especially since most plots that have come before have been non-starters or failures, and the ones that have worked didn’t actually create any conditions that were hospitable to any political follow-through on their stated goals. Let ’em post their little videos. Maybe they’ll end up on Tosh.

Bob says:

I’d have to disagree with Mike’s answer to the question: “Should Google try to prevent terrorism?” for several reasons.

1: Google is not an authorized law enforcement organization, nor are they trained or experienced in anti-terrorism tactics. It would be vigilante justice if a private company took it upon themselves to fight crime. That being said, if a duly authorized government agency went through proper channels to enlist the aid of Google, I’d say Google absolutely should help.

2: It is simply unfeasible to know which videos are terrorist videos before they go up short of reviewing every single video before its posted. Considering how many videos are posted daily (I can’t remember the exact number or the average length of the videos off the top of my head) I’d say its unreasonable to expect it.

3: Philosophically speaking, Google (or anyone else for that matter) doesn’t have a moral obligation to help anyone. Obviously, they have to follow laws but from a moral standpoint, they have no more obligation to help the US government (or terrorists) than you would have to help some random person on the street. Would be nice if you did but morally you are not obligated to.

So basically they have no legal authorization or moral imperative to fight terrorism and monitoring every single video before its put up would be unfeasible. Not to mention there is no conclusive evidence that preventing radical videos from being posted would provide a net benefit on the war on terrorism.

lfroen (profile) says:

Re: Re:

>> Philosophically speaking, Google (or anyone else for that matter) doesn’t have a moral obligation to help anyone
Oh yes they does. Corporations are made of people, know that?

And your saying “they have no more obligation to help the US government (or terrorists)” is ridiculous. Helping terrorists is a _crime_. Helping your country to fight its enemies (terrorists) is your moral obligation. In some places on the planet it’s also legal obligation (compulsory military service). Now, given that Google’s offices are located around the world, it is very possible that some of them are located in countries where it your _legal_ obligation to fight terrorism.

Bob says:

“Oh yes they does. Corporations are made of people, know that?”
People have no moral obligation to help each other. If they did, the consequences would be staggering. Every time you paid for something you didn’t need, that would be immoral because that money could’ve gone to a starving or a sick person. It’s the good samaritan argument. If people have a moral obligation to help someone else in need, how far are they obligated to go and how much are they obligated to sacrifice to help someone? It’s basically drawing an arbitrary line in the sand trying to measure how much overall good different acts accomplish.

As for people being morally obligated to fight their countries enemies, that’s a pretty ridiculous obligation when you think about it. First of all you’d have to define what an enemy of your country is. I’d argue that starvation, poor medical care, subversion of civil liberties, environmental issues, etc are all “enemies” of a country. Terrorism is only one problem among many, an important problem, but still not the only one. And if you define “enemies of your country” solely as terrorists, you would still have to define to what level are citizens required to help in the war against terrorism. Would any act that aids the war on terrorism fulfill your obligation (like donating $1 to the government) or is that obligation continuous and to what extent would you have to sacrifice for it? Saying that there is a moral obligation to fight terrorists forces you to draw a line saying that morally you’d have to do at least this much to fight terrorists or you’ve failed in your moral obligation. Again, a fairly subjective line trying to measure what overall good different actions do in the war on terrorism.

Don’t get me wrong, I’m all for helping people. I think this world would be a much better place if more people helped each other. I just don’t see any reasonable argument that makes it a moral obligation.

And as for the original question, “Should Google try to prevent terrorism?” if the prevention measures mean watching every single video before its publicly posted and having civilians making judgement calls as to whether a video constitutes aiding terrorists, I’d still have to say “No”. It’s basically the same argument as to whether youtube is liable for other people uploading copyright infringing videos. How are they supposed to know whether some guy they’ve never seen before, who may or may not be hiding his face and probably is speaking in a foreign language is talking about blowing something up or reporting the weather?

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...