Service Suicide: YouTube, Soapbox, Daily Motion, et al

What’s the best way to kill your incredibly popular service? Is it:
a) Fight with your users?
b) Eliminate most of your content producers?
c) Increase your legal liability?[I am not a lawyer, none of this is legal advice.] d) All of the above?

It seems that I can’t stop hearing about Audible Magic. First they were trying to filter P2P networks, in what may or may not be a violation of wiretap laws, and now every video sharing site that’s ever heard the word “copyright” is hoping this software will keep the “big”:http://www.viacom.com/ “bad”:http://www.universalstudios.com/ “wolves”:http://mpaa.org/ off their back. But if you throw those dogs a bone, they’re only going to keep coming back for “more”:http://consumerist.com/consumer/worst-company-in-america/congressman-who-took-money-from-riaampaa-says-congress-should-cut-funding-to-colleges-266945.php and “more”:http://torrentfreak.com/riaa-and-mpaa-fund-anti-piracy-politicians/. But this post isn’t about bashing old, failing business models. Instead, it’s about keeping those with them from talking new businesses into killing themselves.

Like DRM, content filtering is a great way to start a fight with your users. Sometimes those fights are because users have faulty assumptions about what they should and shouldn’t be able to do. Legally, users aren’t allowed to take the content they enjoy and put it online to share with others, spreading the word about great shows they enjoy, without permission from the makers of the content, which they have no accessible way of receiving. *If a user wants to do something they shouldn’t, the fewer times you have to be the one to tell them no, the better.*

First of all, if you are a video sharing site, you weren’t the ones who created these rules in the first place. So why would you put yourself in the position of having to enforce them? When a video-sharing site starts saying they will enforce the rules, they will be the ones who are held responsible when the rules get broken anyways. Make no mistake, content filtering will develop into an arms race between filtering makers and filtering breakers just as easily as DRM has. Unless video-sharing sites can break with 50 years of software history and somehow change what users think they should be able to do, the only ‘victory’ condition will be forcing copyright violators to distort a video’s content so much that the quality is abysmal.

Congratulations, you now have a site full of really crappy videos, assuming most of your users didn’t already migrate to a site that isn’t playing hall monitor for media corporations. Good riddance to them, you say? You have plenty of law-abiding users who create content too? Not many left, actually. About half of them were users who perfectly legally used copyrighted material in a manner that’s consistent with fair use, parody, commentary, and other legal uses. They’ve gotten tired of the endless failed uploads and takedowns your software’s false-positives have caused. Between the constant bad word of mouth you’ve gotten from them and from users who were uploading Daily Show clips, much of the remaining members or content creators have decided your site is no longer the place they want to produce content either. And once most of your producers have left, most of your audience is gone too.

That wouldn’t happen to your service? It’s too popular, users love it too much? The let’s forget users for a moment(you obviously already have) and talk about all the legal trouble you’re going to get yourself(and others) into. Isn’t that what you were trying to avoid in the first place? Unfortunately, accepting responsibility for one type of content is likely to set hosting sites sliding down a slippery slope of legal liability. Court cases and legislation have long upheld the idea that sites which host user-created content cannot reasonably be expected to monitor and police the actions of millions of users whose creations are hosted by that service. So if I hop on MySpace and post The DaVinci Code on my blog there, chapter by chapter, MySpace can’t be held liable for my copyright infringement.

Saying that a service should be held responsible for a crime because they provided the platform on which it was committed is *idiotic*. It’s saying e-mail services, instant messaging networks, and online games with chat should all be held liable for sexual harrasment if one user sends such content to another. And if saying that is idiotic, then what are the services who would put themselves in the position of accepting that kind of liability?

Services who have set themselves down this path need to reverse course quickly, while they still can. They need to explain that it’s not their job to enforce media corporations’ rights, that they can’t be expected to, and that the tools that would supposedly let them do it “don’t work anyways”:http://newteevee.com/2007/06/08/does-digital-fingerprinting-work-an-investigative-report/. They could even talk about some of that “user rights” and “fair use” nonsense if they wanted to be all moral about it. But they would mainly want to do it to avoid legal liability. Who cares about not being evil?