Jump to content

Wikipedia:Hardcore pornography images/Rebuttal

From Wikipedia, the free encyclopedia

The essay Wikipedia:Hardcore pornography images asserts that "articles about hardcore pornography subjects should not contain images which are, themselves, hardcore pornography". Such a policy would be ill-advised. Existing Wikipedia policy is that graphic images should neither be deleted nor added based on the subjective reactions they evoke – we don't do censorship, and we don't include irrelevant images simply to shock an audience. Images and other media should be used as appropriate to illustrate the topic matter.

Rebutted issues

[edit]
  • It is true that articles about pornography are seldom of core importance; but the principle is of core importance. For example, during the early days of the Egyptian Revolution of 2011, the Image:Khalid-Saeed.jpg was deleted, much to my dismay, on the basis that it was an extremely graphic image. Graphic it is, but it crucial – the death of Khaled Mohamed Saeed became a major rallying cry during the Egyptian protests, and the appearance of his corpse belied statements by Egyptian police that he had choked to death while trying to swallow drugs. At the time when the world was counting on it to provide vital information, Wikipedia laid down and submitted to censorship – not on minutiae but in its most central role of allowing the people of the world to make informed decisions about their future. We must thoroughly repudiate such habits throughout, in the hope that this will never happen again.
  • It is asserted that pornographic images "degrade Wikipedia's reputation", but it's not so – Wikipedia's reputation is based on its ability to cover the facts, not its adherence to local taboos.
  • It is asserted that the images impose a cost on Wikipedia by driving away readers and potential new editors, but clearly this is false – many a youngster who peers into a library seeking forbidden fruit comes to appreciate the rest of the collection. It is specifically asserted that the images drive away women, but why would they be driven away by things they aren't looking for?
    • Furthermore, no source is given for the claim that pornographic images repel women from Wikipedia. This unverified assertion can be construed as reinforcing the stereotype that all women dislike pornography, or that all women dislike sex in general.
    • By the same logic, one could easily say "Wikipedia hosts images of flaming crosses and lynchings, which drives away black people, so we shouldn't host those images", or, "Wikipedia hosts images of meat, which drives away vegetarians, so we shouldn't host those images"
    • But even if images of X are proven to repel people from demographic group Y, this reason alone is not a good rationale for removing X. Wikipedia's goal is to accurately document encyclopedic knowledge, not to drive up market share or broaden the diversity of our audience or editor body.
  • It is asserted that it is "not a good thing for young people to be viewing these images". But kids looking for pornography on an uncensored internet connection will find it. If they find it here, at least we know they're not being lured onto a server full of viruses and trojans that is intended to "phish" for personal information. We present such information from a neutral point of view, including perhaps criticism from anti-pornography groups – they could find it in a worse context. We should also consider that a few such images salted among Wikipedia's millions of articles provides a "shot across the bow" for parents, who may otherwise be deceived by Wikipedia's educational subject matter into thinking that this is a child-safe site. Parents need to understand that this is an encyclopedia that anyone can edit, and children could be exposed directly to molesters, criminals, all sorts of things. If a kid posts his personal information, all kinds of people could see it. For that matter, Wikipedia is just one click away from the official Hezbollah web site. A few innocuous images of unsatisfying-looking sexual acts may ensure that parents seriously go over the online facts of life with their kids, and prevent a tragedy from unfolding on Wikipedia.
  • It is asserted that the images are misogynistic and degrading to women. This may be true at times, though it always mystified me just how gay porn was supposed to be included in that. But Wikipedia can and should include many images degrading to groups of people, with the intent to educate the public and in the long term to promote greater respect. For example, images in Holocaust and anti-Semitism are surely even more disturbing to Jews, but Wikipedia recognizes that documenting the pure insanity of history is a step toward trying to control it.
  • The essay draws a line between "hardcore pornography" and "real-life human sexuality where the behavior in question occurs among populations at a notable level". This would, in theory, be a welcome limitation to any policy based on the essay, if people read down that far, but it doesn't make it acceptable. Wikipedia should always welcome surveys of the frequency of human sexual behaviors, and make it clear in articles which are more common than others. However, treating people differently based on the commonness of their sexual proclivities, saying that some people should be singled out and told that their habits are too disgusting to depict – this is antithetical to the inclusiveness which Wikipedia should seek to express.
  • It is also stated that Wikipedia's general disclaimer that "for example, graphical depictions of violence, human anatomy, or sexual acts" doesn't go far enough to cover certain "hardcore pornography", but that is just absurd.

In conclusion, there is no valid reason for a special crusade against porn, "hardcore" or not. This is not to say that there aren't times when it is encyclopedically preferable to use a diagram rather than a photo of a genuine sex act. This can be done for the same reasons why sometimes we display a map rather than a satellite photo of a city neighborhood: when graphic detail distracts from the point being made rather than illustrating it. Fortunately, Wikimedia Commons has a long series of well-done color illustrations of various practices, meaning that despite underlying ideological differences editors are often able to come to a consensus or near-consensus in individual disputes. Existing policy and collaboration will continue to provide a good solution much of the time, but adding censorship to the mix will never help.