Jump to content

Wikipedia:Hate is disruptive

From Wikipedia, the free encyclopedia
(Redirected from Wikipedia:Hate speech)

A question arises from time to time on the English Wikipedia: Can we sanction an editor simply for expressing hateful views? A common refrain is that there is no policy against expressing such views.

This is incorrect. Expressing hateful views is a form of disruptive editing. So is acting in a hateful manner, including by aligning oneself with a hate movement.

The essays WP:NONAZIS and WP:NORACISTS discuss this, but both make their conclusions sound more radical than they are. There is nothing radical about blocking, topic-banning, site-banning, or removing user rights from disruptive editors. We do it all the time. An editor does not even need to be participating in bad faith to be sanctioned for disruption.

Why is hate disruptive?

[edit]

Many reasons, but one chief one:

Consider the statement [EDITOR] is a [SLUR]. Obviously, this statement is a pretty severe personal attack against [EDITOR]. Now consider a second statement, black people are [SLUR]s. This is pretty obviously hateful, but also, notice that it's still a personal attack against [EDITOR]! (And every other black person on the project.) Any hateful statement against a group is also a hateful statement against individuals, because groups are made of individuals. We don't allow hateful statements against individuals, so we also don't allow hateful statements against groups.

Of course, not all hatred involves saying slurs, and more subtle forms of hatred can be equally disruptive. A member of a minority group would rightfully feel uncomfortable editing alongside someone who viewed them as less than human, even if that person never called them a slur. An editor calmly defending pseudoscientific theories about race and IQ is still implicitly attacking black people by promoting ideas that have historically been used to enforce white supremacy.

Hate speech

[edit]

Editors are routinely blocked for engaging in hate speech. Usage of slurs directed at individuals or groups of people is treated as severe vandalism. For more complex cases, while there is no Wikipedia policy explicitly defining "hate speech", in practice a Wikipedian can expect consequences for, in the context of a group of people who are distinguished by an inherent attribute:[1]

  • Promoting the group's supremacy over or inferiority to other like groups.[2]
  • Assigning collective guilt to members of the group for any offense, real or imagined.
  • Denying well-documented crimes committed against members of the group.[3]
  • Insulting, harassing, or discriminating against other editors based on their membership in the group.

This essay does not attempt to create any clear definition of hate speech; rather just to document a practice as it already exists. Whether speech is hateful can be assessed by common-sense application of prevailing community norms.[4]

Hateful conduct

[edit]

It is entirely possible to perpetuate hate without ever once saying a slur or claiming that a genocide was a hoax. Cases of more complex hateful conduct on Wikipedia might include:

  • Tendentiously editing to promote a hateful point of view. (See, generally, arbitration enforcement.)
  • Referring to oneself on-wiki as being a member of a hate movement, or outing oneself as supporting such a cause off-wiki.
  • Using the iconography, slogans, or rhetoric of hate movements.

These actions are inherently disruptive. In the rare case that someone does not know the meaning or full context of, say, putting the Nazi flag on their userpage, they presumptively lack the competence to edit Wikipedia.

How does this essay differ from NONAZIS/NORACISTS?

[edit]

In practice? Barely at all. Almost all editors who are blocked under those essays can just as well be blocked for disruptive editing. The only exception would be a user who never in any way indicates that they are a bigot, but is somehow found out to be one without that being their intention, but has still not engaged in any disruption on-wiki. That is a narrow, possibly entirely hypothetical edge case, and on its own would not justify writing a separate essay. Rather, the purpose of this essay is to highlight a difference in philosophy: Under this essay, bigoted editors are not sanctioned for their ideologies; they are sanctioned for their behavior. It just so happens that their ideologies correlate nearly 1:1 with a tendency toward disruptive behavior.

This distinction is important. Non-bigoted editors outside the political mainstream, both on the right and the left, may read NONAZIS and reasonably worry that their ideology is next. Others may infer a political or geographical bias in the focus on right-wing extremists in Europe and the core Anglosphere. Focusing on ideology, in justifying sanctions, raises many difficult-to-answer questions, needlessly complicates things, and leads to drama every time a sanction is made citing these essays. The real answer is simple: Hate is disruptive. We sanction people for disruption. We sanction people who say and do and align with hateful things.

This only sounds controversial if you go out of your way to make it sound controversial.

So bigots can edit here?

[edit]

Sure, if they edit without engaging in any hate speech or hateful conduct (which includes self-identification with hate movements). While this will be impossible for many bigots, presumably some number do manage this—people who write articles about botany without letting on that they think the Holocaust was a hoax, or fix lots of typos and never mention that they think it was a mistake to let women vote. Wikipedia policy does not concern itself with people's private views.[5] The disruption caused by hateful conduct lies in the expression, not the belief.

The flip side of this is true too: If someone uses a bunch of racial slurs because they think it's funny, or posts an edgy statement about gay people on their userpage as a "social experiment", they are engaged in disruptive editing, even if they don't personally harbor hateful views.

Appropriate remedies

[edit]

On Wikipedia, we try to avoid sanctioning people more harshly than necessary. An editor who edits productively about Roman history, but disrupts in the area of American politics, will usually be topic-banned from the latter rather than site-blocked. The same principle holds true with hate speech, but one should take caution to consider what disruption exactly has occurred.

  • For a potentially innocent mistake where the editor plausibly does not know the connotations of what they are saying, a warning or temporary block may be appropriate.
  • If an editor has shown inability to distinguish between hateful and non-hateful sentiments regarding a particular group, while not clearly intending to hurt anyone, a topic ban or partial block may be appropriate.
  • However, in most cases of hate speech, these remedies will not be enough. A temporary block is unlikely to dissuade someone of deeply-held views. And a topic ban may help with content disruption, but will not make editors from the affected group comfortable around the editor in question. (After all, the average person from some targeted group does not only edit articles about that group.) So if someone is engaged in concerted hate speech, the proper remedy will usually be an indefinite block or siteban.
  • Granted user rights are removed as a matter of course with sitebans. With topic bans and non-siteban indefinite blocks, it will often be appropriate to remove particularly trusted rights, such as adminship.

Atonement

[edit]

People change. People mature. People meet a person from Group X for the first time and learn they're not all evil. People wind up on the receiving end of discrimination for the first time and learn how it feels. People read a great book or have a spiritual epiphany.

Most people who are sanctioned under this interpretation of the disruptive editing policy will have been sanctioned because they expressed hateful views and refused to budge. As such, someone who clearly disavows past hateful views, even really nasty hateful views, should usually be given the benefit of the doubt. They can always be asked to elaborate about what led them away from their past views. In some cases, this may not be enough and further mending of fences will be needed, for instance if they seriously harassed other editors prior to their change of heart. But in most cases, a sincere apology merits a second chance.

Notes

[edit]
  1. ^ Generally taken to include spiritual and religious views, even if these are not per se inherent, but not to include political or other philosophical views.
  2. ^ Including conflating the group's success at something with moral supremacy.
  3. ^ Including by "just asking questions" in a manner meant to convey denial subtextually.
  4. ^ While some jurisdictions do have legal definitions of hate speech, Wikipedia is not bound by these, and an argument that something is hate speech based solely on an appeal to the law should be viewed with skepticism, as there is ample history of governments labeling things hate speech to suit their purposes.
  5. ^ Even our child protection policy is based on advocacy and self-identification, not what's in people's heads.