Jump to content

User:Agrandpierre/sandbox

From Wikipedia, the free encyclopedia

Bullshit and Foibles of the Human Mind

[edit]

Abstract

[edit]

The author, Kenneth Taylor, argues in his paper “Bullshit and the Foibles of the Human Mind, or what the Masters of the Dark Arts Know”[1] that the human mind has trouble with certain weaknesses that allow it to be influenced by manipulators. These foibles of bullshit, propaganda, spin, and outright lies take in the human mind where we are essentially at fault for these widespread. Whenever humans are cognizing, it provides a pathway for the bullshit creators to influence our thinking with falsehoods or truths that are somewhat true but not completely. He tries to approach his argument with the following 3 questions:

1. Why is there so much bullshit and other forms of misrepresentation around?

2. Why are we so often taken in by bullshit and other forms of misrepresentation?

3. Why do we find it so hard to distinguish bullshit from its contraries?

These questions are used to focus on few aspects out of the many foibles there are for the human mind that could be taken in by bullshit and misrepresentation.


Confirmation Bias

[edit]

Taylor tells us that although we as a species have made amazing cognitive progress, we are susceptible to foibles of the human mind which make us easy targets for purveyors of bullshit. One of these foibles is confirmation bias. Taylor defines bias as “the tendency to notice and seek out things that confirm one’s beliefs, and to ignore, avoid, or undervalue the relevance of things that would disconfirm one’s beliefs.(insert footnote here?)” Confirmation bias explains why already established beliefs are impenetrable. When we believe something to be true, we also believe we have good reason to believe it. We believe our “evidence” is substantial enough to warrant that belief, making our beliefs essentially impervious to change. This causes us to overestimate our epistemic capabilities and reject any evidence that is inconsistent with our beliefs. Confirmation bias describes the phenomenon where we have beliefs, and we think these beliefs are justified or supported by evidence. Because of this, we deny or ignore contrary evidence.

Taylor gives us an example of confirmation bias in his essay. If someone were to believe Bush’s rationale for a war against Iraq, they would also think they are reasonable in believing Bush’s rationale. They wouldn’t think themselves unreasonable or susceptible to deceit. Because of this confidence, they will reject any evidence suggesting Bush’s rationale is faulty, and reject any evidence suggesting they are irrational or foolish to accept Bush’s rationale. Any new evidence that others may try to convey to them, will be easily dismissed with little to no consideration.

Although it may seem that confirmation bias only aids in the resistance of forming new beliefs, it can also aid in the spread of bullshit and the formation of bullshit ideals. Let’s consider information cocoons like far-right and far-left news sources. These cocoons promote a narrow range of views and continually misrepresent or exclude alternative viewpoints. When Americans intentionally seek information from information cocoons this further propagates confirmation bias. Once consumers participate in the system by putting complete trust in managers of information cocoons, they have made themselves easy pickings for bullshit artists.


Framing Effects

[edit]

Risk-Averse vs. Risk-Seeking

Taylor continues with another foible of the human mind: framing effects. Unlike confirmation bias, which maintains bullshit beliefs, framing effects influence the initial formation of our beliefs. Taylor introduces identical scenarios that are presented differently involving a flu outbreak to explain our risk-averse versus risk-seeking tendencies. To summarize: we tend to be risk-averse when we choose between outcomes all of which have a positive return. People tend to prefer a sure thing to a risky thing of equal or greater expected return when both expected returns are positive. In contrast, people tend to be risk-seeking with respect to losses; people tend to prefer pursuing the chance that no one will die — even if pursuing that chance means running the risk of more deaths — to the certainty that fewer will die. Taylor emphasizes that we tend to make decisions influenced by how our options are framed. We tend to overlook invariant factors (like expected return) across different options. People are highly sensitive to how the choice is framed, which provides powerful leverage for purveyors of bullshit.



Real-World Examples

[edit]
Wason Selection Task

Taylor explains the [Wason Selection Task]: Subjects are given four cards. They are told that each card has a number on one side and a letter on the other. They are asked to name those cards and only those cards which should be turned over in order to determine whether the following rule is true or false of these four cards: If a card has the letter D on one side, it has the number 3 on the other: Applying straight-forward propositional logic, the correct cards are the D card and the 7 card. If a D is on the other side of the 7, then the rule is falsified. If anything other than a 3 is on the other side of the D card, the rule will be falsified again. Subjects performed very poorly on this task: less than 25% of subjects gave the correct choice. The inability of subjects to perform well on this simple test tempted many to conclude that human cognition is irredeemably irrational, but this is not entirely correct.

Wason Selection Task

To reframe this test to generate a more successful response, the options are changed to the following; you are a bartender. Your task is to see that there is no underage drinking. That is, you must see to it that the following conditional is true: If someone is drinking beer, then she must be older than 21. Which cards should you turn over?

Taylor argues the key to our cognitive success as a species rests on our evolved capacity for culture. When cultural mechanisms function to spread the benefits of one or more individuals’ cognitive innovations to others, it is not necessary to be a genius. When the experiment is “re-framed” in terms of something like a social construct, performance is improved dramatically. From a purely logical point of view, this problem has exactly the same structure as the earlier one. Nonetheless, subjects perform significantly better on the second version of the task than on the first. Taylor stated this was explained by evolutionary psychologists as the “cheater detection module”; our minds are not general-purpose problem-solving machines. We are instead specially adapted to solve recurring problems significant to the environment, called built-in frames that enabled certain structures in our mind to quickly and effortlessly recognize the kind of reasoning that had to be applied. Taylor concludes this section by recognizing that some framing effects improve the functioning of the human mind - “he who controls the frame may well control all.”


Reclaiming the Public Square

[edit]

Taylor acknowledges that the human brain has not been designed to withstand the stresses and strains of the modern world. As our understanding of the human brain has improved, so have the techniques used to attract our attention and mislead us. We are constantly introduced to data that is oftentimes designed to take advantage of our natural brain functions. Our cognitive abilities are easily hijacked by modern media sources, which explains why we are so susceptible to misinformation, propaganda, and other forms of media. Taylor argues that we need to even the playing field so that we have the tools to combat today's media. He mentions educating children so that they have a skeptical eye towards easily spreading misinformation. If more people were aware of the tactics used to take advantage of our natural cognitive abilities, society wouldn’t be so susceptible to said tactics.

  1. ^ Bullshit and philosophy : guaranteed to get perfect results every time. Gary L. Hardcastle, George A. Reisch. Chicago, Illinois. 2006. ISBN 0-8126-9611-5. OCLC 71004110.{{cite book}}: CS1 maint: location missing publisher (link) CS1 maint: others (link)