WELCOME TO CRISISFORUMS.ORG!
(1) Please swing by our "HELP CENTER" to view our forum rules prior to commenting.
(2) Acknowledge that by commenting or posting, you take full responsibility for the content and message of the information you put forth, which do not necessarily reflect the opinions of this website.
(3) If you would like to post your own discussion threads, just contact one of the staff so we can verify you as a human.

Why Facts Don't Change Human Minds

In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. They were presented with pairs of suicide notes. In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life. The students were then asked to distinguish between the genuine notes and the fake ones.

Some students discovered that they had a genius for the task. Out of twenty-five pairs of notes, they correctly identified the real one twenty-four times. Others discovered that they were hopeless. They identified the real note in only ten instances.

As is often the case with psychological studies, the whole setup was a put-on. Though half the notes were indeed genuine—they’d been obtained from the Los Angeles County coroner’s office—the scores were fictitious. The students who’d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.

In the second phase of the study, the deception was revealed. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.) Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right. At this point, something curious happened. The students in the high-score group said that they thought they had, in fact, done quite well—significantly better than the average student—even though, as they’d just been told, they had zero grounds for believing this. Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student—a conclusion that was equally unfounded.

“Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”

http://www.newyorker.com/magazine/2017/02/27/why-facts-dont-change-our-minds



Comments

  • Farmer_Sean_DEP_Farmer_Sean_DEP_ Member, Moderator
    From later in the article, after describing a different Stanford study:

    The Stanford studies became famous. Coming from a group of academics in the nineteen-seventies, the contention that people can’t think straight was shocking. It isn’t any longer. Thousands of subsequent experiments have confirmed (and elaborated on) this finding. As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does right now. Still, an essential puzzle remains: How did we come to be this way?
  • tazweisstazweiss Member, Permitted to post new threads
    Gullability usually leads to mindset.

    If the politicians treat people this poorly when they're armed to the teeth,

    just imagine what they'll be willing to do once they've disarmed everyone.

  • Farmer_Sean_DEP_Farmer_Sean_DEP_ Member, Moderator
    edited February 20
    From even further on in the article, and something that is worthy of thought when considering the behaviour of most people:

    Consider what’s become known as “confirmation bias,” the tendency people have to embrace information that supports their beliefs and reject information that contradicts them. Of the many forms of faulty thinking that have been identified, confirmation bias is among the best catalogued; it’s the subject of entire textbooks’ worth of experiments. One of the most famous of these was conducted, again, at Stanford. For this experiment, researchers rounded up a group of students who had opposing opinions about capital punishment. Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.

    The students were asked to respond to two studies. One provided data in support of the deterrence argument, and the other provided data that called it into question. Both studies—you guessed it—were made up, and had been designed to present what were, objectively speaking, equally compelling statistics. The students who had originally supported capital punishment rated the pro-deterrence data highly credible and the anti-deterrence data unconvincing; the students who’d originally opposed capital punishment did the reverse. At the end of the experiment, the students were asked once again about their views. Those who’d started out pro-capital punishment were now even more in favor of it; those who’d opposed it were even more hostile.


    I am absolutely certain that I am often guilty of confirmation bias.  Whenever I read compelling information that challenges something I believe (and, importantly, enjoy believing), I become irritated, dismissive - even downright angry at times.  I generally react with a mental version of the "Talk to the Hand" gesture and move on. 

    I think this is a pretty big fault of mine, but one I have no idea how to overcome.  To be honest, I generally consider myself both more intelligent and more wise than the vast majority of other people, even though I have no rational reason to believe that is the case (especially since I live such a sheltered life).  I'm pretty sure confirmation bias is a symptom of that belief.



  • tazweisstazweiss Member, Permitted to post new threads
    I don't have to worry about any of those studies.  My wife says I know 98% of everything.

    If the politicians treat people this poorly when they're armed to the teeth,

    just imagine what they'll be willing to do once they've disarmed everyone.

  • Matt_ADMIN_Matt_ADMIN_ Administrator
    Obliterate the self, and there will be no ideological mouth to continue feeding


    -------------------
    "...Say, 'GOD is sufficient for me.' In Him the trusters shall trust." (Quran 39:38)
Sign In or Register to comment.