"It is true" says Reason. "It can't be true" says Pride. At last -Reason yields.

Started by DarklingAlice, August 16, 2010, 09:06:09 AM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

DarklingAlice

One of the definitive works on perceived bias in research was put forth by Lord, et al. 1979 in "Biased Assimilation and Attitude Polarization: The Effects of Prior Theories on Subsequently Considered Evidence".

Quote
Abstract
People who hold strong opinions on complex social issues are likely to examine relevant empirical evidence in a biased manner. They are apt to accept "confirming" evidence at face value while subjecting "disconfirming" evidence to critical evaluation, and, as a result, draw undue support for their initial positions from mixed or random empirical findings. Thus, the result of exposing contending factions in a social dispute to an identical body of relevant empirical evidence may be not a narrowing of disagreement but rather an increase in polarization. To test these assumptions, 48 undergraduates supporting and opposing capital punishment were exposed to 2 purported studies, one seemingly confirming and one seemingly disconfirming their existing beliefs about the deterrent efficacy of the death penalty. As predicted, both proponents and opponents of capital punishment rated those results and procedures that confirmed their own beliefs to be the more convincing and probative ones, and they reported corresponding shifts in their beliefs as the various results and procedures were presented. The net effect of such evaluations and opinion shifts was the postulated increase in attitude polarization.
Not a surprising result, but hardly a heartening one. However, new research from earlier this year indicates that it might even be a little worse than that. In "The Scientific Impotence Excuse: Discounting Belief-Threatening Scientific Abstracts", Geoffrey D. Munro presents evidence that not only are people likely to believe that abstracts that run counter to their previously held opinions are flawed; but also that, after exposure to abstracts they don't agree with, people will be more likely to declare that science itself is somehow flawed.

Quote
Abstract
The scientific impotence discounting hypothesis predicts that people resist belief-disconfirming scientific evidence by concluding that the topic of study is not amenable to scientific investigation. In 2 studies, participants read a series of brief abstracts that either confirmed or disconfirmed their existing beliefs about a stereotype associated with homosexuality. Relative to those reading belief-confirming evidence, participants reading belief-disconfirming evidence indicated more belief that the topic could not be studied scientifically and more belief that a series of other unrelated topics could not be studied scientifically. Thus, being presented with belief-disconfirming scientific evidence may lead to an erosion of belief in the efficacy of scientific methods.
When people are presented with rational evidence against their convictions, it seems that their rationality is more likely to yield before their conviction does. Again, not so much a surprising result as a horrifying one, and undoubtedly a strong factor in the current anti-intellectualism movements.

For every complex problem there is a solution that is simple, elegant, and wrong.


Host of Seraphim

Last year, while studying social psychology, there was a term that pretty much summed this up. Confirmation bias.

QuoteConfirmation bias refers to a type of selective thinking  whereby one tends to notice and to look for what confirms one's beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one's beliefs. For example, if you believe that during a full moon there is an increase in admissions to the emergency room where you work, you will take notice of admissions during a full moon, but be inattentive to the moon when admissions occur during other nights of the month. A tendency to do this over time unjustifiably strengthens your belief in the relationship between the full moon and accidents and other lunar effects.

This tendency to give more attention and weight to data that support our beliefs than we do to contrary data is especially pernicious when our beliefs are little more than prejudices. If our beliefs are firmly established on solid evidence and valid confirmatory experiments, the tendency to give more attention and weight to data that fit with our beliefs should not lead us astray as a rule. Of course, if we become blinded to evidence truly refuting a favored hypothesis, we have crossed the line from reasonableness to closed-mindedness.
Tentatively trying to get back into RPing...

:: O/O :: A/A (updated 3/2 -- please read) ::

Neroon

There is a fairly strong selective pressure for such behaviour, if you think about it.  We evolved from hunter-gathers and as such, when gathering, our forebears would use a "search image" to choose what to gather.  The search image hypothesis explains the bevahviour of lots of animals that gather food, in that they will ignore food of one type and instead gather food of another.  The reasoning is that a) it can speed up the actual gathering and b) prevent harmful foods (poisonous berries and the like) being picked up.  Even evolved humans are susceptible to such behaviour (I know, I was one of the "guinea pigs" in such a study in 1987) and it explains such things as stamp collecting and the most common result when watching this video:

Visual illusion- Attention Experiment

In terms of developing behaviours, the same filter will lead us to use some strategies in preference to others.  After all, we don't want to waste time thinking about how to do a thing every time we do it.  We just follow the same procedure again and again because we know it works.  We can see this in computer usage: lots of people will use achieve pointless mouseclicks to achieve the same result as a single keyboard shortcut because that's how they learned it originally.  Indeed, a large proportion of people will get angry if you try to show them the simpler way of doing it.

In terms of survival, out in the wild, often it is better to be slower bur right, rather than quick.  If you're building a shelter, you don't want it to fall down in the middle of the night in a howling storm.  So the tried and tested construction method will chosen over an untried, albeit potentially faster (and maybe even more effective) one.  So it's reasonable to hypothesize that an unwillingness to take on new ideas might well have offered an advantage in our evolutionary past.  Today, of course, that's not the case but we are still the products of our evolution.  That's why those rare individuals that show truly original thinking are so valued and called geniuses.  As for the rest of us, no matter how rational we might consider ourselves to be, our thought patterns are still affected by hand of our biology.
Timeo Danaos et dona ferentes

My yeas and nays     Grovelling Apologies     Wiki
Often confused for some guy

Jude

Similar, but unrelated:  http://researchnews.osu.edu/archive/majorityopinion.htm

To echo what Neroon said somewhat, yes, we're not programmed for reason or logic (though we seem to have the capability to grasp both) but for induction and efficiency.  This often leads to errors in judgment in more complicated issues where the same shortcuts served us well in "survival mode."

I guess where I diverge from Neroon a bit is, I don't believe that there's anything biological that makes some people predisposed to surpassing these faults.  I think it's a matter of culture, and that it can be learned.  Scientists tend to be smarter than the population as a whole, that seems to make sense, but I don't think their greater impartiality comes from that.  It's the culture of thought.  Just as vaccines can ward against susceptibility to illness, being aware of the pitfalls of logical thinking due to our innate human wiring can help you avoid falling victim to them.

I really wish our schools included classes on the pitfalls of logical thinking, concepts like confirmation bias, memory failure, and the imperfection of human perception.  Just about everyone (myself included) could use a little less confidence in their own ideas from time to time.

Xanatos

Just so you know, when I played WoW I was a mouse clicker and was just as fast or faster than the people who used mods to achieve their speed; it shocked people when they found out I used not a single mod and yet I was as good or better. So no dissing on the point and click! -chuckles-

Now back on topic, I can agree to an extent with the point of this discussion but I would have to say from my own experience that I see more truth/logic shot down by sheer stubbornness and unwillingness to accept the truth, even when an individual knows something is the truth. I cannot tell you how many times I have proven something to be true and people simply refuse to believe. I know for a fact, because I fall prey to my own unwillingness to believe at times because I simply don't want too even though I know I am wrong. I personally think that is America/the worlds problem today, not the inability or natural way, I really honestly think its a conscious effort to hold onto ones perceived truth despite knowing one is wrong.

Because of this reasoning that I see on a near constant basis, I try my damnedest to no longer "debate" with anyone, which basically is only arguing now a days, as few if anyone is willing to have a civil debate anymore (at least on the internet).

Maybe that's what you guys are talking about, if so sorry, but what I got out of reading the posts was that people refuse evidence more out of a lack of understanding due to fear of messing up, and a an inability to see the truth.

DarklingAlice

Quite interesting, thank you all.

But how do these ideas lead us into the situation explored in the second study? Why does our confirmation bias on a certain isolate matter, increase the tendency to reject the entire field? e.g. (from the article) if science disagrees with you on whether or not homosexuality is correlated with mental illness, then you are less likely to credit science with being able to discuss the effect of television violence on violent behavior, the accuracy of astrology, and the health effects of herbal medications. To extend the hunter-gatherer metaphor this is rather like if a group of people have a long tradition of not eating blue berries (because conventional wisdom says blue berries make you die, everyone knows that) when they encounter a man who does eat blue berries and doesn't die, they are biased against believing him that blue berries are safe. This makes sense. There is a longstanding tradition of not eating blue berries, and so people will look for an excuse not to believe him. However, (unless I am oversimplifying, and I may well be) the second article seems to suggest that if a different fellow comes along later, and eats purple berries (a berry that their communal wisdom is completely silent on) and doesn't die, the are unlikely to believe him as well, because the entire methodology of eating things to see if you will die when you eat them has been thrown into doubt by blue berry man.

And to discuss Jude's comment about the scientific establishment for the moment, we do not seem to lack the first quality: just look at people like Peter Duesberg who despite being brilliant scientists become drooling idiots on specific subjects. We instead tend to lack the second. No matter how much Duesberg doubts the link between HIV and AIDS, no matter how often he invents methodological flaws because he can't accept research, he does not suddenly start doubting all scientific research in the same way the second article suggests that some laymen are prone to.
For every complex problem there is a solution that is simple, elegant, and wrong.


Jude

Imagine the guy who eats blueberries talking with the others; he attempts to be reasonable and convince them over to his point of view, so he says that he supposes its possible that blueberries could poison people if they didn't wash them first.  He explains that he washes his food first, before eating it, and that's probably why eating blueberries doesn't kill him.  Now you have a method associated with the outcome that they disagree with; I think in that instance they'd feel the method is wrong because they disagree with the conclusion it lead to so strongly.

This is probably also why science advocates are trying to chance science education so that it's more than memorization of facts.  They feel like, if people apply the scientific method to discover something in labs, they will come to understand why it's valid and thus accept the results.

I think right now the problem is people just don't understand science.  When bold claims are made complicated methods, they basically reduce it to voodoo magic in their minds.  Especially when it comes to stuff like dark matter and the big bang theory because they can't wrap their minds around all of the jumps in logic at all anymore.  When you reach a level of theoretical abstraction that is nearly impossible to follow everyone just becomes absurd.

Neroon

Quote from: Jude on August 17, 2010, 01:04:36 PM
This is probably also why science advocates are trying to chance science education so that it's more than memorization of facts.  They feel like, if people apply the scientific method to discover something in labs, they will come to understand why it's valid and thus accept the results.

To digress for a moment- sorry Alice- speaking as a science educator, I can confirm that what Jude says is only the excuse used for it.   The real reason is that the number of science graduates is falling and, since not all science graduate become teachers, the pool of science teachers is shrinking also.  Were science to continue to be taught in the traditional manner, there would soon be too few teachers to adequately teach the subject.  Moreover, the methodology, while laudable in its aim is, unfortunately, flawed in its results.  The results are children who are very good at arguing- on a superficial level- while lacking the knowledge to actually understand the issues they are discussing.  They judge the information they are presented on an emotional level and not with anything close to critical analysis.  In truth, it can be said that good intentions do not insure one against harmful effects from one's actions.

Perhaps, to bring this almost back to the point. the results of such curricular changes will cause an increase in the tendency for people to refuse to accept evidence which contradicts their preconceptions.  That which having been said, this is hardly a new phenomenon, after all it was noted by Julius Caesar when he said, albeit in Latin, "Men are nearly always willing to believe what they wish" .

Note: if you wish to discuss Science education with me, Jude, I would be more than willing to enter into an exchange of PMs with you.
Timeo Danaos et dona ferentes

My yeas and nays     Grovelling Apologies     Wiki
Often confused for some guy

Will

Quote from: DarklingAlice on August 17, 2010, 12:23:41 PM
And to discuss Jude's comment about the scientific establishment for the moment, we do not seem to lack the first quality: just look at people like Peter Duesberg who despite being brilliant scientists become drooling idiots on specific subjects. We instead tend to lack the second. No matter how much Duesberg doubts the link between HIV and AIDS, no matter how often he invents methodological flaws because he can't accept research, he does not suddenly start doubting all scientific research in the same way the second article suggests that some laymen are prone to.

I think the difference between Duesberg and the average layman is that Duesberg has invested himself in science (for better or for worse).  I have to assume that earning a Ph.D. in a scientific field would make one less likely to toss out science as a whole.  The average person doesn't have that level of investment inhibiting their confirmation bias.

Of course, that's assuming they don't think too deeply about the TV they're watching, the medications they're taking, etc, etc.  We're all invested in science, obviously, but people are very good at ignoring things they don't like.
If you can heal the symptoms, but not affect the cause
It's like trying to heal a gunshot wound with gauze

One day, I will find the right words, and they will be simple.
- Jack Kerouac

Jaybee

I only wish everyone who felt strongly about an issue practiced the 3 Tiny's....

1) There's a TINY bit of humility in me, that I NEVER mind showing, because of the fact that...
2) There's a TINY chance that I, like any single human, possess ALL the facts about a complex issue, and so...
3) There's a TINY chance I'm wrong in my stance.

There is, actually, a 4th 'T'....

TRY. 

Lypiphera

I think its also worth taking into consideration the role that authority plays upon experiments like these.

I remember being taught about an experiment while at college (so forgive me for not having anything to quote!) where men in white coats would ask people, as part of a scientific test, to push a button which inflicted an 'electric shock' on someone in the next room. The people pushing the buttons could not see into the next room but were wearing headsets so could hear the (Actors in this case) crying aloud, screaming and (if they carried on pushing the button like the were asked to by the men in white coats) eventually falling silent.

Nearly every single one of the people pushing the buttons continued because the white coats told them to, but when people not wearing white coats told a control group of people to carry on pushing they all refused I believe - showing how much of an affect an authority figure can have on our actions.

So in theory, people could be more willing to dismiss their views if scientists - people in authority who are considered to be more clever than we are - tell them to because we respond to authority.

Oniya

Quote from: Lypiphera on August 31, 2010, 07:04:47 PM
I think its also worth taking into consideration the role that authority plays upon experiments like these.

I remember being taught about an experiment while at college (so forgive me for not having anything to quote!) where men in white coats would ask people, as part of a scientific test, to push a button which inflicted an 'electric shock' on someone in the next room. The people pushing the buttons could not see into the next room but were wearing headsets so could hear the (Actors in this case) crying aloud, screaming and (if they carried on pushing the button like the were asked to by the men in white coats) eventually falling silent.

http://en.wikipedia.org/wiki/Milgram_experiment

Very famous experiment.  Some of the variants that have sprung up (in the non-scientific application) are very telling, like the guy who calls up fast food places under the guise of a police investigator and instructs the manager to strip-search one of the employees, or the guy that calls hotel rooms under the guise of a gas company official and has the guest smash out windows, pull the fire alarm and set off the sprinklers.
"Language was invented for one reason, boys - to woo women.~*~*~Don't think it's all been done before
And in that endeavor, laziness will not do." ~*~*~*~*~*~*~*~*~*~*~Don't think we're never gonna win this war
Robin Williams-Dead Poets Society ~*~*~*~*~*~*~*~*~*~*~*~*~*~Don't think your world's gonna fall apart
I do have a cause, though.  It's obscenity.  I'm for it.  - Tom Lehrer~*~All you need is your beautiful heart
O/O's Updated 5/11/21 - A/A's - Current Status! - Writing a novel - all draws for Fool of Fire up!
Requests updated March 17

Lypiphera

Ah thank you!

Yes I do remember it was very famous - something a lot of college pupils are taught but I couldn't for the life of me remember why. Its a really fascinating view into human behaviour so thanks so much for the wikipedia article, I can go have a good old read now!

DarklingAlice

The interesting thing about this, is that it seems to fly in the face of the Milgram experiment.

People are being told things by scientists contrary to their views.

This leads them to reject that scientist.

Moreover this leads them to lose confidence in science itself.

So, unless people are just all inherently sadistic and like shocking people (a possibility I won't discount) this is in essence, the exact opposite of Milgram.
For every complex problem there is a solution that is simple, elegant, and wrong.


Lypiphera

Ah I see, I misread the quotes. I thought they were saying that the subjects changed their views when shown scientific evidence when actually they were rejecting it...

That really is interesting, I wonder if they could set up something that challenges Milgram using this method, it would be fascinating to see if people responded to authority challenging their beliefs and which way people would go...

Will

I do think there is a small difference between this and Milgram.  If there's a conclusion to be drawn by comparing the two studies, I would say it's that we would sooner shock an innocent person than give up our opinions.
If you can heal the symptoms, but not affect the cause
It's like trying to heal a gunshot wound with gauze

One day, I will find the right words, and they will be simple.
- Jack Kerouac

Lypiphera

Very possibly :) But still it would be an interesting study to know for sure!

dominomask

Quote from: DarklingAlice on August 17, 2010, 12:23:41 PM
how do these ideas lead us into the situation explored in the second study? Why does our confirmation bias on a certain isolate matter, increase the tendency to reject the entire field?

Well, I can't think of a way to test it, but I would hypothesize that it's similar to musculoskeletal problems.  An abnormal inflexibility in one joint can cause problems in joints further up or down the kinetic chain.  Since the abnormal joint can't pass through a normal range of motion, other joints must move beyond their normal range of motion to compensate and create normal motion.  Hyper extension and hyper flexion over time create "mysterious" new chronic injuries to the joints surrounding other injured joints.  (PSA: so don't put off getting that stiff ankle treated...it's not "nothing".:)

So, by way of analogy, when facts work to ossify into fact what a person has always considered debatable, the stress gets passed up the logical chain, and the person's faith in science begins to move in abnormal ways.

Neroon

Generally, I would say that Milgram has very little to say in terms of people's faith in science and scientists but more to say in terms of people's willingness to follow an authority figure when placed in unfamiliar surroundings.

When you take someone off the street and put them into a laboratory, you are taking them out of their normal environment.  As a result, their thinking is unlikely to be grounded in its normal real world context.  The tendency in most people at this point is to revert to an insecure semi-childish mindset in which one will more readily accept what one is told by an authority figure.  As the established sterotype is that in hospitals and laboratories "the man in the white coat knows best" most people participating in a Milgram style experiment will not critically evaluate the authority figure's words that the supposed subject of the experiment is not being hurt, despite the evidence of their ears that the subject is being hurt very much. Reason says, "I am hurting someone," while insecurity says, "Trust authority; it's ok."  For a lot of people, the point where reason overcomes the insecurity is frighteningly distant.

I don't think our opinions are that fixed that we will hold to them in all cases.  Having thought about this for the past couple of weeks, looking back over the ways I've seen myself and my friends behave, I would say that our opinions are much more affected by emotion than reason.  Anger seems to strengthen our hold on our opinions while anxiety and arousal can weaken them.  I've seen many people profess opinions they have previously dismissed and change their behaviour in the hopes of getting laid, with the changes sticking long after the sexual situation is over.
Timeo Danaos et dona ferentes

My yeas and nays     Grovelling Apologies     Wiki
Often confused for some guy