• NEW! LOWEST RATES EVER -- SUPPORT THE SHOW AND ENJOY THE VERY BEST PREMIUM PARACAST EXPERIENCE! Welcome to The Paracast+, eight years young! For a low subscription fee, you can download the ad-free version of The Paracast and the exclusive, member-only, After The Paracast bonus podcast, featuring color commentary, exclusive interviews, the continuation of interviews that began on the main episode of The Paracast. We also offer lifetime memberships! Flash! Take advantage of our lowest rates ever! Act now! It's easier than ever to susbcribe! You can sign up right here!

    Subscribe to The Paracast Newsletter!

The Psychology of Egotistical Belief & Overconfidence


exo_doc

Foolish Earthling
Here are some excerpts from an article in the April 2013 issue of ID Magazine (ID stands for Ideas and Discoveries) entitled "Overconfidence-Ego Trips of the Brain".

I think these following statements have a direct bearing on how the thinking works for uber-skeptics and uber-true believers alike;

1. Overconfidence blocks the ability to look critically upon our own mistakes.
2.When mistakes happen, the brain tries to "explain them away" in order to be able to hold on to the established view.
3.For example, Professor Kevin Dunbar, professor of Neuroscience at the University of Toronto demonstrated the world-view guarding reaction using the example of Stanford University biochemists.
Professor Dunbar accompanied them to work and regularly witnessed falsified perceptions generated by overconfidence: When an experiment produced unexpected results, researchers were quick to blame measurement errors. If it still didn't work, the results were ultimately completely ignored.

So..........does this mean both extremes of belief/non-belief of just about anything result from egotistical overconfidence (at least to some major degree)?
Are we ALL guilty of latching on to what seems to confirm our world view, ...and discard or ignore what seems to disagree with it?

More on this: On Overconfidence § SEEDMAGAZINE.COM
 
it makes sense--if a researcher has been working in the field for a long time and they are running a routine check (not all things labeled "experiments" are merely a check for something truly novel) and suddenly one experiment fails to replicate something that has been cataloged in great detail, its reasonable to blame the measuring devices or methods used. "Ignoring them" sounds like an flawed assumption based on a small window of time--researchers more likely ticketed the item as an anomaly to debug and troubleshoot later--no one likes inconsistencies--watch the film of the JPL technicians testing the mars lander parachute and you will see what really happens when something is inconsistent. Engineers hate inconsistencies and they will hunt them down to the ends of the earth.
 
I think these following statements have a direct bearing on how the thinking works for uber-skeptics and uber-true believers alike;

1. Overconfidence blocks the ability to look critically upon our own mistakes.
2.When mistakes happen, the brain tries to "explain them away" in order to be able to hold on to the established view.
3.For example, Professor Kevin Dunbar, professor of Neuroscience at the University of Toronto demonstrated the world-view guarding reaction using the example of Stanford University biochemists.
Professor Dunbar accompanied them to work and regularly witnessed falsified perceptions generated by overconfidence: When an experiment produced unexpected results, researchers were quick to blame measurement errors. If it still didn't work, the results were ultimately completely ignored.

So..........does this mean both extremes of belief/non-belief of just about anything result from egotistical overconfidence (at least to some major degree)?
Are we ALL guilty of latching on to what seems to confirm our world view, ...and discard or ignore what seems to disagree with it?

More on this: On Overconfidence § SEEDMAGAZINE.COM

This behavior can and does go beyond just ignoring the evidence to the contrary. It extends to irrational personal attacks, and let me emphasize the word irrational here. Not all personal attacks are irrational. Some are warranted. But that can be a tricky business and one can find themselves on the other end of the barrel if they're not careful. Online, the attacks take the form of insults or slights on character. A common one is for the one in denial to claim the person who has provided the contrary evidence is in some way deficient.

The deficiencies leveled usually revolve around the person who has presented the evidence having a lack of faith or intelligence. Sometimes it also includes elements of irrational mockery and ridicule. Again I stress the word irrational. Some mockery and ridicule is legitimately funny, but these people don't know the difference and they don't care if it hurts. In fact they want it to hurt. But perhaps the most perfectly ironic example are those who turn the whole thing around and accuse the person who has presented valid counterpoint of being the one who is egotistical, arrogant, and pompous.

Having said all that, although I've been the target of many such attacks, I'm not perfect either. Though I do my best to be balanced and not fling back anything that isn't deserved, sometimes I've gone too far in pressing a point, thinking I was justified in doing so simply because I knew I was correct, and for what it's worth, I pay for that. I don't like hurting people and if I unintentionally cause someone pain, I feel that pain too and I always wish there was a way to take it back without losing the ground gained in what I perceive to be a noble quest toward truth.

As for me being proven wrong. I look forward to it. It's how progress is made. I don't believe I've ever been in a state of denial when presented with solid facts, and I've eaten my fair share of crow. That's one of the main reasons why I try so hard to avoid being wrong. I guess the real irony is that being right or wrong doesn't always guarantee how we're going to feel about the outcome. Is it truly a victory if we do everything right or succeed in illuminating some truth, but we feel hollow or sad as a result? Is that suffering some kind of cross those who engage in this quest must bear? Should we bear it with honor or disgrace? Who decides? I think a most important point in the article you mentioned is:

The article said:
"Political and economic overconfidence are therefore all the more important because they are more likely to be misplaced and yet also to have implications for millions. We may not be able to eliminate this bias in our decision-making, but it is crucial that we understand it and reset our institutions accordingly if we are to shake our long record of self-imposed disasters."

And to close ... One more from PBS ...

PearlsBeforeSwine-2009_01_14.gif
 
Last edited:
Back
Top