Friday, August 5, 2022

Psychological Barriers

 


Are we always aware of our motives?


There are certain ways our brains are wired. 

Hyperbolic discounting: The present is more important than the future. Short term gains over long term survival.

By-standard Effect: Someone else will deal with the problem. Someone call 911 vs. you call 911 - assuming someone else has control of the situation.

The Bystander Effect (Examples + Experiments) - YouTube

Sunk cost fallacy: organization invests time, money and energy into something and doesn't want to let it go.

Normalcy bias: Life is going to continue as we currently know it. Tomorrow is going to be the same as today and yesterday. People hear about emergencies happening in other places but they don't expect it to happen to them, they expect business as usual forever. Causes people to seize up during emergencies or it prevents them to physically and mentally prepare for emergencies in the first place.

There are many studied examples of this, for instance evacuation efforts. When there is a wild fire, earthquakes, hurricanes. One study shows 70% of people suffer from normalcy bias. When the Pompeii volcano exploded, many people stood and watched. During the 9-11 attacks, one study suggested that it took people 6 minutes to react after feeling the plane crash. People stood around and talked, discussed what was going on. But even after being told to evacuate because there was a plane crash, many people refused or sought out other sources of information to confirm what they had heard. People interviewed since the attack reported being told everything was fine, there was no need to panic, they could slowly walk down the stairs, and it would just be a minor inconvenience in their day.

There was a 2001 study that suggested that when people are asked to leave in the anticipation of a disaster, most check with 4 or more sources of information before deciding what to do. Even when they are told the disaster is imminent and they are in immediate danger.

It has been used to described why so many jews refused to leave Germany and Austria under Nazi occupation until it was too late and they didn't have the option to. It can also be used to describe why so many in Ukranian didn't heed the warning about an impending Russian attack. Problems with Normalcy bias can be further complicated with ideas like 'The Boy Who Cried Wolf'. If we think about Ukraine, they had seen this happen before, they had seen Russia build up on the border and nothing happened. This further re-enforces Normalcy Bias. The idea is that bad things don't happen to me. Yet there are cycles throughout history.

Another example, there was a plane crash in 1977 on the runway in Spain. There was heavy fog. One plane didn't follow the directions for takeoff and crashed into another plane that was still taxiing on the runway. Everyone on the plane taking off died instantly in the impact, but for the plane that was taxiing, only one part of the plane was severely damaged, the rest of the plane was fine. However there was a fire that began. Half of the people unbuckled their seat belts, stood up and walked off the plane. The other half stayed on the plane and perished in the fire. And it was said from people who were interviewed that walked off the plane that the people around them just sat in a shocked silence as everything unfolded around them. All they had to do was unbuckle, get up, and walk off the plane. But Normalcy Bias took over, this idea that this isn't really happening, it's not happening to me or that someone else will take care of this, someone will come grab me and tell me what to do.



Confirmation Bias: The tendency to process information by looking for or interpreting information that is consistent with one's existing beliefs. There are two paths for this, assimilation and combination.

Assimilation: The process of using or tranforming the environment so it can be placed in pre-existing cognitive structures. For example: Your driving down the road with a three year old, and the child points at a cow and says 'doggie'. You correct the child by pointing at the cow and saying 'cow''. The child again says 'doggie'. It's the idea that we don't change our perspective or point of view to accommodate more information, we just adapt the information to fit our frame of mind.

Combination: Changing cognitive structures in order to accept something from the environment. People form opinions and then once those opinions are established, people have a difficult time processing information in a rational unbiased way. In other words I'm really only going to hear or accept information that aligns with the opinions I already have.

Let's look at something in the news....

Why is that?

A: It's efficient. We need to be able to process information quickly to protect ourselves from harm. So if every bit of information that came our way, if we were having to put a lot of energy into evaluating it, it wouldn't be super-efficient. Instead it's easier to immediately discount it if it's not in line with what we believe or we accept it. Also it protects our self image. We as people would like to believe that we are intelligent and well informed. So it's a shot to our self esteem if suddenly we are hearing something that proves that we are wrong.

Echo-Chambers: Putting ourselves in a place where we will hear echoed back to us our own beliefs. Huge reason for polarization. Reason we view some people as 'the others'.


What if you considered all tall people smart? Or what about doctors and patients?

Hyper-Normalization: “HyperNormalisation” is a word that was coined by a brilliant Russian historian who was writing about what it was like to live in the last years of the Soviet Union. What he said, which I thought was absolutely fascinating, was that in the 80s everyone from the top to the bottom of Soviet society knew that it wasn’t working, knew that it was corrupt, knew that the bosses were looting the system, know that the politicians had no alternative vision. And they knew that the bosses knew that they knew that. Everyone knew it was fake, but because no one had any alternative vision for a different kind of society, they just accepted this sense of total fakeness as normal. And this historian, Alexei Yurchak, coined the phrase “HyperNormalisation” to describe that feeling.





Well-Informed Futility Syndrome: Basically, the feeling that nothing can be done.


Parable of the starfishes washed up on the shore...



Frog in boiling water...

Emergent properties: Gold, water, pile of sand....


One more,:Survivorship bias....


The fighter jet engineers...

Cats survive higher falls....

They don't make 'em like they used to....


FOOTNOTES:

Give it to Me Now! The Power of Hyperbolic Discounting (disruptiveadvertising.com)


Bystander Effect | Psychology Today


Well-Informed Futility Syndrome | Carrie Brown Reilly (cbreilly.com)


Shades of Green: Well-Informed Futility Syndrome | The Common Sense Canadian


confirmation bias | Definition, Background, History, & Facts | Britannica