By Martha Henriques
We like to think we’d do the right thing in a tough situation. We’d stand up to our boss when necessary, step in if we saw someone being bullied, and say no if we were asked to do something we felt was wrong. It’s tempting to think we have an innate moral compass that guides our actions, even under pressure from others.
In reality, however, most of us are remarkably bad at standing up to authority. New research is revealing why this is, giving us insight into how the brain deals with – or fails to deal with – these difficult situations. Ultimately, the research could show us how we can train ourselves to become stronger-minded and better able to stick to our guns when needed.
In experiments by social neuroscientist Emilie Caspar at the Netherlands Institute for Neuroscience, volunteers gave each other electric shocks. (The research follows in the footsteps of the notorious experiments of Stanley Milgram in the 1960s, but in a more ethically and scientifically rigorous way.)
You might also like:
• How ‘collective narcissisim’ is driving world politics
• The surprising reason people change their minds
• How curiosity can protect the mind from bias
First, participants were asked to administer shocks for a small sum of money (about 5p each time). When a participant was given 60 chances to shock their partner, about half of the time they chose not to. Around 5-10% of people choose not to shock their partner on all 60 opportunities.
Then Caspar stood over the participants and ordered the person giving the shocks to do it. Now, even the participants who didn’t give any shocks previously started to press the button.
I’ve tested more than 450 participants, and so far only three refused to follow the orders – Emilie Caspar
As soon as Caspar gave orders, the participants’ brain activity also changed, electroencephalogram (EEG) scans showed. In particular, research showed the brain became less able to process the consequences of the respondents’ actions. For the vast majority of volunteers, their sense of agency and responsibility started to melt away.
“I’ve tested more than 450 participants, and so far only three refused to follow the orders,” says Caspar. “How are these people different from the others?”
Studies on patients with localised brain damage are helping to answer part of this question. When people have lesions in the prefrontal cortex – the outermost layer of the front part the brain – they appear to be much more prone to following orders than the general population.
“They really very readily listen to authorities, and are less able to doubt them,” says Erik Asp, an assistant professor of psychology at Hamline University’s College of Liberal Arts in the US. “That means if an authority figure tells you to hurt someone else, you’re more likely to.”
What is it about this part of the brain that helps us stand up to authority?
The question gets into philosophical topics like the nature – and neurological basis – of belief. While there is no clear scientific consensus, the Spinozan model is a strong contender. It suggests that in order to understand a new idea or fact, our brain must, for a split-second, believe it completely.
“The act of understanding is the act of believing. Whatever those processes are, they are the same,” says Asp.
The act of understanding is the act of believing – Erik Asp
After a split second, you then can doubt or reject this new piece of information. “You can use a separate neuropsychological process to come back and disbelieve that mental representation,” says Asp. “In other words, you come back and doubt it.”
For prefrontal cortex patients, it’s this second part of the process that is impaired, Asp argues. So instead of quite literally thinking twice about what an authority figure says, prefrontal cortex patients are more likely to take what they hear as given.
If the prefrontal cortex is the seat of our ability to doubt and question authority, there may be a way in healthy people to strengthen our ability to do this. The prefrontal cortex has some plasticity. “I think it is modifiable,” Asp says. “It’s not built in to the brain function that you have – it’s not set in stone.”
Education is one of the best ways to improve your ability to doubt, says Asp, and therefore your ability to think critically about things you might be told to do.
There is another deciding factor that influences how you behave, too.
When an authority figure asks us to do something, we usually do it because we’re led to believe in the cause behind their request, says Megan Birney, a psychologist and senior lecturer at the University of Chester at University Centre Shrewsbury.
In one experiment, Birney and her colleagues measured how many people dropped out of an experiment where they were told to do something morally objectionable. The participants had to attach negative terms to groups of people in photos. The pictures started off with groups it was easy to dislike, such as Nazis or the Ku Klux Klan. Gradually, the pictures were of more neutral groups and eventually of families or groups of young children.
For those who continued with the experiment, it was a belief that they were contributing to something important that drove them to push through
Assigning negative term after term to harmless groups of people was intended to be emotionally wearing and to make most participants feel uncomfortable. Plenty dropped out as it got more intense. For those who carried on, it was a belief that they were contributing to something important – a rigorous scientific study – that drove them to push through.
“Some of the ones who dropped out would take it upon themselves to get in touch with me and apologise profusely,” says Birney. The participants would tell Birney things like, “I’m so sorry, I hope I haven’t messed up your work, I just felt so uncomfortable, I hope you understand.” In many cases, a strong sense of guilt came with not cooperating.
“When you’re in a conflicting situation, you have competing voices. One is telling you yes, one telling you no. It’s whichever you identify more with, whichever you think is right – that’s what you’ll go with,” says Birney.
This becomes dangerous when people so strongly identify with a cause that they will follow wherever it leads. “You might think, ‘I’ve done so much and I’m doing something that matters’. How far will you take that?”, she says.
Logically, you might expect there to be a tipping point where you realise what you’re doing is terrible idea. But if we believe strongly that we are doing something valuable – that the end justifies the means – this point becomes clouded, or may never appear.
Being able to stand up to authority doesn’t hinge on bravery or courage, confidence or stubbornness. The brain processes and regions essential for rejecting ideas from authority figures are starting to be revealed. And just how invested we are in a cause may prove to be the all-important factor in determining where we are able to draw a line in the sand.
Given this complexity, finding ways to train yourself to get better at resisting authority might seem incredibly challenging. As yet, there’s no specified, evidence-based programme of training we can follow to make ourselves do better in these difficult situations. But such a training programme is exactly Caspar’s “scientific dream”.
“My aim is to make people able to resist,” says Caspar. “Even in the military, soldiers have the legal duty to follow orders, but also to refuse to follow illegal or immoral orders. It’s about how to make people think more about their own responsibility, even if they don’t feel responsible because they’re following orders.
“We need to find out how to train people to be able to do that so that they can feel more responsible in those situations.”