Evidence keeps pouring in that humans have an in-built sense of morality or fairness and that specific regions of the brain are responsible. Over at Neurophilosophy, Mo reports on two new studies — the first involving the use of magnets to impair peoples’ moral intuitions, and the second involving people with brain damage that impairs moral judgments.
Both studies are receiving a fair amount of attention. This is unsurprising, given that the standard thinking about morality is that people are good or evil and that this is largely a matter of choice. Another default assumption is that religious people are more moral than non-religious people.
In his article “Magnets Can Manipulate Morality,” Eric Bland reports:
Using a powerful magnetic field, scientists from MIT, Harvard University and Beth Israel Deaconess Medical Center are able to scramble the moral center of the brain, making it more difficult for people to separate innocent intentions from harmful outcomes. The research could have big implications for not only neuroscientists, but also for judges and juries.
“It’s one thing to ‘know’ that we’ll find morality in the brain,” said Liane Young, a scientist at MIT and co-author of the article. “It’s another to ‘knock out’ that brain area and change people’s moral judgments.”
The area of the brain that was bombarded (and confused) by magnetism is the right tempo-parietal junction, a region that also allows us to attribute mental states to others (i.e., “theory of mind”). Although Bland’s article emphasizes the impact these findings could have on notions of intent and guilt in the legal system, I think the findings have equal relevance to religion. Most religionists assert that morality emanates from the divine.
Ever since the famous and gruesome case of Phineas Gage (who had a tamping rod blown upwards through his left cheek and out of his skull), we have known that damage to specific areas of the frontal lobes can dramatically alter behavior and impair moral judgment. Andy Coghlan at New Scientist reports on a new study that confirms these findings:
To probe emotion’s role in moral decision-making, Liane Young and her colleagues at the Massachusetts Institute of Technology turned to nine people whose emotional responses were impaired due to damage in the ventromedial prefrontal cortex.
Young presented these people with 24 moral dilemmas, each consisting of four different scenarios of varying acceptability. In one, for example, someone kills another by mistakenly adding poison to their coffee instead of sugar. In another scenario, a person tries but fails to kill another by deliberately poisoning their coffee. Participants ranked the moral acceptability of each scenario on a scale of 1 to 7.
The volunteers with brain damage gave failed attempts at intentional harm a 5, rating it twice as permissible as the other volunteers, who opted for 2.5. And the impaired group all rated accidental harm to someone as being less morally acceptable than failed attempts at deliberate harm.
In the end, these studies show that a substantial part of moral decision-making originates in the brain and that humans have evolved cognitive mechanisms for making moral judgments. Although religious training can pattern these judgments, religion is not the source of the judgments.
These findings also contradict the idea that religion is an evolved adaptation which facilitated prosocial or “moral” behaviors and thus provided some groups with a selective advantage over others.