You and your fellow townspeople are hiding in a cellar from marauding soldiers. Your baby starts to cry, which would alert the soldiers to your presence. The only way to save yourself and the others is to smother your baby. What do you do?
Making such tough personal moral judgments involves not just abstract reasoning or emotion, as shown by the results of a new study by Joshua Greene and his colleagues. Rather, their brain scan studies of people making such judgments revealed that the judgments activate brain regions involved in both reason and emotion.
The recruitment of both brain areas could well stem from humans' evolutionary past, they theorize, in which moral judgment is driven in part by social and emotional processing from our ancient primate past in concert with the sophisticated abstract reasoning we have evolved as humans.
In previous studies, the researchers found that when people considered judgments involving others they use brain areas associated with abstract reasoning. Example: You see a runaway trolley headed for five people. You can throw a switch to deflect the trolley onto a track that would kill only one bystander. Should you throw the switch? (Most people say no.)
In contrast, found the researchers, people asked to make personal moral judgments--ones that directly involve them--use brain areas associated with emotional and social cognition. Example: You are standing on a footbridge next to a very large person. You see the same trolley below you. If you throw the person off the footbridge, you save the five people, but personally kill the large person. Should you throw the person off the footbridge? (Most people say no.)
A key question, wondered the researchers, was what happens in the brain when the personal moral dilemmas were difficult, as in the "crying baby" scenario.
In their experiments, Greene and his colleagues presented volunteer subjects with a battery of both easy personal moral dilemmas, such as the "footbridge" dilemma, and difficult ones, such as the "crying baby" dilemma. The latter dilemmas were meant to bring the cognitive and emotional factors into more balanced tension so the researchers could observe the interaction between cognitive versus social/emotional processing.
As the subjects considered the dilemmas, the researchers scanned the subjects' brains using functional magnetic resonance. In this widely used imaging technique, harmless magnetic fields and radio signals are used to measure blood flow in regions of the brain, with such flow indicating brain activity levels.
The researchers discovered that people confronting the difficult personal moral dilemmas showed increased activity in the areas involved in abstract reasoning and cognitive control compared with those confronting easy personal moral dilemmas. As in the earlier work, the personal moral dilemmas also showed increased activity in social-emotional areas.
Importantly, found the researchers, the difficult personal moral dilemmas caused increased activity in a brain region called the "anterior cingulate cortex," which is believed to be involved in processing conflict.
The researchers concluded that their results provide "strong support for the view that both 'cognitive' and emotional processes play crucial and sometimes mutually competitive roles" in moral judgment. They wrote that their results "suggest a synthetic view of moral judgment that acknowledges the crucial roles played by both emotion and 'cognition'."
Joshua D. Greene, Leigh E. Nystrom, Andrew D. Engell, John M. Darley, and Jonathan D. Cohen: "The Neural Bases of Cognitive Conflict and Control in Moral Judgment"
Source: Eurekalert & othersLast reviewed: By John M. Grohol, Psy.D. on 21 Feb 2009
Published on PsychCentral.com. All rights reserved.
Great things are not done by impulse, but by a series of small things brought together.
-- Vincent Van Gogh