Hearts and Minds: How We Think About Moral Dilemmas
Consider the following question:
You are on a bridge above a runaway train quickly approaching a fork in the tracks. On the tracks extending to the left is a group of five railway workmen. On the bridge standing near to you is a single stranger. The train is on the point of proceeding to the left, causing the deaths of the five workmen. The only way to avoid the deaths of these workmen is for you to push the stranger off the bridge that will cause the train to come to a stop, leading to the death of the stranger.
These types of moral dilemmas are the subject of research asking this fundamental psychological question: Are people rational or emotional when making moral judgements?
Some researchers suggest that people have intuitive and emotional reactions to moral issues, and provide after the fact justifications for their intuitions (Haidt, 2001). For example, a person will intuitively choose not to push the man, feeling disgust at the thought, and subsequently develop a rational justification for this response. Others say these intuitive responses can be suppressed by using reasoned deliberation (Pizarro, Uhlmann, & Bloom, 2003). For example, a person may override their decision not to push the man when instructed to give a rational response (Pizarro et al., 2003).
A recent study by Eoin Gubbins and Ruth M. J. Byrne (2014) has shown that while certain moral dilemmas lead to favor either rational or emotional justifications, people can suppress these tendencies and access either one of these processes when primed to do so.
To test their theory, participants were given moral dilemmas that were classified as personal or impersonal. Personal dilemmas involved direct physical contact, or face to face interaction, with another person, as in the moral dilemma posed above. Impersonal dilemmas involved indirect contact with another person, such as the following:
You are at the wheel of a runaway train quickly approaching a fork in the tracks. On the tracks extending to the left is a group of five railway workmen. On the tracks extending to the right is a single railway workman. The train is on the point of proceeding to the left, causing the deaths of the five workmen. The only way to avoid the deaths of these workmen is for you to hit a switch on your dashboard that will cause the train to proceed to the right, leading to the death of the single workman.
In the first experiment, participants read a description of the dilemma, and were then asked to complete a sentence in which they had to justify their choice to a friend. In an emotion primed condition, the sentence was: “I knew I had to make a decision fast. This is what I experienced in those seconds, the feelings and emotions I had…” In a reason primed condition, the sentence was: “I knew I had to make a decision fast. This is what I experienced in those seconds, the thoughts and reasons I had…” In an unprimed condition, the sentence was: “I knew I had to make a decision fast. This is what I experienced in those seconds…”
The participant’s justification was then classified as emotive (referred directly or indirectly to the emotions of the participant or the emotions of others) or non-emotive. Emotive response might include something like, “The choice was horrible” or “The families of the workers would be devastated.” Non-emotive justifications did not make any reference to emotions, such as “Better one die than five.”
Those who were in the unprimed condition gave more emotive justifications for personal dilemmas. However, those who were emotion-primed overcame this tendency, and were equally likely to give emotive justifications for both impersonal and personal dilemmas. Similarly, those who were reason-primed suppressed the tendency to give emotive justifications for the personal dilemma, and even gave less emotive justifications for personal dilemmas than for impersonal dilemmas. These results suggest that, despite the tendency to use either emotion or reason to evaluate certain moral questions, both reason and emotion are accessible to us. Therefore, emotion can be used to supplement our reason in moral dilemmas, and vice versa.
In the second experiment, participants read either personal or impersonal moral dilemmas on a computer screen written from the perspective of a protagonist who must choose to act or not. After reading the dilemma, participants read a sentence describing the emotions felt by the protagonist followed by a sentence stating the protagonist’s decision. Participants were timed on how quickly they read the two sentences.
Theory suggests that if a sentence fits the mental representations of the participant, or is congruent with the participants’ expectations, they will read the sentence faster. Participants read emotion sentences faster for personal dilemmas than impersonal dilemmas, suggesting that they were primed to expect the protagonist to experience emotions in personal dilemmas.
These findings show that we rely on both reason and emotion in our moral judgements. While there appears to be a tendency to favor one of these processes depending on the content and context of the moral dilemma, subtle primes can influence which of these processes we employ. Debates over moral issues can often reach an impasse when reasoned arguments come up against emotional arguments. By applying this research, we can seek to improve this dialogue by crafting more effective arguments that supplement reason with emotion, and vice versa.
Haidt, J. (2001). The emotional dog and its rational tail: A social intuitionist approach to moral judgment. Psychological Review, 108(4), 814.
Gubbins, E., & Byrne, R. M. (2014). Dual processes of emotion and reason in judgments about moral dilemmas. Thinking & Reasoning, 20(2), 245-268.
Pizarro, D. A., Uhlmann, E., & Bloom, P. (2003). Causal deviance and the attribution of moral responsibility. Journal of Experimental Social Psychology, 39(6), 653-660.
Erin Skinner, N. (2017). Hearts and Minds: How We Think About Moral Dilemmas. Psych Central. Retrieved on April 23, 2018, from https://psychcentral.com/lib/hearts-and-minds-how-we-think-about-moral-dilemmas/