Feature

Harvard University philosopher Joshua Greene, PhD, is working to resolve a long-standing philosophical paradox: Do people ultimately base their mature moral judgments on their passions, as sentimentalists such as 18th century David Hume argued, or on reason and logic, as rationalists such as Immanuel Kant believed?

To find out, Greene decided to tap a reliable research tool — fMRI — to look at people’s brains as they made decisions about some classic moral dilemmas.

His work showed that both arguments are correct to some degree: In most situations, people appear to use logic, but there are times when emotions seem to override logic.

Greene is among a growing faction of philosophers who are taking their field back to its empirical roots. To understand human morality, they say, philosophy can no longer ignore the findings coming out of psychology, such as research on automatic processing and cognitive dissonance, which suggest that we humans may have less control over our actions and thoughts than we think. Members of this new generation of philosophers are getting out of their proverbial armchairs, reading the scientific literature and testing the plausibility of their theories with the help of experimental psychology, as well as other areas of modern cognitive science.

For many, that means collaborating with psychologists, neuroscientists, anthropologists and economists who can help them design theoretically interesting, as well as methodologically sound, studies.

Neuroimages of our morality

In Greene’s fMRI studies, among other moral dilemmas, he used the classic philosophical paradigm called the “trolley problem.” In it, he asks people to make decisions about two scenarios that have the same consequence: Saving the lives of five people at the cost of one. In the first scenario, a runaway trolley is hurtling down a track, threatening to kill five people in its path. If you flip a switch to divert the trolley, you will save them, but kill someone working on the other track. In the second scenario, a trolley is headed for five people and you are standing next to a large man on a footbridge above the tracks. You can save the people below by pushing this man off the footbridge into the trolley’s path.

Most people believe it is morally permissible to flip the switch but not to push the man off the bridge. Greene’s imaging data explain this seeming paradox: When people think through both dilemmas, they take a rational, utilitarian approach that is rooted in the brain’s dorsolateral prefrontal cortex. But when people think about the footbridge dilemma, they also engage a neural system associated with emotional processing, which provides a strong, negative emotional response.

Greene thinks we probably have an automatic, emotional anti-violence response that’s related to physically and directly doing harm. “Turning the trolley onto somebody kills them but doesn’t feel as violent,” says Greene, who works out of Harvard’s psychology department. “What really seems to matter is personal force.”

He believes that his findings support both Kant and Hume: When making a moral judgment, the brain activates logical reasoning, but when the judgment involves personal force, emotions kick in and influence decision-making.

In his work, Yale University philosopher Joshua Knobe, PhD, is testing the role moral judgments play in how people evaluate others’ intentions. Traditional philosophical theories suggest that people use rational thought to determine why people act as they do, so moral judgments should be irrelevant. In a series of behavioral studies, Knobe proved this wrong. One study involved participants answering questions about a CEO of a company who is approached by an employee with an idea. First, the employee tells him a project “would maximize profits, but also harm the environment.” The CEO says he doesn’t care about harming the environment; he just wants to maximize profits. Then, the employee tells the CEO that a project “would maximize profits, but also help the environment” and the CEO says he doesn’t care about helping the environment; he just wants to maximize profits. In both cases, the CEO approves the project and it is implemented.

In the first scenario, more than 80 percent of people said the CEO intentionally harmed the environment. In the second, virtually identical scenario, most participants said the CEO did not intentionally help the environment. These results indicate that our moral judgments about the consequences of an action influence whether we think the action was intentional.

“Traditional philosophy would not have predicted this,” says Knobe. “This and other studies like it show us that there’s something very different going on in our understanding of the world.”

Further implications

Indeed, psychology has shown over the past 20 years just how much our beliefs and behaviors can be influenced by unconscious thoughts and feelings, and questions about moral behavior need to take that into account, says University of Virginia psychologist Jonathan Haidt, PhD. His work on moral judgments, often in collaboration with philosophers, has inspired and informed the experimental philosophy movement.

One series of experiments shows that people become more judgmental when they feel disgusted. To trigger disgust, Haidt uses priming — exposing participants to words or foul smells associated with disgust. Then he asks participants to judge people they hear about in stories. One story tells of a couple who happen to be distant cousins. The two meet and become romantically involved. Is it incest? Most people say no, but study participants primed to feel disgust are more likely to say yes.

In another study, Haidt and his colleagues used an innocuous story about “Dan,” the student council president, who arranges discussions between students and faculty. Participants not primed to feel disgust see nothing wrong with Dan’s actions. But one-third of participants who were primed to feel disgust said something was amiss with Dan. When asked to explain, they gave statements like, “I think he’s up to something,” and “He’s a popularity-seeking snob.”

“These findings have enormous implications for our political life,” says Haidt. “People found a way to justify moral condemnation that was based on nothing more than an implanted feeling of disgust.”

Such a finding supports the argument Hume made more than 200 years ago that reasoning follows, rather than precedes, our rapid and automatic judgments, says Haidt. That’s the beauty of experiments like the ones he, Knobe and Greene are conducting: They provide a scientific foundation for many philosophical debates.

“Philosophy lost its empirical bent sometime in the early part of the last century,” says Greene. “We’re trying to bring back that true empirical spirit by making discoveries that tie together philosophical questions with what we know about the human mind and how it influences human behavior.”


Beth Azar is a writer in Portland, Ore.