Lesson 3.6

Understanding Moral Foundations

Watch the following video, then proceed to the Key Concepts below:

 

Key Concepts:

The six underlying moral foundations shared by most human beings

The six underlying moral foundations shared by most human beings

 

Research on Moral Foundations explains in large part why good people can disagree so viciously on things. It’s because most people think they’re good people deep down—and we attach our points of view to “good things.” It’s just that some people tend to focus on certain “good things” more than others.

Moral Foundations, in other words, explains why I hear my good-hearted politically conservative, Mormon and Protestant friends back home in Idaho say the same thing that my good-hearted liberal, Atheist and Agnostic friends in New York say about them: “I can’t believe someone could believe in that!”

The good news is that research says that we can develop respect for differing viewpoints if we make the effort to unearth their moral motivations.

Few people actually think of themselves as evil. So, unless you’ve got something wrong in your brain (e.g. you’re a malignant narcissist or a psychopath), you will tend to justify your decisions to help you feel like a “good” person. Under the surface, you’ll create good reasons for what you think.

Moral Foundations theory says humans share at least six innate moral foundations that serve as the universal building blocks of morality. For the most part, evolutionary psychologists and spiritual belief systems agree on these. They are:

  • Care - Being kind and preventing harm.

  • Fairness - Justice, and not cheating people.

  • Loyalty - Patriotism and self-sacrifice for the group, not betraying the group.

  • Authority - Deference to legitimate authority for the good of the group.

  • Sanctity - Striving to be noble, clean, and not contaminated.

And some emerging schools of thought believe there may be one more foundation inherent to all (or at least most) humans:

  • Liberty - Rights, freedom, rejection of constraints and of oppression.

Studies show that we give our in-group lots of benefit of the doubt because we think “they’re good people.” We understand their underlying morals.

But we don’t afford our out-groups the same benefit of doubt. They might be “bad people,” so we justify not respecting them. (You can see this any time someone calls someone else a “liberal” or “right-wing” in a derisive way and implies through the label that the person is evil and therefore what they say is suspect.)

So, when we’re dealing with others, it pays to step back and identify the underlying morals they are operating from. Once we can isolate the moral values driving someone to think what they think, we can more easily respect them even if we disagree.

As a hypothetical example:

Let’s say that Bob and Mary disagree on a charged topic—like, what to do about immigration to the US.

Now, Bob might make some common anti-immigration arguments about crime and economic impacts. He may say that people sneaking into the US burdens the system, and breaking the law to get here is wrong.

Mary might make a competing argument, saying it’s wrong to prevent people from living where they want to live. She may say that our immigration laws are unnecessarily cruel. She may point out that her best friend, her husband, and her ex-roommate are all immigrants, and that they make her life and the country better.

Underneath, what each of them is really doing is using post-hoc justifications to back up a moral intuition that they value most. And so as the argument continues, they’ll trot out statistics or stories that confirm their feelings. They’ll tune out inconvenient evidence that calls their particular stance into question. The fact that Mary’s Brazilian best friend pays lots of taxes, and her Guatemalan husband makes everyone around her a better person might be dismissed by Bob’s anecdote about a foreign gang member killing someone in his hometown.

They may not even realize it, but they’re not respecting or considering each other’s arguments while they’re so busy defending their own. This conversation likely won’t go anywhere, and is likely to leave everyone disliking each other.

But say we asked Bob and Mary to dig out the moral motivations behind our immigration stances. We might end up unearthing this:

Bob values Fairness and Authority above all else. So he thinks it’s not Fair that some people can break the law and get away with it (entering the country illegally). And he thinks it’s not good to disrespect the Authority of a country by breaking its laws. Finally, Bob might be worried about the Sanctity of the country. Letting in anyone means we might let some bad guys in, too.

Once we unearth this, Mary can acknowledge that Bob’s motivations are good, even if Mary disagrees with his conclusions. After all, Mary believes in Fairness and (righteous) Authority too. Even though those aren’t her primary moral foundations, she understands that Bob is coming from a place of trying to do the right thing.

In contrast, Mary can help Bob see how she values Care and Kindness. If he’s listening, he’ll agree that that’s a good thing, too. Mary can explain how she thinks we should treat people like they’re valuable no matter where they were born. This explains why she thinks restricting immigration the way we do is unkind. And Bob might be surprised to discover that Mary also values Fairness. The way she sees Fairness in the case of immigration is that it’s not fair to tell one human they can live here and another they can’t. We don’t choose where we were born, and she thinks it’s unfair to restrict someone for that.

So Bob and Mary have determined that they both value Fairness, but just apply it in different ways.

Once they unearth these moral foundations, even if Bob and Mary still don’t agree on a conclusion, they have earned respect each other’s viewpoint. Mary sees Bob as a good person. He has good moral motivations behind his arguments. And he sees the same in Mary.

This means they might be able to have a more productive conversation about what to do. They might just be able to exercise some intellectual humility and get somewhere together—to come up with some new ideas for how to make a more Fair and Kind system.

As Dr. Jonathan Haidt, the founding researcher on Moral Foundations, summed it up in this TED talk, “A lot of the problems we have to solve are problems that require us to change other people. And if you want to change other people, a much better way to do it is to first understand who we are—understand our moral psychology, understand that we all think we’re right—and then step out, even if it’s just for a moment, step out of the moral matrix, just try to see it as a struggle playing out, in which everybody does think they’re right, and everybody, at least, has some reasons—even if you disagree with them—everybody has some reasons for what they’re doing.”

 

Let’s Practice: