Lesson 3.6

Understanding Moral Foundations

Watch the following video, then proceed to the Key Concepts below:

 

Key Concepts:

The six underlying moral foundations shared by most human beings

The six underlying moral foundations shared by most human beings

 

Research on Moral Foundations explains in large part why good people can disagree so viciously on things. It’s because most people think they’re good people deep down—and we attach our points of view to “good things.” It’s just that some people tend to focus on certain “good things” more than others.

Moral Foundations, in other words, explains why I hear my good-hearted politically conservative, Mormon and Protestant friends back home in Idaho say the same thing that my good-hearted liberal, Atheist and Agnostic friends in New York say about them: “I can’t believe someone could believe in that!”

The good news is that research says that we can develop respect for differing viewpoints if we make the effort to unearth their moral motivations.

Few people actually think of themselves as evil. So, unless you’ve got something wrong in your brain (e.g. you’re a malignant narcissist or a psychopath), you will tend to justify your decisions to help you feel like a “good” person. Under the surface, you’ll create good reasons for what you think.

Moral Foundations theory says humans share at least six innate moral foundations that serve as the universal building blocks of morality. For the most part, evolutionary psychologists and spiritual belief systems agree on these. They are:

  • Care. Being kind and preventing harm.

  • Fairness. Justice, and not cheating people.

  • Loyalty. Patriotism and self-sacrifice for the group, not betraying the group.

  • Authority. Deference to legitimate authority for the good of the group.

  • Sanctity. Striving to be noble, clean, and not contaminated.

And some emerging schools of thought believe there may be one more foundation inherent to all (or at least most) humans:

  • Liberty: Rights, freedom, rejection of constraints and of oppression.

Studies show that we give our in-group lots of benefit of the doubt because we think “they’re good people.” We understand their underlying morals.

But we don’t afford our out-groups the same benefit of doubt. They might be “bad people,” so we justify not respecting them. (You can see this any time someone calls someone else a “liberal” or “right-wing” in a derisive way and implies through the label that the person is evil and therefore what they say is suspect.)

So, when we’re dealing with others, it pays to step back and identify the underlying morals they are operating from. Once we can isolate the moral values driving someone to think what they think, we can more easily respect them even if we disagree.

As a hypothetical example:

Let’s say that my buddy back home and I disagree on a charged topic—like, what to do about immigration to the US.

Now, my buddy might make some common anti-immigration arguments about crime and economic impacts. He may say that people sneaking into the US burdens the system, and breaking the law to get here is wrong.

I might make a pro-immigration argument, saying it’s wrong to prevent people from living where they want to live. I may say that our immigration laws are unnecessarily cruel. I may point out that my best friend, my wife, and my ex-roommate are all immigrants, and that they make my life and this country better.

Underneath, what each of us is really doing is using post-hoc justifications to back up a moral intuition that we value most. And so as the argument continues, we’ll trot out statistics or stories that confirm our biases. We’ll tune out inconvenient evidence that calls our particular stance into question. The fact that my Brazilian best friend pays lots of taxes, and my Guatemalan wife makes everyone around her a better person might be dismissed by my buddy’s anecdote about a foreign gang member killing someone in Texas.

We may not even realize it, but we’re not respecting or considering each other’s arguments while we’re so busy defending our own. This conversation likely won’t go anywhere, and is likely to leave us disliking each other.

But say we forced ourselves to dig out the moral motivations behind our immigration stances. We might end up unearthing this:

My buddy values Fairness and Authority above all else. So he thinks it’s not fair that some people can break the law and get away with it (entering the country illegally). Even if the law is a little cruel, breaking the law is a betrayal to society. And he thinks it’s not good to disrespect the Authority of a country by breaking it’s laws. Finally, my buddy might be worried about the Sanctity of the country. Letting in anyone means we might let some bad guys in, too. It’s good to not risk contaminating the swimming pool, he might say.

Once we unearth this, I can acknowledge that my buddy’s motivations are good, even if I disagree with his conclusions. After all, I believe in Fairness and (righteous) Authority too. Even though those aren’t my primary moral foundations, I understand that my buddy is coming from a place of trying to do the right thing.

In contrast, I can help him see how I value Care and Kindness above all else. If he’s listening, he’ll agree that that’s a good thing, too. I can explain how I think we should treat people like they’re valuable no matter where they were born. This explains why I think restricting immigration the way we do is unkind. And he might be surprised to discover that I also value Fairness. The way I see Fairness in the case of immigration is that it’s not fair to tell one human they can live here and another they can’t. We don’t choose where we were born, and I think it’s unfair to restrict someone for that.

So we both value Fairness, we just apply it in different ways.

Once we unearth these moral foundations, even if we still don’t agree on a conclusion, we have earned respect each other’s viewpoint. I see my buddy as a good person. He has good moral motivations behind his arguments. And he sees the same in me.

This means we might be able to have a more productive conversation about what to do. We might just be able to exercise some intellectual humility and get somewhere together.

As Dr. Jonathan Haidt, the premiere researcher on Moral Foundataions, summed it up in this TED talk, “A lot of the problems we have to solve are problems that require us to change other people. And if you want to change other people, a much better way to do it is to first understand who we are—understand our moral psychology, understand that we all think we’re right—and then step out, even if it’s just for a moment, step out of the moral matrix, just try to see it as a struggle playing out, in which everybody does think they’re right, and everybody, at least, has some reasons—even if you disagree with them—everybody has some reasons for what they’re doing.”

 

Let’s Practice: