It’s not unusual these days to hear talk of different groups of people having “different facts.” See, for example, this piece:
This is language that we should reject since… well, facts are facts. If it’s a fact that there is oil under my home, there is oil under my home. It’s not a fact for me unless it’s a fact for everyone. Facts do not vary from person to person or group to group. Interpretations (or understandings or such) of facts often do. But interpretations of facts and facts are very different things. After all, even if at some point in time everyone believed the earth was flat, it was not flat. If it were, that would mean the earth went from being quasi-spherical to being flat to being quasi-spherical again as the consensus understanding changed over time.
The issue of supposedly different facts—which are really different interpretations—is meant to indicate something about the extent of the polarization that now prevails in our society. I don’t think, though, that it does much work either in causing polarization or reducing it. I say that because despite disagreeing with all sorts of people about facts, I can almost always engage in discourse with those people. Typically, that requires understanding the disagreement about facts and then talking about what follows from their view of the facts and what follows from mine and moving on from there.
When I engage with people that have different interpretations of facts than my own, I can tentatively take their view as if it were true (counterfactually, I believe). Similarly, they can tentatively take my view as if it were true (counterfactually, they believe). Doing this allows us to go further, perhaps even coming to some sort of agreement. (Even where agreement does not develop, we come to see where we actually “agree to disagree,” which is rarely about the actual facts.)
I admit there are exceptions to the ability to engage in civil discourse. I’ve never figured out how to engage people that are convinced the earth is flat, for example. I’ve also been stumped by people that only trust (what seem to me) obviously bad sources for news and reject information from any and all legacy media (most of which I agree is far from perfect even if I think it often presents real facts). If my interlocutor only accepts what is said by sources all of which I reject and I only accept what is said by sources they reject, it quickly becomes (near?) impossible to go further—unless, perhaps, one of us has first hand evidence. (When one’s interlocutor has first hand evidence, one ought to defer to them if one has any trust in them.)
That the sort of engagement I sketched above is possible suggests that we can figure out where our disagreements about facts come from and, at least possibly, get past those disagreements. I’ve had this happen on occasions—though not as often as I’d like. Each time it’s happened, I came to see how my view of the facts was formed in ways that my interlocutor’s view was not or how their view of the facts was formed in ways that mine were not. Facts aren’t the problems.
Sometimes, polarization seems to be about values. Some say the reason we disagree about facts, after all, is because of disagreement about underlying values. I take it the idea is that our values shape how we interpret facts. This is frequently claimed by a variety of theorists, including critics of western science (including western medicine). As More in Common has shown (see More in Common, Defusing the History Wars: Finding Common Ground in Teaching America’s National Story), though, there are substantial gaps in what polar groups believe their opposition believes—and significantly more overlap in their values than one would think given the mutual animosity. So, it’s not values that explain polarization either.
Of course, even if we share many values, we don’t share all, so the conclusion just made is too quick. Nonetheless, I think we do well to look at something other than facts and values to get to the root of polarization. What would that be? Let me suggest it is worldviews. These are neither values nor facts nor interpretations of facts. They are some amalgam of facts and values and the way those are transmitted within a group.
Ryan Muldoon (Social Contract Theory for a Diverse World: Beyond Tolerance. NY: Routledge, 2016) tells us that worldviews (he calls them “perspectives”) “are simply the filters that we use to view the world. This supposes that we …(consciously or unconsciously) choose to group certain features [of the world] together, choose to ignore certain information while focusing on other information … Perspectives are thus a mental schemata.” And, he adds, “Shared perspectives are a source of social cohesion — they provide a framework for mutual understanding and interpretation of shared events” (48). In short, he says they “do two things for us: they provide us with a categorization of the world, and they provide us with a method of navigating the categorization” (4).
We inherit our worldview from those around us. If everyone you know believes that the cause of a storm is Thor’s anger with your group for its failure to properly honor him, you will likely believe that. The story will be reiterated numerous times: “it stormed here and my crops were destroyed; I should have offered Thor more of a tribute;” “I’m so glad I offered Thor that tribute; the rain was perfect for my crop this year.” What is going on there is simply that a story is told (and likely developed) that is meant to explain the facts. It is repeated by many and many come to accept it as the best (perhaps only) understanding available.
In the Thor example just discussed, you and I might say “this is an absurd story; we know of cases where people did offer the proper tribute and nonetheless lost their crops to weather and cases where people did not offer any tribute and yet had great crops.” A Thor believer might respond to such claims by saying “ah, maybe I should rethink my belief,” but may develop a new iteration of the story meant to show how these cases can be explained (with a more complex version of the story). What response is forthcoming will depend on a variety of factors, but is unlikely to be based on the actual facts alone or values alone. It will likely depend on things like. “Who believes the story? How many people believe it? Who are they to you? Do you benefit from believing the story?”
The story of concern is a worldview. For one example of this see any of the works in the “left behind” literature like Arlie Hochschild’s Strangers in Their Own Land. Hochschild presents a group with a clear and detailed “deep story” about how a group perceive immigrants and minorities as recipients of the largess of both governments and the elites at a cost to themselves. That story is convincing to them; it’s not at all difficult to understand why they accept it even if it’s based on a distorted view of the facts—and the distorted view of the facts alone would not be sufficient to push those people to the sort of polarization they seem to have been pushed to. The worldview—that coherent story they accept—is. And, of course, those of us who value immigrants and have no problem with them, minorities, or other groups, have our stories—our worldviews that might also be based on a distorted view of the facts.
This idea—that it is our worldview that matters—helps us make sense of the way registered Democrats and registered Republicans differ with regard to how they trust news sources. Fox News typically will tell a story more conducive to current Republicans than The NY Times, which will tell a story more conducive to current Democrats. Much of that will be clear when we understand the relevant worldviews. Some things will be less clear—for example, Democrats trusting the NYTimes more than the Economist—but I suspect the answers require understanding more of the relevant stories, in this example, the stories that American Democrats tell themselves.
I would suggest that if we want to alleviate polarization we simply need to work to encourage rethinking of worldviews. That would require, first though, that we make clear what those worldviews are. Subjecting all worldviews—our own included—to discussion and rational criticism seems our best bet. If the worldview one subscribes to makes it rational to believe that immigrants are all criminals—because one is exposed to very few immigrants and repeatedly hears about immigrants who are (supposedly) criminals—it is only through correction via extensive dialogue that one is likely to alter that belief. If one subscribes to a worldview that makes it rational to believe that all Trump voters are irrational—because one can’t conceive of their worldview—it is, again, only through correction via extensive dialogue that one is likely to alter that belief. We can have those dialogues. Those dialogues might help to reduce the use of memes, which bolster support for worldviews rather than challenge them—those who share a meme, already have a worldview it supports; memes are less likely to be seen by those with an opposing worldview. We can improve by truly engaging those who have worldviews opposing our own—and that requires trying to understand theirs.
This is good. It is also an argument for *philosophy* because you have two choices: you can subject your worldview to rational scrutiny, or you can just have a worldview based on your values, inertia, and wishful thinking. Philosophy involves the interplay between three different impulses: developing rational worldviews is one of them.