Three ways our minds derail smart, reasonable conversations, and what to do about it.

In the space of a single weekend morning, my husband and I managed to have several conversations that felt more like parallel monologues than dialogue. They weren’t particularly difficult, but we managed to talk past one another quite a bit.
It was one of those times when I was grateful no one who knows I’m a conflict resolutionary happened to be nearby, because I was as tangled up as anyone could be. What was going on that made several simple problems way more complicated than they should have been?
I hit the icy hiking trails with one of the dogs to do a little post-hoc musing, knowing this about my husband and me: We’re both smart and confident in our own knowledge. We broadly agree on values, goals, outlook. We share context, of course, sharing life and home for decades. We think we know each other fairly well, though we both recognize there’s plenty we don’t know.
And it was these very kinds of things that tripped us up, creating the conditions for three particular cognitive biases to complicate our conversations.
Predictive processing
“I’m seeing what I expect to see.”
Predictive processing theorists propose that our brains are constantly making educated guesses about what’s happening around us by using our past experiences to inform what’s happening in the present.
Humans may have evolved to use predictive processing because it helps our brains be more efficient. By allowing us to anticipate what we’re likely to see, hear, or experience, instead of building every perception from scratch, our brains save energy and speed up response time.
As you might guess, predictive processing can lead us astray when we respond to what we expect to experience without incorporating what we are actually experiencing. Our present experience is real but not true.
I think I know, for example, what my husband is about to say, so I start responding before he’s done speaking. Even though I’m right some of the time, and he does tend to talk long, I’m wrong at least an equal amount and it drives him up the wall.
Other ways predictive processing shows up in conversations: “Here we go again.” “I knew you were going to react that way.” “This is just like last time.” Maybe we say these out loud, maybe we just think them.
Naive realism
“I’m seeing the facts. You aren’t.”
Naive realism is a cognitive bias that leads us to assume we see the world objectively—as it “truly” is—while those who disagree with us are uninformed, irrational, or themselves biased.
In disagreements, naive realism tends to show up as an assumption that our interpretation of events is true and accurate, while the other person’s interpretation is distorted.
Naive realism creates problems by reducing our openness to revising our view of the events. Disagreement doesn’t register as a difference in perspective—it registers as error or bad faith.
Naive realism sounds like, “That’s not what happened.” “You’re not being objective about this.” “That’s a fact, not opinion.” “Anyone would see it that way.”
Illusion of explanatory depth
“I believe I understand this better than I do.”
The illusion of explanatory depth (IOED) is the tendency to believe we understand complex systems, policies, or problems in greater detail or nuance than we actually do. We mistake familiarity for expertise.
When we think we understand an issue better than we actually do, IOED short-circuits curiosity. It gives us a premature sense of mastery, reducing the likelihood that we’ll ask clarifying questions, seek disconfirming evidence, or attempt to articulate the mechanisms behind our claims.
IOED causes problems in disagreement by inflating our confidence without increasing our accuracy, leading us to underestimate the legitimacy or complexity of colliding viewpoints.
In disagreement, IOED might sound like, “I know how the economy works.” “You don’t need to educate me on this.” “I’m already familiar with the science.” “Please don’t treat me like someone who doesn’t know about this already.”
Ways to counter these cognitive biases
Each of these biases creates its own kind of blind spot, making us respond to what we think is happening rather than what is. We reply intelligently (and maybe a bit haughtily) to a version of the conversation that exists mainly in our own heads.
Here are some ways to reduce the impact of predictive processing, naive realism, and illusion of explanatory depth:
- First, the obvious: Know you have biases. Even if you don’t recall which cognitive bias goes with which type of misperception, simply knowing that we have biases like these can help us tease out our own and their thinking in more detail.
- Shift statements of “fact” to statements of perception. Instead of “That’s not what happened,” say, “I had a different experience,” or, “I recall it differently.”
- Actively seek disconfirming details. Ask, “What here doesn’t fit with my perception or conclusion?” or “What else could this be?“
- Practice rugged self-honesty. Ask yourself, “Where am I unclear?” or “What don’t I fully understand about this?” Try explaining the process / policy / system / mechanism to yourself. If you stumble or gloss over details, you may have found a knowledge gap.
- Say out loud what you think to be true / accurate / objective. By illuminating your assumptions, you make it possible to discover evidence that changes your view. It works well to say it as a testable claim: “I think x because y.”
By the time I got home from my hike, I’d identified the culprits from that morning’s messy conversation. Instead of just knowing we were tangled up in something, naming the cognitive biases helped me see the specific ways we were each responding to slightly different versions of reality. Recognizing the specific patterns doesn’t make me immune, but it does help me watch out for them next time.