Truthiness is a concept most of us heard about associated with Trump’s first presidential campaign.
The concept is simple, whether you tell the truth doesn’t actually matter that much; what matters is how believable what you say is. People in general don’t check facts, especially when those facts confirm their pre-existing biases.
This forces us to face up to something rather uncomfortable: in a debate, telling the truth only matters if your audience is well informed. If the audience is not well informed, it’s often possible to construct a false narrative that sounds more convincing than the actual truth.
We know this, we’ve seen it in action.
I want to introduce a new concept: the truthiness horizon.
It’s when someone is faced with a level of detail on a subject that’s so far removed from their own knowledge that they can’t even tell if it’s plausible, let alone true – so they reject it.
Sounds crazy, right? It is, but if the person has a pre-existing bias it can happen really easily.
Nobody Knows How a Smartphone Works
One example from personal experience: a friend of a friend asserted that nobody understood how a smartphone worked. I’ve worked in digital electronics, optoelectronics, radio and software so I do know, reasonably well.
“Go on then”, he asked, “how does the display work?” I started explaining how an AMOLED display worked. He asked some very odd questions. After a while he just folded his arms and said “Nah, you’re making it up, nobody knows this stuff”.
At first I thought he was joking, but he got quite angry and abusive, maintaining the theme that I didn’t really know and I was just making stuff up to sound clever. This is an extreme (and irrational) example, but it demonstrates the point very well.
People get emotionally invested in their opinions, when you make a challenge you’re starting on the back foot. People’s first reaction is to preserve ego, to look for ways they can discount your challenge. We all do this, but we hope that professional people override that initial emotional reaction and consider the wider argument objectively. Not everyone is professional, however.
The person here started with the premise that nobody knew how a smartphone worked and was always looking to maintain that state. The questions he asked weren’t intended to get more information, they were intended to find gaps in my knowledge and prove that I didn’t know how a smartphone works.
His problem was that I do.
My problem was that at some point I crossed the truthiness horizon. I sounded too authoritative to be credible and an alternative narrative popped into his head. He reasoned that the chances of meeting a friend of a friend who genuinely had such in-depth knowledge were vanishingly small, therefore he could safely conclude that I was making it all up to try to sound impressive.
There was only one teensie-tiny problem with this line of reasoning: it was bollocks.
At the time I didn’t really understand it, I just laughed it off and told him that one day not listening to people who knew what they were talking about would get him into trouble.
On reflection, however, I don’t think this is really all that rare.
Bullshit Baffles Brains
There’s an old army saying, “bullshit baffles brains”, or to put it another way, if you need to convince someone of something but you don’t have a good argument, just make up a load of complex stuff that you know they won’t understand. There’s a good chance they won’t want to lose face by admitting they don’t know what you’re talking about, they’re not in a position to argue against it, so their only option is to go along with it.
Data overload is another, related technique. You dump a huge load of correct raw facts on someone you know hasn’t the time or the expertise to interpret them. Then you tell them it means something it doesn’t. They’re in full possession of the facts, they presume that you wouldn’t lie to them because they could find out the truth – if they tried.
My proposal of a truthiness horizon is based on the hypothesis that these and similar techniques are now so engrained in our society that people are basically wary of any situation that appears opaquely complex. Such a stance might well help them in daily life, not being sucked in by bogus investment schemes, etc. A side effect, however, is to make them vulnerable to manipulation through truthiness.
There’s a balance to be found here and I don’t believe we’ve found it.
Right now, if you are an expert or significantly knowledgeable in a particular area it’s good to keep this in the back of your mind: people might disengage not because they don’t understand, but because they don’t believe.