This year I have read and re-read Dr Jess Berentson-Shaw’s book, ‘A Matter of Fact: Talking Truth in a Post-Truth World’. She is an incredibly well-informed author who has some interesting ideas about how communication can be improved when it comes to disseminating evidence and fact. Berentson-Shaw (2018) states that there are many reasons why those who have a stake in communicating what is true do not get traction in terms of people both believing and acting upon good evidence. Her theses are divided into five key sections; the myth of the knowledge gap, a crisis in critical thinking, misinformation in society, conflicts over beliefs and values, and talking and not listening. My thinking was challenged at various points. I highly recommend the book. It has left me with many reflections and sent me down various rabbit holes. I hope to purge myself of a few of these ruminations via this blogpost.
Have you ever found yourself bemoaning the link between sugar ingestion and hyperactivity? It is a myth. As Berentson-Shaw (2018) outlines in her book, studies have examined the relationship and found that sugar ingestion does not lead to hyperactivity, and that the hyperactivity is likely to be caused either by other ingredients in food/drink (e.g. caffeine) and/or induced by the socialisation itself. You may be thinking, “Emina is wrong. I have seen children become hyperactive after too much sugar.” No, you have been told that is what happens, or you have made the association because the two things have occurred together, and it just made sense, or fitted with your existing knowledge, beliefs or values about it. We do this all the time. We accept things we see and hear as truth, because they fit with what we already know or have experienced, and/or we automatically accept that the source of the information is a truthful source.
In education, truth, reason and evidence should matter an enormous amount to all of us, yet we are in the unfortunate position of needing to be perpetual critical consumers of information. Why is this the case? It is because much of what is presented or sold is not worthy of our money, trust or time. We should be able to trust what a lecturer, presenter or consultant tells us, but sadly, it is not always the case that we can.
Misinformation: What is it, who spreads it, and why?
Misinformation is information that is false or inaccurate. The interesting thing about misinformation, and any dictionary definition will elaborate on this, is that it is unconcerned with intent. It will be spread whether the source knew it was false or inaccurate or not, and whether they intended to be misleading or not. Misinformation can be deliberate or accidental, and it can be benign or very serious indeed. Very often people share information, without checking to see whether it is in fact true. This of course has been made far easier with the proliferation of information sharing via social media, and there are studies demonstrating that misinformation spreads faster, deeper, and more broadly than truth. There are scholars across the world who study ‘fake news’, ‘misinformation’, ‘bullshit’ and ‘bullshitting’. If you are interested, type those terms into Google Scholar and you will find some incredible studies on how misinformation is spread, who is most likely to do it, and who is most likely to fall for it.
I mentioned before that the spreading of misinformation can be deliberate or accidental. It can also be categorised as lying or bullshitting, which I will elaborate on shortly. To understand who spreads misinformation, why and how, we must consider what motivates people to spread it in the first place.
“At the heart of misinformation is often power and money, followed up by a human appetite for the shocking or controversial. Misinformation is used to subvert democracy, to sell the cultural stories that maintain people’s relative position and power in society, to make money, or because people fear the change that truth brings with it.” (Berentson-Shaw, 2018, p.34)
Berentson-Shaw (2018) describes five key sources of misinformation and obfuscation in her book. These are rumour/fiction, governments/politicians, vested interests/industry/non-governmental organisations, the media, and the internet. Regardless of the source, the way I see it, we have three key players when it comes to the spread of misinformation.
This person intends to deceive people with false information. They wilfully circulate information with the sole purpose of deceiving their audience.
This person intends to portray a false impression of themselves, especially of their knowledge and skills.
The Bullshitted Bullshitter
This person has been overly accepting of false or weak claims over a period of time. Given they have consumed so much bullshit, they disseminate it with little or no awareness.
Here are a few quotes from Littrel, Risko and Fugelsang’s (2019, p.1-3) paper that explain such human behaviour better than I ever could.
“… the bullshitter may not actually know the truth value of every statement he makes, yet he is often aware of his unawareness, and asserts himself with a sense of certainty that the totality of his statements is true regardless. Given this, rather than being completely ‘unconcerned with the truth', it might be more accurate to say instead that the bullshitter is epistemically insouciant, showing the truth a casual, loose concern, or indifference. Additionally … what makes some statements ‘bullshit' is not necessarily the speaker's casual (dis)regard for the truth, but more in the ‘uses and purposes' for which they employ bullshit. Therefore, ultimately, the veracity of what the bullshitter says does not matter to him nearly as much as his motivations for saying it.”
“ … more prolific persuasive bullshitters were also more likely to overclaim when asked to demonstrate their general knowledge, possibly bullshitting themselves as well as others.”
“Overall, bullshitting can be understood as an instrumental and performative communication strategy employed to either: (1) impress, persuade, or fit in with others by exaggerating one's knowledge, attitudes, skills, or competence (i.e., persuasive bullshitting), and/or; (2) attempts to evade or altogether avoid responding to inquiries where direct answers might result in negative social costs (i.e., evasive bullshitting ).”
Why do we fall for misinformation?
There are two main reason we fall for misinformation. The first is that we are socialised to believe that people (should) tell the truth. This comes from Grice’s four discourse (cooperative conversation) maxims. These maxims are based on the assumptions we should be able to make in cooperative conversations or the unwritten rules we should follow in order to be a cooperative conversationalist.
Try to make your contribution one that is true
Don’t say what you believe to be false and represent it as true
Do not say for which you lack adequate evidence
Attend to the amount that is said
Be as informative as required
Don’t be more informative than is required
Conciseness is valued
Your contribution to conversation should be appropriate to the immediate needs at each stage of the interaction
Say things that are pertinent to the discussion
Know how what is to be said should be said
Avoid obscurity of expression
Be brief and orderly
Further to many of us having “a bias toward accepting statements as true” (Pennycook, Cheyne, Barr, Koehler & Fugelsang, 2015, p.1), the second reason we can fall for misinformation is having “a general tendency to be overly accepting of weak claims”.
“This tendency, which we refer to as reflexive open‐mindedness, may be partly responsible for the prevalence of epistemically suspect beliefs writ large” (Pennycook & Rand, 2020, p.185).
“Reflexive open‐mindedness stands in contrast to reflective open‐mindedness, which is the tendency to deliberate and question one’s intuitions …” (p.186)
What can we do?
Berentson-Shaw (2018, p.32) states, “Research shows that to overcome the default assumption of truthfulness … people would need:
1. The story to sound totally implausible;
2. To have a deep distrust of the communicator; and
3. To engage in a great deal of effort”
This is the battle truth is often up against! As Berentson-Shaw (2018, p.32) states, “accurate information is at a distinct disadvantage.”
The more we know about a topic, the easier it is to detect misinformation about it when we come across it. It does not negate our human biases, but information is power in this regard. We can read and research topics by accessing information from reputable and reliable sources.
We can value truth by approaching new information and previously unexamined beliefs and practices with a healthy amount of distrust and scepticism. We can do this by questioning everything, developing arguments for and against an idea/practice using research evidence whenever possible, and asking people to back up their claims about instructional programs and practices with research evidence.
There is a small but growing evidence base around how to combat fake news and I think some of it applies to how we can think about stemming the tide of misinformation in education. The two broad intervention categories for combating the acceptance and spread of fake news proposed by Lazer et al (2018) are:
1. Empowering the individual to evaluate the information that they encounter
2. Preventing exposure to misinformation in the first place
Initial Teacher Education (ITE) possesses incredible, but as yet unrealised, power in both regards.
Evaluating the information that we encounter and working collectively to stem the tide of misinformation in education is everybody’s business. This can go a long way to ensure best practice for our own professional and personal fulfillment and for the benefit of the students that we educate.
“A well-informed public, who can identify misinformation and whose interests it serves, who recognise good information, who feel listened to and can engage with the evidence and decision-making based upon it, is critical to a modern idea of participatory democracy.” (Berentson-Shaw, 2018, p.52)