Considerations for Science of Reading conversations: Can I change your mind about changing minds?
We already know quite a lot about what works best in reading (and writing) instruction. We have had access to this body of knowledge for quite some time and it continues to expand in both theoretical and practical senses. Most of you are in the business of changing hearts and minds about reading instruction. People often ask, how can I get my school on board, or how can I convince someone of the evidence base? Not everyone has a hard time accepting new information, but many do, especially when it runs contrary to what they have always believed or done. This blogpost examines why there is sometimes resistance to good information, and what we can do to combat it.
In my view, the three primary reasons misinformation persists about reading instruction are:
- pre-service teachers are not equipped with the requisite knowledge and skills during their training (nor are they apprised of the misinformation that they will likely encounter)
- there are vested interests at all tiers of education (people who have skin in the status quo game) which impacts professional learning provision, programs, policies and practices
- changing your mind is usually a very hard thing to do
How we make our minds up about new information
To manage the cognitive load imposed by new information we typically take a range of shortcuts to assess its relevance or value to us. There are a number of studies that have examined these processes and decisions that we all engage in to varying degrees.
Mental models and the stories we tell ourselves
The scientist in me twitches to say this, but sometimes people tend to better engage with and better retain new information (let’s say data or facts) when it is relayed to them via a narrative. There is a body of research that demonstrates how narrative can alter the attitudes and beliefs of the reader or listener when they have been resistant to it in data form. Aside from the often-complex concepts and language that appear in journal articles or reports, this at least in part explains why people are more likely to enjoy and be swayed by blogs, books or videos. The medium matters. Sharing our stories in oral, written or visual form matters. Interestingly, after I published this post, I saw a new meta-analysis which has reviewed the power of data versus anecdote in changing minds. You can read it here. The complete reference is:
Freling, T. H., Yang, Z., Saini, R., Itani, O. S., & Abualsamh, R. R. (2020). When poignant stories outweigh cold hard facts: A meta-analysis of the anecdotal bias. Organizational Behavior and Human Decision Processes, 160, 51-67.
It is therefore helpful to tell your personal story of how you came to acquire and improve your knowledge and practice. It is helpful to share stories of schools that have improved their practice. It is helpful to show rather than tell. Show what best practice in assessment, instruction or intervention looks like and how it can be achieved, rather than talking about the concepts. This fits with Guskey’s (2002) model of teacher change, in which he demonstrated that it tended to only be through experiencing successful implementation of new practices that attitudes and beliefs about certain approaches changed. Trying in the first instance to change attitudes and beliefs is not necessarily effective, it is more so a case of seeing is believing. This showing can be direct or indirect. Video vignettes tend to be powerful. For example, I can (and have – awkward) present a two-hour professional learning session on explicit instruction, detailing the evidence to support it, and how it is done, and still have a number of naysayers in the audience. But when I show a number of videos of teachers using explicit instruction and students learning effectively, and then present a change in student data over time, there tend to be very few naysayers, and they tend to be those for whom explicit instruction causes cognitive dissonance, and unpleasantly disrupts the stories they’ve been told or have told to themselves, which leads me to my next point.
“One strategy people use to assess the truth of a new piece of information is its compatibility with a mental model. In these mental models or narratives, people build a causal chain of events. If new information seeks to replace a single link in that chain but no other links, then it causes a failure in the mental model. People no longer have a coherent story. It stops making sense, so they reject it. Once a good story is formed, it is very resistant to change because all elements in a good story fit together. Single alterations to that narrative cause inconsistencies in the fit and are more likely to be rejected. This is why, even when provided with correct information, people will continue to rely on incorrect information” (Berentson-Shaw, 2018, p.60).
“People are incredibly efficient at plugging gaps in their mental models with information that keeps the narrative consistent … even when the counter-evidence is clear” (Berentson-Shaw, 2018, p.61).
By providing a fact or piece of data, all too often we are seeking to replace a single link in a chain, that would seek to disrupt or destroy a mental model that has been in place for a long time. So, rather than providing individual pieces of information to people which contradict their mental models and cause them to automatically reject what we are saying regardless of its merit, we need to provide them with a new and complete narrative. We all need a coherent story to guide our actions and help us make sense of the sphere we exist within.
“One narrative tool that has been demonstrated to lead to identification is having a protagonist who is similar to the reader. In 2014, de Graaf found that similarities between the reader and the protagonist increased narrative persuasion, as the reader was more likely to relate the lessons learned in the story to themselves and their own life through self-referencing” (Payne, 2018, p.3)
“Green and Brock (2000) believe that transportation plays a role in the persuasive effects of textual narratives. Transportation in this instance is a distinct mental process, whereby the reader becomes immersed in the story and narrative world created through an ‘integrative melding of attention, imagery, and feelings’ (Green & Brock 2000, p.701). Transportation leads to a complete focus on the story and causes the reader to lose or turn off access to real-life facts that may contradict the text and the ideas presented within it. Due to this, when the reader emerges from the story their attitudes and beliefs will be changed to reflect what they have learned in the narrative. The notion of transportation has also been influenced by that of identification, suggesting that ‘factors making it easier to identify with or understand a character can encourage transportation’ (Green, 2004, p.261)” (Payne, 2018, p.3)
Stories matter. Your stories matter. Share your stories and the feelings that you experienced along the way. If you talk about the anger, grief, frustration or disbelief you experienced when you came to understand the Science of Reading, or the fulfilment, joy, excitement and challenge you faced in trying to learn and implement new knowledge and practices, that will be far more powerful in affecting your audience than one hundred peer-reviewed articles placed strategically on desks or in pigeon holes. Tell you stories. Frame it in a way that is about you and the journey you have been on, not about them, and what they need to do.
How we feel and cognitive dissonance theory
We all, to varying degrees, have a preference for information that is consistent with what we already know, feel or value. This information makes us feel good because we are being affirmed and it is easy for our brains to engage with. This is usually why we choose to read certain newspapers and not others, for example. We are all subject to confirmation bias, again to varying degrees, and avoiding the mental work that is required to acknowledge and engage with new and differing information. We look for, with and without awareness, information that confirms our views and we ignore or dismiss that which doesn’t.
“When information is inconsistent with what is already known, it lacks fluency – it ‘sticks’ and doesn’t feel right. The lack of fluency can elicit negative feelings, leading people to doubt and reject new information that contradicts existing beliefs” (Berentson-Shaw, 2018, p.57).
Cognitive dissonance is really what this quote is referring to. This is when we feel a sense of unease or discomfort when what we are being told is as odds with what we already believe, and in this case, we engage in what is called ‘identity protective cognition’, which is our tendency to selectively accept and dismiss information based on beliefs that predominate the group we belong to.
“Cognitive dissonance theory posits that individuals have an innate drive for consistency. When two cognitive elements (attitudes, beliefs, values, etc.) conflict, dissonance is produced. This dissonance is aversive and, as such, individuals should experience a subsequent drive to reduce or avoid it” (Moran et al, 2016, p.153)
If we make people feel bad through the way we share information with them, they will not only reject the information, they will associate us with those feelings, and disregard us as a person or professional too.
Who we trust
Liking and looking for information that confirms our views is dangerous in the age of prolific information sharing online, because we start to associate information that makes us feel good with the people who share it, and we pay even less attention to quality or truth of the information, and share it just because people we like or trust have shared it.
To establish credibility or trust in someone, we need them to have similar interests and values to us and we need to believe we are working toward the same outcomes. We also need to perceive them as having expertise (Berentson-Shaw, 2018). Note this does not mean that they are expert in anything, but if we think that they are, they will engender credibility and trust. If people think you are part of their groupthink, or share their values and attitudes, they are more likely to listen to you. If people think you are not a part of their groupthink, and hold different values, they are likely to reject what you say without paying attention to it. If people perceive you as being on a different side to them, they will reject your expertise, deny your credibility and will not afford your arguments any trust or validity. It is actually that simple and that complex.
A belief in science and in scientific consensus is more likely to be found in some groups and not others, so this can be another hurdle to overcome. If scientific consensus is not of value to the person, they are not going to be influenced by it.
The other issue here is that most of us do not like being told what to do, how to do it, or what or how to think. If you have a position of authority or expertise, or belong to a particular institution, you may find that people reject your message just because of that, because you and/or your institution are the messenger. This phenomenon is called ‘reactance’.
“Reactance is an unpleasant motivational arousal (reaction) to offers, persons, rules, or regulations that threaten or eliminate specific behavioral freedoms. Reactance occurs when a person feels that someone or something is taking away their choices or limiting the range of alternatives.” (Wikipedia, 2020)
What can we take away from all of this? In all conversations, we need to seek to establish common values and desired outcomes. If our values or desired outcomes do not match theirs, we are wasting our time and theirs. Reframe the intended conversation or choose another candidate. We must consider ways in which we can authentically establish credibility and trust. We must be mindful of our own blind spots and biases. We must avoid an ‘us versus them’ mentality. We must consider how we are communicating. Are we telling people what to do, or worse, telling people they are wrong, or are we seeking to collaborate and develop a shared vision or outcome? Are we turning people off before we have even gotten started?
This section describes some reasonably well-researched considerations and techniques for overcoming misinformation, with come cautionary tales.
Using the knowledge gap
“In noticing a gap in their knowledge, as opposed to being told, participants then seek to fill it with credible information they have been provided. It is possible that this approach does not act as a direct challenge to peoples’ beliefs, with all the associated negative feelings that go with it” (Berentson-Shaw, 2018, p.76)
Consider how you can set up circumstances within which people can self-identify their knowledge gap, rather than you pointing it out to them. Professional learning sessions are often a good forum for this.
Pre-bunking/forewarning and inoculation theory
“Pre-bunking involves explicitly warning people, before they receive the information, that certain types of information they are going to encounter may be misleading” (Berentson-Shaw, 2018, p.79)
“According to inoculation theory, a communicator advocating a particular position can engender resistance to a counter-position by ‘inoculating’ the audience against the opposing side’s argument. There are several key ways that resistance to a counter-argument can be developed. One way this is accomplished is by providing an audience with a weakened form of the opposing side’s position” (Moran et al, 2016, p.153)
“Inoculation theory posits that individuals who are exposed to a weakened form of an opponent’s argument are better able to form counter-arguments and resist being persuaded by the opponent. This inoculation effect is strengthened when a communicator additionally provides refutation to an opponent’s position.” (Moran et al, 2016, p.153)
The two components to effective pre-bunking are therefore the explicit forewarning and then exposing the fallacy. Initial Teacher Education programs could be engaging in valuable pre-bunking. Unfortunately, often ITE programs engage in a form of this in reverse, with graduates hitting schools armed with naïve and false mantras such as “phonics is bad” and “explicit teaching is oppressive”. When graduates leave university inoculated to some degree against evidence-based approaches, it can be harder to change their minds than if they weren’t. Having said that, my experience of graduates is that they are the most open to change of anyone, because a) they feel grossly underprepared and out of their depth, and thus will embrace any new practical information with glee, and b) they have had far less time to solidify unhelpful beliefs and therefore these beliefs are more malleable. Given all of this information, and given graduates are the future of teaching, they are excellent candidates for conversation about best-practice, and this is often where I start when I cannot get traction with leadership.
Myth-busting, repetition and the backfire effect
There is mixed evidence for myth-busting. There is some evidence that an argument-refutation method can be effective but repetition of false beliefs, even when done in the process of countering them, can be detrimental and result in perpetuating the myth or misconception. It can be a case of people hearing what they want to hear or seeing what they want to see.
Repetition, for good or ill, can create the perception of consensus. Unfortunately, familiarity overrides accuracy or truth. If we hear something enough times, we generally start to believe it whether it is true or not. Why? Because the more we hear a piece of information, the more we think that most people must know it and believe it. Frustratingly, this often works very well for misinformation, and it is good information that struggles to get a voice, in order to gain traction. When misinformation is retracted, all too often in repeating that misinformation with the view to correct it, it only serves to further cement that information in peoples’ minds, and this is ‘the backfire effect’.
Two other common consequences of the repetition of misinformation, made far easier through social media, are ‘pluralistic ignorance’ and ‘the false consensus effect’. “Pluralistic ignorance is when the frequency and volume of a minority held belief leads the majority of people, who do not share this belief, to mistakenly believe that it is what most people think. As a consequence, they move to accepting the minority belief out of a desire to fit in” (Berentson-Shaw, 2018, p.67). ‘False consensus’ is when those in the minority believe they hold the majority view, and they are more likely to overestimate how many think the way they do and behave accordingly.
“The social consensus that results from the repetition of certain types of information can maintain people’s false beliefs. Consider, for example, that misinformation may be repeated during attempts to correct it, and this may help the misinformation become even smoother or more plausible. There is an important consideration here for communicators and researchers. How much attention should we give to extreme beliefs and attitudes and incorrect information in the context of the evidence that vocal repetition may embed the idea and contribute to beliefs that the idea is more widely held than it is?” (Berentson-Shaw, 2018, p.68-69).
I have worked with what probably now totals thousands of education professionals over the past ten or so years. My experience is that most teachers when provided with good information and ways to access more knowledge and improve practice act upon it. There are exceptions, and those exceptions are fierce in their resistance. As I have previously mentioned, perhaps they are not the best targets, even though often they are also the people who have the most sway in their setting.
For those loud and unpleasant few online who are openly argumentative and wilfully ignorant, can I suggest the mute or block buttons? Amplifying their voices and further cementing their views through engagement is incredibly unhelpful.
"To disregard trolls is ... to cold-shoulder them altogether. It's the form of fightback they are least able to handle. It's what they fear most, irrelevance." (The Age, 2020)
In terms of the power of repetition, let’s stop talking about Whole Language and Balanced Literacy unless we really have to, and if we do, be very careful in how we go about it. Let’s stop talking about all of the misinformation. Let’s share and promote good information as it relates to assessment, instruction, intervention and school-wide solutions. We can change the language and in turn the narrative.
Correction through providing an alternative narrative
I have already discussed the power of narrative. It is important to note that we can correct misinformation by providing an alternate narrative.
“Researchers have found that if communicators can fill the ‘coherence gap’ that misinformation creates in people’s narratives, the accurate information may be more likely to be believed … Studies show that misinformation’s power can be neutralised if an alternative account explains why the information was incorrect” (Berentson-Shaw, 2018, p.82-83).
Other considerations when providing corrections to misinformation are detailing the motivations behind the misinformation (the motivations of the author/source), making direct links between the misinformation and the accurate information (making it make sense), using very simple terms and avoiding complex and technical terms (complexity tends to increase resistance to new ideas), and encouraging the audience to generate additional reasons as to why the misinformation is wrong (correction is coming from them not from you) (Berentson-Shaw, 2018).
Paying attention to beliefs and values
According to Berentson-Shaw (2018, p.93), “values are universal concepts about what is important , what matters most to us. Beliefs tend to be contextually dependent and uphold our values”. We tend to believe information if we see it as being consistent with our values. Equally we tend to dismiss information if we see it as being in opposition to our values.
“A growing body of research shows that framing values when communicating good evidence does increase the likelihood of the acceptance of new information” (Berentson-Shaw, 2018, p.95).
“The researchers found that when people had their values affirmed they were more likely to consider the evidence that ran counter to their views and be more critical of advocates confirming their own views” (Berentson-Shaw, 2018, p.96).
We must consider how we can better establish this common ground, given most of us work in education for much the same reasons, and I am yet to find someone who wishes illiteracy on a child, at least actively and deliberately. Spend time fostering relationships that affirm common values and beliefs. Any challenging information that is shared once that has been established is more likely to be accepted.
Social judgement theory and latitudes of acceptance
Social judgment theory offers insight as to how sentiment can be so strongly anchored and resistant to alternate messaging or information and helps us decide who is more likely to change their mind.
“According to social judgment theory, in addition to valence (positive/negative), attitudes vary on strength. Individuals may have attitudes that are very strongly held or that they may not feel very strongly about. When an individual is exposed to a persuasive message, the position (that is, valence and strength) of their initial attitude informs how they perceive and respond to a persuasive message. When individuals have an attitude that is not strongly held, they typically will be receptive to a wider range of persuasive messages. This range of positions that an individual deems acceptable is known as a latitude of acceptance” (Moran et al, 2016, p.152).
Valence, if you are not familiar with the term, refers to the intrinsic attractiveness/goodness (positive valence) or averseness/badness (negative valence) of something. If you have a weak attitude toward something, that is, you don’t feel strongly about it, you are likely to have a large latitude of acceptance and you will be open to persuasion. If someone feels passionately about something, they tend to have a small latitude of acceptance, and they will resist persuasion, only being persuaded by messaging that is similar to what they already believe.
“Social judgment theory posits that such strong attitudes can be a function of ego-involvement with a particular topic. When an individual is highly ego-involved with a topic – that is, when an attitude is ‘central to [one’s] sense of self’ – attitude strength increases. Thus, the attitude of an individual who is highly ego-involved in a topic should have a small latitude of acceptance, and consequently be very difficult to change. Individuals with strongly held attitudes typically have small latitudes of acceptance around those attitudes, making them difficult to change. Moreover, when a persuasive message falls outside an individual’s latitude of acceptance, that individual may engage in perceptual contrast – that is, she or he may perceive the message to be more extreme than it actually is … It is crucial, then, [that the messaging] falls within [a persons’] latitude of acceptance to maximize its effectiveness” (Moran et al, 2016, p.152)
Rather than seeking people out with whom we share the most opposing views, seek out those with whom we share the least opposing views. Seek out those with the least fundamental views about non-evidence-based methods. That is, find those with the largest latitude of acceptance, not the narrowest. Those people who have their identity (ego) most wrapped up in what they do (for example, a Literacy Leader, a Reading Recovery teacher, an interventionist, someone who has been teaching early years literacy for 20 years) are very likely to be most resistant to change and therefore they are not the best candidates for persuasion, both in terms of your time, resilience and energy, but also you are very likely to polarise them further.
Personal qualities to embody
Avoid sarcasm or humour that is designed to bait, insult or demean. It tends to create polarisation, cement misinformed views, and/or exacerbate ‘us versus them’ mentalities in both the misinformed and informed groups, which are all things that we want to be actively avoiding. People are more likely to think we are extreme in our views (which creates polarisation) if we engage in such tactics.
We should aim to talk about our emotions (experiences of guilt, anger, betrayal, frustration) without being emotional. Rightly or wrongly, people tend to associate expressing emotions with extreme views, and it is likely they will not actually take in any of what we say, or worse, they will actively rebel against it.
“ … there is little proof that direct challenges to people’s beliefs work to overturn false beliefs. Rather, the evidence currently points to the entrenching of misinformed beliefs as people strive to protect their identities” (Berentson-Shaw, 2018, p.90).
Do not give voice to the naysayers. Do not amplify individuals who cling to misinformation, directly or indirectly. Do not amplify problematic or incorrect conceptions about reading instruction. Do not engage in tit-for-tat debates. Do not waste precious time on people who do not want to know better or do better. Do not demean or malign, and do not make people feel less than. Do not make fun of people.
We should aim to remove blaming language from our conversations. We can talk about where the responsibility lays (in the most part it is ITE) rather than focusing on individuals, and we can engender a sense of us all being in this together. For example, “we were let down by our training”, “we need to develop more knowledge and better practice”, and so on.
There are so many other persuasive considerations and techniques that exist, with variable effect. Many of them are unethical techniques that are used in politics, propaganda, religion and marketing. I have deliberately left these out.
Sadly, we must accept that some people are unlikely to ever change their minds. Their cognitive dissonance and reactance are just too strong, and their latitude of acceptance is just too narrow, so our time is better spent elsewhere. We must also accept that it will take an awful lot of cognitive effort, struggle, reflection, and sustained pressure for some people to change their minds. And then some people are ready and open to persuasion because they are not particularly wedded to ideas and practices, and their ego is not particularly prominent in what they do; they are simply doing their best in the absence of the best available information. Figuring out where individuals stand is part of the way forward, and key to our own self-preservation.
Conversations take careful planning. I am mindful that I need to be much better at heeding my own advice in this domain. I am trying to do better each day at communicating in ways that are most likely to lead to change, and resisting the less helpful and blatantly unproductive methods. It is a work in progress.
Funnily enough, when I was out walking one of my dogs through the bush this week, thinking about this blogpost and listening to eighties bangers, the Fun Boy Three feat. Bananarama, ‘It Ain’t What You Do (It’s the Way That You Do It)’ song came on. So, let me close with a quote that could not be less yet more appropriate. Forgive me, academic peers, for I know not what I do.
“It ain't what you do, it's the way that you do it, and that's what gets results.”