The school research lead and the evidence-based pep talk

As a school research lead you may often be called upon to give colleagues a ‘pep talk’ about the importance of research and evidence to your school. As such, it seems sensible to look at the research and evidence about motivating colleagues through the use of pep talks.  So this post will look at the recent work of (McGinn, 2017) who draws upon (Mayfield, Mayfield, & Kopf, 1995, 1998) and their research into Motivating Language Theory.

Motivating Language Theory (MLT)

MLT suggests there are three elements of motivating language, which once fully understood can be used to give more ‘motivating’ pep talks.  These elements are:

Uncertainty reducing language – where ‘leaders provide information about how precisely how to do the task at hand, giving easily understandable instructions, good definitions of tasks, and detail on how performance will be evaluated.’

Empathetic language by showing concern for audience as human-beings by including ‘praise, encouragements, gratitude, and acknowledgement of a task’s difficulty.’

Meaning making language ‘this explains why a task is important.  This involves linking the organisation’s purposes or mission to listener’s goals.  Often meaning making-language includes the use of stories …. of how the work has made a real difference in the lives of customers or the community.’ (McGinn, 2017) p. 134

As (McGinn, 2017) notes a good pep talk given to either a group or an individual will contain aspects of all three elements.  However, getting the right mix will depend on the context and who is in your audience, and how well you know them.  

What are the implications for the school research lead?

First, it is really important that you are in command of both terminology and processes– that you can explain the difference between research, practitioner inquiry and evidence-based practice.  In other words, what is it that you are asking them to do?  This all means when you talk about research engagement or research involvement that you can provide practical examples of the differences between the two.  If colleagues want to be ‘research engaged’ they are given very clear guidance about how they can go about it – which probably involves some very small but clearly achievable task which is directly relevant to their teaching.

Second, understand that for many colleagues ‘research’ is scary stuff.  They may not have read any educational research in years – they might not know what is meant by effect size and may be far more concerned about teaching Y11 Group C on a Friday afternoon.  Acknowledge, that becoming research engaged will take time and effort and that to create the time and space for research, specific actions are being undertaken to reduce work-load.  My own view – is that for every new initiative you start with colleagues – you should be looking to get rid of at least two if not three other tasks which colleagues are expected to complete.  In other words, subtract tasks before you add tasks.  Furthermore, if colleagues have done a great job in say developing a journal club – thank them.

Third, connect to why research and evidence is important and central to the work of the school.  Research and evidence is essential if we are going to provide the best possible education for our pupils and so that they can have the best possible life-chances.  Research and evidence is vital if we are going to make use of our most scare resource – colleagues time.  Research and evidence is vital if we are going to make decisions to protect those activities that really matter, especially at a time of financial pressure.  Research and evidence, needs to be part and parcel of our own professional development if we are to learn and progress throughout our careers in education.  Research and evidence is a prerequisite if we are going to keep up with the latest developments both in our subjects and in teaching, especially given how quickly knowledge can depreciate and become out of date.  Research and evidence, to help us challenge our own biases and prejudices – and to make us just stop and think and reflect, that you know what, I might just be wrong.

References

Mayfield, J., Mayfield, M., & Kopf, J. (1995). Motivating Language: Exploring Theory with Scale Development. The Journal of Business Communication (1973), 32(4), 329-344. doi:doi:10.1177/002194369503200402
Mayfield, J., Mayfield, M., & Kopf, J. (1998). The effects of leader motivating language on subordinate performance and satisfaction. Human resource management, 37(3), 235-248.
McGinn, D. (2017). The science of pep talks. Harvard business review, 95(4), 133-137.


The School Research Lead and the 'backfire effect' - should you be worried?


One of the challenges faced by school research leads is the need to engage with colleagues who have different views about the role of evidence in bringing about improvement.  Indeed, these different views are not likely to be restricted just to the role of evidence, they are also likely to include differing views about the research evidence itself.  What’s more in a widely cited article (Nyhan & Reifler, 2010) show how  attempts to  correct misconceptions through the use of evidence frequently fail to to reduce the misconceptions held by a target-group.  Indeed, these attempts at correcting misconceptions may inadvertently lead to increasing misconceptions in the target-group i.e the so-called back-effect

Now if there is a ‘backfire effect this could have profound implications for both evidence-based school leaders and school research leads as they attempt to engage in dialogue to correct the misconceptions which may be held by colleagues about research.   This is extremely important as it is necessary to know whether it is possible to engage in constructive dialogue where misperceptions can be corrected.   If this is not the case then school research leads will need to give careful consideration to how they go about disseminating scholarly research, as it may lead to major opinion formers within a school having an even less favourable view of research as a means of bringing about improvement.

However, there may be an even bigger issue -  the ‘backfire effect’ may not exist at all, and even if it does it may well be the exception rather than the norm.  In a peer-reviewed paper, (Wood & Porter, 2016) present results from four experiments involving over 8000 subjects, and found that on the whole individuals tended to take on board factual information even if this information challenges their partisan and ideological commitments. 

What are the implications for you, as you attempt to develop a school climate and culture based of evidence use.  

First, as (Wood & Porter, 2016) noted the back-fire effect appeared to be a product of question wording, so this would suggest that it’s important to really think through how information is presented to colleagues and how subsequent questions are phrased.  

Second, Wood and Porter note that in general respondents tend to shy away from cognitive effort and will deploy strategies to avoid it.  Whereas as the backfire effect relies on substantial cognitive effort by developing new considerations to offset the cognitive dissonance generated by the new information.    However, the research which has identified the back-fire effect often took place in university settings where the respondents, be it students or teaching staff often take great delight in cognitive effort.  Indeed, the school staff room may have a number of similarities with experiments taking place in university settings.  As such, schools may be particularly prone to seeing a disproportionate number of incidents to the ‘back-fire effect.  

Third, Wood and Porter note that their findings are not without limitations, for example, just because individuals have been presented with information to address their misconceptions, does not mean that that this information has been retained.    


And finally, it’s important to note that even when relatively new ideas and concepts breakout from the academy and reach the public domain, that does not mean they should be taken as ‘gospel’ but rather should be seen as something which has more than surface plausibility.  That said, even when things are plausible that does not mean it is the only explanation for what is taking place.

References

Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32(2), 303-330.

Wood, T., & Porter, E. (2016). The elusive backfire effect: Mass attitudes' steadfast factual adherence.

The school research lead, the 5 Whys and appraising school data ?

As a school research lead one of your key tasks will be to help colleagues interpret the  quantitative data which is generated by your school.  So in this post I am going to suggest that we look at a technique called the ' five whys,' which you can use to  analyse data in a way that will help get to the very heart of any underlying issue (Pojasek, 2000).  In doing so, we will use a case-study where last year's GCSE results in a particular department have been especially disappointing

Asking ‘why’ five times

The ‘five whys' is a simple technique, which involves asking the question ‘why’ at least five times so that you can get to the root cause of a problem.  The process tends to come to an end when it is no longer possible to come up with an answer to ‘why’.    But first let's look at what happens when you ask 'why' only once and then come up with a fairly 'lazy' answer

Problem: A subject’s examination results are substantially below the previous year’s results and the 1 why

Q Why are this department's examination results below those of the previous year

A Because the both the Head of Department and teacher who taught this subject are either newly qualified and relatively inexperienced, who need support and improvement targets

However we know from the work of (Crawford and Benton, 2017) that almost all of the change in a school's examination results can be explained by changes in year to year changes in the pupil cohort.  So let's have a go with the 5 whys

Problem: A subject’s examination results are substantially below the previous year’s results - the 5 whys

Q Why are examination results below the previous year’s results
A Because this year a weaker cohort of students took the subject

Q Why did a weaker cohort of student take the subject this year
A Because ‘stronger’ students who would normally take this subject chose other subjects.

Q Why did the stronger students choose other subjects 
A Because in the year before the students chose their ‘options’, they had been taught predominantly by non-specialist teachers who were adequate rather than inspiring 

Q Why did a non-specialist teachers deliver this subject
A Because all teachers had to have a full timetable

Q Why did all teachers have to have a full timetable
A Due to financial pressures it was not viable to have teachers on ‘light’ timetables

Pojaskek (200)) identifies a number of benefits which come from asking 'why' five times.  First, once you have got the hang of it, it's a pretty quick and easy technique to use.  Second, it helps you think through an issue so that you can drill down to the underlying cause of the problem.  Third, it may help you change your perception of the root cause of a problem.  That said, there a couple of clear challenges in using the 'five whys' and these include the need for strong facilitation skills - as the focus is on getting to the root cause of an issue rather than allocating blame. There's also the issue that there may be multiple issues in play - so it may be difficult to isolate the root cause

And some final words

In these times of acute financial pressures on schools it needs to be emphasised that decisions often have long-term consequences - and what may be a quick fix for the current year, may cause substantive problems in years to come.

Reference

CRAWFORD, C. & BENTON, T. 2017. Volatility happens: Understanding variation in schools’ GCSE results : Cambridge Assessment Research Report. Cambridge, UK: Cambridge Assessmentn.


POJASEK, R. B. 2000. Asking“Why?” five times. Environmental Quality Management, 10, 79-84.