Supporting teachers to be evidence-based practitioners - what do we know?
If you have any kind of interest in the development of evidence-based/informed practice (EBP) within schools, then this blogpost is for you.
Even with the worldwide interest in evidence-based practice (EBP) as a core concept within medicine and healthcare, the evidence on how best to teach evidence-based practice is weak. In a recently published systematic review - Albarqouni, Hoffmann, et al. (2018) – found that most EBP educational interventions evaluated in controlled studies tended to focus on the critical appraisal of research evidence and did not use high quality instruments to measure the outcomes.
With this in mind, the rest of this post will examine the implications of the findings of the review for schools as they attempt to provide support and training to teachers in becoming better evidence-based practitioners.
What are the implications for evidence-based practice educational interventions within schools and other educational settings?
First, whereas in medicine there is a general understanding as to what is meant by evidence-based practice - – Sackett, Rosenberg, et al. (1996) - this is not the case in education. As Nelson and Campbell (2017) argue: there is a little agreement over the precise meaning of the term, in large part because of a lack of consensus as to whether; ‘research’ and ‘evidence’ are one and the same, for example? (Nelson 2014); are ‘evidence-based’ and
‘evidence-informed’ practices fundamentally different? (McFarlane 2015); and, perhaps the most intensely debated, ‘Whose evidence counts?’’ That said, as Professor Rob Coe stated at the February 2017 launch of event of the Chartered College of Teaching – agreeing a definition of evidence-based practice/ evidence-informed practice – should be possible.
Second, in medicine there would appear to be agreement about the five steps associated with being an evidence-based practitioner - Dawes, Summerskill, et al. (2005). These five steps include: translation of uncertainty into an answerable question; systematic retrieval of best evidence available; critical appraisal of evidence for validity, clinical relevance, and applicability; application of results in practice; and, evaluation of performance. On the other hand in education, at most there is agreement in evidence-informed practice involves multiple sources of evidence and the deployment of professional judgement, Nelson and Campbell (2017).
Third, given the nature of education there are going to real challenges for advocates of evidence-based practice within education to demonstrate impact on pupils outcomes. As such, it make some sense to try and come up with validated instruments which can be used to measure teachers and knowledge, skills and attitudes towards EBP. The CREATE framework - Tilson, Kaplan, et al. (2011) – provides guidance on both the assessment domains and types of assessment. This framework could easily be amended for use in an educational context as illustrated in Table 1 ( based on Tilson, Kaplan et als)
Fourth, given time, effort and money being put into EBP educational interventions – not just in IEE/EEF Research Schools – but in an increasing number of schools within England and across the world, perhaps attention should be given to developing guidelines on the reporting of EBP educational interventions, just as has been done in medicine,- GREET - Phillips, Lewis, et al. (2016). This is especially important as we know relatively little about the effective implementation of EBP educational interventions. If studies under-report the details of the intervention – this will make it extremely difficult to bring together: what has been learnt; how to make the most of successes; and, avoiding unnecessary failures.
Fifth, my own experience of EBP educational interventions would suggest that there is a great deal of emphasis on both accessing and interpreting research evidence, with insufficient attention being given to the challenging process of assessing and aggregating differing sources of evidence – be it practitioner expertise, stakeholder views and school data.
And finally
I’ve always been a believer in success is a case of doing simple things well – or as Woody Allen says ‘eighty percent of success is showing up’. Maybe in education we are not doing the doing the simple things well – which is making the most of what has been learnt in other disciplines.
Abstract - Albarqouni, L., Hoffmann, T. and Glasziou, P. (2018). Evidence-Based Practice Educational Intervention Studies: A Systematic Review of What Is Taught and How It Is Measured. BMC medical education. 18. 1. 177.
Background: Despite the established interest in evidence-based practice (EBP) as a core competence for clinicians, evidence for how best to teach and evaluate EBP remains weak. We sought to systematically assess coverage of the five EBP steps, review the outcome domains measured, and assess the properties of the instruments used in studies evaluating EBP educational interventions.
Methods: We conducted a systematic review of controlled studies (i.e. studies with a separate control group) which had investigated the effect of EBP educational interventions. We used citation analysis technique and tracked the forward and backward citations of the index articles (i.e. the systematic reviews and primary studies included in an overview of
the effect of EBP teaching) using Web of Science until May 2017. We extracted information on intervention content (grouped into the five EBP steps), and the outcome domains assessed. We also searched the literature for published reliability and validity data of the EBP instruments used.
Results: Of 1831 records identified, 302 full-text articles were screened, and 85 included. Of these, 46 (54%) studies were randomised trials, 51 (60%) included postgraduate level participants, and 63 (75%) taught medical professionals. EBP Step 3 (critical appraisal) was the most frequently taught step (63 studies; 74%). Only 10 (12%) of the studies taught content which addressed all five EBP steps. Of the 85 studies, 52 (61%) evaluated EBP skills, 39 (46%) knowledge, 35 (41%) attitudes, 19 (22%) behaviours, 15 (18%) self-efficacy, and 7 (8%) measured reactions to EBP teaching delivery. Of the 24 instruments used in the included studies, 6 were high-quality (achieved ≥3 types of established validity evidence) and these were used in 14 (29%) of the 52 studies that measured EBP skills; 14 (41%) of the 39 studies that measured EBP knowledge; and 8 (26%) of the 35 studies that measured EBP attitude.
Conclusions: Most EBP educational interventions which have been evaluated in controlled studies focus on teaching only some of the EBP steps (predominantly critically appraisal of evidence) and did not use high-quality instruments to measure outcomes. Educational packages and instruments which address all EBP steps are needed to improve EBP teaching.
References
Albarqouni, L., Hoffmann, T. and Glasziou, P. (2018). Evidence-Based Practice Educational Intervention Studies: A Systematic Review of What Is Taught and How It Is Measured. BMC medical education. 18. 1. 177.
Dawes, M., Summerskill, W., Glasziou, P., Cartabellotta, A., Martin, J., Hopayian, K., Porzsolt, F., Burls, A. and Osborne, J. (2005). Sicily Statement on Evidence-Based Practice. BMC medical education. 5. 1. 1.
Nelson, J. and Campbell, C. (2017). Evidence-Informed Practice in Education: Meanings and Applications. Educational researcher. 59. 2. 127-135.
Phillips, A. C., Lewis, L. K., McEvoy, M. P., Galipeau, J., Glasziou, P., Moher, D., Tilson, J. K. and Williams, M. T. (2016). Development and Validation of the Guideline for Reporting Evidence-Based Practice Educational Interventions and Teaching (Greet). BMC medical education. 16. 1. 237.
Sackett, D., Rosenberg, W., Gray, J., Haynes, R. and Richardson, W. (1996). Evidence Based Medicine: What It Is and What It Isn't. Bmj. 312. 7023. 71-72.
Tilson, J. K., Kaplan, S. L., Harris, J. L., Hutchinson, A., Ilic, D., Niederman, R., Potomkova, J. and Zwolsman, S. E. (2011). Sicily Statement on Classification and Development of Evidence-Based Practice Learning Assessment Tools. BMC medical education. 11. 1. 78.