Meta-analyses can play an important role in educational research. Aggregating results from different, but comparable studies, to demonstrate average effect sizes can be highly informative. However, the validity of a meta-analysis mean depends on the quality of the studies included.
In my recently published paper (Fullard, 2023), I review 42 randomised control trials (RCTs) from a meta-analysis performed by Fletcher-Wood and Zuccollo (2020) investigating the effect of teacher professional development on pupil outcomes. Of the 42 RCTs reviewed only 10 are valid tests of the meta-analysis hypothesis. Moreover, when the invalid tests of the meta-analysis hypothesis are excluded, the meta-analysis mean falls to 0. This demonstrates that the positive effect reported by Fletcher-Wood and Zuccollo (2020), and the subsequent policy conclusions, are entirely driven by poor research methods.
One of the conclusions from my paper is that a general improvement in empirical methods in education research is necessary to help researchers a) design more robust experiments, and b) evaluate the quality of existing experiments.
