GAVEAU Jérémie
- Sports Science, INSERM U1093 CAPS, University of Burgundy, Dijon, France
- Biomechanics, Sensorimotor Control
- recommender, manager
Recommendation: 1
Reviews: 0
Recommendation: 1
Cumulative evidence synthesis and consideration of "research waste" using Bayesian methods: An example updating a previous meta-analysis of self-talk interventions for sport/motor performance
Bayesian cumulative evidence synthesis and identification of questionable research practices in health & movement science
Recommended by Wanja Wolff and Jérémie Gaveau based on reviews by Maik Bieleke and 1 anonymous reviewerResearch is a resource-demanding endeavor that tries to answer questions such as, “Is there an effect?” and “How large or small is this effect?” To answer these questions as precisely as possible, meta-analysis is considered the gold standard. However, the value of meta-analytic conclusions greatly depends on the quality, comprehensiveness, and timeliness of the meta-analyzed studies, while not neglecting older research. Using the established sport psychological intervention strategy of self-talk as an example, Corcoran & Steele demonstrate how Bayesian methods and statistical indicators of questionable research practices can be used to assess these questions [1].
Bayesian methods enable cumulative evidence synthesis by updating prior beliefs (i.e., knowledge from an earlier meta-analysis) with new information (i.e., the studies that have been published on the topic since the earlier meta-analysis had been published) to arrive at a posterior belief - an updated meta-analytic effect size. This approach essentially tells us whether and how much our understanding of an effect has improved as additional evidence has accumulated; as well as the precision with which we are estimating it. Or to put it more bluntly, how much smarter are we now with respect to the effect we are interested in?
Importantly, the credibility of this updated effect depends not only on the newly included studies but also on the reliability of the prior beliefs – that is, the credibility of the effects summarized in the earlier meta-analysis. A set of frequentist and Bayesian statistical approaches have been introduced to assess this (for a tutorial with worked examples, see [2]). For example, methods such as the multilevel precision-effect test (PET) and precision-effect estimate with standard errors (PEESE) [2] can be used to adjust for publication bias in the meta-analyzed studies, providing a more realistic estimation of the effect size for the topic at hand. This would then help to assess the magnitude of the true effect in the absence of any bias favoring the publication of significant results.
Why does it matter for health and movement science?
The replication crisis and evidence of questionable research practices has cast doubts on various findings across disciplines [3–8]. Compared to other disciplines (e.g., psychology [9]), health & movement science has been relatively slow to recognize issues with the potential replicability of findings in the field [10]. Fortunately, this has started to change [10–14]. Research on factors that might negatively affect replicability in health & movement science has revealed evidence for various questionable research practices, such as publication bias [12,13], lack of statistical power [11,13], and indicators of p-hacking [12]. The presence of such practices in original research does not only undermine trustworthiness of individual studies, but also the conclusions drawn from meta-analyses that rely on these studies.
Open Science practices, such as open materials, open data, pre-registration of analyses plans, as well as registered reports are all good steps for improving science in the future [15–17] and might even lead to a ‘credibility revolution’ [18]. However, it is also crucial to evaluate the extent to which an existing body of literature might be affected by questionable research practices and how this might affect conclusions drawn from the research. Using self-talk as an example, Corcoran and Steele demonstrate this approach and provide a primer on how it can be effectively implemented [1]. By adhering to Open Science practices, their materials, data, and analyses are openly accessible. We believe this will facilitate the adoption of Bayesian methods to cumulatively update available evidence, as well as making it easier for fellow researchers to comprehensively and critically assess the literature they want to meta-analyze.
References
[1] Corcoran H. & Steele, J. Cumulative evidence synthesis and consideration of "research waste" using Bayesian methods: An example updating a previous meta-analysis of self-talk interventions for sport/motor performance. SportRxiv, ver.2, peer-reviewed and recommended by PCI Health & Movement Sciences (2024). https://doi.org/10.51224/SRXIV.348
[2] Bartoš, F., Maier, M., Quintana, D. S. & Wagenmakers, E.-J. Adjusting for publication bias in JASP and R: Selection Models, PET-PEESE, and robust bayesian meta-analysis. Adv. Methods Pract. Psychol. Sci. 5, 25152459221109259 (2022). https://doi.org/10.1177/2515245922110925
[3] Yong, E. Replication studies: Bad copy. Nature 485, 298–300 (2012). https://doi.org/10.1038/485298a.
[4] Hagger, M. S. et al. A multilab preregistered replication of the ego-depletion effect. Perspect. Psychol. Sci. 11, 546–573 (2016). https://doi.org/10.1177/1745691616652873
[5] Scheel, A. M., Schijen, M. R. M. J. & Lakens, D. An excess of positive results: Comparing the standard psychology literature with registered reports. Adv. Methods Pract. Psychol. Sci. 4, 25152459211007467 (2021). https://doi.org/10.1177/2515245921100746
[6] Perneger, T. V. & Combescure, C. The distribution of P-values in medical research articles suggested selective reporting associated with statistical significance. J. Clin. Epidemiol. 87, 70–77 (2017). https://doi.org/10.1016/j.jclinepi.2017.04.003
[7] Errington, T. M. et al. An open investigation of the reproducibility of cancer biology research. eLife 3, e04333 (2014). https://doi.org/10.7554/eLife.04333
[8] Hoffmann, S. et al. The multiplicity of analysis strategies jeopardizes replicability: lessons learned across disciplines. R. Soc. Open Sci. 8, 201925 (2021). https://doi.org/10.1098/rsos.201925
[9] Open Science Collaboration. Estimating the reproducibility of psychological science. Science 349, aac4716 (2015). https://doi.org/10.1126/science.aac4716
[10] Mesquida, C., Murphy, J., Lakens, D. & Warne, J. Replication concerns in sports and exercise science: A narrative review of selected methodological issues in the field. R. Soc. Open Sci. 9, 220946 (2022). https://doi.org/10.1098/rsos.220946
[11] Abt, G. et al. Power, precision, and sample size estimation in sport and exercise science research. J. Sports Sci. 38, 1933–1935 (2020). https://doi.org/10.1080/02640414.2020.1776002
[12] Borg, D. N., Barnett, A. G., Caldwell, A. R., White, N. M. & Stewart, I. B. The bias for statistical significance in sport and exercise medicine. J. Sci. Med. Sport 26, 164–168 (2023). https://doi.org/10.1016/j.jsams.2023.03.002
[13] Mesquida, C., Murphy, J., Lakens, D. & Warne, J. Publication bias, statistical power and reporting practices in the Journal of Sports Sciences: potential barriers to replicability. J. Sports Sci. 41, 1507–1517 (2023). https://doi.org/10.1080/02640414.2023.2269357
[14] Büttner, F., Toomey, E., McClean, S., Roe, M. & Delahunt, E. Are questionable research practices facilitating new discoveries in sport and exercise medicine? The proportion of supported hypotheses is implausibly high. Br. J. Sports Med. 54, 1365–1371 (2020). https://doi.org/10.1136/bjsports-2019-101863
[15] Chambers, C. D. & Tzavella, L. The past, present and future of Registered Reports. Nat. Hum. Behav. 6, 29–42 (2022). https://doi.org/10.1038/s41562-021-01193-7
[16] Soderberg, C. K. et al. Initial evidence of research quality of registered reports compared with the standard publishing model. Nat. Hum. Behav. 5, 990–997 (2021). https://doi.org/10.1038/s41562-021-01142-4
[17] Wunsch, K., Pixa, N. H. & Utesch, K. Open science in German sport psychology. Z. Für Sportpsychol. 30, 156–166 (2023). https://doi.org/10.1026/1612-5010/a000406
[18] Korbmacher, M. et al. The replication crisis has led to positive structural, procedural, and community changes. Commun. Psychol. 1, 1–13 (2023). https://doi.org/10.1038/s44271-023-00003-2