Submit a preprint

Latest recommendations

IdTitle * Authors * Abstract * Picture * Thematic fields * Recommender▲ReviewersSubmission date
06 Mar 2024
article picture

Not fleeting but lasting: Limited influence of aging on implicit adaptative motor learning and its short-term retention

Does aging affect implicit motor adaptation?

Recommended by based on reviews by Kevin Trewartha and Marit Ruitenberg
Motor adaptation to environmental perturbations (such as visuomotor rotations and force fields) is thought to be achieved through the interaction of  implicit and explicit processes [1]. However, the extent to which these processes are affected by aging is unclear, partly because of differences in experimental protocols. In this paper, Hermans et al. [2] address the question of whether the implicit component of learning is affected in older adults. 
 
Using a force-field adaptation paradigm, the authors examine implicit adaptation and spontaneous recovery in healthy young and older adults. Overall, the authors found that both total adaptation and implicit adaptation was not affected in older adults. They also found evidence that spontaneous recovery was associated with implicit adaptation, but was not affected in older adults. 
 
These results are noteworthy because they challenge some prior work in the field [3], but are also consistent with results from other experimental paradigms [4]. A main strength of the current paper is the rigor applied to testing this question. The authors provide robust, converging evidence from multiple analyses and statistical methods, and control for confounds both statistically and experimentally.
 
Readers might want to note that this is a ‘conceptual’ replication of the previous study [3], and there are some potentially important differences in experimental details, which are clearly outlined. The sensitivity of the findings to such experimental parameters needs further testing. More broadly, these results highlight the need for greater understanding of how age differences observed in other motor learning tasks [5] are reflective of deficits in learning mechanisms.
 
References
1.     Taylor, J. A., & Ivry, R. B. (2011). Flexible cognitive strategies during motor learning. PLoS computational biology, 7(3), e1001096. https://doi.org/10.1371/journal.pcbi.1001096
2.     Hermans, P., Vandevoorde, K., & Orban de Xivry, J. J. (2024). Not fleeting but lasting: Limited influence of aging on implicit adaptative motor learning and its short-term retention. bioRxiv, ver.2, peer-reviewed and recommended by PCI Health & Movement Sciences. https://doi.org/10.1101/2023.08.30.555501
3.     Trewartha, K. M., Garcia, A., Wolpert, D. M., & Flanagan, J. R. (2014). Fast but fleeting: adaptive motor learning processes associated with aging and cognitive decline. The Journal of neuroscience : the official journal of the Society for Neuroscience, 34(40), 13411–13421. https://doi.org/10.1523/JNEUROSCI.1489-14.2014
4.     Vandevoorde, K., & Orban de Xivry, J. J. (2019). Internal model recalibration does not deteriorate with age while motor adaptation does. Neurobiology of aging, 80, 138–153. https://doi.org/10.1016/j.neurobiolaging.2019.03.020
5.     Voelcker-Rehage, C. (2008). Motor-skill learning in older adults—a review of studies on age-related differences. European review of aging and physical activity 5, 5–16. https://doi.org/10.1007/s11556-008-0030-9
 
Not fleeting but lasting: Limited influence of aging on implicit adaptative motor learning and its short-term retentionPauline Hermans, Koen Vandevoorde, Jean-Jacques Orban de Xivry<p>In motor adaptation, learning is thought to rely on a combination of several processes. Two of these are implicit learning (incidental updating of the sensory prediction error) and explicit learning (intentional adjustment to reduce target erro...Sensorimotor ControlRajiv Ranganathan Marit Ruitenberg, Kevin Trewartha2023-09-02 13:23:44 View
20 Nov 2024
article picture

Cumulative evidence synthesis and consideration of "research waste" using Bayesian methods: An example updating a previous meta-analysis of self-talk interventions for sport/motor performance

Bayesian cumulative evidence synthesis and identification of questionable research practices in health & movement science

Recommended by ORCID_LOGO and ORCID_LOGO based on reviews by Maik Bieleke and 1 anonymous reviewer

Research is a resource-demanding endeavor that tries to answer questions such as, “Is there an effect?” and “How large or small is this effect?” To answer these questions as precisely as possible, meta-analysis is considered the gold standard. However, the value of meta-analytic conclusions greatly depends on the quality, comprehensiveness, and timeliness of the meta-analyzed studies, while not neglecting older research. Using the established sport psychological intervention strategy of self-talk as an example, Corcoran & Steele demonstrate how Bayesian methods and statistical indicators of questionable research practices can be used to assess these questions [1].

Bayesian methods enable cumulative evidence synthesis by updating prior beliefs (i.e., knowledge from an earlier meta-analysis) with new information (i.e., the studies that have been published on the topic since the earlier meta-analysis had been published) to arrive at a posterior belief - an updated meta-analytic effect size. This approach essentially tells us whether and how much our understanding of an effect has improved as additional evidence has accumulated; as well as the precision with which we are estimating it. Or to put it more bluntly, how much smarter are we now with respect to the effect we are interested in?

Importantly, the credibility of this updated effect depends not only on the newly included studies but also on the reliability of the prior beliefs – that is, the credibility of the effects summarized in the earlier meta-analysis. A set of frequentist and Bayesian statistical approaches have been introduced to assess this (for a tutorial with worked examples, see [2]). For example, methods such as the multilevel precision-effect test (PET) and precision-effect estimate with standard errors (PEESE) [2] can be used to adjust for publication bias in the meta-analyzed studies, providing a more realistic estimation of the effect size for the topic at hand. This would then help to assess the magnitude of the true effect in the absence of any bias favoring the publication of significant results.

Why does it matter for health and movement science?
The replication crisis and evidence of questionable research practices has cast doubts on various findings across disciplines [3–8]. Compared to other disciplines (e.g., psychology [9]), health & movement science has been relatively slow to recognize issues with the potential replicability of findings in the field [10]. Fortunately, this has started to change [10–14]. Research on factors that might negatively affect replicability in health & movement science has revealed evidence for various questionable research practices, such as publication bias [12,13], lack of statistical power [11,13], and indicators of p-hacking [12]. The presence of such practices in original research does not only undermine trustworthiness of individual studies, but also the conclusions drawn from meta-analyses that rely on these studies.

Open Science practices, such as open materials, open data, pre-registration of analyses plans, as well as registered reports are all good steps for improving science in the future [15–17] and might even lead to a ‘credibility revolution’ [18]. However, it is also crucial to evaluate the extent to which an existing body of literature might be affected by questionable research practices and how this might affect conclusions drawn from the research. Using self-talk as an example, Corcoran and Steele demonstrate this approach and provide a primer on how it can be effectively implemented [1]. By adhering to Open Science practices, their materials, data, and analyses are openly accessible. We believe this will facilitate the adoption of Bayesian methods to cumulatively update available evidence, as well as making it easier for fellow researchers to comprehensively and critically assess the literature they want to meta-analyze.      
 
References​
[1] Corcoran H. & Steele, J. Cumulative evidence synthesis and consideration of "research waste" using Bayesian methods: An example updating a previous meta-analysis of self-talk interventions for sport/motor performance. SportRxiv, ver.2, peer-reviewed and recommended by PCI Health & Movement Sciences (2024). https://doi.org/10.51224/SRXIV.348
[​2] ​Bartoš, F., Maier, M., Quintana, D. S. & Wagenmakers, E.-J. Adjusting for publication bias in JASP and R: Selection Models, PET-PEESE, and robust bayesian meta-analysis. Adv. Methods Pract. Psychol. Sci. 5, 25152459221109259 (2022). https://doi.org/10.1177/2515245922110925
[​3] ​Yong, E. Replication studies: Bad copy. Nature 485, 298–300 (2012). https://doi.org/10.1038/485298a.
[​4] ​Hagger, M. S. et al. A multilab preregistered replication of the ego-depletion effect. Perspect. Psychol. Sci. 11, 546–573 (2016). https://doi.org/10.1177/1745691616652873
[​5] ​Scheel, A. M., Schijen, M. R. M. J. & Lakens, D. An excess of positive results: Comparing the standard psychology literature with registered reports. Adv. Methods Pract. Psychol. Sci. 4, 25152459211007467 (2021). https://doi.org/10.1177/2515245921100746
[​6] ​Perneger, T. V. & Combescure, C. The distribution of P-values in medical research articles suggested selective reporting associated with statistical significance. J. Clin. Epidemiol. 87, 70–77 (2017). https://doi.org/10.1016/j.jclinepi.2017.04.003
[​7] ​Errington, T. M. et al. An open investigation of the reproducibility of cancer biology research. eLife 3, e04333 (2014). https://doi.org/10.7554/eLife.04333
[​8] ​Hoffmann, S. et al. The multiplicity of analysis strategies jeopardizes replicability: lessons learned across disciplines. R. Soc. Open Sci. 8, 201925 (2021). https://doi.org/10.1098/rsos.201925
[​9] ​Open Science Collaboration. Estimating the reproducibility of psychological science. Science 349, aac4716 (2015). https://doi.org/10.1126/science.aac4716
[​10] ​Mesquida, C., Murphy, J., Lakens, D. & Warne, J. Replication concerns in sports and exercise science: A narrative review of selected methodological issues in the field. R. Soc. Open Sci. 9, 220946 (2022). https://doi.org/10.1098/rsos.220946
[​11] ​Abt, G. et al. Power, precision, and sample size estimation in sport and exercise science research. J. Sports Sci. 38, 1933–1935 (2020). https://doi.org/10.1080/02640414.2020.1776002
[​12] ​Borg, D. N., Barnett, A. G., Caldwell, A. R., White, N. M. & Stewart, I. B. The bias for statistical significance in sport and exercise medicine. J. Sci. Med. Sport 26, 164–168 (2023). https://doi.org/10.1016/j.jsams.2023.03.002
[​13] ​Mesquida, C., Murphy, J., Lakens, D. & Warne, J. Publication bias, statistical power and reporting practices in the Journal of Sports Sciences: potential barriers to replicability. J. Sports Sci. 41, 1507–1517 (2023). https://doi.org/10.1080/02640414.2023.2269357
[​14] ​Büttner, F., Toomey, E., McClean, S., Roe, M. & Delahunt, E. Are questionable research practices facilitating new discoveries in sport and exercise medicine? The proportion of supported hypotheses is implausibly high. Br. J. Sports Med. 54, 1365–1371 (2020). https://doi.org/10.1136/bjsports-2019-101863
[​15] ​Chambers, C. D. & Tzavella, L. The past, present and future of Registered Reports. Nat. Hum. Behav. 6, 29–42 (2022). https://doi.org/10.1038/s41562-021-01193-7
[​16] ​Soderberg, C. K. et al. Initial evidence of research quality of registered reports compared with the standard publishing model. Nat. Hum. Behav. 5, 990–997 (2021). https://doi.org/10.1038/s41562-021-01142-4
[​17] ​Wunsch, K., Pixa, N. H. & Utesch, K. Open science in German sport psychology. Z. Für Sportpsychol. 30, 156–166 (2023). https://doi.org/10.1026/1612-5010/a000406
[​18] ​Korbmacher, M. et al. The replication crisis has led to positive structural, procedural, and community changes. Commun. Psychol. 1, 1–13 (2023).​ https://doi.org/10.1038/s44271-023-00003-2

Cumulative evidence synthesis and consideration of "research waste" using Bayesian methods: An example updating a previous meta-analysis of self-talk interventions for sport/motor performanceHannah Corcoran, James Steele<p>In the present paper we demonstrate the application of methods for cumulative evidence synthesis including Bayesian meta-analysis, and exploration of questionable research practices such as publication bias or <em>p</em>-hacking, in the sport a...Exercise & Sports Psychology, Meta-Science in Health & MovementWanja Wolff Maik Bieleke, Anonymous2023-11-27 10:06:36 View