1 |
The Reliability Paradox: When High Reliability Does not Signal Reliable Detection of Experimental EffectsWang, Shuo 24 October 2019 (has links)
No description available.
|
2 |
Foxes who want to be hedgehogs: Is ethical pluralism possible in psychology's replication crisis?Sullivan, Paul W., Ackroyd, John 21 March 2022 (has links)
Yes / In this article, we draw attention to public-private dilemmas among psychologists that make sense of the debates around the replication crisis, citation practices and networking practices. We argue that the values of justice and caring map onto the public sphere and private sphere respectively and create the horns of a dilemma for academics. While bureaucratic justice is a publicly revered value of modernity in psychological research that underpins ethics, validity, reliability and equality of opportunity, ‘caring’ is a more subtle value of traditionalism that runs in parallel and is detected only by our psychological practices. In particular, we argue that it is detected by practices such as disputes between the replicated and their replicators in replication studies (understood as a dramatic counter reality) as to who is more ‘careless’ with procedure; citation (including the self-care of self-citation) as thanksgiving and incantation of powerful others in enchantment rituals, and the system of professional indebtedness that accrues in ‘kinship’ networks – where kinship is understood broadly as adoption into a research group and its network. The clashes between these values can lead to a sense of hypocrisy and irony in academic life, as incommensurate values split between private and public expression. From this position, we delve into Isaiah Berlin's work on incommensurate values to suggest that ethical pluralism, involving more public recognition of the equal but different ethical demands of these values can help overcome these everyday dilemmas in the public sphere.
|
3 |
A registered report survey of open research practices in psychology departments in the UK and IrelandSilverstein, P., Pennington, C.R., Branney, Peter, O'Connor, D., Lawlor, E., O'Brien, E., Lynott, D. 08 March 2024 (has links)
Yes / Open research practices seek to enhance the transparency and reproducibility of research. Whilst there is evidence of increased uptake in these practices, such as study preregistration and open data, facilitated by new infrastructure and policies, little research has assessed general uptake of such practices across psychology university researchers. The current study estimates psychologists’ level of engagement in open research practices across universities in the United Kingdom and Ireland, while also assessing possible explanatory factors that may impact their engagement. Data were collected from 602 psychology researchers in the UK and Ireland on the extent to which they have implemented various practices (e.g., use of preprints, preregistration, open data, open materials). Here we present the summarised descriptive results, as well as considering differences between various categories of researcher (e.g., career stage, subdiscipline, methodology), and examining the relationship between researcher’s practices and their self-reported capability, opportunity, and motivation (COM-B) to engage in open research practices. Results show that while there is considerable variability in engagement of open research practices, differences across career stage and subdiscipline of psychology are small by comparison. We observed consistent differences according to respondent’s research methodology and based on the presence of institutional support for open research. COM-B dimensions were collectively significant predictors of engagement in open research, with automatic motivation emerging as a consistently strong predictor. We discuss these findings, outline some of the challenges experienced in this study, and offer suggestions and recommendations for future research. Estimating the prevalence of responsible research practices is important to assess sustained behaviour change in research reform, tailor educational training initiatives, and to understand potential factors that might impact engagement. / Research funding: Aston University. Article funding: Open access funding provided by IReL.
|
4 |
A Call for Open Science in Giftedness ResearchMcBee, Matthew T., Makel, Matthew C., Peters, Scott J., Matthews, Michael S. 01 October 2018 (has links)
Current practices in study design and data analysis have led to low reproducibility and replicability of findings in fields such as psychology, medicine, biology, and economics. Because gifted education research relies on the same underlying statistical and sociological paradigms, it is likely that it too suffers from these problems. This article discusses the origin of the poor replicability and introduces a set of open science practices that can increase the rigor and trustworthiness of gifted education’s scientific findings: preregistration, open data and open materials, registered reports, and preprints. Readers are directed to Internet resources for facilitating open science. To model these practices, a pre peer-review preprint of this article is available at https://psyarxiv.com/nhuv3/.
|
5 |
EVALUATING THE EFFECTS OF PUBLICATION BIAS IN SINGLE-CASE RESEARCH DESIGN FOR EVIDENCE-BASED PRACTICES IN AUTISM SPECTRUM DISORDERDowdy, Arthur G. January 2018 (has links)
In single-case research design (SCRD), experimental control is demonstrated when the researcher’s application of an intervention, known as the independent variable, reliably produces a change in behavior, known as the dependent variable, and the change is not otherwise explained by confounding or extraneous variables. SCRD studies that fail to demonstrate experimental control may not be published because researchers may be unwilling to submit these papers for publication due to null findings and journals may be unwilling and unlikely to publish null outcomes (i.e., publication bias). The lack of submission and publication of null findings, leading to a disproportion of positive studies in the published research literature, is known as the “file drawer effect” (Rosenthal, 1979; Ferguson & Heene, 2012). Recently, researchers and policy organizations have identified evidence-based practices (EBPs) for children with autism spectrum disorder (ASD) based on systematic reviews of SCRD studies (Odom, Collet-Klingenberg, Rogers, & Hatton, 2010). However, if SCRD studies that do not demonstrate experimental control (i.e., null studies) are disproportionately unpublished due to the file drawer effect, this may result in a misrepresentation of positive findings, leading interventions to be deemed evidence-based that, actually, lack sufficient empirical support (Sham & Smith, 2014; Shadish, Zelinsky, Vevea, & Kratochwill, 2016). Social narratives, exercise, self-management, and response interruption/redirection are interventions for children with ASD that has been named EBPs according to the National Autism Standards (NAC; 2009) and National Professional Development Center on Autism Spectrum Disorder (NPDC; 2010); however, these interventions have not yet been evaluated for potential publication bias. The study employed and extended methods similar to Sham and Smith (2014), comparing the procedures and results of published articles and unpublished dissertations and theses for interventions identified as EBPs to evaluate the methodological rigor and evaluate the possibility of publication bias, file drawer effect, and lack of replication. Specifically, the results of published and unpublished studies were compared to determine if published studies showed greater treatment effect, which would indicate the file drawer effect. Also, SCRD quality indicators were employed to evaluate whether studies that were published tend to be of higher quality, as this would mitigate possible publication bias shown by larger effect sizes (ES) in published studies. The outcome resulted in three out of four EBPs (social narratives, antecedent exercise, and response interruption and redirection), yielding different ES when published studies were compared to unpublished studies; in contrast, self-management yielded a similar ES for published and unpublished studies. For social narratives and antecedent exercise, unpublished studies presented at lower estimated ES than published studies; whereas for response interruption and redirection, unpublished studies presented at a higher estimated ES compared to published studies. Generally, study quality presented at similar levels for published and unpublished studies for each EBP, with the exception of antecedent exercise. Differences were identified for antecedent exercise study quality based upon visual and statistical analyses. Lastly, there do not appear to be observed differences in treatment outcomes between published and unpublished studies when study quality was considered in the analysis. Implications of the results are discussed with respect to the file drawer effect and publication bias in EBPs, and the call to increase publications in peer-reviewed journals of negative findings and replication studies, which leads to identifying and establishing boundary criteria for EBPs. / Special Education
|
Page generated in 0.1128 seconds