• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Change Starts With Journal Editors: In Response to Makel (2014)

McBee, Matthew T., Matthews, Michael S. 01 February 2014 (has links)
The editors of the Journal of Advanced Academics comment on Makel (2014). The replicability crisis in psychology is summarized in terms of three focal issues: the "file drawer" problem, lack of replication studies, and the null hypothesis significance testing paradigm. The authors argue that journal editors are uniquely positioned to address all three of these problems via the adoption of new policies for review and publication.
2

Detecting publication bias in random effects meta-analysis: An empirical comparison of statistical methods

Rendina-Gobioff, Gianna 01 June 2006 (has links)
Publication bias is one threat to validity that researchers conducting meta-analysis studies confront. Two primary goals of this research were to examine the degree to which publication bias impacts the results of a random effects meta-analysis and to investigate the performance of five statistical methods for detecting publication bias in random effects meta-analysis. Specifically, the difference between the population effect size and the estimated meta-analysis effect size, as well as the difference between the population effect size variance and the meta-analysis effect size variance, provided an indication of the impact of publication bias. In addition, the performance of five statistical methods for detecting publication bias (Begg Rank Correlation with sample size, Begg Rank Correlation with variance, Egger Regression, Funnel Plot Regression, and Trim and Fill) were estimated with Type I error rates and statistical power. The overall findings indicate that publication bias notably impacts the meta-analysis effect size and variance estimates. Poor FTSe I error control was exhibited in many conditions by most of the statistical methods. Even when Type I error rates were adequate the power was small, even with larger samples and greater numbers of studies in the meta-analysis.
3

EVALUATING THE EFFECTS OF PUBLICATION BIAS IN SINGLE-CASE RESEARCH DESIGN FOR EVIDENCE-BASED PRACTICES IN AUTISM SPECTRUM DISORDER

Dowdy, Arthur G. January 2018 (has links)
In single-case research design (SCRD), experimental control is demonstrated when the researcher’s application of an intervention, known as the independent variable, reliably produces a change in behavior, known as the dependent variable, and the change is not otherwise explained by confounding or extraneous variables. SCRD studies that fail to demonstrate experimental control may not be published because researchers may be unwilling to submit these papers for publication due to null findings and journals may be unwilling and unlikely to publish null outcomes (i.e., publication bias). The lack of submission and publication of null findings, leading to a disproportion of positive studies in the published research literature, is known as the “file drawer effect” (Rosenthal, 1979; Ferguson & Heene, 2012). Recently, researchers and policy organizations have identified evidence-based practices (EBPs) for children with autism spectrum disorder (ASD) based on systematic reviews of SCRD studies (Odom, Collet-Klingenberg, Rogers, & Hatton, 2010). However, if SCRD studies that do not demonstrate experimental control (i.e., null studies) are disproportionately unpublished due to the file drawer effect, this may result in a misrepresentation of positive findings, leading interventions to be deemed evidence-based that, actually, lack sufficient empirical support (Sham & Smith, 2014; Shadish, Zelinsky, Vevea, & Kratochwill, 2016). Social narratives, exercise, self-management, and response interruption/redirection are interventions for children with ASD that has been named EBPs according to the National Autism Standards (NAC; 2009) and National Professional Development Center on Autism Spectrum Disorder (NPDC; 2010); however, these interventions have not yet been evaluated for potential publication bias. The study employed and extended methods similar to Sham and Smith (2014), comparing the procedures and results of published articles and unpublished dissertations and theses for interventions identified as EBPs to evaluate the methodological rigor and evaluate the possibility of publication bias, file drawer effect, and lack of replication. Specifically, the results of published and unpublished studies were compared to determine if published studies showed greater treatment effect, which would indicate the file drawer effect. Also, SCRD quality indicators were employed to evaluate whether studies that were published tend to be of higher quality, as this would mitigate possible publication bias shown by larger effect sizes (ES) in published studies. The outcome resulted in three out of four EBPs (social narratives, antecedent exercise, and response interruption and redirection), yielding different ES when published studies were compared to unpublished studies; in contrast, self-management yielded a similar ES for published and unpublished studies. For social narratives and antecedent exercise, unpublished studies presented at lower estimated ES than published studies; whereas for response interruption and redirection, unpublished studies presented at a higher estimated ES compared to published studies. Generally, study quality presented at similar levels for published and unpublished studies for each EBP, with the exception of antecedent exercise. Differences were identified for antecedent exercise study quality based upon visual and statistical analyses. Lastly, there do not appear to be observed differences in treatment outcomes between published and unpublished studies when study quality was considered in the analysis. Implications of the results are discussed with respect to the file drawer effect and publication bias in EBPs, and the call to increase publications in peer-reviewed journals of negative findings and replication studies, which leads to identifying and establishing boundary criteria for EBPs. / Special Education

Page generated in 0.0546 seconds