References
Allison, D. B., & Gorman, B. S. (1994). "Make things as simple as possible, but no simpler." a rejoinder to scruggs and mastropieri. Behaviour Research and Therapy, 32(8), 885–890. https://doi.org/10.1016/0005-7967(94)90170-8
Baek, E. K., Petit-Bois, M., Van den Noortgate, W., Beretvas, S. N., & Ferron, J. M. (2016). Using visual analysis to evaluate and refine multilevel models of single-case studies. The Journal of Special Education, 50(1), 18–26.
Baek, E., & Ferron, J. J. M. (2020). Modeling heterogeneity of the level-1 error covariance matrix in multilevel models for single-case data. Methodology, 16(2), 166–185. https://doi.org/10.5964/meth.2817
Barton, E. E., Pustejovsky, J. E., Maggin, D. M., & Reichow, B. (2017). Technology-aided instruction and intervention for students with ASD: A meta-analysis using novel methods of estimating effect sizes for single-case research. Remedial and Special Education, 38(6), 371–386. https://doi.org/10.1177/0741932517729508
Becker, B. J. (1996). The generalizability of empirical research results. In Intellectual talent: Psychometric and social issues (pp. 362–383). Johns Hopkins University Press.
Beretvas, S. N., & Chung, H. (2008). A review of meta-analyses of single-subject experimental designs: Methodological issues and practice. Evidence-Based Communication Assessment and Intervention, 2(3), 129–141. https://doi.org/10.1080/17489530802446302
Borenstein, M. (2009). Effect sizes for continuous data. In H. M. Cooper, L. V. Hedges, & J. C. Valentine (Eds.), The Handbook of Research Synthesis and Meta-Analysis (pp. 221–235). Russell Sage Foundation.
Borenstein, M. (2019). Effect sizes for continuous data. In H. M. Cooper, L. V. Hedges, & J. C. Valentine (Eds.), The Handbook of Research Synthesis and Meta-Analysis. Russell Sage Foundation.
Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2021). Introduction to Meta-Analysis. John Wiley & Sons, Ltd.
Bowman-Perrott, L., Burke, M. D., Zaini, S., Zhang, N., & Vannest, K. (2016). Promoting positive behavior using the Good Behavior Game: A meta-analysis of single-case research. Journal of Positive Behavior Interventions, 18(3), 180–190. https://doi.org/10.1177/1098300715592355
Brossart, D. F., Vannest, K. J., Davis, J. L., & Patience, M. A. (2014). Incorporating nonoverlap indices with visual analysis for quantifying intervention effectiveness in single-case experimental designs. Neuropsychological Rehabilitation, 24(3–4), 464–491. https://doi.org/10.1080/09602011.2013.868361
Busk, P. L., & Serlin, R. C. (1992). Meta-analysis for single-case research. In T. R. Kratochwill & J. R. Levin (Eds.), Single-case research design and analysis: New directions for psychology and education (pp. 187–212). Lawrence Erlbaum Associates, Inc.
Byiers, B. J., Dimian, A., & Symons, F. J. (2014). Functional communication training in rett syndrome: A preliminary study. American Journal on Intellectual and Developmental Disabilities, 119(4), 340–350. https://doi.org/10.1352/1944-7558-119.4.340
Case, L. P., Harris, K. R., & Graham, S. (1992). Improving the Mathematical Problem-Solving Skills of Students with Learning Disabilities: Self-Regulated Strategy Development. The Journal of Special Education, 26(1), 1–19. https://doi.org/10.1177/002246699202600101
Casey, L. O. (1978). Development of communicative behavior in autistic children: A parent program using manual signs. Journal of Autism and Childhood Schizophrenia, 8(1), 45–59. https://doi.org/10.1007/BF01550277
Center, B. A., Skiba, R. J., & Casey, A. (1985). A methodology for the quantitative synthesis of intra-subject design research. The Journal of Special Education, 19(4), 387.
Chen, L.-T., Chen, Y.-K., Yang, T.-R., Chiang, Y.-S., Hsieh, C.-Y., Cheng, C., Ding, Q.-W., Wu, P.-J., & Peng, C.-Y. J. (2023). Examining the normality assumption of a design-comparable effect size in single-case designs. Behavior Research Methods. https://doi.org/10.3758/s13428-022-02035-8
Chen, M., & Pustejovsky, J. E. (2022). Multilevel meta-analysis of single-case experimental designs using robust variance estimation. Psychological Methods. https://doi.org/10.1037/met0000510.supp
Collier-Meek, M. A., Fallon, L. M., & DeFouw, E. R. (2017). Toward feasible implementation support: E-mailed prompts to promote teachers’ treatment integrity. School Psychology Review, 46(4), 379–394. https://doi.org/10.17105/SPR-2017-0028.V46-4
Cook, B., Buysse, V., Klingner, J., Landrum, T., McWilliam, R., Tankersley, M., & Test, D. (2014). Council for exceptional children: Standards for evidence-based practices in special education. Teaching Exceptional Children, 46(6), 206.
Cooper, H. M. (2010). Research Synthesis and Meta-Analysis (4th ed.). SAGE Publications.
Cooper, H., Hedges, L. V., & Valentine, J. C. (2019). The handbook of research synthesis and meta-analysis. Russell Sage Foundation.
Datchuk, S. M. (2016). Writing Simple Sentences and Descriptive Paragraphs: Effects of an Intervention on Adolescents with Writing Difficulties. Journal of Behavioral Education, 25(2), 166–188. https://doi.org/10.1007/s10864-015-9236-x
Datchuk, S. M., Wagner, K., & Hier, B. O. (2020). Level and Trend of Writing Sequences: A Review and Meta-Analysis of Writing Interventions for Students With Disabilities. Exceptional Children, 86(2), 174–192. https://doi.org/10.1177/0014402919873311
Declercq, L., Cools, W., Beretvas, S. N., Moeyaert, M., Ferron, J. M., & Van den Noortgate, W. (2020). MultiSCED: A tool for (meta-)analyzing single-case experimental data with multilevel modeling. Behavior Research Methods, 52(1), 177–192. https://doi.org/10.3758/s13428-019-01216-2
Declercq, L., Jamshidi, L., Fernández-Castilla, B., Beretvas, S. N., Moeyaert, M., Ferron, J. M., & Van den Noortgate, W. (2019). Analysis of single-case experimental count data using the linear mixed effects model: A simulation study. Behavior Research Methods, 51(6), 2477–2497. https://doi.org/10.3758/s13428-018-1091-y
Delemere, E., & Dounavi, K. (2018). Parent-Implemented Bedtime Fading and Positive Routines for Children with Autism Spectrum Disorders. Journal of Autism and Developmental Disorders, 48(4), 1002–1019. https://doi.org/10.1007/s10803-017-3398-4
Dowdy, A., Hantula, D. A., Travers, J. C., & Tincani, M. (2022). Meta-analytic methods to detect publication bias in behavior science research. Perspectives on Behavior Science, 45(1), 37–52. https://doi.org/10.1007/s40614-021-00303-0
Dowdy, A., Tincani, M., & Schneider, W. J. (2020). Evaluation of publication bias in response interruption and redirection: A meta‐analysis. Journal of Applied Behavior Analysis, 53(4), 2151–2171. https://doi.org/10.1002/jaba.724
Ferron, J. (2002). Reconsidering the use of the general linear model with single-case data. Behavior Research Methods, Instruments, & Computers, 34(3), 324–331. https://doi.org/10.3758/BF03195459
Ferron, J. M., Bell, B. A., Hess, M. R., Rendina-Gobioff, G., & Hibbard, S. T. (2009). Making treatment effect inferences from multiple-baseline data: The utility of multilevel modeling approaches. Behavior Research Methods, 41(2), 372–384. https://doi.org/10.3758/BRM.41.2.372
Ferron, J., Goldstein, H., Olszewski, A., & Rohrer, L. (2020). Indexing effects in single-case experimental designs by estimating the percent of goal obtained. Evidence-Based Communication Assessment and Intervention, 14(1-2), 6–27. https://doi.org/10.1080/17489539.2020.1732024
Gage, N. A., Beahm, L., Kaplan, R., MacSuga-Gage, A. S., & Lee, A. (2020). Using positive behavioral interventions and supports to reduce school suspensions. Beyond Behavior, 29(3), 132–140. https://doi.org/10.1177/1074295620950611
Gage, N. A., Cook, B. G., & Reichow, B. (2017). Publication bias in special education meta-analyses. Exceptional Children, 83(4), 428–445. https://doi.org/10.1177/0014402917691016
Gage, N. A., Grasley-Boy, N. M., & MacSuga-Gage, A. S. (2018). Professional development to increase teacher behavior-specific praise: A single-case design replication. Psychology in the Schools, 55(3), 264–277. https://doi.org/10.1002/pits.22106
Ganz, J. B., Pustejovsky, J. E., Reichle, J., Vannest, K. J., Foster, M., Pierson, L. M., Wattanawongwan, S., Bernal, A. J., Chen, M., Haas, A. N., Liao, C.-Y., Sallese, M. R., Skov, R., & Smith, S. D. (2023). Participant characteristics predicting communication outcomes in AAC implementation for individuals with ASD and IDD: A systematic review and meta-analysis. Augmentative and Alternative Communication, 39, 7–22. https://doi.org/10.1080/07434618.2022.2116355
Gingerich, W. J. (1984). Meta-analysis of applied time-series data. The Journal of Applied Behavioral Science, 20(1), 71–79. https://doi.org/10.1177/002188638402000113
Greenwood, K. M., & Matyas, T. A. (1990). Problems with the application of interrupted time series analysis for brief single-subject data. Behavioral Assessment, 12(3), 355–370.
Gunning, M. J., & Espie, C. A. (2003). Psychological treatment of reported sleep disorder in adults with intellectual disability using a multiple baseline design. Journal of Intellectual Disability Research, 47(3), 191–202. https://doi.org/10.1046/j.1365-2788.2003.00461.x
Hebert, M., Bohaty, J. J., Nelson, J. R., & Roehling, J. V. (2018). Writing informational text using provided information and text structures: An intervention for upper elementary struggling writers. Reading and Writing, 31(9), 2165–2190. https://doi.org/10.1007/s11145-018-9841-x
Hedges, L. V. (1981). Distribution theory for Glass’s estimator of effect size and related estimators. Journal of Educational Statistics, 6(2), 107–128.
Hedges, L. V. (2007). Effect sizes in cluster-randomized designs. Journal of Educational and Behavioral Statistics, 32(4), 341–370. https://doi.org/10.3102/1076998606298043
Hedges, L. V., & Olkin, I. (1985). Statistical Methods for Meta-Analysis. Academic Press.
Hedges, L. V., Pustejovsky, J. E., & Shadish, W. R. (2012). A standardized mean difference effect size for single case designs. Research Synthesis Methods, 3, 224–239. https://doi.org/10.1002/jrsm.1052
Hedges, L. V., Pustejovsky, J. E., & Shadish, W. R. (2013). A standardized mean difference effect size for multiple baseline designs across individuals. Research Synthesis Methods. https://doi.org/10.1002/jrsm.1086
Hedges, L. V., & Vevea, J. L. (1998). Fixed- and random-effects models in meta-analysis. Psychological Methods, 3(4), 486–504. https://doi.org/10.1037/1082-989X.3.4.486
Hembry, I., Bunuan, R., Beretvas, S. N., Ferron, J. M., & Van den Noortgate, W. (2015). Estimation of a nonlinear intervention phase trajectory for multiple-baseline design data. The Journal of Experimental Education, 83(4), 514–546. https://doi.org/10.1080/00220973.2014.907231
Hershberger, S., Wallace, D., Green, S., & Marquis, J. (1999). Meta-analysis of single-case data. In R. H. Hoyle (Ed.), Statistical strategies for small sample research (pp. 107–132). Sage Newbury Park, CA.
Higgins, J. P. T., Thompson, S. G., & Spiegelhalter, D. J. (2009). A re-evaluation of random-effects meta-analysis. Journal of the Royal Statistical Society: Series A (Statistics in Society), 172(1), 137–159. https://doi.org/10.1111/j.1467-985X.2008.00552.x
Hutchinson, N. L. (1993). Effects of Cognitive Strategy Instruction on Algebra Problem Solving of Adolescents with Learning Disabilities. Learning Disability Quarterly, 16(1), 34–63. https://doi.org/10.2307/1511158
Jamshidi, L., Heyvaert, M., Declercq, L., Fernández-Castilla, B., Ferron, J. M., Moeyaert, M., Beretvas, S. N., Onghena, P., & Van den Noortgate, W. (2018). Methodological quality of meta-analyses of single-case experimental studies. Research in Developmental Disabilities, 79, 97–115. https://doi.org/10.1016/j.ridd.2017.12.016
Joo, S.-H., & Ferron, J. M. (2019). Application of the within- and between-series estimators to non-normal multiple-baseline data: Maximum likelihood and bayesian approaches. Multivariate Behavioral Research, 54(5), 666–689. https://doi.org/10.1080/00273171.2018.1564877
Joo, S.-H., Ferron, J. M., Moeyaert, M., Beretvas, S. N., & Van den Noortgate, W. (2019). Approaches for specifying the level-1 error structure when synthesizing single-case data. The Journal of Experimental Education, 87(1), 55–74. https://doi.org/10.1080/00220973.2017.1409181
Knochel, A. E., Blair, K.-S. C., & Sofarelli, R. (2021). Culturally focused classroom staff training to increase praise for students with autism spectrum disorder in ghana. Journal of Positive Behavior Interventions, 23(2), 106–117.
Konstantopoulos, S., & Hedges, L. V. (2019). Statistically analyzing effect sizes: Fixed-and random-effects models. In H. Cooper, L. V. Hedges, & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (3rd Edition, pp. 246–280). Russell Sage Foundation.
Kratochwill, T. R., Hitchcock, J. H., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2013). Single-Case Intervention Research Design Standards. Remedial and Special Education, 34(1), 26–38. https://doi.org/10.1177/0741932512452794
Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2010). Single-Case Designs Technical Documentation, Version 1.0 (Pilot) (June; Vol. 0). What Works Clearinghouse.
Kratochwill, T. R., Horner, R. H., Levin, J. R., Machalicek, W., Ferron, J., & Johnson, A. (2021). Single-case design standards: An update and proposed upgrades. Journal of School Psychology, 89, 91–105. https://doi.org/10.1016/j.jsp.2021.10.006
Kratochwill, T. R., Levin, J. R., Horner, R. H., & Swoboda, C. M. (2014). Visual analysis of single-case intervention research: Conceptual and methodological issues. In T. R. Kratochwill & J. R. Levin (Eds.), Single-case intervention research: Methodological and statistical advances (pp. 91–125). American Psychological Association. https://doi.org/https://doi.org/10.1037/14376-004
Lambert, M. C., Cartledge, G., Heward, W. L., & Lo, Y. (2006). Effects of response cards on disruptive behavior and academic responding during math lessons by fourth-grade urban students. Journal of Positive Behavior Interventions, 8(2), 88.
Lewandowski, S. C. (2011). The effects of a modified version of self-regulated strategy development and self-talk internalization strategy on writing and self-talk for elementary students with attentional difficulties [PhD thesis]. https://d.lib.msu.edu/etd/28
Li, H., Luo, W., Baek, E., Thompson, C. G., & Lam, K. H. (2023). Multilevel modeling in single-case studies with count and proportion data: A demonstration and evaluation. Psychological Methods, No Pagination Specified–No Pagination Specified. https://doi.org/10.1037/met0000607
Maggin, D. M., Barton, E., Reichow, B., Lane, K. L., & Shogren, K. A. (2022). Commentary on the What Works Clearinghouse Standards and Procedures Handbook (v. 4.1) for the Review of Single-Case Research. Remedial and Special Education, 421–433. https://doi.org/10.1177/07419325211051317
Maggin, D. M., O’Keeffe, B. V., & Johnson, A. H. (2011). A Quantitative Synthesis of Methodology in the Meta-Analysis of Single-Subject Research for Students with Disabilities: 1985. Exceptionality, 19(2), 109–135. https://doi.org/10.1080/09362835.2011.565725
Maggin, D. M., Pustejovsky, J. E., & Johnson, A. H. (2017). A meta-analysis of school-based group contingency interventions for students with challenging behavior: An update. Remedial and Special Education, 38(6), 353–370. https://doi.org/10.1177/0741932517716900
Manolov, R., Moeyaert, M., & Fingerhut, J. E. (2022). A priori justification for effect measures in single-case experimental designs. Perspectives on Behavior Science, 45(1), 153–186. https://doi.org/10.1007/s40614-021-00282-2
Manolov, R., Solanas, A., & Sierra, V. (2019). Extrapolating baseline trend in single-case data: Problems and tentative solutions. Behavior Research Methods, 51(6), 2847–2869. https://doi.org/10.3758/s13428-018-1165-x
Mason, R. A., Davis, H. S., Ayres, K. M., Davis, J. L., & Mason, B. A. (2016). Video self-modeling for individuals with disabilities: A best-evidence, single case meta-analysis. Journal of Developmental and Physical Disabilities, 28(4), 623–642. https://doi.org/10.1007/s10882-016-9484-2
Matyas, T. A., & Greenwood, K. M. (1997). Serial dependency in single case time series. In Design and analysis of single-case research (pp. 215–243). Erlbaum.
Moeyaert, M., Ferron, J. M., Beretvas, S. N., & Van den Noortgate, W. (2014). From a single-level analysis to a multilevel analysis of single-case experimental designs. Journal of School Psychology, 52(2), 191–211. https://doi.org/10.1016/j.jsp.2013.11.003
Moeyaert, M., Ugille, M., Ferron, J. M., Beretvas, S. N., & Noortgate, W. V. den. (2014). Three-level analysis of single-case experimental data: Empirical validation. The Journal of Experimental Education, 82(1), 1–21. https://doi.org/10.1080/00220973.2012.745470
Moeyaert, M., Ugille, M., Ferron, J. M., Beretvas, S. N., & Van den Noortgate, W. (2013). The three-level synthesis of standardized single-subject experimental data: A monte carlo simulation study. Multivariate Behavioral Research, 48(5), 719–748. https://doi.org/10.1080/00273171.2013.816621
Moeyaert, M., Ugille, M., Ferron, J. M., Beretvas, S. N., & Van Den Noortgate, W. (2016). The misspecification of the covariance structures in multilevel models for single-case data: A monte carlo simulation study. The Journal of Experimental Education, 84(3), 473–509. https://doi.org/10.1080/00220973.2015.1065216
Moeyaert, M., Yang, P., & Xue, Y. (2023). Individual participant data meta-analysis including moderators: Empirical validation. The Journal of Experimental Education, 1–18. https://doi.org/10.1080/00220973.2023.2208062
Montgomery, P. (2004). The relative efficacy of two brief treatments for sleep problems in young learning disabled (mentally retarded) children: A randomised controlled trial. Archives of Disease in Childhood, 89(2), 125–130. https://doi.org/10.1136/adc.2002.017202
Natesan Batley, P., & Hedges, L. V. (2021). Accurate models vs. Accurate estimates: A simulation study of bayesian single-case experimental designs. Behavior Research Methods. https://doi.org/10.3758/s13428-020-01522-0
Odom, S. L., Barton, E. E., Reichow, B., Swaminathan, H., & Pustejovsky, J. E. (2018). Between-case standardized effect size analysis of single case designs: Examination of the two methods. Research in Developmental Disabilities, 79, 88–96. https://doi.org/10.1016/j.ridd.2018.05.009
Ota, K. R., & DuPaul, G. J. (2002). Task engagement and mathematics performance in children with attention-deficit hyperactivity disorder: Effects of supplemental computer instruction. School Psychology Quarterly, 17(3), 242–257. https://doi.org/10.1521/scpq.17.3.242.20881
Owens, C. M., & Ferron, J. M. (2012). Synthesizing single-case studies: A monte carlo examination of a three-level meta-analytic model. Behavior Research Methods, 44(3), 795–805. https://doi.org/10.3758/s13428-011-0180-y
Parker, D. C., Dickey, B. N., Burns, M. K., & McMaster, K. L. (2012). An Application of Brief Experimental Analysis with Early Writing. Journal of Behavioral Education, 21(4), 329–349. https://doi.org/10.1007/s10864-012-9151-3
Parker, R. I., & Vannest, K. (2009). An improved effect size for single-case research: Nonoverlap of all pairs. Behavior Therapy, 40(4), 357–367. https://doi.org/10.1016/j.beth.2008.10.006
Parker, R. I., Vannest, K. J., Davis, J. L., & Sauber, S. B. (2011). Combining nonoverlap and trend for single-case research: Tau-u. Behavior Therapy, 42(2), 284–299. https://doi.org/10.1016/j.beth.2010.08.006
Parker, R. L., Vannest, K. J., & Davis, J. L. (2014). Non-overlap analysis for single-case research. In T. R. Kratochwill & J. R. Levin (Eds.), Single-case intervention research: Methodological and statistical advances (pp. 127–151). American Psychological Association. https://doi.org/https://doi.org/10.1037/14376-005
Patrona, E., Ferron, J., Olszewski, A., Kelley, E., & Goldstein, H. (2022). Effects of explicit vocabulary interventions for preschoolers: An exploratory application of the percent of goal obtained effect size metric. Journal of Speech, Language & Hearing Research, 65(12), 4821–4836. https://doi.org/10.1044/2022_JSLHR-22-00217
Peltier, C., Sinclair, T. E., Pulos, J. M., & Suk, A. (2020). Effects of Schema-Based Instruction on Immediate, Generalized, and Combined Structured Word Problems. The Journal of Special Education, 54(2), 101–112. https://doi.org/10.1177/0022466919883397
Petit-Bois, M. (2014). A monte carlo study: The consequences of the misspecification of the level-1 error structure [PhD thesis]. https://scholarcommons.usf.edu/etd/5341
Petit-Bois, M., Baek, E. K., Van den Noortgate, W., Beretvas, S. N., & Ferron, J. M. (2016). The consequences of modeling autocorrelation when synthesizing single-case studies using a three-level model. Behavior Research Methods, 48, 803–812. https://doi.org/10.3758/s13428-015-0612-1
Pustejovsky, J. E. (2015). Measurement-comparable effect sizes for single-case studies of free-operant behavior. Psychological Methods, 20(3), 342–359. https://doi.org/10.1037/met0000019
Pustejovsky, J. E. (2018). Using response ratios for meta-analyzing single-case designs with behavioral outcomes. Journal of School Psychology, 68, 99–112. https://doi.org/10.1016/j.jsp.2018.02.003
Pustejovsky, J. E. (2019). Procedural sensitivities of effect sizes for single-case designs with directly observed behavioral outcome measures. Psychological Methods, 24(2), 217–235. https://doi.org/10.1037/met0000179
Pustejovsky, J. E., Chen, M., & Hamilton, B. (2021). Scdhlm: A web-based calculator for between-case standardized mean differences.
Pustejovsky, J. E., Chen, M., & Swan, D. M. and. (2023). SingleCaseES: A Calculator for Single-Case Effect Sizes.
Pustejovsky, J. E., & Ferron, J. (2017). Research Synthesis and Meta-Analysis of Single-Case Designs. In Handbook of Special Education (2nd Edition, p. 63). Routledge.
Pustejovsky, J. E., Hedges, L. V., & Shadish, W. R. (2014). Design-comparable effect sizes in multiple baseline designs: A general modeling framework. Journal of Educational and Behavioral Statistics, 39(5), 368–393. https://doi.org/10.3102/1076998614547577
Pustejovsky, J. E., & Tipton, E. (2021). Meta-analysis with robust variance estimation: Expanding the range of working models. Prevention Science. https://doi.org/10.1007/s11121-021-01246-3
R Core Team. (2023). R: A language and environment for statistical computing. R Foundation for Statistical Computing. https://www.R-project.org/
Reichow, B., Barton, E. E., & Maggin, D. M. (2018). Development and applications of the single-case design risk of bias tool for evaluating single-case design research study reports. Research in Developmental Disabilities, 79, 53–64. https://doi.org/10.1016/j.ridd.2018.05.008
Rice, K., Higgins, J. P. T., & Lumley, T. (2018). A re-evaluation of fixed effect(s) meta-analysis. Journal of the Royal Statistical Society Series A: Statistics in Society, 181(1), 205–227. https://doi.org/10.1111/rssa.12275
Rindskopf, D. (2014). Nonlinear bayesian analysis for single case designs. Journal of School Psychology, 52(2), 179–189. https://doi.org/10.1016/j.jsp.2013.12.003
Rodgers, D. B., Datchuk, S. M., & Rila, A. L. (2021). Effects of a Text-Writing Fluency Intervention for Postsecondary Students with Intellectual and Developmental Disabilities. Exceptionality, 29(4), 310–325. https://doi.org/10.1080/09362835.2020.1850451
Rohatgi, A. (2015). Webplotdigitizer. Zenodo. https://doi.org/10.5281/zenodo.32375
Rothstein, H. R., Sutton, A. J., & Borenstein, M. (2005). Publication bias in meta-analysis: Prevention, assessment, and adjustments. West Sussex, England: John Wiley & Sons.
Sallese, M. R., & Vannest, K. J. (2022). Effects of a manualized teacher-led coaching intervention on paraprofessional use of behavior-specific praise. Remedial and Special Education, 43(1), 27–39. https://doi.org/10.1177/07419325211017298
Scotti, J. R., Evans, I. M., Meyer, L. H., & Walker, P. (1991). A meta-analysis of intervention research with problem behavior: Treatment validity and standards of practice. American Journal of Mental Retardation: AJMR, 96(3), 233–256.
Scruggs, T. E., Mastropieri, M. A., & Casto, G. (1987). The quantitative synthesis of single-subject research: Methodology and validation. 8, 24–33. https://doi.org/10.1177/074193258700800206
Shadish, W. R., Hedges, L. V., Horner, R. H., & Odom, S. L. (2015). The Role of Between-Case Effect Size in Conducting, Interpreting, and Summarizing Single-Case Research (NCER 2015-002; p. 109). National Center for Education Research, Institute of Education Sciences, U.S. Department of Education.
Shadish, W. R., Hedges, L. V., & Pustejovsky, J. E. (2014). Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: A primer and applications. Journal of School Psychology, 52(2), 123–147. https://doi.org/10.1016/j.jsp.2013.11.005
Shadish, W. R., Kyse, E. N., & Rindskopf, D. M. (2013). Analyzing data from single-case designs using multilevel models: New applications and some agenda items for future research. Psychological Methods, 18(3), 385–405. https://doi.org/10.1037/a0032964
Shadish, W. R., & Lecy, J. D. (2015). The meta-analytic big bang. Research Synthesis Methods, 6(3), 246–264. https://doi.org/10.1002/jrsm.1132
Shadish, W. R., & Rindskopf, D. M. (2007). Methods for evidence-based practice: Quantitative synthesis of single-subject designs. New Directions for Evaluation, 113(113), 95–109. https://doi.org/10.1002/ev.217
Shadish, W. R., & Sullivan, K. J. (2011). Characteristics of single-case designs used to assess intervention effects in 2008. Behavior Research Methods, 43(4), 971–980. https://doi.org/10.3758/s13428-011-0111-y
Shadish, W. R., Zelinsky, N. A. M., Vevea, J. L., & Kratochwill, T. R. (2016). A survey of publication practices of single-case design researchers when treatments have small or large effects. Journal of Applied Behavior Analysis, 49(3), 656–673. https://doi.org/10.1002/jaba.308
Sham, E., & Smith, T. (2014). Publication bias in studies of an applied behavior-analytic intervention: An initial analysis. Journal of Applied Behavior Analysis, 47(3), 663–678. https://doi.org/10.1002/jaba.146
Stotz, K. E., Itoi, M., Konrad, M., & Alber-Morgan, S. R. (2008). Effects of Self-graphing on Written Expression of Fourth Grade Students with High-Incidence Disabilities. Journal of Behavioral Education, 17(2), 172–186. https://doi.org/10.1007/s10864-007-9055-9
Strasberger, S. K., & Ferreri, S. J. (2014). The effects of peer assisted communication application training on the communicative and social behaviors of children with autism. Journal of Developmental and Physical Disabilities, 26(5), 513–526. https://doi.org/10.1007/s10882-013-9358-9
Swaminathan, H., Rogers, H. J., & Horner, R. H. (2014). An effect size measure and bayesian analysis of single-case designs. Journal of School Psychology, 52(2), 213–230. https://doi.org/10.1016/j.jsp.2013.12.002
Swan, D. M., & Pustejovsky, J. E. (2018). A gradual effects model for single-case designs. Multivariate Behavioral Research, 53(4), 574–593. https://doi.org/10.1080/00273171.2018.1466681
Tarlow, K. R. (2017). An improved rank correlation effect size statistic for single-case designs: Baseline corrected tau. Behavior Modification, 41(4), 427–467. https://doi.org/10.1177/0145445516676750
Tate, R. L., Perdices, M., Rosenkoetter, U., Togher, L., McDonald, S., Shadish, W., Horner, R., Kratochwill, T., Barlow, D. H., Kazdin, A., Sampson, M., Shamseer, L., & Vohra, S. (2016). The Single-Case Reporting Guideline In BEhavioural Interventions (SCRIBE) 2016: Explanation and Elaboration. 22.
Taylor, J. A., Pigott, T., & Williams, R. (2022). Promoting Knowledge Accumulation About Intervention Effects: Exploring Strategies for Standardizing Statistical Approaches and Effect Size Reporting. Educational Researcher, 51(1), 72–80. https://doi.org/10.3102/0013189X211051319
Tipton, E. (2015). Small sample adjustments for robust variance estimation with meta-regression. Psychological Methods, 20(3), 375–393. https://doi.org/10.1037/met0000011
Tipton, E., & Pustejovsky, J. E. (2015). Small-sample adjustments for tests of moderators and model fit using robust variance estimation in meta-regression. Journal of Educational and Behavioral Statistics, 40(6), 604–634. https://doi.org/10.3102/1076998615606099
Van den Noortgate, W., López-López, J. A., Marı́n-Martı́nez, F., & Sánchez-Meca, J. (2013). Three-level meta-analysis of dependent effect sizes. Behavior Research Methods, 45, 576–594. https://doi.org/10.3758/s13428-012-0261-6
Van den Noortgate, W., & Onghena, P. (2003). Combining single-case experimental data using hierarchical linear models. School Psychology Quarterly, 18(3), 325. https://doi.org/10.1521/scpq.18.3.325.22577
Van Den Noortgate, W., & Onghena, P. (2003). Hierarchical linear models for the quantitative integration of effect sizes in single-case research. Behavior Research Methods, Instruments, & Computers, 35(1), 1–10. https://doi.org/10.3758/BF03195492
Van den Noortgate, W., & Onghena, P. (2008). A multilevel meta-analysis of single-subject experimental design studies. Evidence-Based Communication Assessment and Intervention, 2(3), 142–151. https://doi.org/10.1080/17489530802505362
Viechtbauer, W. (2010). Conducting meta-analyses in r with the metafor package. Journal of Statistical Software, 36, 1–48. https://doi.org/10.18637/jss.v036.i03
Walker, B. D., Shippen, M. E., Houchins, D. E., & Cihak, D. F. (2007). Improving the writing skills of high school students with learning disabilities using the Expressive Writing program. International Journal of Special Education, 22(2), 66–76.
Walker, B., Shippen, M. E., Alberto, P., Houchins, D. E., & Cihak, D. F. (2005). Using the expressive writing program to improve the writing skills of high school students with learning disabilities. Learning Disabilities Research & Practice, 20(3), 175–183. https://doi.org/10.1111/j.1540-5826.2005.00131.x
What Works Clearinghouse. (2020a). What Works Clearinghouse Procedures and Standards Handbook (Version 5.0). U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance.
What Works Clearinghouse. (2020b). What Works Clearinghouse Standards Handbook (Version 4.1). U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance.
White, O. R. (1987). Some comments concerning "The quantitative synthesis of single-subject research". Remedial and Special Education, 8(2), 34–39. https://doi.org/10.1177/074193258700800207
Wood, S. G., Moxley, J. H., Tighe, E. L., & Wagner, R. K. (2018). Does Use of Text-to-Speech and Related Read-Aloud Tools Improve Reading Comprehension for Students With Reading Disabilities? A Meta-Analysis. Journal of Learning Disabilities, 51(1), 73–84. https://doi.org/10.1177/0022219416688170
Zimmerman, K. N., Ledford, J. R., Severini, K. E., Pustejovsky, J. E., Barton, E. E., & Lloyd, B. P. (2018). Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor. Research in Developmental Disabilities, 79, 19–32. https://doi.org/10.1016/j.ridd.2018.02.003