Skip to main content

Elizabeth Tipton

Associate Professor of Statistics

PhD, Statistics, Certificate in Education Science, Northwestern University, 2011

Statistician Elizabeth Tipton’s research focuses on the design and analysis of field experiments, particularly on their external validity and on how to make causal generalizations from them. She is developing methods and tools to improve the generalizability of large randomized trials, especially in education and psychology. Her research in meta-analysis—the combination of results across many smaller trials—examines modeling and adjusting for dependence between effect sizes.

Tipton’s research has been published in the Journal of Educational and Behavioral Statistics, Statistics in Medicine, Psychological Methods, Evaluation Review, and the Journal of Research on Educational Effectiveness, among others. Her work has been supported by the National Science Foundation, the Institute for Education Sciences, the Spencer Foundation, and the Raikes Foundation.

Current Research

The Generalizer. In a previous grant from the Spencer Foundation, Tipton developed The Generalizer, a free webtool for developing sampling and recruitment plans for randomized trials in K-12 education and for assessing the generalizability of results of a study to different policy relevant populations. In a recent research study funded by the Institute of Education Sciences (IES), Tipton and Jessaca Spybrook of Western Michigan University are updating The Generalizer to include modules on power-analysis, thus creating a one-stop-shop for designing randomized trials in education. At the same time, in another research study funded by IES, Tipton and Michael Weiss of MDRC are extending The Generalizer to include data on post-secondary schools. Tipton and post-doc Katie Coburn are additionally working on the development of an R package (‘generalize’) that replicates The Generalizer with user-specified data, thus extending its use beyond education research.

Meta-Analysis. With former IPR graduate research assistant James Pustejovsky (now at the University of Wisconsin–Madison), Tipton is continuing the development of small-sample methods for improved hypothesis testing using Cluster Robust Variance Estimation (CRVE). CRVE is used when there are multiple effect sizes per study in a meta-analysis, and these recent innovations include new working models for implementation. Additionally, Tipton is co-teaching a one-week workshop on advanced meta-analysis methods (MATI) that is funded by IES, as well as a one-week workshop on introductory meta-analysis methods (MMARI), supported by the National Science Foundation.

Generalization and Heterogeneity.  Over the past 10 years, Tipton has focused a line of research on the development of research designs and statistical methods for improving the generalizability of results from randomized trials in education. In a recent paper, she, Western Michigan University's Jessaca Spybrook, and IPR graduate research assistant Katie Fitzgerald evaluate the generalizability of findings from 37 efficacy and effectiveness grants funded by IES. These analyses indicate that overall, small school districts and rural schools are under-represented in evaluations. More recently, Tipton has begun to extend this line of research to include improving methods for understanding how the effects of interventions vary across populations and subgroups, with a focus improving research designs and sampling procedures. This latter work grew out of a collaboration with the National Study of Learning Mindsets, an evaluation of a growth mindset intervention conducted in a probability sample of U.S. high schools.

Selected Publications

Tipton, E. Forthcoming. Beyond the ATE: Designing randomized trials to understand treatment effect heterogeneity. Journal of the Royal Statistics Society: Series A.   

Tipton, E. Forthcoming. Sample selection in randomized trials with multiple target populations. American Journal of Evaluation.

Tipton, E., J. Spybrook, K. Fitzgerald, Q. Wang, and C. Davidson. Forthcoming. Towards a system of evidence for all: Current practices and future opportunities in 37 randomized trials. Educational Researcher.

Tipton, E., D. Yeager, B. Schneider, and R. Iachan. 2019. Designing probability samples to identify sources of treatment effect heterogeneity. In Experimental Methods in Survey Research: Techniques that Combine Random Sampling with Random Assignment, ed. P.J. Lavrakas, 435–56. New York: Wiley.

Yeager, D., P. Hanselman, G. Walton, J. Murray, R. Crosnoe, C. Muller, E. Tipton, ... and C. Dweck. 2019. A national experiment reveals where a growth mindset improves achievement. Nature 573:36469.

Tipton, E., and R. Olsen. 2018. A review of statistical methods for generalizing from evaluations of educational interventions. Educational Researcher 47(8): 516–24.

Pustejovsky, J., and E. Tipton. 2018. Small sample methods for cluster-robust variance estimation and hypothesis testing in fixed effects models. Journal of Business and Economic Statistics 36(4): 672–83.

Tipton, E., and L. Peck. 2017. A design-based approach to improve external validity in welfare policy evaluations. Evaluation Review (Special Issue: External Validity 1), 41(4): 326–56.

Tipton, E., and J. Shuster. 2017. A framework for the meta-analysis of Bland-Altman studies based on a limits of agreement approachStatistics in Medicine, 36(23), 3621–35. 

Tipton, E., and J. Pustejovsky. 2015. Small-sample adjustments to multivariate hypothesis tests in robust variance estimation in meta-regressionJournal of Educational and Behavioral Statistics, 40(6): 604–34.  

Tipton, E. 2015. Small sample adjustments for robust variance estimation with meta-regressionPsychological Methods, 20(3): 375–93.