Skip to main content

Elizabeth Tipton

Professor of Statistics

PhD, Statistics, Certificate in Education Science, Northwestern University, 2011

Statistician Elizabeth Tipton’s research focuses on the design and analysis of field experiments, particularly on their external validity and on how to make causal generalizations from them. She is developing methods and tools to improve the generalizability of large randomized trials, especially in education and psychology. Her research in meta-analysis—the combination of results across many smaller trials—examines modeling and adjusting for dependence between effect sizes.

Tipton’s research has been published in the Journal of Educational and Behavioral Statistics, Statistics in Medicine, Psychological Methods, Evaluation Review, and the Journal of Research on Educational Effectiveness, among others. Her work has been supported by the National Science Foundation, the Institute for Education Sciences, the Spencer Foundation, and the Raikes Foundation.

Current Research

The Generalizer. In a previous grant from the Spencer Foundation, Tipton developed The Generalizer, a free webtool for developing sampling and recruitment plans for randomized trials in K-12 education and for assessing the generalizability of results of a study to different policy relevant populations. In a recent research study funded by the Institute of Education Sciences (IES), Tipton and Jessaca Spybrook of Western Michigan University are updating The Generalizer to include modules on power-analysis, thus creating a one-stop-shop for designing randomized trials in education. At the same time, in another research study funded by IES, Tipton and Michael Weiss of MDRC are extending The Generalizer to include data on postsecondary schools. Tipton is also working on the development of an R package (‘generalize’) that replicates The Generalizer with user-specified data, thus extending its use beyond education research.

Meta-Analysis. With former IPR graduate research assistant James Pustejovsky (now at the University of Wisconsin–Madison), Tipton is continuing the development of small-sample methods for improved hypothesis testing using Cluster Robust Variance Estimation (CRVE). CRVE is used when there are multiple effect sizes per study in a meta-analysis, and these recent innovations include new working models for implementation. Additionally, Tipton is co-teaching a one-week workshop on advanced meta-analysis methods (MATI) that is funded by IES, as well as a one-week workshop on introductory meta-analysis methods (MMARI), supported by the National Science Foundation.

Generalization and Heterogeneity.  Over the past 10 years, Tipton has focused a line of research on the development of research designs and statistical methods for improving the generalizability of results from randomized trials in education. In a recent paper, she, Western Michigan University's Jessaca Spybrook, and former IPR graduate research assistant Katie Fitzgerald, now at Azusa Pacific University, evaluate the generalizability of findings from 37 efficacy and effectiveness grants funded by IES. These analyses indicate that overall, small school districts and rural schools are underrepresented in evaluations. More recently, Tipton has begun to extend this line of research to include improving methods for understanding how the effects of interventions vary across populations and subgroups, with a focus improving research designs and sampling procedures. This latter work grew out of a collaboration with the National Study of Learning Mindsets, an evaluation of a growth mindset intervention conducted in a probability sample of U.S. high schools.

Selected Publications

Pustejovsky, J. and E. Tipton. 2022. Meta-analysis with robust variance estimation: Expanding the range of working models. Prevention Science 23: 425–38.  

Tipton, E. 2022. Sample selection in randomized trials with multiple target populationsAmerican Journal of Evaluation 43(1): 70–89.

Bryan, C., E. Tipton, and D. Yeager. 2021. Behavioral science is unlikely to change the world without a heterogeneity revolution. Nature Human Behavior 5: 980–89.

Tipton, E. 2021. Beyond the ATE: Designing randomized trials to understand treatment effect heterogeneity. Journal of the Royal Statistics Society: Series A, 184(2): 504–21.

Tipton, E., J. Spybrook, K. Fitzgerald, Q. Wang, and C. Davidson. 2021. Towards a system of evidence for all: Current practices and future opportunities in 37 randomized trials. Educational Researcher, 50(3): 145–56.

Tipton, E., D. Yeager, B. Schneider, and R. Iachan. 2019. Designing probability samples to identify sources of treatment effect heterogeneity. In Experimental Methods in Survey Research: Techniques that Combine Random Sampling with Random Assignment, ed. P.J. Lavrakas, 435–56. New York: Wiley.

Yeager, D., P. Hanselman, G. Walton, J. Murray, R. Crosnoe, C. Muller, E. Tipton, ... and C. Dweck. 2019. A national experiment reveals where a growth mindset improves achievement. Nature 573:36469.

Tipton, E., and R. Olsen. 2018. A review of statistical methods for generalizing from evaluations of educational interventions. Educational Researcher 47(8): 516–24.

Pustejovsky, J., and E. Tipton. 2018. Small sample methods for cluster-robust variance estimation and hypothesis testing in fixed effects models. Journal of Business and Economic Statistics 36(4): 672–83.