Openness, transparency, and reproducibility in research are critical to scientific progress. Yet, according to a 2016 internet-based survey of 1,576 researchers, 90% of respondents felt there was a slight or significant reproducibility crisis.1 Moreover, there are longstanding concerns that scientific journals favor positive or significant results—often referred to as publication bias—which may produce bodies of evidence that are incomplete or misleading (i.e., because studies which report null findings are not published).2 Publication bias can also arise when researchers choose not to disseminate their findings on the basis of obtaining null results. For example, Franco and colleagues3 reviewed 221 sociological studies conducted from 2000-2012 and recorded by Time-sharing Experiments for the Social Sciences, and they revealed that nearly 65% of the studies reporting null findings weren’t written up. The dissemination of null findings is important because they help shape the knowledgebase in a given research area while informing other researchers of alternative hypotheses to examine. Fortunately, there are several ongoing and recent efforts to mitigate these concerns. The reasons that publishing null results are important include...

One approach to improve openness, transparency, and reproducibility in the social sciences is to publish a pre-analysis plan.4,5 Akin to pre-registering a clinical trial in the biomedical sciences, social scientists can use a pre-analysis plan to specify and archive their research objectives, hypotheses, data sources, and methodological approaches prior to conducting analyses. Pre-analysis plans serve both the researcher and consumer of research by raising the credibility and reliability of results obtained.

There are several advantages of using pre-analysis plans. For one, pre-analysis plans serve as protection against p-hacking or significance searching, which typically refers to a practice in which a researcher conducts analysis to find patterns in their data that can be presented as statistically significant. This safeguard is especially important in settings with a large number of potential outcomes.6 Pre-analysis plans also serve as a useful tool for researchers to carefully think through and design their research study, ensuring that they define clear, testable hypotheses for each research question of interest in their study.

It should also be noted that some researchers have raised concerns over the potential pitfalls of using pre-analysis plans.6-8 For example, some scholars argue that pre-analysis plans inhibit exploratory work.9 Researchers have also proposed alternative strategies to improve openness, transparency, and reproducibility in research that doesn’t necessarily require the use of pre-analysis plans.10,11 Nevertheless, a pre-analysis plan can serve as a framework for carefully planning your research study while improving the quality and transparency of your research.

If you are interested in learning more about pre-analysis plans, there are several resources available. For example, the Berkeley Initiative for Transparency in the Social Sciences features on their website a wealth of information on pre-analysis plans. You can also find templates and checklists for pre-analysis plans through the Center for Open Science (see, for example: which also features resources for archiving pre-analysis plans and an expansive list of frequently asked questions.


1.        Baker M. 1,500 scientists lift the lid on reproducibility. Nature News. 2016;533(7604):452.

2.        Begg CB. Publication bias. The handbook of research synthesis. 1994;25:299-409.

3.        Franco A, Malhotra N, Simonovits G. Publication bias in the social sciences: Unlocking the file drawer. Science. 2014;345(6203):1502-1505.

4.        Miguel E, Camerer C, Casey K, et al. Promoting transparency in social science research. Science. 2014;343(6166):30-31.

5.        Nosek BA, Ebersole CR, DeHaven AC, Mellor DT. The preregistration revolution. Proceedings of the National Academy of Sciences. 2018;115(11):2600-2606.

6.        Casey K, Glennerster R, Miguel E. Reshaping institutions: Evidence on aid impacts using a preanalysis plan. The Quarterly Journal of Economics. 2012;127(4):1755-1812.

7.        Olken BA. Promises and perils of pre-analysis plans. Journal of Economic Perspectives. 2015;29(3):61-80.

8.        Coffman LC, Niederle M. Pre-analysis plans have limited upside, especially where replications are feasible. Journal of Economic Perspectives. 2015;29(3):81-98.

9.        Gelman A. Preregistration of studies and mock reports. Political Analysis. 2013;21(1):40-41.

10.     Fafchamps M, Labonne J. Using split samples to improve inference on causal effects. Political Analysis. 2017;25(4):465-482.

11.     Anderson ML, Magruder J. Split-sample strategies for avoiding false discoveries. National Bureau of Economic Research;2017.


Jordan Weiss

Jordan Weiss is a demographer who studies population health and inequality and a D-Lab Data Science Fellow. A central theme of his work concerns the integration of theories across multiple disciplines with advances in statistical and computational science to inform research design and translate findings into actionable, policy-relevant information.Jordan earned his PhD in Demography and Sociology from the University of Pennsylvania where he also earned an MA in Statistics.