Chris Fariss is an Assistant Professor in the Department of Political Science at the University of Michigan. Prior to beginning this appointment, he was the Jeffrey L. Hyde and Sharon D. Hyde and Political Science Board of Visitors Early Career Professor in Political Science in the Department of Political Science at Penn State University. In June 2013, he graduated with a Ph.D. in political science from the University of California, San Diego. He also studied at the University of North Texas, where he graduated with an M.S. in political science (2007), a B.F.A in drawing and painting (2005), and a B.A. in political science (2005). His core research focuses on the politics and measurement of human rights, discrimination, violence, and repression. Chris uses computational methods to understand why governments around the world torture, maim, and kill individuals within their jurisdiction and the processes monitors use to observe and document these abuses. Other projects cover a broad array of themes but share a focus on computationally intensive methods and research design. These methodological tools, essential for analyzing data at massive scale, open up new insights into the micro-foundations of state repression and the politics of measurement. Below you will find links to his publications , working papers, teaching material , a Dataverse archive where you can access replication data, and links to human rights data generated from several measurement projects.
Course Content This class will provide graduate students with an introduction to the scientific method and an overview of how to apply it to the study of politics. Students will learn the fundamentals of the scientific method and, through research design, how to improve both causal inference and the measurement of political phenomena.
Course Objectives Students will learn how to recognize and engage with the research design tools used to generate valid causal inferences.
Course Prerequisites I assume no prior exposure to the research design concepts covered in this class.
Course Details • I will begin each class day with a lecture over the class material (approximately 90 minutes). • After each lecture, students will discuss two or three articles as they relate to the research design topics of each lecture (approximately 30-45 minutes for each article). • Finally, we will practice implementing the design using an applied example in the R programming language.
Course Books 1. Trochim and Donnelly — Trochim, William and James P. Donnelly. 2007. The Research Methods Knowledge Base, 3rd Edition. Cincinnati, OH, Atomic Dog Publishing. http://www.socialresearchmethods.net/kb/ 2. Additional articles and chapters are listed below. Copies of these readings will be provided by the instructor.
• I will begin each class day with a lecture over the class material (approximately 90 minutes). • After each lecture, students will discuss two or three articles as they relate to the research design topics of each lecture (approximately 30-45 minutes for each article). • Finally, we will practice implementing the design using an applied example in the R programming language.
Trochim and Donnelly — Trochim, William and James P. Donnelly. 2007. The Research Methods
Additional articles and chapters are listed below. Copies of these readings will be provided by the
Types of Course Readings
Lecture Readings provide the core content for each course topic. You should read each of these readings closely prior to the class day.
Discussion Readings illustrate the use of the research design tools discussed for each course topic. You should read each of these readings closely prior to the class day.
Supplementary Readings provide additional details about the research design tools discussed in the course. Note that you are not required to read these before class but I recommend you review these once the course is completed. These readings may also help you when choosing from the more advanced topics covered in the summer school.
Related Essex Summer School Courses provides a list of other summer school courses that provide much greater coverage of the topics covered in this course. Note that not every summer school course is included in these lists. In the reading list below, author names in bold are current or former Essex Summer school instructors.
Day 1: Designing Validity for Experimental and Observational Studies
Trochim and Donnelly. Ch 6: “Design”
Trochim and Donnelly. Ch 7: “Experimental Design.”
Rubin, Donald B. 2008. “For Objective Causal Inference, Design Trumps Analysis.” Annals of
D.J. Flynn and Yanna Krupnikov. 2018. “Motivations and Misinformation: Why People Retain Some Errors but Quickly Dismiss Others” Journal of Experimental Political Science 6(1):5-15. https://doi.org/10.1017/XPS.2018.12
Slough, Tara and Christopher J. Fariss. “Misgovernance and Human Rights: Experimental Evidence of Illegal Detention without Intent” (Forthcoming).
Alberto Abadie and Matias D. Cattaneo. 2018. “Econometric Methods for Program Evaluation”
Imai, Kosuke, Luke J. Keele, Dustin Tingley, and Teppei Yamamoto. 2011. “Unpacking the Black Box of Causality: Learning about Causal Mechanisms from Experimental and Observational Studies.” American Political Science Review 105(4):765-789. https://doi.org/10.1017/S0003055411000414
Sinclair, Betsy, Margaret McConnell, and Donald P. Green. 2012. “Detecting Spillover Effects: Design and Analysis of Multilevel Experiments.” American Journal of Political Science 56(4):1055-1069. https://doi.org/10.1111/j.1540-5907.2012.00592.x
Shadish, William R. 2010. “Campbell and Rubin: A Primer and Comparison of Their Approaches to Causal Inference in Field Settings.” Psychological Methods 15(1):3-17. https://doi.org/10.1037/a0015916
Abadie, Alberto, Alexis Diamond and Jens Hainmueller. 2014. “Comparative Politics and the Synthetic Control Method.” American Journal of Political Science 59(2):495-510. https://doi.org/10.1111/ajps.12116
Lijphart, Arend 1971. “Comparative Politics and the Comparative Method.” American Political
Lustik, Ian S. 1996. “History, Historiography, and Political Science: Multiple Historical Records and the Problem of Selection Bias.” American Political Science Review 90(3):605-618. https://doi.org/10.2307/2082612
Mosley, Layna. 2013. ““Just Talk to People.”? Interviews in Contemporary Political Science.” In
Interview Research in Political Science, editors, Layna Mosley. Cornell University Press.
Nielsen, Richard. 2016. “Case Selection via Matching.” Sociological Methods and Research
Day 3: Quasi-Experimental Designs: Non-Equivalent Group Designs
Trochim and Donnelly. Ch 9: “Quasi-Experimental Design”
Trochim and Donnelly. Ch 10: “Advanced Design Topics.”
Card, David, and Alan B. Krueger. 1994. “Minimum Wages and Employment: A Case Study of the Fast-Food Industry in New Jersey and Pennsylvania.” American Economic Review 84(4):772-793. https://doig.org/10.3386/w4509
Campbell, Donald T. and H. Laurence Ross. 1968. “Analysis of Data on the Connecticut Speeding Crackdown as a Time-Series Quasi-Experiment.” Law and Society Review 3(1):55-76. https://doi.org/10.2307/3052794
Lalonde, Robert. 1986. “Evaluating the Econometric Evaluations of Training Programs.” American Economic Review 76:604-620.
Day 4: Quasi-Experimental Designs: Regression Discontinuity and Instrumental Variables
Imbens, Guido and Thomas Lemieux. 2008. “Regression Discontinuity Designs: A Guide to Practice.” Journal of Econometrics 142:615-635. https://doi.org/10.3386/w13039
Conrad, Courtenay R., and Emily Hencken Ritter. 2016. “Preventing and Responding to Dissent: The Observational Challenges of Explaining Strategic Repression.” American Political Science Review 110(1):85-99. https://doi.org/10.1017/S0003055415000623
James M. Snyder, Olle Folke, and Shigeo Hirano. 2015. “Partisan Imbalance in Regression Discontinuity Studies Based on Electoral Thresholds” Political Science Research and Methods 3(2):169-186. https://doi.org/10.1017/psrm.2014.31
Lee, David S. and Thomas Lemieux. 2010. “Regression Discontinuity Designs in Economics.”
Journal of Economic Literature 48(2):281-355.
Keele, Luke J. and Rocio Titiunik, 2015. “Geographic Boundaries as Regression Discontinuities.”
Baerg, Nicole and Will Lowe. Forthcoming. “A textual Taylor rule: estimating central bank preferences combining topic and scaling methods” Political Science Research and Methods. https://doi.org/10.1017/psrm.2018.31
Lo, James, Sven-Oliver Proksch and Jonathan B. Slapin. 2016. “Ideological Clarity in Multiparty Competition: A New Measure and Test Using Election Manifestos” British Journal of Political Science 46(3):591-610. https://doi.org/10.1017/S0007123414000192
Adcock, Robert, and David Collier. 2001. “Measurement Validity: A Shared Standard for Qual- itative and Quantitative Research.” American Political Science Review 95(3):529-546. https://doi.org/10.1017/S0003055401003100
Borsboom, Denny. 2005. Measuring the Mind. Cambridge: Cambridge University Press. Ch 1:
“Introduction”, Ch 3: “Latent variables”, and Ch.6: “The concept of validity”.
Fariss, Christopher J. 2019. “Yes, Human Rights Practices Are Improving Over Time” American
Hand, D. J., 1996. “Statistics and the Theory of Measurement.” Journal of the Royal Statistical Society. Series A (Statistics in Society). 159(3):445-492. https://doi.org/10.2307/2983326
Grimmer, Justin and Brandon M. Stewart. 2013. “Text as Data: The Promise and Pitfalls of Automatic Content Analysis Methods for Political Texts.” Political Analysis 21(3):267-297. https://doi.org/10.1093/pan/mps028
Jackman, Simon. 2008. “Measurement.” In The Oxford Handbook of Political Methodology, edited by Janet M. Box-Steffensmeier, Henry E. Brady, and David Collier. Oxford University Press.
Poole, Keith T. and Howard Rosenthal. 1991. “Patterns of Congressional Voting.” American