Please note: This course will be taught in hybrid mode. Hybrid delivery of courses will include synchronous live sessions during which on campus and online students will be taught simultaneously.

Susumu Shikano is Professor of Political Methodology. He is doing research on spatial models of politics and diverse topics in political behaviour. Before going to Konstanz, he was professor ad interim at the University of Potsdam (2008) and assistant professor at the University of Mannheim (2001-2008). He received a Dr. Phil (2001) and Venia Legendi (2007) for political science from the University of Mannheim.

Empirical researchers working with statistical models are owing a lot to probability theory, however, often without noticing it. Being informed about probability theory has at least two advantages: 1) Use of statistical softwares and interpretation of their outputs become better-grounded; 2) It becomes easier to learn more advanced techniques such as machine learning. In this course, we first discuss the basics of probability theory, which include: set theory; the three axioms of probability theory; rules in probability calculation; discrete and continuous random variables; joint, marginal and conditional densities; the Gaussian distribution; limit theorem; random process. After discussing the above topics, the course continues with how probability theory contributes to different types of social science research. More specifically, maximum likelihood estimation, Bayesian inference, MCMC and Bayesian networks will be discussed. All these abstract topics will be accompanied with some concrete samples for better understanding, whereby simple R codes for some calculation and/or visualization will be used.

Course Objectives
Students will gain a solid understanding of probability theory in the first week by learning the basics concepts and rules of probability theory. Based on that, students will learn in the second week how probability theory is used to make inferences in different contexts. More specifically, the course will introduce maximum likelihood, Bayesian inference, Markov Chain Monte Carlo techniques and Bayesian networks in their very basic forms. More extensive discussion will be provided by the other corresponding advanced courses in the ESS program.

Course Prerequisites
A solid understanding of descriptive statistics and a basic understanding of regression analysis is of great advantage. This course is not an introduction to R but assumes that students are familiar with basic R programming.

Required text
We shall be using the following textbook: Joe Blitzstein and Jessica Hwang Introduction to Probability, 2nd Edition. A free version is available online free version available online at

Background knowledge required:
Calculus = Moderate
Linear Regression = Moderate

OLS = Moderate

Computing Background:
R = Moderate

Course details and schedule:

Classes will meet for ten sessions. In each session, about 2/3 of the time will be devoted to lectures and the rest will be “lab” sessions where the lecture part will be “experienced” by running the corresponding R-codes.

Students are expected to read the corresponding literature before the lecture. The lecture will impart the same substance in somewhat different angles by using various examples (but in principle in the same notation). 

In the lab sessions, we will use R. In order to see the whole process without black boxes, we will not use additional packages with a few exceptions. That is, the participants only need to install the most recent version of R on their computer.

After each session (except for Day 10), students will have some assignments, which can be solved by using the provided R-code during the lab sessions. They have to be submitted before the subsequent session.

Day 1: Introduction; Set theory

Blitzstein and Hwang, Chapter 1

Day 2: The three axioms of probability theory, some rules in probability calculation, Conditional prob and independence

Blitzstein and Hwang, Chapters 1-3

Day 3: Discrete and continuous random variables; The Gaussian distribution; Expectation and Moments

Blitzstein and Hwang, Chapters 3-6

Day 4: Joint, marginal and conditional densities; Limit theorem 

Blitzstein and Hwang, Chapters 7-10

Day 5: Random process; Markov Chain

Blitzstein and Hwang, Chapters 11 and 13

Day 6: Parameter estimation based on the maximum likelihood principle

Elff, Martin, 2015. Estimation techniques : ordinary least squares and maximum likelihood. In: Henning Best and Christoph Wolf, ed. The Sage handbook of regression analysis and causal inference. Sage. pp. 7-30.

Gary King. 1998. Unifying Political Methodology: The Likelihood Theory of Statistical Inference. Ann Arbor: University of Michigan Press. Chapters 2-4.

Day 7: Obtaining posterior in Bayesian inference

Shikano, Susumu, 2015. Bayesian estimation of regression models. In: Henning Best and Christoph Wolf, ed. The Sage handbook of regression analysis and causal inference. Sage. pp. 31-54.

Ben Lambert, 2018, A student’s guide to Bayesian statistics. Sage. Chapters 4-7.

Day 8: Markov Chain Monte Carlo

Blitzstein and Hwang, Chapter 12

Ben Lambert, 2018, A student’s guide to Bayesian statistics. Sage. Chapters 12-15.

Day 9: Bayesian networks

Stuart Russell, Peter Norvig: Artificial Intelligence: A Modern Approach. 3. Auflage. Prentice Hall, 2010. Chapters 13 and 14. 

Day 10: Probability in applied social science research; Wrap up

Students give a short presentation on how probability theory contributes to a social science study of their choice. 

No reading