Course Code:                 MATH 3278

Course Title:                  Probability Theory II

Course Type:                 Core

Level:                             3

Semester:                       1

No. of Credits:                 3


Pre-requisites:                MATH 2270,MATH 2274


 Course Rationale

This is a second course in Probability Theory that approaches Probability Theory from two perspectives:

(i)  Probability theory is a branch of mathematics. As such, we will focus on the fundamental assumptions of Probability Theory and how the main properties of Probability Measures proceed from these assumptions. Throughout the course, therefore, students will be expected to be able to derive the main results that they use. Very little will be assumed without proof.

(ii) Probability Theory is primarily concerned with modelling phenomena with uncertain outcomes.  The course emphasizes this. It is most definitely not a course in Pure Mathematics.A knowledge of calculus of one and several variables (including a good understanding of limits, continuity, differentiability) and some knowledge of elementary analysis is assumed (hence the need for Math 2120).

Probability Theory is now an indispensable tool in many applications of Mathematics to the Natural Sciences, Computing, Finance, Insurance and, of course, to Statistics. It is hard to imagine someone graduating with a degree in Mathematics that does not include a course in Probability Theory. This course continues the study of Probability begun in Math 2140. It provides those majoring in Mathematics, Statistics, Actuarial Science, Economics and other areas with a heavy content of Probability and Statistics with a sound knowledge of distribution theory and an introduction to the main ideas of stochastic processes.

The course is necessary to provide the mathematical probabilistic foundation for more advanced courses in programmes in the Department of Mathematics. The exposure of students to this course gives them the ability to become good problem solvers. This course therefore creates the opportunity for our graduates to become critical and creative thinkers – thereby fulfilling one of the key attributes of our graduates. 


Course Description

The course begins with a discussion of the axioms of probability. We point out that not all subsets of an arbitrary sample can be events and introduce the idea of a sigma field. There is a careful discussion of distribution functions in general (including continuous, absolutely continuous and discrete cases). The rest of the section on distribution theory focuses on the distribution theory of several random variables. Joint density functions, transformations, joint mgfs, order statistics, convolution are discussed. We then define conditional expectation and give its main properties. The section on distribution theory closes with a discussion of multivariate distributions, including the multinomial and multivariate normal. We prove that the sample mean and sample variance in a sample from the normal distribution are independent and obtain the distribution of the sample variance.

The second half of the course focuses on stochastic processes. Markov Chains in discrete time and with discrete state space are discussed. Details are as follows:

Definition of a stochastic process and a Markov Chain; Chapman-Kolmogorov Equations; Classification of states; Ergodic theorem; The Poisson process; Generating functions with Applications to Branching Processes.

All lectures, assignments, handouts, and review materials are available online through Myelearning to all students. Blended leaning techniques will be employed. Lectures will be supplemented with laboratory work and group discussions.

Assessment is designed to encourage students to work continuously with the course materials. Active learning will be achieved through weekly assignments and problem sheets allowing continuous feedback and guidance on problem solving techniques in tutorials and lectures. Assessment will be based on the weekly assignments and in-course tests followed by a final examination based on the whole course.

Learning Outcomes

Upon successful completion of the course, students will be able to:

  • Derive basic properties of probability measures and distribution functions, including joint distribution functions.
  • Generate random numbers and distributions using R and Matlab.
  • Given a joint pdf of several random variables, compute the pdf of a transformation of these variables.
  • Derive the distribution of the rth order statistic of a sample and the joint distribution of several order statistics.
  • Find the distribution of the sum of two random variables using convolution.
  • Apply the properties of conditional expectation to the behavior of random sums of random variables.
  • Find the distribution of a linear transformation of a multivariate normal random vector.
  • Derive the joint distribution of the sample mean and sample variance of a sample from the normal distribution.
  • Define the terms stochastic process, state space, stationary distribution, Markov process.
  • Given the initial distribution and transition matrix of a stationary Markov Chain, find the distribution of the process at time t.
  • Classify the states of a discrete Markov Chain
  • Compute the limiting distribution of an Ergodic Markov process.
  • Define the notions of convergence in distribution and convergence in probability.
  • Prove the Central Limit Theorem and the Poisson approximation to the Binomial using mgfs


  • Foundations of Probability: Definition of statistical experiment, sample space. Sigma algebras of events. Probability measures. Distribution functions of one and several random variables. Absolutely continuous distributions.
  • Random and pseudo-random numbers. Generating random numbers and distributions using R and Matlab.
  • The distribution of a transformation of one and several random variables.
  • The distribution of the rth order statistic of a random sample of size n from a distribution F and the joint distribution of two or more order statistics.
  • Joint mgfs and their applications.
  • The distribution of a sum of two independent random variables using convolution.
  • Conditional expectation and its application to random sums of random variables.
  • Definition of a stochastic process, state space, stationary distribution, Markov process, Markov Chain, transition Matrix.
  • The Chapman-Kolmogorov equation. Finding the distribution of a Markov Chain at time n from a knowledge of the initial distribution and the transition matrix.
  • Communication classes of a Markov Chain. Irreducible MCs. Recurrent and transient states of a Markov Chain. Period of a MC. Basic properties of periodicity.
  • The Ergodic Theorem.
  • The Poisson Process in detail.
  • Simple branching processes.
  • Convergence in distribution, convergence in probability and convergence almost surely.
  • Proof of the Central Limit Theorem using mgfs.

Teaching Methodology

Lectures, tutorials, assignments and problem papers.

Lectures: Three lectures/tutorials (as needed) per week.

Assignments: One assignment (marked) per week.

Additional problems will be given during lectures and tutorials but will not be marked. However, students will need to do some of these, as well as the assignments, in order to learn the material properly and to adequately prepare for examinations and quizzes. Problem papers will be discussed during the tutorial sessions.



     Coursework Mark:  50% (based on three in-term examinations and/or quizzes).

      Final Examination: 50% (one 2-hour written paper).


Course Calendar






Introduction/Course Overview


Basic Ideas of Probability: Axioms of Probability; sigma algebras. Distribution functions. Using R and Matlab to generate random numbers and distributions.



Assignment 1 handed out





Transformations of several random variables. Joint mgfs.

Assignment 1 returned.

Assignment 2 handed out



Distribution of rth order statistic and joint distribution of several order statistics. Examples and problems.

Assignment        3

Assignment 2 returned.




The distribution of the sum of two random variables using convolution. Conditional Expectation and application to random sums of random variables.

Assignment 4  Assignment 3 returned



Multivariate distributions. Mean and covariance matrix. The multinomial and multivariate normal distributions. Linear Transformations of multivariate distributions (including multivariate normal).

Assignment          5          

Assignment  4 returned

Test 1 on material in weeks 1 to 3.


Convergence in distribution. Proof of the Central Limit Theorem. Convergence in probability and almost sure convergence.

Assignment          6

Assignment 5 returned



Definition of stochastic process, state space, stationary distribution, Markov Chain, transition matrix. Chapman-Kolmogorov equation. Distribution of MC at time n.

Assignment          7

Assignment 6 Returned



Accessibility. Equivalence classes of accessible states. Recurrent and transient states. Examples.

Assignment          8

Assignment 7 returned

Test 2 on material in weeks 4, 5 and 6.


Periodicity. The Ergodic Theorem.

Assignment          9



The simple random walk. Random walks in two and three dimensions. The Gamblers Ruin Problem

Assignment        10

Assignment 9 returned



The Poisson process.

Assignment        11



Simple Branching Processes

Assignment        12





Test 3 on material in weeks 7 to 12.


Required Reading

Essential Texts:

An Introduction to Probability Models by Sheldon Ross (Academic Press).

Probability, an Introduction. Grimmett and Walsh.

Probability and Statistics 3rd edition,  Morris De Groot.


Other Recommended Texts:

Probability and Random Processes, Grimmett and Stirzaker. (3rd edition, 2001). Oxford Press.

An Introduction to Stochastic Processes with Applications to Biology, Linda Allen, 2nd edition. Pearson.

Statistical Inference, Casella and Berger, 2nd edition, Wiley.