Computer Science 351 — Probability for Computer Science

Dice

Probability for Computer Science

Overview

The course ends by continuing the introduction to discrete probability theory that was started in Computer Science 251. Results that are useful for the average-case analysis of algorithms, and the analysis of randomized algorithms, will be emphasized — and examples from computer science will be given.

Lecture #18: Probability Distributions

Why This is Included

Once again, you have to start somewhere! This lecture introduces sample spaces and probability distributions, which will be used to model and analyze experiments, for the rest of this course.

Almost everything in this lecture should be a review of material that was introduced in a prerequisite for this course — but an example from computer science, and a small number of technical results, will likely be material that students have not seen already.

Preparatory Reading

Lecture Presentation

Finishing Up

Tutorial #15: Probability Distributions

Lecture #19: Conditional Probability and Independence

Why This is Included

We are often interested in multiple events, and the relationships between them. It is also sometimes possible to compute the probability of a complicated event by considering other events and using these to introduce multiple simpler cases that can be considered instead. This lecture introduces notation and some results that are needed for this.

Once again, quite a bit of the material in this lecture should be material that most students in the class have seen before — but new material may appear, near the end.

Preparatory Reading

Lecture Presentation

Finishing Up

Tutorial #16: Conditional Probability and Independence

Lecture #20: Random Variables and Expectation

Why This is Included

There is often a numerical value, associated with an experiment, whose value we wish to know. Indeed, computing (or estimating, or bounding) this value may be the main reason to consider the experiment at all. This lecture introduces notation and some results that are needed for this.

While many students will have seen quite a bit of this material already, some have reported, in past terms, that they have not remembered seeing any of it at all. In any case it is likely, once again, that there will be some new material near the end.

Preparatory Reading

Lecture Presentation

Tutorial #17: Random Variables and Expectation

Lecture #21: Tail Bounds

Why This is Included

As noted above, there is often a numerical value, associated with an experiment, whose value we wish to know about. This can be modelled using a random variable whose expected value can be computed or bounded.

We might also be interested in the probability that this random variable’s value is far away from its expected value — or is above (or below) a given threshold. This lecture includes notation and results that can often be used to compute or bound this kind of proabability.

Preparatory Reading

Lecture Presentation

Finishing Up

Tutorial #18: Tail Bounds

Lecture #22: Application — The Analysis of Algorithms

Why This is Included

While the worst-case analysis of an algorithm is often important, this might not describe how the algorithm behaves “most of the time“ — and this might be what someone is most interested in, when they want to make a choice between algorithms.

It is sometimes possible to produce algorithms that are simpler or (at least slightly) faster, than would otherwise be the case, if these algorithms are allowed to make random choices.

This lecture describes how probability theory can be applied to perform an average-case analysis of an algorithm, to understand how it behaves in practice. It also introduces several kinds of randomized algorithms that have been studied and used.

Preparatory Reading

Lecture Presentation

Finishing Up

Tutorial #19: Application — The Analysis of Algorithms

Lecture #23: Classical Probability Distributions

Why This is Included

“Classical” probability distributions arise when probability theory is being used to solve problems, over and over again. However, the language used in an application area might not be the language used in a textbook on probability and statistics, so these distributions might appear in a somewhat “disguised” form. If you are already familiar with these classical probability distributions then you will, ideally, recognize them — so that you can look for and make use of helpful material about these classical distributions in the “probability theory” literature.

Note: Apart from the inclusion in “participation” questions, students will not be examined on this material in CPSC 351.

Preparatory Reading

Lecture Presentation

Lecture #24: Randomly Constructed Binary Search Trees

Why This is Included

Binary search trees are often used to implement “dictionaries” because their performance, in practice, is so good, even though operations can use time that is linear in the size of the dictionary being represented, in the worst case.

The analysis in this lecture provides additional evidence that these data structures will work well, when assumptions about their creation (and distribution) hold. It also provides additional evidence that the mateiral about discrete probability, introduced in this course, is useful for solving problems in computer science.

Preparatory Reading

Lecture Presentation

Assignment #3: Probability for Computer Science


University of Calgary Extension of Logo
Department of Computer Science

cpsc 351 computer science faculty of science u of c

cpsc 351 introduction finite automata and regular languages turing machines and their languages and functions computability probability for computer science conclusion recommended references course administation assignments tests