Computer Science 511 — Other Directions

Other Research

Other Directions

This course will end with brief introductions to a variety of other areas of research in computational complexity theory. This may include a considerable amount of material that has changed since the last time this course was offered — and material might be less formal, and less complete, than the material that precedes it.

Lecture #27: Approximation Algorithms, the PCP Theorem, and the Hardness of Approximation

Why This is Included

As previous lectures may suggestion, one strategy for “coping with NP-completeness” has been to use randomized algorithms — either with good “worst-case expected performance” or that can fail (possibly, returning incorrect output) with small probability.

Many NP-complete problems are optimization problems — which are either minimization problems or maximization problems. Instances define a set of structures (called “candidates”, or “feasible solutions”). A “measure function”, which represents a “cost” for a minimization problem, and a “reward” for a maximization problem. If the problem is a minimization problem then a structure, corresponding to an instance of the problem, is a correct output if and only if it minimizes the value of this measure function. If the problem is a maximization problem then such a structure is a correct output if and only if it maximizes the value of the measure function, instead.

Sometimes an output will be considered to be acceptable (or “good enough”) if it gives a value, for the measure function, that is close to optimal in some sense. Approximization algorithms are algorithms for optimization problems that return this kind of output.

This lecture introduces approximation algorithms somewhat more formally, including a way to measure the quality of their output. It also introduces results concerning the hardness of approximation — that is, results establishing that some optimization problems do not have very useful approximation algorithms, if P ≠ NP.

Preparatory Reading

Lecture Presentation

Finishing Up

Lecture #28: Parallel Computation and Complexity Classes

Why This is Included

One might first think that parallel computation would allow us to solve problems that are not known to be in P. However, a parallel computation can be simulated using a sequential algorithm, in sequential time that is not significantly larger than the product of the parallel time and the number of processors used. Thue the speedup for a parallel computation is bounded by the number of processors that are available —and there are, generally, too few of these for this approach to be successful.

On the other hand, computations that are highly parallelizable — so that they can be carried out using polylogarithmic time, when a polynomial number of processors are available — have been studied and used.

This lecture introduces a model of computation that can be used to study these parallel algorithms along with associated complexity classes and their properties.

Preparatory Reading

Lecture Presentation

Finishing Up

Lecture #29: Interactive Proofs

Why This is Included:

The exchange of information frequently involves repeated interactions between two or more parties. Indeed, when one person wishes to convince another person that something is true, there will often be several questions that are asked and answered, and there might also be exercises that are requested and carried out. It is certainly plausible that one might be able to prove something, using a multi-round protocol, more easily or efficiently than when only one exchange of information (from a “prover” to a “verifier”) is allowed.

The lecture introduces a formal model that can be used to describe, and measure the reliability and cost of, interactive proofs. It also includes some nontrivial, and significant, connections between the classes of languages that have such proofs and various complexity classes that have already been studied — which is helpful when one wishes to understand “the power of interaction”.

Preparatory Reading

Lecture Presentation

Finishing Up

Lecture #30: Lower Bounds for Structured Models

Why This is Included

Software development often includes both the use of library software (which involves “programming by contract”) and generic programming. In these settings one is solving a problem using formally defined “structured” operations without having direct access to representations of the information being accessed and modified, or without even knowing what representation is being used (or whether it might change).

This lecture introduces a simple framework that can be used to model (some) structure computations, along with two techniques that are often used to establish lower bounds for the resources needed to solve problems in these settings.

Preparatory Reading

Lecture Presentation

Finishing Up

Lecture #31: Boolean Decision Trees

Why This is Included

Boolean decision trees provide an extremely simple model of computation that can be used for the computation of Boolean functions. This lecture introduces this model, along with various properties of Boolean functions that have been studied and related to a function’s “decision tree complexity”.

Preparatory Reading

Lecture Presentation

Finishing Up

Lecture #32: Communication Complexity

Why This is Included

In many situations, multiple people each have different information and wish to compute something that depends on the information of each of them. Furthermore the cost to communicate information from one party to another can exceed the cost of each party’s internal computations.

Communication complexity concerns the amount of information that must be exchanges to correctly compute a function. This lecture introduces this subject and provides a brief introduction to some of the techniques that have been developed to provide lower bounds for the amount of information that must be exchanged to solve problems.

Preparatory Reading

Lecture Presentation

Finishing Up

Lecture #33: Going Further

Why This is Included

This course provides an introduction to a huge area of research that is still active. This lecture describes material that is related to the course”s lecture material that might also have been covered if there had been more time.

Note: Material introduced in this lecture will not be assessed on the final examination.

Preparatory Reading

Lecture Presentation


University of Calgary Extension of Logo
Department of Computer Science

cpsc 511/611 computer science faculty of science u of c

cpsc 511 introduction deterministic time nondeterministic time space-bounded computation circuit complexity randomized computation other directions recommended references course administration assignments tests