Introduction to communication and information complexity
|| MW 10:05-11:25am at MC 320.
This course is intended for graduate students in theoretical computer science or mathematics. It is a topics course in communication complexity and information complexity. The setting is that there are two players (with unlimited computational power), and given a function f(x,y), they want to agree on a communication protocol for the following purpose: Each of the players receives an n bit input, say x and y. Neither knows the other's input, and they wish to use the protocol to collaboratively compute f(x,y). Communication complexity is concerned with minimizing the number of bits communicated by the players for the worst-case choice of inputs x and y. Information complexity is concerned with the amount of information that the communicated bits reveal about the inputs x and y to the other player or to the external observer. The areas of communication and information complexity are currently very active fields of research in theoretical computer science, and they employ a surprisingly vast range of
techniques and tools from other areas of mathematics.
The main prerequisite is mathematical maturity and familiarity with basic background in linear algebra, mathematics analysis, and probability theory.
There is no exam in this course. The final grade will be based on assignments and a final presentation.
Lectures (will be updated as we proceed)
- Lecture 1: Deterministic model of two-party communication. Protocols for the equality and parity problems. Formal definition of protocols and their tree representation. Combinatorial rectangles, and their relation to deterministic protocols. Tight lower bounds for the equality. (Reading: 1.1, 1.2, 1.3 of Kushilevits-Nisan).
- Lecture 2: Lower bounds on communication via matrix rank. Log-rank Conjecture. Rectangle covers, and upper-bounding deterministic communication complexity in terms of the sizes of the smallest covers (Theorem 2.11 of Kushilevits-Nisan). Introduction to private coin randomized communication complexity. A private coin randomized protocol for Equality. (Reading: 1.4, 2.1, 2.3, 3.1 of Kushilevits-Nisan).
- Lecture 3: Public coin Vs Private coin. A randomized protocol for Equality using public coin randomness. Newman's theorem (Theorem 3.14 of Kushilevits-Nisan) for the equivalence of the public coin and the private coin models. (Reading: 3.3 of Kushilevits-Nisan).
- Lecture 4: Distributional communication complexity, and Yao's theorem that maximum distributional communication complexity equals to the public coin randomized communication complexity. (Reading: 3.4 of Kushilevits-Nisan).
- Lecture 5: Discrepancy, lower-bounding the randomized CC in terms of discrepancy, randomized CC of the inner product function. (Reading: Section 2 of Toni's note, and/or 3.5 of Kushilevits-Nisan).
- Lecture 6: Limitation of the the discrepancy method, limitation of the fooling set technique, approximate rank and sign rank, Krause's theorem (randomized CC is at least log of the approximate rank), unbounded error model.
- Lecture 7: Paturi-Simon theorem (unbounded randomized CC equals log of the sign rank). Non-determinism, Communication complexity classes: P, NP, CoNP, PSPACE, Polynomial hierarchy, BPP, UPP, PP.
- Lecture 8: Reductions and completeness, DISJ is CoNP-complete, Matrix norms, Forster's theorem.
- Lecture 9: Fourier analysis of Boolean functions, degree, approximate degree, sign-degree, threshold weight.
- Lecture 10: Pattern Matrices and their spectral properties, Generalized discrepancy method. (Reading: 3.3, 4.1, 4.2 of Sherstov's thesis).
- Lecture 11: randomized CC and the discrepancy of pattern matrices, and their relation to approximate and sign degree. (Reading: 4.3, 4.4, 4.5, 4.6 of Sherstov's thesis).
- Lecture 12: Introduction to information theory and Entropy.
- Lecture 13: Mutual information and divergence.
- Lecture 14: Introduction to information complexity.
- Lecture 15: Information complexity of Equality.
- Lecture 16: Direct product/sum theorems, additivity of information complexity, information equals amortized communication.
- Lecture 17: A direct sum theorem using compression (from Lecture 16), Braverman-Rao's correlated sampling.
- Lecture 18: Compression using Braverman-Rao's correlated sampling (from Lecture 17). The [BBCR] compression.
- Lecture 19: Braverman's exponential compression. The external information complexity of the AND function.
- Lecture 20: The internal information complexity of the AND function.
- Lecture 21: Exact asymptotics of the randomized communication complexity of Disjointness.
- Lecture 22: Karchmer-Widgerson theorem (See Section 1.1 in Toni's note), Super-logarithmic depth lower bounds via the direct sum in communication complexity (See Theorem 7 in Karchmer-Raz-Widgerson). Multiparty Number on the forehead model (9.1, 9.2, 9.3 of Sherstov's thesis).
- Theorem 3.5 and 3.6 from "Mark Braverman, Interactive information complexity". (Claimed by Ying Jie)
- Shachar Lovett. Communication is bounded by root of rank, and Thomas Rothvoss, A direct proof for Lovett's bound on the communication complexity of low rank matrices. (Claimed by Ira)
- Mark Braverman, Ankit Garg, Denis Pankratov, Omri Weinstein, Information lower bounds via self-reducibility. (Claimed by Alan Do-Omri)
- Mika Goos, Shachar Lovett, Raghu Meka, Thomas Watson, David Zuckerman. Rectangles Are Nonnegative Juntas. (Claimed by Florence)
- Mark Braverman, Anup Rao, Omri Weinstein, Amir Yehudayoff, Direct products in communication complexity. (Claimed by Harrison)
- Anat Ganor, Gillat Kol, Ran Raz, Exponential Separation of Communication and External Information. (Claimed by Tricia)
- A Sherstov, Communication lower bounds using directional derivatives.
- Mark Braverman, Omri Weinstein, An interactive information odometer with applications.