308-362B Fall 2005


September 1 Lecture: Course Overview and An Introduction to Linear Programming (Chapter 1 0f Chvatal, handed out in class).

September 8th Lecture: Linear Programming Duality (Chapter 5 0f Chvatal, handed out in class, pp-54-57, the statement of the Duality theorem on page 58, and page 60). Note that we did not prove the strong duality theorem and you need not read this part of the chapter.

September 13th Lecture: Stable Marriages: Section 1.1 of the text.

September 15th Lecture: Five Representatives Problems and Solutions to two of them: Section 12.,4.1,6.1, and 6.2 of the text.

September 20th Lecture: Lightest Paths 1: We can regard the interval scheduling problem of the last lecture as the problem of finding the longest path from s to t in an acyclic digraph with vertices {s,v1,...,vn,t}, edges from s to every other vertex, edges to t from every other vertex, and an edge from vi to vj precisely if b_i is less than a_j. To solve the maximum weight interval scheduling problem, we find the lightest s-t path in this graph under the weighting w(s)=w(t)=0, w(vi)=-val(i).

In this lecture we considered the Lightest Path Problem , in general directed graphs. We developed a dynamic programming algorithm for the Lightest Path Problem in acyclic directed graphs . We also developed a general technique for certifying a solution to the LPP , in graphs with no negative weight cycles. We used it to certify the solution our algorithm gave for LPP in DAGS.

September 22nd Lecture: Lightest Paths 2. We saw algorithms for directed graphs weighted with nonnegative weights and for the more general problem where we just forbid negative weight cycles. The first algorithm is discussed in Section 4.4 of the text. The second is discussed in Section 6.8 (once we flip the direction of every arc).

September 27th Lecture: Lightest Paths 3: We saw Johnson's algorithm for determining if a graph has a negative weight cycle. This is discussed in Section 6.10 of the text. We saw how to used the certificate obtained by this algorithm to transform the All Pairs Lightest Path Problem in graphs without negative weight cycles to the special case in which no edge weights are negative.

September 29th Lecture: Max Flow I We described an algorithm for this problem and proved it terminated. This is sections 7.1 and 7.2 in the book.

October 4th Lecture: Max Flow II We saw applications of the maximum flow algorithm to Bipartite Matching and Image Segmentation. This material can be found in Sections 7.5 and 7.10 of the text. We also talked about Hall's Theorem, the Konig-Egervary Theorem, and Dilworth's theorem. There will be a handout on these topics.

October 6th Lecture: Max Flow III We completed our discussion of Dilworth's theorem, and noted that we could actually solve the Airline scheduling problem of Section 9 using this approach rather than that taken in the book. We showed how to decompose s-t flows up into a set of weighted s-t paths and cycles. That is we proved that for any flow $x$ in a capacitated network with $m$ arcs: for some $l \le m$ there is a set $C_1,,,.,.C_k$ of cycles and $P_{k+1}, ..., P_l$ and associated non-negative weights $w_1,..,w_l$ such that for each edge $e$, $x(e)$ is the sum of the weights of the cycles and paths containing $e$ and the volume of $x$ is the sum of the weights of the paths. We also showed that if the flow was integer valued then we could choose integer weights. This allowed us to prove Menger's Theorem from Section 7.6 in the book.

October 11th Lecture: Max Flow IV We showed that the if $x_1$ was a flow and $x_2$ was a maximum flow then there was a flow of volume $vol(x_2)-vol(x_1)$ in the auxiliary graph for $x_1$. Applying the previous result, we thereby obtained that there is a path of capacity $vol(x_2)-vol(x_1)/m$ in the auxiliary network. This implied that if we always augmented along a maximum capacity path in the auxiliary graph we would perform only $O(m log (Cm))$ iterations where $C$ is the largest edge capacity. We saw that we could find such a path in $O(mlogm)$ time. This yields an implementation of our algorithm in $O(m squared log(m) log (Cm))$ time. A similar but different approach is taken in Section 7.3 of the book.

We briefly discussed the Minimum Cost Maximum Flow Problem, which is not examined material.


October 13th Lecture: A Randomized Polytime Algorithm for testing primality. A second reference was a chapter by Raghavan and Motwani handed out in class.

October 18th Lecture: A Deterministic Polytime Algorithm for testing primality. Not examined material.

October 20th Lecture: Yao's minmax principle. Reference handed out in class.

October 25th Lecture: Knapsacks, Cutting Stock, and Column Generation for LPS The latter two topics are not examined Material, you should know the algorithm for solving the fractional knapsack problem, the branch and cut algorithm for solving knapsack problems, and the two-approx. algorithm for binpacking. .

October 27th Lecture: More on the Cutting Stock.

October 28th: Exam 10:45-12:15. Covers material up to but not including the October 20th lecture. Will be held in McConnell 320

November 1: Lecture Cancelled (held on October 28th).

November 3: Vision, Not examined Material.

November 8: Algorithmic Game Theory. Not examined Material.

November 10: NP-completeness I: P and NP. Sections 7.1-7.3 of Sipser. This is largely review material.

November 15: NP-completeness II: Turing Reducibility and some sample reductions, and the definition of NP-complete. Reference for our reductions is Chapter 8 of the text. We essentially covered sections 8.1 through 8.9 of the text in five lectures, except A) we did not discuss Circuit complexity to show Sat was NP-complete rather we proved the Cook-Levin Theorem and (B) our introduction to NP-completeness was a little more formal hence I used Sipser as a reference (there is overlap with the text).

November 17: NP-completeness IV: More examples of NP-complete problems. Also, the 1.5-approx algorithm for bin packing.

November 22: NP-completeness III: the Cook-Levin Theorom. Reference is a handout in Class from Garey and Johnson. Uses same notation a Sipser except for yes and no instead of accept and reject and +1,-1 instead of L,R.

November 24: NP-completeness V. Yet more examples.