Past and future lectures in CS252--Winter 2005


The schedule may be updated from time to time, as we progress. CLRS refers to Cormen, Leiserson Rivest and Stein(Second Edition)

Lecture 1 (Jan 6) General introduction. Review of Relevant material from 308-250. Definition of algorithm, abstract data types, data structures. Introduction to simple computer models: Turing model, RAM model, pointer based machines, Worst case, best case, and average case complexity. Chapters 1,2, and 3 in CLR(this is review of material from 250), web pages from 1997.
Lecture 2 (Jan 11) We review the notion of recursion. Euclid's Algorithm, Binary Search, Mergesort, Fibonnacci Numbers. The review is chapter 4 of Cormen, except for the proof of the Master Theorem. Euclid's algorithm is pages 856-859 of Cormen. The results on Fibonacci numbers are an exercise in Chapter 4 of Cormen.
Lecture 3 (Jan 13) Quicksort and Simplesort. Probabilistic Analysis of Algorithms. Chapter 7 of CLR.
Lecture 4 (Jan 18) Selection: chapter 9 of CLR. We cover the linear worst-case algorithm in its entirety, and gloss over Hoare's linear expected time algorithm.
Lecture 5 (Jan 20) Lower Bounds I: The Decision Tree Model, The information theory lower bounds for sorting and for searching an ordered array. The Comparision DAG. The information theory lower bound for the kth smallest element. These slides: describe the decision tree model and the lower bounding for sorting. They also discuss the bound for the median given in the next lecture, using a treatment similar to but slightly different from the one I gave in class. Section 8.1 of the text talks about the information theory lower bound for sorting.
Lecture 6 (Jan 25) Lower Bounds II: Adversarial Arguments. The adversarial argument for median finding, finding the maximum and the minimum, and finding the two largest elements. For a discussion of the adversarial arguments for these three problems, see some useful notes: Twice in these notes there is a bold face Trivial Lower Bound which should be Trivial Upper Bound. The reduction approach and an example: finding convex hulls.
Lecture 7 (Jan 27) Priority Queues, Review of Heaps and Heapsort. Binomial Heaps Chapters 6 (Review) and 19 of CLR.
Lecture 8 (Feb 1) Linear time algorithms for sorting sections 8.2-8.4 of the text. Review of Basic Graph Definitions, Combined Adjacency List-Adjacency Matrix data strcuture. Section 22.1 of the text. Pre and post order traversal of trees and other basic tree definitions. See Blanchette'sweb page for cs250. An algorithm for computing the height of a tree using a post order transversal.
Lecture 9 (Feb 3) DFS-including proof of correctness and use in topological sort and finding strong components Section 22.3, 22.4,22.5 of CLR.
Lecture 10 (Feb 8) SAT and 2SAT.
Lecture 11 (Feb 10) An Introduction to Linear Programming and Lower Bounds on Building Heaps (the latter is an unexamined topic). You can see the first bit of Chapter 29 of the text for Linear Programming or the first chapter of Chvatal's text Linear Programming which is available at Copi-EUS on the ground floor of McConnell.
Lecture 12 (Feb 15) Computing the blocks of a graph using Depth First search.
Midterm Exam(Feb 17) Note: closed book, 90 minutes, no calculators, cell phones or other electronic devices. The material from Lecture 1 on models of computation will not be examined nor will the material on Linear Programming or finding blocks although you are responsible for the latter with respect to the final exam.
Lecture 13 (Mar 1) Breadth-First Search. Minimum weight spanning trees and Prim's algorithm. Section 22.2 and chapter 23 of CLRS.
Lecture 14 (Mar 3) More on minimum weight spanning trees: Kruskal's algorithm and data structures for disjoint sets. Sections 21.1, 21.2 and of CLRS. Also, an introduction to amortized analysis. (What we saw was an example of aggregate analysis. For another example, see section 17.1 of CLRS.)
Lecture 15 (Mar 8) Hash tables. Direct-address tables, hashing with chaining, universal hashing. Open addressiong: linear probing and double hashing. Sections 11.1 to 11.4 of CLRS
Lecture 16 (Mar 10) Greedy algorithms: making change, building a Huffman tree. Coding and compression: prefix codes, Huffman codes. Sections 16.2 and 16.3 of CLRS.
Lecture 17 (Mar 15) Dynamic programming. Computing the binomial coefficient using Pascal's triangle; making change (using coins of arbitrary values); matrix-chain multiplication. The notions of optimal substructure and of overlapping subproblems. Sections 15.2 and 15.3 of CLRS.
Lecture 18 (Mar 17) Memoization in dynamic programming. Divide and conquer: the skyline problem; finding the closest pair of points. Section 33.4 of CLRS.
Lecture 19 (Mar 22) The master theorem: proof of a simplified version. Red-black trees. Chapter 13 of CLRS.
Lecture 20 (Mar 24) Red-black trees continued: insertion and deletion in a RB tree. Augmenting data structures: order-statistic trees. Section 14.1 of CLRS.
Lecture 21 (Mar 29) Augmenting data structures, continued: interval trees. Plane sweep and interval trees used to solve the rectangle intersections problem. Sections 14.2 and 14.3 of CLRS.
Lecture 22 (Mar 31) Using plane sweep to compute the intersection of a set of half-planes in R^2. Berg et al. section 4.2.
Lecture 23 (Apr 5) Incremental algorithm for 2D linear programming. Berg et al. section 4.3.
Lecture 24 (Apr 7) Probabilistic analysis of the randomized version of incremental 2DLP. Approximation algorithm for the travelling salesman problem with triangle inequality. Berg et al. section 4.4; CLRS section 35.2.
Lecture 25 (Apr 12) The general travelling salesman problem cannot be approximated in polynomial time with a constant approximation ratio, unless P=NP. We do this by reducing the hamiltonian cycle problem to Approx-TSP. Again, CLRS section 35.2.