Home/Announcements | Course Information | Lectures | Assignments | Exams | Tutorials |
Below you will find summaries of lectures, handouts made available during the term, and reading materials. A set of notes is also given online. Please see the slides for the first lecture for suggestions on how to make use of these notes.
Please check this page regularly for updates.
Lecture 1 (Tuesday, September 4): Introduction; RAM, Algorithm Analysis and Order Notation
Today I first handed out the course information sheet. I then gave an introduction to this course. Afterwards, I started to show algorithm analysis and introduced the random access machine model of computation. I will talk more about this model in the next lecture.
Handout: course information sheet; introduction to CSCI 3110; Lecture 1 notes
Reading: CLRS 2.1
Lecture 2 (Thursday, September 6): RAM, Algorithm Analysis and Order Notation
Today I talked more about RAM and showed how to analyze the running time of insertion sort under RAM.
Handout: Lecture 2 notes
Reading: CLRS 2.2
Lecture 3 (Tuesday, September 11): Algorithm Analysis and Order Notation
I first introduced Θ-, O-, Ω-, o- and ω-notation and gave some properties of order notation. Then I concentrated on tricks that could help us comparing functions.
Handout: Lecture 3 notes
Reading: CLRS 3.1
Lecture 4 (Thursday, September 13): Algorithm Analysis and Order Notation; Algorithm Design and Maximum Subrange Sum
Today I first talked a little more about the tricks for order notation. Then I showed four solutions to the maximum subrange sum problem, and started to talk about the fifth solution.
Handout: Lecture 4 notes
Lecture 5 (Tuesday, September 19): Algorithm Design and Maximum Subrange Sum; Reduce to Known Problem
I first talked more about the element distinctness problem and and started to talk about the algorithm design paradigm of reduce to known problem. Then we discussed the element distinctness problem, set intersection and colinear points.
Handout: Lecture 5 notes
Lecture 6 (Thursday, September 21): Reduce to Known Problem
First I defined the convex hull problem, introduced polar angle and showed how to compute angles between line segments. I then described a naive approach for convex hull, Jarvis's march and Graham's scan. I will finish some details of Graham's scan in the next lecture.
Handout: Lecture 6 notes
Reading: CLRS 33.3
Lecture 7 (Tuesday, September 25): Reduce to Known Problem; Recursion
Today I showed some details of Graham's scan. Then I proved a lower bound for convex hull by reducing sorting to convex hull. Then I analyzed the running time of the recursive solution to Fibonacci numbers rigorously, using the substitution method. The fire alarm in the Tupper Building interrupted the lectures for some time.
Handout: Lecture 7 notes
Lecture 8 (Thursday, September 27): Recursion; Divide-and-Conquer
I first showed how to use memoization to make the recursive solution for computing the n-th Fibonacci number efficient. I then introduced a recursive algorithm for printing all permutations and analyzed its running time. Finally, I gave the pseudocode of mergesort, which we will analyze in the next lecture.
Handout: Lecture 8 notes
Reading:
CLRS: 2.3.1
Lecture 9 (Tuesday, October 2): Divide-and-Conquer
Today I first showed how to solve the recurrence for mergesort. Then I analyzed a divide-and-conquer algorithm for multiplying two n-bit integers, and introduced Karatsuba's algorithm. Next we learned the master theorem. I will give examples on how to use it in the next lecture.
Handout: Lecture 9 notes
Reading: CLRS 4.5
Lecture 10 (Thursday, October 5): Divide-and-Conquer
Today I first gave some suggestions regarding the midterm exam. I then went through some examples of using master theorem. Next I introduced recursion trees, and started to discuss the problem of finding closest pairs of points.
Handout: Lecture 10 notes
Reading: CLRS 4.4
Optional reading: CLRS: 4.6, Slides for Midterm preparation
Lecture 11 (Tuesday, October 9): Divide-and-Conquer; Invent (or Augment) a Data Structure
Today I first finished the solution to the problem of finding the closest pair of points. Then we started to discuss how to invent or argument a data structure. I showed most of the solutions to the problem of abelien square detection.
Handout: Lecture 11 notes
Reading:
CLRS: 33.4
Lecture 12 (Thursday, October 11): Invent (or Augment) a Data Structure; Greedy Algorithms
Today I first finished talking about the problem of abelien square detection. Then I introduced greedy algorithms. Next I described the activity selection problem and showed some greedy strategies that do not work. Finally I gave a greedy algorithm for the activity selection problem that always yields an optimal solution. I will finish the correctness proof in the next lecture.
Handout: Lecture 12 notes
Reading:
CLRS: 16.1, 16.2
Lecture 13 (Tuesday, October 16): Greedy Algorithms
Today I first finished the correctness proof of the greedy algorithm for the activity selection problem. Then I discussed the knapsack and the fractional knapsack problems, and introduced the Egyptian fractions.
Handout: Lecture 13 notes
Lecture 14 (Tuesday, October 23): Greedy Algorithms; Dynamic Programming
Today I first discussed the Egyptian fractions. I then started to introduce dynamic programming, using matrix-chain multiplication as an example. I analyzed the brute-force approach for matrix-chain multiplication, and then introduced a solution to matrix-chain multiplication based on dynamic programming.
Handout: Lecture 14 notes
Reading:
CLRS: 15.2
Lecture 15 (Thursday, October 25): Dynamic Programming
I first walked through an example of using dynamic programming to solve matrix chain multiplication. Then I showed how to multiply the matrix chain optimally using the results computed by the algorithm based on dynamic programming. I also showed another way of writing down the pseudocode of dynamic programming algorithms: memoization. After that, we started to discuss the longest common subsequence problem.
Handout: Lecture 15 notes
Reading:
CLRS: 15.3
Lecture 16 (Tuesday, October 31): Dynamic Programming
Today Dr. Nobert Zeh covered my lecture. He first finished discussing the problem of computing the longest common subsequence. He then discussed the problem of marking change.
Handout: Lecture 16 notes
Reading:
CLRS: 15.4
Lecture 17 (Thursday, November 1): Exploiting the Problem's Structure; Probabilistic & Randomized Techniques
Today I first presented Euclid's algorithm and analyzed it. After that, I introduced the binary algorithm of computing the greatest common divisor, and analyzed it as well. I then used the problem of finding the majority element as an example to teach randomized algorithms and I will finish it in the next lecture.
Handout: Lecture 17 notes
Reading:
CLRS: 31.2
Lecture 18 (Tuesday, November 6): Probabilistic & Randomized Techniques
Today we use the majority and selection problems to discuss randomized algorithms.
Handout: Lecture 18 notes
Reading: CLRS: 9.2, 9.3 (optional)
Lecture 19 (Thursday, November 8): Graph Algorithms
First I defined terminology on graphs and reviewed graph representations. Next I introduced Kruskal's algorithm and proved its correctness.
Handout: Lecture 19 notes; Graph Theory Fundementals and Graph Representations
Reading: CLRS: 22.1, 23.1
Lecture 20 (Tuesday, November 20): Graph Algorithms
Today I first showed how to implement Kruskal's algorithm. Then I introduced Prim's algorithm.
Handout: Lecture 20 notes
Reading:
CLRS: 23.2
Lecture 21 (Thursday, November 22): Graph Algorithms
Today I first talked about the final exam and then introduced the single-source shortest path problem and the Bellman-Ford Algorithm.
Handout: Lecture 21 notes; Suggestions on Final Exam Preparation
Reading:
CLRS: 24.1
Lecture 22 (Tuesday, November 27): Graph Algorithms; NP-completeness
The first 15 minutes of the lecture were used for student rating of instructions; thank you for participating. After it, I introduced Dijkstra's algorithm. Finally I introduced the complexity classes P.
Handout: Lecture 22 notes
Reading:
CLRS: 24.3
Lecture 23 (Thursday, November 30): NP-completeness
Today we discussed the complexity classes P and NP. However, the university was closed due to power shortage afterwards. Therefore, as mentioned in my email to class, I post the notes for today as a "bonus" and today's lecture will not be covered in the final exam.
Handout: Lecture 23 notes
Reading:
CLRS: 34.1, 34.2, 34.3
Acknowledgement: Some content and examples are from and made to be generally consistent in notation with the required textbook Introduction to Algorithms, though my exact way of presenting these is often different. I would also like to thank Professor Jeffrey Shallit at the University of Waterloo, who taught two other parallel sections of an algorithm course when I taught the third section; such experience and his notes greatly influenced the notes that I wrote for my offering of CSCI3110 at Dalhousie.