230 lines
8.0 KiB
Plaintext
230 lines
8.0 KiB
Plaintext
#import "template.typ": *
|
|
|
|
// Take a look at the file `template.typ` in the file panel
|
|
// to customize this template and discover how it works.
|
|
#show: project.with(
|
|
title: "CS3230",
|
|
authors: (
|
|
"Yadunand Prem",
|
|
),
|
|
)
|
|
|
|
|
|
= Lecture 1
|
|
== Fibonacci
|
|
```python
|
|
def fib(n):
|
|
if n == 0: return 0
|
|
elif n == 1: return 1
|
|
else: return fib(n-1) + fib(n-2)
|
|
```
|
|
- `T(0) = 2` (if, return)
|
|
- `T(1) = 3` (if, elif, return)
|
|
- `T(n) = T(n-1) + T(n-2) + 7` (if, elif, else, +, fib, fib, return)
|
|
|
|
== Notations
|
|
=== $O-$notation
|
|
- _Upper Bound_, that is function grows no faster than $c g(n)$
|
|
- $f in O(g)$ if there is $c > 0$ and $n_0 > 0$ such that $forall n >= n_0: 0 <= f(n) <= c g(n)$
|
|
- The intuition is that for a large enough $n$, there is a #link(<orders>)[function] $g$ and constant $c$, such that $f(n)$ is always lesser than $g$.
|
|
=== $Omega-$notation
|
|
- _Lower Bound_, that is function grows at least as fast as $c g(n)$
|
|
- $f in O(g)$ if there is $c > 0$ and $n_0 > 0$ such that $forall n >= n_0: 0 <= c g(n) <= f(n)$
|
|
=== $Theta-$notation
|
|
- Both upper and lower bounded by $c g(n)$
|
|
- $f in O(g)$ if there is $c_1,c_2 > 0$ and $n_0 > 0$ such that $forall n >= n_0: 0 <= c_1 g(n) <= f(n) <= c_2 g(n)$
|
|
|
|
=== $o$-notation
|
|
- Strict upper bound, $0 <= f(n) < c g(n)$
|
|
=== $omega$-notation
|
|
- Strict lower bound, $0 <= c g(n) < f(n)$
|
|
|
|
#[
|
|
#set par(justify: true, leading: 0.5em)
|
|
=== Orders of Common Functions <orders>
|
|
- $O(1)$
|
|
- $O(log log n)$
|
|
- $O(log n)$
|
|
- $O((log n)^c)$
|
|
- $O(n^c), 0 < c < 1$
|
|
- $O(n)$
|
|
- $O(n log^* n)$
|
|
- $O(n log n) = O(log n!)$
|
|
- $O(n^2)$
|
|
- $O(n^c)$
|
|
- $O(c^n)$
|
|
- $O(n!)$
|
|
]
|
|
|
|
=== Limits
|
|
- $limits(lim)_(n arrow.r infinity)(f(n)/g(n)) = 0 arrow.r.double f(n) = o(g(n))$
|
|
- By defn of limits, $limits(lim)_(n arrow.r infinity)(f(n)/g(n)) = 0$ means:
|
|
- $forall epsilon > 0, exists n_0 > 0$, s.t. $forall n >= n_0, f(n)/g(n) < epsilon$
|
|
- Hence, $f(n) < c * g(n)$
|
|
- $limits(lim)_(n arrow.r infinity)(f(n)/g(n)) < infinity arrow.r.double f(n) = O(g(n))$
|
|
- 0 < $limits(lim)_(n arrow.r infinity)(f(n)/g(n)) < infinity arrow.r.double f(n) = Theta(g(n))$
|
|
- $limits(lim)_(n arrow.r infinity)(f(n)/g(n)) > 0 arrow.r.double f(n) = Omega(g(n))$
|
|
- $limits(lim)_(n arrow.r infinity)(f(n)/g(n)) = infinity arrow.r.double f(n) = omega(g(n))$
|
|
|
|
= Lecture 2
|
|
== Merge Sort
|
|
- MERGE-SORT $A[1..n]$
|
|
1. If $n = 1$, done
|
|
2. Recursively sort $A[1..ceil(n/2)]$ and $A[ceil(n/2)+1..n]$
|
|
3. Merge the 2 sorted lists
|
|
- $T(n) = $
|
|
- $Theta(1)$ if $n = 1$
|
|
- $2T(n/2) + Theta-(n)$ if n > 1
|
|
|
|
== Solving Recurrances
|
|
=== Telescoping Method
|
|
- For a sequence $sum^(n-1)_(k=0) (a_k - a_k+1) = vec(delim: #none,a_0-a_1,a_1-a_2,a_(n-1)-a_n) = a_0 - a_n$
|
|
- E.g. $T(n) = 2T(n/2)+n$
|
|
- $T(n)/n = T(n/2)/(n/2) + 1$
|
|
- $T(n)/n=T(1)/1+log n$
|
|
- $T(n) = n log n$
|
|
- General Solution
|
|
- $T(n) = a T(n/b)+f(n)$
|
|
- $T(n)/g(n) = T(n/b)/g(n/b) + h(n)$
|
|
- And then sum up occurances of $h(n)$.
|
|
=== Recursion Tree
|
|
- Draw the tree, where each node is the $f(n)$ value.
|
|
- Figure out the height of the tree and number of leaves
|
|
=== Master Theorem
|
|
- Put recurrance in the form $ T(n) = a T(n/b) + f(n) $
|
|
- $a >= 1, b > 1, f$ is asymptotically positive
|
|
- Compare $f(n)$ and $n^(log_b a)$
|
|
|
|
Cases
|
|
+ $f(n) = O(n^(log_b a - epsilon)), epsilon > 0$ (If $epsilon = 0$, case 2)
|
|
- $f(n)$ grows polynomically slower than $n^(log_b a)$
|
|
- $therefore T(n) = Theta(n^(log_b a))$
|
|
+ $f(n) = Theta(n^(log_b a)log^(k)n), k >= 0$
|
|
- $f(n)$ and $n^(log_b a)$ grow similar rates
|
|
- $therefore T(n) = Theta(n^(log_b a) log^(k+1)n)$
|
|
+ $f(n) = Omega(n^(log_b a + epsilon)), epsilon > 0$
|
|
- $f(n)$ satisfies $a f(n/b) <= c f(n), c < 1$
|
|
- The regularity condition satisifes that sum of subproblems $< f(n)$
|
|
- $f(n)$ grows polynomically faster than $n^(log_b a)$
|
|
- $therefore T(n) = Theta(f(n))$
|
|
|
|
=== Substitution
|
|
- For Upper bound: $T(n) <= c f(n)$
|
|
- For Tight bound: $T(n) <= c_2n^2 - c_1n$
|
|
Solve $T(n) = 4T(n/2) + n$
|
|
- Guess $T(n) = O(n^3)$
|
|
- Constant $c$ s.t. $T(n) <= c n^3, n >= n_0$
|
|
- By induction
|
|
- $c = max{2, q}, n_0=1$
|
|
- Base Case ($n = 1$): $T(1) = q <= c (1)^3$
|
|
- Recursive Case ($n > 1$):
|
|
- Strong induction, assume $T(k) <= c k^3, n > k >= 1$
|
|
- $T(n) = 4T(n/2) + n <= 4c(n/2)^3 + n = (c/2)n^3 + n <= c n^3$
|
|
= Tutorial 2
|
|
== Telescoping
|
|
- $T(n) = 4T(n/4) + n/(log n)$
|
|
- $T(n)/n - T(n/4)/(n/4) = 1/(log n)$
|
|
- Let $a_i = T(4^i)/4^i, i = log_4 n, a_i - a_(i-1) = 1/(2i)$
|
|
- $sum^(i-1)_(m=0)(a_(m+1)-a_m) = 1/(2i) +1/(2(i-1)) + ... + 1/2 = 1/2(H_i)$ ($H_i$ is harmonic)
|
|
- $a_i - a_0 = O(log i)$
|
|
- $T(4^i) = O(4^i log i), T(n) = O(n log log n)$
|
|
= Lecture 3
|
|
== Correctness of Iterative Algos using Invariants
|
|
- Initialization: Invariant is true before first iteration of loop
|
|
- Maintenance: Invariant is true at start of loop iteration, it is true at start of the _next iteration_ (Induction)
|
|
- Termination: Algo at the end gives correct answer
|
|
=== Insertion Sort
|
|
- Invariant 1: $A[1..i-1]$ are sorted values of $B[1..i-1]$
|
|
- Invariant 2:
|
|
== Recursive Algorithms
|
|
- Usually use mathematical induction on _size of problem_
|
|
=== Binary Search
|
|
- Induction on $"length" n = "ub" - "lb" + 1$
|
|
- Base Case
|
|
- ```python if n <= 0: return False```
|
|
- Induction Step: if $n > 0$, `ub >= lb`
|
|
- Assume algo works for all values `ub-lb+1 < n`
|
|
- `x == A["mid"]: return True`, answer returned correctly
|
|
- `x > A[mid]`, then x is in the array iff it is in `A[mid + 1..ub]`, as $A$ is sorted. Thus by induction answer must be `BinarySearch(A, mid+1, ub, x)`
|
|
- `x < A[mid]`, then x is in the array iff it is in `A[lb..mid-1]`, as $A$ is sorted. Thus by induction answer must be `BinarySearch(A, lb, mid-1, x)`
|
|
- Thus, answer returned is correct
|
|
|
|
== Divide and Conquer
|
|
+ Divide problem into smaller subproblems
|
|
+ Solve subproblems recursively (conquer)
|
|
+ Combine / Use subproblem to get solution to full problem
|
|
|
|
- $T(n) = a T(n/b) + f(n)$
|
|
- $a$ subproblems
|
|
- Each subproblem is size of atmost $n/b$
|
|
- $f(n)$ is time needed to _divide_ problem into subproblems + time to get solution from subproblems _(combine)_
|
|
=== Merge Sort
|
|
+ `MergeSort(A[lb..ub])`
|
|
+ `If ub = lb` return
|
|
+ `If ub > lb` $"mid" = ("ub"+"lb")/2$ ($O(1)$)
|
|
+ `MergeSort(A[lb,mid]), MergeSort(A[mid+1,ub])` (2 Subproblems, each $ceil(n/2)$)
|
|
+ Merge 2 sorted list (Combining: $Theta(n)$)
|
|
|
|
$T(n) = 2T(n/2) + O(n)$
|
|
Complexity: $Theta(n * log n)$
|
|
|
|
=== Powering
|
|
- $F(a, n) = F(a, floor(n/2))^2$ if $n$ is even
|
|
- $F(a, n) = F(a, floor(n/2))^2 * F(a, 1)$ if $n$ is odd
|
|
|
|
$T(n) = T(n/2) + O(1)$
|
|
|
|
=== Fibonacci
|
|
|
|
|
|
= Tutorial 3
|
|
= Lecture 4
|
|
== Lower bound for sorting
|
|
- Any comparison based sorting runs in $Omega (n log n)$
|
|
- The tree must contain at least $n!$ leaves for every possible permutation
|
|
- Height of binary tree = $log(n!) = Omega n log n$
|
|
= Tutorial 4
|
|
= Lecture 5
|
|
- Las Vegas Algorithms
|
|
- Output always correct
|
|
- Monte Carlo
|
|
- Answer may be incorrect with small probability
|
|
= Tutorial 5
|
|
= Lecture 6
|
|
== LCS
|
|
=== Brute Force
|
|
- Check all possible subsequences of A and check if its a subsequence of B and output longest one
|
|
- $2^n$ possible subsequences, Total Time $O(m 2^n)$
|
|
=== Recursive
|
|
Base Case:
|
|
- $"LCS"(i, 0) = emptyset$ ($i$ is index A, $j$ is index B)
|
|
- $"LCS"(0, j) = emptyset$ ($i$ is index A, $j$ is index B)
|
|
|
|
If $a_n = b_m$, then $"LCS"(n-1, m-1) ::a_n$
|
|
Proof by contradiction
|
|
- If last symbol in $S = "LCS"(n, m)$ is not same as $a_n$, then last symbol must be a past of$a_1, ... a_(n-1)$, and $b_1, ... b_(n-1)$.
|
|
- $S$ is subsequence of $a_1, ... a_(n-1)$, and $b_1, ... b_(n-1)$.
|
|
- Append $a_n$ with $S$ i.e. $S :: a_n$ and get subsequence of length 1 more
|
|
- Thus $S$ canont be largest subsequence (Contradiction)
|
|
- So far, we only argued $a_n$ must be last symbol in $"LCS"(n, m)$
|
|
- It is ifne to match $a_n$ with $b_m$ (since $a_n$ is last symbol)
|
|
- Therefore $"LCS"(n, m) = "LCS"(n-1, m-1)::a_n$
|
|
|
|
If $a_n != b_m, "LCS"(n, m) = max("LCS"(n-1,m), "LCS"(n, m-1))$
|
|
|
|
|
|
= Tutorial 6
|
|
Greedy vs DP
|
|
|
|
|
|
= Lecture 7 Greedy
|
|
== DP Recap
|
|
- Express solutions recursively
|
|
- Small number(polynomial) of subproblems
|
|
- Huge overlap among subproblems, so recursive may take exponential time
|
|
- Compute recursive iteratively in bottom up fashion
|
|
== Greedy
|
|
Recast the problem so that only 1 subproblem needs to be solved at each step.
|
|
|
|
= Lecture 8 Amortized Analysis
|