Skip to main content

Discrete Mathematics

Dynamic Programmingโ€‹

  • Optimal substructure. The optimal solution to a problem consists of optimal solutions to subproblems.
  • Overlapping subproblems. few subproblems in total, many recurring instances of each.

Bellman-Fordโ€‹

shortest path, negative possible.

OPT[v,k]=minโก(OPT[v,kโˆ’1],OPT[u,v]+w(u,v))\text{OPT}[v,k] = \min(\text{OPT}[v, k-1], \text{OPT}[u, v] + w(u,v))

Chain Matrix Multiplicationโ€‹

OPT[i,j]=OPT[i,k]+OPT[k+1,j]+combine\text{OPT}[i, j] = \text{OPT}[i, k] + \text{OPT}[k+1, j] + \text{combine}

Min Cutโ€‹

The set of vertices reachable from source s in the residual graph is one part of the partition. Cut capacity cap(A,B)=โˆ‘outย Ac(e)\text{cap}(A, B) = \sum_{\text{out A}} c(e).

โˆฃfโˆฃ=โˆ‘eย outย ofย Xf(e)โˆ’โˆ‘eย intoย Xf(e)|f| = \sum_{e~\text{out of X}} f(e) - \sum_{e~\text{into X}} f(e)
  • maxflow = Min cap(A,B)\text{cap}(A, B).

Ford-Fulkersonโ€‹

Given (G,s,t,cโˆˆN+)(G, s, t, c \in \mathbb{N}+), start with f(u,v)=0f(u,v)=0 and Gf=GG_f =G. While an augmenting path is in GfG_f, find a bottleneck. Augment the flow along this path and update the residual graph GfG_f. O(โˆฃfโˆฃ(V+E))\mathcal{O}(|f| (V+E)).

Edmond-Karpโ€‹

Ford-Fulkerson, but choose the shortest augmenting path.

For any flow ff and any (A,B)(A,B) cut, โˆฃfโˆฃโ‰คcap(A,B)|f| โ‰ค \text{cap}(A,B). For any flow ff and any (A,B)(A,B) cut, โˆฃfโˆฃ=โˆ‘f(s,v)=โˆ‘uโˆˆA,ย vโˆˆBf(u,v)โˆ’โˆ‘uโˆˆA,ย vโˆˆBf(v,u)|f| = \sum_f (s, v) = \sum_{u \in A,~v \in B} f(u, v) - \sum_{u \in A,~v \in B} f(v, u)

Solving by Reductionโ€‹

To reduce a problem YY to a problem XX (Yโ‰คpXY \leq_{p} X) we want a function ff that maps YY to XX such that ff is a polynomial time computable and โˆ€yโˆˆY\forall y \in Y is solvable if and only if f(y)โˆˆXf(y) \in X is solvable.

Reduction to NFโ€‹

Describe how to construct a flow network. Claim "This is feasible if and only if the max flow is โ€ฆ". Prove both directions.

Bipartite to Max Flowโ€‹

  • Max Matching โŸน Max Flow. If k edges are matched, then there is a flow of value k.
  • Max matching โŸธ Max flow. If there is a flow f of value k, there is a matching with k edges.
  • O(โˆฃfโˆฃ(Eโ€ฒ+Vโ€ฒ))\mathcal{O}(|f| (E' + V'))
  • โˆฃfโˆฃ=V|f| = V
  • Vโ€ฒ=2V+2V' = 2V + 2
  • Eโ€ฒ=E+2VE' = E + 2V
  • O(V(E+2V+2V+2))=O(VE+V2)=O(VE)\mathcal{O}(V (E + 2V + 2V + 2)) = \mathcal{O}(V E + V^2) = \mathcal{O}(V E)

Circulationโ€‹

  • d(v)>0d(v) > 0 if demand
  • d(v)<0d(v) < 0 if supply
  • Capacity Constraint 0โ‰คf(e)โ‰คc(e)0 \leq f(e) \leq c(e)
  • Conservation Constraint fin(v)โˆ’fout(v)=d(v)f^{\text{in}} (v) - f^{\text{out}} (v) = d(v).
  • For every feasible circulation โˆ‘vโˆˆVd(v)=0\sum_{v \in V} d(v) = 0, there is a feasible circulation with demands d(v)d(v) in GG if and only if the maximum sโˆ’ts-t flow in Gโ€ฒG' has value D=โˆ‘d(v)>0d(v)D = \sum_{d(v)>0} d(v).

Circulation with Demands and Lower Bounds.โ€‹

  • Capacity Constraint l(e)โ‰คf(e)โ‰คc(e)l(e) \leq f(e) \leq c(e)
  • Conservation Constraint fin(v)โˆ’fout(v)=d(v)f^{\text{in}} (v) - f^{\text{out}} (v) = d(v)
  • Given GG with lower bounds, we subtract the lower bound l(e)l(e) from the capacity of each edge.
  • Subtract f0in(v)โˆ’f0out(v)=L(v)f_0^{\text{in}}(v) โˆ’ f_0^{\text{out}}(v) = L(v) from the demand of each node.
  • Solve the circulation problem on this new graph to get a flow ff.
  • Add l(e)l(e) to every f(e)f(e) to get a flow for the original graph.

Existence of Linear Programming Solution.โ€‹

  • Given an Linear Programming problem with a feasible set and an objective function PP.
  • If SS is empty, Linear Programming has no solution.``
  • If SS is unbounded, Linear Programming may or may not have the solution.
  • If SS is bounded, Linear Programming has one or more solutions.

Standard Form Linear Programmingโ€‹

  • maxโก(c1x1+โ‹ฏ+cnxn)\max (c_1x_1 + \cdots + c_nx_n)
  • subject to
  • a11x1+โ‹ฏ+a1nxnโ‰คb1a_{11}x_1 + \cdots + a_{1n}x_n โ‰ค b_1
  • all the way to
  • am1x1+โ‹ฏ+amnxnโ‰คbma_{m1}x_1 + \cdots + a_{mn}x_n โ‰ค b_m.
  • Also, x1โ‰ฅ0,โ‹ฏโ€‰,xnโ‰ฅ0x_1 \geq 0, \cdots, x_n \geq 0.

Matrix Form Linear Programmingโ€‹

  • maxโก(cTx)\max(c^T x)
  • subject to
  • Axโ‰คbAx \leq b
  • xโ‰ฅ0x \geq 0
  • xโˆˆRx \in \mathbb{R}

Integer Linear Programโ€‹

  • all variables โˆˆN\in \mathbb{N}
  • is NP-Hard

Dual Programโ€‹

  • minโก(bTy)\min(b^T y)
  • subject to
  • ATyโ‰ฅcA^T y \geq c
  • yโ‰ฅ0y \geq 0
  • yโˆˆRy \in \mathbb{R}

Weak Dualityโ€‹

  • The optimum of the dual is an upper bound to the optimum of the primal.
  • OPT(primal)โ‰คOPT(dual)\text{OPT}(\text{primal}) โ‰ค \text{OPT}(\text{dual}).

Strong dualityโ€‹

  • Let P and D be primal and dual Linear Programming correspondingly. If P and D are feasible, then cTx=bTyc^T x = b^T y.

If a standard problem and its dual are feasible, both are feasibly bounded. If one problem has an unbounded solution, then the dual of that problem is infeasible.

Possibilities of Feasibilityโ€‹

P\DFeasibly BoundedFeasibly UnboundedInfeasible
Feasibly BoundedPossibleImpossibleImpossible
Feasibly UnboundedImpossibleImpossiblePossible
InfeasibleImpossiblePossiblePossible

P vs. NPโ€‹

  • P = set of poly-time solvable problems.
  • NP = set of poly-time verifiable problems.
  • Decision Knapsack is NP-complete, whereas undecidable problems like Halting problems are only NP-Hard (not NP).
  • X is NP-Hard if โˆ€YโˆˆNP\forall Y \in NP and Yโ‰คpXY \leq_p X.
  • X is NP-complete if X is NP-Hard and XโˆˆNPX \in NP.
  • If P = NP, then P = NP-complete.
  • NP Problems can be solved in poly-time with NDTM.
  • NP Problems can be verified in poly-time by DTM.

Polynomial Reductionโ€‹

To reduce a decision problem Y to a decision problem X (Yโ‰คpXY \leq_p X), find a function ff that maps Y to X such that ff is poly-time computable and โˆ€yโˆˆY\forall y \in Y is YES if and only if f(y)โˆˆXf(y) \in X is YES.

Proving NP-Completeโ€‹

Show X is in NP, Pick problem NP-complete Y, and show Yโ‰คpXY \leq_p X.

Conjunctive Normal Form SATโ€‹

  • Conjunction of clauses.
  • Literal: Variable or its negation.
  • Clause: Disjunction of literals.
  • 3-SAT. Each clause has at most three literals.
  • (X1โˆจยฌX3)โˆง(X1โˆจX2โˆจX4)โˆงโ‹ฏ(X_1 \lor \neg X_3) \land (X_1 \lor X_2 \lor X_4) \land \cdots

Independent Setโ€‹

  • Independent Set is a set of vertices in a graph, no two of which are adjacent.
  • The max independent set asks for the size of the largest independent set in a given graph.

Cliqueโ€‹

  • Complete graph = every pair of vertices is connected by an edge.
  • The max clique asks for the number of vertices in its largest complete subgraph.

Vertex Coverโ€‹

  • Vertex Cover of a graph is a set of vertices that includes at least one endpoint of every edge.
  • The min vertex cover asks for the size of the smallest vertex cover in a graph.

Hamiltonian Cycleโ€‹

  • A cycle that visits each vertex exactly once.
  • Hamiltonian Path visits each vertex exactly once and doesn't need to return to its starting point.

Graph Coloringโ€‹

  • Given G, can you color the nodes with <k<k colors such that the end points of every edge are colored differently?