online advertising

Saturday, July 30, 2016

Coursera Approximation Algorithms Part II - Peer Assignment 2



(1) There is a constraint for each cut in the primal program, so there is a variable $ y_c $ for each cut $ c $ in the dual program.

The primal program is a minimization, so the dual program is a maximization, the objective would be

$ \max \sum\limits_{c \in S}{y_c} $

There is a variable for each edge in the primal program, so there is a constraint for each edge as follow:

$ \forall e \in E \sum\limits_{c \in S : e \in \delta(c)}{y_c} \le w(e) $

Of course, we also have the non-negativity constraints

$ y_c \ge 0 $.

So that is the dual program.

(2) Whenever a dual variable for a cut $ F $ is increased, we add one edge to $ F $, and we never remove edge from $ F $, so we never consider the same cut again. Therefore, a dual variable is increased at most once in an iteration.

(3) In (2) we said that we add a new edge to $ F $ for each iteration. There are only |E| edges, so the algorithm has to terminate with at most |E| iterations.

The base case is easy to prove - a graph with only one edge has to have two nodes, and therefore it is a tree.

The induction case is simple too - if we start from a tree (say $ k $ nodes and $ k - 1 $ edges), and then we add one node from the outside of $ F $ and then one edge, the new connected component will have $ k + 1 $ node and $ k $ edge, which is still a tree.

Therefore we conclude $ F $ is always a tree.

(4) The primal program is the linear programming relaxation, and it is a minimization, so we have

$ val(x*) \le OPT $.

By strong duality, we have

$ val(y*) = val(x*) \le OPT $.

(5) At the beginning of the algorithm, $ y = 0 $ and therefore it is feasible. At each iteration. The algorithm maintains dual feasibility by stop increasing the dual variable before any constraint is violated. Therefore, at the end of the loop, the dual variable is still feasible. Pruning does not impact the dual variable values. Therefore, we conclude, at the end of the algorithm, the dual solution is feasible.

(6) The dual program is a maximization, so every feasible solution is at most optimal

$ val(y) \le val(y*) $

Therefore we can combine it with (5) to get

$ val(y) \le val(y*) = val(x*) \le OPT $.

(7) The edge $ e \in P $ must be added to $ F $ at some iteration of the algorithm. At that point of time, we have:

$ \sum\limits_{c \in S : e \in \delta(c)}{y_c} = w(e) $

Because we maintain dual feasibility and we never decrease dual variable, therefore the condition must maintain throughout the algorithm, and therefore the above it still true at the end of the algorithm.

(8) For this question, we simply take a sum over all edges in $ P $

$ \sum\limits_{e \in P}{\sum\limits_{c \in S : e \in \delta(c)}{y_c}} = \sum\limits_{e \in P}{w(e)} $

(9) We wish to reverse the summation order. However, the sum is dependent, we need to know $ e $. To break that dependency, we introduce indicator functions:

$ I(c, e) = 1 $ if $ e \in \delta (c) $

$ \sum\limits_{e \in P}{\sum\limits_{c \in S : e \in \delta(c)}{y_c}} = \sum\limits_{e \in P}{\sum\limits_{c \in S}{I(c, e) y_c}} $

Now the sums are independent, and therefore we can freely swap them

$ \sum\limits_{e \in P}{\sum\limits_{c \in S}{I(c, e) y_c}}
= \sum\limits_{c \in S}{\sum\limits_{e \in P}{I(c, e) y_c}}  $

We can factor out $ y_c $ now since it is now part of the inner sum.

$ \sum\limits_{c \in S}{\sum\limits_{e \in P}{I(c, e) y_c}} = \sum\limits_{c \in S}{y_c \sum\limits_{e \in P}{I(c, e)}} $

The inner sum is simply counting the number of edges in both $ P $ and $ \delta(c) $, therefore it is simply $ |P \cap \delta(c)| $.

Therefore, the final sum is

$ \sum\limits_{e \in P}{\sum\limits_{c \in S : e \in \delta(c)}{y_c}} = \sum\limits_{c \in S}{y_c |P \cap \delta(c)|} $

(10) At any time in the algorithm $ F $ is a tree, $ y_c > 0 $ implies we increased it at a step in the algorithm, the path crossed the cut once. If the path cross the cut another time, then $ F $ would have a cycle, so it is impossible for $ |P \cap \delta(c)| \ge 2 $.

(11) Question 6 gives us this

$ val(y) \le val(y*) = val(x*) \le OPT $.

Question 8 gives us this:

$ \sum\limits_{e \in P}{\sum\limits_{c \in S : e \in \delta(c)}{y_c}} = \sum\limits_{e \in P}{w(e)} $

Question 9 gives us this:

$ \sum\limits_{e \in P}{\sum\limits_{c \in S : e \in \delta(c)}{y_c}} = \sum\limits_{c \in S}{y_c |P \cap \delta(c)|} $

Question 10 gives us this:

$ |P \cap \delta(c)| \le 1 $ if $ p_c > 0 $.

Chaining these together, we get

$ \sum\limits_{c \in S}{y_c} \ge \sum\limits_{c \in S}{y_c |P \cap \delta(c)|} =  \sum\limits_{e \in P}{\sum\limits_{c \in S : e \in \delta(c)}{y_c}} = \sum\limits_{e \in P}{w(e)} $

Left hand side we have the dual value, right hand side we have the primal value. This indicates the duality gap is 0, which means we reached optimal!

(12) The pruning is important because we needed to bound $ |P \cap \delta(c)| \ge 1 $ in question 10.

(13) The first edge that is added to $ F $ by the primal dual algorithm is the shortest edge starting from $ s $ because it is the first edge that cause a constraint to be violated.

The first edge that is added to $ D $ by the Dijkstra's algorithm is the  shortest edge starting from $ s $, by definition.

Therefore, the first edge added to $ F $ is the same first edge added to $ D $.

(14) The node $ j $ with minimal value of $ l(j) $.

(15) The edge $ (i, j) $ such that $ i \in S $ and $ w((i, j)) $ is minimal.

(16) Suppose we have added $ (i, j) $ to $ F $, for all nodes reachable by $ j $, $ l(k) $ should be updated if $ w(j, k) < l(k) $. In that case, $ l(k) = w(j, k) $.

(17)  The relaxation rule above and the selection rule above is identical to what the Dijkstra's algorithm would do, so the order of adding edges to $ F $ and $ D $ should be identical.

(18) Using a Fibonacci heap, Dijkstra's algorithm can be implemented in $ O(|E| + |V|\log|V|) $ time. 

No comments :

Post a Comment