[SOUND] As a warm up to our study on the maxcut problem, let us start with a very simple algorithm. This algorithm is called random. Now, probably you know already, you can already guess what this algorithm is. But let me spell it out for you. In the first step, the algorithm constructs the partition of the grand vertexes into two sets. For each vertex it decides whether to put it in S or not. Each vertex goes into S w.pr..5, and into the compliment of S w.pr..5. And these decisions are independently done for every vertex. In the second step you do nothing. You're done. You just output the resulting cut. Really, the algorithm cannot be much simpler than this. And the theorem is, it's a 2-approximation. You are within the factor of 2 of the optimum cut. Let us see the proof, also a simple proof. Two parts of this proof. Upper bounding OPT, lower bounding, the output of the algorithm. Upper bounding OPT. What is the best you can hope for? The best you can hope for is to get every edge in the graph. And in that case, what is the the value of the of the resulting cut? The value of the cut will be the sum over every edge of the graph of the edge weight. That's our upper bound on OPT. Simple. Next, the lower bound. Lower bound on the cut of the output. Since the output is random, let's do low bound on the expected cuts of the output. The cuts of the output is the sum over every edge of the graph of the weight of the edge times either 1 or 0. 1 if the edge crosses the cut. 0 if it doesn't cross the cut. What is the expected value of the output? By linearity, it is the sum over average of the graph of the width of the edge times the probability that the edge crosses the cut. How do we compute the probability that a given edge ij crosses the cut? We use independence. As always if we want to avoid making mistakes for a randomized algorithm, we decompose the probability space into pieces, into atomic events. So lets look at all possibilities for {i,j}. And let's see for each possibility, whether {i,j), whether the edge crosses the cut and what is the probability of that event. Either i is in s or not. Probability, .5, .5. Either j is in s or not. Probability one half, one half. Two times two possibilities equals four possible events. In the first case, i in s and j in s. The edge does not cross it's to the left side. In a fourth case, i and j not in s, the edge does not cross, it's in the other side. In the other two cases, the edge crosses the cut. And what are the probabilities of these other two cases? By independence, it is the product .5 times .5 equals .25. So, the probability that {i,j} crosses the cut is .25 + .25 = .5. And therefore, plugging that in, the expected value of the output is 1/2 times the sum of the edge weights. Let's put all this together. Easy, two lines. OPT, is at most, the sum of the edge weights. The output is an expectation, half of the sum of the edge weights. Therefore, the expected value of the output is at least half of OPT. That proves the factor of 2, and that proves the approximation. Can we do better than this? Let's try, greedy. Greedy, you think [INAUDIBLE] one by one. Each of them we decide to put it left, to put it right, depending on the neighbors that have already been placed left or right, which one looks better? Which one gives you the most weight to approximation? Okay, let's try something else. Local search. We'd start from a certain population, and then we try to move vertices between sides. If we move a vertex, if moving to the other side gives you a better cut with a better cost, until we get to a local optimum. Two approximations. So, all the simple algorithms give you a two approximation. So we need to be smart to do better. And in the next part, we will see a particular linear programming relaxation from maxcut.