In a noisy broadcast channel, processors communicate as follows: in each time step, one processor broadcasts a bit. Each of the other processors receives a bit, but the received bit is incorrect with some known probability p. Reception errors are assumed to be independent. In such an environment, a group of n broadcasters, each possessing a bit, wish to transmit all n bits to a receiver so that, with probability close to 1, the receiver learns all of the bits correctly. This can be done easily with O(n log n) broadcasts, by having each processor broadcast its input bit C log n times (for some sufficiently large constant C) and having the receiver recover each bit by majority vote. This naive algorithm was improved by Gallager who, in 1988, gave a surprising algorithm that uses only O (n log log n) broadcasts. I'll talk about my proof with Navin Goyal and Guy Kindler that Gallager's algorithm is optimal up to a constant factor.
Michael Saks is a professor in the Mathematics Department at Rutgers University. His interests range over a number of areas of theory of computing and discrete mathematics. Most recently he’s been thinking about lower bounds in concrete computational models (decision trees, communication complexity), polynomial identity testing, exponential algorithms for NP hard problems, and various algorithmic problems related to monotonicity.