### Abstract

We prove the first non-trivial (superlinear) lower bound in the noisy broadcast model of distributed computation. In this model, there are n + 1 processors P _{0}, P _{1}, . . . , P _{n}. Each P _{i}, for i ≥ 1, initially has a private bit x _{i} and the goal is for P _{0} to learn f(x _{1}, . . . , x _{n})for some specified function f. At each time step, a designated processor broadcasts some function of its private bit and the bits it has heard so far. Each broadcast is received by the other processors but each reception may be corrupted by noise. In this model, Gallager [16] gave a noise-resistant protocol that allows P _{0} to learn the entire input in O(n log log n) broadcasts. We prove that Gallager's protocol is optimal up to a constantfactor. Our lower bound follows from a lower bound in a new model, the generalized noisy decision tree model, which may be of independent interest.

Original language | English (US) |
---|---|

Title of host publication | Proceedings - 46th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2005 |

Pages | 40-49 |

Number of pages | 10 |

DOIs | |

State | Published - Dec 1 2005 |

Event | 46th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2005 - Pittsburgh, PA, United States Duration: Oct 23 2005 → Oct 25 2005 |

### Publication series

Name | Proceedings - Annual IEEE Symposium on Foundations of Computer Science, FOCS |
---|---|

Volume | 2005 |

ISSN (Print) | 0272-5428 |

### Other

Other | 46th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2005 |
---|---|

Country | United States |

City | Pittsburgh, PA |

Period | 10/23/05 → 10/25/05 |

### Fingerprint

### All Science Journal Classification (ASJC) codes

- Engineering(all)

### Cite this

*Proceedings - 46th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2005*(pp. 40-49). [1530700] (Proceedings - Annual IEEE Symposium on Foundations of Computer Science, FOCS; Vol. 2005). https://doi.org/10.1109/SFCS.2005.48

}

*Proceedings - 46th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2005.*, 1530700, Proceedings - Annual IEEE Symposium on Foundations of Computer Science, FOCS, vol. 2005, pp. 40-49, 46th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2005, Pittsburgh, PA, United States, 10/23/05. https://doi.org/10.1109/SFCS.2005.48

**Lower bounds for the noisy broadcast problem.** / Goyal, Navin; Kindlert, Guy; Saks, Michael.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

TY - GEN

T1 - Lower bounds for the noisy broadcast problem

AU - Goyal, Navin

AU - Kindlert, Guy

AU - Saks, Michael

PY - 2005/12/1

Y1 - 2005/12/1

N2 - We prove the first non-trivial (superlinear) lower bound in the noisy broadcast model of distributed computation. In this model, there are n + 1 processors P 0, P 1, . . . , P n. Each P i, for i ≥ 1, initially has a private bit x i and the goal is for P 0 to learn f(x 1, . . . , x n)for some specified function f. At each time step, a designated processor broadcasts some function of its private bit and the bits it has heard so far. Each broadcast is received by the other processors but each reception may be corrupted by noise. In this model, Gallager [16] gave a noise-resistant protocol that allows P 0 to learn the entire input in O(n log log n) broadcasts. We prove that Gallager's protocol is optimal up to a constantfactor. Our lower bound follows from a lower bound in a new model, the generalized noisy decision tree model, which may be of independent interest.

AB - We prove the first non-trivial (superlinear) lower bound in the noisy broadcast model of distributed computation. In this model, there are n + 1 processors P 0, P 1, . . . , P n. Each P i, for i ≥ 1, initially has a private bit x i and the goal is for P 0 to learn f(x 1, . . . , x n)for some specified function f. At each time step, a designated processor broadcasts some function of its private bit and the bits it has heard so far. Each broadcast is received by the other processors but each reception may be corrupted by noise. In this model, Gallager [16] gave a noise-resistant protocol that allows P 0 to learn the entire input in O(n log log n) broadcasts. We prove that Gallager's protocol is optimal up to a constantfactor. Our lower bound follows from a lower bound in a new model, the generalized noisy decision tree model, which may be of independent interest.

UR - http://www.scopus.com/inward/record.url?scp=33748629917&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=33748629917&partnerID=8YFLogxK

U2 - 10.1109/SFCS.2005.48

DO - 10.1109/SFCS.2005.48

M3 - Conference contribution

AN - SCOPUS:33748629917

SN - 0769524680

SN - 9780769524689

T3 - Proceedings - Annual IEEE Symposium on Foundations of Computer Science, FOCS

SP - 40

EP - 49

BT - Proceedings - 46th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2005

ER -