This paper considers a problem of distributed hypothesis testing and social learning. Individual nodes in a network receive noisy (private) observations whose distribution is parameterized by a discrete parameter (hypotheses). The distributions are known locally at the nodes, but the true parameter/hypothesis is not known. An update rule is analyzed in which agents first perform a Bayesian update of their belief (distribution estimate) of the parameter based on their local observation, communicate these updates to their neighbors, and then perform a 'non-Bayesian' linear consensus using the log-beliefs of their neighbors. The main result of this paper is that under mild assumptions, the belief of any agent in any incorrect parameter converges to zero exponentially fast, and the exponential rate of learning is a characterized by the network structure and the divergences between the observations' distributions.