True Randomness from Big Data

Periklis A. Papakonstantinou, David P. Woodruff, Guang Yang

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

Original languageEnglish (US)
Article number33740
JournalScientific reports
Volume6
DOIs
StatePublished - Sep 26 2016

All Science Journal Classification (ASJC) codes

  • General

Fingerprint

Dive into the research topics of 'True Randomness from Big Data'. Together they form a unique fingerprint.

Cite this