Improving corpus reproducibility through modular text transformations and connected data set

Jonathan Pulliza, Chirag Shah

Research output: Contribution to journalArticlepeer-review


The Enron Email Corpus is one of the most utilized collections of documents in Natural Language Processing, Machine Learning, and Network Analysis. Different groups of researchers have transformed the corpus, changing the content and format to meet their needs. The many distinct versions can all claim to be the Enron Email Corpus, though they are as distinct from the original publicly available collection as they are from each other. Researchers then have to determine the usefulness of a particular version in comparison to the many others available, as well as ascertain what has been done to the collection and how it would affect their specific research goal. This is especially important for reproducing a particular research method onto a different corpus, as transposing a method necessitates a deep understanding of the original data in the experiment. This project models the various transformations performed on different versions of the collection to form a network of connected datasets, highlighting the most important nodes as well as the most common transformations. Traversing different paths between nodes offers the community a way to model and reproduce the necessary data work performed on one collection that can be transposed onto other collections.

Original languageEnglish (US)
Pages (from-to)883-884
Number of pages2
JournalProceedings of the Association for Information Science and Technology
Issue number1
StatePublished - Jan 1 2018

All Science Journal Classification (ASJC) codes

  • Computer Science(all)
  • Library and Information Sciences


  • Data Citation
  • Data Reuse
  • Email
  • Enron e-mail dataset
  • Network Analysis
  • Reproducible Research


Dive into the research topics of 'Improving corpus reproducibility through modular text transformations and connected data set'. Together they form a unique fingerprint.

Cite this