McRank: Learning to rank using multiple classification and gradient boosting

Ping Li, Christopher J.C. Burges, Qiang Wu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

106 Scopus citations

Abstract

We cast the ranking problem as (1) multiple classification ("Mc") (2) multiple ordinal classification, which lead to computationally tractable learning algorithms for relevance ranking in Web search. We consider the DCG criterion (discounted cumulative gain), a standard quality measure in information retrieval. Our approach is motivated by the fact that perfect classifications result in perfect DCG scores and the DCG errors are bounded by classification errors. We propose using the Expected Relevance to convert class probabilities into ranking scores. The class probabilities are learned using a gradient boosting tree algorithm. Evaluations on large-scale datasets show that our approach can improve LambdaRank [5] and the regressions-based ranker [6], in terms of the (normalized)DCG scores. An efficient implementation of the boosting tree algorithm is also presented.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems 20 - Proceedings of the 2007 Conference
StatePublished - 2009
Externally publishedYes
Event21st Annual Conference on Neural Information Processing Systems, NIPS 2007 - Vancouver, BC, Canada
Duration: Dec 3 2007Dec 6 2007

Publication series

NameAdvances in Neural Information Processing Systems 20 - Proceedings of the 2007 Conference

Other

Other21st Annual Conference on Neural Information Processing Systems, NIPS 2007
Country/TerritoryCanada
CityVancouver, BC
Period12/3/0712/6/07

All Science Journal Classification (ASJC) codes

  • Information Systems

Fingerprint

Dive into the research topics of 'McRank: Learning to rank using multiple classification and gradient boosting'. Together they form a unique fingerprint.

Cite this