Perturbation based Large Margin Approach for Ranking (2012)
Eunho Yang, Ambuj Tewari and Pradeep Ravikumar
The use of the standard hinge loss for structured outputs, for the learning to rank problem, faces two main caveats: (a) the label space, the set of all possible permutations of items to be ranked, is too large, and also less amenable to the usual dynamic-programming based techniques used for structured outputs, and (b) the supervision or training data consists of instances with multiple labels per input, instead of just a single label. The most natural way to deal with such multiple labels leads, unfortunately, to a non-convex surrogate. In this paper, we propose a general class of perturbation-based surrogates that leverage the large margin approach, and are convex. We show that the standard hinge surrogate for classification actually falls within this class. We also find a surrogate within this class, for the ranking problem, that does not suffer from the caveats mentioned above. Indeed, our experiments demonstrate that it performs better than other candidate large margin proposals on both synthetic and real world ranking datasets.
View:
PDF
Citation:
In International Conference on Artificial Intelligence and Statistics (AISTATS) 2012.
Bibtex:

Pradeep Ravikumar Formerly affiliated Faculty pradeepr [at] cs utexas edu
Eunho Yang Ph.D. Alumni eunho [at] cs utexas edu