eCommons

 

Representation Learning For Sequence And Comparison Data

dc.contributor.authorChen, Shuo
dc.contributor.chairJoachims,Thorsten
dc.contributor.committeeMemberVan Loan,Charles Francis
dc.contributor.committeeMemberBindel,David S.
dc.date.accessioned2016-04-04T18:06:05Z
dc.date.available2016-04-04T18:06:05Z
dc.date.issued2016-02-01
dc.description.abstractThe core idea of representation learning is to learn semantically more meaningful features (usually represented by a vector or vectors for each data point) from the dataset, so that they contain more discriminative information and make the given prediction task easier. It often provides better generalization performance and data visualization. In this thesis work, we improve the foundation and practice of representation learning methods for two types of data, namely sequences and comparisons: 1. Using music playlist data as an example, we propose Logistic Markov Embedding method that learns from sequence of songs and yields vectorized representations of songs. We demonstrate its better generalization performance in predicting the next song to play in a coherent playlist, as well as its capability in producing meaningful visualization for songs. We also propose an accompanying scalable training method that can be easily parallelized for learning representations on sequences. 2. Motivated by modeling intransitivity (rock-paper-scissors relation) in competitive matchup (two-player games or sports) data, we propose the blade-chest model for learning vectorized representations of players. It is then extended to a general framework that predicts the outcome of pairwise comparisons, making use of both object and context features. We see its successful application in matchup and preference prediction. The two lines of works have the same underlying theme: the object we study is first represented by a parameter vector or vectors, which are used to explain the interac- tions in the proposed models. These parameter vectors are learned by training on the datasets that contain interactions. The learned vectors can be used to predict any future interaction by simply plugging them back into the proposed models. Also, when the dimensionality of the vector is small (e.g. 2), plotting them gives interesting insight into the data.
dc.identifier.otherbibid: 9597212
dc.identifier.urihttps://hdl.handle.net/1813/43697
dc.language.isoen_US
dc.subjectRepresentation learning
dc.subjectSequence
dc.subjectPairwise comparison
dc.titleRepresentation Learning For Sequence And Comparison Data
dc.typedissertation or thesis
thesis.degree.disciplineComputer Science
thesis.degree.grantorCornell University
thesis.degree.levelDoctor of Philosophy
thesis.degree.namePh. D., Computer Science

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
sc2247.pdf
Size:
1.64 MB
Format:
Adobe Portable Document Format