Show simple item record

dc.contributor.authorIrsoy, Ozan
dc.date.accessioned2017-04-04T20:28:31Z
dc.date.available2017-04-04T20:28:31Z
dc.date.issued2017-01-30
dc.identifier.otherIrsoy_cornellgrad_0058F_10131
dc.identifier.otherhttp://dissertations.umi.com/cornellgrad:10131
dc.identifier.otherbibid: 9906139
dc.identifier.urihttps://hdl.handle.net/1813/47892
dc.description.abstractRecent advances in deep learning have provided fruitful applications for natural language processing (NLP) tasks. One key advance was the invention of word vectors, representing every word in a dense, low-dimensional vector space. Even though word vectors provide very strong results for word level NLP tasks, producing appropriate representation for phrases and sentences is still an open research problem. In this dissertation, we focus on compositional approaches to representation learning. In particular, we employ the notions of compositionality in which the sequence or structure information is utilized, via recurrent or recursive neural networks. We investigate the effectiveness of such approaches for specific natural language understanding tasks including opinion mining and sentiment analysis, and extend some of the approaches to provide better representation hierarchies. In particular, we propose two novel variants: bidirectional recursive neural networks, which are capable of producing context-dependent structural representations and deep recursive neural networks, which provide representation hierarchies in the structural setting. Additionally, we qualitatively investigate such models, and describe how they relate to alternative compositional approaches. Finally, we discuss challenges in interpretation and understanding of compositional neural models, propose simple tools for visualization, and perform exploratory analyses over features learned by such a model.
dc.language.isoen_US
dc.rightsAttribution 4.0 International*
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/*
dc.subjectneural networks
dc.subjectComputer science
dc.subjectmachine learning
dc.subjectnatural language processing
dc.subjectDeep Learning
dc.titleDeep Sequential and Structural Neural Models of Compositionality
dc.typedissertation or thesis
thesis.degree.disciplineComputer Science
thesis.degree.grantorCornell University
thesis.degree.levelDoctor of Philosophy
thesis.degree.namePh. D., Computer Science
dc.contributor.chairCardie, Claire T
dc.contributor.committeeMemberWoodard, Dawn B.
dc.contributor.committeeMemberKleinberg, Robert David
dcterms.licensehttps://hdl.handle.net/1813/59810
dc.identifier.doihttps://doi.org/10.7298/X46971JG


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

Except where otherwise noted, this item's license is described as Attribution 4.0 International

Statistics