Show simple item record

dc.contributor.authorAthiwaratkun, Praphruetpong (Ben)
dc.identifier.otherbibid: 11050402
dc.description.abstractWe demonstrate the benefits of probabilistic representations due to their expressiveness which allows for flexible representations, their ability of capture uncertainty, and their interpretable geometric structures that are suitable for modeling hierarchical data. We show that multimodal densities can be effectively used to represent words in natural text, capturing possibly multiple meanings and their nuances. Probability densities also have natural geometric structures which can be used to represent hierarchies among entities through the concept of encapsulation; that is, dispersed distributions are generic entities that encompass more specific ones. We show an effective approach to train such density embeddings by penalizing order violations which are defined through on asymmetric divergences of probability densities.
dc.rightsAttribution 4.0 International
dc.subjectArtificial intelligence
dc.subjectProbabilistic embeddings
dc.subjectWord embeddings
dc.titleDensity Representations for Words and Hierarchical Data
dc.typedissertation or thesis University of Philosophy, Statistics
dc.contributor.chairWilson, Andrew Gordon
dc.contributor.committeeMemberCardie, Claire T.
dc.contributor.committeeMemberMimno, David

Files in this item


This item appears in the following Collection(s)

Show simple item record

Except where otherwise noted, this item's license is described as Attribution 4.0 International