Exploring and Exploiting Structure and Self-supervision in Sequence Learning
Loading...
No Access Until
Permanent Link(s)
Collections
Other Titles
Authors
Abstract
Sequence data, which consists of values organized in a certain order, is one of the most commonly seen data types in our everyday lives. For instance, the daily temperature and precipitation measurements throughout a year form a sequence of weather conditions. The crop yields each year over the past several decades depict a trend in agricultural production. These are also known as time series data. Time-indexed data is not the only kind of sequence data. Linguistic data such as speech and texts are sequential in nature. DNA sequences are indexed based on the physical order of the bases and materials' density of states are indexed by energy levels. In fact, any reasonably ordered data can be viewed as a sequence. Sequence data has been a long-standing area of interest in the artificial intelligence (AI) field, and this class of problems is often called sequence learning. Various kinds of sequence learning tasks have been defined, such as predicting the general properties of a sequence, tagging the sequence with labels at each index, or generating a new sequence from the input. Different types of sequence data have unique structures, and it is often challenging to develop a model to encode or decode the data taking account the inherent sequential relationship, so each sub-field has historically relied on separate sequence learning tools and frameworks. However, recent advances in machine learning (ML) and deep neural networks (DNN) have provided the capacity to handle arbitrarily long sequences and store historical states in a more unified fashion, regardless of the modality of the data. Deep models like recurrent neural networks (RNN), long short-term memory (LSTM), gated recurrent units (GRU) and transformers have become the foundation of most modern sequence learning and feature extraction methods. A new challenge is to efficiently and effectively utilize these deep models to capture intrinsic features from the input sequences. In this thesis, I will study both supervised and self-supervised sequence learning using deep models. Conventional methods for supervised sequence learning are typically designed to study sequences of scalar values or vectors, and are not suitable for structured data such as graphs. I will illustrate three novel yet challenging scenarios involving graphs and sequences: dynamic node property prediction for a fixed graph, sequence prediction from a graph, and multi-label prediction of sequential inputs. Structured input data can be modeled using a framework that combines graph neural networks (GNN) with sequence models (e.g., GRU and transformer). This framework is validated on several tasks, including crop yield prediction and density of states prediction. Self-supervised learning is another trending direction in the sequence learning field. Self-supervision obtains supervisory signals from the data itself and leverages the underlying structure in the data. It has the potential to improve the sample efficiency for downstream tasks and contribute to better model interpretability. In recent years, self-supervised sequence learning has been successfully applied to language and acoustic model pretraining. In my thesis, I will demonstrate that self-supervision can enforce latent structure, disentangle static and dynamic factors, and supplement supervised signals in model training, by applying it to speech recognition, video understanding and sequence generation. In general, I will show in this thesis different methods to capture and exploit structure from sequence data, and diverse explorations of the self-supervision for sequence learning.
Journal / Series
Volume & Issue
Description
236 pages
Sponsorship
Date Issued
2022-08
Publisher
Keywords
Computational Sustainability; Machine learning; Self-supervised Learning; Sequence Learning
Location
Effective Date
Expiration Date
Sector
Employer
Union
Union Local
NAICS
Number of Workers
Committee Chair
Gomes, Carla P.
Committee Co-Chair
Committee Member
Selman, Bart
Benson, Austin Reilley
Benson, Austin Reilley
Degree Discipline
Computer Science
Degree Name
Ph. D., Computer Science
Degree Level
Doctor of Philosophy
Related Version
Related DOI
Related To
Related Part
Based on Related Item
Has Other Format(s)
Part of Related Item
Related To
Related Publication(s)
Link(s) to Related Publication(s)
References
Link(s) to Reference(s)
Previously Published As
Government Document
ISBN
ISMN
ISSN
Other Identifiers
Rights
Rights URI
Types
dissertation or thesis