Algorithmic Advances in Reactive Chemical Systems
No Access Until
Permanent Link(s)
Collections
Other Titles
Author(s)
Abstract
Due to the inherent need for atomic-scale resolution, reaction mechanisms have -for the longest time- been difficult to study experimentally and computationally. Ingenuity in experimental design, combined with experience and chemical intuition, are often a requirement for these endeavours. However, the true cost for trial and error experimentation to optimize chemical reactions remains high. For decades, experimental studies of reaction mechanisms have relied upon computational approaches as a guide. Here too, the cost of computation is high, and the choice of approach essentially comes down to tiny accurate ab initio calculations or larger-scale semi-empirical ones with limited access to reactive force fields to describe the interactions between species. Computational approaches themselves remain difficult, and the expense is essentially transformed from a dollar amount to a unit of computational time, which is itself not without cost. Obtaining quantum-level accuracy in calculations can take days to weeks, or even months, of dedicated supercomputer time. Further, these approaches are restrictive in size, roughly scaling with time as the number of electrons cubed, making large systems a continuing challenge. This dissertation outlines improvements developed and implemented at several scales of computation. At the high-accuracy, and high-cost, end of computational research using Density Functional Theory, Procrustes was implemented in combination with the limited memory Broyden–Fletcher–Goldfarb–Shanno (L-BFGS) optimization algorithm to improve upon the more commonly used Nudged Elastic Band method to identify reaction barriers. At the less expensive, and presumed lower accuracy, end of scaling, and to allow scaling to larger systems, a reactive force field for Molecular Dynamics studies was developed and made into a user-friendly codebase to allow for ease of use within the general computational community. This method was tested and validated by studies of SiO2 nucleation, PbS quantum dot growth, and CsPbI3 nucleation in solvent. These improvements tackle challenges at both the small scale and at a larger scales; however, the question of which system in particular should be studied next, essentially the design of experiments, still remains. Using Gaussian Process Bayesian optimization, the Physical Analytics pipeLine was developed to automate the maximization of compositional space in Hybrid Organic-Inorganic Perovskite materials for solar cells, given a user-supplied objective function. This approach was shown to improve greatly on pre-existing methods, and is the first of its kind to be applied to this materials application, and one of only a handful of studies of the application of machine learning to materials discovery and optimization. Given the success of this new approach, the concept was extended further with the implementation of a multi-information source optimization method, hybridized with a co-regionalization approach. This new method is compared to various related models with a view to minimizing cost during objective function optimization. The final task was to make these new algorithms and associated python scripts available in a readily accessible open-source manner. As a result, the Squid codebase was introduced on github. Squid was developed in parallel with these other projects in order to simplify the effort necessary to implement these calculations, and to consolidate the work, making it accessible to future computational studies.
Journal / Series
Volume & Issue
Description
Sponsorship
Date Issued
Publisher
Keywords
Location
Effective Date
Expiration Date
Sector
Employer
Union
Union Local
NAICS
Number of Workers
Committee Chair
Committee Co-Chair
Committee Member
Thompson, Michael Olgar