Cornell University
Library
Cornell UniversityLibrary

eCommons

Help
Log In(current)
  1. Home
  2. Cornell Computing and Information Science
  3. Computer Science
  4. Computer Science Technical Reports
  5. INEXACT REFLECTIVE NEWTON METHODS FOR LARGE-SCALE OPTIMIZATION SUBJECT TOBOUND CONSTRAINTS

INEXACT REFLECTIVE NEWTON METHODS FOR LARGE-SCALE OPTIMIZATION SUBJECT TOBOUND CONSTRAINTS

File(s)
95-1543.ps (1.17 MB)
95-1543.pdf (1.38 MB)
Permanent Link(s)
https://hdl.handle.net/1813/7200
Collections
Computer Science Technical Reports
Author
Branch, Mary Ann
Abstract

This thesis addresses the problem of minimizing a large-scale nonlinear function subject to simple bound constraints. The most popular methods to handle bound constrained problems, active-set methods, introduce a combinatorial aspect to the problem. For these methods, the number of steps to converge may be related to the number of constraints. For large problems, this behavior is particularly detrimental. Reflective Newton methods avoid this problem by staying strictly within the constrained region. As a result, these methods have strong theoretical properties. Moreover, they behave experimentally like an unconstrained method: the number of steps to a solution is not strongly correlated with problem size. In this thesis, we discuss the reflective Newton approach and how it can be combined with inexact Newton techniques, within a subspace trust-region method, to efficiently solve large problems. Two algorithms are presented. The first uses a line search as its globalizing strategy. The second uses a strictly trust-region approach to globally converge to a local minimizer. Global convergence and rate of convergence results are established for both methods. We present computational evidence that using inexact Newton steps preserves the properties of the reflective Newton methods: the iteration counts are as low as when "exact" Newton steps are used. Also, both the inexact and exact methods are robust when the starting point is varied. Furthermore, the inexact reflective Newton methods have fast convergence when negative curvature is encountered, a trait not always shared by similar active-set type methods. The role of negative curvature is further explored by comparing the subspace trust-region approach to other common approximations to the full-space trust-region problem. On problems where only positive curvature is found, these trust-region methods differ little in the number of iterations to converge. However, for problems with negative curvature, the subspace method is more effective in capturing the negative curvature information, resulting in faster convergence. Finally a parallel implementation on the IBM SP2 is described and evaluated; the scalability and efficiency of this implementation are shown to be as good as the matrix-vector multiply routine it depends on.

Date Issued
1995-10
Publisher
Cornell University
Keywords
computer science
•
technical report
Previously Published as
http://techreports.library.cornell.edu:8081/Dienst/UI/1.0/Display/cul.cs/TR95-1543
Type
technical report

Site Statistics | Help

About eCommons | Policies | Terms of use | Contact Us

copyright © 2002-2026 Cornell University Library | Privacy | Web Accessibility Assistance