Superlinear Convergence of a Minimax Method
To solve a minimax problem Han [1977b] suggested the use of quadratic programs to find search directions. If the matrices in the quadratic programs are positive definite, the method can be shown convergent globally. In this paper we study that for efficiency the matrices should also be good approximations to a certain convex combination of Hessians on some subspace. Therefore, we suggest Powell's scheme [Powell 1977] for updating these matrices. By doing so, we can avoid computing Hessians. Meanwhile, the matrices maintain positive definiteness and Han's global convergence theorems can apply. Besides, the convergence of the resulting method is superlinear, indeed.
computer science; technical report
Previously Published As