Now showing items 47-64 of 64

    • Predicting Fill for Sparse Orthogonal Factorization 

      Coleman, Thomas F.; Edenbrandt, Anders; Gilbert, John R. (Cornell University, 1983-10)
      In solving large sparse linear least squares problems $Ax \cong b$, several different numeric methods involve computing the same upper triangular factor $R$ of $A$. It is of interest to be able to compute the nonzero ...
    • A Quadratically-Convergent Algorithm for the Linear Programming Problem with Lower and Upper Bounds 

      Coleman, Thomas F.; Li, Yuying (Cornell University, 1990-04)
      We present a new algorithm to solve linear programming problems with finite lower and upper bounds. This algorithm generates an infinite sequence of points guaranteed to converge to the solution; the ultimate convergence ...
    • A Quasi-Newton $L_{2}$-Penalty Method for Minimization Subject toNonlinear Equality Constraints 

      Coleman, Thomas F.; Yuan, Wei (Cornell University, 1995-03)
      We present a modified $L_{2}$ penalty function method for equality constrained optimization problems. The pivotal feature of our algorithm is that at every iterate we invoke a special change of variables to improve the ...
    • A Quasi-Newton L2-Penalty Method for Minimization Subject to Nonlinear Constraints 

      Coleman, Thomas F.; Yuan, Wei (Cornell University, 1995-02)
      We present a modified L2 penalty function method for equality constrained optimization problems. The pivotal feature of our algorithm is that at every iterate we invoke a special change of variables to improve the ability ...
    • Reconstructing the Unknown Local Volatility Function 

      Coleman, Thomas F.; Li, Yuying; Verma, Arun (Cornell University, 2003-01-23)
      Using market European option prices, a method for computing a smooth local volatility function in a 1-factor continuous diffusion model is proposed. Smoothness is introduced to facilitate accurate approximation of the ...
    • A Reflective Newton Method for Minimizing a Quadratic Function Subject to Bounds on Some of the Variables 

      Coleman, Thomas F.; Li, Yuying (Cornell University, 1992-11)
      We propose a new algorithm, a reflective Newton method, for the minimization of a quadratic function of many variables subject to upper and lower bounds on some of the variables. This method applies to a general ...
    • A Reflective Newton Method for Minimizing a Quadratic FunctionSubject to Bounds on Some of The Variables. 

      Coleman, Thomas F.; Li, Yuying (Cornell University, 1992-11)
      We propose a new algorithm, a reflective Newton method, for the minimization of a quadratic function of many variables subject to upper and lower bounds on some of the variables. The method applies to a general (indefinite) ...
    • Segmentation of Pulmonary Nodule Images Using Total Variation Minimization 

      Coleman, Thomas F.; Li, Yuying; Mariano, Adriano (Cornell University, 2003-01-22)
      Total variation minimization has edge preserving and enhancing properties which make it suitable for image segmentation. We present Image Simplification, a new formulation and algorithm for image segmentation. We illustrate ...
    • Segmentation of Pulmonary Nodule Images Using Total VariationMinimization 

      Coleman, Thomas F.; Li, Yuying; Mariano, Adrian (Cornell University, 1998-09)
      Total variation minimization has edge preserving and enhancing properties which make it suitable for image segmentation. We present Image Simplification, a new formulation and algo rithm for image segmentation. We ...
    • Software For Estimating Sparse Hessian Matrices 

      Coleman, Thomas F.; Garbow, Burton S.; More, Jorge J. (Cornell University, 1985-01)
      The solution of a nonlinear optimization problem often requires an estimate of the Hessian matrix for a function $f$. In large scale problems the Hessian matrix is usually sparse, and then estimation by differences of ...
    • Software for Estimating Sparse Jacobian Matrices 

      Coleman, Thomas F.; More, Jorge J. (Cornell University, 1982-06)
      In many nonlinear problems it is necessary to estimate the Jacobian matrix of a nonlinear mapping $F$. In large scale problems the Jacobian of $F$ is usually sparse, and then estimation by differences is attractive because ...
    • Solution of Nonlinear Least-Square Problems on a Multiprocessor 

      Coleman, Thomas F.; Plassmann, Paul (Cornell University, 1988-06)
      In this paper we describe algorithms for solving nonlinear least-squares problems on a message-passing multiprocessor. We demonstrate new parallel algorithms, including an efficient parallel algorithm for determining the ...
    • Solving Systems of Nonlinear Equations on a Message-Passing Multiprocessor 

      Coleman, Thomas F.; Li, Guangye (Cornell University, 1987-11)
      We develop parallel algorithms for the solution of dense systems of nonlinear equations on a message-passing multiprocessor computer. Specifically, we propose a distributed finite-difference Newton method, a multiple ...
    • The Sparse Null Space Basis Problem 

      Coleman, Thomas F.; Pothen, Alex (Cornell University, 1984-07)
      The sparse null space basis problem is the following: $A t \times n$ matrix $A (t less than n)$ is given. Find a matrix $N$, with the fewest nonzero entries in it, whose columns span the null space of $A$. This problem ...
    • Structure and Efficient Hessian Calculation 

      Coleman, Thomas F.; Verma, Arun (Cornell University, 1996-08)
      Modern methods for numerical optimization calculate (or approximate) the matrix of second derivatives, the Hessian matrix, at each iteration. The recent arrival of robust software for automatic differentiation allows for ...
    • Structure and Efficient Jacobian Calculation 

      Coleman, Thomas F.; Verma, Arun (Cornell University, 1996-03)
      Many computational tasks require the determination of the Jacobian matrix, at a given argument, for a large nonlinear system of equations. Calculation or approximation of a Newton step is a related task. The development ...
    • A Subspace, Interior, and Conjugate Gradient Method for Large-scale Bound-constrained Minimization Problems 

      Branch, Mary Ann; Coleman, Thomas F.; Li, Yuying (Cornell University, 1995-07)
      A subspace adaption of the Coleman-Li trust region and interior method is proposed for solving large-scale bound-constrained minimization problems. This method can be implemented with either sparse Cholesky factorization ...
    • A Subspace, Interior, and Conjugate Gradient Method for Large-ScaleBound-Constrained Minimization Problems 

      Branch, Mary Ann; Coleman, Thomas F.; Li, Yuying (Cornell University, 1995-07)
      A subspace adaptation of the Coleman-Li trust region and interior method is proposed for solving large-scale bound-constrained minimization problems. This method can be implemented with either sparse Cholesky factorization ...