Typical optimization problems aim to select a single solution of maximum or minimum value from a large space of feasible solutions. For many such problems, feasible solutions are subsets of a global set of elements that meet certain constraints and achieve certain goals. Recent trends in optimization literature have shown a drift from classic optimization problems, which deal with static problems, to dynamic optimization paradigms such as the online methodology. We introduce a general framework for dynamic optimization, incremental optimization, in which problem constraints are allowed to develop monotonically in discrete time steps. Solutions to such problems are sequences of feasible solutions, one for each time step, such that later solutions build on earlier solutions incrementally. Our abstract model for dynamic optimization can be applied with minimal effort to any static packing or covering problem. In this dissertation we give a formal description of this incremental model and three relevant evaluation metrics. We also demonstrate how this model can be used to incrementalize a variety of static optimization problems. We enumerate results for incremental versions of several canonical optimization problems, including maximum flow, minimum cut, and maximum-weight matching. We also present some general-purpose approximation schemes for incremental packing and covering problems. Our findings reveal that incremental considerations expose a new and diverse level of complexity in conventional optimization problems.