klampt.math.optimize module

Classes to help set up and solve nonlinear, constrained optimization problems.

Supports local and global optimization. Wraps around scipy, pyOpt, or DIRECT (for now).

Works well with the klampt.math.symbolic module.

Classes:

OptimizationProblem()

A holder for optimization problem data.

LocalOptimizer([method])

A wrapper around different local optimization libraries.

GlobalOptimizer([method])

A wrapper around different global optimization libraries.

OptimizerParams([numIters, tol, ...])

OptimizationObjective(expr, type[, weight])

Describes an optimization cost function or constraint.

OptimizationProblemBuilder([context])

Defines a generalized optimization problem that can be saved/loaded from a JSON string.

Functions:

sample_range(a, b)

Samples x in the range [a,b].

class klampt.math.optimize.OptimizationProblem[source]

Bases: object

A holder for optimization problem data. All attributes are optional, and some solvers can’t handle certain types of constraints and costs.

The objective function must return a float. All equality and inequality functions are required to return a list of floats.

objective

an objective function f(x)

Type:

function

objectiveGrad

a function df/dx(x) giving the gradient of f.

Type:

function

bounds

a pair (l,u) giving lower and upper bounds on the search space.

Type:

tuple

equalities

functions \(g(x)=0\) required of a feasible solution. In practice, \(|g(x)| \leq tol\) is required, where tol is a tolerance parameter for the solver.

Type:

list of functions

equalityGrads

gradient/Jacobian functions \(\frac{\partial g}{\partial x}(x)\) of the equality functions.

Type:

list of functions

inequalities

inequality functions requiring math:h(x) leq 0 for a feasible solution.

Type:

list of functions

inequalityGrads

a list of gradient/Jacobian functions \(\frac{\partial h}{\partial x}(x)\) of each inequality function.

Type:

list of functions

feasibilityTests

boolean black-box predicates that must be true of the solution

Type:

list of functions

Suitable for use with the symbolic module. Once a Context is created, and appropriate Variables, Functions, and Expressions are declared, the setSymbolicObjective() and addSymbolicConstraint() methods automatically determine the standard Python function forms. i.e., context.makeFlatFunction(f,varorder) where varorder = None for the default variable ordering.

The OptimizationProblemBuilder class is more closely tied with the symbolic module and is more convenient to use. It performs automatic simplification and differentiation, and can be saved / loaded to disk.

Methods:

setObjective(func[, funcGrad])

addEquality(func[, funcGrad])

addInequality(func[, funcGrad])

setBounds(xmin, xmax)

setFeasibilityTest(test)

addFeasibilityTest(test)

setSymbolicObjective(func, context[, varorder])

Sets an objective function from a symbolic Function or Expression (see symbolic module).

addSymbolicConstraint(func, context[, ...])

Adds a constraint from a symbolic Function or symbolic.Expression (see symbolic module).

objectiveValue(x)

Returns the objective function value f(x).

feasible(x[, equalityTol])

Returns true if x is a feasible point.

equalityResidual(x)

Returns the stacked vector g(x) where g(x)=0 is the equality constraint.

inequalityResidual(x)

Returns the stacked vector h(x) where h(x)<=0 is the inequality constraint.

makeUnconstrained(objective_scale[, keep_bounds])

If this problem is constrained, returns a new problem in which the objective function is a scoring function that sums all of the equality / inequality errors at x plus objective_scale*objective function(x).

setObjective(func, funcGrad=None)[source]
addEquality(func, funcGrad=None)[source]
addInequality(func, funcGrad=None)[source]
setBounds(xmin, xmax)[source]
setFeasibilityTest(test)[source]
addFeasibilityTest(test)[source]
setSymbolicObjective(func, context, varorder=None)[source]

Sets an objective function from a symbolic Function or Expression (see symbolic module).

Note

The optimization parameters will be a flattened version of each Variable appearing in func.

addSymbolicConstraint(func, context, varorder=None, blackbox=False)[source]

Adds a constraint from a symbolic Function or symbolic.Expression (see symbolic module). This will be “smart” in that and Expressions will be converted to multiple constraints, inequalities will be converted to inequality constraints, and bounds will be converted to bound constraints. All other constraints will be treated as feasibility constraints.

objectiveValue(x)[source]

Returns the objective function value f(x).

feasible(x, equalityTol=1e-06)[source]

Returns true if x is a feasible point.

equalityResidual(x)[source]

Returns the stacked vector g(x) where g(x)=0 is the equality constraint.

inequalityResidual(x)[source]

Returns the stacked vector h(x) where h(x)<=0 is the inequality constraint.

makeUnconstrained(objective_scale, keep_bounds=True)[source]

If this problem is constrained, returns a new problem in which the objective function is a scoring function that sums all of the equality / inequality errors at x plus objective_scale*objective function(x). If objective_scale is small, then the scoring function is approximately minimized at a feasible minimum.

If the problem is unconstrained, this just returns self.

If keep_bounds = true, this does not add the bounds to the inequality errors.

class klampt.math.optimize.LocalOptimizer(method='auto')[source]

Bases: object

A wrapper around different local optimization libraries. Only minimization is supported, and only scipy and pyOpt are supported.

The method is specified using the method string, which can be:

  • ‘auto’: picks between scipy and pyOpt, whatever is available.

  • ‘scipy’: uses scipy.optimize.minimize with default settings.

  • ‘scipy.[METHOD]’: uses scipy.optimize.minimize with the argument method=[METHOD].

  • ‘pyOpt’: uses pyOpt with SLSQP.

  • ‘pyOpt.[METHOD]’: uses pyOpt with the given method.

Methods:

methodsAvailable()

Returns a list of methods that are available on this system

methodsAppropriate(problem)

Returns a list of available methods that are appropriate to use for the given problem

setSeed(x)

solve(problem[, numIters, tol])

Returns a tuple (success,result)

static methodsAvailable()[source]

Returns a list of methods that are available on this system

static methodsAppropriate(problem)[source]

Returns a list of available methods that are appropriate to use for the given problem

setSeed(x)[source]
solve(problem, numIters=100, tol=1e-06)[source]

Returns a tuple (success,result)

klampt.math.optimize.sample_range(a, b)[source]

Samples x in the range [a,b].

  • If the range is bounded, the uniform distribution x~U(a,b) is used.

  • If the range is unbounded, then this uses the log transform to sample a distribution.

Specifically, if a=-inf and b is finite, then \(x \sim b + \log(y)\) where \(y \sim U(0,1)\). A similar formula holds for a finite and \(b=\infty\).

If a=-inf and b=inf, then \(x \sim s*\log(y)\), where \(y \sim U(0,1)\) and the sign s takes on either of {-1,1} each with probability 0.5.

class klampt.math.optimize.GlobalOptimizer(method='auto')[source]

Bases: object

A wrapper around different global optimization libraries. Only minimization is supported, and only DIRECT, scipy, and pyOpt are supported.

The optimization technique is specified using the method string, which can be:

  • ‘auto’: picks between DIRECT and random-restart

  • ‘random-restart.METHOD’: random restarts using the local optimizer METHOD.

  • ‘DIRECT’: the DIRECT global optimizer

  • ‘scipy’: uses scipy.optimize.minimize with default settings.

  • ‘scipy.METHOD’: uses scipy.optimize.minimize with the argument method=METHOD.

  • ‘pyOpt’: uses pyOpt with SLSQP.

  • ‘pyOpt.METHOD’: uses pyOpt with the given method.

The method attribute can also be a list, which does a cascading solver in which the previous solution point is used as a seed for the next solver.

Examples:

  • ‘DIRECT’: Run the DIRECT method

  • ‘scipy.differential_evolution’: Runs the scipy differential evolution technique

  • ‘random-restart.scipy’: Runs random restarts using scipy’s default local optimizer

  • ‘random-restart.pyOpt.SLSQP’: Runs random restarts using pyOpt as a local optimizer

  • [‘DIRECT’,’auto’]: Run the DIRECT method then clean it up with the default local optimizer

Random restarts picks each component x of the seed state randomly using sample_range(a,b) where [a,b] is the range of x given by problem.bounds.

DIRECT and scipy.differential_evolution require a bounded state space.

Methods:

setSeed(x)

solve(problem[, numIters, tol])

Returns a pair (solved,x) where solved is True if the solver found a valid solution, and x is the solution vector.

setSeed(x)[source]
solve(problem, numIters=100, tol=1e-06)[source]

Returns a pair (solved,x) where solved is True if the solver found a valid solution, and x is the solution vector.

class klampt.math.optimize.OptimizerParams(numIters=50, tol=0.001, startRandom=False, numRestarts=1, timeout=10, globalMethod=None, localMethod=None)[source]

Bases: object

Methods:

toJson()

fromJson(obj)

solve(optProblem[, seed])

Globally or locally solves an OptimizationProblem instance with the given parameters.

toJson()[source]
fromJson(obj)[source]
solve(optProblem, seed=None)[source]

Globally or locally solves an OptimizationProblem instance with the given parameters. Optionally takes a seed as well.

Basically, this is a thin wrapper around GlobalOptimizer that converts the OptimizerParams to the appropriate format.

Returns:

(success,x) where success is True or False and x is the solution.

Return type:

tuple

class klampt.math.optimize.OptimizationObjective(expr, type, weight=None)[source]

Bases: object

Describes an optimization cost function or constraint.

expr

object f(x)

Type:

symbolic.Expression

type

string describing what the objective does:

  • ‘cost’: added to the cost. Must be scalar.

  • ‘eq’: an equality f(x)=0 that must be met exactly (up to a given equality tolerance)

  • ‘ineq’: an inequality constraint f(x)<=0

  • ‘feas’: a black-box boolean feasibility test f(x) = True

Type:

str

soft

if true, this is penalized as part of the cost function. Specifically \(w \|f(x)\|^2\) is the penalty for ‘eq’ types, and \(w I[f(x)\neq \text{True}]\) for ‘feas’ types.

Type:

bool

weight

a weight, used only for cost or soft objectives

Type:

float, optional

name

a name for this objective.

Type:

str, optional

class klampt.math.optimize.OptimizationProblemBuilder(context=None)[source]

Bases: object

Defines a generalized optimization problem that can be saved/loaded from a JSON string. Allows custom lists of objectives, feasibility tests, and cost functions. Multiple variables can be optimized at once.

context

a context that stores the optimization variables and any user data.

Type:

symbolic.Context

objectives

all objectives or constraints used in the optimization.

Type:

list of OptimizationObjective

optimizationVariables

A list of Variables used for optimization. If not set, this will try to find the variable ‘x’. If not found, this will use all unbound variables in the objectives.

Type:

list of Variable

Note that objectives must be created from symbolic.Function objects, so that they are savable/loadable. See the documentation of the symbolic module for more detail.

Methods:

addEquality(f[, weight])

If f is a symbolic.Function it's a function f(x) that evaluates to 0 for a feasible solution.

addInequality(f[, weight])

Adds an inequality f(x) <= 0.

addCost(f[, weight])

Adds a cost function f(q).

addFeasibilityTest(f[, weight])

Adds an additional feasibility test.

setBounds(var[, xmin, xmax])

Bounds the optimization variable var

bind(**kwargs)

Binds the variables specified by the keyword arguments

unbind(**kwargs)

Unbinds the variables specified by the keyword arguments

bindVars(*args)

unbindVars()

getVarValues()

Saves the bindings for optimization variables in the current context into a list.

setVarValues(s)

Converts a state into bindings for the optimization variables in the current context.

getVarVector()

Flattens the bindings for optimization variables in the current context into a vector x.

setVarVector(x)

Turns a vector x into bindings for the optimization variables in the current context.

randomVarBinding()

Samples values for all optimization variables, sampling uniformly according to their bounds

cost()

Evaluates the cost function with the variables already bound.

equalityResidual([soft])

Evaluates the equality + ik functions at the currently bound state x, stacking the results into a single vector.

satisfiesEqualities([tol])

Returns True if every entry of the (hard) equality + IK residual equals 0 (to the tolerance tol).

inequalityResidual([soft])

Evaluates the inequality functions at the currently bound state x, stacking the results into a single vector.

satisfiesInequalities([margin])

Returns True if the for currently bound state x, every entry of the (hard) inequality residuals is <= -margin (default 0).

feasibilityTestsPass([soft])

Returns True if the currently bound state passes all black-box feasibility tests.

inBounds()

Returns True if all bounded variables are within their ranges at the currently bound state x

isFeasible([eqTol])

Returns True if the currently bound state passes all equality, inequality, joint limit, and black-box feasibility tests.

costSymbolic()

Returns a symbolic.Expression, over variables in self.context, that evaluates to the cost

equalityResidualSymbolic([soft])

Returns a symbolic.Expression, over variables in self.context, that evaluates to the equality residual

inequalityResidualSymbolic([soft])

Returns a symbolic.Expression, over variables in self.context, that evaluates to the inequality residual

equalitySatisfiedSymbolic([tol, soft])

Returns a symbolic.Expression, over variables in self.context, that evaluates to True if the equality constraint is met with tolerance tol

inequalitySatisfiedSymbolic([soft])

Returns a symbolic.Expression, over variables in self.context, that evaluates to True if the inequality constraint is met

feasibilityTestsPassSymbolic([soft])

Returns a symbolic.Expression, over variables in self.context, that evaluates to True if the black-box feasibility constraints are met

inBoundsSymbolic()

Returns a symbolic.Expression, over variables in self.context, that evaluates to True the configuration meets bound constraints

isFeasibleSymbolic([eqTol])

Returns a symbolic.Expression, over $q and other user data variables, that evaluates to True if the configuration meets all feasibility tests

score([eqWeight, ineqWeight, infeasWeight])

Returns an error score that is equal to the optimum at a feasible solution.

pprint([indent])

toJson([saveContextFunctions, prettyPrintExprs])

Returns a JSON object representing this optimization problem.

fromJson(object[, context])

Sets this IK problem to a JSON object representing it.

preprocess([steps])

Preprocesses the problem to make solving more efficient

getBounds()

Returns optimization varable bounds as a list of (xmin,xmax) pairs.

getProblem()

Returns an OptimizationProblem instance over the optimization variables.

solve([params, preprocess, cache])

Solves the optimization problem.

addEquality(f, weight=None)[source]

If f is a symbolic.Function it’s a function f(x) that evaluates to 0 for a feasible solution. If it is a symbolic.Expression it’s an expresion over the optimization variables

If weight = None then this is an equality constraint, Otherwise it gets added to the objective weight*||f(x)||^2.

addInequality(f, weight=None)[source]

Adds an inequality f(x) <= 0.

addCost(f, weight=1)[source]

Adds a cost function f(q).

addFeasibilityTest(f, weight=None)[source]

Adds an additional feasibility test.

setBounds(var, xmin=None, xmax=None)[source]

Bounds the optimization variable var

bind(**kwargs)[source]

Binds the variables specified by the keyword arguments

unbind(**kwargs)[source]

Unbinds the variables specified by the keyword arguments

bindVars(*args)[source]
unbindVars()[source]
getVarValues()[source]

Saves the bindings for optimization variables in the current context into a list.

setVarValues(s)[source]

Converts a state into bindings for the optimization variables in the current context.

getVarVector()[source]

Flattens the bindings for optimization variables in the current context into a vector x.

setVarVector(x)[source]

Turns a vector x into bindings for the optimization variables in the current context.

randomVarBinding()[source]

Samples values for all optimization variables, sampling uniformly according to their bounds

cost()[source]

Evaluates the cost function with the variables already bound.

equalityResidual(soft=True)[source]

Evaluates the equality + ik functions at the currently bound state x, stacking the results into a single vector. The residual should equal 0 (to a small tolerance) at a feasible solution.

If soft=True, also stacks the soft equalities.

satisfiesEqualities(tol=0.001)[source]

Returns True if every entry of the (hard) equality + IK residual equals 0 (to the tolerance tol).

inequalityResidual(soft=False)[source]

Evaluates the inequality functions at the currently bound state x, stacking the results into a single vector. The residual should be <= 0 at a feasible solution.

If soft=True then this includes the soft inequality residuals.

satisfiesInequalities(margin=0)[source]

Returns True if the for currently bound state x, every entry of the (hard) inequality residuals is <= -margin (default 0).

feasibilityTestsPass(soft=False)[source]

Returns True if the currently bound state passes all black-box feasibility tests.

inBounds()[source]

Returns True if all bounded variables are within their ranges at the currently bound state x

isFeasible(eqTol=0.001)[source]

Returns True if the currently bound state passes all equality, inequality, joint limit, and black-box feasibility tests. Equality and IK constraints mut be met with equality tolerance eqTol.

costSymbolic()[source]

Returns a symbolic.Expression, over variables in self.context, that evaluates to the cost

equalityResidualSymbolic(soft=False)[source]

Returns a symbolic.Expression, over variables in self.context, that evaluates to the equality residual

inequalityResidualSymbolic(soft=False)[source]

Returns a symbolic.Expression, over variables in self.context, that evaluates to the inequality residual

equalitySatisfiedSymbolic(tol=0.001, soft=False)[source]

Returns a symbolic.Expression, over variables in self.context, that evaluates to True if the equality constraint is met with tolerance tol

inequalitySatisfiedSymbolic(soft=False)[source]

Returns a symbolic.Expression, over variables in self.context, that evaluates to True if the inequality constraint is met

feasibilityTestsPassSymbolic(soft=False)[source]

Returns a symbolic.Expression, over variables in self.context, that evaluates to True if the black-box feasibility constraints are met

inBoundsSymbolic()[source]

Returns a symbolic.Expression, over variables in self.context, that evaluates to True the configuration meets bound constraints

isFeasibleSymbolic(eqTol=0.001)[source]

Returns a symbolic.Expression, over $q and other user data variables, that evaluates to True if the configuration meets all feasibility tests

score(eqWeight=1.0, ineqWeight=1.0, infeasWeight=1.0)[source]

Returns an error score that is equal to the optimum at a feasible solution. Evaluated at the currently bound state x.

pprint(indent=0)[source]
toJson(saveContextFunctions=False, prettyPrintExprs=False)[source]

Returns a JSON object representing this optimization problem.

Parameters:
  • saveContextFunctions (bool, optional) – if True, saves all custom functions in self.context. If they are saved, then the current context is required to be the same context in which the problem is loaded.

  • prettyPrintExprs (bool, optional) – if True, prints expressions more nicely as more human-readable strings rather than JSON objects. These strings are parsed on load, which is a little slower than pure JSON.

fromJson(object, context=None)[source]

Sets this IK problem to a JSON object representing it. A ValueError is raised if it is not the correct type.

preprocess(steps='all')[source]

Preprocesses the problem to make solving more efficient

Returns:

(opt,optToSelf,selfToOpt) giving:

  • opt: a simplified version of this optimization problem. If no simplfication can be performed, opt = self

  • optToSelf: a map of opt’s variables to self’s variables. If no simplification can be performed, optToSelf = None

  • selfToOpt: a map of self’s variables to opts’s variables. If no simplification can be performed, selfToOpt = None

Return type:

tuple

Specific steps include:

  1. delete any objectives with 0 weight

  2. delete any optimization variables not appearing in expressions

  3. fixed-bound (x in [a,b], with a=b) variables are replaced with fixed values.

  4. simplify objectives

  5. TODO: replace equalities of the form var = expr by matching var to expr?

If optToSelf is not None, then it is a list of Expressions that, when eval’ed, produce the values of the corresponding optimizationVariables in the original optimization problem. selfToOpt performs the converse mapping. In other words, if opt has bound values to all of its optimizationVariables, the code:

for var,expr in zip(self.optimizationVariables,optToSelf):
    var.bind(expr.eval(opt.context))

binds all optimization variables in self appropriately.

getBounds()[source]

Returns optimization varable bounds as a list of (xmin,xmax) pairs. None is returned if the problem is unconstrained

getProblem()[source]

Returns an OptimizationProblem instance over the optimization variables.

solve(params=<klampt.math.optimize.OptimizerParams object>, preprocess=True, cache=False)[source]

Solves the optimization problem. The result is stored in the bound optimizationVariables.

If you will be solving the problem several times without modification (except for user data and initial values of optimizationVariables), you may set cache=True to eliminate some overhead. Note that caching does not work properly if you change constraints or non-optimization variables.