secml.optim.optimizers

COptimizer

class secml.optim.optimizers.c_optimizer.COptimizer(fun, constr=None, bounds=None)[source]

Bases: secml.core.c_creator.CCreator

Interface for optimizers.

Implements:

minimize f(x) s.t. gi(x) <= 0, i=1,…,m (inequality constraints) hj(x) = 0, j = 1,…, n (equality constraints)

Parameters
funCFunction

The objective function to be optimized, along with 1st-order (Jacobian) and 2nd-order (Hessian) derivatives (if available).

constrCConstraintL1 or CConstraintL2 or None, optional

A distance constraint. Default None.

boundsCConstraintBox or None, optional

A box constraint. Default None.

Attributes
bounds

Optimization bounds.

class_type

Defines class type.

constr

Optimization constraint.

f

The objective function

f_eval
f_opt
f_seq
grad_eval
logger

Logger for current object.

n_dim
verbose

Verbosity level of logger output.

x_opt
x_seq

Methods

copy(self)

Returns a shallow copy of current class.

create([class_item])

This method creates an instance of a class with given type.

deepcopy(self)

Returns a deep copy of current class.

get_class_from_type(class_type)

Return the class associated with input type.

get_params(self)

Returns the dictionary of class hyperparameters.

get_state(self, **kwargs)

Returns the object state dictionary.

get_subclasses()

Get all the subclasses of the calling class.

list_class_types()

This method lists all types of available subclasses of calling one.

load(path)

Loads object from file.

load_state(self, path)

Sets the object state from file.

maximize(self, x_init[, args])

Interface for maximizers.

minimize(self, x_init[, args])

Interface for minimizers.

save(self, path)

Save class object to file.

save_state(self, path, **kwargs)

Store the object state to file.

set(self, param_name, param_value[, copy])

Set a parameter of the class.

set_params(self, params_dict[, copy])

Set all parameters passed as a dictionary {key: value}.

set_state(self, state_dict[, copy])

Sets the object state using input dictionary.

timed([msg])

Timer decorator.

property bounds

Optimization bounds.

property constr

Optimization constraint.

property f

The objective function

property f_eval
property f_opt
property f_seq
property grad_eval
maximize(self, x_init, args=(), **kwargs)[source]

Interface for maximizers.

Implementing:

max fun(x) s.t. constraint

This is implemented by inverting the sign of fun and gradient and running the COptimizer.minimize().

Parameters
x_initCArray

The initial input point.

argstuple, optional

Extra arguments passed to the objective function and its gradient.

kwargs

Additional parameters of the minimization method.

abstract minimize(self, x_init, args=(), **kwargs)[source]

Interface for minimizers.

Implementing:

min fun(x) s.t. constraint

Parameters
x_initCArray

The initial input point.

argstuple, optional

Extra arguments passed to the objective function and its gradient.

kwargs

Additional parameters of the minimization method.

property n_dim
property x_opt
property x_seq

COptimizerPGD

class secml.optim.optimizers.c_optimizer_pgd.COptimizerPGD(fun, constr=None, bounds=None, eta=0.001, eps=0.0001, max_iter=200)[source]

Bases: secml.optim.optimizers.c_optimizer.COptimizer

Solves the following problem:

min f(x) s.t. d(x,x0) <= dmax x_lb <= x <= x_ub

f(x) is the objective function (either linear or nonlinear), d(x,x0) <= dmax is a distance constraint in feature space (l1 or l2), and x_lb <= x <= x_ub is a box constraint on x.

The solution algorithm is based on the classic gradient descent algorithm.

Parameters
funCFunction

The objective function to be optimized, along with 1st-order (Jacobian) and 2nd-order (Hessian) derivatives (if available).

constrCConstraintL1 or CConstraintL2 or None, optional

A distance constraint. Default None.

boundsCConstraintBox or None, optional

A box constraint. Default None.

etascalar, optional

Step of the Projected Gradient Descent. Default 1e-3.

epsscalar, optional

Tolerance of the stop criterion. Default 1e-4.

max_iterint, optional

Maximum number of iterations. Default 200.

Attributes
class_type‘pgd’

Defines class type.

Methods

copy(self)

Returns a shallow copy of current class.

create([class_item])

This method creates an instance of a class with given type.

deepcopy(self)

Returns a deep copy of current class.

get_class_from_type(class_type)

Return the class associated with input type.

get_params(self)

Returns the dictionary of class hyperparameters.

get_state(self, **kwargs)

Returns the object state dictionary.

get_subclasses()

Get all the subclasses of the calling class.

list_class_types()

This method lists all types of available subclasses of calling one.

load(path)

Loads object from file.

load_state(self, path)

Sets the object state from file.

maximize(self, x_init[, args])

Interface for maximizers.

minimize(self, x_init[, args])

Interface to minimizers.

save(self, path)

Save class object to file.

save_state(self, path, **kwargs)

Store the object state to file.

set(self, param_name, param_value[, copy])

Set a parameter of the class.

set_params(self, params_dict[, copy])

Set all parameters passed as a dictionary {key: value}.

set_state(self, state_dict[, copy])

Sets the object state using input dictionary.

timed([msg])

Timer decorator.

property eps

Return tolerance value for stop criterion

property eta

Return gradient descent step

property max_iter

Returns the maximum number of gradient descent iteration

minimize(self, x_init, args=(), **kwargs)[source]

Interface to minimizers.

Implements:

min fun(x) s.t. constraint

Parameters
x_initCArray

The initial input point.

argstuple, optional

Extra arguments passed to the objective function and its gradient.

Returns
f_seqCArray

Array containing values of f during optimization.

x_seqCArray

Array containing values of x during optimization.

COptimizerPGDLS

class secml.optim.optimizers.c_optimizer_pgd_ls.COptimizerPGDLS(fun, constr=None, bounds=None, eta=0.001, eta_min=None, eta_max=None, max_iter=1000, eps=0.0001)[source]

Bases: secml.optim.optimizers.c_optimizer.COptimizer

Solves the following problem:

min f(x) s.t. d(x,x0) <= dmax x_lb <= x <= x_ub

f(x) is the objective function (either linear or nonlinear), d(x,x0) <= dmax is a distance constraint in feature space (l1 or l2), and x_lb <= x <= x_ub is a box constraint on x.

The solution algorithm is based on a line-search exploring one feature (i.e., dimension) at a time (for l1-constrained problems), or all features (for l2-constrained problems). This solver also works for discrete problems where x and the grid discretization (eta) are integer valued.

Differently from standard line searches, it explores a subset of n_dimensions at a time. In this sense, it is an extension of the classical line-search approach.

Parameters
funCFunction

The objective function to be optimized, along with 1st-order (Jacobian) and 2nd-order (Hessian) derivatives (if available).

constrCConstraintL1 or CConstraintL2 or None, optional

A distance constraint. Default None.

boundsCConstraintBox or None, optional

A box constraint. Default None.

etascalar, optional

Minimum resolution of the line-search grid. Default 1e-3.

eta_minscalar or None, optional

Initial step of the line search. Gets multiplied or divided by 2 at each step until convergence. If None, will be set equal to eta. Default None.

eta_maxscalar or None, optional

Maximum step of the line search. Default None.

max_iterint, optional

Maximum number of iterations. Default 1000.

epsscalar, optional

Tolerance of the stop criterion. Default 1e-4.

Attributes
class_type‘pgd-ls’

Defines class type.

Methods

copy(self)

Returns a shallow copy of current class.

create([class_item])

This method creates an instance of a class with given type.

deepcopy(self)

Returns a deep copy of current class.

get_class_from_type(class_type)

Return the class associated with input type.

get_params(self)

Returns the dictionary of class hyperparameters.

get_state(self, **kwargs)

Returns the object state dictionary.

get_subclasses()

Get all the subclasses of the calling class.

list_class_types()

This method lists all types of available subclasses of calling one.

load(path)

Loads object from file.

load_state(self, path)

Sets the object state from file.

maximize(self, x_init[, args])

Interface for maximizers.

minimize(self, x_init[, args])

Interface to minimizers implementing

save(self, path)

Save class object to file.

save_state(self, path, **kwargs)

Store the object state to file.

set(self, param_name, param_value[, copy])

Set a parameter of the class.

set_params(self, params_dict[, copy])

Set all parameters passed as a dictionary {key: value}.

set_state(self, state_dict[, copy])

Sets the object state using input dictionary.

timed([msg])

Timer decorator.

property eps

Return tolerance value for stop criterion

property eta
property eta_max
property eta_min
property max_iter

Returns the maximum number of descent iterations

minimize(self, x_init, args=(), **kwargs)[source]
Interface to minimizers implementing

min fun(x) s.t. constraint

Parameters
x_initCArray

The initial input point.

argstuple, optional

Extra arguments passed to the objective function and its gradient.

Returns
f_seqCArray

Array containing values of f during optimization.

x_seqCArray

Array containing values of x during optimization.

COptimizerPGDExp

class secml.optim.optimizers.c_optimizer_pgd_exp.COptimizerPGDExp(fun, constr=None, bounds=None, eta=0.001, eta_min=None, eta_max=None, max_iter=1000, eps=0.0001)[source]

Bases: secml.optim.optimizers.c_optimizer_pgd_ls.COptimizerPGDLS

Solves the following problem:

min f(x) s.t. d(x,x0) <= dmax x_lb <= x <= x_ub

f(x) is the objective function (either linear or nonlinear), d(x,x0) <= dmax is a distance constraint in feature space (l1 or l2), and x_lb <= x <= x_ub is a box constraint on x.

The solution algorithm is based on a line-search exploring one feature (i.e., dimension) at a time (for l1-constrained problems), or all features (for l2-constrained problems). This solver also works for discrete problems where x and the grid discretization (eta) are integer valued.

Parameters
funCFunction

The objective function to be optimized, along with 1st-order (Jacobian) and 2nd-order (Hessian) derivatives (if available).

constrCConstraintL1 or CConstraintL2 or None, optional

A distance constraint. Default None.

boundsCConstraintBox or None, optional

A box constraint. Default None.

etascalar, optional

Minimum resolution of the line-search grid. Default 1e-3.

eta_minscalar or None, optional

Initial step of the line search. Gets multiplied or divided by 2 at each step until convergence. If None, will be set equal to eta. Default None.

eta_maxscalar or None, optional

Maximum step of the line search. Default None.

max_iterint, optional

Maximum number of iterations. Default 1000.

epsscalar, optional

Tolerance of the stop criterion. Default 1e-4.

Attributes
class_type‘pgd-exp’

Defines class type.

Methods

copy(self)

Returns a shallow copy of current class.

create([class_item])

This method creates an instance of a class with given type.

deepcopy(self)

Returns a deep copy of current class.

get_class_from_type(class_type)

Return the class associated with input type.

get_params(self)

Returns the dictionary of class hyperparameters.

get_state(self, **kwargs)

Returns the object state dictionary.

get_subclasses()

Get all the subclasses of the calling class.

list_class_types()

This method lists all types of available subclasses of calling one.

load(path)

Loads object from file.

load_state(self, path)

Sets the object state from file.

maximize(self, x_init[, args])

Interface for maximizers.

minimize(self, x_init[, args])

Interface to minimizers implementing

save(self, path)

Save class object to file.

save_state(self, path, **kwargs)

Store the object state to file.

set(self, param_name, param_value[, copy])

Set a parameter of the class.

set_params(self, params_dict[, copy])

Set all parameters passed as a dictionary {key: value}.

set_state(self, state_dict[, copy])

Sets the object state using input dictionary.

timed([msg])

Timer decorator.

minimize(self, x_init, args=(), **kwargs)[source]
Interface to minimizers implementing

min fun(x) s.t. constraint

Parameters
x_initCArray

The initial input point.

argstuple, optional

Extra arguments passed to the objective function and its gradient.

Returns
f_seqCArray

Array containing values of f during optimization.

x_seqCArray

Array containing values of x during optimization.

COptimizerScipy

class secml.optim.optimizers.c_optimizer_scipy.COptimizerScipy(fun, constr=None, bounds=None)[source]

Bases: secml.optim.optimizers.c_optimizer.COptimizer

Implements optimizers from scipy.

Attributes
class_type‘scipy-opt’

Defines class type.

Methods

copy(self)

Returns a shallow copy of current class.

create([class_item])

This method creates an instance of a class with given type.

deepcopy(self)

Returns a deep copy of current class.

get_class_from_type(class_type)

Return the class associated with input type.

get_params(self)

Returns the dictionary of class hyperparameters.

get_state(self, **kwargs)

Returns the object state dictionary.

get_subclasses()

Get all the subclasses of the calling class.

list_class_types()

This method lists all types of available subclasses of calling one.

load(path)

Loads object from file.

load_state(self, path)

Sets the object state from file.

maximize(self, x_init[, args])

Interface for maximizers.

minimize(self, x_init[, args])

Minimize function.

save(self, path)

Save class object to file.

save_state(self, path, **kwargs)

Store the object state to file.

set(self, param_name, param_value[, copy])

Set a parameter of the class.

set_params(self, params_dict[, copy])

Set all parameters passed as a dictionary {key: value}.

set_state(self, state_dict[, copy])

Sets the object state using input dictionary.

timed([msg])

Timer decorator.

minimize(self, x_init, args=(), **kwargs)[source]

Minimize function.

Wrapper of scipy.optimize.minimize.

Parameters
x_initCArray

Init point. Dense flat array of real elements of size ‘n’, where ‘n’ is the number of independent variables.

argstuple, optional

Extra arguments passed to the objective function and its derivatives (fun, jac and hess functions).

The following can be passed as optional keyword arguments:
methodstr or callable, optional

Type of solver. Should be one of

If not given, chosen to be one of BFGS or L-BFGS-B depending if the problem has constraints or bounds. See c_optimizer_scipy.SUPPORTED_METHODS for the full list.

jac{‘2-point’, ‘3-point’, ‘cs’, bool}, optional

Method for computing the gradient vector. The function in self.fun.gradient will be used (if defined). Alternatively, the keywords {‘2-point’, ‘3-point’, ‘cs’} select a finite difference scheme for numerical estimation of the gradient. Options ‘3-point’ and ‘cs’ are available only to ‘trust-constr’. If jac is a Boolean and is True, fun is assumed to return the gradient along with the objective function. If False, the gradient will be estimated using ‘2-point’ finite difference estimation.

boundsscipy.optimize.Bounds, optional

A bound constraint in scipy.optimize format. If defined, bounds of COptimizerScipy will be ignored.

tolfloat, optional

Tolerance for termination. For detailed control, use solver-specific options.

optionsdict, optional

A dictionary of solver options. All methods accept the following generic options:

  • maxiter : int Maximum number of iterations to perform.

  • disp : bool Set to True to print convergence messages. Equivalent of setting COptimizerScipy.verbose = 2.

For method-specific options, see show_options.

Returns
xCArray

The solution of the optimization.

Examples

>>> from secml.array import CArray
>>> from secml.optim.optimizers import COptimizerScipy
>>> from secml.optim.function import CFunctionRosenbrock
>>> x_init = CArray([1.3, 0.7])
>>> opt = COptimizerScipy(CFunctionRosenbrock())
>>> x_opt = opt.minimize(
... x_init, method='BFGS', options={'gtol': 1e-6, 'disp': True})
Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 32
         Function evaluations: 39
         Gradient evaluations: 39
>>> print(x_opt)
CArray([1. 1.])
>>> print(opt.f_opt)
9.294383981640425e-19