secml.ml.classifiers.secure

CClassifierSecSVM

class secml.ml.classifiers.secure.c_classifier_sec_svm.CClassifierSecSVM(ub=inf, idx_ub=None, lb=-inf, idx_lb=None, eta=0.5, max_it=10000.0, eps=0.0001, *args, **kwargs)[source]

Bases: secml.ml.classifiers.sklearn.c_classifier_svm.CClassifierSVM

Secure Support Vector Machine (Sec-SVM) classifier.

This implements the secure classifier from:

Demontis et al. “Yes, machine learning can be more secure! a case study on android malware detection.” IEEE TDSC 2017. https://arxiv.org/abs/1704.08996

Parameters
ubscalar or None, optional

Upper bound of the weights. If None (default), no bound is applied.

idx_ubCArray or None, optional

If CArray, the upper bound is only applied to the weights indicized by idx_ub. If None (default), the bound is applied to all weights.

lbscalar or None, optional

Lower bound of the weights. If None (default), no bound is applied.

idx_lbCArray or None, optional

If CArray, the lower bound is only applied to the weights indicized by idx_ub. If None (default), the bound is applied to all weights.

etascalar, optional

Step of the gradient descent. Default 0.5.

max_itint, optional

Maximum number of iterations of the gradient descent. Default 1e4.

epsscalar, optional

Tolerance of the stop criterion of the gradient descent. Default 1e-4.

*args, **kwargs

Other paramters from CClassifierSVM.

Attributes
class_type‘sec-svm’

Defines class type.

Methods

C_hinge_loss(self, x, y)

Compute the loss term for each point in dataset multiplied by C.

backward(self[, w])

Returns the preprocessor gradient wrt data.

copy(self)

Returns a shallow copy of current class.

create([class_item])

This method creates an instance of a class with given type.

create_chain(class_items, kwargs_list)

Creates a chain of preprocessors.

decision_function(self, x[, y])

Computes the decision function for each pattern in x.

deepcopy(self)

Returns a deep copy of current class.

estimate_parameters(self, dataset, …[, …])

Estimate parameter that give better result respect a chose metric.

fit(self, x, y)

Trains the classifier.

fit_forward(self, x[, y, caching])

Fit estimator using data and then execute forward on the data.

forward(self, x[, caching])

Forward pass on input x.

get_class_from_type(class_type)

Return the class associated with input type.

get_params(self)

Returns the dictionary of class hyperparameters.

get_state(self, **kwargs)

Returns the object state dictionary.

get_subclasses()

Get all the subclasses of the calling class.

grad_f_params(self, x[, y])

Derivative of the decision function w.r.t.

grad_f_x(self, x, y)

Computes the gradient of the classifier’s decision function wrt x.

grad_loss_params(self, x, y[, loss])

Derivative of the loss w.r.t.

grad_tr_params(self, x, y)

Derivative of the classifier training objective w.r.t.

gradient(self, x[, w])

Compute gradient at x by doing a backward pass.

gradient_w_b(self, x, y)

Compute the gradient dloss/dw, where loss is sum_i max(0, 1-y_i*f(x_i))

hessian_tr_params(self[, x, y])

Hessian of the training objective w.r.t.

hinge_loss(self, x, y)

Compute the loss term for each point in dataset.

is_fitted(self)

Return True if the classifier is trained (fitted).

list_class_types()

This method lists all types of available subclasses of calling one.

load(path)

Loads object from file.

load_state(self, path)

Sets the object state from file.

objective(self, x, y)

Objective function.

predict(self, x[, return_decision_function])

Perform classification of each pattern in x.

save(self, path)

Save class object to file.

save_state(self, path, **kwargs)

Store the object state to file.

set(self, param_name, param_value[, copy])

Set a parameter of the class.

set_params(self, params_dict[, copy])

Set all parameters passed as a dictionary {key: value}.

set_state(self, state_dict[, copy])

Sets the object state using input dictionary.

timed([msg])

Timer decorator.

C_hinge_loss(self, x, y)[source]

Compute the loss term for each point in dataset multiplied by C.

If class_weight == ‘balanced’, it multiplies C to the inverse prob of theclasses.

property b

Return the SVM bias (b term in the decision function)

property eps

Precision of the stop condition for training.

property eta

Eta parameter for the training gradient.

gradient_w_b(self, x, y)[source]

Compute the gradient dloss/dw, where loss is sum_i max(0, 1-y_i*f(x_i))

hinge_loss(self, x, y)[source]

Compute the loss term for each point in dataset.

property lb

Return value of weight lower bound

property max_it

Maximum number of iteration for the training.

objective(self, x, y)[source]

Objective function.

property ub

Return value of weight upper bound

property w

Return the vector of feature weights (only if ckernel is None)