tltorch._tensor_lasso
.CPL1Regularizer
-
class
tltorch._tensor_lasso.
CPL1Regularizer
(penalty=0.01, clamp_weights=True, threshold=1e-06, normalize_loss=True)[source] Decomposition Hook for Tensor Lasso on TT tensors
Parameters: - penaltyfloat, default is 0.01
scaling factor for the loss
- clamp_weightsbool, default is True
if True, the lasso weights are clamp between -1 and 1
- thresholdfloat, default is 1e-6
if a lasso weight is lower than the set threshold, it is set to 0
- normalize_lossbool, default is True
If True, the loss will be between 0 and 1. Otherwise, the raw sum of absolute weights will be returned.
Examples
First you need to create an instance of the regularizer:
>>> regularizer = CPL1Regularizer(penalty=penalty)
You can apply the regularizer to one or several layers:
>>> trl = CPTRL((5, 5), (5, 5), rank='same') >>> trl2 = CPTRL((5, 5), (2, ), rank='same') >>> regularizer.apply(trl) >>> regularizer.apply(trl2)
The lasso is automatically applied:
>>> x = trl(x) >>> pred = trl2(x) >>> loss = your_loss_function(pred)
Add the Lasso loss:
>>> loss = loss + regularizer.loss
You can now backpropagate through your loss as usual:
>>> loss.backwards()
After you finish updating the weights, don’t forget to reset the regularizer, otherwise it will keep accumulating values!
>>> loss.reset()
You can also remove the regularizer with regularizer.remove(trl).
Attributes: loss
Returns the current Lasso (l1) loss for the layers that have been called so far.
Methods
__call__
(module, cp_tensor)CP already includes weights, we’ll just take their l1 norm apply
(module)Apply an instance of the L1Regularizer to a tensor module remove
(module)Remove the Regularization from a module. reset
()Reset the loss, should be called at the end of each iteration. -
reset
()[source] Reset the loss, should be called at the end of each iteration.
-
property
loss
Returns the current Lasso (l1) loss for the layers that have been called so far.
Returns: - float
l1 regularization on the tensor layers the regularization has been applied to.
-
apply
(module)[source] Apply an instance of the L1Regularizer to a tensor module
Parameters: - moduleTensorModule
module on which to add the regularization
Returns: - TensorModule (with Regularization hook)
-
remove
(module)[source] Remove the Regularization from a module.