API reference
tltorch
: Tensorized Deep Neural Networks
Factorized Tensors
TensorLy-Torch builds on top of TensorLy and provides out of the box PyTorch layers for tensor based operations. The core of this is the concept of factorized tensors, which factorize our layers, instead of regular, dense PyTorch tensors.
You can create any factorized tensor through the main class, or directly create a specific subclass:
|
Tensor Factorization |
|
CP Factorization |
|
Tucker Factorization |
|
Tensor-Train (Matrix-Product-State) Factorization |
Tensorized Matrices
In TensorLy-Torch , you can also represent matrices in tensorized form, as low-rank tensors .
|
Matrix in Tensorized Format |
|
Methods |
|
|
Initialization
Module for initializing tensor decompositions
|
Initializes directly the parameters of a factorized tensor so the reconstruction has the specified standard deviation and 0 mean |
|
Initializes directly the weights and factors of a CP decomposition so the reconstruction has the specified std and 0 mean |
|
Initializes directly the weights and factors of a Tucker decomposition so the reconstruction has the specified std and 0 mean |
|
Initializes directly the weights and factors of a TT decomposition so the reconstruction has the specified std and 0 mean |
Tensor Regression Layers
|
Tensor Regression Layers |
Tensor Contraction Layers
|
Tensor Contraction Layer [R4c5b93526459-1] |
Factorized Linear Layers
|
Tensorized Fully-Connected Layers |
Factorized Convolutions
General N-Dimensional convolutions in Factorized forms
|
Create a factorized convolution of arbitrary order |
Factorized Embeddings
A drop-in replacement for PyTorch’s embeddings but using an efficient tensor parametrization that never reconstructs the full table.
|
Tensorized Embedding Layers For Efficient Model Compression |
Tensor Dropout
These functions allow you to easily add or remove tensor dropout from tensor layers.
|
Tensor Dropout |
|
Removes the tensor dropout from a TensorModule |
You can also use the class API below but unless you have a particular use for the classes, you should use the convenient functions provided instead.
|
Decomposition Hook for Tensor Dropout on FactorizedTensor |
L1 Regularization
L1 Regularization on tensor modules.
|
Generalized Tensor Lasso from a factorized tensors |
|
Removes the tensor lasso from a TensorModule |