API reference¶
tltorch
: Tensorized Deep Neural Networks
Factorized Tensors¶
TensorLy-Torch builds on top of TensorLy and provides out of the box PyTorch layers for tensor based operations. The core of this is the concept of factorized tensors, which factorize our layers, instead of regular, dense PyTorch tensors.
You can create any factorized tensor through the main class using:
|
Tensor in Factorized form |
You can create a tensor of any form using FactorizedTensor.new(shape, rank, factorization)
, where factorization can be Dense, CP, Tucker or TT.
Note that if you use factorization = 'dense'
you are just creating a regular, unfactorized tensor.
This allows to manipulate any tensor, factorized or not, with a simple, unified interface.
Alternatively, you can also directly create a specific subclass:
|
Dense tensor |
|
CP Factorization |
|
Tucker Factorization |
|
Tensor-Train (Matrix-Product-State) Factorization |
Tensorized Matrices¶
- In TensorLy-Torch , you can also represent matrices in tensorized form, as low-rank tensors.
Just as for factorized tensor, you can create a tensorized matrix through the main class using:
|
Matrix in Tensorized Format |
You can create a tensor of any form using TensorizedTensor.new(tensorized_shape, rank, factorization)
, where factorization can be Dense, CP, Tucker or BlockTT.
You can also explicitly create the type of tensor you want using the following classes:
|
Methods |
|
Matrix in Tensorized Format |
|
Methods |
|
|
Complex Tensors¶
In theory, you can simply specify dtype=torch.cfloat
in the creation of any of the tensors of tensorized matrices above, to automatically create a complex valued tensor.
However, in practice, there are many issues in complex support. Distributed Data Parallelism in particular, is not supported.
In TensorLy-Torch, we propose a convenient and transparent way around this: simply use ComplexTensor
instead.
This will store the factors of the decomposition in real form (by explicitly storing the real and imaginary parts)
but will transparently return you a complex valued tensor or reconstruction.
|
Complex Dense Factorization |
|
Complex CP Factorization |
|
Complex Tucker Factorization |
|
Complex TT Factorization |
|
Complex DenseTensorized Factorization |
|
Complex TuckerTensorized Factorization |
|
Complex Tensorized CP Factorization |
|
Complex BlockTT Factorization |
You can also transparently instanciate any of these using directly the main classes, TensorizedTensor
or FactorizedTensor
and specifying
factorization="ComplexCP"
or in general ComplexFactorization
with Factorization any of the supported decompositions.
Initialization¶
Initialization is particularly important in the context of deep learning. We provide convenient functions to directly initialize factorized tensor (i.e. their factors) such that their reconstruction follows approximately a centered Gaussian distribution.
|
Initializes directly the parameters of a factorized tensor so the reconstruction has the specified standard deviation and 0 mean |
|
Initializes directly the weights and factors of a CP decomposition so the reconstruction has the specified std and 0 mean |
|
Initializes directly the weights and factors of a Tucker decomposition so the reconstruction has the specified std and 0 mean |
|
Initializes directly the weights and factors of a TT decomposition so the reconstruction has the specified std and 0 mean |
|
Initializes directly the weights and factors of a BlockTT decomposition so the reconstruction has the specified std and 0 mean |
Tensor Regression Layers¶
|
Tensor Regression Layers |
Tensor Contraction Layers¶
|
Tensor Contraction Layer [R4c5b93526459-1] |
Factorized Linear Layers¶
|
Tensorized Fully-Connected Layers |
Factorized Convolutions¶
General N-Dimensional convolutions in Factorized forms
|
Create a factorized convolution of arbitrary order |
Factorized Embeddings¶
A drop-in replacement for PyTorch’s embeddings but using an efficient tensor parametrization that never reconstructs the full table.
|
Tensorized Embedding Layers For Efficient Model Compression Tensorized drop-in replacement for torch.nn.Embedding |
Tensor Dropout¶
These functions allow you to easily add or remove tensor dropout from tensor layers.
|
Tensor Dropout |
|
Removes the tensor dropout from a TensorModule |
You can also use the class API below but unless you have a particular use for the classes, you should use the convenient functions provided instead.
|
Decomposition Hook for Tensor Dropout on FactorizedTensor |
L1 Regularization¶
L1 Regularization on tensor modules.
|
Generalized Tensor Lasso from a factorized tensors |
|
Removes the tensor lasso from a TensorModule |
Utilities¶
Utility functions
|
Factorizes in_features and out_features such that: * they both are factorized into the same number of integers * they should both be factorized into order integers * each of the factors should be at least min_dim |