tensorly.decomposition.non_negative_parafac

non_negative_parafac(tensor, rank, n_iter_max=100, init='svd', svd='truncated_svd', tol=1e-06, random_state=None, verbose=0, normalize_factors=False, return_errors=False, mask=None, cvg_criterion='abs_rec_error', fixed_modes=None)[source]

Non-negative CP decomposition

Uses multiplicative updates, see [2]

Parameters:
tensorndarray
rankint

number of components

n_iter_maxint

maximum number of iteration

init{‘svd’, ‘random’}, optional
svdstr, default is ‘truncated_svd’

function to use to compute the SVD, acceptable values in tensorly.SVD_FUNS

tolfloat, optional

tolerance: the algorithm stops when the variation in the reconstruction error is less than the tolerance

random_state{None, int, np.random.RandomState}
verboseint, optional

level of verbosity

normalize_factorsif True, aggregate the weights of each factor in a 1D-tensor

of shape (rank, ), which will contain the norms of the factors

fixed_modeslist, default is None

A list of modes for which the initial value is not modified. The last mode cannot be fixed due to error computation.

Returns:
factorsndarray list

list of positive factors of the CP decomposition element i is of shape (tensor.shape[i], rank)

References

[2]

Amnon Shashua and Tamir Hazan, “Non-negative tensor factorization with applications to statistics and computer vision”, In Proceedings of the International Conference on Machine Learning (ICML), pp 792-799, ICML, 2005