tensorly.contrib.sparse.decomposition.non_negative_tucker

non_negative_tucker(tensor, rank, n_iter_max=10, init='svd', tol=0.0001, random_state=None, verbose=False, return_errors=False, normalize_factors=False)

Non-negative Tucker decomposition

Iterative multiplicative update, see [2]

Parameters:
tensorndarray
rankNone, int or int list

size of the core tensor, (len(ranks) == tensor.ndim) if int, the same rank is used for all modes

n_iter_maxint

maximum number of iteration

init{‘svd’, ‘random’}
random_state{None, int, np.random.RandomState}
verboseint , optional

level of verbosity

return_errorsboolean

Indicates whether the algorithm should return all reconstruction errors and computation time of each iteration or not Default: False

normalize_factorsif True, aggregates the norms of the factors in the core.
Returns:
corendarray

positive core of the Tucker decomposition has shape ranks

factorsndarray list

list of factors of the CP decomposition element i is of shape (tensor.shape[i], rank)

References

[2]

Yong-Deok Kim and Seungjin Choi, “Non-negative tucker decomposition”, IEEE Conference on Computer Vision and Pattern Recognition s(CVPR), pp 1-8, 2007