tensorly.plugins
.use_opt_einsum
- use_opt_einsum(optimize='auto-hq')[source]
Plugin to use opt-einsum [1] to precompute (and cache) a better contraction path
References
[1]Daniel G. A. Smith and Johnnie Gray, opt_einsum, A Python package for optimizing contraction order for einsum-like expressions. Journal of Open Source Software, 2018, 3(26), 753
Examples
>>> import tensorly as tl
Use your favourite backend, here PyTorch: >>> tl.set_backend(‘pytorch’)
Use the convenient backend system to automatically dispatch all tenalg operations to einsum
>>> from tensorly import tenalg >>> tenalg.set_backend('einsum')
Now you can transparently cache the optimal contraction path, transparently:
>>> from tensorly import plugins >>> plugins.use_opt_einsum()
That’s it! You can revert to the original einsum just as easily:
>>> plugings.use_default_einsum()
Revert to the original tensor algebra backend:
>>> tenalg.set_backend('core')