Python implementation of the Tucker toolbox. Package allows users to manipulate tensors in Tucker and SF-Tucker [[1]]() formats. It also provides tools for implementing first-order optimization methods of the Riemannian optimization on the manifolds of tensors of fixed Tucker rank or fixed SF-Tucker rank. For instance, package implements a method for efficiently computing the Riemannian gradient of any smooth function via automatic differentiation.
The library is compatible with several computation frameworks, such as PyTorch and JAX, and can be easily integrated with other frameworks.
Installation#
NumPy, SciPy and [opt-einsum](https://pypi.org/project/opt-einsum/) are required for installation. Additionally, you need to install your special computation framework: PyTorch or JAX.
Package may be installed using
pip install tucker_riemopt[torch/jax]
with corresponding computation framework.
TBD
Quick start#
See the following repositories as a reference of the usage of the package:
[This repository](johanDDC/R-TuckER) uses the package for knowledge graph completion task;
[This repository](https://bitbucket.org/johan_ddc/bert_imdb/src/master/) uses the library for BERT compression;
Default computation framework is PyTorch. For using JAX you should
Install JAX;
Enable JAX backend using