Tensor Decompositions
A tensor of order N is a data structure that generalizes vectors (order one) and matrices (order two). A color picture is an example of a tensor of order 3 and a video sequence is a tensor of order 4. In neural networks, in particular in convolution networks, there exist tensors of even higher orders. The amount of data in such tensors is enormous and they can easily become too large to work with. In such situations, tensor decompositions can help. By approximating a tensor of a high order with a "product" (contraction) of tensors of lower orders, one can dramatically reduce the requirements on memory and speed at the cost of a small and controllable loss of precision. In dynamical programming, which is used in optimization and in optimal control of technological processes, tensor decompositions are used for modeling complicated functions depending on a large number of variables.

Popular tensor decompositions include the canonical decomposition, Tucker's decomposition, and tensor trains. Other types are the block decomposition (a generalization of the canonical decomposition), the structured Tucker decomposition, the tensor chain, tree, and network, in general.
In our work, we mainly focused on finding optimal tensor decompositions for estimations with limited sensitivity, that are suitable for the compression of convolution kernels in neural networks, and on tensor trains for applications in dynamical programming and quantum chemistry.
Related publications:
- TICHAVSKÝ, Petr; PHAN, Anh-Huy; CICHOCKI, Andrzej. Krylov-levenberg-marquardt algorithm for structured tucker tensor decompositions. IEEE Journal of Selected Topics in Signal Processing, 2021, 15.3: 550-559.
- TICHAVSKÝ, Petr; PHAN, Anh-Huy; CICHOCKI, Andrzej. Sensitivity in tensor decomposition. IEEE Signal Processing Letters, 2019, 26.11: 1653-1657.
- TICHAVSKÝ, Petr; PHAN, Anh-Huy; CICHOCKI, Andrzej. Non-orthogonal tensor diagonalization. Signal Processing, 2017, 138: 313-320.
Contact person