Welcome to Tensorlab’s documentation!

Tensorlab provides various tools for tensor computations, (coupled) tensor decompositions and complex optimization. In Tensorlab, datasets are stored as (possibly incomplete, sparse or structured) vectors, matrices and higher-order tensors, possibly obtained after tensorizing lower-order data. By mixing different types of tensor decompositions and factor transformations, a vast amount of factorizations can be computed. Users can choose from a library of preimplemented transformations and structures, including nonnegativity, orthogonality, Toeplitz and Vandermonde matrices to name a few, or even define their own factor transformations.

Chapter 2 covers the representation of dense, incomplete, sparse and structured datasets in Tensorlab and the basic operations on such tensors. Tensor decompositions of both real and complex tensors such as the canonical polyadic decomposition (CPD), the decomposition in multilinear rank-\((L_r,L_r,1)\) terms, the low multilinear rank approximation (LMLRA), the multilinear singular value decomposition (MLSVD) and the block term decomposition (BTD) are discussed in Chapters 3, 4, 5 and 6, respectively. Chapter 7 discusses the topic and framework of tensorization, in which lower-order datasets can be transformed to higher-order data. A basic introduction in structured data fusion (SDF) is given in Chapter 8, with which multiple datasets can be jointly factorized while imposing structure on the factors using factor transformations. Examples of SDF are given in Chapter 9. Chapter 10 discusses the more advanced concepts within SDF, while Chapters 11 explains the implementation of a transformation. Chapter 12 gives a full specification of the domain specific language used by SDF. Many of the algorithms to accomplish the decomposition tasks are based on complex optimization, that is, optimization of functions in complex variables. Chapter 13 introduces the necessary concepts and shows how to solve different types of complex optimization problems. Finally, Chapter 14 treats global optimization of bivariate (and polyanalytic) polynomials and rational functions, which appears as a subproblem in tensor optimization.

The first version of Tensorlab was released in February 2013, comprising a wide range of algorithms for tensor decompositions. Tensorlab 2.0 was released in January 2014, introducing the structured data fusion framework. The current version of Tensorlab, version 3.0, is released in March 2016. Tensorlab 3.0 introduces a tensorization framework, offers more support for large-scale and structured datasets and enhances the performance and user-friendliness of the SDF framework.

More specifically in Tensorlab 3.0, different tensorization and detensorization techniques are introduced. Efficient representations of structured tensors are supported, which improve the speed of many operations and decompositions. A number of algorithms dealing with large-scale datasets are provided, as well as a family of algorithms for the decomposition in multilinear rank-\((L_r,L_r,1)\) terms. The SDF framework has been redesigned to allow greater flexibility in the model definition and to improve the performance. Further, a tool has been made available to verify and correct an SDF model, and a method is provided for visual exploration of high-dimensional data of arbitrary order. More details can be found in the release notes.

To illustrate the power of the toolbox in an accessible manner, a number of demos have been developed accompanying this manual. These demos discuss different case studies using tensor decompositions such as multidimensional harmonic retrieval, independent component analysis (ICA) and the prediction of user involvement based on a GPS dataset.

A PDF version of the user guide can be found here. This toolbox can be cited as in [24].