Intro
Technische Universität IlmenauTensor-Based Signal Processing Univ.-Prof. Dr.-Ing. Martin Haardt
Univ.-Prof. Dr.-Ing. Martin HaardtWelcome!
Outline
Fundamental Concepts of Tensor Algebra*
- n-mode vectors, n-mode unfoldings
- multilinearity and n-mode products
- n-ranks
- Higher-Order SVD (HOSVD)
- PARAFAC / CANDECOMP (CP)
Selected Signal Processing*
- Applications
- Approximate CP and Alternating Least Squares (ALS)
- Semi-Algebraic CP decomposition via Simultaneous Matrix Diagonalization
- HOSVD-based subspace estimation to improve the parameter estimation accuracy in multi-dimensional harmonic retrieval problems
* not in DEMO
Motivation
Why tensors?
with Tullio Levi-Civita
I. Physikalischer Teil von Albert Einstein;
II. Mathematischer Teil von Marcel Grossmann“
1846: W. R. Hamilton
in 1846 by william hamilton used tensors to define the norm operation on the clifford algebra.
1899: W. Voigt
in 1899 W. Voigt used tensors in the current meaning for crystal physics.
1900: Gregorio Ricci-Curbastro
The “inventor” of the tensor calculus: “The Absolute Differential Calculus“
with Tullio Levi-Civita
1915: M. Grossmann, A. Einstein
„Entwurf einer verallgemeinerten Relativitätstheorie und einer Theorie der Gravitation:
I. Physikalischer Teil von Albert Einstein;
II. Mathematischer Teil von Marcel Grossmann“
1927: F. L. Hitchcock
Polyadic decomposition: Foundation for modern multilinear algebra
What is a tensor?
What is a tensor?Strictly speaking: An element from a tensor field which is an outer (tensor) product of R linear spaces.
- like a matrix is an element from the outer product of two linear spaces
- engineers typically work with coordinate representations obtained by fixing the basis of all spaces (array of numbers)
- for simplicity, we assimilate tensors with the coordinate representations
Notation
R-way arrays Notation
R-way arrays Notation
tensors = bold-faced, calligraphic letters
Why tensors?
Why tensors?Initial motivation
- provides new insights / solutions
More than two dimensions:
- even more compact,
- more structure is preserved
Test
Prof. Dr. Martin HaardtTesting knowledge
graduate Alla ManinaMaster of Science in Communications and Signal Processing
Identifiability
fundamental advantages over matrix-based counterparts Identifiability
Let's look at the matrix case...
IdentifiabilityMatrix case
IdentifiabilityConclusion
- the tensor rank can largely exceed its dimensions
- more sources than sensors can be identified
Uniqueness
fundamental advantages over matrix-based counterparts Uniqueness
There's no uniqueness in matrix decompositions.
Why don't we have uniqueness in matrix decompositions? Let's take a look at the matrix case ...
UniquenessMatrix case
UniquenessConclusion
- columns of mixing matrix can be identified individually
- blind source separation (BSS)
Multilinear rank reduction
fundamental advantages over matrix-based counterparts Multilinear rank reduction
Multilinear rank reduction Matrix case
Multilinear rank reduction Conclusion
- More efficient denoising: exploiting the structure, therefore more noise is suppressed
- many applications, e.g., chemometrics, psychometrics, computer vision, watermarking, data mining, array processing, independent component analysis (ICA), image and video processing, …
Improved subspace estimate
fundamental advantages over matrix-based counterparts Improved subspace estimate
Improved subspace estimateMatrix case
Improved subspace estimateConclusion
- multidimensional subspace-based parameter estimation schemes: can be improved by using the multilinear rank reduction
- yields an improved subspace estimate, therefore a higher accuracy
- many applications, e.g., channel modeling, surveillance RADAR, microwave imaging, positioning, blind channel estimation, …