Notice

This multimedia story format uses video and audio footage. Please make sure your speakers are turned on.

Use the mouse wheel or the arrow keys on your keyboard to navigate between pages.

Swipe to navigate between pages.

Let's go

GRIAT-CSP

Logo https://tu-ilmenau.pageflow.io/griat-csp

Intro

Goto first page
0:00
/
0:00
Start audio now
Goto first page
Introduction and Motivation

Fundamental Concepts of Tensor Algebra*
  • n-mode vectors, n-mode unfoldings
  • multilinearity and n-mode products
  • n-ranks
Elementary Tensor Decompositions*
  • Higher-Order SVD (HOSVD)
  • PARAFAC / CANDECOMP (CP)

Selected Signal Processing*
  • Applications
  • Approximate CP and Alternating Least Squares (ALS)
  • Semi-Algebraic CP decomposition via Simultaneous Matrix Diagonalization
  • HOSVD-based subspace estimation to improve the parameter estimation accuracy in multi-dimensional harmonic retrieval problems
Conclusions*

* not in DEMO
Goto first page

Motivation

Tensor-based signal processing techniques offer fundamental advantages over their matrix-based counterparts:
Goto first page

1846: W. R. Hamilton

in 1846 by william hamilton used tensors to define the norm operation on the clifford algebra.

1899: W. Voigt

in 1899 W. Voigt used tensors in the current meaning for crystal physics.

1900: Gregorio Ricci-Curbastro

The “inventor” of the tensor calculus: “The Absolute Differential Calculus“
with Tullio Levi-Civita

1915: M. Grossmann, A. Einstein

„Entwurf einer verallgemeinerten Relativitätstheorie und einer Theorie der Gravitation:

        I. Physikalischer Teil von Albert Einstein;
        II. Mathematischer Teil von Marcel Grossmann“

1927: F. L. Hitchcock

Polyadic decomposition: Foundation for modern multilinear algebra

Goto first page

What is a tensor?

0:00
/
0:00
Start audio now
  • like a matrix is an element from the outer product of two linear spaces
  • engineers typically work with coordinate representations obtained by fixing the basis of all spaces (array of numbers)
  • for simplicity, we assimilate tensors with the coordinate representations

Open audio

Goto first page
0:00
/
0:00
Start video now
Goto first page

tensors = bold-faced, calligraphic letters

Fullscreen
If we print it on the slides we basically have this calligraphical letter and this would be a three-dimensional array so the order is three and we can draw a picture like for the scalar vector matrix and the order three tensor however if we have an order four tensor we cannot draw the picture anymore right we can of course treat it mathematically we can use it in math matlab for example a tensor of size I₁ by I₂ by I₃ by I₄ but the picture so it's not so visual anymore but we can use it for in mathematics of course easily

Close
Goto first page

Why tensors?

Close
Well, why even matrices?
  • matrix equations are usually more compact => provide new insights
If you like, look at the example "DFT" here.
I agree with being shown Vimeo videos. More information
Example: DFT

To opt out of displaying external embeds, manage settings here.

Goto first page
0:00
/
0:00
Start audio now
Not a different data model, but more compact
  • provides new insights / solutions

More than two dimensions:
  • even more compact,
  • more structure is preserved

Open audio

Goto first page
Close
I agree with being shown Vimeo videos. More information
Watch the video!

To opt out of displaying external embeds, manage settings here.

Goto first page

Test

0:00
/
0:00
Start audio now
Goto first page
0:00
/
0:00
Start audio now
Goto first page
Goto first page

Identifiability

The first advantage is that the tensor rank can largely exceed its dimensions.

Let's look at the matrix case...


Goto first page
0:00
/
0:00
Start video now
Goto first page
0:00
/
0:00
Start audio now
  • the tensor rank can largely exceed its dimensions
  • more sources than sensors can be identified

Open audio

Goto first page

Uniqueness

Uniqueness is also a big advantage, because the bilinear (a matrix) decomposition requires constraints for uniqueness.
There's no uniqueness in matrix decompositions.

Why don't we have uniqueness in matrix decompositions? Let's take a look at the matrix case ...

Goto first page
0:00
/
0:00
Start video now
Goto first page
0:00
/
0:00
Start audio now
trilinear/multilinear (tensor) decomposition: essentially unique up to permutation and scaling
  • columns of mixing matrix can be identified individually
  • blind source separation (BSS)

Open audio

Goto first page

Multilinear rank reduction

Goto first page
0:00
/
0:00
Start video now
Goto first page
0:00
/
0:00
Start audio now
  • More efficient denoising: exploiting the structure, therefore more noise is suppressed
  • many applications, e.g., chemometrics, psychometrics, computer vision, watermarking, data mining, array processing, independent component analysis (ICA), image and video processing, …

Open audio

Goto first page

Improved subspace estimate

We already did subspace estimation in the matrix ...

Goto first page
0:00
/
0:00
Start video now
Goto first page
0:00
/
0:00
Start audio now
  • multidimensional subspace-based parameter estimation schemes: can be improved by using the multilinear rank reduction
  • yields an improved subspace estimate, therefore a higher accuracy
  • many applications, e.g., channel modeling, surveillance RADAR, microwave imaging, positioning, blind channel estimation, …

Open audio

Goto first page
Scroll down to continue Swipe to continue
Swipe to continue