UTCS Colloquium/FACULTY CANDIDATE: Lek-Heng Lim - FACULTY CANDIDATE Stanford University Ten Ways to Decompose a Tensor ACES 2.302 Thursday April 5 2007 at 11:00 a.m.
There is a signup schedule for this event.
<
br>Speaker Name: Lek-Heng Lim - FACULTY CANDIDATE
Speaker Affiliati
on: Stanford University
Date: April 5 2007
Start Time: 11
:00a.m.
Location: ACES 2.302
Host: Inderjit Dhillon
Talk Title: Ten Ways to Decompose a Tensor
Talk Abstract:
In
scientific and statistical computing one often reduces the
problem at
hand -- be it a problem involving differential
equations or nonlinear
optimization or parameter estimation
-- to a simpler problem (or a sequ
ence of these) that requires
nothing more than linear algebra. In fact
the solution of linear
systems alone accounts for more than 70% of all
supercomputing
time in the world.
However with the use of incre
asingly sophisticated sensor
devices experimental methodologies and m
athematical
models we now see a new generation of problems in
scie
ntific and statistical computing that cannot be reduced
to standard pro
blems in numerical linear algebra. It is thus
pertinent to enlarge the
arsenal of computational tools available
at our disposal. Among the var
ious plausible extensions
to numerical linear algebra one will find th
e capability of
dealing with multilinearity to be among the most natura
l
desirable and powerful -- if we could do tensor computations
(n
umerical multilinear algebra) as effectively as matrix
computations (nu
merical linear algebra) then we would be
able to address many of the n
ew problems arising in modern
scientific and statistical computing.
It is not coincidental that the decompositional approach
to matrix
computations has been named one of the Top 10
Algorithms of the 20th Ce
ntury. If numerical linear algebra
is the foundation of scientific comp
uting then matrix decompositions
may be considered to be the foundatio
n of numerical linear
algebra. So the development of numericalmultiline
ar algebra
ought to begin with a few basic tensor decompositions.
In this talk we will present ten decompositions of tensors
-- the sp
eaker has developed four of these and contributed
to studies of the oth
er six. Our list will include the tensorial
generalizations of LU/LDU
decomposition QR/complete
orthogonal factorization eigenvalue decompo
sition (EVD)
singular value decomposition (SVD) nonnegative matrix factorization (NMF) Kronecker product decomposition
and more. We wi
ll discuss the similarities and differences
of these decompositions wit
h their matrix counterparts
as well as the various challenges in numer
ical multilinear
algebra of which these tensor decompositions form a <
br>cornerstone.
To every tensor decomposition there is an associate
d approximation
problem. We will see how these may be applied to multil
inear
statistical models that generalize vector space models
indep
endent component analysis graphical models/Bayesian
networks and mode
l reduction. We will illustrate these with selected
applications in bioi
nformatics computer vision signal processing
spectroscopy and sensor
location.
- About
- Research
- Faculty
- Awards & Honors
- Undergraduate
- Graduate
- Careers
- Outreach
- Alumni
- UTCS Direct