at the
American Institute of Mathematics, San Jose, California
organized by
Rasmus Bro, Michael Friedlander, Tamara G. Kolda, and Stephen Wright
Further advances in tensor decompositions depend critically on advanced optimization algorithms. Computational tools have not changed significantly in the past four decades and are often based on a simple alternating least squares (ALS) approach, which is often slow and comes with no guarantees of convergence to a useful solution. Despite these drawbacks, it remains the method of choice because it is quite general and because it allows for useful modifications (for example, to problems with missing data). Recent work has shown that optimization methods other than ALS can provide superior solutions in specific situations. Our workshop seeks to further this line of research by bringing together leading experts in numerical optimization and tensor decompositions, with the purpose of developing optimization-based tensor decomposition methods that are robust, accurate, numerically stable, and scablable. Furthermore, these methods should be able to allow constraints such as nonnegativity to be imposed on the parameters; they should allow missing data to be efficiently and accurately handled; they should enable sparse data and sparse solutions; and they should be able to handle formulations that involve alternative loss functions such as (generalized) weighted least squares.
The goal of this workshop is to foster a new scientific community to facilitate the development of new decomposition methods and to provide fundamentally new insights into both tensor decompositions and numerical optimization. This community is expected have an impact in many diverse areas (including those listed above) in the years to come.
The workshop schedule.
A report on the workshop activities.
Papers arising from the workshop: