| 
SLAM: Structured Linear Algebra for Sustainable Machine Learning General Information
 Brief description: This project develops new numerical methods based on structured linear algebra with applications to efficient algorithms for signal processing, machine learning, and artificial intelligence. The objective is lowering the costs associated with modern machine learning methods which is essential to ensure their widespread adoption and their longterm sustainability. As machine learning methods heavily rely on numerical linear algebra (and mostly matrix calculations), in this project, we propose new numerical methods based on new structured matrices to reduce the storage and computational complexity of training and running machine learning models.
 TeamDescriptionThis project develops new numerical methods based on structured linear algebra with applications to efficient algorithms for signal processing, machine learning, and artificial intelligence. State-of-the-art machine learning techniques are revolutionizing many fields of science (computer science, biology, economics, business, to name a few) and the society at large by taking over increasingly complex tasks from human operators. While it is true that these technological breakthroughs have immense benefits and promise to revolutionize the global economy, unfortunately, they also exhibit at least one major, potentially crippling, drawback: high costs of operating the state-of-the-art machine learning methods on specialized hardware infrastructure and with high costs in terms in energy consumption (direct monetary costs and indirect costs in the form of damage to the environment). We are quickly reaching a situation where only a few large companies or well-funded research groups can afford the costs of running large-scale state-of-the-art machine learning methods. Furthermore, even in these cases, the damage done to the environment due to CO2 emissions is unacceptable. Therefore, lowering the costs associated with modern machine learning methods is essential to ensure their widespread adoption and their longterm sustainability. As machine learning methods heavily rely on numerical linear algebra (and mostly matrix calculations), in this project, we propose new numerical methods based on new structured matrices to reduce the storage and computational complexity of training and running machine learning models. We expect our work to have a direct impact on the latest machine learning techniques, such as optimization techniques, dictionary learning for sparse representations, kernel methods, and neural networks.  ObjectivesThe project has two overarching objectives:
 
 
 Developed new structured linear algebraic methods with applications to
 The development of new preconditions for linear systems
 Efficient factorizations relevant to eigenvalue and singular value decompositions
 Non-negative matrix factorizations
 Extensions of structured matrices to optimization techniques
 Structured gradient descend and quasi-Newton methods
 Structured matrices for kernel methods
 Structured matrices for neural networks
 PapersThe following papers summarize the results of this research project. Each paper includes source code freely available on github.
 
 
C. Rusu and L. Rosasco, Constructing fast approximate eigenspaces with application to the fast graph Fourier transformspublished in IEEE Transactions on Signal Processing 69, 5037-5050
 available online here
 C. Rusu, An iterative coordinate descent algorithm to compute sparse low-rank approximationspublished in IEEE Signal Processing Letters 29
 available online here
 C. Rusu, An iterative Jacobi-like algorithm to compute a few sparse eigenvalue-eigenvector pairsavailable online here
 C. Rusu and P. Irofti, Efficient and parallel separable dictionary learningpublished in IEEE 27th International Conference on Parallel and Distributed Systems (ICPADS)
 available online here
 C. Rusu and L. Rosasco, Fast approximation of orthogonal matrices and application to PCApublished in Signal Processing 194
 available online here
 C. Rusu, Learning multiplication-free linear transformationspublished in Digital Signal Processing 126
 available online here
 P. Irofti, C. Rusu and A. Pătraşcu, Dictionary Learning with Uniform Sparse Representations for Anomaly Detectionpublished in IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
 available online here
 D. Ilie-Ablachim and C. Rusu, Real-time precnditioners for solving systems of linear equationsunder development 2022
 D. Ilie-Ablachim, I. Culic, and C. Rusu, Online preconditioner design for kernel machinesunder development 2022
 D. Ilie-Ablachim and C. Rusu, New deterministic algorithms for online PCAunder development 2022
 C. Rusu and N. Gonzalez-Prelcic, A novel approach for unit-modulus least-squares optimization problemsunder development 2022
 N. Gonzalez-Prelcic, E. Dominguez-Jimenez and C. Rusu, Multicoset-based deterministic measurement matrices for compressed sensing of sparse multiband signalsunder development 2022
 J. Palacios, N. Gonzalez-Prelcic and C. Rusu, Multidimensional orthogonal matching pursuit: theory and application to high accuracy joint localization and communication at mmWaveunder development 2022
 |