Skip to main content

Research Repository

Advanced Search

Scalable Learning in Reproducing Kernel Kre?n Spaces


Dino Oglic



We provide the first mathematically complete derivation of the Nyström method for low-rank approximation of indefinite kernels and propose an efficient method for finding an approximate eigen-decomposition of such kernel matrices. Building on this result, we devise highly scalable methods for learning in reproducing kernel Kre˘ ın spaces. The devised approaches provide a principled and theoretically well-founded means to tackle large scale learning problems with indefinite kernels. The main motivation for our work comes from problems with structured representations (e.g., graphs, strings, time-series), where it is relatively easy to devise a pairwise (dis)similarity function based on intuition and/or knowledge of domain experts. Such functions are typically not positive definite and it is often well beyond the expertise of practitioners to verify this condition. The effectiveness of the devised approaches is evaluated empirically using indefinite kernels defined on structured and vectorial data representations.


Oglic, D., & Gärtner, T. (2019). Scalable Learning in Reproducing Kernel Kreĭn Spaces.

Conference Name 36th International Conference on Machine Learning (ICML 2019)
Start Date Jun 9, 2019
End Date Jun 15, 2019
Acceptance Date Apr 22, 2019
Online Publication Date Jun 13, 2019
Publication Date Jun 13, 2019
Deposit Date Jun 24, 2019
Publicly Available Date Jun 24, 2019
Volume 97
Pages 4912-4921
Series Title Proceedings of Machine Learning Research
Series ISSN 2640-3498
Public URL
Publisher URL


You might also like

Downloadable Citations