Alex Lambert

alt text 

Researcher
KU Leuven
E-mail: alex.lambert [@] kuleuven [DOT] be

About me

I'm taking a leave from academia to work as a data engineer at Dataminded

Before that, I was a researcher associated with the KU Leuven university, where I worked with Johan Suykens on kernel methods and duality..

I have a PhD in Machine Learning, prepared at Télécom Paris under the supervision of Florence d'Alché-Buc and Zoltan Szabo.
My PhD aimed at efficiently predicting functional outputs using kernels, the slides and manuscript are available.

Before that, I graduated from Télécom Paris and M2 Data Science from Institut Polytechnique de Paris.
More information about my background can be found on my resume.

Research Interests

I'm globally interested in Machine Learning, and for the moment focus my attention on

  • Operator-Valued Kernels, Integral Operators, Random Features for Large Scale Learning

  • Convex Optimization, Shape Constraints, Infinite-dimensional Lagrange Multipliers

  • Multi-Task Learning, Quantile Regression, Regularization

  • Kernel PCA, Kernel SVD

My list of publication is available here.

News

  • New ICML paper accepted ! “Learning in Feature Spaces via Coupled Covariances: Asymmetric Kernel SVD and Nyström method” -> link.

  • I was part of the organizing team of the DEEPK workhop on deep learning and kernel methods at Leuven.

  • I gave a talk at the MIND/SODA team seminar on “Robustness and sparsity through Moreau envelopes in kernel-based settings” slides.

  • Our paper “Extending Kernel PCA through Dualization: Sparsity, Robustness, and Fast Algorithms” has been accepted at ICML !