A Theory of Feature Learning in Kernel Models

Speaker: 阮丰 (Northwestern University)

Abstract: We study feature learning in a compositional variant of kernel ridge regression in which the predictor is applied to a learnable linear transformation of the input. When the response depends on the input only through a low-dimensional predictive subspace, we show that all global minimizers of the population objective for the linear transformation annihilate directions orthogonal to this subspace, and in certain regimes, exactly identify the subspace. Moreover, we show that global minimizers of the finite-sample objective inherit the exact same low-dimensional structure with high probability, even without any explicit penalization on the linear transformation.

Time: 15:00~16:00, Mar. 23, 2026

Location: R1510, SIMIS


Introduction to the Speaker: Feng Ruan is an Assistant Professor in the Department of Statistics and Data Science at Northwestern University. His research lies at the intersection of machine learning, statistics, and optimization. He works broadly on two themes: representation learning, particularly how models discover low-dimensional predictive structure in data; and the variational and algorithmic foundations of nonsmooth and nonconvex optimization problems arising in statistical learning.

en_USEnglish
Scroll to Top