In Hainmueller and Hazlett (2014), the use of Kernel Regularized Least Squares (KRLS) was proposed for addressing modeling and inference problems in social science. KRLS leverages machine learning techniques designed for regression and classification tasks, avoiding reliance on linearity or additivity assumptions. The method constructs a flexible hypothesis space using kernels as radial basis functions and identifies the best-fitting surface by minimizing a complexity-penalized least squares problem.
We argue that KRLS is particularly well-suited for social science applications because it avoids strong parametric assumptions while remaining interpretable, similar to generalized linear models. Additionally, it allows for the exploration of nonlinearities, interactions, and heterogeneous effects. To support other researchers, we developed an R package and a Stata routine that make these methods accessible Ferwerda, Hainmueller, and Hazlett (2017).
@article{hainmueller2014kernel,title={Kernel regularized least squares: Reducing misspecification bias with a flexible and interpretable machine learning approach},author={Hainmueller, Jens and Hazlett, Chad},journal={Political Analysis},volume={22},number={2},pages={143--168},year={2014},publisher={Cambridge University Press},doi={10.1093/pan/mpt019 },url={https://doi.org/10.1093/pan/mpt019 },}
JSS
Kernel-based regularized least squares in R (KRLS) and Stata (krls)
Jeremy Ferwerda, Jens Hainmueller, and Chad J Hazlett
@article{ferwerda2017kernel,title={Kernel-based regularized least squares in R (KRLS) and Stata (krls)},author={Ferwerda, Jeremy and Hainmueller, Jens and Hazlett, Chad J},journal={Journal of Statistical Software},volume={79},number={3},pages={1--26},year={2017},doi={ 10.18637/jss.v079.i03 },url={ https://doi.org/10.18637/jss.v079.i03},}