Dimitrije Marković
DySCO meeting 16.02.2022
Simulated data
Piironen, Juho, and Aki Vehtari. "Sparsity information and regularization in the horseshoe and other shrinkage priors." Electronic Journal of Statistics 11.2 (2017): 5018-5051.
Piironen, Juho, and Aki Vehtari. "Sparsity information and regularization in the horseshoe and other shrinkage priors." Electronic Journal of Statistics 11.2 (2017): 5018-5051.
Half-Cauchy distribution
Piironen, Juho, and Aki Vehtari. "Sparsity information and regularization in the horseshoe and other shrinkage priors." Electronic Journal of Statistics 11.2 (2017): 5018-5051.
Effective number of
non-zero coefficients
Expected effective number
Piironen, Juho, and Aki Vehtari. "Sparsity information and regularization in the horseshoe and other shrinkage priors." Electronic Journal of Statistics 11.2 (2017): 5018-5051.
Piironen, Juho, and Aki Vehtari. "Sparsity information and regularization in the horseshoe and other shrinkage priors." Electronic Journal of Statistics 11.2 (2017): 5018-5051.
prior constraint on the
number of nonzero coefficients
\( \rightarrow \)
Agrawal, Raj, et al. "The kernel interaction trick: Fast bayesian discovery of pairwise interactions in high dimensions." International Conference on Machine Learning. PMLR, 2019.
linear regression?
Agrawal, Raj, et al. "The kernel interaction trick: Fast bayesian discovery of pairwise interactions in high dimensions." International Conference on Machine Learning. PMLR, 2019.
Agrawal et al. demonstrates how one can express the linear regression problem (with pairwise interactions) as a Gaussian process model using specific kernel structure.
The Gaussian process models have better scaling with D, but further approximations are needed for larger N.
To me it is unclear if the approach can be applied to nonlinear GLMs.
Hierarchical shrinkage priors are essential for separating signal from noise in the presence of correlations.
Structural shrinkage priors help separate linear
and pairwise contributions to the "responses".
https://github.com/dimarkov/pybefit
Generalises to Logistic regression, Poisson regression, etc.