Balanced Bayesian LASSO for heavy tails

Daniel F. Linder, Viral Panchal, Hani Samawi, Duchwan Ryu

Research output: Contribution to journalArticlepeer-review

Abstract

Regression procedures are not only hindered by large p and small n, but can also suffer in cases when outliers are present or the data generating mechanisms are heavy tailed. Since the penalized estimates like the least absolute shrinkage and selection operator (LASSO) are equipped to deal with the large p small n by encouraging sparsity, we combine a LASSO type penalty with the absolute deviation loss function, instead of the standard least squares loss, to handle the presence of outliers and heavy tails. The model is cast in a Bayesian setting and a Gibbs sampler is derived to efficiently sample from the posterior distribution. We compare our method to existing methods in a simulation study as well as on a prostate cancer data set and a base deficit data set from trauma patients.

Original languageEnglish
Pages (from-to)1115-1132
Number of pages18
JournalJournal of Statistical Computation and Simulation
Volume86
Issue number6
DOIs
StatePublished - Apr 12 2016

Keywords

  • Lasso
  • heavy tail
  • loss function
  • outlier
  • regression
  • sparsity

Fingerprint

Dive into the research topics of 'Balanced Bayesian LASSO for heavy tails'. Together they form a unique fingerprint.

Cite this