A lasso for hierarchical interactions
Jacob Bien, Jonathan Taylor, Robert Tibshirani(Submitted on 22 May 2012 (v1), last revised 19 Jun 2013 (this version, v3))
We add a set of convex constraints to the lasso to produce sparse interaction models that honor the hierarchy restriction that an interaction only be included in a model if one or both variables are marginally important. We give a precise characterization of the effect of this hierarchy constraint, prove that hierarchy holds with probability one and derive an unbiased estimate for the degrees of freedom of our estimator. A bound on this estimate reveals the amount of fitting "saved" by the hierarchy constraint. We distinguish between parameter sparsity - the number of nonzero coefficients - and practical sparsity - the number of raw variables one must measure to make a new prediction. Hierarchy focuses on the latter, which is more closely tied to important data collection concerns such as cost, time and effort. We develop an algorithm, available in the R package hierNet, and perform an empirical study of our method.
Comments: Published in at this http URL the Annals of Statistics (this http URL) by the Institute of Mathematical Statistics (this http URL)Subjects: Methodology (stat.ME); Statistics Theory (math.ST); Machine Learning (stat.ML)Journal reference: Annals of Statistics 2013, Vol. 41, No. 3, 1111-1141DOI: 10.1214/13-AOS1096Report number: IMS-AOS-AOS1096Cite as: arXiv:1205.5050 [stat.ME] (or arXiv:1205.5050v3 [stat.ME] for this version)