Pathwise coordinate optimization

Annals of Applied Statistics - Tập 1 Số 2 - 2007
Jerome H. Friedman1, Trevor Hastie1, Holger Höfling1, Robert Tibshirani1
1Stanford University

Tóm tắt

Từ khóa


Tài liệu tham khảo

Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. <i>J. Roy. Statist. Soc. Ser. B</i> <b>58</b> 267–288.

Rudin, L. I., Osher, S. and Fatemi, E. (1992). Nonlinear total variation based noise removal algorithms. <i>Phys. D</i> <b>60</b> 259–268.

Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. <i>J. Roy. Statist. Soc. Ser. B</i> <b>68</b> 49–67.

Efron, B., Hastie, T., Johnstone, I. and Tibshirani, R. (2004). Least angle regression (with discussion). <i>Ann. Statist.</i> <b>32</b> 407–499.

Bertsekas, D. (1999). <i>Nonlinear Programming</i>. Athena Scientific.

Breiman, L. (1995). Better subset selection using the nonnegative garrote. <i>Technometrics</i> <b>37</b> 738–754.

Chen, S. S., Donoho, D. L. and Saunders, M. A. (1998). Atomic decomposition by basis pursuit. <i>SIAM J. Sci. Comput.</i> 33–61.

Daubechies, I., Defrise, M. and De Mol, C. (2004). An iterative thresholding algorithm for linear inverse problems with a sparsity constraint. <i>Comm. Pure Appl. Math.</i> <b>57</b> 1413–1457.

Donoho, D. and Johnstone, I. (1995). Adapting to unknown smoothness via wavelet shrinkage. <i>J. Amer. Statist. Assoc.</i> <b>90</b> 1200–1224.

Friedlander, M. and Saunders, M. (2007). Discussion of “Dantzig selector” by E. Candes and T. Tao. <i>Ann. Statist.</i> <b>35</b> 2385–2391.

Fu, W. J. (1998). Penalized regressions: The bridge versus the lasso. <i>J. Comput. Graph. Statist.</i> <b>7</b> 397–416.

Gill, P., Murray, W. and Saunders, M. (1999). Users guide for sqopt 5.3: A fortran package for large-scale linear and quadratic programming. Technical report, Stanford Univ.

Li, Y. and Arce, G. (2004). A maximum likelihood approach to least absolute deviation regression. <i>URASIP J. Appl. Signal Processing</i> <b>2004</b> 1762–1769.

Osborne, M., Presnell, B. and Turlach, B. (2000). A new approach to variable selection in least squares problems. <i>IMA J. Numer. Anal.</i> <b>20</b> 389–404.

Owen, A. (2006). A robust hybrid of lasso and ridge regression. Technical report, Stanford Univ.

Schlegel, P. (1970). The explicit inverse of a tridiagonal matrix. <i>Math. Comput.</i> <b>24</b> 665–665.

Tibshirani, R., Saunders, M., Rosset, S., Zhu, J. and Knight, K. (2005). Sparsity and smoothness via the fused lasso. <i>J. Roy. Statist. Soc. Ser. B</i> <b>67</b> 91–108.

Tibshirani, R. and Wang, P. (2007). Spatial smoothing and hot spot detection for CGH data using the fused lasso. <i>Biostatistics</i>. Advance Access published May 18, 2007.

Tseng, P. (2001). Convergence of block coordinate descent method for nondifferentiable maximation. <i>J. Opt. Theory Appl.</i> <b>109</b> 474–494.

Van der Kooij, A. (2007). Prediction accuracy and stability of regresssion with optimal scaling transformations. Technical report, Dept. Data Theory, Leiden Univ.

Wang, H., Li, G. and Jiang, G. (2006). Robust regression shrinkage and consistent variable selection via the lad-lasso. <i>J. Business Econom. Statist.</i> <b>11</b> 1–6.

Zhou, H. and Hastie, T. (2005). Regularization and variable selection via the elastic net. <i>J. Roy. Statist. Soc. Ser. B</i> <b>67</b> 301–320.

Tseng, P. (1988). Coordinate ascent for maximizing nondifferentiable concave functions. Technical Report LIDS-P, 1840, Massachusetts Institute of Technology, Laboratory for Information and Decision Systems.