The HPQUANTSELECT Procedure

Quantile Regression

This section describes the basic concepts and notations for quantile regression and quantile regression model selection.

Let StartSet left-parenthesis y Subscript i Baseline comma bold x Subscript i Baseline right-parenthesis colon i equals 1 comma ellipsis comma n EndSet denote a data set of observations, where y Subscript i are responses and bold x Subscript i are regressors. Koenker and Bassett (1978) define the regression quantile at quantile level tau element-of left-parenthesis 0 comma 1 right-parenthesis as any solution to the minimization problem

min Underscript bold-italic beta element-of bold upper R Superscript p Endscripts sigma-summation Underscript i equals 1 Overscript n Endscripts rho Subscript tau Baseline left-parenthesis y Subscript i Baseline minus bold x prime Subscript i Baseline bold-italic beta right-parenthesis

where rho Subscript tau Baseline left-parenthesis r right-parenthesis equals tau r Superscript plus Baseline plus left-parenthesis 1 minus tau right-parenthesis r Superscript minus is a check loss function in which r Superscript plus Baseline equals max left-parenthesis r comma 0 right-parenthesis and r Superscript minus Baseline equals max left-parenthesis negative r comma 0 right-parenthesis.

If you specify weights w Subscript i Baseline comma i equals 1 comma ellipsis comma n, in the WEIGHT statement, then weighted quantile regression is carried out by solving

min Underscript bold-italic beta element-of bold upper R Superscript p Endscripts sigma-summation Underscript i equals 1 Overscript n Endscripts rho Subscript tau Baseline left-parenthesis w Subscript i Baseline left-parenthesis y Subscript i Baseline minus bold x prime Subscript i Baseline bold-italic beta right-parenthesis right-parenthesis

The HPQUANTSELECT procedure fits a quantile regression model by using a predictor-corrector interior point algorithm, which was originally designed to solve support vector machine classifiers for large data sets (Gertz and Griffin 2005, 2010).

Last updated: December 09, 2022