Huber Regression

Introduction

Huber regression (Huber 1964) is a regression technique that is robust to outliers. The idea is to use a different loss function rather than the traditional least-squares; we solve

\[\begin{array}{ll} \underset{\beta}{\mbox{minimize}} & \sum_{i=1}^m \phi(y_i - x_i^T\beta) \end{array}\]

for variable \(\beta \in {\mathbf R}^n\), where the loss \(\phi\) is the Huber function with threshold \(M > 0\), \[ \phi(u) = \begin{cases} u^2 & \mbox{if } |u| \leq M \\ 2Mu - M^2 & \mbox{if } |u| > M. \end{cases} \]

This function is identical to the least squares penalty for small residuals, but on large residuals, its penalty is lower and increases linearly rather than quadratically. It is thus more forgiving of outliers.

Example

We generate some problem data.

n <- 1
m <- 450
M <- 1      ## Huber threshold
p <- 0.1    ## Fraction of responses with sign flipped

## Generate problem data
set.seed(1289)
beta_true <- 5 * matrix(stats::rnorm(n), nrow = n)
X <- matrix(stats::rnorm(m * n), nrow = m, ncol = n)
y_true <- X %*% beta_true
eps <- matrix(stats::rnorm(m), nrow = m)

We will randomly flip the sign of some responses to illustrate the robustness.

factor <- 2*stats::rbinom(m, size = 1, prob = 1-p) - 1
y <- factor * y_true + eps

We can solve this problem both using ordinary least squares and huber regression to compare.

beta <- Variable(n)
rel_err <- norm(beta - beta_true, "F") / norm(beta_true, "F")

## OLS
obj <- sum((y - X %*% beta)^2)
prob <- Problem(Minimize(obj))
result <- solve(prob)
beta_ols <- result$getValue(beta)
err_ols <- result$getValue(rel_err)

## Solve Huber regression problem
obj <- sum(CVXR::huber(y - X %*% beta, M))
prob <- Problem(Minimize(obj))
result <- solve(prob)
beta_hub <- result$getValue(beta)
err_hub <- result$getValue(rel_err)

Finally, we also solve the OLS problem assuming we know the flipped signs.

## Solve ordinary least squares assuming sign flips known
obj <- sum((y - factor*(X %*% beta))^2)
prob <- Problem(Minimize(obj))
result <- solve(prob)
beta_prs <- result$getValue(beta)
err_prs <- result$getValue(rel_err)

We can now plot the fit against the measured responses.

d1 <- data.frame(X = X, y = y, sign = as.factor(factor))
d2 <- data.frame(X = rbind(X, X, X),
                 yHat = rbind(X %*% beta_ols,
                              X %*% beta_hub,
                              X %*% beta_prs),
                 Estimate = c(rep("OLS", m),
                              rep("Huber", m),
                              rep("Prescient", m)))
ggplot() +
    geom_point(data = d1, mapping = aes(x = X, y = y, color = sign)) +
    geom_line(data = d2, mapping = aes(x = X, y = yHat, color = Estimate))

As can be seen, the Huber line is closer to the prescient line.

Session Info

sessionInfo()
## R version 4.2.1 (2022-06-23)
## Platform: x86_64-apple-darwin21.6.0 (64-bit)
## Running under: macOS Ventura 13.0
## 
## Matrix products: default
## BLAS:   /usr/local/Cellar/openblas/0.3.21/lib/libopenblasp-r0.3.21.dylib
## LAPACK: /usr/local/Cellar/r/4.2.1_4/lib/R/lib/libRlapack.dylib
## 
## locale:
## [1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
## 
## attached base packages:
## [1] stats     graphics  grDevices datasets  utils     methods   base     
## 
## other attached packages:
## [1] ggplot2_3.3.6 CVXR_1.0-11  
## 
## loaded via a namespace (and not attached):
##  [1] tidyselect_1.2.0 xfun_0.34        bslib_0.4.0      slam_0.1-50     
##  [5] lattice_0.20-45  Rmosek_10.0.25   colorspace_2.0-3 vctrs_0.5.0     
##  [9] generics_0.1.3   htmltools_0.5.3  yaml_2.3.6       gmp_0.6-6       
## [13] utf8_1.2.2       rlang_1.0.6      jquerylib_0.1.4  pillar_1.8.1    
## [17] glue_1.6.2       Rmpfr_0.8-9      withr_2.5.0      DBI_1.1.3       
## [21] Rcplex_0.3-5     bit64_4.0.5      lifecycle_1.0.3  stringr_1.4.1   
## [25] munsell_0.5.0    blogdown_1.13    gtable_0.3.1     gurobi_9.5-2    
## [29] codetools_0.2-18 evaluate_0.17    labeling_0.4.2   knitr_1.40      
## [33] fastmap_1.1.0    fansi_1.0.3      cccp_0.2-9       highr_0.9       
## [37] Rcpp_1.0.9       scales_1.2.1     cachem_1.0.6     osqp_0.6.0.5    
## [41] jsonlite_1.8.3   farver_2.1.1     bit_4.0.4        digest_0.6.30   
## [45] stringi_1.7.8    bookdown_0.29    dplyr_1.0.10     Rglpk_0.6-4     
## [49] grid_4.2.1       cli_3.4.1        tools_4.2.1      magrittr_2.0.3  
## [53] sass_0.4.2       tibble_3.1.8     pkgconfig_2.0.3  rcbc_0.1.0.9001 
## [57] Matrix_1.5-1     assertthat_0.2.1 rmarkdown_2.17   R6_2.5.1        
## [61] compiler_4.2.1

Source

R Markdown

References

Huber, P. J. 1964. “Robust Estimation of a Location Parameter.” Annals of Mathematical Statistics 35 (1): 73–101.