Logistic Regression

Introduction

In classification problems, the goal is to predict the class membership based on predictors. Often there are two classes and one of the most popular methods for binary classification is logistic regression Freedman (2009).

Suppose now that \(y_i \in \{0,1\}\) is a binary class indicator. The conditional response is modeled as \(y|x \sim \mbox{Bernoulli}(g_{\beta}(x))\), where \(g_{\beta}(x) = \frac{1}{1 + e^{-x^T\beta}}\) is the logistic function, and maximize the log-likelihood function, yielding the optimization problem

\[ \begin{array}{ll} \underset{\beta}{\mbox{maximize}} & \sum_{i=1}^m \{ y_i\log(g_{\beta}(x_i)) + (1-y_i)\log(1 - g_{\beta}(x_i)) \}. \end{array} \]

CVXR provides the logistic atom as a shortcut for \(f(z) = \log(1 + e^z)\) to express the optimization problem. One may be tempted to use log(1 + exp(X %*% beta)) as in conventional R syntax. However, this representation of \(f(z)\) violates the DCP composition rule, so the CVXR parser will reject the problem even though the objective is convex. Users who wish to employ a function that is convex, but not DCP compliant should check the documentation for a custom atom or consider a different formulation.

Example

The formulation is very similar to OLS, except for the specification of the objective.

In the example below, we demonstrate a key feature of CVXR, that of evaluating various functions of the variables that are solutions to the optimization problem. For instance, the log-odds, \(X\hat{\beta}\), where \(\hat{\beta}\) is the logistic regression estimate, is simply specified as X %*% beta below, and the getValue function of the result will compute its value. (Any other function of the estimate can be similarly computed.)

n <- 20
m <- 1000
offset <- 0
sigma <- 45
DENSITY <- 0.2

set.seed(183991)
beta_true <- stats::rnorm(n)
idxs <- sample(n, size = floor((1-DENSITY)*n), replace = FALSE)
beta_true[idxs] <- 0
X <- matrix(stats::rnorm(m*n, 0, 5), nrow = m, ncol = n)
y <- sign(X %*% beta_true + offset + stats::rnorm(m, 0, sigma))

beta <- Variable(n)
obj <- -sum(logistic(-X[y <= 0, ] %*% beta)) - sum(logistic(X[y == 1, ] %*% beta))
prob <- Problem(Maximize(obj))
result <- solve(prob)

log_odds <- result$getValue(X %*% beta)
beta_res <- result$getValue(beta)
y_probs <- 1/(1 + exp(-X %*% beta_res))

We can compare with the standard stats::glm estimate.

d <- data.frame(y = as.numeric(y > 0), X = X)
glm <- stats::glm(formula = y ~ 0 + X, family = "binomial", data = d)
est.table <- data.frame("CVXR.est" = beta_res, "GLM.est" = coef(glm))
rownames(est.table) <- paste0("$\\beta_{", 1:n, "}$")
knitr::kable(est.table, format = "html") %>%
    kable_styling("striped") %>%
    column_spec(1:3, background = "#ececec")
CVXR.est GLM.est
\(\beta_{1}\) -0.0305494 0.0305494
\(\beta_{2}\) 0.0023528 -0.0023528
\(\beta_{3}\) -0.0110080 0.0110080
\(\beta_{4}\) 0.0163919 -0.0163919
\(\beta_{5}\) 0.0157186 -0.0157186
\(\beta_{6}\) 0.0006251 -0.0006251
\(\beta_{7}\) -0.0157914 0.0157914
\(\beta_{8}\) -0.0092228 0.0092228
\(\beta_{9}\) 0.0173823 -0.0173823
\(\beta_{10}\) 0.0019102 -0.0019102
\(\beta_{11}\) -0.0100746 0.0100746
\(\beta_{12}\) -0.0269883 0.0269883
\(\beta_{13}\) 0.0233625 -0.0233625
\(\beta_{14}\) 0.0009529 -0.0009529
\(\beta_{15}\) -0.0016264 0.0016264
\(\beta_{16}\) 0.0312156 -0.0312156
\(\beta_{17}\) 0.0038949 -0.0038949
\(\beta_{18}\) -0.0121105 0.0121105
\(\beta_{19}\) 0.0246811 -0.0246811
\(\beta_{20}\) -0.0007025 0.0007025

The sign difference is due to the coding of \(y\) as \((-1, 1)\) for CVXR rather than \((0, 1)\) for stats::glm.

So, for completeness, if we were to code the \(y\) as \((0, 1)\), the objective will have to be modified as below.

obj <- -sum(X[y <= 0, ] %*% beta) - sum(logistic(-X %*% beta))
prob <- Problem(Maximize(obj))
result <- solve(prob)
beta_log <- result$getValue(beta)
est.table <- data.frame("CVXR.est" = beta_log, "GLM.est" = coef(glm))
rownames(est.table) <- paste0("$\\beta_{", 1:n, "}$")
knitr::kable(est.table, format = "html") %>%
    kable_styling("striped") %>%
    column_spec(1:3, background = "#ececec")
CVXR.est GLM.est
\(\beta_{1}\) 0.0305494 0.0305494
\(\beta_{2}\) -0.0023528 -0.0023528
\(\beta_{3}\) 0.0110080 0.0110080
\(\beta_{4}\) -0.0163919 -0.0163919
\(\beta_{5}\) -0.0157186 -0.0157186
\(\beta_{6}\) -0.0006251 -0.0006251
\(\beta_{7}\) 0.0157914 0.0157914
\(\beta_{8}\) 0.0092228 0.0092228
\(\beta_{9}\) -0.0173823 -0.0173823
\(\beta_{10}\) -0.0019102 -0.0019102
\(\beta_{11}\) 0.0100746 0.0100746
\(\beta_{12}\) 0.0269883 0.0269883
\(\beta_{13}\) -0.0233625 -0.0233625
\(\beta_{14}\) -0.0009529 -0.0009529
\(\beta_{15}\) 0.0016264 0.0016264
\(\beta_{16}\) -0.0312156 -0.0312156
\(\beta_{17}\) -0.0038949 -0.0038949
\(\beta_{18}\) 0.0121105 0.0121105
\(\beta_{19}\) -0.0246811 -0.0246811
\(\beta_{20}\) 0.0007025 0.0007025

Now, the results match perfectly.

Session Info

sessionInfo()
## R version 4.2.1 (2022-06-23)
## Platform: x86_64-apple-darwin21.6.0 (64-bit)
## Running under: macOS Ventura 13.0
## 
## Matrix products: default
## BLAS:   /usr/local/Cellar/openblas/0.3.21/lib/libopenblasp-r0.3.21.dylib
## LAPACK: /usr/local/Cellar/r/4.2.1_4/lib/R/lib/libRlapack.dylib
## 
## locale:
## [1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
## 
## attached base packages:
## [1] stats     graphics  grDevices datasets  utils     methods   base     
## 
## other attached packages:
## [1] kableExtra_1.3.4 CVXR_1.0-11     
## 
## loaded via a namespace (and not attached):
##  [1] xfun_0.34         bslib_0.4.0       slam_0.1-50       lattice_0.20-45  
##  [5] Rmosek_10.0.25    colorspace_2.0-3  htmltools_0.5.3   viridisLite_0.4.1
##  [9] yaml_2.3.6        gmp_0.6-6         rlang_1.0.6       jquerylib_0.1.4  
## [13] glue_1.6.2        Rmpfr_0.8-9       Rcplex_0.3-5      bit64_4.0.5      
## [17] lifecycle_1.0.3   stringr_1.4.1     munsell_0.5.0     blogdown_1.13    
## [21] rvest_1.0.3       gurobi_9.5-2      codetools_0.2-18  evaluate_0.17    
## [25] knitr_1.40        fastmap_1.1.0     cccp_0.2-9        highr_0.9        
## [29] Rcpp_1.0.9        scales_1.2.1      cachem_1.0.6      webshot_0.5.4    
## [33] jsonlite_1.8.3    systemfonts_1.0.4 bit_4.0.4         digest_0.6.30    
## [37] stringi_1.7.8     bookdown_0.29     grid_4.2.1        Rglpk_0.6-4      
## [41] ECOSolveR_0.5.4   cli_3.4.1         tools_4.2.1       magrittr_2.0.3   
## [45] sass_0.4.2        rcbc_0.1.0.9001   Matrix_1.5-1      xml2_1.3.3       
## [49] assertthat_0.2.1  rmarkdown_2.17    svglite_2.1.0     httr_1.4.4       
## [53] rstudioapi_0.14   R6_2.5.1          compiler_4.2.1

Source

R Markdown

References

Cox, D. R. 1958. “The Regression Analysis of Binary Sequences.” Journal of the Royal Statistical Society. Series B (Methodological) 20 (2): 215–42.
Freedman, D. A. 2009. Statistical Models: Theory and Practice. Cambridge University Press.