Logistic Regression

Introduction

In classification problems, the goal is to predict the class membership based on predictors. Often there are two classes and one of the most popular methods for binary classification is logistic regression (Cox 1958, Freedman (2009)).

Suppose now that \(y_i \in \{0,1\}\) is a binary class indicator. The conditional response is modeled as \(y|x \sim \mbox{Bernoulli}(g_{\beta}(x))\), where \(g_{\beta}(x) = \frac{1}{1 + e^{-x^T\beta}}\) is the logistic function, and maximize the log-likelihood function, yielding the optimization problem

\[ \begin{array}{ll} \underset{\beta}{\mbox{maximize}} & \sum_{i=1}^m \{ y_i\log(g_{\beta}(x_i)) + (1-y_i)\log(1 - g_{\beta}(x_i)) \}. \end{array} \]

CVXR provides the logistic atom as a shortcut for \(f(z) = \log(1 + e^z)\) to express the optimization problem. One may be tempted to use log(1 + exp(X %*% beta)) as in conventional R syntax. However, this representation of \(f(z)\) violates the DCP composition rule, so the CVXR parser will reject the problem even though the objective is convex. Users who wish to employ a function that is convex, but not DCP compliant should check the documentation for a custom atom or consider a different formulation.

Example

The formulation is very similar to OLS, except for the specification of the objective.

In the example below, we demonstrate a key feature of CVXR, that of evaluating various functions of the variables that are solutions to the optimization problem. For instance, the log-odds, \(X\hat{\beta}\), where \(\hat{\beta}\) is the logistic regression estimate, is simply specified as X %*% beta below, and the getValue function of the result will compute its value. (Any other function of the estimate can be similarly computed.)

suppressMessages(suppressWarnings(library(CVXR)))
n <- 20
m <- 1000
offset <- 0
sigma <- 45
DENSITY <- 0.2

set.seed(183991)
beta_true <- stats::rnorm(n)
idxs <- sample(n, size = floor((1-DENSITY)*n), replace = FALSE)
beta_true[idxs] <- 0
X <- matrix(stats::rnorm(m*n, 0, 5), nrow = m, ncol = n)
y <- sign(X %*% beta_true + offset + stats::rnorm(m, 0, sigma))

beta <- Variable(n)
obj <- -sum(logistic(-X[y <= 0, ] %*% beta)) - sum(logistic(X[y == 1, ] %*% beta))
prob <- Problem(Maximize(obj))
result <- solve(prob)

log_odds <- result$getValue(X %*% beta)
beta_res <- result$getValue(beta)
y_probs <- 1/(1 + exp(-X %*% beta_res))

We can compare with the standard stats::glm estimate.

d <- data.frame(y = as.numeric(y > 0), X = X)
glm <- stats::glm(formula = y ~ 0 + X, family = "binomial", data = d)
est.table <- data.frame("CVXR.est" = beta_res, "GLM.est" = coef(glm))
rownames(est.table) <- paste0("$\\beta_{", 1:n, "}$")
library(kableExtra)
knitr::kable(est.table, format = "html") %>%
    kable_styling("striped") %>%
    column_spec(1:3, background = "#ececec")
CVXR.est GLM.est
\(\beta_{1}\) 0.0003345 -0.0003345
\(\beta_{2}\) 0.0149849 -0.0149849
\(\beta_{3}\) -0.0144564 0.0144564
\(\beta_{4}\) -0.0172645 0.0172645
\(\beta_{5}\) -0.0206029 0.0206029
\(\beta_{6}\) 0.0200824 -0.0200824
\(\beta_{7}\) 0.0137066 -0.0137066
\(\beta_{8}\) 0.0009806 -0.0009806
\(\beta_{9}\) 0.0246409 -0.0246409
\(\beta_{10}\) -0.0101345 0.0101345
\(\beta_{11}\) 0.0113936 -0.0113936
\(\beta_{12}\) -0.0187613 0.0187613
\(\beta_{13}\) -0.0051938 0.0051938
\(\beta_{14}\) -0.0015573 0.0015573
\(\beta_{15}\) -0.0202801 0.0202801
\(\beta_{16}\) 0.0176528 -0.0176528
\(\beta_{17}\) -0.0191530 0.0191530
\(\beta_{18}\) -0.0131130 0.0131130
\(\beta_{19}\) 0.0265186 -0.0265186
\(\beta_{20}\) -0.0039912 0.0039912

The sign difference is due to the coding of \(y\) as \((-1, 1)\) for CVXR rather than \((0, 1)\) for stats::glm.

So, for completeness, if we were to code the \(y\) as \((0, 1)\), the objective will have to be modified as below.

obj <- -sum(X[y <= 0, ] %*% beta) - sum(logistic(-X %*% beta))
prob <- Problem(Maximize(obj))
result <- solve(prob)
beta_log <- result$getValue(beta)
est.table <- data.frame("CVXR.est" = beta_log, "GLM.est" = coef(glm))
rownames(est.table) <- paste0("$\\beta_{", 1:n, "}$")
knitr::kable(est.table, format = "html") %>%
    kable_styling("striped") %>%
    column_spec(1:3, background = "#ececec")
CVXR.est GLM.est
\(\beta_{1}\) -0.0003345 -0.0003345
\(\beta_{2}\) -0.0149849 -0.0149849
\(\beta_{3}\) 0.0144564 0.0144564
\(\beta_{4}\) 0.0172645 0.0172645
\(\beta_{5}\) 0.0206029 0.0206029
\(\beta_{6}\) -0.0200824 -0.0200824
\(\beta_{7}\) -0.0137066 -0.0137066
\(\beta_{8}\) -0.0009806 -0.0009806
\(\beta_{9}\) -0.0246409 -0.0246409
\(\beta_{10}\) 0.0101345 0.0101345
\(\beta_{11}\) -0.0113936 -0.0113936
\(\beta_{12}\) 0.0187613 0.0187613
\(\beta_{13}\) 0.0051938 0.0051938
\(\beta_{14}\) 0.0015573 0.0015573
\(\beta_{15}\) 0.0202801 0.0202801
\(\beta_{16}\) -0.0176528 -0.0176528
\(\beta_{17}\) 0.0191530 0.0191530
\(\beta_{18}\) 0.0131130 0.0131130
\(\beta_{19}\) -0.0265186 -0.0265186
\(\beta_{20}\) 0.0039912 0.0039912

Now, the results match perfectly.

Session Info

sessionInfo()
## R version 3.5.1 (2018-07-02)
## Platform: x86_64-apple-darwin17.7.0 (64-bit)
## Running under: macOS  10.14.1
## 
## Matrix products: default
## BLAS/LAPACK: /usr/local/Cellar/openblas/0.3.3/lib/libopenblasp-r0.3.3.dylib
## 
## locale:
## [1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
## 
## attached base packages:
## [1] stats     graphics  grDevices datasets  utils     methods   base     
## 
## other attached packages:
## [1] kableExtra_0.9.0 CVXR_0.99-2     
## 
## loaded via a namespace (and not attached):
##  [1] gmp_0.5-13.2      Rcpp_0.12.19      highr_0.7        
##  [4] compiler_3.5.1    pillar_1.3.0      R.methodsS3_1.7.1
##  [7] R.utils_2.7.0     tools_3.5.1       digest_0.6.18    
## [10] bit_1.1-14        viridisLite_0.3.0 evaluate_0.12    
## [13] tibble_1.4.2      lattice_0.20-35   pkgconfig_2.0.2  
## [16] rlang_0.3.0.1     Matrix_1.2-15     rstudioapi_0.8   
## [19] yaml_2.2.0        blogdown_0.9.2    xfun_0.4         
## [22] xml2_1.2.0        httr_1.3.1        Rmpfr_0.7-1      
## [25] ECOSolveR_0.4     stringr_1.3.1     knitr_1.20       
## [28] hms_0.4.2         rprojroot_1.3-2   bit64_0.9-7      
## [31] grid_3.5.1        R6_2.3.0          rmarkdown_1.10   
## [34] bookdown_0.7      readr_1.1.1       magrittr_1.5     
## [37] scales_1.0.0      backports_1.1.2   htmltools_0.3.6  
## [40] scs_1.1-1         rvest_0.3.2       colorspace_1.3-2 
## [43] stringi_1.2.4     munsell_0.5.0     crayon_1.3.4     
## [46] R.oo_1.22.0

Source

R Markdown

References

Cox, D. R. 1958. “The Regression Analysis of Binary Sequences.” Journal of the Royal Statistical Society. Series B (Methodological) 20 (2): 215–42.

Freedman, D. A. 2009. Statistical Models: Theory and Practice. Cambridge University Press.