Introduction
Consider a simple linear regression problem where it is desired to estimate a set of parameters using a least squares criterion.
We generate some synthetic data where we know the model completely, that is
\[ Y = X\beta + \epsilon \]
where \(Y\) is a \(100\times 1\) vector, \(X\) is a \(100\times 10\) matrix, \(\beta = [-4,\ldots ,-1, 0, 1, \ldots, 5]\) is a \(10\times 1\) vector, and \(\epsilon \sim N(0, 1)\).
set.seed(123)
n <- 100
p <- 10
beta <- -4:5 # beta is just -4 through 5.
X <- matrix(rnorm(n * p), nrow=n)
colnames(X) <- paste0("beta_", beta)
Y <- X %*% beta + rnorm(n)
Given the data \(X\) and \(Y\), we can estimate the \(\beta\) vector using
lm
function in R that fits a standard regression model.
ls.model <- lm(Y ~ 0 + X) # There is no intercept in our model above
m <- matrix(coef(ls.model), ncol = 1)
rownames(m) <- paste0("$\\beta_{", 1:p, "}$")
library(kableExtra)
knitr::kable(m, format = "html") %>%
kable_styling("striped") %>%
column_spec(1:2, background = "#ececec")
\(\beta_{1}\) | -3.9196886 |
\(\beta_{2}\) | -3.0117048 |
\(\beta_{3}\) | -2.1248242 |
\(\beta_{4}\) | -0.8666048 |
\(\beta_{5}\) | 0.0914658 |
\(\beta_{6}\) | 0.9490454 |
\(\beta_{7}\) | 2.0764700 |
\(\beta_{8}\) | 3.1272275 |
\(\beta_{9}\) | 3.9609565 |
\(\beta_{10}\) | 5.1348845 |
These are the least-squares estimates and can be seen to be reasonably close to the original \(\beta\) values -4 through 5.
The CVXR
formulation
The CVXR
formulation states the above as an optimization problem:
\[
\begin{array}{ll}
\underset{\beta}{\mbox{minimize}} & \|y - X\beta\|_2^2,
\end{array}
\]
which directly translates into a problem that CVXR
can solve as shown
in the steps below.
- Step 0. Load the
CVXR
library
suppressWarnings(library(CVXR, warn.conflicts=FALSE))
- Step 1. Define the variable to be estimated
betaHat <- Variable(p)
- Step 2. Define the objective to be optimized
objective <- Minimize(sum((Y - X %*% betaHat)^2))
Notice how the objective is specified using functions such as sum
,
*%*
and ^
, that are familiar to R users despite that fact that
betaHat
is no ordinary R expression but a CVXR
expression.
- Step 3. Create a problem to solve
problem <- Problem(objective)
- Step 4. Solve it!
result <- solve(problem)
- Step 5. Extract solution and objective value
## Objective value: 97.847586
We can indeed satisfy ourselves that the results we get matches that
from lm
.
m <- cbind(result$getValue(betaHat), coef(ls.model))
colnames(m) <- c("CVXR est.", "lm est.")
rownames(m) <- paste0("$\\beta_{", 1:p, "}$")
knitr::kable(m, format = "html") %>%
kable_styling("striped") %>%
column_spec(1:3, background = "#ececec")
CVXR est. | lm est. | |
---|---|---|
\(\beta_{1}\) | -3.9196886 | -3.9196886 |
\(\beta_{2}\) | -3.0117048 | -3.0117048 |
\(\beta_{3}\) | -2.1248242 | -2.1248242 |
\(\beta_{4}\) | -0.8666048 | -0.8666048 |
\(\beta_{5}\) | 0.0914658 | 0.0914658 |
\(\beta_{6}\) | 0.9490454 | 0.9490454 |
\(\beta_{7}\) | 2.0764700 | 2.0764700 |
\(\beta_{8}\) | 3.1272275 | 3.1272275 |
\(\beta_{9}\) | 3.9609565 | 3.9609565 |
\(\beta_{10}\) | 5.1348845 | 5.1348845 |
## Testthat Results: No output is good
## Error: `solution` not identical to intro_results$beta.
## Objects equal but not identical
Wait a minute! What have we gained?
On the surface, it appears that we have replaced one call to lm
with
at least five or six lines of new R code. On top of that, the code
actually runs slower, and so it is not clear what was really achieved.
So suppose we knew that the \(\beta\)s were nonnegative and we wish to
take this fact into account. This
is
nonnegative least squares regression and
lm
would no longer do the job.
In CVXR
, the modified problem merely requires the addition of a constraint to the
problem definition.
problem <- Problem(objective, constraints = list(betaHat >= 0))
result <- solve(problem)
m <- matrix(result$getValue(betaHat), ncol = 1)
rownames(m) <- paste0("$\\beta_{", 1:p, "}$")
knitr::kable(m, format = "html") %>%
kable_styling("striped") %>%
column_spec(1:2, background = "#ececec")
\(\beta_{1}\) | 0.0000000 |
\(\beta_{2}\) | 0.0000000 |
\(\beta_{3}\) | 0.0000000 |
\(\beta_{4}\) | 0.0000000 |
\(\beta_{5}\) | 1.2374488 |
\(\beta_{6}\) | 0.6234665 |
\(\beta_{7}\) | 2.1230663 |
\(\beta_{8}\) | 2.8035640 |
\(\beta_{9}\) | 4.4448016 |
\(\beta_{10}\) | 5.2073521 |
We can verify once again that these values are comparable to those obtained from another R package, say nnls.
library(nnls)
nnls.fit <- nnls(X, Y)$x
m <- cbind(result$getValue(betaHat), nnls.fit)
colnames(m) <- c("CVXR est.", "nnls est.")
rownames(m) <- paste0("$\\beta_{", 1:p, "}$")
knitr::kable(m, format = "html") %>%
kable_styling("striped") %>%
column_spec(1:3, background = "#ececec")
CVXR est. | nnls est. | |
---|---|---|
\(\beta_{1}\) | 0.0000000 | 0.0000000 |
\(\beta_{2}\) | 0.0000000 | 0.0000000 |
\(\beta_{3}\) | 0.0000000 | 0.0000000 |
\(\beta_{4}\) | 0.0000000 | 0.0000000 |
\(\beta_{5}\) | 1.2374488 | 1.2374488 |
\(\beta_{6}\) | 0.6234665 | 0.6234665 |
\(\beta_{7}\) | 2.1230663 | 2.1230663 |
\(\beta_{8}\) | 2.8035640 | 2.8035640 |
\(\beta_{9}\) | 4.4448016 | 4.4448016 |
\(\beta_{10}\) | 5.2073521 | 5.2073521 |
## Testthat Results: No output is good
## Error: result$getValue(betaHat) not identical to intro_results$beta.
## Objects equal but not identical
Okay that was cool, but…
As you no doubt noticed, we have done nothing that other R packages could not do.
So now suppose further, for some extraneous reason, that the sum of \(\beta_2\) and \(\beta_3\) is known to be negative and but all other \(\beta\)s are positive.
It is clear that this problem would not fit into any standard
package. But in CVXR
, this is easily done by adding a few
constraints.
To express the fact that \(\beta_2 + \beta_3\) is negative, we construct a row matrix with zeros everywhere, except in positions 2 and 3 (for \(\beta_2\) and \(\beta_3\) respectively).
A <- matrix(c(0, 1, 1, rep(0, 7)), nrow = 1)
colnames(A) <- paste0("$\\beta_{", 1:p, "}$")
knitr::kable(A, format = "html") %>%
kable_styling("striped") %>%
column_spec(1:10, background = "#ececec")
\(\beta_{1}\) | \(\beta_{2}\) | \(\beta_{3}\) | \(\beta_{4}\) | \(\beta_{5}\) | \(\beta_{6}\) | \(\beta_{7}\) | \(\beta_{8}\) | \(\beta_{9}\) | \(\beta_{10}\) |
---|---|---|---|---|---|---|---|---|---|
0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
The sum constraint is nothing but \[ A\beta <= 0 \]
which we express in R as
constraint1 <- A %*% betaHat <= 0
NOTE: The above constraint can also be expressed simply as
constraint1 <- betaHat[2] + betaHat[3] <= 0
but it is easier working with matrices in general with CVXR
.
For the positivity for rest of the variables, we construct a \(10\times 10\) matrix \(A\) to have 1’s along the diagonal everywhere except rows 2 and 3 and zeros everywhere.
B <- diag(c(1, 0, 0, rep(1, 7)))
colnames(B) <- rownames(B) <- paste0("$\\beta_{", 1:p, "}$")
knitr::kable(B, format = "html") %>%
kable_styling("striped") %>%
column_spec(1:11, background = "#ececec")
\(\beta_{1}\) | \(\beta_{2}\) | \(\beta_{3}\) | \(\beta_{4}\) | \(\beta_{5}\) | \(\beta_{6}\) | \(\beta_{7}\) | \(\beta_{8}\) | \(\beta_{9}\) | \(\beta_{10}\) | |
---|---|---|---|---|---|---|---|---|---|---|
\(\beta_{1}\) | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
\(\beta_{2}\) | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
\(\beta_{3}\) | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
\(\beta_{4}\) | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 |
\(\beta_{5}\) | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 |
\(\beta_{6}\) | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 |
\(\beta_{7}\) | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 |
\(\beta_{8}\) | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 |
\(\beta_{9}\) | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
\(\beta_{10}\) | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
The constraint for positivity is \[ B\beta >= 0 \]
which we express in R as
constraint2 <- B %*% betaHat >= 0
Now we are ready to solve the problem just as before.
problem <- Problem(objective, constraints = list(constraint1, constraint2))
result <- solve(problem)
And we can get the estimates of \(\beta\).
m <- matrix(result$getValue(betaHat), ncol = 1)
rownames(m) <- paste0("$\\beta_{", 1:p, "}$")
knitr::kable(m, format = "html") %>%
kable_styling("striped") %>%
column_spec(1:2, background = "#ececec")
\(\beta_{1}\) | 0.0000000 |
\(\beta_{2}\) | -2.8446952 |
\(\beta_{3}\) | -1.7109771 |
\(\beta_{4}\) | 0.0000000 |
\(\beta_{5}\) | 0.6641308 |
\(\beta_{6}\) | 1.1781109 |
\(\beta_{7}\) | 2.3286139 |
\(\beta_{8}\) | 2.4144893 |
\(\beta_{9}\) | 4.2119052 |
\(\beta_{10}\) | 4.9483245 |
## Testthat Results: No output is good
## Error: result$getValue(betaHat) not identical to intro_results$beta.
## Objects equal but not identical
This demonstrates the chief advantage of CVXR
: flexibility. Users
can quickly modify and re-solve a problem, making our package ideal
for prototyping new statistical methods. Its syntax is simple and
mathematically intuitive. Furthermore, CVXR
combines seamlessly with
native R code as well as several popular packages, allowing it to be
incorporated easily into a larger analytical framework. The user is
free to construct statistical estimators that are solutions to a
convex optimization problem where there may not be a closed form
solution or even an implementation. Such solutions can then be
combined with resampling techniques like the bootstrap to estimate
variability.
Further Reading
We hope we have whet your appetite. You may wish to read a longer introduction with more examples.
We also have a number of tutorial examples available to study and mimic.
Session Info
sessionInfo()
## R version 4.4.2 (2024-10-31)
## Platform: x86_64-apple-darwin20
## Running under: macOS Sequoia 15.1
##
## Matrix products: default
## BLAS: /Library/Frameworks/R.framework/Versions/4.4-x86_64/Resources/lib/libRblas.0.dylib
## LAPACK: /Library/Frameworks/R.framework/Versions/4.4-x86_64/Resources/lib/libRlapack.dylib; LAPACK version 3.12.0
##
## locale:
## [1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
##
## time zone: America/Los_Angeles
## tzcode source: internal
##
## attached base packages:
## [1] stats graphics grDevices datasets utils methods base
##
## other attached packages:
## [1] nnls_1.6 CVXR_1.0-15 kableExtra_1.4.0 testthat_3.2.1.1
## [5] here_1.0.1
##
## loaded via a namespace (and not attached):
## [1] gmp_0.7-5 utf8_1.2.4 clarabel_0.9.0.1 sass_0.4.9
## [5] xml2_1.3.6 slam_0.1-54 blogdown_1.19 stringi_1.8.4
## [9] lattice_0.22-6 digest_0.6.37 magrittr_2.0.3 evaluate_1.0.1
## [13] grid_4.4.2 bookdown_0.41 pkgload_1.4.0 fastmap_1.2.0
## [17] rprojroot_2.0.4 jsonlite_1.8.9 Matrix_1.7-1 brio_1.1.5
## [21] fansi_1.0.6 Rmosek_10.2.0 viridisLite_0.4.2 scales_1.3.0
## [25] codetools_0.2-20 jquerylib_0.1.4 cli_3.6.3 Rmpfr_0.9-5
## [29] rlang_1.1.4 Rglpk_0.6-5.1 bit64_4.5.2 munsell_0.5.1
## [33] cachem_1.1.0 yaml_2.3.10 tools_4.4.2 osqp_0.6.3.3
## [37] Rcplex_0.3-6 rcbc_0.1.0.9001 colorspace_2.1-1 gurobi_11.0-0
## [41] assertthat_0.2.1 vctrs_0.6.5 R6_2.5.1 lifecycle_1.0.4
## [45] stringr_1.5.1 bit_4.5.0 desc_1.4.3 cccp_0.3-1
## [49] pillar_1.9.0 bslib_0.8.0 glue_1.8.0 Rcpp_1.0.13-1
## [53] systemfonts_1.1.0 xfun_0.49 highr_0.11 rstudioapi_0.17.1
## [57] knitr_1.48 htmltools_0.5.8.1 rmarkdown_2.29 svglite_2.1.3
## [61] compiler_4.4.2