Skip to content

Could the convex problem be tackled by CVX?

An answer to this question on the Scientific Computing Stack Exchange.

Question

I want to solve the convex optimization as follows: \begin{align} \underset{X_1,X_2}{\min} &\ -\frac{1}{N}\sum_{i=1}^N\log\det\left(I+H_i^HX_2H_i\right)-\log\left[1+h^H(X_1+X_2)h\right]\ &\qquad\qquad\qquad\qquad\qquad\qquad\qquad\qquad\qquad+ \text{tr}(AX_1)+\text{tr}(BX_2)\ \text{s.t} &\ \text{tr}(X_1+X_2) \leq 1 \ &\ X_1 \in S^n_+,X_2 \in S^n_+ \end{align} with $H_i$, $h$, $A$, $B$ are given. $A$ and $B$ are both hermitian matrices. Subscript $H$ stands for the conjugate transpose.

I have utilized CVX to solve this problem, but the performance seems not well. Because Sylvester's determinant identity is used in two CVX codes, we obtain completely different results, which is unreasonable.

If the problem cannot be solved by CVX efficiently, the other toolbox like SDPT3 or SeDuMi could be in sight to tackle the problem? Thank you.

matlab code is shown as follows:

Sample = 10;
H = randn(4,2,Sample)+1i*randn(4,2,Sample);
h = randn(4,1)+1i*randn(4,1);
A = randn(4,4)+1i*randn(4,4); A = A*A';
B = randn(4,4)+1i*randn(4,4); B = B*B';
        
cvx_begin quiet
    variable X1(4,4) semidefinite hermitian
    variable X2(4,4) semidefinite hermitian
    t = -log(1+quad_form(h,X1+X2))+trace(A*X1)+trace(B*X2);
    for i = 1:Sample
        t =  t - log_det(eye(2) + H(:,:,i)'*X2 *H(:,:,i))/Sample; % Primal constraint
    end
    minimize(t)
    subject to
        trace(X1+X2)<=1;
cvx_end
X1
X2
t
cvx_begin quiet
    variable X1(4,4) semidefinite hermitian
    variable X2(4,4) semidefinite hermitian
    t = -log(1+quad_form(h,X1+X2))+trace(A*X1)+trace(B*X2);
    for i = 1:Sample
        t =  t - log_det(eye(4) + X2 *H(:,:,i)*H(:,:,i)')/Sample; % Sylvester's determinant identity is used
    end
    minimize(t)
    subject to
        trace(X1+X2)<=1;
cvx_end
X1
X2
t

Answer

You should use a modeling language so that your code is independent of the underlying solver.

cvxpy is a good choice.

When I rewrite your model in cvxpy:

#!/usr/bin/env python3
import cvxpy as cp
import numpy as np
from numpy.random import normal as randn
Sample = 10
H = randn(size=(4,2,Sample))+1j*randn(size=(4,2,Sample))
h = randn(size=(4,1))+1j*randn(size=(4,1))
A = randn(size=(4,4))+1j*randn(size=(4,4))
A = A*A.T;
B = randn(size=(4,4))+1j*randn(size=(4,4))
B = B*B.T;
X1 = cp.Variable((4,4), symmetric=True)
X2 = cp.Variable((4,4), symmetric=True)
t = -cp.log(cp.real(1+cp.quad_form(h,X1+X2)))+cp.trace(A*X1)+cp.trace(B*X2)
for i in range(Sample):
    t -= cp.log_det(np.eye(2) + H[:,:,i].T*X2*H[:,:,i])/Sample # Primal constraint
constraints = [cp.trace(X1+X2)<=1, X1>>0, X2>>0]
problem1 = cp.Problem(cp.Minimize(cp.real(t)), constraints)
optval1 = problem1.solve()
print("Optimval value", optval1)
print("X1.value", X1.value)
print("X2.value", X2.value)
t = -cp.log(cp.real(1+cp.quad_form(h,X1+X2)))+cp.trace(A*X1)+cp.trace(B*X2);
for i in range(Sample):
    t -= cp.log_det(np.eye(4) + X2*H[:,:,i]*H[:,:,i].T)/Sample #Sylvester's determinant identity is used
problem2 = cp.Problem(cp.Minimize(cp.real(t)), constraints)
optval2 = problem2.solve()
print("Optimval value", optval2)
print("X1.value", X1.value)
print("X2.value", X2.value)

I find it takes about 2 seconds to solve either problem and that the two agree to a few decimals places (-2.1221305764766405 vs -2.122080730649998).

If you need better performance, you can have cvxpy use any one of many solvers.

If you need better better performance, rewrite the model in JuMP, which offers the flexibility of many solvers along with the performance of Julia (your model will get converted very quickly after the first run-through of the code gives the JIT a chance to work).