Optimizing linear equalities over iid random variables and mixed integer programming

Tobias Fritz and myself wrote a preprint on optimizing linear equalities over iid random variables. A non-trivial instance discussed in the paper is the following found by Tobias: Maximize with respect to a probability measure \(\mu\) on \([0,+\infty)\) the probability \[ P( X_1 + X_2 + X_3 < 2X_4 ) \] where \(X_1,X_2,X_3,X_4\) are iid random variables with distribution \(\mu\). More precisely, the goal is to find the exact value, or alternatively a lower and upper bound as sharp as possible, of the quantity \[ C^* = \sup_{\mu} P_{X_1,...,X_4\sim^{iid}\mu}( X_1 + X_2 + X_3 < 2X_4 ). \]

Trying to make this probability as large as possible involves constructing probability measures. Some attempts are documented at https://mathoverflow.net/questions/474916/how-large-can-mathbfpx-1-x-2-x-3-2-x-4-get/474927?noredirect=1#comment1253583_474927 and in Bellec and Fritz (2024) where we show that this probability can be made at least as large as 0.400695.

This page focuses on upper bound on \(C^*\): we would like to show that no probability measure on \([0,+\infty)\) can make this probability larger than some threshold. A first naive observation, assuming that the distribution is continuous to avoid ties for simplicity, is that \(C^*\le 0.5\) thanks to the following argument: If \(X_1+X_2+X_3 < 2X_4\) holds then the rank of \(X_4\) among \(X_1,X_2,X_3,X_4\) is 3 or 4, which happens with probability 0.5 by symmetry. Here, symmetry refers to the fact that since the \(X_i\) are iid, any permutation of the indices \((1,2,3,4)\) leads to the same distribution and the rank of \(X_4\) is uniformly distributed on \(\{1,2,3,4\}\).

General upper bounds by exchangeability

Sample \(X_1,...,X_m\) accordingly to a given distribution \(\mu\) and let \(p_\mu = P(X_{1} + X_{2} + X_{3} < 2X_{4}\). Then since all variables are distributed according to \(\mu\), \[ p_\mu = P(X_{i} + X_{j} + X_{k} < 2X_{l} ) \] as long as \(i,j,k,l\in \{1,...,m\}\) are distinct indices. Thus for any injective function \(h: \{1,2,3,4\} \to \{1,...,m\}\), \[p_\mu = P(X_{i(1)} + X_{h(2)} + X_{h(3)} < 2X_{h(4)}) =\mathbb{E}[I\{X_{h(1)} + X_{h(2)} + X_{h(3)} < 2X_{h(4)}\} ] \] where \(E[\cdot]\) is the expectation sign and \(I\{\cdot\}\) is the indicator function in \(\{0,1\}\) The equality thus holds for the average over all such injective functions: \[ p_\mu = \frac{1}{N} \mathbb{E}\Big[ \sum_{\substack{h:\{1,2,3,4\}\to\{1,...,m\}\\ \text{injective}}} I\{X_{h(1)} + X_{h(2)} + X_{h(3)} < 2X_{h(4)}\} \Big] \] where \(N=m(m-1)(m-2)(m-3)\) is the number of injective functions from a set of size 4 to a set of size \(m\). At this point, we can bound from above the random variable inside the expectation by the supremum over all deterministic non-negative numbers \(x_1,...,x_m\): \[ p_\mu \le \sup_{x_1,...,x_m\ge 0} \frac{1}{N}\Big[ \sum_{\substack{h:\{1,2,3,4\}\to\{1,...,m\}\\ \text{injective}}} I\{x_{h(1)} + x_{h(2)} + x_{h(3)} < 2x_{h(4)}\} \Big] \] At this point the inequality holds between two deterministic quantities. The optimization on the right hand side is over non-negative reals \(x_1,...,x_m\ge 0\) and requires to satisfy as many inequalities \(x_{h(1)} + x_{h(2)} + x_{h(3)} < 2x_{h(4)}\) as possible where \(h:\{1,2,3,4\}\to\{1,...,m\}\) is an injective function. Without loss of generality, we assume from now on that \[ 0 \le x_1 \le x_2 \le \dots \le x_m \] since the average over all injective functions is invariant by permutation of the indices \(\{1,...,m\}\).

Some injection functions \(h:\{1,2,3,4\}\to\{1,...,m\}\) can be grouped together as they lead to the same inequality. Indeed \[ x_{h(1)} + x_{h(2)} + x_{h(3)} < 2x_{h(4)} \] only depends on \(h(1),h(2),h(3)\) through the set \(\{h(1),h(2),h(3)\}\) but not on a particular ordering of these three indices among all 6 possible orderings. Grouping together all such injections with ${i,j,k} = \(\{h(1),h(2),h(3)\}\) and \(l=h(l)\), we get \[ p_\mu \le \sup_{x_1,...,x_m\ge 0} \frac{6}{N}\Big[ \sum_{1\le i<j<k \le m} \sum_{\substack{l=1: l\notin\{i,j,k\}}}^m I\{x_i + x_j + x_k < 2 x_l\} \Big]. \] Half the terms in the sum can discarded right away since if \(l < k\) then the inequality \(x_i + x_j + x_k < 2 x_l\) cannot hold. Exactly half of the terms are discarded this way because \(l < k\) if and only if the rank of \(l\) among \(\{i,j,k,l\}\) is 1 or 2, and the rank of \(l\) is uniformly distributed among \(\{1,2,3,4\}\) by symmetry.

Upper bounds by mixed integer linear programming

We will maximize the right-hand side above over non-negative reals \(x_1,...,x_m\ge 0\), below with \(m=12\) to illustrate how such maximization problems can be solved using mixed integer linear programming (MILP). We will use the Gurobi solver via its Python interface.

Start with initializing the MILP model and defining the variables \(x_1,x_2,...,x_m\).

import gurobipy as gp
from gurobipy import GRB

m = 12
model = gp.Model("MILP_violation_indicators")

xs = model.addVars(range(1,m+1), vtype=GRB.CONTINUOUS, lb=0, ub=1, name="x")

# Generate the order constraints x_1 <= x_2 <= ... <= x_m = 1
for i in range(1,m):
    model.addConstr(xs[i] <= xs[i + 1], f"order_{i}")

# Fix x_m to 1 without loss of generality
model.addConstr(xs[m] == 1, "x_m_is_1")       
<gurobi.Constr *Awaiting Model Update*>

Next, for each term \(x_i + x_j + x_k < 2 x_l\) with \(1 \le i < j < k \le m\) and \(l \notin \{i,j,k\}\) and \(j < l\), we define a binary variable \(y_{i,j,k,l}\) which is equal to 1 if the inequality is satisfied. Because numerical solvers do not handle strict inequality, we introduce a small threshold.

# Define the y variables: one variable y_t for each tuple t = (i,j,k,l) in T, with corresponding inequality X_i + X_j + X_k < 2 X_l 
T = [
        (i,j,k,l)
        for i in range(1,m+1)
        for j in range(1,m+1)
        for k in range(1,m+1)
        for l in range(1,m+1)
        if i < j and j < k and l not in [i,j,k] and j < l
    ]
ys = model.addVars(T, vtype=GRB.BINARY, name=[f"X{t[0]} + X{t[1]} + X{t[2]} < 2*X{t[3]}" for t in T])

threshold = 2e-5
for i,j,k,l in T:
    # https://docs.gurobi.com/projects/optimizer/en/12.0/reference/cpp/model.html#_CPPv4N8GRBModel21addGenConstrIndicatorE6GRBVariRK13GRBTempConstr6string
    model.addGenConstrIndicator(
        ys[(i,j,k,l)],     # Indicator variable
        1,                 # If y[i, j, k, l] == 1, then the following inequality must hold:
        xs[i] + xs[j] + xs[k] - 2 * xs[l], GRB.LESS_EQUAL, -threshold,
        name=f"indicator_{i}_{j}_{k}_{l}"
    )

It remains to define the objective function (to maximize the number of indicator functions) and to solve the MILP.

model.setObjective(gp.quicksum(ys.values()), GRB.MAXIMIZE)
model.optimize()
Gurobi Optimizer version 13.0.0 build v13.0.0rc1 (linux64 - "Ubuntu 24.04.1 LTS")

CPU model: Intel(R) Core(TM) i5-8350U CPU @ 1.70GHz, instruction set [SSE2|AVX|AVX2]
Thread count: 4 physical cores, 8 logical processors, using up to 8 threads

Optimize a model with 12 rows, 1002 columns and 23 nonzeros (Max)
Model fingerprint: 0xe17a5c59
Model has 990 linear objective coefficients
Model has 990 simple general constraints
  990 INDICATOR
Variable types: 12 continuous, 990 integer (990 binary)
Coefficient statistics:
  Matrix range     [1e+00, 1e+00]
  Objective range  [1e+00, 1e+00]
  Bounds range     [1e+00, 1e+00]
  RHS range        [1e+00, 1e+00]
  GenCon rhs range [2e-05, 2e-05]
  GenCon coe range [1e+00, 2e+00]
Presolve added 722 rows and 0 columns
Presolve removed 0 rows and 267 columns
Presolve time: 0.02s
Presolved: 734 rows, 735 columns, 3025 nonzeros
Variable types: 9 continuous, 726 integer (726 binary)
Found heuristic solution: objective 39.0000000
Found heuristic solution: objective 40.0000000
Found heuristic solution: objective 41.0000000

Root relaxation: objective 9.444815e+02, 437 iterations, 0.02 seconds (0.02 work units)

    Nodes    |    Current Node    |     Objective Bounds      |     Work
 Expl Unexpl |  Obj  Depth IntInf | Incumbent    BestBd   Gap | It/Node Time

     0     0  944.48154    0  250   41.00000  944.48154  2204%     -    0s
H    0     0                     690.9999767  944.48154  36.7%     -    0s
H    0     0                     725.9999767  944.48154  30.1%     -    0s
H    0     0                     726.0000000  944.48154  30.1%     -    0s
H    0     0                     745.0000000  944.48154  26.8%     -    0s
H    0     0                     746.0000000  944.48154  26.6%     -    0s
H    0     0                     756.0000000  935.26282  23.7%     -    0s
H    0     0                     757.0000000  935.26282  23.5%     -    0s
H    0     0                     760.0000000  935.26282  23.1%     -    0s
H    0     0                     769.0000000  935.26282  21.6%     -    0s
H    0     0                     772.0000000  935.26282  21.1%     -    0s
H    0     0                     779.0000000  935.26282  20.1%     -    0s
     0     0  935.26282    0  287  779.00000  935.26282  20.1%     -    0s
H    0     0                     791.0000000  935.22199  18.2%     -    0s
H    0     0                     793.0000000  935.22199  17.9%     -    0s
H    0     0                     794.0000000  935.22199  17.8%     -    0s
H    0     0                     795.0000000  935.22199  17.6%     -    0s
     0     0  934.81232    0  288  795.00000  934.81232  17.6%     -    0s
     0     0  934.81209    0  288  795.00000  934.81209  17.6%     -    0s
     0     0  927.95525    0  272  795.00000  927.95525  16.7%     -    0s
     0     0  927.26185    0  287  795.00000  927.26185  16.6%     -    0s
     0     0  927.12068    0  285  795.00000  927.12068  16.6%     -    0s
     0     0  927.08358    0  290  795.00000  927.08358  16.6%     -    0s
     0     0  927.06427    0  290  795.00000  927.06427  16.6%     -    0s
     0     0  927.06420    0  290  795.00000  927.06420  16.6%     -    0s
     0     0  914.16848    0  302  795.00000  914.16848  15.0%     -    0s
     0     0  912.65071    0  310  795.00000  912.65071  14.8%     -    0s
     0     0  912.48200    0  308  795.00000  912.48200  14.8%     -    0s
     0     0  912.34683    0  309  795.00000  912.34683  14.8%     -    0s
     0     0  912.33744    0  309  795.00000  912.33744  14.8%     -    0s
     0     0  903.33111    0  309  795.00000  903.33111  13.6%     -    0s
     0     0  901.85346    0  313  795.00000  901.85346  13.4%     -    0s
     0     0  901.53675    0  320  795.00000  901.53675  13.4%     -    0s
     0     0  901.49276    0  319  795.00000  901.49276  13.4%     -    0s
     0     0  901.44650    0  320  795.00000  901.44650  13.4%     -    0s
     0     0  901.43253    0  322  795.00000  901.43253  13.4%     -    0s
     0     0  899.83618    0  324  795.00000  899.83618  13.2%     -    0s
     0     0  899.28345    0  332  795.00000  899.28345  13.1%     -    0s
     0     0  899.14568    0  327  795.00000  899.14568  13.1%     -    0s
     0     0  899.07079    0  329  795.00000  899.07079  13.1%     -    0s
     0     0  899.05426    0  336  795.00000  899.05426  13.1%     -    0s
     0     0  897.20000    0  319  795.00000  897.20000  12.9%     -    0s
     0     0  896.74151    0  326  795.00000  896.74151  12.8%     -    0s
     0     0  896.57463    0  328  795.00000  896.57463  12.8%     -    0s
     0     0  896.52273    0  329  795.00000  896.52273  12.8%     -    0s
     0     0  896.48463    0  325  795.00000  896.48463  12.8%     -    0s
H    0     0                     806.0000000  896.48463  11.2%     -    0s
H    0     0                     809.0000000  894.78878  10.6%     -    0s
     0     0  894.78878    0  319  809.00000  894.78878  10.6%     -    0s
H    0     0                     828.0000000  894.29309  8.01%     -    0s
H    0     0                     836.0000000  894.29309  6.97%     -    0s
     0     0  894.29309    0  327  836.00000  894.29309  6.97%     -    0s
     0     0  894.18556    0  329  836.00000  894.18556  6.96%     -    0s
     0     0  894.16106    0  331  836.00000  894.16106  6.96%     -    0s
     0     0  893.96973    0  331  836.00000  893.96973  6.93%     -    0s
     0     0  893.76723    0  330  836.00000  893.76723  6.91%     -    1s
     0     0  893.73836    0  338  836.00000  893.73836  6.91%     -    1s
     0     0  893.24237    0  331  836.00000  893.24237  6.85%     -    1s
H    0     0                     838.0000000  893.24140  6.59%     -    1s
     0     0  893.04315    0  335  838.00000  893.04315  6.57%     -    1s
     0     0  893.01232    0  338  838.00000  893.01232  6.56%     -    1s
     0     0  891.65107    0  329  838.00000  891.65107  6.40%     -    1s
     0     0  891.34608    0  329  838.00000  891.34608  6.37%     -    1s
     0     0  891.30139    0  334  838.00000  891.30139  6.36%     -    1s
     0     0  890.62943    0  335  838.00000  890.62943  6.28%     -    1s
     0     0  890.39889    0  338  838.00000  890.39889  6.25%     -    1s
     0     0  890.32285    0  347  838.00000  890.32285  6.24%     -    1s
     0     0  890.27446    0  343  838.00000  890.27446  6.24%     -    1s
     0     0  889.48815    0  337  838.00000  889.48815  6.14%     -    1s
     0     0  889.36662    0  345  838.00000  889.36662  6.13%     -    1s
     0     0  889.34421    0  346  838.00000  889.34421  6.13%     -    1s
     0     0  888.28729    0  335  838.00000  888.28729  6.00%     -    1s
     0     0  888.11055    0  339  838.00000  888.11055  5.98%     -    1s
     0     0  888.08203    0  347  838.00000  888.08203  5.98%     -    1s
     0     0  887.71292    0  341  838.00000  887.71292  5.93%     -    1s
     0     0  887.41992    0  354  838.00000  887.41992  5.90%     -    1s
     0     0  887.33421    0  348  838.00000  887.33421  5.89%     -    1s
     0     0  887.31694    0  352  838.00000  887.31694  5.89%     -    1s
     0     0  886.32956    0  337  838.00000  886.32956  5.77%     -    1s
     0     0  886.17394    0  341  838.00000  886.17394  5.75%     -    1s
     0     0  886.11461    0  343  838.00000  886.11461  5.74%     -    1s
     0     0  886.07997    0  344  838.00000  886.07997  5.74%     -    1s
     0     0  885.72732    0  332  838.00000  885.72732  5.70%     -    1s
     0     0  885.31182    0  335  838.00000  885.31182  5.65%     -    1s
     0     0  885.25887    0  337  838.00000  885.25887  5.64%     -    1s
     0     0  885.21536    0  343  838.00000  885.21536  5.63%     -    1s
     0     0  885.20755    0  343  838.00000  885.20755  5.63%     -    1s
     0     0  885.20755    0  338  838.00000  885.20755  5.63%     -    1s
     0     2  885.20755    0  336  838.00000  885.20755  5.63%     -    1s
  1289   906  872.38463   16  338  838.00000  878.04417  4.78%  90.3    5s
  1399   971  865.11450   19  288  838.00000  870.94325  3.93%   112   10s
H 1533   978                     850.0000000  869.16283  2.25%   119   10s
  2169   988  856.71672   25  223  850.00000  865.76619  1.85%   130   15s
  2989  1022  861.89880   23  298  850.00000  864.46281  1.70%   135   20s
  4125  1024  858.05053   27  282  850.00000  862.36791  1.46%   137   25s
  5157  1281  854.29979   27  257  850.00000  860.82396  1.27%   139   30s
  5783  1271     cutoff   34       850.00000  859.71948  1.14%   146   35s
  6465  1234  852.41915   25  292  850.00000  858.63214  1.02%   153   40s
  7065  1059     cutoff   29       850.00000  857.23008  0.85%   160   45s
  7579   781     cutoff   38       850.00000  856.03101  0.71%   166   51s
  8161   366     cutoff   26       850.00000  854.87238  0.57%   171   56s

Cutting planes:
  Learned: 6
  Gomory: 26
  Cover: 194
  Clique: 15
  MIR: 350
  StrongCG: 4
  Flow cover: 448
  Inf proof: 10
  RLT: 14
  Relax-and-lift: 13

Explored 8822 nodes (1524529 simplex iterations) in 59.59 seconds (50.68 work units)
Thread count was 8 (of 8 available processors)

Solution count 10: 850 838 836 ... 791

Optimal solution found (tolerance 1.00e-04)
Best objective 8.500000000000e+02, best bound 8.500000000000e+02, gap 0.0000%

We can further analyze the results: print the values of the \(x_i\) variables found as well as the list of satisfied and violated inequalities.

if model.status == GRB.OPTIMAL:
    print(f'Optimal Objective Value: {model.objVal}\n')

    # values for the x variables which achieve a maximum feasible subsystem
    # shift the indices by one to match the paper's notation
    for i in range(1,m+1):
        print(f'x_{i} = {xs[i].x}')


    # Summarize, using that the total number of versions of the inequality is 2 * |T|
    print("\nIn total, the fraction of satisfied inequalities \nis up to", int(sum(y.x for y in ys.values())), "/", 2 * len(T),
          f'≈ {sum(y.x for y in ys.values())/(2*len(T)):.3f} at m = {m}')
Optimal Objective Value: 850.0

x_1 = 0.0
x_2 = 0.0
x_3 = 4.0000000000110856e-05
x_4 = 6.0000000000201704e-05
x_5 = 6.000000000029256e-05
x_6 = 0.5000700000000002
x_7 = 0.7500750000000002
x_8 = 0.8750775000000002
x_9 = 0.9375787500000002
x_10 = 0.9688293750000001
x_11 = 0.9844546875000003
x_12 = 1.0

In total, the fraction of satisfied inequalities 
is up to 850 / 1980 ≈ 0.429 at m = 12
if model.status == GRB.OPTIMAL:
    # List the satisfied and violated inequalities separately
    print(f'\nSatisfied inequalities (modulo threshold {threshold}):')
    for y in ys.values():
        if y.x > 0.5:
            print(f'{y.varName}')

    print(f'\nViolated inequalities (modulo threshold {threshold}):')
    for y in ys.values():
        if y.x <= 0.5:
            print(f'{y.varName}')

(Output not shown here for brevity.)

Redundant constraints

A common technique in MILP is to add redundant constraints. Redundant constraints do not change the feasible set of solutions, but can help the solver to prune branches of the search tree more efficiently. One interpretation is that such redundant constraints can inform the solver about the structure of the problem.

Let us define the same MILP as above:

import gurobipy as gp
from gurobipy import GRB

m = 12
model = gp.Model("MILP_violation_indicators")

xs = model.addVars(range(1,m+1), vtype=GRB.CONTINUOUS, lb=0, ub=1, name="x")

# Generate the order constraints x_1 <= x_2 <= ... <= x_m = 1
for i in range(1,m):
    model.addConstr(xs[i] <= xs[i + 1], f"order_{i}")

# Fix x_m to 1 without loss of generality
model.addConstr(xs[m] == 1, "x_m_is_1")       

T = [
        (i,j,k,l)
        for i in range(1,m+1)
        for j in range(1,m+1)
        for k in range(1,m+1)
        for l in range(1,m+1)
        if i < j and j < k and l not in [i,j,k] and j < l
    ]
ys = model.addVars(T, vtype=GRB.BINARY, name=[f"X{t[0]} + X{t[1]} + X{t[2]} < 2*X{t[3]}" for t in T])

threshold = 2e-5
for i,j,k,l in T:
    model.addGenConstrIndicator(
        ys[(i,j,k,l)],     # Indicator variable
        1,                 # If y[i, j, k, l] == 1, then the following inequality must hold:
        xs[i] + xs[j] + xs[k] - 2 * xs[l], GRB.LESS_EQUAL, -threshold,
        name=f"indicator_{i}_{j}_{k}_{l}"
    )

A first family of redundant constraints is obtained looking by forming \((i,j,k,l)\) and \((a,b,c,d)\) two tuples in \(T\), such that the implication \(x_a + x_b + x_c < 2*x_d \Rightarrow x_i + x_j + x_k < 2*x_l\) holds merely by the monotonicity \(0 \le x_1 \le ... \le x_m = 1\).

def children(t):
    i,j,k,l = t
    possible_children = [
        (i+1, j, k, l),
        (i, j+1, k, l),
        (i, j, k+1, l),
        (i, j, k, l-1),
    ]
    return [tt for tt in possible_children if tt in set(T)]

for t in T:
    y = ys[t]
    for tt in children(t):
        yy = ys[tt]
        # if t=(i,j,k,l) does not hold (y=0) then tt=(a,c,d,d) cannot hold either
        model.addConstr(yy <= y)

Anoter family of redundant constraints is obtained by noting that given six indices \(i_0<i_2<\dots<i_5\), the inequalities \(x_0 + x_1+x_4 < 2 x_2\) and \(x_2 + x_3 + x_5 < 2 x_4\) are not simultaneously satisfiable, so the sum of the two corresponding indicator variables is at most 1.

from itertools import combinations

for subset in combinations(range(1, m+1), 6):
    subset = sorted(subset)
    i0, i1, i2, i3, i4, i5 = subset
    #{((0, 1, 4, 2), (2, 3, 5, 4))} is not satisfiable
    y = ys[(i0, i1, i4, i2)]
    yy = ys[(i2, i3, i5, i4)]
    model.addConstr( y + yy <= 1 ) # not jointly satisfiable

We now set the objective function and solve the MILP as before to compare the speedup obtained by adding these redundant constraints.

model.setObjective(gp.quicksum(ys.values()), GRB.MAXIMIZE)
model.optimize()
Gurobi Optimizer version 13.0.0 build v13.0.0rc1 (linux64 - "Ubuntu 24.04.1 LTS")

CPU model: Intel(R) Core(TM) i5-8350U CPU @ 1.70GHz, instruction set [SSE2|AVX|AVX2]
Thread count: 4 physical cores, 8 logical processors, using up to 8 threads

Optimize a model with 3576 rows, 1002 columns and 7151 nonzeros (Max)
Model fingerprint: 0x5ae333ef
Model has 990 linear objective coefficients
Model has 990 simple general constraints
  990 INDICATOR
Variable types: 12 continuous, 990 integer (990 binary)
Coefficient statistics:
  Matrix range     [1e+00, 1e+00]
  Objective range  [1e+00, 1e+00]
  Bounds range     [1e+00, 1e+00]
  RHS range        [1e+00, 1e+00]
  GenCon rhs range [2e-05, 2e-05]
  GenCon coe range [1e+00, 2e+00]
Presolve removed 73 rows and 42 columns
Presolve time: 0.06s
Presolved: 3503 rows, 960 columns, 8956 nonzeros
Variable types: 9 continuous, 951 integer (951 binary)
Found heuristic solution: objective 91.0000000
Found heuristic solution: objective 92.0000000

Root relaxation: objective 8.780906e+02, 455 iterations, 0.02 seconds (0.02 work units)

    Nodes    |    Current Node    |     Objective Bounds      |     Work
 Expl Unexpl |  Obj  Depth IntInf | Incumbent    BestBd   Gap | It/Node Time

     0     0  878.09057    0  193   92.00000  878.09057   854%     -    0s
H    0     0                     743.0000000  878.09057  18.2%     -    0s
H    0     0                     771.0000000  878.09057  13.9%     -    0s
H    0     0                     772.0000000  878.09057  13.7%     -    0s
H    0     0                     844.0000000  878.09057  4.04%     -    0s
H    0     0                     850.0000000  878.09057  3.30%     -    0s
     0     0  870.21956    0  249  850.00000  870.21956  2.38%     -    0s
     0     0  868.82157    0  253  850.00000  868.82157  2.21%     -    0s
     0     0  868.82157    0  248  850.00000  868.82157  2.21%     -    0s
     0     0  865.63924    0  262  850.00000  865.63924  1.84%     -    0s
     0     0  865.42915    0  254  850.00000  865.42915  1.82%     -    0s
     0     0  865.40275    0  258  850.00000  865.40275  1.81%     -    0s
     0     0  865.36190    0  259  850.00000  865.36190  1.81%     -    0s
     0     0  862.08394    0  260  850.00000  862.08394  1.42%     -    0s
     0     0  861.62875    0  292  850.00000  861.62875  1.37%     -    0s
     0     0  861.54036    0  279  850.00000  861.54036  1.36%     -    0s
     0     0  861.53397    0  291  850.00000  861.53397  1.36%     -    0s
     0     0  861.00449    0  280  850.00000  861.00449  1.29%     -    0s
     0     0  860.89229    0  286  850.00000  860.89229  1.28%     -    0s
     0     0  860.88538    0  281  850.00000  860.88538  1.28%     -    0s
     0     0  860.88161    0  281  850.00000  860.88161  1.28%     -    0s
     0     0  860.87921    0  281  850.00000  860.87921  1.28%     -    0s
     0     0  860.18362    0  306  850.00000  860.18362  1.20%     -    0s
     0     0  860.02831    0  306  850.00000  860.02831  1.18%     -    0s
     0     0  860.01231    0  304  850.00000  860.01231  1.18%     -    0s
     0     0  860.01221    0  304  850.00000  860.01221  1.18%     -    0s
     0     0  859.99583    0  296  850.00000  859.99583  1.18%     -    0s
     0     0  859.97597    0  306  850.00000  859.97597  1.17%     -    0s
     0     0  859.97597    0  306  850.00000  859.97597  1.17%     -    0s
     0     0  859.97597    0  303  850.00000  859.97597  1.17%     -    1s
     0     0  859.97597    0  303  850.00000  859.97597  1.17%     -    1s
     0     0  859.97217    0  305  850.00000  859.97217  1.17%     -    1s
     0     0  859.96833    0  306  850.00000  859.96833  1.17%     -    1s
     0     0  859.96833    0  305  850.00000  859.96833  1.17%     -    1s
     0     0  859.96299    0  305  850.00000  859.96299  1.17%     -    1s
     0     0  859.91726    0  304  850.00000  859.91726  1.17%     -    1s
     0     0  859.89530    0  305  850.00000  859.89530  1.16%     -    1s
     0     0  859.88266    0  305  850.00000  859.88266  1.16%     -    1s
     0     0  859.88199    0  305  850.00000  859.88199  1.16%     -    1s
     0     0  859.65718    0  302  850.00000  859.65718  1.14%     -    1s
     0     0  859.61102    0  316  850.00000  859.61102  1.13%     -    1s
     0     0  859.60611    0  316  850.00000  859.60611  1.13%     -    1s
     0     0  859.60059    0  316  850.00000  859.60059  1.13%     -    1s
     0     0  859.60052    0  316  850.00000  859.60052  1.13%     -    1s
     0     0  859.57658    0  303  850.00000  859.57658  1.13%     -    1s
     0     0  859.53524    0  304  850.00000  859.53524  1.12%     -    1s
     0     0  859.51377    0  305  850.00000  859.51377  1.12%     -    1s
     0     0  859.50824    0  315  850.00000  859.50824  1.12%     -    1s
     0     0  859.49972    0  312  850.00000  859.49972  1.12%     -    1s
     0     0  859.49892    0  315  850.00000  859.49892  1.12%     -    1s
     0     0  859.45136    0  305  850.00000  859.45136  1.11%     -    1s
     0     0  859.42855    0  305  850.00000  859.42855  1.11%     -    1s
     0     0  859.39911    0  305  850.00000  859.39911  1.11%     -    1s
     0     0  859.39003    0  306  850.00000  859.39003  1.10%     -    1s
     0     0  859.38555    0  306  850.00000  859.38555  1.10%     -    1s
     0     0  859.38536    0  305  850.00000  859.38536  1.10%     -    1s
     0     0  859.37714    0  315  850.00000  859.37714  1.10%     -    1s
     0     0  859.37255    0  319  850.00000  859.37255  1.10%     -    1s
     0     0  859.37255    0  319  850.00000  859.37255  1.10%     -    1s
     0     0  859.36308    0  319  850.00000  859.36308  1.10%     -    1s
     0     0  859.36308    0  319  850.00000  859.36308  1.10%     -    1s
     0     2  859.35509    0  319  850.00000  859.35509  1.10%     -    1s

Cutting planes:
  Learned: 1
  Gomory: 2
  MIR: 104
  StrongCG: 1
  RLT: 2
  Relax-and-lift: 1

Explored 31 nodes (5423 simplex iterations) in 2.09 seconds (1.15 work units)
Thread count was 8 (of 8 available processors)

Solution count 7: 850 844 772 ... 91

Optimal solution found (tolerance 1.00e-04)
Best objective 8.500000000000e+02, best bound 8.500000000000e+02, gap 0.0000%

We can further analyze the results: print the values of the \(x_i\) variables found

if model.status == GRB.OPTIMAL:
    print(f'Optimal Objective Value: {model.objVal}\n')
    for i in range(1,m+1):
        print(f'x_{i} = {xs[i].x}')

    print("\nIn total, the fraction of satisfied inequalities \nis up to", int(sum(y.x for y in ys.values())), "/", 2 * len(T),
          f'≈ {sum(y.x for y in ys.values())/(2*len(T)):.3f} at m = {m}')
Optimal Objective Value: 850.0

x_1 = 0.0
x_2 = 0.0
x_3 = 3.999999999990898e-05
x_4 = 5.999999999981796e-05
x_5 = 5.999999999981796e-05
x_6 = 0.5132116699219463
x_7 = 0.7632166699219464
x_8 = 0.8882191699219466
x_9 = 0.9507204199219468
x_10 = 0.9819710449219468
x_11 = 1.0
x_12 = 1.0

In total, the fraction of satisfied inequalities 
is up to 850 / 1980 ≈ 0.429 at m = 12

Going further

The preprint Bellec and Fritz (2024) contains more details on this approach, including how to scale the resolution of such MILP to \(m\approx 25\), how to obtain verifiable upper bounds from numerical solutions such as the one above, and the latest lower and upper bounds known to us for this problem. The final version of the MILP above that we use to solve larger values of \(m\) is available in the ancilliary files of the arXiv preprint https://arxiv.org/abs/2412.15179.

References

Bellec, Pierre C, and Tobias Fritz. 2024. “Optimizing over Iid Distributions and the Beat the Average Game.” arXiv Preprint arXiv:2412.15179.