Skip to content

Commit 7dcb979

Browse files
committed
transpose vectors
1 parent b38e2fe commit 7dcb979

File tree

3 files changed

+6
-6
lines changed

3 files changed

+6
-6
lines changed

python/benchmark/benchmark_FairSVM/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,8 +13,8 @@ $$\sum_{i=1}^n z_{ij} = 0,$$
1313

1414
such as gender and/or race. The constraints limit the correlation between the $d_0$-length sensitive features $\mathbf{z}_ i \in \mathbb{R}^{d_0}$ and the decision function $\mathbf{\beta}^\intercal \mathbf{x}$, and the constants $\mathbf{\rho} \in \mathbb{R}_+^{d_0}$ trade-offs predictive accuracy and fairness. Note that the FairSVM can be rewritten as a ReHLine optimization with
1515
```math
16-
\mathbf{U} \leftarrow -C \mathbf{y}/n, \quad
17-
\mathbf{V} \leftarrow C \mathbf{1}_n/n, \quad
16+
\mathbf{U} \leftarrow -C \mathbf{y}^\intercal/n, \quad
17+
\mathbf{V} \leftarrow C \mathbf{1}^\intercal_n/n, \quad
1818
\mathbf{A} \leftarrow
1919
\begin{pmatrix}
2020
\mathbf{Z}^\intercal \mathbf{X} / n \\

python/benchmark/benchmark_SVM/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,8 +6,8 @@ SVMs solve the following optimization problem:
66
```
77
where $\mathbf{x}_i \in \mathbb{R}^d$ is a feature vector, and $y_i \in \{-1, 1\}$ is a binary label. Note that the SVM can be rewritten as a ReHLine optimization with
88
```math
9-
\mathbf{U} \leftarrow -C \mathbf{y}/n, \quad
10-
\mathbf{V} \leftarrow C \mathbf{1}_n/n, \quad
9+
\mathbf{U} \leftarrow -C \mathbf{y}^\intercal/n, \quad
10+
\mathbf{V} \leftarrow C \mathbf{1}^\intercal_n/n,
1111
```
1212
where $\mathbf{1}_n = (1, \cdots, 1)^\intercal$ is the $n$-length one vector, $\mathbf{X} \in \mathbb{R}^{n \times d}$ is the feature matrix, and $\mathbf{y} = (y_1, \cdots, y_n)^\intercal$ is the response vector.
1313
### Benchmarking solvers

python/benchmark/benchmark_sSVM/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,8 +6,8 @@ Smoothed SVMs solve the following optimization problem:
66
```
77
where $V(\cdot)$ is the smoothed hinge loss, $\mathbf{x}_i \in \mathbb{R}^d$ is a feature vector, and $y_i \in \{-1, 1\}$ is a binary label. Smoothed SVM can be rewritten as a ReHLine optimization with
88
```math
9-
\mathbf{S} \leftarrow -\sqrt{C} \mathbf{y}/n, \quad
10-
\mathbf{T} \leftarrow \sqrt{C} \mathbf{1}_n/n, \quad
9+
\mathbf{S} \leftarrow -\sqrt{C} \mathbf{y}^\intercal/n, \quad
10+
\mathbf{T} \leftarrow \sqrt{C} \mathbf{1}^\intercal_n/n, \quad
1111
\mathbf{\tau} \leftarrow \sqrt{C},
1212
```
1313
where $\mathbf{1}_n = (1, \cdots, 1)^\intercal$ is the $n$-length one vector, $\mathbf{X} \in \mathbb{R}^{n \times d}$ is the feature matrix, and $\mathbf{y} = (y_1, \cdots, y_n)^\intercal$ is the response vector.

0 commit comments

Comments
 (0)