Commit 33fe669
Handle dimension-reducing attributes on Parameters (cvxpy#3146)
* Handle dimension-reducing attributes on Parameters
Parameters with dimension-reducing attributes (sparsity, diag, symmetric,
PSD, NSD) were stored but ignored during canonicalization. This extends
CvxAttr2Constr to lower these parameters into reduced-size representations,
using shared infrastructure moved from Variable to Leaf.
Key changes:
- Move provenance tracking from Variable to Leaf base class
- Add _has_dim_reducing_attr, reduced_size, and
_build_dim_reduced_expression() to Leaf for shared dim-reduction logic
- Extend CvxAttr2Constr.apply() to create reduced parameters
- Update cone_matrix_stuffing to extract values from lowered parameters
- Extend lower_value() to handle sparsity attribute
- Copy dim-reducing attrs in DGP parameter canonicalization
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Always store sparse leaf _value as ndarray of nonzero values
Simplify save_value to store just val[self.sparse_idx] instead of
wrapping in a coo_array that was immediately unwrapped on read.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Copy symmetric attr to log-space variables/parameters in DGP
PSD/NSD imply symmetry but the semidefiniteness constraint doesn't
transfer to log-space, so only copy symmetric. Sparsity and diag
have structural zeros and log(0) is undefined.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Use update_parameters for dim-reducing parameter value propagation
Move parameter value propagation from cone_matrix_stuffing provenance
lookups to CvxAttr2Constr.update_parameters, matching the pattern used
by Dgp2Dcp and Complex2Real. Also add param_backward and param_forward
for differentiation support.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Fix hermitian param DPP by iterating _parameters directly
CvxAttr2Constr.update_parameters was iterating problem.parameters()
which only contains user-facing parameters. Intermediate parameters
created by earlier reductions (e.g. Complex2Real's symmetric real_param
for hermitian parameters) were missed. Now iterates self._parameters
directly to update all reduced parameters.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Make reduced_size private
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Revert cosmetic param_value change; error on unsupported DGP attrs
Revert the unnecessary refactor of param_value in cone_matrix_stuffing.
Add explicit errors in DGP variable_canon and parameter_canon for
sparsity, diag, PSD, and NSD attributes, which are incompatible with
the log-space transformation (structural zeros or semidefiniteness).
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Add explicit variable/dual recovery maps to contributing wishlist
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Improve param dim-reducing tests; move wishlist item to large scope
Consolidate parameter dim-reducing attribute tests to be denser: each
test covers solve, compiled param size check, and DPP re-solve. Size
assertions fail on master where parameters aren't reduced.
Move explicit variable/dual recovery maps to large scope projects in
contributing docs.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Restore shared param ids; fix gradient double-counting differently
Reduced parameters now share the original's id again (needed by
CVXPYgen). The gradient double-counting issue is fixed by changing
the backward/forward loops in problem.py: the direct dparams lookup
is only used as a fallback when no reduction handles the parameter.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Restore assert in set_leaf_of_provenance
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Address PR review comments: refactor DGP attr validation, clarify lower_value, improve tests
- Factor out common attribute validation between variable_canon and
parameter_canon into _validate_dgp_attrs helper
- Add docstring to lower_value explaining None vs explicit value paths;
replace shape comparison with direct full_size flag for sparse branch
- Add _has_dim_reducing_attr, _reduced_size, and is_dpp assertions to
parameter dim-reducing tests
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Add sparse variable derivative test
Exercises the split_adjoint -> lower_value path for sparse variables.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* Rename recover_value_for_variable to recover_value_for_leaf
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
---------
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>1 parent dcaa9bd commit 33fe669
File tree
11 files changed
+287
-67
lines changed- cvxpy
- cvxcore/python
- expressions
- problems
- reductions
- dcp2cone
- dgp2dcp/canonicalizers
- solvers/conic_solvers
- tests
- doc/source/contributing
11 files changed
+287
-67
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
47 | 47 | | |
48 | 48 | | |
49 | 49 | | |
50 | | - | |
| 50 | + | |
| 51 | + | |
| 52 | + | |
51 | 53 | | |
52 | 54 | | |
53 | 55 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
166 | 166 | | |
167 | 167 | | |
168 | 168 | | |
| 169 | + | |
169 | 170 | | |
170 | 171 | | |
171 | 172 | | |
| |||
506 | 507 | | |
507 | 508 | | |
508 | 509 | | |
509 | | - | |
| 510 | + | |
510 | 511 | | |
511 | 512 | | |
512 | 513 | | |
| |||
523 | 524 | | |
524 | 525 | | |
525 | 526 | | |
526 | | - | |
| 527 | + | |
527 | 528 | | |
528 | 529 | | |
529 | 530 | | |
| |||
538 | 539 | | |
539 | 540 | | |
540 | 541 | | |
541 | | - | |
542 | | - | |
543 | | - | |
544 | | - | |
| 542 | + | |
545 | 543 | | |
546 | 544 | | |
547 | 545 | | |
| |||
702 | 700 | | |
703 | 701 | | |
704 | 702 | | |
| 703 | + | |
| 704 | + | |
| 705 | + | |
| 706 | + | |
| 707 | + | |
| 708 | + | |
| 709 | + | |
| 710 | + | |
| 711 | + | |
| 712 | + | |
| 713 | + | |
| 714 | + | |
| 715 | + | |
| 716 | + | |
| 717 | + | |
| 718 | + | |
| 719 | + | |
| 720 | + | |
| 721 | + | |
| 722 | + | |
| 723 | + | |
| 724 | + | |
| 725 | + | |
| 726 | + | |
| 727 | + | |
| 728 | + | |
| 729 | + | |
| 730 | + | |
705 | 731 | | |
706 | 732 | | |
707 | 733 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
48 | 48 | | |
49 | 49 | | |
50 | 50 | | |
51 | | - | |
52 | 51 | | |
53 | 52 | | |
54 | 53 | | |
| |||
116 | 115 | | |
117 | 116 | | |
118 | 117 | | |
119 | | - | |
120 | | - | |
121 | | - | |
122 | | - | |
123 | 118 | | |
124 | | - | |
125 | | - | |
| 119 | + | |
| 120 | + | |
126 | 121 | | |
127 | 122 | | |
128 | | - | |
129 | | - | |
| 123 | + | |
| 124 | + | |
130 | 125 | | |
131 | 126 | | |
132 | 127 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
1224 | 1224 | | |
1225 | 1225 | | |
1226 | 1226 | | |
1227 | | - | |
1228 | | - | |
| 1227 | + | |
| 1228 | + | |
1229 | 1229 | | |
1230 | 1230 | | |
1231 | 1231 | | |
1232 | 1232 | | |
1233 | 1233 | | |
| 1234 | + | |
| 1235 | + | |
| 1236 | + | |
| 1237 | + | |
1234 | 1238 | | |
1235 | 1239 | | |
1236 | 1240 | | |
| |||
1300 | 1304 | | |
1301 | 1305 | | |
1302 | 1306 | | |
1303 | | - | |
1304 | | - | |
1305 | | - | |
| 1307 | + | |
1306 | 1308 | | |
1307 | 1309 | | |
1308 | 1310 | | |
1309 | 1311 | | |
1310 | 1312 | | |
| 1313 | + | |
| 1314 | + | |
| 1315 | + | |
| 1316 | + | |
1311 | 1317 | | |
1312 | 1318 | | |
1313 | 1319 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
23 | 23 | | |
24 | 24 | | |
25 | 25 | | |
| 26 | + | |
26 | 27 | | |
27 | 28 | | |
28 | 29 | | |
| |||
73 | 74 | | |
74 | 75 | | |
75 | 76 | | |
76 | | - | |
| 77 | + | |
77 | 78 | | |
78 | 79 | | |
79 | 80 | | |
| |||
93 | 94 | | |
94 | 95 | | |
95 | 96 | | |
96 | | - | |
| 97 | + | |
| 98 | + | |
| 99 | + | |
| 100 | + | |
| 101 | + | |
| 102 | + | |
| 103 | + | |
| 104 | + | |
| 105 | + | |
| 106 | + | |
| 107 | + | |
| 108 | + | |
| 109 | + | |
| 110 | + | |
| 111 | + | |
| 112 | + | |
| 113 | + | |
| 114 | + | |
| 115 | + | |
| 116 | + | |
| 117 | + | |
| 118 | + | |
| 119 | + | |
97 | 120 | | |
98 | 121 | | |
99 | 122 | | |
100 | 123 | | |
| 124 | + | |
| 125 | + | |
| 126 | + | |
| 127 | + | |
| 128 | + | |
| 129 | + | |
101 | 130 | | |
102 | 131 | | |
103 | 132 | | |
104 | 133 | | |
| 134 | + | |
| 135 | + | |
| 136 | + | |
| 137 | + | |
| 138 | + | |
| 139 | + | |
| 140 | + | |
| 141 | + | |
| 142 | + | |
| 143 | + | |
| 144 | + | |
| 145 | + | |
| 146 | + | |
| 147 | + | |
| 148 | + | |
| 149 | + | |
105 | 150 | | |
106 | 151 | | |
107 | 152 | | |
108 | 153 | | |
109 | 154 | | |
110 | 155 | | |
| 156 | + | |
111 | 157 | | |
112 | 158 | | |
113 | 159 | | |
| |||
123 | 169 | | |
124 | 170 | | |
125 | 171 | | |
126 | | - | |
| 172 | + | |
| 173 | + | |
| 174 | + | |
127 | 175 | | |
128 | 176 | | |
129 | 177 | | |
| |||
144 | 192 | | |
145 | 193 | | |
146 | 194 | | |
147 | | - | |
148 | | - | |
149 | | - | |
150 | | - | |
151 | | - | |
152 | | - | |
153 | | - | |
154 | | - | |
155 | | - | |
156 | | - | |
157 | | - | |
158 | | - | |
159 | | - | |
160 | | - | |
| 195 | + | |
| 196 | + | |
| 197 | + | |
| 198 | + | |
| 199 | + | |
| 200 | + | |
161 | 201 | | |
162 | 202 | | |
163 | 203 | | |
164 | 204 | | |
165 | | - | |
166 | | - | |
167 | 205 | | |
168 | 206 | | |
169 | 207 | | |
170 | | - | |
171 | | - | |
| 208 | + | |
| 209 | + | |
172 | 210 | | |
173 | 211 | | |
174 | 212 | | |
| |||
177 | 215 | | |
178 | 216 | | |
179 | 217 | | |
180 | | - | |
181 | | - | |
182 | | - | |
183 | | - | |
184 | | - | |
185 | | - | |
186 | | - | |
187 | | - | |
188 | | - | |
189 | | - | |
190 | | - | |
191 | | - | |
192 | | - | |
193 | | - | |
| 218 | + | |
| 219 | + | |
| 220 | + | |
| 221 | + | |
194 | 222 | | |
195 | 223 | | |
196 | | - | |
| 224 | + | |
197 | 225 | | |
198 | 226 | | |
199 | 227 | | |
| |||
209 | 237 | | |
210 | 238 | | |
211 | 239 | | |
| 240 | + | |
| 241 | + | |
| 242 | + | |
| 243 | + | |
| 244 | + | |
| 245 | + | |
| 246 | + | |
| 247 | + | |
| 248 | + | |
| 249 | + | |
| 250 | + | |
| 251 | + | |
| 252 | + | |
| 253 | + | |
| 254 | + | |
| 255 | + | |
| 256 | + | |
212 | 257 | | |
213 | 258 | | |
214 | 259 | | |
| |||
218 | 263 | | |
219 | 264 | | |
220 | 265 | | |
| 266 | + | |
| 267 | + | |
| 268 | + | |
| 269 | + | |
| 270 | + | |
| 271 | + | |
| 272 | + | |
| 273 | + | |
| 274 | + | |
| 275 | + | |
| 276 | + | |
| 277 | + | |
| 278 | + | |
| 279 | + | |
| 280 | + | |
| 281 | + | |
| 282 | + | |
| 283 | + | |
| 284 | + | |
| 285 | + | |
| 286 | + | |
221 | 287 | | |
222 | 288 | | |
223 | 289 | | |
| |||
227 | 293 | | |
228 | 294 | | |
229 | 295 | | |
230 | | - | |
| 296 | + | |
231 | 297 | | |
232 | 298 | | |
233 | 299 | | |
| |||
0 commit comments