-
Notifications
You must be signed in to change notification settings - Fork 19
Preallocate on LineModel #166
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
0cea9fa to
76131b7
Compare
Codecov Report
@@ Coverage Diff @@
## main #166 +/- ##
==========================================
+ Coverage 90.64% 91.11% +0.46%
==========================================
Files 8 8
Lines 171 180 +9
==========================================
+ Hits 155 164 +9
Misses 16 16
Continue to review full report at Codecov.
|
|
@dpo, I've updated with allocations tests, but there are a few issues. First, the allocation of the LineModel API and of Armijo is different on 1.3, so it fails. The allocation of Armijo is not really well defined. I initially thought it was Any suggestions? |
| NLPModels.increment!(f, :neval_grad) | ||
| return dot(grad(f.nlp, f.x + t * f.d), f.d) | ||
| @. f.xt = f.x + t * f.d | ||
| return dot(grad(f.nlp, f.xt), f.d) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This method should call grad!().
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would need to store g inside LineModel as well, is that what you mean?
| NLPModels.increment!(f, :neval_hess) | ||
| return dot(f.d, hprod(f.nlp, f.x + t * f.d, f.d)) | ||
| @. f.xt = f.x + t * f.d | ||
| return dot(f.d, hprod(f.nlp, f.xt, f.d)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn’t this implement the NLPModels API properly, i.e., with hess_coord, etc? It should be easy; the Hessian is just a scalar here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It can't implement the API properly because of the arguments being scalars, so even if we implement the missing functions, they can't be tested or used as NLPs. Alternatively, we can change to use a proper NLPModel, similar to what I tried to do for the Merit function in #145. We would essentially remove this LineModel and use MeritModels instead, like in #146. If we decide to go this way I can try to make a simplified version of #145 and #146.
test/test_linesearch.jl
Outdated
| for bk_max = 0:8 | ||
| (t, gg, ht, nbk, nbW) = armijo_wolfe(lm, h₀, slope, g, bk_max=bk_max) | ||
| al = @allocated armijo_wolfe(lm, h₀, slope, g, bk_max=bk_max) | ||
| @test al == min(12.5, 7 + 1.5nbk) * 32 + 208 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I’m trying to understand where armijo_wolfe allocates. Do you have a sense of that?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I count:
- 6 scalar keyword arguments;
- Creation of 9 variables
nbk,nbW,ht,slope_t,h_goal,fact,ϵ,Armijo,good_grad; - One
objcall pernbk(which should be 16 allocations); - One
objgrad!call (should be 64 allocations);
There are 4 iterations, and there are some scalar operations. I imagined most of these were actually no-ops, so I can't explain the allocations yet. Especially the extra 208 that only happens inside the for.
|
Does the |
76131b7 to
98cb346
Compare
2af7496 to
9312d39
Compare
9312d39 to
1b16f3b
Compare
|
Maybe restrict allocation tests to Julia > 1.3? |
1b16f3b to
7e35eaf
Compare
7e35eaf to
8c2015c
Compare
Stores
xtinside LineModel to prevent allocations.WIP because I can't predict the allocations of
armijo_wolfeyet.Testing manually with this script gives a different result.