Skip to content

Conversation

@abelsiqueira
Copy link
Member

Stores xt inside LineModel to prevent allocations.

WIP because I can't predict the allocations of armijo_wolfe yet.
Testing manually with this script gives a different result.

@abelsiqueira abelsiqueira force-pushed the improve-alloc-line-model branch from 0cea9fa to 76131b7 Compare May 31, 2021 23:19
@codecov
Copy link

codecov bot commented May 31, 2021

Codecov Report

Merging #166 (8c2015c) into main (9f316a3) will increase coverage by 0.46%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##             main     #166      +/-   ##
==========================================
+ Coverage   90.64%   91.11%   +0.46%     
==========================================
  Files           8        8              
  Lines         171      180       +9     
==========================================
+ Hits          155      164       +9     
  Misses         16       16              
Impacted Files Coverage Δ
src/linesearch/line_model.jl 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 9f316a3...8c2015c. Read the comment docs.

@abelsiqueira
Copy link
Member Author

@dpo, I've updated with allocations tests, but there are a few issues.

First, the allocation of the LineModel API and of Armijo is different on 1.3, so it fails.

The allocation of Armijo is not really well defined. I initially thought it was (8 + nbk) * 32 + 16, but that actually only works for bk_max >= 4, which sets nbk = 4. However, if I change bk_max < 4, then it becomes (7 + 1.5nbk) * 32. But this doesn't work for nbk = 4. So I used min(12.5 * 32 + 16, 7 + 1.5nbk) * 32. Furthermore, if I use a loop for the test, then it allocates and additional 208.

Any suggestions?

NLPModels.increment!(f, :neval_grad)
return dot(grad(f.nlp, f.x + t * f.d), f.d)
@. f.xt = f.x + t * f.d
return dot(grad(f.nlp, f.xt), f.d)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This method should call grad!().

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would need to store g inside LineModel as well, is that what you mean?

NLPModels.increment!(f, :neval_hess)
return dot(f.d, hprod(f.nlp, f.x + t * f.d, f.d))
@. f.xt = f.x + t * f.d
return dot(f.d, hprod(f.nlp, f.xt, f.d))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn’t this implement the NLPModels API properly, i.e., with hess_coord, etc? It should be easy; the Hessian is just a scalar here.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It can't implement the API properly because of the arguments being scalars, so even if we implement the missing functions, they can't be tested or used as NLPs. Alternatively, we can change to use a proper NLPModel, similar to what I tried to do for the Merit function in #145. We would essentially remove this LineModel and use MeritModels instead, like in #146. If we decide to go this way I can try to make a simplified version of #145 and #146.

for bk_max = 0:8
(t, gg, ht, nbk, nbW) = armijo_wolfe(lm, h₀, slope, g, bk_max=bk_max)
al = @allocated armijo_wolfe(lm, h₀, slope, g, bk_max=bk_max)
@test al == min(12.5, 7 + 1.5nbk) * 32 + 208
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I’m trying to understand where armijo_wolfe allocates. Do you have a sense of that?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I count:

  • 6 scalar keyword arguments;
  • Creation of 9 variables nbk, nbW, ht, slope_t, h_goal, fact, ϵ, Armijo, good_grad;
  • One obj call per nbk (which should be 16 allocations);
  • One objgrad! call (should be 64 allocations);

There are 4 iterations, and there are some scalar operations. I imagined most of these were actually no-ops, so I can't explain the allocations yet. Especially the extra 208 that only happens inside the for.

@dpo
Copy link
Member

dpo commented Jun 1, 2021

Does the armijo_wolfe in #146 allocate?

@abelsiqueira
Copy link
Member Author

Does the armijo_wolfe in #146 allocate?

Probably, I made no efforts to remove allocations on #146.

@abelsiqueira abelsiqueira force-pushed the improve-alloc-line-model branch from 76131b7 to 98cb346 Compare July 1, 2021 17:55
@github-actions
Copy link
Contributor

github-actions bot commented Jul 1, 2021

Package name latest stable
JSOSolvers.jl
Percival.jl

@github-actions
Copy link
Contributor

github-actions bot commented Jul 1, 2021

Package name latest stable
JSOSolvers.jl
Percival.jl

@abelsiqueira abelsiqueira force-pushed the improve-alloc-line-model branch from 2af7496 to 9312d39 Compare July 1, 2021 18:58
@github-actions
Copy link
Contributor

github-actions bot commented Jul 1, 2021

Package name latest stable
JSOSolvers.jl
Percival.jl

@abelsiqueira abelsiqueira force-pushed the improve-alloc-line-model branch from 9312d39 to 1b16f3b Compare July 1, 2021 23:59
@github-actions
Copy link
Contributor

github-actions bot commented Jul 2, 2021

Package name latest stable
JSOSolvers.jl
Percival.jl

@dpo
Copy link
Member

dpo commented Jul 2, 2021

Maybe restrict allocation tests to Julia > 1.3?

@abelsiqueira abelsiqueira force-pushed the improve-alloc-line-model branch from 1b16f3b to 7e35eaf Compare July 2, 2021 17:57
@github-actions
Copy link
Contributor

github-actions bot commented Jul 2, 2021

Package name latest stable
JSOSolvers.jl
Percival.jl

@abelsiqueira abelsiqueira changed the title [WIP] Preallocate on LineModel Preallocate on LineModel Jul 2, 2021
@abelsiqueira abelsiqueira requested a review from dpo July 2, 2021 22:11
@abelsiqueira abelsiqueira force-pushed the improve-alloc-line-model branch from 7e35eaf to 8c2015c Compare July 11, 2021 02:31
@github-actions
Copy link
Contributor

Package name latest stable
JSOSolvers.jl
Percival.jl

@abelsiqueira abelsiqueira merged commit 453e4b1 into main Jul 11, 2021
@abelsiqueira abelsiqueira deleted the improve-alloc-line-model branch July 11, 2021 02:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants