Conversation
b5cf68c to
3c4b31b
Compare
|
Michael, I finished what I wanted. |
|
@amontoison probably this is why the |
|
A more realistic example, batched QuadraticModel where we vary only the RHS: JuliaSmoothOptimizers/QuadraticModels.jl@main...klamike:QuadraticModels.jl:mk/rhsbatch |
|
Amazing Michael! |
|
Should we hardcode |
I think so, yes, to be consistent with the regular API. Both can be updated at the same time later (maybe only in 0.22)
We probably want to define some more meta functions like |
65dcf55 to
a20cc30
Compare
|
@amontoison what do you think about having an API for updating the nbatch? and maybe some optional |
|
@klamike I don't unserstand what you mean by updating the |
|
Yes I meant updating the In the ExaModels case, of course it depends on how you do the batching. When I added parameters to ExaModels I specifically made the lower level functions all take the parameter vector as an input, to make it possible to implement the batching the way I did in BatchNLPKernels. It is based on a single ExaModel and does the same number of kernel launches for the batch evaluation as ExaModels would for a regular model, just with 2D grids. Since the base parametric ExaModel is built "unbatched", it is trivial to implement the |
|
Actually the I do still think the |
|
@klamike Be free to add what you need in the APi on |
cc @klamike