Skip to content

Improve caching decorator and move it to simulations#73

Merged
santisoler merged 5 commits intomainfrom
improve-cache
Feb 10, 2026
Merged

Improve caching decorator and move it to simulations#73
santisoler merged 5 commits intomainfrom
improve-cache

Conversation

@santisoler
Copy link
Member

@santisoler santisoler commented Feb 6, 2026

Improve the decorator so it can be used in multiple methods within a same object. Change how the cache objects are stored: each _cache_{hash(func)} attribute will be a tuple with two elements: the hash of the model and the cached return of the func. This way, two different methods can have their own caches, even for different models. Remove the cache_on_model decorator from DataMisfit. Using the decorator on DataMisfit methods would force us to also validate other relevant attributes, like the data, their uncertainties and the weights. Use the decorator on the __call__ method of the WrappedSimulation.

Improve the decorator so it can be used in multiple methods within
a same object. Change how the cache objects are stored: each
`_cache_{hash(func)}` attribute will be a tuple with two elements: the
hash of the model and the cached return of the `func`.
This way, two different methods can have their own caches, even for
different models.
Using the decorator on `DataMisfit` methods would force us to also
validate other relevant attributes, like the data, their uncertainties
and the weights.
Use the decorator on the `__call__` and the `jacobian` methods of the
`WrappedSimulation`.
The caching of the J matrix should happen in `getJ`. And there's no gain
in caching the LinearOperator.
@santisoler santisoler merged commit b4c6abf into main Feb 10, 2026
2 checks passed
@santisoler santisoler deleted the improve-cache branch February 10, 2026 22:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant

Comments