data.json file is getting big and causing opa latency #420
Replies: 2 comments 11 replies
-
Hey @itayhac, 1MB isn't a very large dataset so my guess is that the policy is using some inefficient logic. Have you tried reviewing the recommendations in https://www.openpolicyagent.org/docs/latest/policy-performance/ ? |
Beta Was this translation helpful? Give feedback.
-
I'm not sure I follow here — and your screenshot is cut off before any metrics are shown. Are you using |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
i got a bundle with the following structure:

the data.json's size is usually ~2-5KB and total evaluation time is <10ms.
but recently i faced an issue with data.json's size getting to ~1MB , which makes opa very slow, which in turn makes opa's total evaluation time ~ 300ms.
after running opa eval with profiler it looks like the import of the data.json file to the logic.rego takes most of the time (the 300ms).
need to say that the data.json does not change during a revision of bundle.
im using the following command for debugging:
opa eval -d . -i input.json "data.logic.access" --profile --format=pretty --count=1
i can see using the profiler that almost of all the time is consumed by the
constraints = object.get(index, "constraints", {})
line of code. meaning that the dynamic import of the data set is slowing us down.the question is as followed:
how can i improve the latency of opa component in our system?
first thing that came to my mind is to "hard code" the data.json in our rego file, instead of importing it.
Beta Was this translation helpful? Give feedback.
All reactions