Skip to content

Conversation

DegenFans
Copy link

the outcome for perUnitFlowRate is wrong, when there are several actions in the same tx (which will happen more and more with macros). that´s why the pool should be readed via loadInBlock first, so that there is always the in memory variant if not fallback to the load.

@kasparkallas
Copy link
Contributor

Hey

I don't see how this will help with changing the value of perUnitFlowRate. I haven't used loadInBlock before but it seems to be for performance optimization reasons as per docs: https://thegraph.com/docs/en/subgraphs/developing/creating/graph-ts/api/#looking-up-entities-created-withing-a-block

Please describe your issue in as much detail as possible. Examples are appreciated.

Copy link

codecov bot commented Jun 30, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

📢 Thoughts on this report? Let us know!

@DegenFans
Copy link
Author

The problem is that especially the perUnitFlowrate for streme token is always zero even there is a flow. there happens just one distributeFlow in the whole process and a updateMembership with each stake change. Only in the distributeFlow the perUnitFlowRate is calculated "fresh", but this happens just once in the first tx, after the first updateMembership, but it seems that the totalUnits is 0. As it is loaded again from store from before the current block, and there the pool don´t exist. if i understand loadInBlock correctly it will return also the new generated pool entitiy from the same block, which we need to calculate the perUnitFlowrate correctly.

here is an example subgraph query on base which shows a wrong perUnitFlowate:

query MyQuery {
poolMembers(
where: {pool: "0x42e357ce4427d908ef50cff1d6a082ec358d1f1c"}
block: {number: 30665705}
) {
units
pool {
perUnitFlowRate
createdAtTimestamp
flowRate
totalUnits
perUnitSettledValue
}
isConnected
}
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants