[ENG-1734] feat(comlink): add GET /v4/tradeHistory endpoint with cumulative PnL tracking#3323
[ENG-1734] feat(comlink): add GET /v4/tradeHistory endpoint with cumulative PnL tracking#3323davidli1997 merged 19 commits intomainfrom
Conversation
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughAdds a trade history feature: new in-memory compute and pagination library, Express v4 controller endpoints for subaccount and parent-subaccount trade history, types and Swagger docs, clobPairId→market helper, fills-controller integration changes, deterministic test helpers, and comprehensive unit + integration tests. Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant Controller as TradeHistoryController
participant Validator as Validation Layer
participant DB as FillTable/Database
participant OrderType as OrderType Lookup
participant MarketMap as ClobPairId→Market
participant Logic as TradeHistory Logic
participant Response
Client->>Controller: GET /v4/tradeHistory?address=A&subaccountNumber=S&limit=10&page=1
Controller->>Validator: validate params (address, subaccountNumber, market/marketType)
Validator-->>Controller: OK
Controller->>DB: query fills for subaccount (ordered by time desc)
DB-->>Controller: [fills...]
Controller->>OrderType: batch fetch order types for fill.orderId[]
OrderType-->>Controller: orderTypeMap
Controller->>MarketMap: buildClobPairIdToMarket()
MarketMap-->>Controller: clobPairIdToMarket
Controller->>Logic: computeTradeHistory(fills, orderTypeMap, clobPairIdToMarket, subaccountMap)
Note over Logic: group by market/order → compute rows\nhandle cross-zero, liquidation, weighted prices, PnL
Logic-->>Controller: [tradeHistoryRows...]
Controller->>Controller: paginateTradeHistory(rows, limit, page)
Controller->>Response: 200 TradeHistoryResponse (pageSize, offset, totalResults, tradeHistory)
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 10
🤖 Fix all issues with AI agents
In `@indexer/services/comlink/__tests__/lib/trade-history.test.ts`:
- Around line 158-162: Tests disagree on the sign convention for additionalSize:
stick to the documented "signed: positive = increasing, negative = reducing"
contract and make both OPEN short cases produce a negative additionalSize;
update the failing test expecting '5' to expect '-5' (or if the implementation
is wrong, change the code that computes additionalSize so that when
TradeHistoryType.OPEN and PositionSide.SHORT the produced additionalSize is
negative). Ensure the symbol additionalSize (used alongside
TradeHistoryType.OPEN and PositionSide.SHORT) is consistently signed across both
simple OPEN short and cross-zero OPEN short paths.
In `@indexer/services/comlink/public/api-documentation.md`:
- Around line 3667-3669: There are blank lines inside blockquotes in
api-documentation.md (the block that contains the lines starting with "> Example
responses" and "> 200 Response" and similarly at the other occurrence), which
violates MD028; remove the empty lines between the quote markers so each
blockquote contains only contiguous quoted lines (i.e., delete any empty lines
between consecutive ">" lines around the "Example responses" and "200 Response"
blocks and the similar block at the other location).
- Around line 3677-3691: The API docs show the trade history JSON with the field
"market" but the implementation and types use "marketId"; update the
documentation and swagger schema so the response field is named "marketId"
everywhere: replace "market" with "marketId" in the JSON examples in
api-documentation.md (the trade history response example) and update the
corresponding swagger.json response schema entries to use "marketId": string;
ensure the examples and schema keys match the TypeScript response type
(marketId) and run a quick search for any other occurrences of "market" used as
the trade history field to correct them.
In `@indexer/services/comlink/public/swagger.json`:
- Around line 1461-1526: The Swagger schema TradeHistoryResponseObject must be
updated to match the TypeScript interface: rename the existing "market" property
to "marketId" (preserve type string), and add the missing properties "id" (type
string) and "positionSide" (use the same enum/ref used elsewhere for position
side, e.g., the PositionSide schema or a string if that is how it's modelled);
ensure these fields are included in the "properties" section and in the
"required" array if the TypeScript interface marks them required, and keep
"additionalProperties": false.
In `@indexer/services/comlink/src/controllers/api/v4/trade-history-controller.ts`:
- Around line 125-134: The FillTable.findAll calls (search invoked via
FillTable.findAll) fetch all matching fills because no limit is passed; to
prevent unbounded memory use, add a server-side cap by supplying a capped limit
parameter (e.g., use a MAX_FILL_FETCH constant) when calling FillTable.findAll
and/or validate the incoming pagination limit so it cannot exceed that cap, and
if neither a limit nor a time-bounded filter (createdBeforeOrAt or
createdBeforeOrAtHeight) is provided, reject the request with a 400 or require a
time window; update both occurrences of FillTable.findAll to use the capped
limit/validation logic.
- Line 151: The TSOA route decorator (`@Get`('/parentSubaccount')) does not match
the actual Express route (/parentSubaccountNumber), causing generated docs to
point to the wrong endpoint; update the TSOA decorator in the
TradeHistoryController (replace `@Get`('/parentSubaccount') with
`@Get`('/parentSubaccountNumber')) so the swagger path
/tradeHistory/parentSubaccountNumber matches the runtime route, and apply the
same fix in the FillsController where the same mismatch exists.
In `@indexer/services/comlink/src/lib/trade-history.ts`:
- Around line 236-309: handleCrossZero returns closeRow then openRow with
identical group.lastCreatedAt, causing V8 stable DESC sort to keep close before
open; fix by giving the open row a synthetic later timestamp (or secondary
tiebreaker) so open sorts after close: in handleCrossZero set openRow.time to a
slightly later value than group.lastCreatedAt (e.g., add 1ms or increment a
numeric timestamp) or alternatively include a deterministic suffix/tiebreaker in
makeRowId and adjust the final sort to consider that suffix as a secondary key;
update handling to preserve original time type and ensure tests/consumers accept
the adjusted time/tiebreaker.
- Around line 84-111: The paginateTradeHistory usage loads all rows into memory;
to prevent unbounded work modify computeTradeHistory and paginateTradeHistory to
accept a lower-time filter (e.g., createdAfter) or enforce a hard cap when
fetching/processing (e.g., MAX_TRADE_ROWS) so you never materialize more than
the cap; update function signatures (computeTradeHistory, paginateTradeHistory)
to accept createdAfter or apply the MAX_TRADE_ROWS cap early (before full
mapping) and ensure the createdBeforeOrAt logic is combined with the new lower
bound or cap so only a bounded window of fills is computed and sliced for
pagination.
- Around line 65-71: The code is using a non-null assertion on
marketInfo.perpetualMarketType when calling processMarketFills, which can pass
undefined as marginMode; update the loop to guard/per-check
marketInfo.perpetualMarketType (the symbol perpetualMarketType on marketInfo)
before calling processMarketFills and either continue the loop or provide a safe
default marginMode; ensure the change is applied where fillsByMarket is iterated
and processMarketFills is invoked (referencing clobPairIdToMarket, marketInfo,
processMarketFills, and orderTypeMap) so you never call processMarketFills with
an undefined perpetualMarketType.
- Around line 55-79: computeTradeHistory currently groups fills only by
clobPairId which causes fills from different subaccounts to be merged for the
same market; change grouping so fills are partitioned by both clobPairId and
subaccountId (or iterate per subaccount) before calling processMarketFills,
e.g., build keys using clobPairId + subaccountId and pass each partition’s fills
along with marketInfo.market and marketInfo.perpetualMarketType to
processMarketFills so each child subaccount’s lifecycle is computed
independently.
🧹 Nitpick comments (2)
indexer/services/comlink/__tests__/controllers/api/v4/trade-history-controller.test.ts (1)
326-345: Consider using exact assertions instead oftoBeGreaterThanOrEqual.
toBeGreaterThanOrEqual(2)can mask regressions where extra/duplicate rows are returned. If exactly 2 fills are seeded, asserttoHaveLength(2)andtoBe(2).Tighten assertions
- expect(response.body.tradeHistory.length).toBeGreaterThanOrEqual(2); - expect(response.body.totalResults).toBeGreaterThanOrEqual(2); + expect(response.body.tradeHistory).toHaveLength(2); + expect(response.body.totalResults).toBe(2);indexer/services/comlink/src/controllers/api/v4/trade-history-controller.ts (1)
104-149: Consider extracting shared logic to reduce duplication.
getTradeHistoryandgetTradeHistoryForParentSubaccountdiffer only in theFillTable.findAllfilter. The rest (clobPairId resolution, orderTypeMap build, compute, paginate) is identical. A shared private method accepting the fill-query filter would cut ~40 lines of duplication.Also applies to: 152-197
indexer/services/comlink/src/controllers/api/v4/trade-history-controller.ts
Show resolved
Hide resolved
indexer/services/comlink/src/controllers/api/v4/trade-history-controller.ts
Outdated
Show resolved
Hide resolved
…ccountId for parent subaccount
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
indexer/services/comlink/__tests__/lib/trade-history.test.ts (1)
529-575: Consider adding edge-case pagination tests.The pagination tests cover the main scenarios well. Consider adding tests for boundary conditions like
limitexceeding total rows,page = 0, or negative/undefined page values to ensure the implementation handles them gracefully.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@indexer/services/comlink/__tests__/lib/trade-history.test.ts` around lines 529 - 575, Add unit tests in the paginateTradeHistory suite to cover edge cases: (1) limit larger than total rows (e.g., limit=20) should return all rows and correct totalResults/offset/pageSize; (2) page = 0 should be treated as page 1 (offset 0) or match current implementation—assert expected offset and first item; (3) negative page values (e.g., page = -1) and undefined page should be handled predictably—add assertions that they either default to page 1 or throw/return empty per the implementation. Reference the paginateTradeHistory function and existing tests in trade-history.test.ts to place these new it(...) cases alongside the current cases.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/indexer-build-and-push-dev-staging.yml:
- Line 9: The workflow contains a personal branch trigger
"davidli/trade-history" under the workflow's on: push/branches list which must
be removed; edit the workflow file and delete the "davidli/trade-history" entry
(or replace it with the intended shared branch names like main/staging if
needed) so that CI no longer triggers on your personal feature branch.
---
Duplicate comments:
In `@indexer/services/comlink/__tests__/lib/trade-history.test.ts`:
- Around line 219-227: Update the test expectations to reflect the corrected
sign convention for cross-zero OPEN short trades: in the assertion block
referencing openRow (variables and properties openRow.action, openRow.prevSize,
openRow.additionalSize, openRow.positionSide, openRow.netRealizedPnl,
openRow.netFee), ensure additionalSize is asserted as '-5' (negative for short
direction), positionSide is PositionSide.SHORT, netRealizedPnl is '0' after
lifecycle reset, and netFee matches the computed '0.5' value so the test
consistently follows the "negative = short direction" convention.
---
Nitpick comments:
In `@indexer/services/comlink/__tests__/lib/trade-history.test.ts`:
- Around line 529-575: Add unit tests in the paginateTradeHistory suite to cover
edge cases: (1) limit larger than total rows (e.g., limit=20) should return all
rows and correct totalResults/offset/pageSize; (2) page = 0 should be treated as
page 1 (offset 0) or match current implementation—assert expected offset and
first item; (3) negative page values (e.g., page = -1) and undefined page should
be handled predictably—add assertions that they either default to page 1 or
throw/return empty per the implementation. Reference the paginateTradeHistory
function and existing tests in trade-history.test.ts to place these new it(...)
cases alongside the current cases.
| - main | ||
| - 'release/indexer/v[0-9]+.[0-9]+.x' # e.g. release/indexer/v0.1.x | ||
| - 'release/indexer/v[0-9]+.x' # e.g. release/indexer/v1.x | ||
| - davidli/trade-history |
There was a problem hiding this comment.
Remove personal feature branch from CI trigger before merging.
This davidli/trade-history branch trigger is useful during development for staging validation, but it should not be merged into main. It will cause unnecessary CI builds to dev/staging environments if the branch is ever pushed to again, and clutters the workflow config.
- - davidli/trade-history📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| - davidli/trade-history |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.github/workflows/indexer-build-and-push-dev-staging.yml at line 9, The
workflow contains a personal branch trigger "davidli/trade-history" under the
workflow's on: push/branches list which must be removed; edit the workflow file
and delete the "davidli/trade-history" entry (or replace it with the intended
shared branch names like main/staging if needed) so that CI no longer triggers
on your personal feature branch.
There was a problem hiding this comment.
Actionable comments posted: 3
🧹 Nitpick comments (1)
indexer/services/comlink/src/controllers/api/v4/trade-history-controller.ts (1)
105-215: Extract shared logic into a private helper to eliminate duplication.
getTradeHistoryandgetTradeHistoryForParentSubaccountshare identical logic for: clobPairId resolution, fill-count guard,buildOrderTypeMap,buildClobPairIdToMarket,computeTradeHistory,paginateTradeHistory, and response assembly. Only fill retrieval andsubaccountIdToNumberconstruction differ. Consider extracting the common tail into a private method:♻️ Proposed refactor
+ private async buildTradeHistoryResponse( + fills: FillFromDatabase[], + subaccountIdToNumber: Record<string, number>, + limit: number | undefined, + page: number | undefined, + ): Promise<TradeHistoryResponse> { + if (fills.length > TRADE_HISTORY_MAX_FILLS) { + throw new BadRequestError( + `Too many fills (${fills.length}) to compute trade history. Please filter by market.`, + ); + } + const orderTypeMap = await buildOrderTypeMap(fills); + const clobPairIdToMarket = buildClobPairIdToMarket(); + const allRows = computeTradeHistory(fills, orderTypeMap, clobPairIdToMarket, subaccountIdToNumber); + const effectiveLimit = limit ?? config.API_LIMIT_V4; + const paginated = paginateTradeHistory(allRows, effectiveLimit, page); + return { + tradeHistory: paginated.tradeHistory, + pageSize: paginated.pageSize, + totalResults: paginated.totalResults, + offset: paginated.offset, + }; + }Then simplify both public methods to call
this.buildTradeHistoryResponse(fills, subaccountIdToNumber, limit, page).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@indexer/services/comlink/src/controllers/api/v4/trade-history-controller.ts` around lines 105 - 215, Both controllers duplicate the tail logic (clobPairId resolution aside): after retrieving fills they run the same guard, buildOrderTypeMap, buildClobPairIdToMarket, computeTradeHistory, paginateTradeHistory and assemble the response; extract that into a private helper (e.g. private async buildTradeHistoryResponse(fills: Fill[], subaccountIdToNumber: Record<string,number>, limit?: number, page?: number): Promise<TradeHistoryResponse>) and have getTradeHistory and getTradeHistoryForParentSubaccount call it. Move the fills.length > TRADE_HISTORY_MAX_FILLS guard, calls to buildOrderTypeMap, buildClobPairIdToMarket, computeTradeHistory, paginateTradeHistory and the response construction into that helper so only FillTable retrieval and subaccountId→number mapping remain in each public method.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@indexer/services/comlink/public/swagger.json`:
- Around line 1466-1469: The schema property "subaccountNumber" in this swagger
fragment is declared as "type": "number", "format": "double" but must match the
other occurrences (e.g., PerpetualPositionResponseObject, FillResponseObject,
SubaccountResponseObject) which use "type": "integer", "format": "int32"; change
this property's declaration to "type": "integer" and "format": "int32" so SDK
generators produce an integer field and keep the spec consistent.
In `@indexer/services/comlink/src/controllers/api/v4/trade-history-controller.ts`:
- Around line 133-137: Replace the plain Error thrown when fills exceed
TRADE_HISTORY_MAX_FILLS with a client-error type (e.g., BadRequestError) so
handleControllerError maps it to HTTP 400; change the throw in the trade history
check that references fills and TRADE_HISTORY_MAX_FILLS to throw new
BadRequestError(`Too many fills (${fills.length}) to compute trade history.
Please filter by market.`) and add/import BadRequestError from your error
utilities (or introduce a BadRequestError class consistent with project errors).
Apply the identical replacement for the second occurrence mentioned around the
other check (line ~186) so both user-correctable conditions return 400 instead
of 500.
In `@indexer/services/comlink/src/lib/trade-history.ts`:
- Line 75: The code silently maps missing subaccount IDs to subaccount 0 via
"const subaccountNumber = subaccountIdToNumber[subaccountId] ?? 0", which can
misattribute rows; instead detect when subaccountIdToNumber[subaccountId] is
undefined and either throw an error or skip the group. Replace the
nullish-coalescing fallback with an explicit check: if
subaccountIdToNumber[subaccountId] is undefined, either throw a descriptive
error (e.g., including subaccountId) or continue to skip processing that group;
keep references to the same symbols (subaccountIdToNumber and subaccountNumber)
so the intent is clear.
---
Duplicate comments:
In `@indexer/services/comlink/public/api-documentation.md`:
- Around line 3665-3668: Remove the stray blank line inside the blockquote that
triggers MD028 by editing the block containing the lines "Example responses" and
"200 Response" so there are no empty lines between the blockquote markers and
its content; locate the block by searching for the exact strings "Example
responses" and "200 Response" and collapse/remove the inner blank line so the
blockquote is contiguous.
- Around line 3775-3778: There is a blank line inside a blockquote near the
"Example responses" / "200 Response" block causing MD028; edit the blockquote so
there are no empty lines between consecutive '>' lines (remove the stray blank
line(s) following the blockquote marker) so the blockquote content is continuous
and the "Example responses" and "200 Response" lines are directly adjacent to
other quoted lines.
In `@indexer/services/comlink/src/controllers/api/v4/trade-history-controller.ts`:
- Around line 124-131: The DB query does not cap results, so add a server-side
limit to the FillTable.findAll calls: include limit: TRADE_HISTORY_MAX_FILLS + 1
in the options object passed to FillTable.findAll (the same options object that
currently contains orderBy: fillOrderBy) so the database stops at the threshold;
apply the identical change to both occurrences of FillTable.findAll referenced
in this diff.
In `@indexer/services/comlink/src/lib/trade-history.ts`:
- Around line 99-126: The function paginateTradeHistory currently paginates over
the full in-memory rows array; to avoid unbounded memory/CPU use, enforce a hard
cap: define a MAX_FETCH_ROWS constant (e.g. 10_000), clamp limit to at most
MAX_FETCH_ROWS, create effectiveRows = rows.slice(0, MAX_FETCH_ROWS) and use
effectiveRows for slicing and page math (tradeHistory, offset, pageSize,
totalResults = effectiveRows.length); update paginateTradeHistory to reference
effectiveRows instead of rows and ensure page is clamped to >=1.
---
Nitpick comments:
In `@indexer/services/comlink/src/controllers/api/v4/trade-history-controller.ts`:
- Around line 105-215: Both controllers duplicate the tail logic (clobPairId
resolution aside): after retrieving fills they run the same guard,
buildOrderTypeMap, buildClobPairIdToMarket, computeTradeHistory,
paginateTradeHistory and assemble the response; extract that into a private
helper (e.g. private async buildTradeHistoryResponse(fills: Fill[],
subaccountIdToNumber: Record<string,number>, limit?: number, page?: number):
Promise<TradeHistoryResponse>) and have getTradeHistory and
getTradeHistoryForParentSubaccount call it. Move the fills.length >
TRADE_HISTORY_MAX_FILLS guard, calls to buildOrderTypeMap,
buildClobPairIdToMarket, computeTradeHistory, paginateTradeHistory and the
response construction into that helper so only FillTable retrieval and
subaccountId→number mapping remain in each public method.
indexer/services/comlink/src/controllers/api/v4/trade-history-controller.ts
Show resolved
Hide resolved
… in internal mainnet
|
cc @dwjanus |
|
https://github.com/Mergifyio backport release/indexer/v9.x |
✅ Backports have been createdDetails
Cherry-pick of 59cd776 has failed: To fix up this pull request, you can check it out locally. See documentation: https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/reviewing-changes-in-pull-requests/checking-out-pull-requests-locally |
Changelist
buildClobPairIdToMarket()helper from fills-controller into lib/helpers.ts to avoid duplicationrandomInt()that caused clientId collisions in integration testsGET /v4/tradeHistoryendpoint with two routes:GET /— trade history for a single subaccountGET /parentSubaccountNumber— trade history for a parent subaccount (includes child/isolated subaccounts)lib/trade-history.tscomputes trade history rows on-the-fly from fills:CLOSE+OPENrowsOPEN, EXTEND, PARTIAL_CLOSE, CLOSE, LIQUIDATION_PARTIAL_CLOSE, LIQUIDATION_CLOSEExample: User Trading Activity on BTC-USD
Trade History Cards Produced (most recent first):
How the numbers work:
Row 1: Open long 10 @ $100. entryPrice = $100.
Row 2: Extend by 10 @ $120. entryPrice = weighted avg = (10010 + 12010) / 20 = $110.
Row 3: Partial close 10 @ $130. PnL = (130 - 110) * 10 = $200. Cost basis = 110 * 10 = $1,100. Percent = 200/1100 = 18.18%.
Row 4: Close remaining 10 @ $90. PnL = (90 - 110) * 10 = -$200. Cumulative = 200 + (-200) = $0. Cum cost basis = 1100 + 1100 = $2,200. Percent = 0/2200 = 0%. Lifecycle resets.
Row 5: Open short 5 @ $85. Fresh lifecycle -- netRealizedPnl = 0, netRealizedPnlPercent = null.
Row 6: Close short 5 @ $80. PnL = (85 - 80) * 5 = $25. Cost basis = 85 * 5 = $425. Percent = 25/425 = 5.88%.
Key behaviors:
entryPriceupdates on OPEN (set) and EXTEND (weighted average), stays fixed on partial/full closenetRealizedPnlandnetRealizedPnlPercentare cumulative within a position lifecycle, reset to 0/null when position goes flatnetRealizedPnlPercent= cumPnl / cumCostBasis, where cumCostBasis = sum(entryPrice * closingAmount) across all closesnetRealizedPnlPercentis null on OPEN/EXTEND rows that have no prior closes in the lifecycleTest Plan
Author/Reviewer Checklist
state-breakinglabel.indexer-postgres-breakinglabel.PrepareProposalorProcessProposal, manually add the labelproposal-breaking.feature:[feature-name].backport/[branch-name].refactor,chore,bug.Summary by CodeRabbit
New Features
Documentation
Tests