Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,8 @@ export async function getContractEventBreakdown(params: {
endDate?: Date;
}): Promise<EventBreakdownEntry[]> {
const queryParams = [
`chain=${params.chainId}`,
"group_by=time",
`chain_id=${params.chainId}`,
"group_by=block_timestamp",
"group_by=topic_0 as event_signature",
"aggregate=toStartOfDay(toDate(block_timestamp)) as time",
"aggregate=count(*) as count",
Comment on lines +30 to 34
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Same grouping issue: per-day breakdown should group by the day expression, not raw block_timestamp.

Group by the day bucket (aliased time) alongside event_signature, and remove the separate aggregate for time.

-    `chain_id=${params.chainId}`,
-    "group_by=block_timestamp",
-    "group_by=topic_0 as event_signature",
-    "aggregate=toStartOfDay(toDate(block_timestamp)) as time",
+    `chain_id=${params.chainId}`,
+    "group_by=toStartOfDay(toDate(block_timestamp)) as time",
+    "group_by=topic_0 as event_signature",

Optionally normalize count:

-    "aggregate=count(*) as count",
+    "aggregate=count() as count",
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
`chain_id=${params.chainId}`,
"group_by=block_timestamp",
"group_by=topic_0 as event_signature",
"aggregate=toStartOfDay(toDate(block_timestamp)) as time",
"aggregate=count(*) as count",
`chain_id=${params.chainId}`,
"group_by=toStartOfDay(toDate(block_timestamp)) as time",
"group_by=topic_0 as event_signature",
"aggregate=count() as count",
🤖 Prompt for AI Agents
In
apps/dashboard/src/app/(app)/(dashboard)/(chain)/[chain_id]/[contractAddress]/analytics/utils/contract-event-breakdown.ts
around lines 30 to 34, the query groups by raw block_timestamp instead of the
day bucket; change grouping to use the day expression
(toStartOfDay(toDate(block_timestamp)) as time) and group_by that alias plus
topic_0 as event_signature, remove the separate aggregate entry that defines
time, and keep aggregate=count(*) as count (or optionally normalize count
afterwards).

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,8 +31,8 @@ export async function getContractEventAnalytics(params: {
endDate?: Date;
}): Promise<AnalyticsEntry[]> {
const queryParams = [
`chain=${params.chainId}`,
"group_by=time",
`chain_id=${params.chainId}`,
"group_by=block_timestamp",
"aggregate=toStartOfDay(toDate(block_timestamp)) as time",
Comment on lines +35 to 36
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The change from group_by=time to group_by=block_timestamp introduces a potential field naming inconsistency. While the grouping parameter now uses block_timestamp, the aggregation still creates a time alias with toStartOfDay(toDate(block_timestamp)) as time.

This mismatch between the grouping field and aggregation alias might cause issues if the backend expects consistent naming between these clauses. Consider either:

  1. Updating the aggregation to use a matching alias:

    "aggregate=toStartOfDay(toDate(block_timestamp)) as block_timestamp"
    
  2. Or documenting why this apparent inconsistency is intentional and safe in this context

This would ensure the query parameters maintain logical consistency and reduce the risk of unexpected behavior.

Suggested change
"group_by=block_timestamp",
"aggregate=toStartOfDay(toDate(block_timestamp)) as time",
"group_by=block_timestamp",
"aggregate=toStartOfDay(toDate(block_timestamp)) as block_timestamp",

Spotted by Diamond

Fix in Graphite


Is this helpful? React 👍 or 👎 to let us know.

Comment on lines +34 to 36
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Daily buckets will fragment: grouping by raw block_timestamp yields per-block groups.

If the intent is daily aggregation, group_by=block_timestamp creates one group per block, not per day. Group by the day expression (aliased as time) and drop the redundant aggregate for time.

-    `chain_id=${params.chainId}`,
-    "group_by=block_timestamp",
-    "aggregate=toStartOfDay(toDate(block_timestamp)) as time",
+    `chain_id=${params.chainId}`,
+    "group_by=toStartOfDay(toDate(block_timestamp)) as time",

Optionally normalize count:

-    "aggregate=count(block_timestamp) as count",
+    "aggregate=count() as count",

🌐 Web query:

Does the Insight API support grouping by an aliased expression in the `group_by` param (e.g., `group_by=toStartOfDay(toDate(block_timestamp)) as time`), and should time filters use epoch seconds?

💡 Result:

Short answer: No — the group_by parameter must reference an existing field/alias (or the API's documented filter_alias), not an arbitrary SQL expression with an inline alias. And time filters depend on the provider: some Insight APIs (e.g. Dune) expect date/time strings (use timestamp/date strings in filters), while others (e.g. AWS X‑Ray Insights) require Unix epoch seconds — check the specific API docs. [1][2]

Sources:

  • Dune API — filtering / result parameters (dates used as strings; group/filter operate on columns). [1]
  • AWS X‑Ray GetInsightImpactGraph — start/end times in Unix time seconds. [2]

Group by existing alias, not inline SQL
Insight API only accepts columns or documented aliases in group_by. To bucket by day, keep aggregate=toStartOfDay(toDate(block_timestamp)) as time and change "group_by=block_timestamp" to "group_by=time" (or use the provider’s filter_alias for daily grouping). Optionally normalize counts: use aggregate=count() as count.
apps/dashboard/src/app/(app)/(dashboard)/(chain)/[chain_id]/[contractAddress]/analytics/utils/contract-events.ts:34-36

🤖 Prompt for AI Agents
In
apps/dashboard/src/app/(app)/(dashboard)/(chain)/[chain_id]/[contractAddress]/analytics/utils/contract-events.ts
around lines 34-36, the query uses an inline expression in group_by which the
Insight API rejects; keep the aggregate alias
aggregate=toStartOfDay(toDate(block_timestamp)) as time and replace
"group_by=block_timestamp" with "group_by=time" (or the provider’s daily
filter_alias), and optionally change/augment aggregates to include a normalized
count like aggregate=count() as count if you need per-day counts.

"aggregate=count(block_timestamp) as count",
params.startDate
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ export async function getTotalContractEvents(params: {
chainId: number;
}): Promise<{ count: number }> {
const queryParams = [
`chain=${params.chainId}`,
`chain_id=${params.chainId}`,
"aggregate=count(block_number) as total",
].join("&");

Expand Down
Loading