diff --git a/CLAUDE.md b/CLAUDE.md index 9523981..ed0c470 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -11,6 +11,8 @@ When the user makes a request, identify which agent should handle it, read its S | If the user asks to... | Activate | Skill file | |---|---|---| | Create a dashboard, build a dashboard, set up analytics for a feature | **Dashboard Builder** | `agents/dashboard-builder/SKILL.md` | +| Quick enterprise usage check (SQL-only, no Hex trends or Gong needed) — enterprise usage, account health, how enterprise accounts are doing | **Enterprise Reporter** | `agents/enterprise-reporter/SKILL.md` | +| Full weekly enterprise report with Hex dashboard trends, comprehensive enterprise analysis with Gong + Hex + BigQuery, enterprise weekly report with 4-week trends | **Enterprise Weekly Report** | `agents/enterprise-weekly-report/SKILL.md` | If the request doesn't clearly match an agent, ask the user which they need. @@ -42,10 +44,10 @@ Every agent reads from these files. They are the single source of truth — do n 12. **Validate column names against actual schema** before running queries. Documentation may be outdated. 13. **Check data quality during query execution.** Inspect results for NULLs, zero/low row counts, date gaps, invalid values (negative counts, retention increases, funnel violations), and schema mismatches. Report issues to users with clear warnings. 14. **Always use EXACT segmentation CTEs from `shared/bq-schema.md`** without modification. Never simplify or skip steps: - - **Enterprise Users:** Must use the two-step pattern (inner CTE + outer SELECT with " Pilot" suffix logic) + - **Enterprise Users:** Must use the two-step pattern (`ent_agg` CTE from `ltxstudio_users` + `enterprise` CTE joining `ltxstudio_enterprise_users`). Includes `user_type` (SSO/Code). See bq-schema.md Enterprise Users section. - **Heavy Users:** Must include all filters (4+ weeks active, token consumption, etc.) - **Full Segmentation:** Must follow hierarchy (Enterprise → Heavy → Paying → Free) - - Copy the entire CTE structure from `bq-schema.md` lines 441-516, do not improvise or simplify + - Copy the entire CTE structure from `bq-schema.md`, do not improvise or simplify ## MCP Connections diff --git a/agents/enterprise-weekly-report/SKILL.md b/agents/enterprise-weekly-report/SKILL.md new file mode 100644 index 0000000..7a0be3f --- /dev/null +++ b/agents/enterprise-weekly-report/SKILL.md @@ -0,0 +1,469 @@ +--- +name: enterprise-weekly-report +description: Generate a comprehensive weekly enterprise usage analysis report for LTX Studio enterprise accounts. Combines BigQuery metrics with Hex dashboard trends and Gong call intelligence. Use when asked to produce weekly enterprise reports, analyze enterprise user engagement, or monitor enterprise account health. +compatibility: + - BigQuery MCP tool connected + - Hex MCP tool connected + - Access to ltx-dwh-prod-processed BigQuery project +--- + +# Enterprise Weekly Report + +## When to Use + +- "Run the weekly enterprise report" +- "How are enterprise users doing this week?" +- "Generate enterprise usage analysis" +- "Enterprise account health check" +- "Weekly enterprise update" + +## When NOT to Use + +- Feature-specific dashboard creation (use `dashboard-builder` agent) +- Ad-hoc SQL queries (use BigQuery MCP directly) +- Self-serve/free user analysis (this agent focuses on enterprise accounts) + +## What It Produces + +A structured weekly report covering all enterprise accounts (annual + pilot) with: + +1. **Executive Summary** — Key takeaways and alerts per org +2. **Cross-Account KPI Table** — WAU, generations, tokens, downloads, WoW change, health signal +3. **Active Users** — Weekly active users per org with WoW trends +4. **Generation Activity** — Image/video volumes, press counts, per-user averages +5. **Model Distribution** — Which AI models each org uses +6. **Token Consumption** — Total tokens per org, top consumers +7. **Download & Export Activity** — Output utilization rates +8. **Feature Adoption** — Elements, Script-to-Storyboard, Gen Space, Retake usage +9. **User Health Signals** — New, returning, reactivated, and churned users +10. **Daily Active Users** — Day-by-day engagement pattern within the week +11. **Enterprise Call Intelligence** — Gong call sentiment, feature interest, pain points, competitors +12. **Top Users** — Highest-engagement users per org (with email and user type) +13. **Risk Alerts** — Declining usage, inactive orgs, churned power users +14. **Feedback History Context** — Historical qualitative insights correlated with current trends +15. **4-Week Trends** — Extended trend context from Hex dashboard + +## Flow + +``` +PHASE 1A: SETUP → Determine date range, verify data freshness, read context files + ↓ +PHASE 1B: SLACK CONTEXT → Read ltx-studio-enterprise + ltx-studio-operations (last 7 days) + ↓ +PHASE 2: QUERY → Run SQL queries via BigQuery MCP + Hex dashboard queries via Hex MCP + ↓ +PHASE 3: ANALYZE → Interpret results, detect anomalies, cross-reference qualitative data + ↓ +PHASE 4: REPORT → Generate 4 output formats (Full / Product / CSM / Slack Update) + ↓ +PHASE 5: DELIVER → Present all 4 formats to the user +``` + +--- + +## Phase 1A: Setup + +1. **Determine the reporting week:** + - Default: Last complete ISO week (Monday–Sunday, excluding today) + - Calculate: `report_end_date = last Sunday`, `report_start_date = report_end_date - 6 days` + - Previous week for WoW comparison: `prev_start = report_start_date - 7`, `prev_end = report_start_date - 1` + +2. **Confirm date range with user:** + > "Running enterprise report for **{Mon YYYY-MM-DD} – {Sun YYYY-MM-DD}** (vs prior week {Mon} – {Sun}). Proceed?" + +3. **Verify data freshness:** + Run a quick freshness check using the BigQuery MCP tool: + ```sql + SELECT MAX(DATE(action_ts)) AS latest_data + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` + WHERE action_ts >= TIMESTAMP(DATE_SUB(CURRENT_DATE(), INTERVAL 3 DAY)) + ``` + If `latest_data` < yesterday, warn: "Data may be incomplete — latest data is from {date}." + +4. **Read context files:** + - `shared/bq-schema.md` — BigQuery tables, columns, segmentation CTEs (use exact enterprise CTE) + - `shared/metric-standards.md` — How every metric is calculated (use exact SQL patterns) + - `shared/event-registry.yaml` — Known events per feature (never invent event names) + - `shared/product-context.md` — LTX products, user types, business model + - [references/enterprise-context.md](references/enterprise-context.md) — Org list, plan types, Hex dashboard details, benchmarks + - [templates/weekly-report.md](templates/weekly-report.md) — Output format + +--- + +## Phase 1B: Slack Context + +Read the last 7 days of messages from both channels using the Slack MCP tool. + +| Channel | ID | Purpose | What to extract | +|---------|-----|---------|-----------------| +| `#ltx-studio-enterprise` | `C08LS9AEGQK` | CSM updates, deal news, org-level signals | Org mentions, sentiment, blockers, expansions, churn signals, CSM notes | +| `#ltx-studio-operations` | `C08EGU3HRM4` | Automated GitHub/platform updates | Deployments, incidents, feature releases, infra changes that may affect usage | + +**How to use this data:** +- `ltx-studio-enterprise` → real-time qualitative context per org. Cross-reference with usage trends in Phase 3. +- `ltx-studio-operations` → note any deployments or incidents during the report week. Flag in the report if a technical event correlates with a usage anomaly (e.g., a drop during a known incident). + +**If Slack MCP is unavailable:** skip this phase and note "Slack context unavailable" in the report. + +--- + +## Phase 2: Query + +### 2A: BigQuery Queries (SQL files) + +Run each query from the `sql/` directory via the BigQuery MCP tool (`run_query`). All queries can run in parallel. + +| # | Query File | Section | Priority | +|---|-----------|---------|----------| +| 1 | [sql/01_active_users.sql](sql/01_active_users.sql) | Active Users by Org (WoW) | HIGH | +| 2 | [sql/02_generation_activity.sql](sql/02_generation_activity.sql) | Image/Video Generation Counts | HIGH | +| 3 | [sql/03_model_distribution.sql](sql/03_model_distribution.sql) | Model Usage Distribution | HIGH | +| 4 | [sql/04_token_consumption.sql](sql/04_token_consumption.sql) | Token Consumption per Org | HIGH | +| 5 | [sql/05_top_users.sql](sql/05_top_users.sql) | Top Users per Org | HIGH | +| 6 | [sql/06_downloads_exports.sql](sql/06_downloads_exports.sql) | Download/Export Activity | MID | +| 7 | [sql/07_feature_adoption.sql](sql/07_feature_adoption.sql) | Feature Adoption Metrics | MID | +| 8 | [sql/08_user_health.sql](sql/08_user_health.sql) | New/Returning/Churned Users | MID | +| 9 | [sql/09_daily_active_users.sql](sql/09_daily_active_users.sql) | DAU per Org (Mon–Sun) | MID | +| 10 | [sql/10_enterprise_calls.sql](sql/10_enterprise_calls.sql) | Gong Call Details per Org | HIGH | +| 11 | [sql/11_call_sentiment_summary.sql](sql/11_call_sentiment_summary.sql) | Call Sentiment & Feature Aggregation | MID | + +**BigQuery execution rules:** +- Replace `@report_start_date`, `@report_end_date`, `@prev_start_date`, `@prev_end_date` with actual dates before executing +- Queries 01–09 use the enterprise users CTE — see `shared/bq-schema.md` for the canonical version +- Queries 10–11 query `ltxstudio_enterprise_calls` directly (UNNEST `organization_names`). Org names in this table are raw (e.g., "Monster"), not suffixed (e.g., "Monster Pilot") — map accordingly. +- If a query fails, log the error and continue with remaining queries + +**🛑 BigQuery is required — do not proceed without it:** +- Before running queries, verify BigQuery access by running a quick test: `SELECT 1` via the BQ MCP or Python client (`google.cloud.bigquery` with `project='ltx-dwh-explore'` as billing project). +- If BigQuery is unavailable (access denied, MCP not connected, etc.) — **STOP**. Do not fall back to Hex-only data. Report the blocker to the user and ask them to resolve access before continuing. +- All HIGH priority queries (01, 02, 03, 04, 05, 10) must return results before moving to Phase 3. If any HIGH priority query returns zero rows (not an error — zero rows), flag it explicitly and ask the user whether to continue. +- MID priority queries (06, 07, 08, 09, 11) may be skipped if they error, but the gap must be noted in the report. + +**Data quality checks after each query:** +- Zero rows → warn, do not block report +- Orgs missing from current week but present in prior week → flag as "silent this week" +- Any org with NULL WAU → flag as no activity +- Query 10: `organization_names` from Gong may not exactly match enterprise CTE org names — reconcile by fuzzy-matching when writing the report + +**How to use Gong call data in the report:** +- Surface `call_spotlight_brief` as the call summary per org +- Flag any `customer_pain_point`, `primary_objection`, or `missing_features` as CSM action items +- Use `customer_sentiment` to inform the health signal narrative +- Note any `competitors_mentioned` — relevant for the account team +- Use `next_steps` directly in the Recommendations section +- Feature mention flags (`mentioned_elements_feature`, etc.) link call topics to usage data + +### 2B: Hex Dashboard Queries (via Hex MCP) + +After running BigQuery queries, use the Hex MCP to pull supplementary data from the [LTX Studio Enterprise Dashboard](https://app.hex.tech/lightricks_prod/app/LTX-Studio---Enterprise-Dashboard-031766Suw1qd6T5K733wDu/latest). + +**Available tools:** + +| Tool | Purpose | Usage | +|------|---------|-------| +| `search_projects` | Find the Enterprise Dashboard project | `{"query": "LTX Studio Enterprise Dashboard"}` | +| `create_thread` | Ask a data question — Hex agent queries the warehouse and returns analysis | `{"prompt": "your question"}` | +| `get_thread` | Poll for thread results (threads take 2-5 min) | `{"id": "thread-id"}` | +| `continue_thread` | Ask follow-up on existing thread | `{"id": "thread-id", "prompt": "follow-up"}` | + +**Dashboard project ID:** `01997093-d0d2-700f-9dcc-e624403815fe` + +**Hex query workflow:** +1. Create a thread with a targeted question about enterprise data +2. Poll `get_thread` every 20-30 seconds until the thread reaches IDLE status (check at least 10 times) +3. Use `continue_thread` for follow-up analysis on the same data +4. The Hex agent has direct access to BigQuery and can produce charts, tables, and insights + +**Recommended Hex queries to supplement BigQuery data:** + +| # | Thread Prompt | Purpose | When to Use | +|---|--------------|---------|-------------| +| H1 | "Show weekly active users, image/video generations, and token consumption for Annual Enterprise orgs for the last 4 weeks with WoW changes" | 4-week trend context | Always — provides trend depth beyond 2-week BigQuery queries | +| H2 | "Show weekly active users, image/video generations, tokens, and downloads for all Pilot orgs for the last 4 weeks" | Pilot trend context | Always — surfaces new/inactive pilot orgs | +| H3 | "Show model distribution by org — which AI models are used and their share of total generations" | Model mix detail | When model trends need deeper analysis | +| H4 | "Show user health: new, returning, reactivated, churned users per org with retention rates" | Retention analysis | When user health signals need investigation | +| H5 | "Show per-user generation detail: user, org, model, image/video counts, tokens consumed" | User-level drill-down | When investigating specific user behavior | + +**Hex query rules:** +- Threads take 2-5 minutes — create threads for H1 and H2 in parallel, then poll both +- Use `continue_thread` for follow-ups instead of creating new threads +- Hex agent uses `griffin_enterprise_name_at_action` for org resolution (may differ from CTE — see [references/enterprise-context.md](references/enterprise-context.md)) +- Hex agent has PII restrictions — user emails may not be available; it uses `lt_id` + `email_domain` instead +- Compare Hex results with BigQuery query results to validate consistency + +--- + +## Phase 3: Analyze + +### Cross-Account Totals +Compute aggregate totals across all enterprise orgs: +- Total enterprise WAU (unique users, all orgs combined) +- Total generation presses (video + image) +- Total token consumption +- Total downloads +- WoW change % for each (use `SAFE_DIVIDE`) + +### Per-Account Analysis +For each org, compute: +- **WAU** (current) and **WoW change** (`(current - prior) / prior * 100`) +- **Generation presses** (video vs image split) +- **Tokens consumed** and WoW trend +- **Downloads** (video vs image split) +- **Feature breadth** (how many distinct features used) +- **Health signal**: assign based on WAU WoW change + +Health signal rules: + +| Signal | Condition | +|--------|-----------| +| 🟢 Trending Up | WAU WoW > +10% | +| 🟡 Stable | WAU WoW between -10% and +10% | +| 🔴 Trending Down | WAU WoW < -10% | +| ⚫ Silent | 0 WAU this week (had activity in prior week) | +| 🆕 New | 0 WAU in prior week, activity this week | + +### Anomaly Detection +Flag and highlight the following: +1. **Large WAU drop:** Any org with WoW WAU change < -25% +2. **Large WAU spike:** Any org with WoW WAU change > +50% +3. **Silent accounts:** Orgs with 0 active users this week that had users last week +4. **New accounts:** Orgs with 0 users last week but active this week +5. **Generation collapse:** Video/image generation down >50% WoW for an org +6. **Token spike:** Org tokens up >100% WoW (check for batch jobs or anomalies) +7. **Pilot with <3 active users** (WARNING) +8. **Power user churned** — top user last week inactive this week (NOTE) + +### Qualitative Cross-References +1. **Gong calls:** + - Negative sentiment calls — correlate with usage drops + - Competitor mentions — potential churn risk + - Missing feature requests — connect to product roadmap + - Positive/Excited sentiment — reinforce expansion opportunity +2. **Hex dashboard data:** + - Compare BigQuery results with Hex thread results for consistency + - Use 4-week trends from Hex to contextualize the current week's WoW changes + - Hex provides additional depth on model distribution and user-level patterns + - Flag any discrepancies between BigQuery and Hex data (may indicate data freshness differences) + +--- + +## Phase 4: Report + +Generate **4 output formats** from the same underlying data. + +**Output:** All 4 formats presented inline in chat — no files saved. Each format is clearly labeled and separated. + +--- + +### Format 1 — Full Analyst Report + +**Audience:** Analytics team +**Data sources:** All — BQ queries 01–11, Hex H1–H5, both Slack channels + +``` +# Enterprise Weekly Report — YYYY-MM-DD + +## Executive Summary +(2-4 bullet points — most important findings across all orgs) + +## Cross-Account KPI Table +(WAU, image gens, video gens, tokens, downloads — current + WoW % — health signal per org) + +## Notable Changes & Anomalies +(Flagged anomalies with context — spike/drop/silent/new orgs) + +## Per-Org Sections +(One section per org, ordered: Annual first, then Pilots by activity) +For each org: + - KPI row: WAU, gens, tokens, downloads, WoW changes, health signal + - Generation breakdown: image vs video, press count vs output count + - Gong call summary: sentiment, objections, missing features, next steps + - Slack signals: any mentions in #ltx-studio-enterprise this week + - Narrative: 2-3 sentences — what changed, why it matters + +## Feature Adoption Snapshot +(Elements, Script-to-Storyboard, Gen Space, Retake — per org, WoW) + +## Risk Alerts +(Silent orgs, declining pilots, churned power users, competitor mentions) + +## Recommendations +(One bullet per org that needs action) + +## Appendix A: Top Users +(Per org — lt_id, email_domain, user_type, generation count) + +## Appendix B: Qualitative Context +(#ltx-studio-enterprise Slack summary — notable mentions, CSM signals, deal updates) + +## Appendix C: 4-Week Trends +(WAU trends from Hex H1+H2, model distribution from H3) +``` + +--- + +### Format 2 — Product Report + +**Audience:** Product team +**Data sources:** BQ query 07 (features) + BQ query 03 (models) + BQ query 11 (Gong aggregated) + Hex H3 + `#ltx-studio-operations` +**Focus:** Feature adoption and user feedback — no revenue/commercial framing + +``` +# Enterprise Product Insights — YYYY-MM-DD + +## Feature Adoption This Week +(Elements, Script-to-Storyboard, Gen Space, Retake — per org, WoW change) + +## Model Distribution Trends +(Which models gaining/losing share, any notable shifts — from BQ query 03 + Hex H3) + +## Feature Requests from Gong +(Aggregated missing_features + mentioned_*_feature fields — what users are asking for) + +## Notable User Behaviors +(e.g., power users switching models, new feature adoption spikes, unusual patterns) + +## Technical Context +(Deployments, incidents, feature releases from #ltx-studio-operations this week — +flag if any correlate with usage changes) +``` + +--- + +### Format 3 — CSM / Sales Report + +**Audience:** Sales and Customer Success Managers +**Data sources:** BQ queries 01, 02, 04, 08 (usage/health) + BQ query 10 (Gong per call) + `#ltx-studio-enterprise` +**Focus:** Account health and actionable signals — plain language, no raw SQL + +``` +# Enterprise Account Health — YYYY-MM-DD + +## Account Health Overview +(KPI table: org | WAU | WoW | tokens | health signal 🟢🟡🔴⚫🆕) + +## Per-Org Narrative +(One paragraph per org — what happened, what it means, what to do next. +Cite Gong sentiment + objections + next steps. Include Slack signals where relevant.) + +## Risk Accounts +(Orgs with ⚫ Silent / 🔴 Trending Down / <3 active users — prioritized) + +## Expansion Opportunities +(Orgs with 🟢 Trending Up / positive Gong sentiment / high feature breadth) + +## Recommended CSM Actions +(One bullet per at-risk or high-opportunity org) +``` + +--- + +### Format 4 — Slack Update (mrkdwn, in chat) + +**Audience:** Entire company +**Data sources:** Executive summary from Format 1 +**Format:** Slack mrkdwn — no markdown tables, use `*bold*` not `**bold**` + +``` +📊 *Enterprise Weekly Update — Week of {Mon DD MMM}* + +*TL;DR:* {1 sentence — the single biggest takeaway this week} + +*Highlights:* +• *{Org}:* {what happened — 1 line} +• *{Org}:* {what happened — 1 line} +• *{Org}:* {what happened — 1 line} +(max 5 highlights — only the most notable) + +*Watch list:* {1-2 orgs to keep an eye on and why} + +_Full report available on request_ +``` + +--- + +**Shared formatting rules (all formats):** +- WoW arrows: ↑ increase, ↓ decrease, → stable (±5%) +- Percentages: 1 decimal place with `%` suffix +- Token counts: K/M suffixes (e.g., 1.2M tokens) +- Health signals: 🟢 Trending Up | 🟡 Stable | 🔴 Trending Down | ⚫ Silent | 🆕 New +- Annual orgs first, then Pilots ordered by WAU descending +- DO NOT invent interpretations not supported by the data + +--- + +## Phase 5: Deliver + +Output all 4 formats inline in the chat, one after another, each with a clear header: + +``` +--- +## 📊 Full Analyst Report +[full report content] + +--- +## 🛠️ Product Report +[product report content] + +--- +## 🤝 CSM / Sales Report +[CSM report content] + +--- +## 💬 Slack Update (copy-paste ready) +[Slack mrkdwn content] +``` + +Offer to run a deep-dive for a specific org if the user asks. + +--- + +## Rules + +1. **Always use the exact enterprise user CTE** from `shared/bq-schema.md`. Never simplify or approximate it. The CTE joins `ltxstudio_users` (for attributes) with `ltxstudio_enterprise_users` (for membership). +2. **Never invent org names.** Only use orgs from the enterprise CTE output. +3. **Always exclude** Lightricks, Popular Pays, and None from enterprise users. +4. **Always apply the McCann split** — McCann_NY and McCann_Paris are distinct orgs. +5. **Annual vs Pilot** — Annual orgs are: Indegene, HearWell_BeWell, Novig, Cylndr Studios, Miroma, Deriv, McCann_Paris. All others get " Pilot" suffix. +6. **Use `action_name_detailed`** for filtering generation events, never `action_name`. +7. **Report both press counts and output counts** for generations. +8. **Split image and video metrics** — never combine them. +9. **Exclude the current incomplete day** from all queries. +10. **Use `lt_id`** as the user identifier. Never use `anonymous_id`. +11. **`ltxstudio_user_all_actions` already excludes LT team** — no extra filter needed. +12. **Include TIMESTAMP partition pruning filters** on `action_ts` for performance (never filter on `DATE(action_ts)` alone). +13. **Always use `SAFE_DIVIDE(x, y) * 100`** for percentages and WoW changes. +14. **Never use staging tables** (no table with `stg` in the name). +15. **UNNEST `organization_names`** when querying `ltxstudio_enterprise_calls` — it's a REPEATED field. +16. **Org name mapping for Gong calls** — `ltxstudio_enterprise_calls.organization_names` uses raw org names (e.g., "Monster"), while the enterprise users CTE adds a " Pilot" suffix (e.g., "Monster Pilot"). Map raw names to CTE equivalents. Fuzzy-match when needed (e.g., "Cylndr Studios" ↔ "Cylndr"). +17. **Blend quantitative and qualitative** — every org narrative should cite Gong call insights when available, and cross-reference feedback history for known issues. +18. **Use Hex dashboard for trend depth** — BigQuery queries cover current + previous week; Hex threads provide 4-week trends for context. Always run at least H1 and H2 threads. +19. **Hex org name differences** — The Hex agent may use `griffin_enterprise_name_at_action` or raw `organization_name`, which can differ from the CTE's org classification (e.g., "Cylndr Studios" may show as "None" in `griffin_enterprise_name_at_action`). Reconcile when comparing. +20. **Dashboard link in report** — Always include the dashboard link in the report header: `https://app.hex.tech/lightricks_prod/app/LTX-Studio---Enterprise-Dashboard-031766Suw1qd6T5K733wDu/latest` +21. **Confirm the date range** before running queries. +22. **Flag data quality issues** — do not silently skip orgs with missing data. +23. **Validate column names against actual schema** (`shared/bq-schema.md`) before running queries. +24. **BigQuery is a hard dependency.** Never generate a report from Hex-only data. If BQ is unavailable, stop and surface the blocker to the user. + +## Data Sources + +| Source | Type | Purpose | +|--------|------|---------| +| `ltx-dwh-prod-processed.web.ltxstudio_users` | BigQuery | User attributes, enterprise name resolution | +| `ltx-dwh-prod-processed.web.ltxstudio_enterprise_users` | BigQuery | Enterprise user membership filter | +| `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` | BigQuery | Event-level actions (LT team pre-excluded) | +| `ltx-dwh-prod-processed.web.ltxstudio_enterprise_calls` | BigQuery | Gong call data with AI-analyzed insights | +| [LTX Studio Enterprise Dashboard](https://app.hex.tech/lightricks_prod/app/LTX-Studio---Enterprise-Dashboard-031766Suw1qd6T5K733wDu/latest) | Hex Dashboard | Interactive dashboard with 5 tabs: Annual Enterprise Data, Pilot Enterprise Data, Image Generations, Video Generations, Downloads/Exports. Provides 4-week trends, model mix charts, per-user drill-downs. | + +## Reference Files + +| File | Read when | +|------|-----------| +| `shared/bq-schema.md` | Phase 1 — for enterprise segmentation CTEs and table schemas | +| `shared/metric-standards.md` | Phase 1 — for metric definitions | +| `shared/event-registry.yaml` | Phase 1 — for event name validation | +| `shared/product-context.md` | Phase 1 — for LTX product context | +| [references/enterprise-context.md](references/enterprise-context.md) | Phase 1 — for org definitions, Hex dashboard details, and benchmarks | +| [references/enterprise-dashboard-spec.md](references/enterprise-dashboard-spec.md) | Phase 2B — for Hex dashboard structure, tabs, metrics, and query templates | +| [templates/weekly-report.md](templates/weekly-report.md) | Phase 4 — for output format | +| [sql/*.sql](sql/) | Phase 2A — SQL for each metric | diff --git a/agents/enterprise-weekly-report/references/enterprise-context.md b/agents/enterprise-weekly-report/references/enterprise-context.md new file mode 100644 index 0000000..1824dbd --- /dev/null +++ b/agents/enterprise-weekly-report/references/enterprise-context.md @@ -0,0 +1,162 @@ +# Enterprise Context + +## Enterprise Users CTE + +The canonical enterprise users CTE lives in `shared/bq-schema.md` (Enterprise Users section). **Always use that exact CTE verbatim** — do not simplify or rewrite it. + +The CTE produces: `lt_id`, `org` (with " Pilot" suffix for non-annual orgs), `user_type` (SSO/Code), `email`, `first_active_ts`, `first_active_ts_tokens`. + +### How Org Classification Works + +- **Annual orgs** (hardcoded list): `Indegene`, `HearWell_BeWell`, `Novig`, `Cylndr Studios`, `Miroma`, `Deriv`, `McCann_Paris` — org name used as-is +- **Pilot orgs** (all others): org name gets " Pilot" suffix (e.g., "Monster Pilot", "GM Pilot") +- **McCann split**: `McCann_NY` and `McCann_Paris` are distinct orgs (COALESCE logic) +- **Excluded**: Lightricks, Popular Pays, None +- **User type**: SSO (has `organization_name`) vs Code (redeemed code) + +### Tables Used + +| Table | Purpose | +|-------|---------| +| `ltx-dwh-prod-processed.web.ltxstudio_users` | User attributes, enterprise name resolution | +| `ltx-dwh-prod-processed.web.ltxstudio_enterprise_users` | Enterprise user membership (JOIN filter) | +| `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` | Event-level actions (LT team pre-excluded) | +| `ltx-dwh-prod-processed.web.ltxstudio_enterprise_calls` | Gong call data with AI-analyzed insights per org | + +### Enterprise Calls Table (`ltxstudio_enterprise_calls`) + +Contains Gong call recordings matched to enterprise orgs, with AI-analyzed fields: + +**Identifiers & Metadata:** +- `conversation_key` — Natural key for the call +- `call_started_at` — When the call started +- `organization_names` — REPEATED array of org names on the call (use UNNEST to flatten) +- `enterprise_participant_emails` / `enterprise_participant_names` — REPEATED arrays +- `title` — Call title/subject +- `call_spotlight_brief` — Gong AI summary of the call +- `clean_text_transcript` — Full transcript (plain text) + +**AI-Analyzed Fields (all STRING, values include "Parse Error" for failed parsing):** +- `customer_sentiment` — "Positive/Excited", "Neutral/Professional", "Negative/Frustrated" +- `competitors_mentioned` — Comma-separated list or "None" (tracked: Runway, Freepik, Krea, Higgsfield, Weavy, Luma, Kling, Minimax) +- `primary_objection` — "Price", "Missing Feature", "Timing", "Authority", "Security", "None" +- `customer_pain_point` — Max 10 words describing the problem +- `missing_features` — Features the customer wishes existed +- `mentioned_*_feature` — "Yes"/"No" for: pitch_deck, collaboration, elements, retake +- `budget_discussion` — Amount/range or "Not Discussed" +- `is_decision_maker` — "Yes", "No", "Unclear" +- `next_steps` — Action items summary + +**Query pattern:** UNNEST `organization_names` to join with org-level data: +```sql +SELECT org_name, ... +FROM `ltx-dwh-prod-processed.web.ltxstudio_enterprise_calls` c, +UNNEST(c.organization_names) AS org_name +WHERE c.call_started_at >= TIMESTAMP(@report_start_date) +``` + +**Important — Org Name Mapping:** +The `organization_names` field contains raw org names (e.g., "Monster", "NBC", "GM") without the " Pilot" suffix that the CTE adds. When correlating Gong call data with usage metrics, map raw names to their CTE equivalents: +- Raw "Monster" = CTE "Monster Pilot" +- Raw "Deriv" = CTE "Deriv" (annual, no suffix) +- Raw "McCann" → depends on McCann_NY / McCann_Paris split + +## Hex Enterprise Dashboard + +**URL:** https://app.hex.tech/lightricks_prod/app/LTX-Studio---Enterprise-Dashboard-031766Suw1qd6T5K733wDu/latest +**Project ID:** `01997093-d0d2-700f-9dcc-e624403815fe` +**MCP Server:** Hex MCP + +The dashboard has 5 tabs providing interactive views of enterprise data: + +| Tab | Scope | Metrics | +|-----|-------|---------| +| Annual Enterprise Data | Annual orgs | WAU, image/video gens, tokens, WoW trends, model mix, feature adoption | +| Pilot Enterprise Data | Pilot orgs | WAU, image/video gens, tokens, downloads, WoW trends | +| Image Generations Data | User-level | Per-user image gen counts, model used, tokens consumed | +| Video Generations Data | User-level | Per-user video gen counts, model used, tokens consumed | +| Download/Exports Data | User-level | Image/video downloads, project exports, download rates | + +**Hex vs BigQuery differences:** +- Hex uses `griffin_enterprise_name_at_action` for org resolution — this field is NULL for Cylndr Studios; Hex joins through `ltxstudio_enterprise_users` as fallback +- Hex has PII restrictions — `email` and `full_name` are protected; users identified by `lt_id` + `email_domain` +- Hex provides 4-week trend windows vs BigQuery queries' 2-week window +- Hex org names use raw `organization_name` (e.g., "McCann", "General Motors"), not the CTE's suffixed names (e.g., "McCann Pilot", "General Motors Pilot") + +## Org Reference + +| Org | Type | Notes | +|-----|------|-------| +| McCann_Paris | Annual | Image-heavy (~88% Nano Banana 2), creative agency, 34-42 WAU, dominates token consumption (~73% of annual total) | +| Deriv | Annual | Financial services, 10-18 WAU, declining tokens (818K→410K), needs portrait video support | +| HearWell_BeWell | Annual | Single power user, 23% Veo 3 usage (highest of any org), volatile activity | +| Novig | Annual | Low activity (4 WAU), 86% Nano Banana 2, minimal video | +| Cylndr Studios | Annual | Small team (2-7 WAU), volatile usage, `griffin_enterprise_name_at_action` is NULL | +| Indegene | Annual | Healthcare, most balanced feature adoption (Elements 63%, Storyboard 38%, Retake 38%), ramping image generation | +| Miroma | Annual | Video surge (638% WoW spike), high LTX-2 adoption (14%), Script-to-Storyboard adopter (32%) | +| McCann Pilot | Pilot | Image-heavy (10K+ images), 6-11 WAU, highest download rate, separate from McCann_Paris | +| McCann_NY Pilot | Pilot | Low activity (1 WAU), sporadic usage | +| General Motors Pilot | Pilot | Video-heavy (~6.4M tokens total), VEO-3 dominant, enterprise security key differentiator | +| Meta Pilot | Pilot | High-volume evaluation (~6M tokens), 9-16 WAU, declining trend, split between VEO-3 and LTX-2 | +| Fanatics Pilot | Pilot | Sports collectibles, image-focused (6K images), 1-6 WAU, growing user base | +| Jazz Side Pilot | Pilot | New org, rapid ramp-up (600 video gens last week), high growth trajectory | +| Plarium Pilot | Pilot | Gaming, massive onboarding spike (1→30 WAU), recently activated | +| Disney Pilot | Pilot | Broadest pilot user base (7-13 WAU), moderate volumes, exploratory usage | +| EōS Fitness Pilot | Pilot | Print + digital imagery, 3-5 WAU, declining trend | +| Bent Image Lab Pilot | Pilot | Animation studio, 2 WAU, low volume but consistent | +| Bosch Pilot | Pilot | Minimal activity, intermittent usage | +| Comcast Advertising Pilot | Pilot | In training phase, near-zero generation activity | +| Telemundo Pilot | Pilot | Broadcasting, near-zero activity | +| Monster Pilot | Pilot | Energy drinks brand, pilot ended — zero recent activity | +| NBC Universal Pilot | Pilot | Broadcasting, pilot ended — zero recent activity | +| Craft Pilot | Pilot | New, single user, no generations yet | +| Eset Pilot | Pilot | Cybersecurity, pilot stopped | + +## Key Business Context + +- LTX shifted strategy in ~October 2025 to prioritize Enterprise over self-serve +- "Active" for enterprise = generated content (not just page views) +- McCann_Paris is distinctively image-focused (~70 images/user/week vs 5-10 videos) +- Deriv showed strong LTX-2 adoption but needs portrait video support +- Token costs vary significantly by model (Nano Banana 2 >> Flux >> LTX-2) +- Pilot orgs have time-limited evaluations — declining usage is a churn risk +- Gong call data provides qualitative context — sentiment, feature requests, pain points +- Real-time qualitative context comes from `#ltx-studio-enterprise` Slack channel (read last 7 days each report run) + +## Model Distribution Benchmarks (from Hex Dashboard) + +Based on dashboard data, these are the typical model usage patterns: + +| Model Group | Type | Typical Share | Notes | +|-------------|------|---------------|-------| +| Nano Banana 2 | Image | 47-88% of gens | Dominant image model across all orgs | +| Flux (Flux 2 Pro) | Image | 5-36% of gens | #2 image model, used for quality-critical work | +| Z-Image | Image | 1-6% of gens | Newer model, lighter usage | +| LTX-2 (Pro/Fast) | Video | 1-16% of gens | In-house video model, strongest at Miroma/Indegene | +| Veo 3 / Veo 3.1 | Video | 1-23% of gens | External video model, highest at HearWell_BeWell | +| Veo 2 | Video | <1% of gens | Legacy, declining | +| LTXV (13b) | Video | <1% of gens | Legacy, declining | + +## Feature Adoption Benchmarks (from Hex Dashboard) + +| Feature | Adoption Range | Benchmark | +|---------|---------------|-----------| +| Gen Space | 63-100% | Core workflow — nearly universal | +| Elements | 20-63% | Moderate — highest at Indegene (63%), lowest at Novig (20%) | +| Script-to-Storyboard | 0-38% | Growing — zero at McCann_Paris/Cylndr, highest at Indegene (38%) | +| Retake | 7-38% | Lowest adoption — highest at Indegene/Cylndr (38%), lowest at Deriv (7%) | + +## Qualitative Data Sources + +1. **Gong calls** (`ltxstudio_enterprise_calls`) — Real-time, queried per reporting week. Shows customer sentiment, competitor mentions, feature interest, objections, and next steps. +2. **Hex Enterprise Dashboard** — Interactive dashboard accessible via Hex MCP. Provides 4-week trend windows, per-user drill-downs, model distribution charts, and feature adoption visuals. Use for trend depth beyond the 2-week BigQuery window and for cross-validation of metrics. + +## Important Column Mappings + +- `action_name_detailed` — Use for filtering event types (NOT `action_name`) +- `source_interaction_id` — Video press count (distinct count) +- `native_action_id` — Image press count (distinct count) +- `tokens_charged` — Tokens consumed per action +- `model_name` — Model user intended to use +- `fetch_result` — Success/failure of generation (NOT `result`) +- `page_workspace_name` — Workspace context (gen_space, storyboard) diff --git a/agents/enterprise-weekly-report/references/enterprise-dashboard-spec.md b/agents/enterprise-weekly-report/references/enterprise-dashboard-spec.md new file mode 100644 index 0000000..18f116b --- /dev/null +++ b/agents/enterprise-weekly-report/references/enterprise-dashboard-spec.md @@ -0,0 +1,196 @@ +# LTX Studio Enterprise Dashboard — Reference Specification + +**Dashboard URL:** https://app.hex.tech/lightricks_prod/app/LTX-Studio---Enterprise-Dashboard-031766Suw1qd6T5K733wDu/latest +**Hex Project ID:** `01997093-d0d2-700f-9dcc-e624403815fe` +**MCP Server:** Hex MCP (server name depends on workspace configuration — look for "hex" in available MCP servers) +**Last published:** Feb 23, 2026 | 32 views/week, 26 distinct viewers/month + +--- + +## How to Query the Dashboard via Hex MCP + +The Hex MCP provides 4 tools for interacting with the dashboard data: + +``` +# 1. Search for the dashboard project +CallMcpTool(server="", toolName="search_projects", + arguments={"query": "LTX Studio Enterprise Dashboard"}) + +# 2. Create a thread to query data (takes 2-5 minutes) +CallMcpTool(server="", toolName="create_thread", + arguments={"prompt": "Show weekly active users for all enterprise orgs for the last 4 weeks"}) + +# 3. Poll for results (repeat until IDLE status) +CallMcpTool(server="", toolName="get_thread", + arguments={"id": "thread-id-from-step-2"}) + +# 4. Ask follow-up questions on same thread +CallMcpTool(server="", toolName="continue_thread", + arguments={"id": "thread-id", "prompt": "Now break down by model distribution"}) +``` + +> **Note:** Replace `` with the actual Hex MCP server name from your workspace. Common names: `hex`, `project-*-hex`. Use the Hex MCP tools listed in your available MCP connections. + +--- + +## Dashboard Tabs + +### Tab 1: Annual Enterprise Data + +**Scope:** 7 annual orgs (McCann_Paris, Deriv, HearWell_BeWell, Indegene, Miroma, Novig, Cylndr Studios) + +**Metrics shown:** + +| Metric | Granularity | Description | +|--------|-------------|-------------| +| Weekly Active Users (WAU) | Org × Week | Users who generated content | +| WAU WoW Change % | Org × Week | Week-over-week percentage change | +| Image Generations | Org × Week | Count of successful image outputs | +| Video Generations | Org × Week | Count of successful video outputs | +| Total Tokens Consumed | Org × Week | Sum of `tokens_charged` | +| Token WoW Change % | Org × Week | Week-over-week token change | +| Model Distribution | Org × Model | Share of each AI model (Nano Banana 2, Flux, LTX-2, Veo 3, etc.) | +| Feature Adoption | Org | % of users using Elements, Script-to-Storyboard, Gen Space, Retake | +| Top Users | Org × User | Top consumers by tokens, with lt_id and email_domain | + +**Recent data snapshot (4 weeks ending Feb 22, 2026):** + +| Week | Total WAU | Image Gens | Video Gens | Tokens | +|------|-----------|------------|------------|--------| +| Feb 16-22 | 75 | 15,075 | 745 | 5.6M | +| Feb 09-15 | 78 | 14,416 | 772 | 4.7M | +| Feb 02-08 | 75 | 18,926 | 299 | 6.2M | +| Jan 26-01 | 66 | 14,570 | 460 | 4.7M | + +### Tab 2: Pilot Enterprise Data + +**Scope:** All pilot orgs (16+ orgs as of Feb 2026) + +**Metrics shown:** Same structure as Annual tab, plus: + +| Additional Metric | Description | +|-------------------|-------------| +| Image Downloads | Count of `download_image` actions | +| Video Downloads | Count of `download_video` actions | + +**Active pilot orgs (last 4 weeks):** +- **High activity:** Meta (9-16 WAU, 6M tokens), General Motors (3-4 WAU, 6.4M tokens), McCann (6-11 WAU), Fanatics (1-6 WAU) +- **Ramping:** Jazz Side (new, rapid growth), Plarium (1→30 WAU spike), EōS Fitness (3-5 WAU) +- **Moderate:** Disney (7-13 WAU), Bent Image Lab (2 WAU), McCann_NY (1 WAU) +- **Inactive/ended:** Monster (0 gens), NBC Universal (0 gens), Telemundo (near-zero), Craft (new, 0 gens), Bosch (intermittent) + +### Tab 3: Image Generations Data + +**Scope:** Per-user image generation detail + +| Column | Description | +|--------|-------------| +| User (lt_id) | User identifier | +| Email Domain | Domain from user email (PII-protected) | +| Org | Organization name | +| Seat Type | Enterprise / None | +| Model Name | AI model used (nano-banana-2, flux-2-pro, z-image, etc.) | +| Image Generations | Count of image outputs | +| Tokens Consumed | Tokens charged for image generations | + +### Tab 4: Video Generations Data + +**Scope:** Per-user video generation detail + +| Column | Description | +|--------|-------------| +| User (lt_id) | User identifier | +| Email Domain | Domain from user email | +| Org | Organization name | +| Model Name | AI model used (ltx-2-pro, ltx-2-fast, veo-3, veo-3-fast, veo-2, veo-3.1, ltxv-13b) | +| Video Generations | Count of video outputs | +| Tokens Consumed | Tokens charged for video generations | + +### Tab 5: Download/Exports Data + +**Scope:** Per-user download and export activity + +| Column | Description | +|--------|-------------| +| User (lt_id) | User identifier | +| Org | Organization name | +| Image Downloads | Count of `download_image` actions | +| Video Downloads | Count of `download_video` actions | +| Project Exports | Count of `generate_export` actions | +| Download Rate | Downloads / Generations percentage | + +--- + +## Data Sources (underlying BigQuery tables) + +| Table | How Hex Resolves Orgs | +|-------|----------------------| +| `ltxstudio_users` | Primary source; uses `griffin_enterprise_name_at_action` when available | +| `ltxstudio_enterprise_users` | JOIN fallback for orgs where `griffin_enterprise_name_at_action` is NULL (e.g., Cylndr Studios) | +| `ltxstudio_user_all_actions` | Event-level actions; `is_lt_team` users excluded; `action_category = 'generations'` + `fetch_result = 'success'` for gen counts | +| `ltxstudio_enterprise_calls` | Gong call data (not currently visualized in dashboard but available for threads) | + +--- + +## Key Event / Action Mappings (used by dashboard) + +> Full event definitions and action mappings are in `shared/event-registry.yaml`. +> The values below are the specific filters used by the Hex Enterprise Dashboard. + +| Filter | Value | Purpose | +|--------|-------|---------| +| `action_category` | `generations` | Generation events | +| `fetch_result` | `success` | Successful generations only | +| `model_gen_type` | `t2i`, `i2i` | Image generation types | +| `model_gen_type` | `t2v`, `i2v`, `v2v` | Video generation types | +| `action_name_detailed` | `download_image` | Image downloads | +| `action_name_detailed` | `download_video` | Video downloads | +| `action_name_detailed` | `generate_export` | Project exports | +| `did_use_elements` | `true` | Elements feature usage | +| `page_workspace_name` | `storyboard` | Script-to-Storyboard usage | +| `page_workspace_name` | `gen_space` | Gen Space usage | + +--- + +## Recommended Thread Prompts + +These prompts are optimized based on how the Hex agent queries the underlying data: + +**Annual Enterprise Analysis:** +``` +Show weekly active users, image generations, video generations, and token consumption +for Annual Enterprise orgs (Indegene, HearWell_BeWell, Novig, Cylndr Studios, Miroma, +Deriv, McCann_Paris) for the last 4 weeks with WoW changes. Break down by org. +Use ltxstudio_users, ltxstudio_enterprise_users, and ltxstudio_user_all_actions +tables in ltx-dwh-prod-processed.web dataset. +``` + +**Pilot Enterprise Analysis:** +``` +Query enterprise data for Pilot organizations. Using ltx-dwh-prod-processed.web.ltxstudio_users +joined with ltx-dwh-prod-processed.web.ltxstudio_enterprise_users, find all pilot orgs +(those NOT in Indegene, HearWell_BeWell, Novig, Cylndr Studios, Miroma, Deriv, McCann_Paris, +Lightricks, Popular Pays, None). For each pilot org, show for the last 4 weeks: weekly active +users, image/video generation counts, total tokens consumed, image/video downloads. +``` + +**Model Distribution:** +``` +Show model distribution by org for enterprise users — which AI models are being used +and their share of total generations. Group models into: Nano Banana 2, Flux, LTX-2, +Veo 3, Veo 3.1, Veo 2, Z-Image, LTXV. +``` + +**Feature Adoption:** +``` +Show feature adoption by org for enterprise users: how many users used Elements, +Script-to-Storyboard, Gen Space, and Retake. Show both counts and percentage of +active users. +``` + +**User Health & Retention:** +``` +Show user health metrics per org: new users (first seen this week), returning (active both weeks), +reactivated (inactive last week, active this week), churned (active last week, not this week). +Calculate retention rate as returning / (returning + churned). +``` diff --git a/agents/enterprise-weekly-report/sql/01_active_users.sql b/agents/enterprise-weekly-report/sql/01_active_users.sql new file mode 100644 index 0000000..3d1e79b --- /dev/null +++ b/agents/enterprise-weekly-report/sql/01_active_users.sql @@ -0,0 +1,83 @@ +-- Active Users by Org: Current week vs previous week +-- Replace @report_start_date, @report_end_date, @prev_start_date, @prev_end_date before executing + +WITH agg AS ( + SELECT DISTINCT + lt_id, + enterprise_name_at_purchase, + current_enterprise_name, + organization_name, + CASE + WHEN COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) = 'McCann_NY' THEN 'McCann_NY' + WHEN COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) LIKE '%McCann%' THEN 'McCann_Paris' + ELSE COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) + END AS org, + current_customer_plan_type, + first_active_ts, + first_active_ts_tokens + FROM `ltx-dwh-prod-processed.web.ltxstudio_users` +), + +ent_users AS ( + SELECT DISTINCT + agg.lt_id, + CASE + WHEN org IN ('Indegene', 'HearWell_BeWell', 'Novig', 'Cylndr Studios', 'Miroma', 'Deriv', 'McCann_Paris') + THEN org + ELSE CONCAT(org, ' Pilot') + END AS org, + CASE WHEN agg.organization_name IS NOT NULL THEN 'SSO' ELSE 'Code' END AS user_type, + email, + first_active_ts, + first_active_ts_tokens + FROM agg + JOIN `ltx-dwh-prod-processed.web.ltxstudio_enterprise_users` e ON agg.lt_id = e.lt_id + WHERE org NOT IN ('Lightricks', 'Popular Pays', 'None') +), + +current_week AS ( + SELECT + e.org, + COUNT(DISTINCT a.lt_id) AS active_users, + COUNT(DISTINCT DATE(a.action_ts)) AS active_days_total + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a + JOIN ent_users e ON a.lt_id = e.lt_id + WHERE a.action_ts >= TIMESTAMP(@report_start_date) + AND a.action_ts < TIMESTAMP(DATE_ADD(@report_end_date, INTERVAL 1 DAY)) + AND a.action_name_detailed IN ('generate_image', 'generate_video') + GROUP BY 1 +), + +prev_week AS ( + SELECT + e.org, + COUNT(DISTINCT a.lt_id) AS active_users + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a + JOIN ent_users e ON a.lt_id = e.lt_id + WHERE a.action_ts >= TIMESTAMP(@prev_start_date) + AND a.action_ts < TIMESTAMP(DATE_ADD(@prev_end_date, INTERVAL 1 DAY)) + AND a.action_name_detailed IN ('generate_image', 'generate_video') + GROUP BY 1 +), + +total_users AS ( + SELECT + e.org, + COUNT(DISTINCT e.lt_id) AS total_registered_users + FROM ent_users e + GROUP BY 1 +) + +SELECT + c.org, + t.total_registered_users, + c.active_users AS current_week_active, + p.active_users AS prev_week_active, + SAFE_DIVIDE(c.active_users - p.active_users, p.active_users) * 100 AS wow_change_pct, + SAFE_DIVIDE(c.active_users, t.total_registered_users) * 100 AS activation_rate_pct +FROM current_week c +LEFT JOIN prev_week p ON c.org = p.org +LEFT JOIN total_users t ON c.org = t.org +ORDER BY + CASE WHEN c.org NOT LIKE '% Pilot' THEN 0 ELSE 1 END, + c.active_users DESC diff --git a/agents/enterprise-weekly-report/sql/02_generation_activity.sql b/agents/enterprise-weekly-report/sql/02_generation_activity.sql new file mode 100644 index 0000000..6c04672 --- /dev/null +++ b/agents/enterprise-weekly-report/sql/02_generation_activity.sql @@ -0,0 +1,103 @@ +-- Generation Activity by Org: Image/Video counts, press counts, per-user averages +-- Replace @report_start_date, @report_end_date, @prev_start_date, @prev_end_date before executing + +WITH agg AS ( + SELECT DISTINCT + lt_id, + enterprise_name_at_purchase, + current_enterprise_name, + organization_name, + CASE + WHEN COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) = 'McCann_NY' THEN 'McCann_NY' + WHEN COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) LIKE '%McCann%' THEN 'McCann_Paris' + ELSE COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) + END AS org, + current_customer_plan_type, + first_active_ts, + first_active_ts_tokens + FROM `ltx-dwh-prod-processed.web.ltxstudio_users` +), + +ent_users AS ( + SELECT DISTINCT + agg.lt_id, + CASE + WHEN org IN ('Indegene', 'HearWell_BeWell', 'Novig', 'Cylndr Studios', 'Miroma', 'Deriv', 'McCann_Paris') + THEN org + ELSE CONCAT(org, ' Pilot') + END AS org, + CASE WHEN agg.organization_name IS NOT NULL THEN 'SSO' ELSE 'Code' END AS user_type, + email, + first_active_ts, + first_active_ts_tokens + FROM agg + JOIN `ltx-dwh-prod-processed.web.ltxstudio_enterprise_users` e ON agg.lt_id = e.lt_id + WHERE org NOT IN ('Lightricks', 'Popular Pays', 'None') +), + +current_week AS ( + SELECT + e.org, + COUNT(DISTINCT a.lt_id) AS generating_users, + + -- Image metrics + COUNTIF(a.action_name_detailed = 'generate_image') AS image_outputs, + COUNT(DISTINCT CASE + WHEN a.action_name = 'generate_genspace_image_batch' + THEN a.native_action_id + END) AS image_presses, + + -- Video metrics + COUNTIF(a.action_name_detailed = 'generate_video') AS video_outputs, + COUNT(DISTINCT CASE + WHEN a.action_category = 'generations' AND a.action_name = 'generate_video' + THEN a.source_interaction_id + END) AS video_presses, + + -- Per-user averages + SAFE_DIVIDE( + COUNTIF(a.action_name_detailed = 'generate_image'), + COUNT(DISTINCT CASE WHEN a.action_name_detailed = 'generate_image' THEN a.lt_id END) + ) AS avg_image_outputs_per_user, + SAFE_DIVIDE( + COUNTIF(a.action_name_detailed = 'generate_video'), + COUNT(DISTINCT CASE WHEN a.action_name_detailed = 'generate_video' THEN a.lt_id END) + ) AS avg_video_outputs_per_user + + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a + JOIN ent_users e ON a.lt_id = e.lt_id + WHERE a.action_ts >= TIMESTAMP(@report_start_date) + AND a.action_ts < TIMESTAMP(DATE_ADD(@report_end_date, INTERVAL 1 DAY)) + AND a.action_name_detailed IN ('generate_image', 'generate_video') + GROUP BY 1 +), + +prev_week AS ( + SELECT + e.org, + COUNTIF(a.action_name_detailed = 'generate_image') AS image_outputs, + COUNTIF(a.action_name_detailed = 'generate_video') AS video_outputs + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a + JOIN ent_users e ON a.lt_id = e.lt_id + WHERE a.action_ts >= TIMESTAMP(@prev_start_date) + AND a.action_ts < TIMESTAMP(DATE_ADD(@prev_end_date, INTERVAL 1 DAY)) + AND a.action_name_detailed IN ('generate_image', 'generate_video') + GROUP BY 1 +) + +SELECT + c.org, + c.generating_users, + c.image_outputs, + c.image_presses, + c.video_outputs, + c.video_presses, + ROUND(c.avg_image_outputs_per_user, 1) AS avg_images_per_user, + ROUND(c.avg_video_outputs_per_user, 1) AS avg_videos_per_user, + SAFE_DIVIDE(c.image_outputs - p.image_outputs, p.image_outputs) * 100 AS image_wow_pct, + SAFE_DIVIDE(c.video_outputs - p.video_outputs, p.video_outputs) * 100 AS video_wow_pct +FROM current_week c +LEFT JOIN prev_week p ON c.org = p.org +ORDER BY + CASE WHEN c.org NOT LIKE '% Pilot' THEN 0 ELSE 1 END, + c.image_outputs + c.video_outputs DESC diff --git a/agents/enterprise-weekly-report/sql/03_model_distribution.sql b/agents/enterprise-weekly-report/sql/03_model_distribution.sql new file mode 100644 index 0000000..1fcdae5 --- /dev/null +++ b/agents/enterprise-weekly-report/sql/03_model_distribution.sql @@ -0,0 +1,72 @@ +-- Model Distribution by Org: Which AI models each org uses +-- Replace @report_start_date, @report_end_date before executing + +WITH agg AS ( + SELECT DISTINCT + lt_id, + enterprise_name_at_purchase, + current_enterprise_name, + organization_name, + CASE + WHEN COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) = 'McCann_NY' THEN 'McCann_NY' + WHEN COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) LIKE '%McCann%' THEN 'McCann_Paris' + ELSE COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) + END AS org, + current_customer_plan_type, + first_active_ts, + first_active_ts_tokens + FROM `ltx-dwh-prod-processed.web.ltxstudio_users` +), + +ent_users AS ( + SELECT DISTINCT + agg.lt_id, + CASE + WHEN org IN ('Indegene', 'HearWell_BeWell', 'Novig', 'Cylndr Studios', 'Miroma', 'Deriv', 'McCann_Paris') + THEN org + ELSE CONCAT(org, ' Pilot') + END AS org, + CASE WHEN agg.organization_name IS NOT NULL THEN 'SSO' ELSE 'Code' END AS user_type, + email, + first_active_ts, + first_active_ts_tokens + FROM agg + JOIN `ltx-dwh-prod-processed.web.ltxstudio_enterprise_users` e ON agg.lt_id = e.lt_id + WHERE org NOT IN ('Lightricks', 'Popular Pays', 'None') +), + +model_usage AS ( + SELECT + e.org, + a.model_name, + a.action_name_detailed AS gen_type, + COUNT(*) AS output_count, + COUNT(DISTINCT a.lt_id) AS unique_users, + SUM(a.tokens_charged) AS tokens_consumed + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a + JOIN ent_users e ON a.lt_id = e.lt_id + WHERE a.action_ts >= TIMESTAMP(@report_start_date) + AND a.action_ts < TIMESTAMP(DATE_ADD(@report_end_date, INTERVAL 1 DAY)) + AND a.action_name_detailed IN ('generate_image', 'generate_video') + AND a.model_name IS NOT NULL + GROUP BY 1, 2, 3 +), + +org_totals AS ( + SELECT org, SUM(output_count) AS total_outputs + FROM model_usage + GROUP BY 1 +) + +SELECT + m.org, + m.model_name, + m.gen_type, + m.output_count, + m.unique_users, + m.tokens_consumed, + SAFE_DIVIDE(m.output_count, t.total_outputs) * 100 AS pct_of_org_outputs +FROM model_usage m +JOIN org_totals t ON m.org = t.org +WHERE SAFE_DIVIDE(m.output_count, t.total_outputs) >= 0.02 +ORDER BY m.org, m.output_count DESC diff --git a/agents/enterprise-weekly-report/sql/04_token_consumption.sql b/agents/enterprise-weekly-report/sql/04_token_consumption.sql new file mode 100644 index 0000000..1a9a574 --- /dev/null +++ b/agents/enterprise-weekly-report/sql/04_token_consumption.sql @@ -0,0 +1,81 @@ +-- Token Consumption per Org: Total tokens, WoW change, by content type +-- Replace @report_start_date, @report_end_date, @prev_start_date, @prev_end_date before executing + +WITH agg AS ( + SELECT DISTINCT + lt_id, + enterprise_name_at_purchase, + current_enterprise_name, + organization_name, + CASE + WHEN COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) = 'McCann_NY' THEN 'McCann_NY' + WHEN COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) LIKE '%McCann%' THEN 'McCann_Paris' + ELSE COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) + END AS org, + current_customer_plan_type, + first_active_ts, + first_active_ts_tokens + FROM `ltx-dwh-prod-processed.web.ltxstudio_users` +), + +ent_users AS ( + SELECT DISTINCT + agg.lt_id, + CASE + WHEN org IN ('Indegene', 'HearWell_BeWell', 'Novig', 'Cylndr Studios', 'Miroma', 'Deriv', 'McCann_Paris') + THEN org + ELSE CONCAT(org, ' Pilot') + END AS org, + CASE WHEN agg.organization_name IS NOT NULL THEN 'SSO' ELSE 'Code' END AS user_type, + email, + first_active_ts, + first_active_ts_tokens + FROM agg + JOIN `ltx-dwh-prod-processed.web.ltxstudio_enterprise_users` e ON agg.lt_id = e.lt_id + WHERE org NOT IN ('Lightricks', 'Popular Pays', 'None') +), + +current_week AS ( + SELECT + e.org, + SUM(a.tokens_charged) AS total_tokens, + SUM(CASE WHEN a.action_name_detailed = 'generate_image' THEN a.tokens_charged ELSE 0 END) AS image_tokens, + SUM(CASE WHEN a.action_name_detailed = 'generate_video' THEN a.tokens_charged ELSE 0 END) AS video_tokens, + COUNT(DISTINCT a.lt_id) AS consuming_users, + SAFE_DIVIDE(SUM(a.tokens_charged), COUNT(DISTINCT a.lt_id)) AS avg_tokens_per_user + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a + JOIN ent_users e ON a.lt_id = e.lt_id + WHERE a.action_ts >= TIMESTAMP(@report_start_date) + AND a.action_ts < TIMESTAMP(DATE_ADD(@report_end_date, INTERVAL 1 DAY)) + AND a.action_name_detailed IN ('generate_image', 'generate_video') + AND a.tokens_charged > 0 + GROUP BY 1 +), + +prev_week AS ( + SELECT + e.org, + SUM(a.tokens_charged) AS total_tokens + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a + JOIN ent_users e ON a.lt_id = e.lt_id + WHERE a.action_ts >= TIMESTAMP(@prev_start_date) + AND a.action_ts < TIMESTAMP(DATE_ADD(@prev_end_date, INTERVAL 1 DAY)) + AND a.action_name_detailed IN ('generate_image', 'generate_video') + AND a.tokens_charged > 0 + GROUP BY 1 +) + +SELECT + c.org, + c.total_tokens, + c.image_tokens, + c.video_tokens, + c.consuming_users, + ROUND(c.avg_tokens_per_user, 0) AS avg_tokens_per_user, + p.total_tokens AS prev_week_tokens, + SAFE_DIVIDE(c.total_tokens - p.total_tokens, p.total_tokens) * 100 AS wow_change_pct +FROM current_week c +LEFT JOIN prev_week p ON c.org = p.org +ORDER BY + CASE WHEN c.org NOT LIKE '% Pilot' THEN 0 ELSE 1 END, + c.total_tokens DESC diff --git a/agents/enterprise-weekly-report/sql/05_top_users.sql b/agents/enterprise-weekly-report/sql/05_top_users.sql new file mode 100644 index 0000000..e7096cd --- /dev/null +++ b/agents/enterprise-weekly-report/sql/05_top_users.sql @@ -0,0 +1,75 @@ +-- Top Users per Org: Highest token consumers and most active users +-- Replace @report_start_date, @report_end_date before executing + +WITH agg AS ( + SELECT DISTINCT + lt_id, + enterprise_name_at_purchase, + current_enterprise_name, + organization_name, + CASE + WHEN COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) = 'McCann_NY' THEN 'McCann_NY' + WHEN COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) LIKE '%McCann%' THEN 'McCann_Paris' + ELSE COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) + END AS org, + current_customer_plan_type, + first_active_ts, + first_active_ts_tokens + FROM `ltx-dwh-prod-processed.web.ltxstudio_users` +), + +ent_users AS ( + SELECT DISTINCT + agg.lt_id, + CASE + WHEN org IN ('Indegene', 'HearWell_BeWell', 'Novig', 'Cylndr Studios', 'Miroma', 'Deriv', 'McCann_Paris') + THEN org + ELSE CONCAT(org, ' Pilot') + END AS org, + CASE WHEN agg.organization_name IS NOT NULL THEN 'SSO' ELSE 'Code' END AS user_type, + email, + first_active_ts, + first_active_ts_tokens + FROM agg + JOIN `ltx-dwh-prod-processed.web.ltxstudio_enterprise_users` e ON agg.lt_id = e.lt_id + WHERE org NOT IN ('Lightricks', 'Popular Pays', 'None') +), + +user_activity AS ( + SELECT + e.org, + e.user_type, + a.lt_id, + e.email, + SUM(a.tokens_charged) AS tokens_consumed, + COUNTIF(a.action_name_detailed = 'generate_image') AS image_outputs, + COUNTIF(a.action_name_detailed = 'generate_video') AS video_outputs, + COUNT(DISTINCT DATE(a.action_ts)) AS active_days + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a + JOIN ent_users e ON a.lt_id = e.lt_id + WHERE a.action_ts >= TIMESTAMP(@report_start_date) + AND a.action_ts < TIMESTAMP(DATE_ADD(@report_end_date, INTERVAL 1 DAY)) + AND a.action_name_detailed IN ('generate_image', 'generate_video') + GROUP BY 1, 2, 3, 4 +), + +ranked AS ( + SELECT + *, + ROW_NUMBER() OVER (PARTITION BY org ORDER BY tokens_consumed DESC) AS rank_in_org + FROM user_activity +) + +SELECT + org, + lt_id, + email, + user_type, + tokens_consumed, + image_outputs, + video_outputs, + active_days, + rank_in_org +FROM ranked +WHERE rank_in_org <= 5 +ORDER BY org, rank_in_org diff --git a/agents/enterprise-weekly-report/sql/06_downloads_exports.sql b/agents/enterprise-weekly-report/sql/06_downloads_exports.sql new file mode 100644 index 0000000..531233d --- /dev/null +++ b/agents/enterprise-weekly-report/sql/06_downloads_exports.sql @@ -0,0 +1,93 @@ +-- Download & Export Activity by Org +-- Replace @report_start_date, @report_end_date, @prev_start_date, @prev_end_date before executing + +WITH agg AS ( + SELECT DISTINCT + lt_id, + enterprise_name_at_purchase, + current_enterprise_name, + organization_name, + CASE + WHEN COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) = 'McCann_NY' THEN 'McCann_NY' + WHEN COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) LIKE '%McCann%' THEN 'McCann_Paris' + ELSE COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) + END AS org, + current_customer_plan_type, + first_active_ts, + first_active_ts_tokens + FROM `ltx-dwh-prod-processed.web.ltxstudio_users` +), + +ent_users AS ( + SELECT DISTINCT + agg.lt_id, + CASE + WHEN org IN ('Indegene', 'HearWell_BeWell', 'Novig', 'Cylndr Studios', 'Miroma', 'Deriv', 'McCann_Paris') + THEN org + ELSE CONCAT(org, ' Pilot') + END AS org, + CASE WHEN agg.organization_name IS NOT NULL THEN 'SSO' ELSE 'Code' END AS user_type, + email, + first_active_ts, + first_active_ts_tokens + FROM agg + JOIN `ltx-dwh-prod-processed.web.ltxstudio_enterprise_users` e ON agg.lt_id = e.lt_id + WHERE org NOT IN ('Lightricks', 'Popular Pays', 'None') +), + +downloads_current AS ( + SELECT + e.org, + COUNTIF(a.action_name_detailed = 'download_image') AS image_downloads, + COUNTIF(a.action_name_detailed = 'download_video') AS video_downloads, + COUNTIF(a.action_name_detailed = 'generate_export') AS project_exports, + COUNT(DISTINCT a.lt_id) AS downloading_users + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a + JOIN ent_users e ON a.lt_id = e.lt_id + WHERE a.action_ts >= TIMESTAMP(@report_start_date) + AND a.action_ts < TIMESTAMP(DATE_ADD(@report_end_date, INTERVAL 1 DAY)) + AND a.action_name_detailed IN ('download_image', 'download_video', 'generate_export') + GROUP BY 1 +), + +generations_current AS ( + SELECT + e.org, + COUNTIF(a.action_name_detailed = 'generate_image') AS image_outputs, + COUNTIF(a.action_name_detailed = 'generate_video') AS video_outputs + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a + JOIN ent_users e ON a.lt_id = e.lt_id + WHERE a.action_ts >= TIMESTAMP(@report_start_date) + AND a.action_ts < TIMESTAMP(DATE_ADD(@report_end_date, INTERVAL 1 DAY)) + AND a.action_name_detailed IN ('generate_image', 'generate_video') + GROUP BY 1 +), + +downloads_prev AS ( + SELECT + e.org, + COUNTIF(a.action_name_detailed = 'download_image') + COUNTIF(a.action_name_detailed = 'download_video') AS total_downloads + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a + JOIN ent_users e ON a.lt_id = e.lt_id + WHERE a.action_ts >= TIMESTAMP(@prev_start_date) + AND a.action_ts < TIMESTAMP(DATE_ADD(@prev_end_date, INTERVAL 1 DAY)) + AND a.action_name_detailed IN ('download_image', 'download_video') + GROUP BY 1 +) + +SELECT + d.org, + d.image_downloads, + d.video_downloads, + d.project_exports, + d.downloading_users, + SAFE_DIVIDE(d.image_downloads, g.image_outputs) * 100 AS image_download_rate_pct, + SAFE_DIVIDE(d.video_downloads, g.video_outputs) * 100 AS video_download_rate_pct, + SAFE_DIVIDE( + d.image_downloads + d.video_downloads - p.total_downloads, + p.total_downloads + ) * 100 AS downloads_wow_pct +FROM downloads_current d +LEFT JOIN generations_current g ON d.org = g.org +LEFT JOIN downloads_prev p ON d.org = p.org +ORDER BY d.image_downloads + d.video_downloads DESC diff --git a/agents/enterprise-weekly-report/sql/07_feature_adoption.sql b/agents/enterprise-weekly-report/sql/07_feature_adoption.sql new file mode 100644 index 0000000..886c3ea --- /dev/null +++ b/agents/enterprise-weekly-report/sql/07_feature_adoption.sql @@ -0,0 +1,108 @@ +-- Feature Adoption by Org: Elements, Storyboard, Gen Space, Retake usage +-- Replace @report_start_date, @report_end_date before executing + +WITH agg AS ( + SELECT DISTINCT + lt_id, + enterprise_name_at_purchase, + current_enterprise_name, + organization_name, + CASE + WHEN COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) = 'McCann_NY' THEN 'McCann_NY' + WHEN COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) LIKE '%McCann%' THEN 'McCann_Paris' + ELSE COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) + END AS org, + current_customer_plan_type, + first_active_ts, + first_active_ts_tokens + FROM `ltx-dwh-prod-processed.web.ltxstudio_users` +), + +ent_users AS ( + SELECT DISTINCT + agg.lt_id, + CASE + WHEN org IN ('Indegene', 'HearWell_BeWell', 'Novig', 'Cylndr Studios', 'Miroma', 'Deriv', 'McCann_Paris') + THEN org + ELSE CONCAT(org, ' Pilot') + END AS org, + CASE WHEN agg.organization_name IS NOT NULL THEN 'SSO' ELSE 'Code' END AS user_type, + email, + first_active_ts, + first_active_ts_tokens + FROM agg + JOIN `ltx-dwh-prod-processed.web.ltxstudio_enterprise_users` e ON agg.lt_id = e.lt_id + WHERE org NOT IN ('Lightricks', 'Popular Pays', 'None') +), + +active_users AS ( + SELECT + e.org, + COUNT(DISTINCT a.lt_id) AS total_active_users + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a + JOIN ent_users e ON a.lt_id = e.lt_id + WHERE a.action_ts >= TIMESTAMP(@report_start_date) + AND a.action_ts < TIMESTAMP(DATE_ADD(@report_end_date, INTERVAL 1 DAY)) + GROUP BY 1 +), + +elements_usage AS ( + SELECT + e.org, + COUNT(DISTINCT a.lt_id) AS users_with_elements, + COUNTIF(a.did_use_elements = TRUE) AS generations_with_elements + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a + JOIN ent_users e ON a.lt_id = e.lt_id + WHERE a.action_ts >= TIMESTAMP(@report_start_date) + AND a.action_ts < TIMESTAMP(DATE_ADD(@report_end_date, INTERVAL 1 DAY)) + AND a.action_name_detailed IN ('generate_image', 'generate_video') + AND a.did_use_elements = TRUE + GROUP BY 1 +), + +workspace_usage AS ( + SELECT + e.org, + COUNT(DISTINCT CASE WHEN a.page_workspace_name = 'storyboard' THEN a.lt_id END) AS storyboard_users, + COUNT(DISTINCT CASE WHEN a.page_workspace_name = 'gen_space' THEN a.lt_id END) AS genspace_users, + COUNTIF(a.page_workspace_name = 'storyboard') AS storyboard_actions, + COUNTIF(a.page_workspace_name = 'gen_space') AS genspace_actions + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a + JOIN ent_users e ON a.lt_id = e.lt_id + WHERE a.action_ts >= TIMESTAMP(@report_start_date) + AND a.action_ts < TIMESTAMP(DATE_ADD(@report_end_date, INTERVAL 1 DAY)) + AND a.page_workspace_name IN ('storyboard', 'gen_space') + GROUP BY 1 +), + +retake_usage AS ( + SELECT + e.org, + COUNT(DISTINCT a.lt_id) AS retake_users, + COUNTIF(a.model_gen_type = 'v2v') AS retake_count + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a + JOIN ent_users e ON a.lt_id = e.lt_id + WHERE a.action_ts >= TIMESTAMP(@report_start_date) + AND a.action_ts < TIMESTAMP(DATE_ADD(@report_end_date, INTERVAL 1 DAY)) + AND a.action_name_detailed = 'generate_video' + AND a.model_gen_type = 'v2v' + GROUP BY 1 +) + +SELECT + au.org, + au.total_active_users, + COALESCE(el.users_with_elements, 0) AS elements_users, + SAFE_DIVIDE(el.users_with_elements, au.total_active_users) * 100 AS elements_adoption_pct, + COALESCE(el.generations_with_elements, 0) AS elements_gen_count, + COALESCE(ws.storyboard_users, 0) AS storyboard_users, + SAFE_DIVIDE(ws.storyboard_users, au.total_active_users) * 100 AS storyboard_adoption_pct, + COALESCE(ws.genspace_users, 0) AS genspace_users, + SAFE_DIVIDE(ws.genspace_users, au.total_active_users) * 100 AS genspace_adoption_pct, + COALESCE(rt.retake_users, 0) AS retake_users, + COALESCE(rt.retake_count, 0) AS retake_count +FROM active_users au +LEFT JOIN elements_usage el ON au.org = el.org +LEFT JOIN workspace_usage ws ON au.org = ws.org +LEFT JOIN retake_usage rt ON au.org = rt.org +ORDER BY au.total_active_users DESC diff --git a/agents/enterprise-weekly-report/sql/08_user_health.sql b/agents/enterprise-weekly-report/sql/08_user_health.sql new file mode 100644 index 0000000..4dac68e --- /dev/null +++ b/agents/enterprise-weekly-report/sql/08_user_health.sql @@ -0,0 +1,114 @@ +-- User Health Signals: New, returning, reactivated, and churned users per org +-- Replace @report_start_date, @report_end_date, @prev_start_date, @prev_end_date before executing + +WITH agg AS ( + SELECT DISTINCT + lt_id, + enterprise_name_at_purchase, + current_enterprise_name, + organization_name, + CASE + WHEN COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) = 'McCann_NY' THEN 'McCann_NY' + WHEN COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) LIKE '%McCann%' THEN 'McCann_Paris' + ELSE COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) + END AS org, + current_customer_plan_type, + first_active_ts, + first_active_ts_tokens + FROM `ltx-dwh-prod-processed.web.ltxstudio_users` +), + +ent_users AS ( + SELECT DISTINCT + agg.lt_id, + CASE + WHEN org IN ('Indegene', 'HearWell_BeWell', 'Novig', 'Cylndr Studios', 'Miroma', 'Deriv', 'McCann_Paris') + THEN org + ELSE CONCAT(org, ' Pilot') + END AS org, + CASE WHEN agg.organization_name IS NOT NULL THEN 'SSO' ELSE 'Code' END AS user_type, + email, + first_active_ts, + first_active_ts_tokens + FROM agg + JOIN `ltx-dwh-prod-processed.web.ltxstudio_enterprise_users` e ON agg.lt_id = e.lt_id + WHERE org NOT IN ('Lightricks', 'Popular Pays', 'None') +), + +first_generation AS ( + SELECT + a.lt_id, + MIN(DATE(a.action_ts)) AS first_gen_date + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a + JOIN ent_users e ON a.lt_id = e.lt_id + WHERE a.action_name_detailed IN ('generate_image', 'generate_video') + GROUP BY 1 +), + +current_week_users AS ( + SELECT DISTINCT + e.org, + a.lt_id + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a + JOIN ent_users e ON a.lt_id = e.lt_id + WHERE a.action_ts >= TIMESTAMP(@report_start_date) + AND a.action_ts < TIMESTAMP(DATE_ADD(@report_end_date, INTERVAL 1 DAY)) + AND a.action_name_detailed IN ('generate_image', 'generate_video') +), + +prev_week_users AS ( + SELECT DISTINCT + e.org, + a.lt_id + FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a + JOIN ent_users e ON a.lt_id = e.lt_id + WHERE a.action_ts >= TIMESTAMP(@prev_start_date) + AND a.action_ts < TIMESTAMP(DATE_ADD(@prev_end_date, INTERVAL 1 DAY)) + AND a.action_name_detailed IN ('generate_image', 'generate_video') +) + +SELECT + e.org, + + -- New users: first generation this week + COUNT(DISTINCT CASE + WHEN c.lt_id IS NOT NULL + AND fg.first_gen_date BETWEEN @report_start_date AND @report_end_date + THEN c.lt_id + END) AS new_users, + + -- Returning users: active this week AND last week + COUNT(DISTINCT CASE + WHEN c.lt_id IS NOT NULL AND p.lt_id IS NOT NULL + THEN c.lt_id + END) AS returning_users, + + -- Reactivated users: active this week, NOT last week, but generated before + COUNT(DISTINCT CASE + WHEN c.lt_id IS NOT NULL + AND p.lt_id IS NULL + AND fg.first_gen_date < @report_start_date + THEN c.lt_id + END) AS reactivated_users, + + -- Churned users: active last week but NOT this week + COUNT(DISTINCT CASE + WHEN p.lt_id IS NOT NULL AND c.lt_id IS NULL + THEN p.lt_id + END) AS churned_users, + + -- Week-over-week retention rate + SAFE_DIVIDE( + COUNT(DISTINCT CASE WHEN c.lt_id IS NOT NULL AND p.lt_id IS NOT NULL THEN c.lt_id END), + COUNT(DISTINCT p.lt_id) + ) * 100 AS wow_retention_pct + +FROM ent_users e +LEFT JOIN current_week_users c ON e.lt_id = c.lt_id AND e.org = c.org +LEFT JOIN prev_week_users p ON e.lt_id = p.lt_id AND e.org = p.org +LEFT JOIN first_generation fg ON e.lt_id = fg.lt_id +GROUP BY 1 +HAVING COUNT(DISTINCT c.lt_id) > 0 OR COUNT(DISTINCT p.lt_id) > 0 +ORDER BY + CASE WHEN e.org NOT LIKE '% Pilot' THEN 0 ELSE 1 END, + wow_retention_pct DESC diff --git a/agents/enterprise-weekly-report/sql/09_daily_active_users.sql b/agents/enterprise-weekly-report/sql/09_daily_active_users.sql new file mode 100644 index 0000000..c0bee03 --- /dev/null +++ b/agents/enterprise-weekly-report/sql/09_daily_active_users.sql @@ -0,0 +1,44 @@ +-- Daily Active Users per Org: Day-by-day engagement pattern within the report week +-- Replace @report_start_date, @report_end_date before executing + +WITH agg AS ( + SELECT DISTINCT + lt_id, + enterprise_name_at_purchase, + current_enterprise_name, + organization_name, + CASE + WHEN COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) = 'McCann_NY' THEN 'McCann_NY' + WHEN COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) LIKE '%McCann%' THEN 'McCann_Paris' + ELSE COALESCE(enterprise_name_at_purchase, current_enterprise_name, organization_name) + END AS org, + current_customer_plan_type, + first_active_ts, + first_active_ts_tokens + FROM `ltx-dwh-prod-processed.web.ltxstudio_users` +), + +ent_users AS ( + SELECT DISTINCT + agg.lt_id, + CASE + WHEN org IN ('Indegene', 'HearWell_BeWell', 'Novig', 'Cylndr Studios', 'Miroma', 'Deriv', 'McCann_Paris') + THEN org + ELSE CONCAT(org, ' Pilot') + END AS org, + CASE WHEN agg.organization_name IS NOT NULL THEN 'SSO' ELSE 'Code' END AS user_type + FROM agg + JOIN `ltx-dwh-prod-processed.web.ltxstudio_enterprise_users` e ON agg.lt_id = e.lt_id + WHERE org NOT IN ('Lightricks', 'Popular Pays', 'None') +) + +SELECT + DATE(a.action_ts) AS activity_date, + e.org, + COUNT(DISTINCT a.lt_id) AS dau +FROM `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` a +JOIN ent_users e ON a.lt_id = e.lt_id +WHERE a.action_ts >= TIMESTAMP(@report_start_date) + AND a.action_ts < TIMESTAMP(DATE_ADD(@report_end_date, INTERVAL 1 DAY)) +GROUP BY 1, 2 +ORDER BY 1, 2 diff --git a/agents/enterprise-weekly-report/sql/10_enterprise_calls.sql b/agents/enterprise-weekly-report/sql/10_enterprise_calls.sql new file mode 100644 index 0000000..b6b4331 --- /dev/null +++ b/agents/enterprise-weekly-report/sql/10_enterprise_calls.sql @@ -0,0 +1,27 @@ +-- Enterprise Gong Call Insights: Recent calls per org with sentiment, feature mentions, and pain points +-- Replace @report_start_date, @report_end_date before executing + +SELECT + org_name AS org, + c.conversation_key, + c.call_started_at, + c.title, + c.call_spotlight_brief, + c.enterprise_participant_count, + c.customer_sentiment, + c.competitors_mentioned, + c.primary_objection, + c.customer_pain_point, + c.missing_features, + c.next_steps, + c.mentioned_pitch_deck_feature, + c.mentioned_collaboration_feature, + c.mentioned_elements_feature, + c.mentioned_retake_feature, + c.budget_discussion, + c.is_decision_maker +FROM `ltx-dwh-prod-processed.web.ltxstudio_enterprise_calls` c, +UNNEST(c.organization_names) AS org_name +WHERE c.call_started_at >= TIMESTAMP(@report_start_date) + AND c.call_started_at < TIMESTAMP(DATE_ADD(@report_end_date, INTERVAL 1 DAY)) +ORDER BY org_name, c.call_started_at DESC diff --git a/agents/enterprise-weekly-report/sql/11_call_sentiment_summary.sql b/agents/enterprise-weekly-report/sql/11_call_sentiment_summary.sql new file mode 100644 index 0000000..716cd58 --- /dev/null +++ b/agents/enterprise-weekly-report/sql/11_call_sentiment_summary.sql @@ -0,0 +1,54 @@ +-- Enterprise Call Sentiment & Feature Interest Summary: Aggregated per org over reporting period +-- Replace @report_start_date, @report_end_date before executing + +WITH calls_unnested AS ( + SELECT + org_name AS org, + c.conversation_key, + c.call_started_at, + c.customer_sentiment, + c.competitors_mentioned, + c.primary_objection, + c.customer_pain_point, + c.missing_features, + c.mentioned_pitch_deck_feature, + c.mentioned_collaboration_feature, + c.mentioned_elements_feature, + c.mentioned_retake_feature + FROM `ltx-dwh-prod-processed.web.ltxstudio_enterprise_calls` c, + UNNEST(c.organization_names) AS org_name + WHERE c.call_started_at >= TIMESTAMP(@report_start_date) + AND c.call_started_at < TIMESTAMP(DATE_ADD(@report_end_date, INTERVAL 1 DAY)) +) + +SELECT + org, + COUNT(DISTINCT conversation_key) AS total_calls, + + -- Sentiment breakdown + COUNTIF(customer_sentiment = 'Positive/Excited') AS positive_calls, + COUNTIF(customer_sentiment = 'Neutral/Professional') AS neutral_calls, + COUNTIF(customer_sentiment = 'Negative/Frustrated') AS negative_calls, + + -- Feature interest + COUNTIF(mentioned_elements_feature = 'Yes') AS calls_mentioning_elements, + COUNTIF(mentioned_retake_feature = 'Yes') AS calls_mentioning_retake, + COUNTIF(mentioned_collaboration_feature = 'Yes') AS calls_mentioning_collaboration, + COUNTIF(mentioned_pitch_deck_feature = 'Yes') AS calls_mentioning_pitch_deck, + + -- Top objections + COUNTIF(primary_objection = 'Missing Feature') AS objection_missing_feature, + COUNTIF(primary_objection = 'Price') AS objection_price, + COUNTIF(primary_objection = 'Security') AS objection_security, + + -- Competitors + COUNTIF(competitors_mentioned != 'None' AND competitors_mentioned IS NOT NULL) AS calls_mentioning_competitors, + + -- Collect pain points and missing features for narrative + ARRAY_AGG(DISTINCT customer_pain_point IGNORE NULLS) AS pain_points, + ARRAY_AGG(DISTINCT missing_features IGNORE NULLS) AS missing_features_list, + ARRAY_AGG(DISTINCT competitors_mentioned IGNORE NULLS) AS competitors_list + +FROM calls_unnested +GROUP BY 1 +ORDER BY total_calls DESC diff --git a/agents/enterprise-weekly-report/templates/weekly-report.md b/agents/enterprise-weekly-report/templates/weekly-report.md new file mode 100644 index 0000000..8d22d67 --- /dev/null +++ b/agents/enterprise-weekly-report/templates/weekly-report.md @@ -0,0 +1,209 @@ +# LTX Studio — Enterprise Weekly Report +**Week:** {report_start_date} to {report_end_date} +**Prior Week:** {prev_start_date} to {prev_end_date} +**Generated:** {current_timestamp} +**Dashboard:** [LTX Studio Enterprise Dashboard](https://app.hex.tech/lightricks_prod/app/LTX-Studio---Enterprise-Dashboard-031766Suw1qd6T5K733wDu/latest) + +--- + +## Executive Summary + +> 3–5 bullet points. Lead with what changed. Flag anomalies first. + +- {Key insight 1 — e.g., "Enterprise WAU up +12% WoW, driven by [Org] (+45 users)"} +- {Key insight 2 — e.g., "[Org] went silent this week after [N] active users last week — follow up"} +- {Key insight 3 — e.g., "Video generation up [X]% across enterprise, led by [Org]"} +- {Risk alert — e.g., "Eset had 0 active users this week (pilot at risk)"} +- {Positive signal — e.g., "NBC pilot ramping up with 12 active users, strong LTX-2 adoption"} + +--- + +## Cross-Account KPIs + +| Org | WAU | WAU WoW | Gen Presses | Token Usage | Downloads | Health | +|-----|----:|--------:|------------:|------------:|----------:|--------| +| {Org 1} | {N} | {↑/↓/→ X%} | {N} | {NK/M} | {N} | 🟢/🟡/🔴/⚫/🆕 | +| {Org 2} | {N} | {↑/↓/→ X%} | {N} | {NK/M} | {N} | 🟢/🟡/🔴/⚫/🆕 | +| ... | | | | | | | +| **TOTAL** | **{N}** | **{↑/↓/→ X%}** | **{N}** | **{NK/M}** | **{N}** | | + +**Health legend:** 🟢 Trending Up (>+10% WAU) · 🟡 Stable (±10%) · 🔴 Trending Down (<-10%) · ⚫ Silent · 🆕 New this week + +--- + +## Notable Changes & Anomalies + +> Only include if anomalies exist. Remove section if none. + +### ⚠️ {Anomaly type — e.g., "Silent Account"} +**{Org}** had {N} active users last week but 0 this week. +- Prior week: {N} WAU, {N} generation presses +- Action: Recommend CSM outreach + +### ⚠️ {Anomaly type — e.g., "Token Spike"} +**{Org}** token consumption up {X}% WoW ({N} → {N} tokens). +- Could indicate: batch job, new user onboarding, or workflow change +- Monitor next week; flag if spike continues + +### 🆕 {Anomaly type — e.g., "New Account Active"} +**{Org}** had first activity this week: {N} active users, {N} generation presses. + +--- + +## Annual Enterprise Accounts + +### {Org Name} — Annual {health signal emoji} + +| Metric | This Week | Last Week | WoW | +|--------|-----------|-----------|-----| +| Active Users | {n} | {n} | {↑/↓/→ x%} | +| Image Outputs | {n} | {n} | {↑/↓/→ x%} | +| Image Presses | {n} | — | — | +| Video Outputs | {n} | {n} | {↑/↓/→ x%} | +| Video Presses | {n} | — | — | +| Tokens Consumed | {nK/M} | {nK/M} | {↑/↓/→ x%} | +| Image Downloads | {n} | — | — | +| Video Downloads | {n} | — | — | +| Download Rate (Image) | {x%} | — | — | +| Download Rate (Video) | {x%} | — | — | + +**Daily activity (Mon–Sun):** {Mon: N · Tue: N · Wed: N · Thu: N · Fri: N · Sat: N · Sun: N} + +**Model Mix:** {model1} ({x%}), {model2} ({x%}), ... +**Feature Adoption:** Elements ({x%} of users), Storyboard ({x%}), Gen Space ({x%}), Retake ({x%}) +**User Health:** {n} returning, {n} new, {n} reactivated, {n} churned | WoW Retention: {x%} +**Top Users:** {user1} ({nK tokens}), {user2} ({nK tokens}), {user3} ({nK tokens}) + +**Gong Call Insights:** {n} calls this week | Sentiment: {Positive/Neutral/Negative breakdown} +- {Key pain point or feature request from calls, if any} +- {Competitor mentions, if any} +- {Next steps or action items from calls, if any} + +{2-3 sentence narrative insight combining quantitative and qualitative signals: +- What's the story for this org this week? +- Connect usage trends to Gong call sentiment and feedback history +- e.g., "McCann Paris continues image-heavy usage with Nano Banana 2 driving token consumption. + 3 new users joined this week. Gong call on Tuesday showed excitement about custom storyboard + styles. Historical feedback notes frustration with element consistency — watch for adoption trends."} + +--- + +{Repeat for each annual org} + +--- + +## Pilot Enterprise Accounts + +{Same format as above for each pilot org} + +--- + +## Feature Adoption Snapshot + +> Which features each org used this week. Blank = not used. + +| Feature | {Org 1} | {Org 2} | {Org 3} | ... | +|---------|:-------:|:-------:|:-------:|-----| +| Gen Space | ✅ {N}u | ✅ {N}u | ✅ {N}u | | +| Elements | ✅ {N}u | — | ✅ {N}u | | +| Script-to-Storyboard | — | ✅ {N}u | — | | +| Retake | — | — | ✅ {N}u | | + +> ✅ = used · {N}u = number of unique users · — = not used this week + +--- + +## Risk Alerts + +| Severity | Org | Signal | Detail | +|----------|-----|--------|--------| +| CRITICAL | {org} | No active users | Last activity: {date} | +| WARNING | {org} | Active users down >30% | {n} → {n} ({-x%}) | +| WARNING | {org} | Power user churned | {user} was top user last week, inactive this week | +| NOTE | {org} | Pilot with <3 users | Only {n} active users in evaluation | + +--- + +## Enterprise Call Intelligence + +### Weekly Call Summary + +| Org | Calls | Positive | Neutral | Negative | Competitors | Top Objection | +|-----|-------|----------|---------|----------|-------------|---------------| +| {org} | {n} | {n} | {n} | {n} | {list} | {objection} | +| ... | ... | ... | ... | ... | ... | ... | + +### Notable Call Details + +{For each org with calls this week, list key highlights:} + +**{Org}** — {call title} ({date}) +- Sentiment: {sentiment} +- Pain point: {customer_pain_point} +- Missing features: {missing_features} +- Next steps: {next_steps} + +### Feedback & Known Issues (from shared/enterprise-feedback.txt) + +> Relevant verbatims and CSM observations per account. Quote directly where possible. + +- **{Org}:** {1–2 sentence summary of relevant feedback or concern.} +- **{Org}:** {e.g., "Raised portrait video as a blocker for social media use case."} + +--- + +## Recommendations + +> Action items for the CSM / account team. Only include if there is something specific to act on. + +| Priority | Org | Recommended Action | Reason | +|----------|-----|-------------------|--------| +| 🔴 High | {Org} | {Action — e.g., "Schedule check-in call"} | {Reason — e.g., "Silent for 1 week"} | +| 🟡 Med | {Org} | {Action — e.g., "Share feature guide for Camera Control"} | {Reason} | +| 🟢 Low | {Org} | {Action — e.g., "Celebrate milestone with account team"} | {Reason} | + +--- + +## Appendix A: Top Users Across All Orgs + +| Org | User | Email | Type | Tokens | Images | Videos | Active Days | +|-----|------|-------|------|--------|--------|--------|-------------| +| {org} | {lt_id} | {email} | {SSO/Code} | {nK} | {n} | {n} | {n} | +| ... | ... | ... | ... | ... | ... | ... | ... | + +## Appendix B: Feedback History Highlights + +{Reference specific entries from shared/enterprise-feedback.txt that are relevant to this week's trends. +Only include if a quantitative signal connects to a known qualitative issue.} + +- {Org}: {Brief context from feedback history that explains or contextualizes this week's data} + +## Appendix C: 4-Week Trends (from Hex Dashboard) + +{Include 4-week trend data from Hex threads to provide context beyond the 2-week BigQuery window.} + +### Annual Orgs — 4-Week WAU Trend +| Org | Week -3 | Week -2 | Week -1 | Current Week | Trend | +|-----|---------|---------|---------|--------------|-------| +| {org} | {n} | {n} | {n} | {n} | {↑/↓/→} | + +### Pilot Orgs — 4-Week WAU Trend +| Org | Week -3 | Week -2 | Week -1 | Current Week | Trend | +|-----|---------|---------|---------|--------------|-------| +| {org} | {n} | {n} | {n} | {n} | {↑/↓/→} | + +### Model Distribution Summary +| Org | Top Model (% share) | #2 Model (%) | Video Model (%) | +|-----|---------------------|--------------|-----------------| +| {org} | {model} ({x%}) | {model} ({x%}) | {model} ({x%}) | + +--- + +## Data Notes + +- Report period: {start} – {end} (complete ISO week) +- Source: `ltx-dwh-prod-processed.web.ltxstudio_user_all_actions` + `ltxstudio_users` +- Enterprise segmentation: exact CTE from `shared/bq-schema.md` (two-step pattern with " Pilot" suffix) +- LT team excluded at source (`ltxstudio_user_all_actions` pre-filters) +- Hex dashboard data: 4-week trends queried via Hex MCP threads +- {Any data quality warnings from SQL execution, if applicable} diff --git a/shared/bq-schema.md b/shared/bq-schema.md index 80f8f95..50f827b 100644 --- a/shared/bq-schema.md +++ b/shared/bq-schema.md @@ -320,6 +320,78 @@ Clustered by: to_date, griffin_tier_name --- +### ltxstudio_enterprise_users + +**Enterprise user membership.** One row per enterprise user. Used as JOIN filter to identify enterprise users. + +``` +Table: `ltx-dwh-prod-processed.web.ltxstudio_enterprise_users` +``` + +| Column | Type | Description | +|--------|------|-------------| +| lt_id | STRING | User identifier | +| email | STRING | User email | +| organization_name | STRING | SSO organization name (NULL for code-redeemed users) | +| enterprise_name_at_purchase | STRING | Enterprise name at time of purchase | +| current_enterprise_name | STRING | Current enterprise name | + +> Use this table as a JOIN filter to restrict queries to enterprise users. See Enterprise Users CTE in User Segmentation Queries section. + +--- + +### ltxstudio_enterprise_calls + +**Gong call recordings matched to enterprise orgs, with AI-analyzed fields.** + +``` +Table: `ltx-dwh-prod-processed.web.ltxstudio_enterprise_calls` +Partitioned by: call_started_at (TIMESTAMP) +``` + +**Identifiers & Metadata:** + +| Column | Type | Description | +|--------|------|-------------| +| conversation_key | STRING | Natural key for the call | +| call_started_at | TIMESTAMP | When the call started (partition key) | +| organization_names | REPEATED STRING | Org names on the call — use UNNEST to flatten | +| enterprise_participant_emails | REPEATED STRING | Participant emails | +| enterprise_participant_names | REPEATED STRING | Participant names | +| title | STRING | Call title/subject | +| call_spotlight_brief | STRING | Gong AI summary of the call | +| clean_text_transcript | STRING | Full transcript (plain text) | + +**AI-Analyzed Fields (all STRING; "Parse Error" = failed parsing):** + +| Column | Type | Description | +|--------|------|-------------| +| customer_sentiment | STRING | "Positive/Excited", "Neutral/Professional", "Negative/Frustrated" | +| competitors_mentioned | STRING | Comma-separated list or "None" (Runway, Freepik, Krea, Higgsfield, Weavy, Luma, Kling, Minimax) | +| primary_objection | STRING | "Price", "Missing Feature", "Timing", "Authority", "Security", "None" | +| customer_pain_point | STRING | Max 10 words describing the problem | +| missing_features | STRING | Features the customer wishes existed | +| mentioned_pitch_deck_feature | STRING | "Yes" / "No" | +| mentioned_collaboration_feature | STRING | "Yes" / "No" | +| mentioned_elements_feature | STRING | "Yes" / "No" | +| mentioned_retake_feature | STRING | "Yes" / "No" | +| budget_discussion | STRING | Amount/range or "Not Discussed" | +| is_decision_maker | STRING | "Yes", "No", "Unclear" | +| next_steps | STRING | Action items summary | + +**Query pattern — UNNEST organization_names to join with org-level data:** + +```sql +SELECT org_name, c.customer_sentiment, c.call_spotlight_brief +FROM `ltx-dwh-prod-processed.web.ltxstudio_enterprise_calls` c, +UNNEST(c.organization_names) AS org_name +WHERE c.call_started_at >= TIMESTAMP(@report_start_date) +``` + +> **Org name mapping:** `organization_names` contains raw names (e.g., "Monster", "NBC") without the " Pilot" suffix added by the enterprise CTE. Map manually when correlating with usage data (e.g., raw "Monster" = CTE "Monster Pilot", raw "Deriv" = CTE "Deriv"). + +--- + ### Other Marts | Table | Grain | Description | diff --git a/shared/enterprise-orgs.md b/shared/enterprise-orgs.md new file mode 100644 index 0000000..3f6799d --- /dev/null +++ b/shared/enterprise-orgs.md @@ -0,0 +1,109 @@ +# Enterprise Orgs Reference + +> Shared reference for all agents working with enterprise data. +> Source of truth for org classification, Hex dashboard access, and org-level notes. + +--- + +## Key Business Context + +- **Strategy:** LTX shifted in ~October 2025 to prioritize Enterprise over self-serve (also in `shared/product-context.md`) +- **"Active" definition:** For enterprise = generated content (not just page views) +- **Pilot risk:** Pilot orgs have time-limited evaluations — declining usage is a churn signal +- **Token costs:** Vary significantly by model — Nano Banana 2 >> Flux >> LTX-2 +- **Gong data:** `ltxstudio_enterprise_calls` provides qualitative context per week — sentiment, objections, feature requests +- **Feedback history:** `shared/enterprise-feedback.txt` — accumulated CSM/Slack notes per org + +--- + +## Org Classification + +- **Annual orgs** (hardcoded list): `Indegene`, `HearWell_BeWell`, `Novig`, `Cylndr Studios`, `Miroma`, `Deriv`, `McCann_Paris` — org name used as-is +- **Pilot orgs** (all others): org name gets " Pilot" suffix (e.g., "Monster Pilot", "GM Pilot") +- **McCann split**: `McCann_NY` and `McCann_Paris` are distinct orgs (COALESCE logic) +- **Excluded**: Lightricks, Popular Pays, None +- **User type**: SSO (has `organization_name`) vs Code (redeemed code) + +> See `shared/bq-schema.md` — Enterprise Users CTE for the exact SQL implementation. + +--- + +## Org Reference + +| Org | Type | Notes | +|-----|------|-------| +| McCann_Paris | Annual | Image-heavy (~88% Nano Banana 2), creative agency, 34-42 WAU, dominates token consumption (~73% of annual total) | +| Deriv | Annual | Financial services, 10-18 WAU, declining tokens (818K→410K), needs portrait video support | +| HearWell_BeWell | Annual | Single power user, 23% Veo 3 usage (highest of any org), volatile activity | +| Novig | Annual | Low activity (4 WAU), 86% Nano Banana 2, minimal video | +| Cylndr Studios | Annual | Small team (2-7 WAU), volatile usage, `griffin_enterprise_name_at_action` is NULL | +| Indegene | Annual | Healthcare, most balanced feature adoption (Elements 63%, Storyboard 38%, Retake 38%), ramping image generation | +| Miroma | Annual | Video surge (638% WoW spike), high LTX-2 adoption (14%), Script-to-Storyboard adopter (32%) | +| McCann Pilot | Pilot | Image-heavy (10K+ images), 6-11 WAU, highest download rate, separate from McCann_Paris | +| McCann_NY Pilot | Pilot | Low activity (1 WAU), sporadic usage | +| General Motors Pilot | Pilot | Video-heavy (~6.4M tokens total), VEO-3 dominant, enterprise security key differentiator | +| Meta Pilot | Pilot | High-volume evaluation (~6M tokens), 9-16 WAU, declining trend, split between VEO-3 and LTX-2 | +| Fanatics Pilot | Pilot | Sports collectibles, image-focused (6K images), 1-6 WAU, growing user base | +| Jazz Side Pilot | Pilot | New org, rapid ramp-up (600 video gens last week), high growth trajectory | +| Plarium Pilot | Pilot | Gaming, massive onboarding spike (1→30 WAU), recently activated | +| Disney Pilot | Pilot | Broadest pilot user base (7-13 WAU), moderate volumes, exploratory usage | +| EōS Fitness Pilot | Pilot | Print + digital imagery, 3-5 WAU, declining trend | +| Bent Image Lab Pilot | Pilot | Animation studio, 2 WAU, low volume but consistent | +| Bosch Pilot | Pilot | Minimal activity, intermittent usage | +| Comcast Advertising Pilot | Pilot | In training phase, near-zero generation activity | +| Telemundo Pilot | Pilot | Broadcasting, near-zero activity | +| Monster Pilot | Pilot | Energy drinks brand, pilot ended — zero recent activity | +| NBC Universal Pilot | Pilot | Broadcasting, pilot ended — zero recent activity | +| Craft Pilot | Pilot | New, single user, no generations yet | +| Eset Pilot | Pilot | Cybersecurity, pilot stopped | + +--- + +## Hex Enterprise Dashboard + +**URL:** https://app.hex.tech/lightricks_prod/app/LTX-Studio---Enterprise-Dashboard-031766Suw1qd6T5K733wDu/latest +**Project ID:** `01997093-d0d2-700f-9dcc-e624403815fe` +**MCP Server:** Hex MCP (use `create_thread` / `continue_thread` / `get_thread`) + +| Tab | Scope | Metrics | +|-----|-------|---------| +| Annual Enterprise Data | Annual orgs | WAU, image/video gens, tokens, WoW trends, model mix, feature adoption | +| Pilot Enterprise Data | Pilot orgs | WAU, image/video gens, tokens, downloads, WoW trends | +| Image Generations Data | User-level | Per-user image gen counts, model used, tokens consumed | +| Video Generations Data | User-level | Per-user video gen counts, model used, tokens consumed | +| Download/Exports Data | User-level | Image/video downloads, project exports, download rates | + +**Hex vs BigQuery differences:** +- Hex uses `griffin_enterprise_name_at_action` for org resolution — this field is NULL for Cylndr Studios; Hex joins through `ltxstudio_enterprise_users` as fallback +- Hex has PII restrictions — `email` and `full_name` are protected; users identified by `lt_id` + `email_domain` +- Hex provides 4-week trend windows vs BigQuery queries' 2-week window +- Hex org names use raw `organization_name` (e.g., "McCann", "General Motors"), not the CTE's suffixed names (e.g., "McCann Pilot", "General Motors Pilot") + +--- + +## Model Distribution Benchmarks + +> As of March 2026. Update when dashboard data changes significantly. + +| Model Group | Type | Typical Share | Notes | +|-------------|------|---------------|-------| +| Nano Banana 2 | Image | 47-88% of gens | Dominant image model across all orgs | +| Flux (Flux 2 Pro) | Image | 5-36% of gens | #2 image model, used for quality-critical work | +| Z-Image | Image | 1-6% of gens | Newer model, lighter usage | +| LTX-2 (Pro/Fast) | Video | 1-16% of gens | In-house video model, strongest at Miroma/Indegene | +| Veo 3 / Veo 3.1 | Video | 1-23% of gens | External video model, highest at HearWell_BeWell | +| Veo 2 | Video | <1% of gens | Legacy, declining | +| LTXV (13b) | Video | <1% of gens | Legacy, declining | + +--- + +## Feature Adoption Benchmarks + +> As of March 2026. Update when dashboard data changes significantly. + +| Feature | Adoption Range | Benchmark | +|---------|---------------|-----------| +| Gen Space | 63-100% | Core workflow — nearly universal | +| Elements | 20-63% | Moderate — highest at Indegene (63%), lowest at Novig (20%) | +| Script-to-Storyboard | 0-38% | Growing — zero at McCann_Paris/Cylndr, highest at Indegene (38%) | +| Retake | 7-38% | Lowest adoption — highest at Indegene/Cylndr (38%), lowest at Deriv (7%) | diff --git a/shared/product-features.md b/shared/product-features.md new file mode 100644 index 0000000..f5b12f5 --- /dev/null +++ b/shared/product-features.md @@ -0,0 +1,161 @@ +# LTX Studio — Product Features Reference + +> Shared knowledge base for all LTX agents. +> Update this file when new features ship so agents can accurately classify +> feature requests vs. existing features and understand the competitive landscape. + +--- + +## Features that EXIST on the platform + +### Image & Video Generation +- **Text-to-Image** — Generate images from text prompts +- **Text-to-Video** — Generate video clips from text prompts +- **Image-to-Video** — Upload an image and animate it into a video +- **Audio-to-Video** — Upload audio, generate matching video +- **Script-to-Storyboard** — Paste a full script, get a complete storyboard in ~10 minutes (major improvement shipped late 2025) +- **Script-to-Video** — End-to-end from script to final video +- **Image & Video Upscaling** — Enhance resolution of generated assets + +### AI Models available inside the platform + +**LTX Video & Audio Models (Lightricks proprietary):** +- **LTX-2 Pro** — High quality, balancing speed and fidelity +- **LTX-2 Fast** — High quality, optimized for speed + +**Other Video Models:** +- **Veo 3.1** — Google's leading video & audio model (long wait, extreme cost) +- **Veo 3.1 Fast** — Google's video & audio model, optimized for speed (high cost) +- **Veo 2** — Google's legacy video model +- **Kling 2.6 Pro** — Realistic video generations with audio (Standard tier+) +- **Kling 3.0 Pro** — Advanced cinematic generations (coming soon) + +**Image Models:** +- **Nano Banana 2** — High-quality, fast and cost-effective, by Google +- **Nano Banana Pro (Gemini 3)** — Studio-grade quality with multi-image fusion, by Google +- **FLUX.2 Pro** — Enhanced realism and consistency, by Black Forest Labs (Standard tier+) +- **Z-Image** — Fast, high-quality visuals by Alibaba (Free tier) + +### Storyboard Tools +- **AI Storyboard Generator** — Input text, prompts, or sketch → generates storyboard frames +- **Blank Storyboards** — Manual storyboard creation without AI generation +- **Shot Breakdown** — AI breaks script into scenes, shots, and camera details +- **Multi-level Customization** — Control at project, board, and frame levels +- **Aspect Ratio Selection** — Cinematic 2.35:1, standard 16:9, vertical 9:16 +- **Genre & Mood Styling** — Noir, sci-fi, comedy, horror, and mood options +- **Animated Storyboards** — Export storyboard frames as animated video + +### Character & Object Consistency (Elements) +- **Elements Feature** — Define a character or object once; system maintains appearance across all shots +- **Character Profiles** — Store age, ethnicity, hairstyle, wardrobe, and facial details +- **Auto-Extraction** — Characters and objects automatically extracted from script +- **Cross-shot Consistency** — Same character/object looks identical across all frames + +### Camera Controls +- **Advanced Camera Controls** — Control camera angle, framing, and composition +- **Camera Motion Control** — Define movement (pan, tilt, zoom, dolly) +- **Scene Composition Suggestions** — AI suggests framing for each shot + +### Editing +- **Timeline Editor** — Simplified video editor (like Final Cut) to assemble shots into a video +- **Editing Package** — Included in Lite tier and above + +### Collaboration +- **Real-time Multi-user Collaboration** — Multiple users edit the same project simultaneously (like Figma) +- **Comments on Shots** — Leave feedback on specific frames +- **Structured Review/Approval Workflow** — Approval flows between team members +- **Up to 3 Collaborators** — Pro tier +- **Unlimited Collaborators** — Enterprise tier only + +### Export & Output +- **MP4 Video Export** — Export final video +- **PDF Pitch Deck Export** — Auto-generates a full pitch deck with synopsis, character profiles, storyboard, and style references +- **Commercial Use License** — Standard tier and above + +### Enterprise Features +- **SSO (Single Sign-On)** — Enterprise only +- **Custom LTX-2 Model Training** — Fine-tune the model on brand/style (Enterprise only) +- **Multi-Brand Kit** — Manage multiple brand identities with org-level roles (Enterprise only) +- **GDPR, ISO, SOC 2 Compliance** — Enterprise only +- **Enhanced Data Privacy** — No training on customer inputs or outputs (Enterprise) +- **Dedicated Account Manager** — Enterprise only +- **Volume Pricing & Centralized Invoicing** — Enterprise only +- **Custom Credit Allocation** — Enterprise only + +### Platform +- **Web-based** — No download required, desktop browser +- **Credits System** — AI usage measured in credits/computing seconds +- **Purchase Additional Credits** — Standard tier and above + +--- + +## Features that do NOT exist on the platform + +> If a prospect mentions wanting these, classify as a genuine feature request. + +- **Managed / white-glove service** — Users must operate the tool themselves; no done-for-you production +- **Mobile app** — Desktop web only; no iOS or Android app +- **Adobe Premiere / After Effects integration** — No direct plugin or export to NLEs +- **Final Cut Pro integration** — No direct integration +- **DaVinci Resolve integration** — No direct integration +- **Video-to-video editing / inpainting** — Cannot edit or modify existing video footage; only generate new video +- **Automatic scene transitions** — Not automated +- **Built-in voiceover / TTS** — No text-to-speech; Audio-to-Video is different (needs audio input) +- **Multi-language dubbing / lip sync** — Not available (Veo 3.1 has AI audio but not translation dubbing) +- **Music / soundtrack generation** — No built-in music creation +- **Green screen / chroma key / compositing** — No background removal or compositing tools +- **Background removal tool** — Not a dedicated feature +- **API access for enterprise customers** — LTX Model API is a separate product (not part of LTX Studio) +- **Batch processing / bulk generation** — No automated bulk workflows +- **Version history / branching** — No project versioning +- **Offline / desktop app** — Web only, requires internet connection +- **3D asset generation** — No 3D model or scene creation +- **Voice cloning** — No custom voice replication feature +- **Custom music / brand soundtrack** — No music generation tied to brand guidelines + +--- + +## Pricing Tiers (for price objection context) + +| Tier | Price | Key inclusions | +|------|-------|----------------| +| Free | $0 | 800 credits one-time, personal use only, no Storyboards, no Elements | +| Lite | $15/mo | 8,640 credits, editing package, personal use only | +| Standard | $35/mo | 28,000 credits, FLUX models, Elements, AI Storyboards, Pitch Decks, commercial license | +| Pro | More credits | Veo 3.1, 3 collaborators, commercial license | +| Enterprise | Custom | Everything + SSO, custom model, compliance, unlimited collaborators, Brand Kit | + +--- + +## Common Competitors Mentioned in Sales Calls + +Always use the **Canonical Name** column exactly when reporting competitors. + +| Canonical Name | Also known as | Category | +|---------------|---------------|----------| +| Midjourney | Mid Journey | Image generation | +| Runway | Runway ML, Runway Gen-3 | Video generation | +| Kling | Kling AI | Video generation (also available inside LTX Studio) | +| Google Veo | Veo, Veo 2, Veo 3, Google Vertex AI Video | Video generation (also available inside LTX Studio) | +| Gemini | Google Gemini | Google multimodal AI | +| Sora | OpenAI Sora, OpenAI Video | Video generation | +| Adobe | Adobe Premiere, After Effects, Adobe Firefly, Adobe Express | Creative suite / editing | +| HiggsField | Higgs Field, HiggsField AI, Higsfield | AI video (popular in India/Southeast Asia) | +| HeyGen | Hey Gen | Avatar / talking head video | +| Synthesia | — | Avatar / talking head video | +| Pika | Pika Labs | Video generation | +| Luma AI | Luma, Dream Machine | Video generation | +| ComfyUI | Comfy UI, ComfyUI workflows | Open-source AI image/video pipeline | +| Stable Video | Stable Video Diffusion, SVD, Stability AI Video | Open-source video generation | +| D-ID | — | Talking head / avatar video | +| ElevenLabs | Eleven Labs | AI voice / audio | +| Topaz | Topaz Labs | Video upscaling | +| CapCut | Cap Cut | Short-form video editing (mobile-first) | +| DaVinci Resolve | DaVinci, Da Vinci | Professional video editing | +| Final Cut Pro | FCP, Final Cut | Traditional video editing (macOS) | +| ChatGPT | GPT-4, OpenAI | General AI / script writing | +| Descript | — | AI-powered podcast / video editing | +| Canva | Canva Video | Design / simple video | +| MiniMax | Hailuo, Hailuo AI | Video generation (Chinese) | +| Wan | Wan Video, Wan AI | Video generation (Chinese) | +| Manual storyboard artists | Traditional workflow | Outsourced / human production |