Skip to content

Commit f2dcb6f

Browse files
committed
Merge remote-tracking branch 'upstream/main' into fix/stream-restore-on-thread-navigation
2 parents cbca760 + a9087fd commit f2dcb6f

File tree

21 files changed

+1776
-1771
lines changed

21 files changed

+1776
-1771
lines changed

.cursor/rules/suna-project.mdc

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -70,7 +70,6 @@ project/
7070
- **LLM Integration**: LiteLLM for multi-provider support, structured prompts
7171
- **Tool System**: Dual schema decorators (OpenAPI + XML), consistent ToolResult
7272
- **Real-time**: Supabase subscriptions for live updates
73-
- **Background Jobs**: Dramatiq for async processing, QStash for scheduling
7473

7574
## Key Technologies
7675

CONTRIBUTING.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,6 @@ Before contributing, ensure you have access to:
4141
- Daytona account (for agent execution)
4242
- Tavily API key (for search)
4343
- Firecrawl API key (for web scraping)
44-
- QStash account (for background jobs)
4544

4645
**Optional:**
4746

README.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -99,7 +99,6 @@ The setup process includes:
9999
- Setting up Daytona for secure agent execution
100100
- Integrating with LLM providers (Anthropic, OpenAI, OpenRouter, etc.)
101101
- Configuring web search and scraping capabilities (Tavily, Firecrawl)
102-
- Setting up QStash for background job processing and workflows
103102
- Configuring webhook handling for automated tasks
104103
- Optional integrations (RapidAPI for data providers)
105104

@@ -147,14 +146,13 @@ We welcome contributions from the community! Please see our [Contributing Guide]
147146
### Technologies
148147

149148
- [Daytona](https://daytona.io/) - Secure agent execution environment
150-
- [Supabase](https://supabase.com/) - Database and authentication
149+
- [Supabase](https://supabase.com/) - Database, Cron, and Authentication
151150
- [Playwright](https://playwright.dev/) - Browser automation
152151
- [OpenAI](https://openai.com/) - LLM provider
153152
- [Anthropic](https://www.anthropic.com/) - LLM provider
154153
- [Morph](https://morphllm.com/) - For AI-powered code editing
155154
- [Tavily](https://tavily.com/) - Search capabilities
156155
- [Firecrawl](https://firecrawl.dev/) - Web scraping capabilities
157-
- [QStash](https://upstash.com/qstash) - Background job processing and workflows
158156
- [RapidAPI](https://rapidapi.com/) - API services
159157
- Custom MCP servers - Extend functionality with custom tools
160158

backend/.env.example

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -54,11 +54,6 @@ SMITHERY_API_KEY=
5454

5555
MCP_CREDENTIAL_ENCRYPTION_KEY=
5656

57-
QSTASH_URL="https://qstash.upstash.io"
58-
QSTASH_TOKEN=""
59-
QSTASH_CURRENT_SIGNING_KEY=""
60-
QSTASH_NEXT_SIGNING_KEY=""
61-
6257
WEBHOOK_BASE_URL=""
6358

6459
# Optional

backend/README.md

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -96,11 +96,6 @@ DAYTONA_API_KEY=your-daytona-key
9696
DAYTONA_SERVER_URL=https://app.daytona.io/api
9797
DAYTONA_TARGET=us
9898

99-
# Background Job Processing (Required)
100-
QSTASH_URL=https://qstash.upstash.io
101-
QSTASH_TOKEN=your-qstash-token
102-
QSTASH_CURRENT_SIGNING_KEY=your-current-signing-key
103-
QSTASH_NEXT_SIGNING_KEY=your-next-signing-key
10499
WEBHOOK_BASE_URL=https://yourdomain.com
105100

106101
# MCP Configuration

backend/agent/prompt.py

Lines changed: 8 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -533,15 +533,13 @@
533533
534534
**CRITICAL EXECUTION ORDER RULES:**
535535
1. **SEQUENTIAL EXECUTION ONLY:** You MUST execute tasks in the exact order they appear in the Task List
536-
2. **ONE TASK AT A TIME:** Never execute multiple tasks simultaneously or in bulk
536+
2. **ONE TASK AT A TIME:** Never execute multiple tasks simultaneously or in bulk, but you can update multiple tasks in a single call
537537
3. **COMPLETE BEFORE MOVING:** Finish the current task completely before starting the next one
538538
4. **NO SKIPPING:** Do not skip tasks or jump ahead - follow the list strictly in order
539539
5. **NO BULK OPERATIONS:** Never do multiple web searches, file operations, or tool calls at once
540540
6. **ASK WHEN UNCLEAR:** If you encounter ambiguous results or unclear information during task execution, stop and ask for clarification before proceeding
541541
7. **DON'T ASSUME:** When tool results are unclear or don't match expectations, ask the user for guidance rather than making assumptions
542-
8. **MANDATORY TASK COMPLETION:** After completing each task, IMMEDIATELY update it to "completed" status before proceeding to the next task
543-
9. **NO MULTIPLE UPDATES:** Never update multiple tasks at once - complete one task, mark it complete, then move to the next
544-
10. **VERIFICATION REQUIRED:** Only mark a task as complete when you have concrete evidence of completion
542+
8. **VERIFICATION REQUIRED:** Only mark a task as complete when you have concrete evidence of completion
545543
546544
**🔴 CRITICAL WORKFLOW EXECUTION RULES - NO INTERRUPTIONS 🔴**
547545
**WORKFLOWS MUST RUN TO COMPLETION WITHOUT STOPPING!**
@@ -596,12 +594,11 @@
596594
**MANDATORY EXECUTION CYCLE:**
597595
1. **IDENTIFY NEXT TASK:** Use view_tasks to see which task is next in sequence
598596
2. **EXECUTE SINGLE TASK:** Work on exactly one task until it's fully complete
599-
3. **UPDATE TO COMPLETED:** Immediately mark the completed task as "completed" using update_tasks
600-
4. **MOVE TO NEXT:** Only after marking the current task complete, move to the next task
601-
5. **REPEAT:** Continue this cycle until all tasks are complete
602-
6. **SIGNAL COMPLETION:** Use 'complete' or 'ask' when all tasks are finished
603-
604-
**CRITICAL: NEVER execute multiple tasks simultaneously or update multiple tasks at once. Always complete one task fully, mark it complete, then move to the next.**
597+
3. **THINK ABOUT BATCHING:** Before updating, consider if you have completed multiple tasks that can be batched into a single update call
598+
4. **UPDATE TO COMPLETED:** Update the status of completed task(s) to 'completed'. EFFICIENT APPROACH: Batch multiple completed tasks into one update call rather than making multiple consecutive calls
599+
5. **MOVE TO NEXT:** Only after marking the current task complete, move to the next task
600+
6. **REPEAT:** Continue this cycle until all tasks are complete
601+
7. **SIGNAL COMPLETION:** Use 'complete' or 'ask' when all tasks are finished
605602
606603
**HANDLING AMBIGUOUS RESULTS DURING TASK EXECUTION:**
607604
1. **WORKFLOW CONTEXT MATTERS:**
@@ -672,7 +669,7 @@
672669
4. **EXECUTION:** Wait for tool execution and observe results
673670
5. **TASK COMPLETION:** Verify the current task is fully completed before moving to the next
674671
6. **NARRATIVE UPDATE:** Provide **Markdown-formatted** narrative updates explaining what was accomplished and what's next
675-
7. **PROGRESS TRACKING:** Mark current task complete, update Task List with any new tasks needed
672+
7. **PROGRESS TRACKING:** Mark current task complete, update Task List with any new tasks needed. EFFICIENT APPROACH: Consider batching multiple completed tasks into a single update call
676673
8. **NEXT TASK:** Move to the next task in sequence - NEVER skip ahead or do multiple tasks at once
677674
9. **METHODICAL ITERATION:** Repeat this cycle for each task in order until all tasks are complete
678675
10. **COMPLETION:** IMMEDIATELY use 'complete' or 'ask' when ALL tasks are finished
@@ -717,13 +714,6 @@
717714
- Technical documentation or guides
718715
- Any content that would be better as an editable artifact
719716
720-
**CRITICAL FILE CREATION RULES:**
721-
- **ONE FILE PER REQUEST:** For a single user request, create ONE file and edit it throughout the entire process
722-
- **EDIT LIKE AN ARTIFACT:** Treat the file as a living document that you continuously update and improve
723-
- **APPEND AND UPDATE:** Add new sections, update existing content, and refine the file as you work
724-
- **NO MULTIPLE FILES:** Never create separate files for different parts of the same request
725-
- **COMPREHENSIVE DOCUMENT:** Build one comprehensive file that contains all related content
726-
727717
**CRITICAL FILE CREATION RULES:**
728718
- **ONE FILE PER REQUEST:** For a single user request, create ONE file and edit it throughout the entire process
729719
- **EDIT LIKE AN ARTIFACT:** Treat the file as a living document that you continuously update and improve

backend/agent/tools/task_list_tool.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -345,7 +345,7 @@ async def create_tasks(self, sections: Optional[List[Dict[str, Any]]] = None,
345345
"type": "function",
346346
"function": {
347347
"name": "update_tasks",
348-
"description": "Update one or more tasks. Can update content, status, or move tasks between sections. IMPORTANT: Follow the one-task-at-a-time execution principle. After completing each individual task, immediately update it to 'completed' status before proceeding to the next task. This ensures proper progress tracking and prevents bulk operations that violate the sequential execution workflow.",
348+
"description": "Update one or more tasks. EFFICIENT BATCHING: Before calling this tool, think about what tasks you have completed and batch them into a single update call. This is more efficient than making multiple consecutive update calls. Always execute tasks in the exact sequence they appear, but batch your updates when possible. Update task status to 'completed' after finishing each task, and consider batching multiple completed tasks into one call rather than updating them individually.",
349349
"parameters": {
350350
"type": "object",
351351
"properties": {
@@ -354,7 +354,7 @@ async def create_tasks(self, sections: Optional[List[Dict[str, Any]]] = None,
354354
{"type": "string"},
355355
{"type": "array", "items": {"type": "string"}, "minItems": 1}
356356
],
357-
"description": "Task ID (string) or array of task IDs to update. For optimal workflow, prefer updating single tasks to 'completed' status immediately after completion rather than bulk updates."
357+
"description": "Task ID (string) or array of task IDs to update. EFFICIENT APPROACH: Batch multiple completed tasks into a single call rather than making multiple consecutive update calls. Always maintain sequential execution order."
358358
},
359359
"content": {
360360
"type": "string",
@@ -363,7 +363,7 @@ async def create_tasks(self, sections: Optional[List[Dict[str, Any]]] = None,
363363
"status": {
364364
"type": "string",
365365
"enum": ["pending", "completed", "cancelled"],
366-
"description": "New status for the task(s) (optional). Use 'completed' immediately after finishing each individual task to maintain proper execution flow."
366+
"description": "New status for the task(s) (optional). Set to 'completed' for finished tasks. Batch multiple completed tasks when possible."
367367
},
368368
"section_id": {
369369
"type": "string",
@@ -376,15 +376,15 @@ async def create_tasks(self, sections: Optional[List[Dict[str, Any]]] = None,
376376
})
377377
@usage_example(
378378
'''
379-
# Update single task:
379+
# Update single task (when only one task is completed):
380380
<function_calls>
381381
<invoke name="update_tasks">
382382
<parameter name="task_ids">task-uuid-here</parameter>
383383
<parameter name="status">completed</parameter>
384384
</invoke>
385385
</function_calls>
386386
387-
# Update multiple tasks:
387+
# Update multiple tasks (EFFICIENT: batch multiple completed tasks):
388388
<function_calls>
389389
<invoke name="update_tasks">
390390
<parameter name="task_ids">["task-id-1", "task-id-2", "task-id-3"]</parameter>

backend/pyproject.toml

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,6 @@ dependencies = [
5858
"cryptography>=41.0.0",
5959
"apscheduler>=3.10.0",
6060
"croniter>=1.4.0",
61-
"qstash>=2.0.0",
6261
"structlog==25.4.0",
6362
"PyPDF2==3.0.1",
6463
"python-docx==1.1.0",
Lines changed: 96 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,96 @@
1+
-- Enable Supabase Cron and HTTP extensions and provide helper RPCs
2+
-- This migration replaces QStash-based scheduling with Supabase Cron
3+
4+
BEGIN;
5+
6+
-- Enable required extensions if not already enabled
7+
CREATE EXTENSION IF NOT EXISTS pg_cron;
8+
CREATE EXTENSION IF NOT EXISTS pg_net;
9+
10+
-- Helper function to schedule an HTTP POST via Supabase Cron
11+
-- Overwrites existing job with the same name
12+
CREATE OR REPLACE FUNCTION public.schedule_trigger_http(
13+
job_name text,
14+
schedule text,
15+
url text,
16+
headers jsonb DEFAULT '{}'::jsonb,
17+
body jsonb DEFAULT '{}'::jsonb,
18+
timeout_ms integer DEFAULT 8000
19+
) RETURNS bigint
20+
LANGUAGE plpgsql
21+
SECURITY DEFINER
22+
AS $$
23+
DECLARE
24+
job_id bigint;
25+
sql_text text;
26+
headers_fixed jsonb;
27+
body_fixed jsonb;
28+
BEGIN
29+
-- Unschedule any existing jobs with the same name
30+
PERFORM cron.unschedule(j.jobid)
31+
FROM cron.job j
32+
WHERE j.jobname = job_name;
33+
34+
-- Normalize headers/body in case callers pass JSON strings instead of objects
35+
headers_fixed := COALESCE(
36+
CASE
37+
WHEN headers IS NULL THEN '{}'::jsonb
38+
WHEN jsonb_typeof(headers) = 'object' THEN headers
39+
WHEN jsonb_typeof(headers) = 'string' THEN (
40+
-- Remove surrounding quotes then unescape to get raw JSON text, finally cast to jsonb
41+
replace(replace(trim(both '"' from headers::text), '\\"', '"'), '\\\\', '\\')
42+
)::jsonb
43+
ELSE '{}'::jsonb
44+
END,
45+
'{}'::jsonb
46+
);
47+
48+
body_fixed := COALESCE(
49+
CASE
50+
WHEN body IS NULL THEN '{}'::jsonb
51+
WHEN jsonb_typeof(body) = 'object' THEN body
52+
WHEN jsonb_typeof(body) = 'string' THEN (
53+
replace(replace(trim(both '"' from body::text), '\\"', '"'), '\\\\', '\\')
54+
)::jsonb
55+
ELSE body
56+
END,
57+
'{}'::jsonb
58+
);
59+
60+
-- Build the SQL snippet to be executed by pg_cron
61+
sql_text := format(
62+
$sql$select net.http_post(
63+
url := %L,
64+
headers := %L::jsonb,
65+
body := %L::jsonb,
66+
timeout_milliseconds := %s
67+
);$sql$,
68+
url,
69+
headers_fixed::text,
70+
body_fixed::text,
71+
timeout_ms
72+
);
73+
74+
job_id := cron.schedule(job_name, schedule, sql_text);
75+
RETURN job_id;
76+
END;
77+
$$;
78+
79+
-- Helper to unschedule by job name
80+
CREATE OR REPLACE FUNCTION public.unschedule_job_by_name(job_name text)
81+
RETURNS void
82+
LANGUAGE plpgsql
83+
SECURITY DEFINER
84+
AS $$
85+
BEGIN
86+
PERFORM cron.unschedule(j.jobid)
87+
FROM cron.job j
88+
WHERE j.jobname = job_name;
89+
END;
90+
$$;
91+
92+
-- Grant execute to service role (backend uses service role key)
93+
GRANT EXECUTE ON FUNCTION public.schedule_trigger_http(text, text, text, jsonb, jsonb, integer) TO service_role;
94+
GRANT EXECUTE ON FUNCTION public.unschedule_job_by_name(text) TO service_role;
95+
96+
COMMIT;

backend/triggers/api.py

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@
66
import uuid
77
from datetime import datetime, timezone
88
import json
9+
import hmac
910

1011
from services.supabase import DBConnection
1112
from utils.auth_utils import get_current_user_id_from_jwt
@@ -452,6 +453,18 @@ async def trigger_webhook(
452453
raise HTTPException(status_code=403, detail="Agent triggers are not enabled")
453454

454455
try:
456+
# Simple header-based auth using a shared secret
457+
# Configure the secret via environment variable: TRIGGER_WEBHOOK_SECRET
458+
secret = os.getenv("TRIGGER_WEBHOOK_SECRET")
459+
if not secret:
460+
logger.error("TRIGGER_WEBHOOK_SECRET is not configured")
461+
raise HTTPException(status_code=500, detail="Webhook secret not configured")
462+
463+
incoming_secret = request.headers.get("x-trigger-secret", "")
464+
if not hmac.compare_digest(incoming_secret, secret):
465+
logger.warning(f"Invalid webhook secret for trigger {trigger_id}")
466+
raise HTTPException(status_code=401, detail="Unauthorized")
467+
455468
# Get raw data from request
456469
raw_data = {}
457470
try:

0 commit comments

Comments
 (0)