Skip to content

Commit 50bbe61

Browse files
authored
Merge pull request #2162 from mito-ds/abacus
Abacus
2 parents 31e0e02 + bdff443 commit 50bbe61

22 files changed

+826
-271
lines changed
Lines changed: 88 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,88 @@
1+
name: Test - Mito AI Frontend Playwright with Abacus AI
2+
3+
on:
4+
push:
5+
branches: [ dev ]
6+
paths:
7+
- 'mito-ai/**'
8+
- 'tests/llm_providers_tests/abacus_llm_providers.spec.ts'
9+
- '.github/workflows/test-abacus-llm-providers.yml'
10+
pull_request:
11+
paths:
12+
- 'mito-ai/**'
13+
- 'tests/llm_providers_tests/abacus_llm_providers.spec.ts'
14+
- '.github/workflows/test-abacus-llm-providers.yml'
15+
workflow_dispatch:
16+
17+
concurrency:
18+
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
19+
cancel-in-progress: true
20+
21+
jobs:
22+
test-mitoai-frontend-jupyterlab-abacus:
23+
runs-on: ubuntu-24.04
24+
timeout-minutes: 60
25+
strategy:
26+
matrix:
27+
python-version: ['3.10', '3.12']
28+
fail-fast: false
29+
30+
steps:
31+
- uses: actions/checkout@v4
32+
- name: Set up Python ${{ matrix.python-version }}
33+
uses: actions/setup-python@v5
34+
with:
35+
python-version: ${{ matrix.python-version }}
36+
cache: pip
37+
cache-dependency-path: |
38+
mito-ai/setup.py
39+
tests/requirements.txt
40+
- uses: actions/setup-node@v4
41+
with:
42+
node-version: 22
43+
cache: 'npm'
44+
cache-dependency-path: mito-ai/package-lock.json
45+
- name: Upgrade pip
46+
run: |
47+
python -m pip install --upgrade pip
48+
- name: Install dependencies
49+
run: |
50+
cd tests
51+
bash mac-setup.sh
52+
- name: Install mitosheet-helper-enterprise
53+
run: |
54+
cd tests
55+
source venv/bin/activate
56+
pip install mitosheet-helper-enterprise
57+
- name: Install JupyterLab
58+
run: |
59+
python -m pip install jupyterlab
60+
- name: Install Node.js dependencies
61+
run: |
62+
cd mito-ai
63+
jlpm install
64+
- name: Setup JupyterLab
65+
run: |
66+
cd tests
67+
source venv/bin/activate
68+
pip install setuptools==68.0.0
69+
cd ../mito-ai
70+
jupyter labextension develop . --overwrite
71+
jupyter server extension enable --py mito_ai
72+
- name: Start a server and run Abacus AI provider tests
73+
run: |
74+
cd tests
75+
source venv/bin/activate
76+
jupyter lab --config jupyter_server_test_config.py &
77+
jlpm run test:abacus-llm-providers
78+
env:
79+
ABACUS_BASE_URL: ${{ secrets.ABACUS_BASE_URL }}
80+
ABACUS_MODELS: ${{ secrets.ABACUS_MODELS }}
81+
ABACUS_API_KEY: ${{ secrets.ABACUS_API_KEY }}
82+
- name: Upload test-results
83+
uses: actions/upload-artifact@v4
84+
if: failure()
85+
with:
86+
name: mitoai-jupyterlab-playwright-abacus-report-${{ matrix.python-version }}-${{ github.run_id }}
87+
path: tests/playwright-report/
88+
retention-days: 14

agent.md

Lines changed: 12 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,24 @@
1+
12
## Virtual Environment
23
- ALWAYS activate the virtual environment before running any Python commands: `cd mito-ai && source venv/bin/activate`
34
- For other components: check for venv directories and activate appropriately
45

56
## Engineering practices
67

8+
### Startup mentality
79
We're a startup. You're probably used to writing enterprise code -- code that tries to handle every possible edge case and has fallbacks for everything. That's not how we do things around here: our number one rule is to keep things simple. We handle ONLY the most important cases.
810

911
We try to only add new functionality that is small (that is, simple and few lines of code) or absolutely necessary. If a change is not small or absolutely necessary, don't make it.
1012

11-
**Backwards-compatibility**: Since our app runs locally on a user's machine, we think about backwards compatibility very differently than enterprise code.
13+
### Backwards Compatibility
14+
Since our app runs locally on a user's machine, we think about backwards compatibility very differently than enterprise code.
1215
- For things like updating the environment variables structure, we don't need to worry about backwards compatibility because if the enterprise opts in to upgrading, they can update the enviornment variables at the same time.
1316
- For things like adding new tool options, we don't need to worry about backwards compatibility because only users on the newest version of the tool are going to have access to the new options.
14-
- For things like changing how we read in old chat histories, we DO need to worry about backwards compatibility because the chat histories live on the user's machine, so we don't have a way to migrate them to the new format without adding that migration step into the tool.
17+
- For things like changing how we read in old chat histories, we DO need to worry about backwards compatibility because the chat histories live on the user's machine, so we don't have a way to migrate them to the new format without adding that migration step into the tool.
18+
19+
### Use Parameterized Tests When Appropriate
20+
Use `@pytest.mark.parametrize` when you have multiple test cases that:
21+
- Test the same function/method with different inputs
22+
- Have the same expected output structure
23+
- Follow the same test logic pattern
24+
- Would otherwise require repetitive test code

mito-ai/docs/litellm-deployment.md

Lines changed: 0 additions & 87 deletions
This file was deleted.

mito-ai/mito_ai/__init__.py

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,7 @@
99
from mito_ai.app_deploy.handlers import AppDeployHandler
1010
from mito_ai.log.urls import get_log_urls
1111
from mito_ai.utils.litellm_utils import is_litellm_configured
12+
from mito_ai.enterprise.utils import is_abacus_configured
1213
from mito_ai.version_check import VersionCheckHandler
1314
from mito_ai.db.urls import get_db_urls
1415
from mito_ai.settings.urls import get_settings_urls
@@ -101,10 +102,12 @@ def _load_jupyter_server_extension(server_app) -> None: # type: ignore
101102

102103
web_app.add_handlers(host_pattern, handlers)
103104

104-
# Log enterprise mode status and LiteLLM configuration
105+
# Log enterprise mode status and router configuration
105106
if is_enterprise():
106107
server_app.log.info("Enterprise mode enabled")
107-
if is_litellm_configured():
108+
if is_abacus_configured():
109+
server_app.log.info(f"Abacus AI configured: endpoint={constants.ABACUS_BASE_URL}, models={constants.ABACUS_MODELS}")
110+
elif is_litellm_configured():
108111
server_app.log.info(f"LiteLLM configured: endpoint={constants.LITELLM_BASE_URL}, models={constants.LITELLM_MODELS}")
109112

110113
server_app.log.info("Loaded the mito_ai server extension")

mito-ai/mito_ai/_version.py

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,3 @@
1-
# Copyright (c) Saga Inc.
2-
# Distributed under the terms of the GNU Affero General Public License v3.0 License.
3-
41
# This file is auto-generated by Hatchling. As such, do not:
52
# - modify
63
# - track in version control e.g. be sure to add to .gitignore

mito-ai/mito_ai/constants.py

Lines changed: 25 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
# Distributed under the terms of the GNU Affero General Public License v3.0 License.
33

44
import os
5-
from typing import Union
5+
from typing import Union, List
66

77
# Claude
88
ANTHROPIC_API_KEY = os.environ.get("ANTHROPIC_API_KEY")
@@ -23,12 +23,34 @@
2323
AZURE_OPENAI_ENDPOINT = os.environ.get("AZURE_OPENAI_ENDPOINT")
2424
AZURE_OPENAI_MODEL = os.environ.get("AZURE_OPENAI_MODEL")
2525

26+
def parse_comma_separated_models(models_str: str) -> List[str]:
27+
"""
28+
Parse a comma-separated string of model names into a list.
29+
Handles quoted and unquoted values, stripping whitespace and quotes.
30+
31+
Args:
32+
models_str: Comma-separated string of model names (e.g., "model1,model2" or '"model1","model2"')
33+
34+
Returns:
35+
List of model names with whitespace and quotes stripped
36+
"""
37+
if not models_str:
38+
return []
39+
return [model.strip().strip('"\'') for model in models_str.split(",") if model.strip()]
40+
2641
# LiteLLM Config (Enterprise mode only)
2742
LITELLM_BASE_URL = os.environ.get("LITELLM_BASE_URL")
2843
LITELLM_API_KEY = os.environ.get("LITELLM_API_KEY")
2944
LITELLM_MODELS_STR = os.environ.get("LITELLM_MODELS", "")
30-
# Parse comma-separated string into list, strip whitespace
31-
LITELLM_MODELS = [model.strip() for model in LITELLM_MODELS_STR.split(",") if model.strip()] if LITELLM_MODELS_STR else []
45+
# Parse comma-separated string into list, strip whitespace and quotes
46+
LITELLM_MODELS = parse_comma_separated_models(LITELLM_MODELS_STR)
47+
48+
# Abacus AI Config (Enterprise mode only)
49+
ABACUS_BASE_URL = os.environ.get("ABACUS_BASE_URL")
50+
ABACUS_API_KEY = os.environ.get("ABACUS_API_KEY")
51+
ABACUS_MODELS_STR = os.environ.get("ABACUS_MODELS", "")
52+
# Parse comma-separated string into list, strip whitespace and quotes
53+
ABACUS_MODELS = parse_comma_separated_models(ABACUS_MODELS_STR)
3254

3355
# Mito AI Base URLs and Endpoint Paths
3456
MITO_PROD_BASE_URL = "https://7eax4i53f5odkshhlry4gw23by0yvnuv.lambda-url.us-east-1.on.aws/v2"

mito-ai/mito_ai/enterprise/litellm_client.py

Lines changed: 12 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@
1111
CompletionItem,
1212
)
1313
from mito_ai.utils.litellm_utils import get_litellm_completion_function_params
14+
from mito_ai.utils.model_utils import strip_router_prefix
1415
import litellm
1516

1617
class LiteLLMClient:
@@ -28,7 +29,7 @@ def __init__(self, api_key: Optional[str], base_url: str, timeout: int = 30, max
2829
async def request_completions(
2930
self,
3031
messages: List[ChatCompletionMessageParam],
31-
model: str, # Should include provider prefix (e.g., "openai/gpt-4o")
32+
model: str, # Should include provider prefix (e.g., "LiteLLM/openai/gpt-4o")
3233
response_format_info: Optional[ResponseFormatInfo] = None,
3334
message_type: MessageType = MessageType.CHAT
3435
) -> str:
@@ -37,16 +38,19 @@ async def request_completions(
3738
3839
Args:
3940
messages: List of chat messages
40-
model: Model name with provider prefix (e.g., "openai/gpt-4o")
41+
model: Model name with router and provider prefix (e.g., "LiteLLM/openai/gpt-4o")
4142
response_format_info: Optional response format specification
4243
message_type: Type of message (chat, agent execution, etc.)
4344
4445
Returns:
4546
The completion text response
4647
"""
48+
# Strip router prefix if present (LiteLLM/ prefix)
49+
model_for_litellm = strip_router_prefix(model)
50+
4751
# Prepare parameters for LiteLLM
4852
params = get_litellm_completion_function_params(
49-
model=model,
53+
model=model_for_litellm,
5054
messages=messages,
5155
api_key=self.api_key,
5256
api_base=self.base_url,
@@ -82,7 +86,7 @@ async def stream_completions(
8286
8387
Args:
8488
messages: List of chat messages
85-
model: Model name with provider prefix (e.g., "openai/gpt-4o")
89+
model: Model name with router and provider prefix (e.g., "LiteLLM/openai/gpt-4o")
8690
message_type: Type of message (chat, agent execution, etc.)
8791
message_id: ID of the message being processed
8892
reply_fn: Function to call with each chunk for streaming replies
@@ -93,9 +97,12 @@ async def stream_completions(
9397
"""
9498
accumulated_response = ""
9599

100+
# Strip router prefix if present (LiteLLM/ prefix)
101+
model_for_litellm = strip_router_prefix(model)
102+
96103
# Prepare parameters for LiteLLM
97104
params = get_litellm_completion_function_params(
98-
model=model,
105+
model=model_for_litellm,
99106
messages=messages,
100107
api_key=self.api_key,
101108
api_base=self.base_url,

mito-ai/mito_ai/enterprise/utils.py

Lines changed: 16 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,11 +5,25 @@
55
# Distributed under the terms of the The Mito Enterprise license.
66

77
from mito_ai.utils.version_utils import is_enterprise, is_mitosheet_private
8-
from mito_ai.constants import AZURE_OPENAI_API_KEY, AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_VERSION, AZURE_OPENAI_MODEL
8+
from mito_ai.constants import (
9+
AZURE_OPENAI_API_KEY,
10+
AZURE_OPENAI_ENDPOINT,
11+
AZURE_OPENAI_API_VERSION,
12+
AZURE_OPENAI_MODEL,
13+
ABACUS_BASE_URL,
14+
ABACUS_MODELS
15+
)
916

1017
def is_azure_openai_configured() -> bool:
1118
"""
1219
Azure OpenAI is only supported for Mito Enterprise users
1320
"""
1421
is_allowed_to_use_azure = is_enterprise() or is_mitosheet_private()
15-
return all([is_allowed_to_use_azure, AZURE_OPENAI_API_KEY, AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_VERSION, AZURE_OPENAI_MODEL])
22+
return all([is_allowed_to_use_azure, AZURE_OPENAI_API_KEY, AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_VERSION, AZURE_OPENAI_MODEL])
23+
24+
def is_abacus_configured() -> bool:
25+
"""
26+
Abacus AI is only supported for Mito Enterprise users.
27+
Checks if Abacus AI is configured with base URL and models.
28+
"""
29+
return all([is_enterprise(), ABACUS_BASE_URL, ABACUS_MODELS])

0 commit comments

Comments
 (0)