Skip to content

Commit 19fef62

Browse files
SeaTable AI Docs (#273)
* Docs for deploying SeaTable AI * update descriptions of using custom llm * update seatable-ai standlone deployment * update seatable-ai deployment * update seatable-ai deployment * add DISABLE_CONTEXT * support custom temperature * update seatable-ai docs * remove no-merged content * Improve SeaTable AI docs * Split SeaTable AI docs into two pages * Move AI token pricing section to a dedicated page * Improve section on AI credits --------- Co-authored-by: Simon Hammes <[email protected]>
1 parent e4bab94 commit 19fef62

File tree

5 files changed

+238
-0
lines changed

5 files changed

+238
-0
lines changed

docs/configuration/roles-and-permissions.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -45,6 +45,7 @@ The following quotas are supported in user roles:
4545
| scripts_running_limit | 2.3 | Total number of _Python_ scripts run within a month: 100 means 100 script runs per month; -1 means unlimited script runs | The script run counter is reset at the beginning of every month. |
4646
| snapshot_days | 2.1 | Retention period for snapshots in days: 180 means a storage period of 180 days; no value means an unlimited retention period | Snapshots older than the retention period are automatically removed. |
4747
| share_limit | | Max number of users a base can be shared with: 100 means a base can be shared with 100 users | |
48+
| ai_credit_per_user | 6.0 | The maximum AI quota allowed per user per month (i.e., the maximum amount of tokens that can be used in a single month, converted into an amount. In team mode, the total quota within the team will be shared). `-1` means unlimited quota. | |
4849

4950

5051
### Standard User Roles
Lines changed: 86 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,86 @@
1+
# Standalone Deployment of SeaTable AI
2+
3+
This guide describes the standalone deployment of `seatable-ai` on a dedicated server or virtual machine.
4+
5+
## Prerequisites
6+
7+
- You have successfully installed [Docker and Docker-Compose](../basic-setup.md#install-docker-and-docker-compose-plugin)
8+
- You have [downloaded the latest `.yml` files](../basic-setup.md#1-create-basic-structure) from the `seatable-release` GitHub repository
9+
- The hosts destined to run `seatable-ai` and other SeaTable components are attached to the same private network
10+
11+
## SeaTable AI Configuration
12+
13+
The following section outlines an `.env` file with the settings needed to run `seatable-ai`.
14+
These changes should be made inside `/opt/seatable-compose/.env`:
15+
16+
```ini
17+
COMPOSE_FILE='seatable-ai-standalone.yml'
18+
COMPOSE_PATH_SEPARATOR=','
19+
20+
# system settings
21+
TIME_ZONE='Europe/Berlin'
22+
23+
# database
24+
MARIADB_HOST=
25+
MARIADB_PORT=3306
26+
MARIADB_PASSWORD=
27+
28+
# redis
29+
REDIS_HOST=
30+
REDIS_PORT=6379
31+
REDIS_PASSWORD=
32+
33+
# This private key must have the same value as the JWT_PRIVATE_KEY variable on other SeaTable nodes
34+
JWT_PRIVATE_KEY=
35+
36+
# Public URL of your SeaTable server
37+
SEATABLE_SERVER_URL=https://seatable.your-domain.com
38+
39+
# Cluster-internal URL of dtable-server
40+
INNER_DTABLE_SERVER_URL=http://dtable-server:5000
41+
42+
# Cluster-internal URL of dtable-db
43+
INNER_DTABLE_DB_URL=http://dtable-db:7777
44+
45+
# LLM
46+
SEATABLE_AI_LLM_TYPE=openai
47+
SEATABLE_AI_LLM_URL=
48+
SEATABLE_AI_LLM_KEY=...
49+
SEATABLE_AI_LLM_MODEL=gpt-4o-mini # recommended
50+
```
51+
52+
!!! warning
53+
- In case you are not using password authentication for Redis, you should not specify a value after the equals sign (`=`) for the `REDIS_PASSWORD` variable.
54+
Specifying an empty string (e.g. `REDIS_PASSWORD=""`) will cause problems.
55+
56+
- By default, the ports of `dtable-server` (5000) and `dtable-db` (7777) are not exposed to the host. This requires a manual change inside the `.yml` file.
57+
58+
### LLM Provider Configuration
59+
60+
Please refer to the documentation on [configuring your LLM provider of choice](../components/seatable-ai.md#llm-provider-configuration).
61+
These configuration details do not change depending on the deployment topology of `seatable-server` and `seatable-ai`.
62+
63+
### Start SeaTable AI
64+
65+
You can now start SeaTable AI by running the following command inside your terminal:
66+
67+
```bash
68+
cd /opt/seatable-compose
69+
docker compose up -d
70+
```
71+
72+
## Configuration of SeaTable Server
73+
74+
Since `seatable-ai` is now running on a separate host or virtual machine, the following configuration changes must be made inside the `.env` file on the host running the `seatable-server` container:
75+
76+
```ini
77+
ENABLE_SEATABLE_AI=true
78+
SEATABLE_AI_SERVER_URL='http://seatable-ai.example.com:8888'
79+
```
80+
81+
Restart the `seatable-server` service and test your SeaTable AI:
82+
83+
```bash
84+
cd /opt/seatable-compose
85+
docker compose up -d
86+
```
Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
# AI Token Pricing
2+
3+
## AI Credits
4+
5+
AI credits serve as an internal unit of currency for measuring AI-related usage within SeaTable.
6+
They are directly linked to the number of tokens consumed by using AI-based features according to the [configured price](#pricing-configuration) of each AI model.
7+
8+
SeaTable supports role-based AI credit limits by configuring the `ai_credit_per_user` option on a user role.
9+
Please refer to the documentation on [user quotas](../../configuration/roles-and-permissions.md#user-quotas) for more details.
10+
11+
!!! note "`ai_credit_per_user` for organization users"
12+
AI credits are shared across all users inside a SeaTable organization. The total number of credits can be calculated by multiplying the value of `ai_credit_per_user` by the number of team users.
13+
14+
**Example:** Setting `ai_credit_per_user` to `2` will allow a team with 10 members to have 20 AI credits in total.
15+
16+
## Pricing Configuration
17+
18+
In order to accurately track the number of AI credits used by users and organizations, you must configure token pricing inside `/opt/seatable-server/seatable/conf/dtable_web_settings.py`.
19+
This can be achieved by configuring the `AI_PRICES` variable, which is a dictionary that maps model identifiers (e.g `gpt-4o-mini`) to token pricing **per thousand tokens**:
20+
21+
```py
22+
AI_PRICES = {
23+
"gpt-4o-mini": {
24+
"input_tokens_1k": 0.01827, # price / 1000 tokens
25+
"output_tokens_1k": 0.07309 # price / 1000 tokens
26+
},
27+
}
28+
```
29+
30+
!!! warning "Model Identifiers"
31+
The dictionary key must match **the exact value** of the chosen AI Model, which is configured through the `SEATABLE_AI_LLM_MODEL` variable inside your `.env` file.
32+
In case of a mismatch, AI usage will not count towards any configured credit limits!
Lines changed: 115 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,115 @@
1+
# SeaTable AI Integration
2+
3+
<!-- md:version 6.0 -->
4+
5+
SeaTable AI is a SeaTable extension that integrates AI functionality into SeaTable.
6+
Deploying SeaTable AI allows users to execute AI-based automation steps within SeaTable.
7+
8+
At the time of writing, the following types of automation steps are supported:
9+
10+
- **Summarize**
11+
- **Classify**
12+
- **OCR** (Optical character recognition)
13+
- **Extract**
14+
- **Custom** for individual use cases
15+
16+
## Deployment
17+
18+
!!! note "SeaTable AI requires SeaTable 6.0"
19+
20+
The easiest way to deploy SeaTable AI is to deploy it on the same host as SeaTable Server. A standalone deployment (on a separate host or virtual machine) is explained [here](../advanced/seatable-ai-standalone.md).
21+
22+
### Amend the .env file
23+
24+
To install SeaTable AI, include `seatable-ai.yml` in the `COMPOSE_FILE` variable within your `.env` file. This instructs Docker-Compose to include the `seatable-ai` service.
25+
26+
Simply copy and paste (:material-content-copy:) the following code into your command line:
27+
28+
```bash
29+
sed -i "s/COMPOSE_FILE='\(.*\)'/COMPOSE_FILE='\1,seatable-ai.yml'/" /opt/seatable-compose/.env
30+
```
31+
32+
Then add SeaTable AI server configurations in `.env`:
33+
34+
```ini
35+
ENABLE_SEATABLE_AI=true
36+
SEATABLE_AI_SERVER_URL=http://seatable-ai:8888
37+
```
38+
39+
#### LLM Provider Configuration
40+
41+
SeaTable AI will use AI functions in conjunction with a Large Language Model (LLM) service.
42+
43+
!!! note "Supported LLM Providers"
44+
45+
SeaTable AI supports a wide variety of LLM providers through [LiteLLM](https://docs.litellm.ai/docs) as well as any LLM services with OpenAI-compatible endpoints. Please refer to [LiteLLM's documentation](https://docs.litellm.ai/docs/providers) in case you run into issues while trying to use a specific provider.
46+
47+
!!! note "Model Selection"
48+
49+
In order to ensure the efficient use of SeaTable AI features, you need to select a **large, multimodal model**.
50+
This requires the chosen model to support image input and recognition (e.g. for running OCR as part of automations).
51+
52+
The following section showcases the required configuration settings for the most popular hosted LLM services.
53+
These must be configured inside your `.env` file:
54+
55+
<a id="llm-configuration"></a>
56+
=== "OpenAI"
57+
```ini
58+
SEATABLE_AI_LLM_TYPE=openai
59+
SEATABLE_AI_LLM_KEY=<your openai LLM access key>
60+
SEATABLE_AI_LLM_MODEL=gpt-4o-mini # recommended
61+
```
62+
=== "Deepseek"
63+
```ini
64+
SEATABLE_AI_LLM_TYPE=deepseek
65+
SEATABLE_AI_LLM_KEY=<your LLM access key>
66+
SEATABLE_AI_LLM_MODEL=deepseek-chat # recommended
67+
```
68+
=== "Azure OpenAI"
69+
```ini
70+
SEATABLE_AI_LLM_TYPE=azure
71+
SEATABLE_AI_LLM_URL= # your deployment url, leave blank to use default endpoint
72+
SEATABLE_AI_LLM_KEY=<your API key>
73+
SEATABLE_AI_LLM_MODEL=<your deployment name>
74+
```
75+
=== "Ollama"
76+
```ini
77+
SEATABLE_AI_LLM_TYPE=ollama_chat
78+
SEATABLE_AI_LLM_URL=<your LLM endpoint>
79+
SEATABLE_AI_LLM_KEY=<your LLM access key>
80+
SEATABLE_AI_LLM_MODEL=<your model-id>
81+
```
82+
=== "HuggingFace"
83+
```ini
84+
SEATABLE_AI_LLM_TYPE=huggingface
85+
SEATABLE_AI_LLM_URL=<your huggingface API endpoint>
86+
SEATABLE_AI_LLM_KEY=<your huggingface API key>
87+
SEATABLE_AI_LLM_MODEL=<model provider>/<model-id>
88+
```
89+
=== "Self-Hosted Proxy Server"
90+
```ini
91+
SEATABLE_AI_LLM_TYPE=proxy
92+
SEATABLE_AI_LLM_URL=<your proxy url>
93+
SEATABLE_AI_LLM_KEY=<your proxy virtual key> # optional
94+
SEATABLE_AI_LLM_MODEL=<model-id>
95+
```
96+
=== "Other"
97+
If you are using an LLM service with ***OpenAI-compatible endpoints***, you should set `SEATABLE_AI_LLM_TYPE` to `other` or `openai`, and set other LLM configuration settings as necessary:
98+
99+
```ini
100+
SEATABLE_AI_LLM_TYPE=...
101+
SEATABLE_AI_LLM_URL=...
102+
SEATABLE_AI_LLM_KEY=...
103+
SEATABLE_AI_LLM_MODEL=...
104+
```
105+
106+
### Download SeaTable AI image and restart
107+
108+
One more step is necessary to download the SeaTable AI image and restart the SeaTable service:
109+
110+
```bash
111+
cd /opt/seatable-compose
112+
docker compose up -d
113+
```
114+
115+
Now SeaTable AI can be used.

mkdocs.yml

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -151,6 +151,7 @@ nav:
151151
- Our deployment approach: installation/deployment-approach.md
152152
- Single-Node Deployment:
153153
- SeaTable Server: installation/basic-setup.md
154+
- SeaTable AI: installation/components/seatable-ai.md
154155
- Python Pipeline: installation/components/python-pipeline.md
155156
- Whiteboard: installation/components/whiteboard.md
156157
- n8n: installation/components/n8n.md
@@ -178,6 +179,9 @@ nav:
178179
- Webserver Security: installation/advanced/webserver-security.md
179180
- Maintenance Mode: installation/advanced/maintenance-mode.md
180181
- Advanced Settings for Caddy: installation/advanced/settings-caddy.md
182+
- SeaTable AI:
183+
- SeaTable AI (standalone): installation/advanced/seatable-ai-standalone.md
184+
- AI Token Pricing: installation/advanced/seatable-ai-token-pricing.md
181185
- S3 Object Storage:
182186
- Configuration: installation/advanced/s3.md
183187
- Migration: installation/advanced/s3-migration.md

0 commit comments

Comments
 (0)