Skip to content

Commit 4582d7b

Browse files
author
longbingljw
authored
Fix: duplicate cache keys and a circular dependency (#7)
* Revert "feat: Add support for OBKV-Redis (#3)" (#6) This reverts commit 36db8b6. * fix * cache-test * Revert "cache-test" This reverts commit 28e6754. * Revert "fix" This reverts commit 09682ee. * Reapply "fix" This reverts commit dace252. * Reapply "cache-test" This reverts commit 0505658. * fix clean thread of caches and change celert env * fix the circular dependency issue in creating cache tables * Append content to the setup-mysql-env.sh script * image test * image test * imaga publish test * config fix * change mirror source from thu to aliyun * use public pypi source * fix image test * use dockerhub images * cache duplicate fix * feat:Now,users can choose use redis or not.When using redis ,OCEANBASE_CLUSTER_NAME will become difyai-redis * fix vec_memory.sql miss * fix readme.cn
1 parent b34da4e commit 4582d7b

File tree

12 files changed

+2786
-105
lines changed

12 files changed

+2786
-105
lines changed
Lines changed: 138 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,138 @@
1+
name: Build and Push API to Docker Hub
2+
3+
on:
4+
push:
5+
branches:
6+
- "main"
7+
- "deploy/dev"
8+
- "deploy/enterprise"
9+
- "dify-for-mysql"
10+
tags:
11+
- "*"
12+
release:
13+
types: [published]
14+
15+
concurrency:
16+
group: build-push-api-${{ github.head_ref || github.run_id }}
17+
cancel-in-progress: true
18+
19+
env:
20+
DIFY_API_IMAGE_NAME: oceanbase/dify-api
21+
22+
jobs:
23+
build:
24+
runs-on: ${{ matrix.platform == 'linux/arm64' && 'ubuntu-24.04-arm' || 'ubuntu-latest' }}
25+
strategy:
26+
matrix:
27+
include:
28+
- service_name: "build-api-amd64"
29+
image_name: "oceanbase/dify-api"
30+
context: "api"
31+
platform: linux/amd64
32+
- service_name: "build-api-arm64"
33+
image_name: "oceanbase/dify-api"
34+
context: "api"
35+
platform: linux/arm64
36+
37+
steps:
38+
- name: Checkout code
39+
uses: actions/checkout@v4
40+
41+
- name: Prepare
42+
run: |
43+
platform=${{ matrix.platform }}
44+
echo "PLATFORM_PAIR=${platform//\//-}" >> $GITHUB_ENV
45+
46+
- name: Login to Docker Hub
47+
uses: docker/login-action@v3
48+
with:
49+
registry: docker.io
50+
username: ${{ secrets.DOCKER_USERNAME }}
51+
password: ${{ secrets.DOCKER_PASSWORD }}
52+
53+
- name: Set up QEMU
54+
uses: docker/setup-qemu-action@v3
55+
56+
- name: Set up Docker Buildx
57+
uses: docker/setup-buildx-action@v3
58+
59+
- name: Extract metadata for Docker
60+
id: meta
61+
uses: docker/metadata-action@v5
62+
with:
63+
images: ${{ matrix.image_name }}
64+
65+
- name: Build Docker image
66+
id: build
67+
uses: docker/build-push-action@v6
68+
with:
69+
context: "{{defaultContext}}:${{ matrix.context }}"
70+
platforms: ${{ matrix.platform }}
71+
build-args: COMMIT_SHA=${{ fromJSON(steps.meta.outputs.json).labels['org.opencontainers.image.revision'] }}
72+
labels: ${{ steps.meta.outputs.labels }}
73+
outputs: type=image,name=${{ matrix.image_name }},push-by-digest=true,name-canonical=true,push=true
74+
cache-from: type=gha,scope=${{ matrix.service_name }}
75+
cache-to: type=gha,mode=max,scope=${{ matrix.service_name }}
76+
77+
- name: Export digest
78+
env:
79+
DIGEST: ${{ steps.build.outputs.digest }}
80+
run: |
81+
mkdir -p /tmp/digests
82+
sanitized_digest=${DIGEST#sha256:}
83+
touch "/tmp/digests/${sanitized_digest}"
84+
85+
- name: Upload digest
86+
uses: actions/upload-artifact@v4
87+
with:
88+
name: digests-${{ matrix.context }}-${{ env.PLATFORM_PAIR }}
89+
path: /tmp/digests/*
90+
if-no-files-found: error
91+
retention-days: 1
92+
93+
create-manifest:
94+
needs: build
95+
runs-on: ubuntu-latest
96+
steps:
97+
- name: Checkout code
98+
uses: actions/checkout@v4
99+
100+
- name: Download digests
101+
uses: actions/download-artifact@v4
102+
with:
103+
path: /tmp/digests
104+
pattern: digests-api-*
105+
merge-multiple: true
106+
107+
- name: Login to Docker Hub
108+
uses: docker/login-action@v3
109+
with:
110+
registry: docker.io
111+
username: ${{ secrets.DOCKER_USERNAME }}
112+
password: ${{ secrets.DOCKER_PASSWORD }}
113+
114+
- name: Extract metadata for Docker
115+
id: meta
116+
uses: docker/metadata-action@v5
117+
with:
118+
images: oceanbase/dify-api
119+
tags: |
120+
type=raw,value=latest,enable=${{ startsWith(github.ref, 'refs/tags/') && !contains(github.ref, '-') }}
121+
type=ref,event=branch
122+
type=sha,enable=true,priority=100,prefix=,suffix=,format=long
123+
type=raw,value=${{ github.ref_name }},enable=${{ startsWith(github.ref, 'refs/tags/') }}
124+
125+
- name: Create manifest list and push
126+
working-directory: /tmp/digests
127+
env:
128+
IMAGE_NAME: oceanbase/dify-api
129+
run: |
130+
docker buildx imagetools create $(jq -cr '.tags | map("-t " + .) | join(" ")' <<< "$DOCKER_METADATA_OUTPUT_JSON") \
131+
$(printf "$IMAGE_NAME@sha256:%s " *)
132+
133+
- name: Inspect image
134+
env:
135+
IMAGE_NAME: oceanbase/dify-api
136+
IMAGE_VERSION: ${{ steps.meta.outputs.version }}
137+
run: |
138+
docker buildx imagetools inspect "$IMAGE_NAME:$IMAGE_VERSION"

README_CN.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
[English](README.md) | 简体中文
44

5-
这是一个 [https://github.com/langgenius/dify](https://github.com/langgenius/dify) 的 fork,我们基于原始的 Dify 项目进行了一些修改,使其能够使用 MySQL 作为基础数据库。
5+
这是一个 [https://github.com/langgenius/dify](https://github.com/langgenius/dify) 的 fork,我们基于原始的 Dify 项目进行了一些修改,使其能够使用 MySQL 作为基础数据库,同时可以使用mysql cache作为缓存
66

77
本分支基于历史版本 [https://github.com/oceanbase-devhub/dify](https://github.com/oceanbase-devhub/dify),自 Dify 1.1.0 开始更新,后续将在官方社区加入 MySQL 适配前进行定期发布。
88

@@ -40,4 +40,4 @@ docker compose up -d
4040

4141
## License
4242

43-
本仓库遵循 [Dify Open Source License](LICENSE) 开源协议,该许可证本质上是 Apache 2.0,但有一些额外的限制。
43+
本仓库遵循 [Dify Open Source License](LICENSE) 开源协议,该许可证本质上是 Apache 2.0,但有一些额外的限制。

api/Dockerfile

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,20 +6,23 @@ WORKDIR /app/api
66
# Install uv
77
ENV UV_VERSION=0.6.14
88

9+
10+
# RUN pip install --no-cache-dir -i https://pypi.tuna.tsinghua.edu.cn/simple uv==${UV_VERSION}
911
RUN pip install --no-cache-dir uv==${UV_VERSION}
1012

1113

1214
FROM base AS packages
1315

1416
# if you located in China, you can use aliyun mirror to speed up
15-
# RUN sed -i 's@deb.debian.org@mirrors.aliyun.com@g' /etc/apt/sources.list.d/debian.sources
17+
RUN sed -i 's@deb.debian.org@mirrors.aliyun.com@g' /etc/apt/sources.list.d/debian.sources
1618

1719
RUN apt-get update \
1820
&& apt-get install -y --no-install-recommends gcc g++ libc-dev libffi-dev libgmp-dev libmpfr-dev libmpc-dev
1921

2022
# Install Python dependencies
2123
COPY pyproject.toml uv.lock ./
22-
RUN uv sync --locked
24+
# RUN uv sync --index-url https://pypi.tuna.tsinghua.edu.cn/simple
25+
RUN uv sync
2326

2427
# production stage
2528
FROM base AS production

api/commands.py

Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -31,6 +31,7 @@
3131
from services.clear_free_plan_tenant_expired_logs import ClearFreePlanTenantExpiredLogs
3232
from services.plugin.data_migration import PluginDataMigration
3333
from services.plugin.plugin_migration import PluginMigration
34+
from tenacity import retry, stop_after_attempt, wait_exponential
3435

3536

3637
@click.command("reset-password", help="Reset the account password.")
@@ -665,10 +666,48 @@ def create_tenant(email: str, language: Optional[str] = None, name: Optional[str
665666
)
666667

667668

669+
@retry(stop=stop_after_attempt(3), wait=wait_exponential(multiplier=1, min=4, max=10))
670+
def ensure_caches_table_exists():
671+
"""原子性地确保 caches 表存在,用于MySQL缓存模式"""
672+
try:
673+
# 使用 CREATE TABLE IF NOT EXISTS 的原子操作
674+
create_caches_table_sql = """
675+
CREATE TABLE IF NOT EXISTS caches (
676+
id BIGINT AUTO_INCREMENT PRIMARY KEY,
677+
cache_key VARCHAR(255) NOT NULL UNIQUE,
678+
cache_value LONGBLOB NOT NULL,
679+
expire_time DATETIME NULL,
680+
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
681+
INDEX caches_cache_key_idx (cache_key),
682+
INDEX caches_expire_time_idx (expire_time)
683+
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
684+
"""
685+
686+
with db.engine.begin() as conn:
687+
conn.execute(db.text(create_caches_table_sql))
688+
click.echo(click.style("Caches table ensured for MySQL cache mode.", fg="green"))
689+
690+
except Exception as e:
691+
click.echo(click.style(f"Error: Could not ensure caches table after 3 attempts: {e}", fg="red"))
692+
# 抛出异常,停止迁移
693+
raise Exception(f"Failed to create caches table after 3 attempts: {e}")
694+
695+
668696
@click.command("upgrade-db", help="Upgrade the database")
669697
@click.option("--directory", prompt=False, help="The target migration script directory.")
670698
def upgrade_db(directory: Optional[str] = None):
671699
click.echo("Preparing database migration...")
700+
701+
# 1. 先确保 caches 表存在(原子操作)
702+
# 这样在MySQL缓存模式下,分布式锁可以正常工作
703+
try:
704+
ensure_caches_table_exists()
705+
except Exception as e:
706+
click.echo(click.style(f"Error: Failed to ensure caches table: {e}", fg="red"))
707+
click.echo(click.style("Migration stopped due to caches table creation failure.", fg="red"))
708+
raise Exception(f"Migration failed: {e}")
709+
710+
# 2. 然后使用分布式锁(可以安全使用了)
672711
lock = redis_client.lock(name="db_upgrade_lock", timeout=60)
673712
if lock.acquire(blocking=False):
674713
try:

0 commit comments

Comments
 (0)