Compare commits

...

43 Commits

Author SHA1 Message Date
CJACK.
01f33c409f Update VERSION 2026-03-21 18:04:39 +08:00
CJACK.
40594a44db Fix env-backed Vercel sync override and config refresh behavior 2026-03-21 17:53:44 +08:00
CJACK.
67787d9c99 Merge pull request #132 from CJackHwang/codex/toolcallhistory-6t7271
Preserve code fences around standalone tool JSON and add marker-output guards
2026-03-21 17:44:05 +08:00
CJACK.
7061094964 Fix fence-strip regression for closed code blocks before tool JSON 2026-03-21 17:39:08 +08:00
CJACK.
492c603300 Merge pull request #129 from CJackHwang/codex/optimize-vercel-deployment-sync-mechanism
Vercel sync: support env-backed config drafts, hash diffing and UI indicators
2026-03-21 17:21:42 +08:00
CJACK.
7e473dffc9 Fix Vercel sync override to avoid redacted config payloads 2026-03-21 17:19:32 +08:00
CJACK.
43a6e6712f Show UI drift marker for env draft vs Vercel config 2026-03-21 17:08:43 +08:00
CJACK.
1e7e0b2ae3 Merge pull request #125 from CJackHwang/codex/align-documentation-with-configuration-updates
Docs: add `auto_delete.sessions`, rename `claude_model_mapping` to `claude_mapping`, and clarify config token handling
2026-03-21 15:34:35 +08:00
CJACK.
fd158e5ae2 Merge pull request #124 from CJackHwang/codex/fix-codex-review-issues-in-pr-#123
Preserve file-backed account tokens on startup and add regression test
2026-03-21 15:34:01 +08:00
CJACK.
95c96f7744 docs: clarify configured account token is ignored on load 2026-03-21 15:32:09 +08:00
CJACK.
e7f59fac80 Update VERSION 2026-03-21 15:22:09 +08:00
CJACK.
1bf059396f Fix file-backed token reuse at startup 2026-03-21 15:19:41 +08:00
CJACK.
f4db2732b0 Merge pull request #122 from CJackHwang/codex/refactor-configuration-to-remove-token-support
Treat account tokens as runtime-only; remove token-only account support and always refresh tokens on admin actions
2026-03-21 15:07:19 +08:00
CJACK.
ee88a74dcf Drop legacy token-only accounts when loading config 2026-03-21 15:01:16 +08:00
CJACK.
ca08bb66b9 Add HTTP token-runtime coverage and fix gate tests for tokenless config 2026-03-21 14:27:12 +08:00
CJACK.
708fcb5beb Merge pull request #121 from jacob-sheng/fix/zeabur-build-version-fallback-zh
fix: 修复 Docker 在缺少 BUILD_VERSION 时构建失败
2026-03-21 11:17:58 +08:00
jacob-sheng
7a65d1eaa2 fix: allow Docker builds without BUILD_VERSION 2026-03-21 09:55:53 +08:00
CJACK.
6de2457743 Merge pull request #119 from CJackHwang/dev
Merge pull request #118 from CJackHwang/codex/analyze-and-fix-build-failure-for-pr-117

fix: decouple runtime-from-dist image from go-builder stage
2026-03-21 02:00:35 +08:00
CJACK.
ce44e260bf Merge pull request #118 from CJackHwang/codex/analyze-and-fix-build-failure-for-pr-117
fix: decouple runtime-from-dist image from go-builder stage
2026-03-21 01:59:52 +08:00
CJACK.
09f6537ffc fix: decouple runtime-from-dist image from go-builder stage 2026-03-21 01:32:09 +08:00
CJACK.
ab8f494fdb Merge pull request #117 from CJackHwang/dev
Merge pull request #115 from CJackHwang/codex/fix-version-detection-for-ds2api

Expose version endpoint, add version package, and inject build version into artifacts/Docker images
2026-03-21 00:51:36 +08:00
CJACK.
b56a211da9 Merge pull request #115 from CJackHwang/codex/fix-version-detection-for-ds2api
Expose version endpoint, add version package, and inject build version into artifacts/Docker images
2026-03-21 00:47:57 +08:00
CJACK.
fcce5308cb Merge pull request #116 from CJackHwang/codex/align-vercel-deployment-with-go-version-semantics
Align Vercel JS toolcall detection/format behavior with Go semantics
2026-03-21 00:43:50 +08:00
CJACK.
d27b19cc53 fix: show vercel preview commit version instead of dev 2026-03-21 00:43:09 +08:00
CJACK.
b8ff678f24 Align Vercel JS toolcall filtering with Go semantics 2026-03-21 00:23:22 +08:00
CJACK.
b24ef1282d fix: route /admin/version to api on vercel 2026-03-21 00:18:55 +08:00
CJACK.
65e0de3c82 Merge pull request #112 from CJackHwang/codex/fix-token-expiration-handling
Attempt token refresh for biz_code failures; report config writability and handle token write errors
2026-03-20 23:56:40 +08:00
CJACK.
0c2743a48c fix: align build version source with tags and VERSION fallback 2026-03-20 23:55:10 +08:00
CJACK.
dc73e8a6da Gate biz_code refresh attempts to auth-indicative failures 2026-03-20 23:54:13 +08:00
CJACK.
b8495eeeb3 surface account test config writeability and save failures 2026-03-20 23:34:29 +08:00
CJACK.
b3eae22cef Merge pull request #111 from CJackHwang/dev
Merge pull request #110 from CJackHwang/codex/align-js-runtime-with-go-runtime-logic

Align Vercel JS stream tool-call delta handling with Go runtime
2026-03-20 10:05:25 +08:00
CJACK.
7af0098d1b Merge pull request #110 from CJackHwang/codex/align-js-runtime-with-go-runtime-logic
Align Vercel JS stream tool-call delta handling with Go runtime
2026-03-20 09:49:08 +08:00
CJACK.
17405be300 shrink vercel stream module under line gate limit 2026-03-20 09:47:22 +08:00
CJACK.
5bc03e5de6 align vercel js stream toolcall delta behavior with go runtime 2026-03-20 09:36:45 +08:00
CJACK.
5a5f93148d Merge pull request #109 from CJackHwang/dev
Merge pull request #108 from CJackHwang/codex/clean-up-unused-files-and-update-documentation-uiip50

docs: refresh deployment/testing guides and remove stale investigation report
2026-03-20 03:12:25 +08:00
CJACK.
32dc5b6099 Merge pull request #108 from CJackHwang/codex/clean-up-unused-files-and-update-documentation-uiip50
docs: refresh deployment/testing guides and remove stale investigation report
2026-03-20 03:08:09 +08:00
CJACK.
7936d4675f Merge pull request #107 from CJackHwang/codex/clean-up-unused-files-and-update-documentation
docs: prune stale files and refresh docs, add .env.example, align READMEs/DEPLOY/CONTRIBUTING
2026-03-20 03:07:21 +08:00
CJACK.
808eafa7c6 docs: refresh deployment/testing guides and prune stale report 2026-03-20 03:05:36 +08:00
CJACK.
bcb8ed6df2 docs: prune stale docs and refresh project documentation 2026-03-20 03:05:22 +08:00
CJACK.
8ec5dcc0cc Merge pull request #106 from CJackHwang/dev
Merge pull request #105 from CJackHwang/codex/fix-issues-found-in-review

Merge pull request #104 from CJackHwang/codex/revert-to-commit-efb484b

Restore tool-call parsing and repair logic; remove accidental split files
2026-03-20 02:53:30 +08:00
CJACK.
88a79f212d Fix path control-char repair on JSON fallback parses 2026-03-20 02:52:27 +08:00
CJACK.
41c0f7ce28 Merge pull request #102 from CJackHwang/dev
Merge pull request #99 from CJackHwang/codex/refactor-toolcalls_parse.go-for-line-limits

Codex-generated pull request
2026-03-20 01:18:05 +08:00
CJACK.
43cbc4aac0 Merge pull request #97 from CJackHwang/dev
Merge pull request #96 from CJackHwang/codex/update-ci-line-count-limits-cihke3

ci: ignore test files in line gate and raise frontend limit to 500
2026-03-18 00:15:03 +08:00
64 changed files with 1218 additions and 643 deletions

View File

@@ -1,93 +1,15 @@
# DS2API environment template (Go runtime)
# Copy this file to .env and adjust values.
# Updated: 2026-02
# ---------------------------------------------------------------
# Runtime
# ---------------------------------------------------------------
# HTTP listen port (default: 5001)
# DS2API runtime
PORT=5001
# Log level: DEBUG | INFO | WARN | ERROR
LOG_LEVEL=INFO
# Max concurrent inflight requests per account in managed-key mode.
# Default: 2
# Recommended client concurrency is calculated dynamically as:
# account_count * DS2API_ACCOUNT_MAX_INFLIGHT
# So by default it is account_count * 2.
# Requests beyond inflight slots enter a waiting queue first.
# Default queue size equals recommended concurrency, so 429 starts after:
# account_count * DS2API_ACCOUNT_MAX_INFLIGHT * 2
# Alias: DS2API_ACCOUNT_CONCURRENCY
# DS2API_ACCOUNT_MAX_INFLIGHT=2
# Admin authentication
DS2API_ADMIN_KEY=change-me
# Optional waiting queue size override for managed-key mode.
# Default: recommended_concurrency (same as account_count * inflight_limit)
# Alias: DS2API_ACCOUNT_QUEUE_SIZE
# DS2API_ACCOUNT_MAX_QUEUE=10
# Config loading (choose one)
# 1) file-based config
DS2API_CONFIG_PATH=/app/config.json
# 2) inline JSON or Base64 JSON
# DS2API_CONFIG_JSON=
# ---------------------------------------------------------------
# Admin auth
# ---------------------------------------------------------------
# Admin key for /admin login and protected admin APIs.
# Default is "admin" when unset, but setting it explicitly is recommended.
DS2API_ADMIN_KEY=admin
# Optional JWT signing secret for admin token.
# Defaults to DS2API_ADMIN_KEY when unset.
# DS2API_JWT_SECRET=change-me
# Optional admin JWT validity in hours (default: 24)
# DS2API_JWT_EXPIRE_HOURS=24
# ---------------------------------------------------------------
# Config source (choose one)
# ---------------------------------------------------------------
# Option A: config file path (local/dev recommended)
# DS2API_CONFIG_PATH=config.json
# Option B: JSON string
# DS2API_CONFIG_JSON={"keys":["your-api-key"],"accounts":[{"email":"user@example.com","password":"xxx","token":""}]}
# Option C: Base64 encoded JSON (recommended for Vercel env var)
# DS2API_CONFIG_JSON=eyJrZXlzIjpbInlvdXItYXBpLWtleSJdLCJhY2NvdW50cyI6W3siZW1haWwiOiJ1c2VyQGV4YW1wbGUuY29tIiwicGFzc3dvcmQiOiJ4eHgiLCJ0b2tlbiI6IiJ9XX0=
#
# Generate from local config.json:
# DS2API_CONFIG_JSON="$(base64 < config.json | tr -d '\n')"
# ---------------------------------------------------------------
# Paths (optional)
# ---------------------------------------------------------------
# WASM file used for PoW solving
# DS2API_WASM_PATH=sha3_wasm_bg.7b9ca65ddd.wasm
# Built admin static assets directory
# DS2API_STATIC_ADMIN_DIR=static/admin
# Auto-build WebUI on startup when static/admin is missing.
# Default: enabled on local/Docker, disabled on Vercel.
# DS2API_AUTO_BUILD_WEBUI=true
# Internal auth secret used by the Vercel hybrid streaming path
# (Go prepare endpoint <-> Node stream function).
# Optional: falls back to DS2API_ADMIN_KEY when unset.
# DS2API_VERCEL_INTERNAL_SECRET=change-me
# Stream lease TTL seconds for Vercel hybrid streaming.
# During this window, the managed account stays occupied until Node calls release.
# Default: 900 (15 minutes)
# DS2API_VERCEL_STREAM_LEASE_TTL_SECONDS=900
# ---------------------------------------------------------------
# Vercel sync integration (optional)
# ---------------------------------------------------------------
# VERCEL_TOKEN=your-vercel-token
# VERCEL_PROJECT_ID=prj_xxxxxxxxxxxx
# VERCEL_TEAM_ID=team_xxxxxxxxxxxx
# Optional: Vercel deployment protection bypass secret.
# If deployment protection is enabled, DS2API will use this value as
# x-vercel-protection-bypass for internal Node->Go calls on Vercel.
# You can also use VERCEL_AUTOMATION_BYPASS_SECRET directly.
# DS2API_VERCEL_PROTECTION_BYPASS=your-bypass-secret
# Optional: static admin assets path
# DS2API_STATIC_ADMIN_DIR=/app/static/admin

View File

@@ -51,6 +51,10 @@ jobs:
run: |
set -euo pipefail
TAG="${RELEASE_TAG}"
BUILD_VERSION="${TAG}"
if [ -z "${BUILD_VERSION}" ] && [ -f VERSION ]; then
BUILD_VERSION="$(cat VERSION | tr -d '[:space:]')"
fi
mkdir -p dist
targets=(
@@ -73,7 +77,7 @@ jobs:
mkdir -p "${STAGE}/static"
CGO_ENABLED=0 GOOS="${GOOS}" GOARCH="${GOARCH}" \
go build -trimpath -ldflags="-s -w" -o "${STAGE}/${BIN}" ./cmd/ds2api
go build -trimpath -ldflags="-s -w -X ds2api/internal/version.BuildVersion=${BUILD_VERSION}" -o "${STAGE}/${BIN}" ./cmd/ds2api
cp config.example.json .env.example sha3_wasm_bg.7b9ca65ddd.wasm LICENSE README.MD README.en.md "${STAGE}/"
cp -R static/admin "${STAGE}/static/admin"

View File

@@ -123,5 +123,7 @@ jobs:
labels: |
org.opencontainers.image.version=${{ steps.next_version.outputs.new_version }}
org.opencontainers.image.revision=${{ github.sha }}
build-args: |
BUILD_VERSION=${{ steps.next_version.outputs.new_tag }}
cache-from: type=gha
cache-to: type=gha,mode=max

View File

@@ -1,128 +1,130 @@
name: Release to Aliyun CR
on:
workflow_dispatch:
inputs:
version_type:
description: '版本类型'
required: true
default: 'patch'
type: choice
options:
- patch
- minor
- major
permissions:
contents: write
jobs:
release:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v5
with:
fetch-depth: 0
token: ${{ secrets.GITHUB_TOKEN }}
- name: Get current version
id: get_version
run: |
LATEST_TAG=$(git describe --tags --abbrev=0 2>/dev/null || echo "v0.0.0")
TAG_VERSION=${LATEST_TAG#v}
if [ -f VERSION ]; then
FILE_VERSION=$(cat VERSION | tr -d '[:space:]')
else
FILE_VERSION="0.0.0"
fi
function version_gt() { test "$(printf '%s\n' "$@" | sort -V | head -n 1)" != "$1"; }
if version_gt "$FILE_VERSION" "$TAG_VERSION"; then
VERSION="$FILE_VERSION"
else
VERSION="$TAG_VERSION"
fi
echo "Current version: $VERSION"
echo "current_version=$VERSION" >> $GITHUB_OUTPUT
- name: Calculate next version
id: next_version
env:
VERSION_TYPE: ${{ github.event.inputs.version_type }}
run: |
VERSION="${{ steps.get_version.outputs.current_version }}"
BASE_VERSION=$(echo "$VERSION" | sed 's/-.*$//')
IFS='.' read -r -a version_parts <<< "$BASE_VERSION"
MAJOR="${version_parts[0]:-0}"
MINOR="${version_parts[1]:-0}"
PATCH="${version_parts[2]:-0}"
case "$VERSION_TYPE" in
major)
NEW_VERSION="$((MAJOR + 1)).0.0"
;;
minor)
NEW_VERSION="${MAJOR}.$((MINOR + 1)).0"
;;
*)
NEW_VERSION="${MAJOR}.${MINOR}.$((PATCH + 1))"
;;
esac
echo "New version: $NEW_VERSION"
echo "new_version=$NEW_VERSION" >> $GITHUB_OUTPUT
echo "new_tag=v$NEW_VERSION" >> $GITHUB_OUTPUT
- name: Update VERSION file
run: |
echo "${{ steps.next_version.outputs.new_version }}" > VERSION
- name: Commit VERSION and create tag
run: |
git config user.name "github-actions[bot]"
git config user.email "github-actions[bot]@users.noreply.github.com"
git add VERSION
if ! git diff --cached --quiet; then
git commit -m "chore: bump version to ${{ steps.next_version.outputs.new_tag }} [skip ci]"
fi
NEW_TAG="${{ steps.next_version.outputs.new_tag }}"
git tag -a "$NEW_TAG" -m "Release $NEW_TAG"
git push origin HEAD:main "$NEW_TAG"
# Docker 构建并推送到阿里云
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to Aliyun Container Registry
uses: docker/login-action@v3
with:
registry: ${{ secrets.ALIYUN_REGISTRY }}
username: ${{ secrets.ALIYUN_REGISTRY_USER }}
password: ${{ secrets.ALIYUN_REGISTRY_PASSWORD }}
- name: Build and push Docker image
uses: docker/build-push-action@v6
with:
context: .
file: ./Dockerfile
platforms: linux/amd64,linux/arm64
push: true
tags: |
${{ secrets.ALIYUN_REGISTRY }}/${{ secrets.ALIYUN_REGISTRY_NAMESPACE }}/ds2api:${{ steps.next_version.outputs.new_tag }}
${{ secrets.ALIYUN_REGISTRY }}/${{ secrets.ALIYUN_REGISTRY_NAMESPACE }}/ds2api:${{ steps.next_version.outputs.new_version }}
${{ secrets.ALIYUN_REGISTRY }}/${{ secrets.ALIYUN_REGISTRY_NAMESPACE }}/ds2api:latest
labels: |
org.opencontainers.image.version=${{ steps.next_version.outputs.new_version }}
org.opencontainers.image.revision=${{ github.sha }}
cache-from: type=gha
cache-to: type=gha,mode=max
name: Release to Aliyun CR
on:
workflow_dispatch:
inputs:
version_type:
description: '版本类型'
required: true
default: 'patch'
type: choice
options:
- patch
- minor
- major
permissions:
contents: write
jobs:
release:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v5
with:
fetch-depth: 0
token: ${{ secrets.GITHUB_TOKEN }}
- name: Get current version
id: get_version
run: |
LATEST_TAG=$(git describe --tags --abbrev=0 2>/dev/null || echo "v0.0.0")
TAG_VERSION=${LATEST_TAG#v}
if [ -f VERSION ]; then
FILE_VERSION=$(cat VERSION | tr -d '[:space:]')
else
FILE_VERSION="0.0.0"
fi
function version_gt() { test "$(printf '%s\n' "$@" | sort -V | head -n 1)" != "$1"; }
if version_gt "$FILE_VERSION" "$TAG_VERSION"; then
VERSION="$FILE_VERSION"
else
VERSION="$TAG_VERSION"
fi
echo "Current version: $VERSION"
echo "current_version=$VERSION" >> $GITHUB_OUTPUT
- name: Calculate next version
id: next_version
env:
VERSION_TYPE: ${{ github.event.inputs.version_type }}
run: |
VERSION="${{ steps.get_version.outputs.current_version }}"
BASE_VERSION=$(echo "$VERSION" | sed 's/-.*$//')
IFS='.' read -r -a version_parts <<< "$BASE_VERSION"
MAJOR="${version_parts[0]:-0}"
MINOR="${version_parts[1]:-0}"
PATCH="${version_parts[2]:-0}"
case "$VERSION_TYPE" in
major)
NEW_VERSION="$((MAJOR + 1)).0.0"
;;
minor)
NEW_VERSION="${MAJOR}.$((MINOR + 1)).0"
;;
*)
NEW_VERSION="${MAJOR}.${MINOR}.$((PATCH + 1))"
;;
esac
echo "New version: $NEW_VERSION"
echo "new_version=$NEW_VERSION" >> $GITHUB_OUTPUT
echo "new_tag=v$NEW_VERSION" >> $GITHUB_OUTPUT
- name: Update VERSION file
run: |
echo "${{ steps.next_version.outputs.new_version }}" > VERSION
- name: Commit VERSION and create tag
run: |
git config user.name "github-actions[bot]"
git config user.email "github-actions[bot]@users.noreply.github.com"
git add VERSION
if ! git diff --cached --quiet; then
git commit -m "chore: bump version to ${{ steps.next_version.outputs.new_tag }} [skip ci]"
fi
NEW_TAG="${{ steps.next_version.outputs.new_tag }}"
git tag -a "$NEW_TAG" -m "Release $NEW_TAG"
git push origin HEAD:main "$NEW_TAG"
# Docker 构建并推送到阿里云
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to Aliyun Container Registry
uses: docker/login-action@v3
with:
registry: ${{ secrets.ALIYUN_REGISTRY }}
username: ${{ secrets.ALIYUN_REGISTRY_USER }}
password: ${{ secrets.ALIYUN_REGISTRY_PASSWORD }}
- name: Build and push Docker image
uses: docker/build-push-action@v6
with:
context: .
file: ./Dockerfile
platforms: linux/amd64,linux/arm64
push: true
tags: |
${{ secrets.ALIYUN_REGISTRY }}/${{ secrets.ALIYUN_REGISTRY_NAMESPACE }}/ds2api:${{ steps.next_version.outputs.new_tag }}
${{ secrets.ALIYUN_REGISTRY }}/${{ secrets.ALIYUN_REGISTRY_NAMESPACE }}/ds2api:${{ steps.next_version.outputs.new_version }}
${{ secrets.ALIYUN_REGISTRY }}/${{ secrets.ALIYUN_REGISTRY_NAMESPACE }}/ds2api:latest
labels: |
org.opencontainers.image.version=${{ steps.next_version.outputs.new_version }}
org.opencontainers.image.revision=${{ github.sha }}
build-args: |
BUILD_VERSION=${{ steps.next_version.outputs.new_tag }}
cache-from: type=gha
cache-to: type=gha,mode=max

View File

@@ -623,6 +623,7 @@ Reads runtime settings and status, including:
- `admin` (JWT expiry, default-password warning, etc.)
- `runtime` (`account_max_inflight`, `account_max_queue`, `global_max_inflight`)
- `toolcall` / `responses` / `embeddings`
- `auto_delete` (`sessions`)
- `claude_mapping` / `model_aliases`
- `env_backed`, `needs_vercel_sync`
@@ -635,6 +636,7 @@ Hot-updates runtime settings. Supported fields:
- `toolcall.mode` / `toolcall.early_emit_confidence`
- `responses.store_ttl_seconds`
- `embeddings.provider`
- `auto_delete.sessions`
- `claude_mapping`
- `model_aliases`

2
API.md
View File

@@ -628,6 +628,7 @@ data: {"type":"message_stop"}
- `admin`JWT 过期、默认密码告警等)
- `runtime``account_max_inflight``account_max_queue``global_max_inflight`
- `toolcall` / `responses` / `embeddings`
- `auto_delete``sessions`
- `claude_mapping` / `model_aliases`
- `env_backed``needs_vercel_sync`
@@ -640,6 +641,7 @@ data: {"type":"message_stop"}
- `toolcall.mode` / `toolcall.early_emit_confidence`
- `responses.store_ttl_seconds`
- `embeddings.provider`
- `auto_delete.sessions`
- `claude_mapping`
- `model_aliases`

View File

@@ -99,7 +99,7 @@ ds2api/
├── api/
│ ├── index.go # Vercel Serverless Go entry
│ ├── chat-stream.js # Vercel Node.js stream relay
│ └── helpers/ # Node.js helper modules
│ └── (rewrite targets in vercel.json)
├── internal/
│ ├── account/ # Account pool and concurrency queue
│ ├── adapter/
@@ -112,6 +112,7 @@ ds2api/
│ ├── compat/ # Compatibility helpers
│ ├── config/ # Config loading and hot-reload
│ ├── deepseek/ # DeepSeek client, PoW WASM
│ ├── js/ # Node runtime stream/compat logic
│ ├── devcapture/ # Dev packet capture
│ ├── format/ # Output formatting
│ ├── prompt/ # Prompt building
@@ -123,7 +124,9 @@ ds2api/
│ └── webui/ # WebUI static hosting
├── webui/ # React WebUI source
│ └── src/
│ ├── components/ # Components
│ ├── app/ # Routing, auth, config state
│ ├── features/ # Feature modules
│ ├── components/ # Shared components
│ └── locales/ # Language packs
├── scripts/ # Build and test scripts
├── static/admin/ # WebUI build output (not committed)

View File

@@ -99,7 +99,7 @@ ds2api/
├── api/
│ ├── index.go # Vercel Serverless Go 入口
│ ├── chat-stream.js # Vercel Node.js 流式转发
│ └── helpers/ # Node.js 辅助模块
│ └── (rewrite targets in vercel.json)
├── internal/
│ ├── account/ # 账号池与并发队列
│ ├── adapter/
@@ -112,6 +112,7 @@ ds2api/
│ ├── compat/ # 兼容性辅助
│ ├── config/ # 配置加载与热更新
│ ├── deepseek/ # DeepSeek 客户端、PoW WASM
│ ├── js/ # Node 运行时流式/兼容逻辑
│ ├── devcapture/ # 开发抓包
│ ├── format/ # 输出格式化
│ ├── prompt/ # Prompt 构建
@@ -123,7 +124,9 @@ ds2api/
│ └── webui/ # WebUI 静态托管
├── webui/ # React WebUI 源码
│ └── src/
│ ├── components/ # 组件
│ ├── app/ # 路由、鉴权、配置状态
│ ├── features/ # 业务功能模块
│ ├── components/ # 通用组件
│ └── locales/ # 语言包
├── scripts/ # 构建与测试脚本
├── static/admin/ # WebUI 构建产物(不提交)

View File

@@ -113,12 +113,8 @@ go build -o ds2api ./cmd/ds2api
# Copy env template
cp .env.example .env
# Generate single-line Base64 from config.json
DS2API_CONFIG_JSON="$(base64 < config.json | tr -d '\n')"
# Edit .env and set:
# Edit .env and set at least:
# DS2API_ADMIN_KEY=your-admin-key
# DS2API_CONFIG_JSON=${DS2API_CONFIG_JSON}
# Start
docker-compose up -d
@@ -185,6 +181,7 @@ Notes:
- **Port**: DS2API listens on `5001` by default; the template sets `PORT=5001`.
- **Persistent config**: the template mounts `/data` and sets `DS2API_CONFIG_PATH=/data/config.json`. After importing config in Admin UI, it will be written and persisted to this path.
- **Build version**: Zeabur / regular `docker build` does not require `BUILD_VERSION` by default. The image prefers that build arg when provided, and automatically falls back to the repo-root `VERSION` file when it is absent.
- **First login**: after deployment, open `/admin` and login with `DS2API_ADMIN_KEY` shown in Zeabur env/template instructions (recommended: rotate to a strong secret after first login).
---
@@ -366,7 +363,7 @@ Each archive includes:
- `ds2api` executable (`ds2api.exe` on Windows)
- `static/admin/` (built WebUI assets)
- `sha3_wasm_bg.7b9ca65ddd.wasm`
- `sha3_wasm_bg.7b9ca65ddd.wasm` (optional; binary has embedded fallback)
- `config.example.json`, `.env.example`
- `README.MD`, `README.en.md`, `LICENSE`
@@ -455,7 +452,9 @@ server {
```bash
# Copy compiled binary and related files to target directory
sudo mkdir -p /opt/ds2api
sudo cp ds2api config.json sha3_wasm_bg.7b9ca65ddd.wasm /opt/ds2api/
sudo cp ds2api config.json /opt/ds2api/
# Optional: if you want to use an external WASM file (override embedded one)
# sudo cp sha3_wasm_bg.7b9ca65ddd.wasm /opt/ds2api/
sudo cp -r static/admin /opt/ds2api/static/admin
```

View File

@@ -113,12 +113,8 @@ go build -o ds2api ./cmd/ds2api
# 复制环境变量模板
cp .env.example .env
# 从 config.json 生成单行 Base64
DS2API_CONFIG_JSON="$(base64 < config.json | tr -d '\n')"
# 编辑 .env请改成你的强密码设置
# 编辑 .env请改成你的强密码至少设置
# DS2API_ADMIN_KEY=your-admin-key
# DS2API_CONFIG_JSON=${DS2API_CONFIG_JSON}
# 启动
docker-compose up -d
@@ -185,6 +181,7 @@ healthcheck:
- **端口**:服务默认监听 `5001`,模板会固定设置 `PORT=5001`
- **配置持久化**:模板挂载卷 `/data`,并设置 `DS2API_CONFIG_PATH=/data/config.json`;在管理台导入配置后,会写入并持久化到该路径。
- **构建版本号**Zeabur / 普通 `docker build` 默认不需要传 `BUILD_VERSION`;镜像会优先使用该构建参数,未提供时自动回退到仓库根目录的 `VERSION` 文件。
- **首次登录**:部署完成后访问 `/admin`,使用 Zeabur 环境变量/模板指引中的 `DS2API_ADMIN_KEY` 登录(建议首次登录后自行更换为强密码)。
---
@@ -366,7 +363,7 @@ No Output Directory named "public" found after the Build completed.
- `ds2api` 可执行文件Windows 为 `ds2api.exe`
- `static/admin/`WebUI 构建产物)
- `sha3_wasm_bg.7b9ca65ddd.wasm`
- `sha3_wasm_bg.7b9ca65ddd.wasm`(可选;程序内置 embed fallback
- `config.example.json``.env.example`
- `README.MD``README.en.md``LICENSE`
@@ -455,7 +452,9 @@ server {
```bash
# 将编译好的二进制文件和相关文件复制到目标目录
sudo mkdir -p /opt/ds2api
sudo cp ds2api config.json sha3_wasm_bg.7b9ca65ddd.wasm /opt/ds2api/
sudo cp ds2api config.json /opt/ds2api/
# 可选:若你希望使用外置 WASM 文件(覆盖内置版本)
# sudo cp sha3_wasm_bg.7b9ca65ddd.wasm /opt/ds2api/
sudo cp -r static/admin /opt/ds2api/static/admin
```

View File

@@ -10,19 +10,24 @@ FROM golang:1.24 AS go-builder
WORKDIR /app
ARG TARGETOS
ARG TARGETARCH
ARG BUILD_VERSION
COPY go.mod go.sum* ./
RUN go mod download
COPY . .
RUN set -eux; \
GOOS="${TARGETOS:-$(go env GOOS)}"; \
GOARCH="${TARGETARCH:-$(go env GOARCH)}"; \
CGO_ENABLED=0 GOOS="${GOOS}" GOARCH="${GOARCH}" go build -o /out/ds2api ./cmd/ds2api
BUILD_VERSION_RESOLVED="${BUILD_VERSION:-}"; \
if [ -z "${BUILD_VERSION_RESOLVED}" ] && [ -f VERSION ]; then BUILD_VERSION_RESOLVED="$(cat VERSION | tr -d "[:space:]")"; fi; \
CGO_ENABLED=0 GOOS="${GOOS}" GOARCH="${GOARCH}" go build -ldflags="-s -w -X ds2api/internal/version.BuildVersion=${BUILD_VERSION_RESOLVED}" -o /out/ds2api ./cmd/ds2api
FROM busybox:1.36.1-musl AS busybox-tools
FROM debian:bookworm-slim AS runtime-base
WORKDIR /app
COPY --from=go-builder /etc/ssl/certs/ca-certificates.crt /etc/ssl/certs/ca-certificates.crt
RUN apt-get update \
&& apt-get install -y --no-install-recommends ca-certificates \
&& rm -rf /var/lib/apt/lists/*
COPY --from=busybox-tools /bin/busybox /usr/local/bin/busybox
EXPOSE 5001
CMD ["/usr/local/bin/ds2api"]

View File

@@ -160,17 +160,13 @@ go run ./cmd/ds2api
# 1. 准备环境变量文件
cp .env.example .env
# 2. 从 config.json 生成 DS2API_CONFIG_JSON单行 Base64
DS2API_CONFIG_JSON="$(base64 < config.json | tr -d '\n')"
# 3. 编辑 .env设置
# 2. 编辑 .env至少设置 DS2API_ADMIN_KEY
# DS2API_ADMIN_KEY=请替换为强密码
# DS2API_CONFIG_JSON=${DS2API_CONFIG_JSON}
# 4. 启动
# 3. 启动
docker-compose up -d
# 5. 查看日志
# 4. 查看日志
docker-compose logs -f
```
@@ -182,6 +178,8 @@ docker-compose logs -f
2. 部署完成后访问 `/admin`,使用 Zeabur 环境变量/模板指引中的 `DS2API_ADMIN_KEY` 登录。
3. 在管理台导入/编辑配置(会写入并持久化到 `/data/config.json`)。
说明Zeabur 使用仓库内 `Dockerfile` 直接构建时,不需要额外传入 `BUILD_VERSION`;镜像会优先读取该构建参数,未提供时自动回退到仓库根目录的 `VERSION` 文件。
### 方式三Vercel 部署
1. Fork 仓库到自己的 GitHub
@@ -246,13 +244,11 @@ cp opencode.json.example opencode.json
"accounts": [
{
"email": "user@example.com",
"password": "your-password",
"token": ""
"password": "your-password"
},
{
"mobile": "12345678901",
"password": "your-password",
"token": ""
"password": "your-password"
}
],
"model_aliases": {
@@ -273,7 +269,7 @@ cp opencode.json.example opencode.json
"embeddings": {
"provider": "deterministic"
},
"claude_model_mapping": {
"claude_mapping": {
"fast": "deepseek-chat",
"slow": "deepseek-reasoner"
},
@@ -284,21 +280,25 @@ cp opencode.json.example opencode.json
"account_max_inflight": 2,
"account_max_queue": 0,
"global_max_inflight": 0
},
"auto_delete": {
"sessions": false
}
}
```
- `keys`API 访问密钥列表,客户端通过 `Authorization: Bearer <key>` 鉴权
- `accounts`DeepSeek 账号列表,支持 `email` 或 `mobile` 登录
- `token`留空则首次请求时自动登录获取;也可预填已有 token
- `token`配置文件中即使填写也会在加载时被清空(不会从 `config.json` 读取 token实际 token 仅在运行时内存中维护并自动刷新
- `model_aliases`:常见模型名(如 GPT/Codex/Claude到 DeepSeek 模型的映射
- `compat.wide_input_strict_output`:建议保持 `true`(当前实现默认宽进严出)
- `toolcall`:固定采用特征匹配 + 高置信早发策略
- `responses.store_ttl_seconds``/v1/responses/{id}` 的内存缓存 TTL
- `embeddings.provider`embedding 提供方(当前内置 `deterministic/mock/builtin`
- `claude_model_mapping`:字典中 `fast`/`slow` 后缀映射到对应 DeepSeek 模型
- `claude_mapping`:字典中 `fast`/`slow` 后缀映射到对应 DeepSeek 模型(兼容读取 `claude_model_mapping`
- `admin`管理后台设置JWT 过期时间、密码哈希等),可通过 Admin Settings API 热更新
- `runtime`:运行时参数(并发限制、队列大小),可通过 Admin Settings API 热更新
- `runtime`:运行时参数(并发限制、队列大小),可通过 Admin Settings API 热更新`account_max_queue=0`/`global_max_inflight=0` 表示按推荐值自动计算
- `auto_delete.sessions`:是否在请求结束后自动清理 DeepSeek 会话(默认 `false`,可在 Settings 热更新)
### 环境变量
@@ -397,7 +397,7 @@ ds2api/
├── api/
│ ├── index.go # Vercel Serverless Go 入口
│ ├── chat-stream.js # Vercel Node.js 流式转发
│ └── helpers/ # Node.js 辅助模块
│ └── (rewrite targets in vercel.json)
├── internal/
│ ├── account/ # 账号池与并发队列
│ ├── adapter/
@@ -410,6 +410,7 @@ ds2api/
│ ├── compat/ # 兼容性辅助
│ ├── config/ # 配置加载与热更新
│ ├── deepseek/ # DeepSeek API 客户端、PoW WASM
│ ├── js/ # Node 运行时流式处理与兼容逻辑
│ ├── devcapture/ # 开发抓包模块
│ ├── format/ # 输出格式化
│ ├── prompt/ # Prompt 构建
@@ -420,7 +421,9 @@ ds2api/
│ └── webui/ # WebUI 静态文件托管与自动构建
├── webui/ # React WebUI 源码Vite + Tailwind
│ └── src/
│ ├── components/ # AccountManager / ApiTester / BatchImport / VercelSync / Login / LandingPage
│ ├── app/ # 路由、鉴权、配置状态管理
│ ├── features/ # 业务功能模块account/settings/vercel/apiTester
│ ├── components/ # 登录/落地页等通用组件
│ └── locales/ # 中英文语言包zh.json / en.json
├── scripts/
│ └── build-webui.sh # WebUI 手动构建脚本
@@ -500,7 +503,7 @@ go test -v -run 'TestParseToolCalls|TestRepair' ./internal/util/
- **触发条件**:仅在 GitHub Release `published` 时触发(普通 push 不会触发)
- **构建产物**:多平台二进制包(`linux/amd64`、`linux/arm64`、`darwin/amd64`、`darwin/arm64`、`windows/amd64`+ `sha256sums.txt`
- **容器镜像发布**:仅推送到 GHCR`ghcr.io/cjackhwang/ds2api`
- **每个压缩包包含**`ds2api` 可执行文件、`static/admin`、WASM 文件、配置示例、README、LICENSE
- **每个压缩包包含**`ds2api` 可执行文件、`static/admin`、WASM 文件(同时支持内置 fallback、配置示例、README、LICENSE
## 免责声明

View File

@@ -160,17 +160,13 @@ Default URL: `http://localhost:5001`
# 1. Prepare env file
cp .env.example .env
# 2. Generate DS2API_CONFIG_JSON from config.json (single-line Base64)
DS2API_CONFIG_JSON="$(base64 < config.json | tr -d '\n')"
# 3. Edit .env and set:
# 2. Edit .env (at least set DS2API_ADMIN_KEY)
# DS2API_ADMIN_KEY=replace-with-a-strong-secret
# DS2API_CONFIG_JSON=${DS2API_CONFIG_JSON}
# 4. Start
# 3. Start
docker-compose up -d
# 5. View logs
# 4. View logs
docker-compose logs -f
```
@@ -182,6 +178,8 @@ Rebuild after updates: `docker-compose up -d --build`
2. After deployment, open `/admin` and login with `DS2API_ADMIN_KEY` shown in Zeabur env/template instructions.
3. Import / edit config in Admin UI (it will be written and persisted to `/data/config.json`).
Note: when Zeabur builds directly from the repo `Dockerfile`, you do not need to pass `BUILD_VERSION`. The image prefers that build arg when provided, and automatically falls back to the repo-root `VERSION` file when it is absent.
### Option 3: Vercel
1. Fork this repo to your GitHub account
@@ -246,13 +244,11 @@ cp opencode.json.example opencode.json
"accounts": [
{
"email": "user@example.com",
"password": "your-password",
"token": ""
"password": "your-password"
},
{
"mobile": "12345678901",
"password": "your-password",
"token": ""
"password": "your-password"
}
],
"model_aliases": {
@@ -273,7 +269,7 @@ cp opencode.json.example opencode.json
"embeddings": {
"provider": "deterministic"
},
"claude_model_mapping": {
"claude_mapping": {
"fast": "deepseek-chat",
"slow": "deepseek-reasoner"
},
@@ -284,21 +280,25 @@ cp opencode.json.example opencode.json
"account_max_inflight": 2,
"account_max_queue": 0,
"global_max_inflight": 0
},
"auto_delete": {
"sessions": false
}
}
```
- `keys`: API access keys; clients authenticate via `Authorization: Bearer <key>`
- `accounts`: DeepSeek account list, supports `email` or `mobile` login
- `token`: Leave empty for auto-login on first request; or pre-fill an existing token
- `token`: Even if set in `config.json`, it is cleared during load (DS2API does not read persisted tokens from config); runtime tokens are maintained/refreshed in memory only
- `model_aliases`: Map common model names (GPT/Codex/Claude) to DeepSeek models
- `compat.wide_input_strict_output`: Keep `true` (current default policy)
- `toolcall`: Fixed to feature matching + high-confidence early emit
- `responses.store_ttl_seconds`: In-memory TTL for `/v1/responses/{id}`
- `embeddings.provider`: Embeddings provider (`deterministic/mock/builtin` built-in)
- `claude_model_mapping`: Maps `fast`/`slow` suffixes to corresponding DeepSeek models
- `claude_mapping`: Maps `fast`/`slow` suffixes to corresponding DeepSeek models (still compatible with `claude_model_mapping`)
- `admin`: Admin panel settings (JWT expiry, password hash, etc.), hot-reloadable via Admin Settings API
- `runtime`: Runtime parameters (concurrency limits, queue sizes), hot-reloadable via Admin Settings API
- `runtime`: Runtime parameters (concurrency limits, queue sizes), hot-reloadable via Admin Settings API; `account_max_queue=0`/`global_max_inflight=0` means auto-calculate from recommended values
- `auto_delete.sessions`: Whether to auto-delete DeepSeek sessions after request completion (default `false`, hot-reloadable via Settings)
### Environment Variables
@@ -398,7 +398,7 @@ ds2api/
├── api/
│ ├── index.go # Vercel Serverless Go entry
│ ├── chat-stream.js # Vercel Node.js stream relay
│ └── helpers/ # Node.js helper modules
│ └── (rewrite targets in vercel.json)
├── internal/
│ ├── account/ # Account pool and concurrency queue
│ ├── adapter/
@@ -411,6 +411,7 @@ ds2api/
│ ├── compat/ # Compatibility helpers
│ ├── config/ # Config loading and hot-reload
│ ├── deepseek/ # DeepSeek API client, PoW WASM
│ ├── js/ # Node runtime stream/compat logic
│ ├── devcapture/ # Dev packet capture module
│ ├── format/ # Output formatting
│ ├── prompt/ # Prompt construction
@@ -421,7 +422,9 @@ ds2api/
│ └── webui/ # WebUI static file serving and auto-build
├── webui/ # React WebUI source (Vite + Tailwind)
│ └── src/
│ ├── components/ # AccountManager / ApiTester / BatchImport / VercelSync / Login / LandingPage
│ ├── app/ # Routing, auth, config state
│ ├── features/ # Feature modules (account/settings/vercel/apiTester)
│ ├── components/ # Shared UI pieces (login/landing, etc.)
│ └── locales/ # Language packs (zh.json / en.json)
├── scripts/
│ └── build-webui.sh # Manual WebUI build script
@@ -484,7 +487,7 @@ Workflow: `.github/workflows/release-artifacts.yml`
- **Trigger**: only on GitHub Release `published` (normal pushes do not trigger builds)
- **Outputs**: multi-platform archives (`linux/amd64`, `linux/arm64`, `darwin/amd64`, `darwin/arm64`, `windows/amd64`) + `sha256sums.txt`
- **Container publishing**: GHCR only (`ghcr.io/cjackhwang/ds2api`)
- **Each archive includes**: `ds2api` executable, `static/admin`, WASM file, config template, README, LICENSE
- **Each archive includes**: `ds2api` executable, `static/admin`, WASM file (with embedded fallback support), config template, README, LICENSE
## Disclaimer

View File

@@ -51,7 +51,7 @@ DS2API 提供两个层级的测试:
1. **Preflight 检查**
- `go test ./... -count=1`(单元测试)
- `./tests/scripts/check-node-split-syntax.sh`Node 拆分模块语法门禁)
- `node --test`(如仓库存在 Node 单测文件时执行;当前默认以 Go 测试 + Node 语法门禁为主)
- `node --test tests/node/stream-tool-sieve.test.js tests/node/chat-stream.test.js tests/node/js_compat_test.js`
- `npm run build --prefix webui`WebUI 构建检查)
2. **隔离启动**:复制 `config.json` 到临时目录,启动独立服务进程

View File

@@ -1 +1 @@
0.1.0
2.3.8

View File

@@ -9,20 +9,17 @@
{
"_comment": "邮箱登录方式",
"email": "example1@example.com",
"password": "your-password-1",
"token": ""
"password": "your-password-1"
},
{
"_comment": "邮箱登录方式 - 账号2",
"email": "example2@example.com",
"password": "your-password-2",
"token": ""
"password": "your-password-2"
},
{
"_comment": "手机号登录方式(中国大陆)",
"mobile": "12345678901",
"password": "your-password-3",
"token": ""
"password": "your-password-3"
}
],
"model_aliases": {
@@ -43,8 +40,19 @@
"embeddings": {
"provider": "deterministic"
},
"claude_model_mapping": {
"claude_mapping": {
"fast": "deepseek-chat",
"slow": "deepseek-reasoner"
},
"admin": {
"jwt_expire_hours": 24
},
"runtime": {
"account_max_inflight": 2,
"account_max_queue": 0,
"global_max_inflight": 0
},
"auto_delete": {
"sessions": false
}
}

View File

@@ -194,7 +194,7 @@ func TestPoolAccountConcurrencyAliasEnv(t *testing.T) {
}
}
func TestPoolSupportsTokenOnlyAccount(t *testing.T) {
func TestPoolDropsLegacyTokenOnlyAccountOnLoad(t *testing.T) {
t.Setenv("DS2API_ACCOUNT_MAX_INFLIGHT", "1")
t.Setenv("DS2API_CONFIG_JSON", `{
"keys":["k1"],
@@ -203,19 +203,15 @@ func TestPoolSupportsTokenOnlyAccount(t *testing.T) {
pool := NewPool(config.LoadStore())
status := pool.Status()
if got, ok := status["total"].(int); !ok || got != 1 {
if got, ok := status["total"].(int); !ok || got != 0 {
t.Fatalf("unexpected total in pool status: %#v", status["total"])
}
if got, ok := status["available"].(int); !ok || got != 1 {
if got, ok := status["available"].(int); !ok || got != 0 {
t.Fatalf("unexpected available in pool status: %#v", status["available"])
}
acc, ok := pool.Acquire("", nil)
if !ok {
t.Fatalf("expected acquire success for token-only account")
}
if acc.Token != "token-only-account" {
t.Fatalf("unexpected token on acquired account: %q", acc.Token)
if _, ok := pool.Acquire("", nil); ok {
t.Fatalf("expected acquire to fail for token-only account")
}
}

View File

@@ -128,6 +128,9 @@ func TestBuildClaudeToolPromptSingleTool(t *testing.T) {
if !containsStr(prompt, "tool_use") {
t.Fatalf("expected tool_use instruction in prompt")
}
if !containsStr(prompt, "Never output [TOOL_CALL_HISTORY] or [TOOL_RESULT_HISTORY] markers yourself") {
t.Fatalf("expected marker guard instruction in prompt")
}
if containsStr(prompt, "tool_calls") {
t.Fatalf("expected prompt to avoid tool_calls JSON instruction")
}

View File

@@ -54,6 +54,7 @@ func buildClaudeToolPrompt(tools []any) string {
"When you need a tool, respond with Claude-native tool use (tool_use) using the provided tool schema. Do not print tool-call JSON in text.",
"History markers in conversation: [TOOL_CALL_HISTORY]...[/TOOL_CALL_HISTORY] are your previous tool calls; [TOOL_RESULT_HISTORY]...[/TOOL_RESULT_HISTORY] are runtime tool outputs, not user input.",
"After a valid [TOOL_RESULT_HISTORY], continue with final answer instead of repeating the same call unless required fields are still missing.",
"Never output [TOOL_CALL_HISTORY] or [TOOL_RESULT_HISTORY] markers yourself; they are system-side context only.",
)
return strings.Join(parts, "\n\n")
}

View File

@@ -53,7 +53,7 @@ func injectToolPrompt(messages []map[string]any, tools []any, policy util.ToolCh
if len(toolSchemas) == 0 {
return messages, names
}
toolPrompt := "You have access to these tools:\n\n" + strings.Join(toolSchemas, "\n\n") + "\n\nWhen you need to use tools, output ONLY a JSON code block like this:\n```json\n{\"tool_calls\": [{\"name\": \"tool_name\", \"input\": {\"param\": \"value\"}}]}\n```\n\n【EXAMPLE】\nUser: Please check the weather in Beijing and Shanghai, and update my todo list.\nAssistant:\n```json\n{\"tool_calls\": [\n {\"name\": \"get_weather\", \"input\": {\"city\": \"Beijing\"}},\n {\"name\": \"get_weather\", \"input\": {\"city\": \"Shanghai\"}},\n {\"name\": \"update_todo\", \"input\": {\"todos\": [{\"content\": \"Buy milk\"}, {\"content\": \"Write report\"}]}}\n]}\n```\n\nHistory markers in conversation:\n- [TOOL_CALL_HISTORY]...[/TOOL_CALL_HISTORY] means a tool call you already made earlier.\n- [TOOL_RESULT_HISTORY]...[/TOOL_RESULT_HISTORY] means the runtime returned a tool result (not user input).\n\nIMPORTANT:\n1) If calling tools, output ONLY the JSON code block. The response must start with ```json and end with ```.\n2) After receiving a tool result, you MUST use it to produce the final answer.\n3) Only call another tool when the previous result is missing required data or returned an error.\n4) Do not repeat a tool call that is already satisfied by an existing [TOOL_RESULT_HISTORY] block.\n5) JSON SYNTAX STRICTLY REQUIRED: All property names MUST be enclosed in double quotes (e.g., \"name\", not name).\n6) ARRAY FORMAT: If providing a list of items, you MUST enclose them in square brackets `[]` (e.g., \"todos\": [{\"item\": \"a\"}, {\"item\": \"b\"}]). DO NOT output comma-separated objects without brackets."
toolPrompt := "You have access to these tools:\n\n" + strings.Join(toolSchemas, "\n\n") + "\n\nWhen you need to use tools, output ONLY a JSON code block like this:\n```json\n{\"tool_calls\": [{\"name\": \"tool_name\", \"input\": {\"param\": \"value\"}}]}\n```\n\n【EXAMPLE】\nUser: Please check the weather in Beijing and Shanghai, and update my todo list.\nAssistant:\n```json\n{\"tool_calls\": [\n {\"name\": \"get_weather\", \"input\": {\"city\": \"Beijing\"}},\n {\"name\": \"get_weather\", \"input\": {\"city\": \"Shanghai\"}},\n {\"name\": \"update_todo\", \"input\": {\"todos\": [{\"content\": \"Buy milk\"}, {\"content\": \"Write report\"}]}}\n]}\n```\n\nHistory markers in conversation:\n- [TOOL_CALL_HISTORY]...[/TOOL_CALL_HISTORY] means a tool call you already made earlier.\n- [TOOL_RESULT_HISTORY]...[/TOOL_RESULT_HISTORY] means the runtime returned a tool result (not user input).\n\nIMPORTANT:\n1) If calling tools, output ONLY the JSON code block. The response must start with ```json and end with ```.\n2) After receiving a tool result, you MUST use it to produce the final answer.\n3) Only call another tool when the previous result is missing required data or returned an error.\n4) Do not repeat a tool call that is already satisfied by an existing [TOOL_RESULT_HISTORY] block.\n5) Never output [TOOL_CALL_HISTORY] or [TOOL_RESULT_HISTORY] markers in your answer; these markers are system-side context only.\n6) JSON SYNTAX STRICTLY REQUIRED: All property names MUST be enclosed in double quotes (e.g., \"name\", not name).\n7) ARRAY FORMAT: If providing a list of items, you MUST enclose them in square brackets `[]` (e.g., \"todos\": [{\"item\": \"a\"}, {\"item\": \"b\"}]). DO NOT output comma-separated objects without brackets."
if policy.Mode == util.ToolChoiceRequired {
toolPrompt += "\n5) For this response, you MUST call at least one tool from the allowed list."
}

View File

@@ -651,6 +651,48 @@ func TestHandleStreamFencedToolCallSnippetPromotesToolCall(t *testing.T) {
if strings.Contains(strings.ToLower(got), "tool_calls") {
t.Fatalf("expected raw fenced tool_calls snippet stripped from content, got=%q", got)
}
if strings.Contains(strings.ToLower(got), "```json") || strings.Contains(got, "\n```\n") {
t.Fatalf("expected consumed fenced tool payload to not leave empty code fence, got=%q", got)
}
if streamFinishReason(frames) != "tool_calls" {
t.Fatalf("expected finish_reason=tool_calls, body=%s", rec.Body.String())
}
}
func TestHandleStreamStandaloneToolCallAfterClosedFenceKeepsFence(t *testing.T) {
h := &Handler{}
resp := makeSSEHTTPResponse(
fmt.Sprintf(`data: {"p":"response/content","v":%q}`, "先给一个代码示例:\n```text\nhello\n```\n"),
fmt.Sprintf(`data: {"p":"response/content","v":%q}`, "{\"tool_calls\":[{\"name\":\"search\",\"input\":{\"q\":\"go\"}}]}"),
`data: [DONE]`,
)
rec := httptest.NewRecorder()
req := httptest.NewRequest(http.MethodPost, "/v1/chat/completions", nil)
h.handleStream(rec, req, resp, "cid7g", "deepseek-chat", "prompt", false, false, []string{"search"})
frames, done := parseSSEDataFrames(t, rec.Body.String())
if !done {
t.Fatalf("expected [DONE], body=%s", rec.Body.String())
}
if !streamHasToolCallsDelta(frames) {
t.Fatalf("expected tool_calls delta for standalone payload, body=%s", rec.Body.String())
}
content := strings.Builder{}
for _, frame := range frames {
choices, _ := frame["choices"].([]any)
for _, item := range choices {
choice, _ := item.(map[string]any)
delta, _ := choice["delta"].(map[string]any)
if c, ok := delta["content"].(string); ok {
content.WriteString(c)
}
}
}
got := content.String()
if !strings.Contains(got, "```") {
t.Fatalf("expected closed fence before standalone tool json to be preserved, got=%q", got)
}
if streamFinishReason(frames) != "tool_calls" {
t.Fatalf("expected finish_reason=tool_calls, body=%s", rec.Body.String())
}

View File

@@ -80,4 +80,7 @@ func TestBuildOpenAIFinalPrompt_VercelPreparePathKeepsFinalAnswerInstruction(t *
if !strings.Contains(finalPrompt, "[TOOL_RESULT_HISTORY]") {
t.Fatalf("vercel prepare finalPrompt missing history marker instruction: %q", finalPrompt)
}
if !strings.Contains(finalPrompt, "Never output [TOOL_CALL_HISTORY] or [TOOL_RESULT_HISTORY] markers in your answer") {
t.Fatalf("vercel prepare finalPrompt missing marker-output guard instruction: %q", finalPrompt)
}
}

View File

@@ -182,6 +182,9 @@ func findToolSegmentStart(s string) int {
if start < 0 {
start = bestKeyIdx
}
if fenceStart, ok := openFenceStartBefore(s, start); ok {
return fenceStart
}
return start
}
@@ -191,7 +194,7 @@ func consumeToolCapture(state *toolStreamSieveState, toolNames []string) (prefix
return "", nil, "", false
}
lower := strings.ToLower(captured)
keyIdx := -1
keywords := []string{"tool_calls", "function.name:", "[tool_call_history]"}
for _, kw := range keywords {
@@ -200,7 +203,7 @@ func consumeToolCapture(state *toolStreamSieveState, toolNames []string) (prefix
keyIdx = idx
}
}
if keyIdx < 0 {
return "", nil, "", false
}
@@ -226,5 +229,45 @@ func consumeToolCapture(state *toolStreamSieveState, toolNames []string) (prefix
// For now, keep the original logic but rely on loose JSON repair.
return captured, nil, "", true
}
prefixPart, suffixPart = trimWrappingJSONFence(prefixPart, suffixPart)
return prefixPart, parsed.Calls, suffixPart, true
}
func trimWrappingJSONFence(prefix, suffix string) (string, string) {
trimmedPrefix := strings.TrimRight(prefix, " \t\r\n")
fenceIdx := strings.LastIndex(trimmedPrefix, "```")
if fenceIdx < 0 {
return prefix, suffix
}
// Only strip when the trailing fence in prefix behaves like an opening fence.
// A legitimate closing fence before a standalone tool JSON must be preserved.
if strings.Count(trimmedPrefix[:fenceIdx+3], "```")%2 == 0 {
return prefix, suffix
}
fenceHeader := strings.TrimSpace(trimmedPrefix[fenceIdx+3:])
if fenceHeader != "" && !strings.EqualFold(fenceHeader, "json") {
return prefix, suffix
}
trimmedSuffix := strings.TrimLeft(suffix, " \t\r\n")
if !strings.HasPrefix(trimmedSuffix, "```") {
return prefix, suffix
}
consumedLeading := len(suffix) - len(trimmedSuffix)
return trimmedPrefix[:fenceIdx], suffix[consumedLeading+3:]
}
func openFenceStartBefore(s string, pos int) (int, bool) {
if pos <= 0 || pos > len(s) {
return -1, false
}
segment := s[:pos]
lastFence := strings.LastIndex(segment, "```")
if lastFence < 0 {
return -1, false
}
if strings.Count(segment, "```")%2 == 1 {
return lastFence, true
}
return -1, false
}

View File

@@ -36,8 +36,10 @@ func RegisterRoutes(r chi.Router, h *Handler) {
pr.Post("/test", h.testAPI)
pr.Post("/vercel/sync", h.syncVercel)
pr.Get("/vercel/status", h.vercelStatus)
pr.Post("/vercel/status", h.vercelStatus)
pr.Get("/export", h.exportConfig)
pr.Get("/dev/captures", h.getDevCaptures)
pr.Delete("/dev/captures", h.clearDevCaptures)
pr.Get("/version", h.getVersion)
})
}

View File

@@ -6,7 +6,6 @@ import (
"net/http"
"net/http/httptest"
"net/url"
"strings"
"testing"
"github.com/go-chi/chi/v5"
@@ -26,9 +25,9 @@ func newAdminTestHandler(t *testing.T, raw string) *Handler {
}
}
func TestListAccountsIncludesTokenOnlyIdentifier(t *testing.T) {
func TestListAccountsUsesEmailIdentifier(t *testing.T) {
h := newAdminTestHandler(t, `{
"accounts":[{"token":"token-only-account"}]
"accounts":[{"email":"u@example.com","password":"pwd"}]
}`)
req := httptest.NewRequest(http.MethodGet, "/admin/accounts?page=1&page_size=10", nil)
@@ -49,38 +48,8 @@ func TestListAccountsIncludesTokenOnlyIdentifier(t *testing.T) {
}
first, _ := items[0].(map[string]any)
identifier, _ := first["identifier"].(string)
if identifier == "" {
t.Fatalf("expected non-empty identifier: %#v", first)
}
if !strings.HasPrefix(identifier, "token:") {
t.Fatalf("expected token synthetic identifier, got %q", identifier)
}
}
func TestDeleteAccountSupportsTokenOnlyIdentifier(t *testing.T) {
h := newAdminTestHandler(t, `{
"accounts":[{"token":"token-only-account"}]
}`)
accounts := h.Store.Accounts()
if len(accounts) != 1 {
t.Fatalf("expected 1 account, got %d", len(accounts))
}
id := accounts[0].Identifier()
if id == "" {
t.Fatal("expected token-only synthetic identifier")
}
r := chi.NewRouter()
r.Delete("/admin/accounts/{identifier}", h.deleteAccount)
req := httptest.NewRequest(http.MethodDelete, "/admin/accounts/"+url.PathEscape(id), nil)
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("unexpected status: %d body=%s", rec.Code, rec.Body.String())
}
if got := len(h.Store.Accounts()); got != 0 {
t.Fatalf("expected account removed, remaining=%d", got)
if identifier != "u@example.com" {
t.Fatalf("expected email identifier, got %q", identifier)
}
}
@@ -142,11 +111,10 @@ func TestAddAccountRejectsCanonicalMobileDuplicate(t *testing.T) {
}
}
func TestFindAccountByIdentifierSupportsMobileAndTokenOnly(t *testing.T) {
func TestFindAccountByIdentifierSupportsMobile(t *testing.T) {
h := newAdminTestHandler(t, `{
"accounts":[
{"email":"u@example.com","mobile":"13800138000","password":"pwd"},
{"token":"token-only-account"}
{"email":"u@example.com","mobile":"13800138000","password":"pwd"}
]
}`)
@@ -165,21 +133,4 @@ func TestFindAccountByIdentifierSupportsMobileAndTokenOnly(t *testing.T) {
t.Fatalf("unexpected account by +86 mobile: %#v", accByMobileWithCountryCode)
}
tokenOnlyID := ""
for _, acc := range h.Store.Accounts() {
if strings.TrimSpace(acc.Email) == "" && strings.TrimSpace(acc.Mobile) == "" {
tokenOnlyID = acc.Identifier()
break
}
}
if tokenOnlyID == "" {
t.Fatal("expected token-only account identifier")
}
accByTokenOnly, ok := findAccountByIdentifier(h.Store, tokenOnlyID)
if !ok {
t.Fatalf("expected find by token-only id=%q", tokenOnlyID)
}
if accByTokenOnly.Token != "token-only-account" {
t.Fatalf("unexpected token-only account: %#v", accByTokenOnly)
}
}

View File

@@ -89,7 +89,15 @@ func runAccountTestsConcurrently(accounts []config.Account, maxConcurrency int,
func (h *Handler) testAccount(ctx context.Context, acc config.Account, model, message string) map[string]any {
start := time.Now()
identifier := acc.Identifier()
result := map[string]any{"account": identifier, "success": false, "response_time": 0, "message": "", "model": model, "session_count": 0}
result := map[string]any{
"account": identifier,
"success": false,
"response_time": 0,
"message": "",
"model": model,
"session_count": 0,
"config_writable": !h.Store.IsEnvBacked(),
}
defer func() {
status := "failed"
if ok, _ := result["success"].(bool); ok {
@@ -97,15 +105,14 @@ func (h *Handler) testAccount(ctx context.Context, acc config.Account, model, me
}
_ = h.Store.UpdateAccountTestStatus(identifier, status)
}()
token := strings.TrimSpace(acc.Token)
if token == "" {
newToken, err := h.DS.Login(ctx, acc)
if err != nil {
result["message"] = "登录失败: " + err.Error()
return result
}
token = newToken
_ = h.Store.UpdateAccountToken(acc.Identifier(), token)
token, err := h.DS.Login(ctx, acc)
if err != nil {
result["message"] = "登录失败: " + err.Error()
return result
}
if err := h.Store.UpdateAccountToken(acc.Identifier(), token); err != nil {
result["message"] = "登录成功但写入运行时 token 失败: " + err.Error()
return result
}
authCtx := &authn.RequestAuth{UseConfigToken: false, DeepSeekToken: token}
sessionID, err := h.DS.CreateSession(ctx, authCtx, 1)
@@ -117,7 +124,10 @@ func (h *Handler) testAccount(ctx context.Context, acc config.Account, model, me
}
token = newToken
authCtx.DeepSeekToken = token
_ = h.Store.UpdateAccountToken(acc.Identifier(), token)
if err := h.Store.UpdateAccountToken(acc.Identifier(), token); err != nil {
result["message"] = "刷新 token 成功但写入运行时 token 失败: " + err.Error()
return result
}
sessionID, err = h.DS.CreateSession(ctx, authCtx, 1)
if err != nil {
result["message"] = "创建会话失败: " + err.Error()
@@ -133,7 +143,7 @@ func (h *Handler) testAccount(ctx context.Context, acc config.Account, model, me
if strings.TrimSpace(message) == "" {
result["success"] = true
result["message"] = "API 测试成功(仅会话创建)"
result["message"] = "Token 刷新成功(登录与会话创建成功"
result["response_time"] = int(time.Since(start).Milliseconds())
return result
}
@@ -232,20 +242,16 @@ func (h *Handler) deleteAllSessions(w http.ResponseWriter, r *http.Request) {
return
}
// 获取 token
token := strings.TrimSpace(acc.Token)
if token == "" {
newToken, err := h.DS.Login(r.Context(), acc)
if err != nil {
writeJSON(w, http.StatusOK, map[string]any{"success": false, "message": "登录失败: " + err.Error()})
return
}
token = newToken
_ = h.Store.UpdateAccountToken(acc.Identifier(), token)
// 每次先登录刷新一次 token避免使用过期 token
token, err := h.DS.Login(r.Context(), acc)
if err != nil {
writeJSON(w, http.StatusOK, map[string]any{"success": false, "message": "登录失败: " + err.Error()})
return
}
_ = h.Store.UpdateAccountToken(acc.Identifier(), token)
// 删除所有会话
err := h.DS.DeleteAllSessionsForToken(r.Context(), token)
err = h.DS.DeleteAllSessionsForToken(r.Context(), token)
if err != nil {
// token 可能过期,尝试重新登录并重试一次
newToken, loginErr := h.DS.Login(r.Context(), acc)

View File

@@ -77,7 +77,7 @@ func TestTestAccount_BatchModeOnlyCreatesSession(t *testing.T) {
t.Fatalf("expected success=true, got %#v", result)
}
msg, _ := result["message"].(string)
if !strings.Contains(msg, "仅会话创建") {
if !strings.Contains(msg, "Token 刷新成功") {
t.Fatalf("expected session-only success message, got %q", msg)
}
if ds.loginCalls != 1 || ds.createSessionCalls != 1 {
@@ -118,8 +118,8 @@ func TestDeleteAllSessions_RetryWithReloginOnDeleteFailure(t *testing.T) {
if ok, _ := resp["success"].(bool); !ok {
t.Fatalf("expected success response, got %#v", resp)
}
if ds.loginCalls != 1 {
t.Fatalf("expected relogin once, got %d", ds.loginCalls)
if ds.loginCalls != 2 {
t.Fatalf("expected initial login plus relogin, got %d", ds.loginCalls)
}
if ds.deleteAllSessionsCalls != 2 {
t.Fatalf("expected delete called twice, got %d", ds.deleteAllSessionsCalls)

View File

@@ -43,6 +43,7 @@ func (h *Handler) configImport(w http.ResponseWriter, r *http.Request) {
writeJSON(w, http.StatusBadRequest, map[string]any{"detail": err.Error()})
return
}
incoming.ClearAccountTokens()
importedKeys, importedAccounts := 0, 0
err = h.Store.Update(func(c *config.Config) error {
@@ -180,6 +181,7 @@ func (h *Handler) configImport(w http.ResponseWriter, r *http.Request) {
func (h *Handler) computeSyncHash() string {
snap := h.Store.Snapshot().Clone()
snap.ClearAccountTokens()
snap.VercelSyncHash = ""
snap.VercelSyncTime = 0
b, _ := json.Marshal(snap)

View File

@@ -8,8 +8,9 @@ import (
func (h *Handler) getConfig(w http.ResponseWriter, _ *http.Request) {
snap := h.Store.Snapshot()
safe := map[string]any{
"keys": snap.Keys,
"accounts": []map[string]any{},
"keys": snap.Keys,
"accounts": []map[string]any{},
"env_backed": h.Store.IsEnvBacked(),
"claude_mapping": func() map[string]string {
if len(snap.ClaudeMapping) > 0 {
return snap.ClaudeMapping

View File

@@ -50,9 +50,6 @@ func (h *Handler) updateConfig(w http.ResponseWriter, r *http.Request) {
if strings.TrimSpace(acc.Password) == "" {
acc.Password = prev.Password
}
if strings.TrimSpace(acc.Token) == "" {
acc.Token = prev.Token
}
}
seen[key] = struct{}{}
accounts = append(accounts, acc)

View File

@@ -3,6 +3,8 @@ package admin
import (
"bytes"
"context"
"crypto/md5"
"encoding/base64"
"encoding/json"
"fmt"
"io"
@@ -11,6 +13,8 @@ import (
"os"
"strings"
"time"
"ds2api/internal/config"
)
func (h *Handler) syncVercel(w http.ResponseWriter, r *http.Request) {
@@ -25,7 +29,7 @@ func (h *Handler) syncVercel(w http.ResponseWriter, r *http.Request) {
return
}
validated, failed := h.validateAccountsForVercelSync(r.Context(), opts.AutoValidate)
_, cfgB64, err := h.Store.ExportJSONAndBase64()
cfgJSON, cfgB64, err := h.exportSyncConfig(req)
if err != nil {
writeJSON(w, http.StatusInternalServerError, map[string]any{"detail": err.Error()})
return
@@ -47,7 +51,7 @@ func (h *Handler) syncVercel(w http.ResponseWriter, r *http.Request) {
}
savedCreds := h.saveVercelProjectCredentials(r.Context(), client, opts, params, headers, envs)
manual, deployURL := triggerVercelDeployment(r.Context(), client, opts.ProjectID, params, headers)
_ = h.Store.SetVercelSync(h.computeSyncHash(), time.Now().Unix())
_ = h.Store.SetVercelSync(syncHashForJSON(cfgJSON), time.Now().Unix())
result := map[string]any{"success": true, "validated_accounts": validated}
if manual {
result["message"] = "配置已同步到 Vercel请手动触发重新部署"
@@ -209,11 +213,71 @@ func triggerVercelDeployment(ctx context.Context, client *http.Client, projectID
return false, deployURL
}
func (h *Handler) vercelStatus(w http.ResponseWriter, _ *http.Request) {
func (h *Handler) vercelStatus(w http.ResponseWriter, r *http.Request) {
snap := h.Store.Snapshot()
current := h.computeSyncHash()
synced := snap.VercelSyncHash != "" && snap.VercelSyncHash == current
writeJSON(w, http.StatusOK, map[string]any{"synced": synced, "last_sync_time": nilIfZero(snap.VercelSyncTime), "has_synced_before": snap.VercelSyncHash != ""})
draftHash := ""
draftDiffers := false
if r != nil && r.Method == http.MethodPost && r.Body != nil {
var req map[string]any
if err := json.NewDecoder(r.Body).Decode(&req); err == nil {
if cfgJSON, _, err := h.exportSyncConfig(req); err == nil {
draftHash = syncHashForJSON(cfgJSON)
draftDiffers = draftHash != "" && draftHash != current
}
}
}
writeJSON(w, http.StatusOK, map[string]any{
"synced": synced,
"last_sync_time": nilIfZero(snap.VercelSyncTime),
"has_synced_before": snap.VercelSyncHash != "",
"env_backed": h.Store.IsEnvBacked(),
"config_hash": current,
"last_synced_hash": snap.VercelSyncHash,
"draft_hash": draftHash,
"draft_differs": draftDiffers,
})
}
func (h *Handler) exportSyncConfig(req map[string]any) (string, string, error) {
override, ok := req["config_override"]
if !ok || override == nil {
return h.Store.ExportJSONAndBase64()
}
raw, err := json.Marshal(override)
if err != nil {
return "", "", err
}
var cfg config.Config
if err := json.Unmarshal(raw, &cfg); err != nil {
return "", "", err
}
cfg.DropInvalidAccounts()
cfg.ClearAccountTokens()
cfg.VercelSyncHash = ""
cfg.VercelSyncTime = 0
b, err := json.Marshal(cfg)
if err != nil {
return "", "", err
}
return string(b), base64.StdEncoding.EncodeToString(b), nil
}
func syncHashForJSON(s string) string {
var cfg config.Config
if err := json.Unmarshal([]byte(s), &cfg); err != nil {
return ""
}
cfg.VercelSyncHash = ""
cfg.VercelSyncTime = 0
cfg.ClearAccountTokens()
b, err := json.Marshal(cfg)
if err != nil {
return ""
}
sum := md5.Sum(b)
return fmt.Sprintf("%x", sum)
}
func vercelRequest(ctx context.Context, client *http.Client, method, endpoint string, params url.Values, headers map[string]string, body any) (map[string]any, int, error) {

View File

@@ -0,0 +1,75 @@
package admin
import (
"encoding/json"
"net/http"
"strings"
"time"
"ds2api/internal/version"
)
const latestReleaseAPI = "https://api.github.com/repos/CJackHwang/ds2api/releases/latest"
type latestReleasePayload struct {
TagName string `json:"tag_name"`
HTMLURL string `json:"html_url"`
PublishedAt string `json:"published_at"`
}
func (h *Handler) getVersion(w http.ResponseWriter, _ *http.Request) {
current, source := version.Current()
resp := map[string]any{
"success": true,
"current_version": current,
"current_tag": version.Tag(current),
"source": source,
"checked_at": time.Now().UTC().Format(time.RFC3339),
}
req, err := http.NewRequest(http.MethodGet, latestReleaseAPI, nil)
if err != nil {
resp["check_error"] = err.Error()
writeJSON(w, http.StatusOK, resp)
return
}
req.Header.Set("Accept", "application/vnd.github+json")
req.Header.Set("User-Agent", "ds2api-version-check")
client := &http.Client{Timeout: 4 * time.Second}
r, err := client.Do(req)
if err != nil {
resp["check_error"] = err.Error()
writeJSON(w, http.StatusOK, resp)
return
}
defer r.Body.Close()
if r.StatusCode < 200 || r.StatusCode >= 300 {
resp["check_error"] = "github api status: " + r.Status
writeJSON(w, http.StatusOK, resp)
return
}
var data latestReleasePayload
if err := json.NewDecoder(r.Body).Decode(&data); err != nil {
resp["check_error"] = err.Error()
writeJSON(w, http.StatusOK, resp)
return
}
latest := strings.TrimSpace(data.TagName)
if latest == "" {
resp["check_error"] = "missing latest tag"
writeJSON(w, http.StatusOK, resp)
return
}
latestVersion := strings.TrimPrefix(latest, "v")
resp["latest_tag"] = latest
resp["latest_version"] = latestVersion
resp["release_url"] = data.HTMLURL
resp["published_at"] = data.PublishedAt
resp["has_update"] = version.Compare(current, latestVersion) < 0
writeJSON(w, http.StatusOK, resp)
}

View File

@@ -65,7 +65,6 @@ func toAccount(m map[string]any) config.Account {
Email: email,
Mobile: mobile,
Password: fieldString(m, "password"),
Token: fieldString(m, "token"),
}
}

View File

@@ -188,8 +188,8 @@ func TestToAccountAllFields(t *testing.T) {
if acc.Password != "secret" {
t.Fatalf("unexpected password: %q", acc.Password)
}
if acc.Token != "tok123" {
t.Fatalf("unexpected token: %q", acc.Token)
if acc.Token != "" {
t.Fatalf("expected token to be ignored, got %q", acc.Token)
}
}

View File

@@ -0,0 +1,109 @@
package admin
import (
"bytes"
"encoding/json"
"net/http"
"net/http/httptest"
"strings"
"testing"
"github.com/go-chi/chi/v5"
"ds2api/internal/account"
"ds2api/internal/config"
)
func newHTTPAdminHarness(t *testing.T, rawConfig string, ds DeepSeekCaller) http.Handler {
t.Helper()
t.Setenv("DS2API_CONFIG_JSON", rawConfig)
t.Setenv("CONFIG_JSON", "")
store := config.LoadStore()
h := &Handler{
Store: store,
Pool: account.NewPool(store),
DS: ds,
}
r := chi.NewRouter()
RegisterRoutes(r, h)
return r
}
func adminReq(method, path string, body []byte) *http.Request {
req := httptest.NewRequest(method, path, bytes.NewReader(body))
req.Header.Set("Authorization", "Bearer admin")
req.Header.Set("Content-Type", "application/json")
return req
}
func TestConfigImportIgnoresTokenFieldInPayload(t *testing.T) {
ds := &testingDSMock{}
router := newHTTPAdminHarness(t, `{"accounts":[]}`, ds)
payload := []byte(`{
"mode":"replace",
"config":{
"accounts":[{"email":"u@example.com","password":"pwd","token":"expired-token"}]
}
}`)
rec := httptest.NewRecorder()
router.ServeHTTP(rec, adminReq(http.MethodPost, "/config/import", payload))
if rec.Code != http.StatusOK {
t.Fatalf("import status=%d body=%s", rec.Code, rec.Body.String())
}
readRec := httptest.NewRecorder()
router.ServeHTTP(readRec, adminReq(http.MethodGet, "/config", nil))
if readRec.Code != http.StatusOK {
t.Fatalf("get config status=%d body=%s", readRec.Code, readRec.Body.String())
}
var data map[string]any
if err := json.Unmarshal(readRec.Body.Bytes(), &data); err != nil {
t.Fatalf("decode config response: %v", err)
}
accounts, _ := data["accounts"].([]any)
if len(accounts) != 1 {
t.Fatalf("expected one account, got %d", len(accounts))
}
accountMap, _ := accounts[0].(map[string]any)
if hasToken, _ := accountMap["has_token"].(bool); hasToken {
t.Fatalf("expected imported token to be ignored, account=%#v", accountMap)
}
}
func TestAccountTestRefreshesRuntimeTokenButExportOmitsToken(t *testing.T) {
ds := &testingDSMock{}
router := newHTTPAdminHarness(t, `{
"accounts":[{"email":"batch@example.com","password":"pwd","token":"stale-token"}]
}`, ds)
rec := httptest.NewRecorder()
router.ServeHTTP(rec, adminReq(http.MethodPost, "/accounts/test", []byte(`{"identifier":"batch@example.com"}`)))
if rec.Code != http.StatusOK {
t.Fatalf("test account status=%d body=%s", rec.Code, rec.Body.String())
}
var testResp map[string]any
if err := json.Unmarshal(rec.Body.Bytes(), &testResp); err != nil {
t.Fatalf("decode test response: %v", err)
}
if ok, _ := testResp["success"].(bool); !ok {
t.Fatalf("expected test success, got %#v", testResp)
}
if ds.loginCalls < 1 {
t.Fatalf("expected login to be called at least once, got %d", ds.loginCalls)
}
exportRec := httptest.NewRecorder()
router.ServeHTTP(exportRec, adminReq(http.MethodGet, "/config/export", nil))
if exportRec.Code != http.StatusOK {
t.Fatalf("export status=%d body=%s", exportRec.Code, exportRec.Body.String())
}
var exportResp map[string]any
if err := json.Unmarshal(exportRec.Body.Bytes(), &exportResp); err != nil {
t.Fatalf("decode export response: %v", err)
}
exportJSON, _ := exportResp["json"].(string)
if strings.Contains(exportJSON, `"token"`) {
t.Fatalf("expected export json to omit tokens, got %s", exportJSON)
}
}

View File

@@ -58,7 +58,7 @@ func TestDetermineWithXAPIKeyManagedKeyAcquiresAccount(t *testing.T) {
if auth.AccountID != "acc@example.com" {
t.Fatalf("unexpected account id: %q", auth.AccountID)
}
if auth.DeepSeekToken != "account-token" {
if auth.DeepSeekToken != "fresh-token" {
t.Fatalf("unexpected account token: %q", auth.DeepSeekToken)
}
if auth.CallerID == "" {

View File

@@ -1,10 +1,6 @@
package config
import (
"crypto/sha256"
"encoding/hex"
"strings"
)
import "strings"
func (a Account) Identifier() string {
if strings.TrimSpace(a.Email) != "" {
@@ -13,12 +9,5 @@ func (a Account) Identifier() string {
if mobile := NormalizeMobileForStorage(a.Mobile); mobile != "" {
return mobile
}
// Backward compatibility: old configs may contain token-only accounts.
// Use a stable non-sensitive synthetic id so they can still join the pool.
token := strings.TrimSpace(a.Token)
if token == "" {
return ""
}
sum := sha256.Sum256([]byte(token))
return "token:" + hex.EncodeToString(sum[:8])
return ""
}

View File

@@ -12,8 +12,8 @@ type Config struct {
Toolcall ToolcallConfig `json:"toolcall,omitempty"`
Responses ResponsesConfig `json:"responses,omitempty"`
Embeddings EmbeddingsConfig `json:"embeddings,omitempty"`
AutoDelete AutoDeleteConfig `json:"auto_delete"`
VercelSyncHash string `json:"_vercel_sync_hash,omitempty"`
AutoDelete AutoDeleteConfig `json:"auto_delete"`
VercelSyncHash string `json:"_vercel_sync_hash,omitempty"`
VercelSyncTime int64 `json:"_vercel_sync_time,omitempty"`
AdditionalFields map[string]any `json:"-"`
}
@@ -26,6 +26,32 @@ type Account struct {
TestStatus string `json:"test_status,omitempty"`
}
func (c *Config) ClearAccountTokens() {
if c == nil {
return
}
for i := range c.Accounts {
c.Accounts[i].Token = ""
}
}
// DropInvalidAccounts removes accounts that cannot be addressed by admin APIs
// (no email and no normalizable mobile). This prevents legacy token-only
// records from becoming orphaned empty entries after token stripping.
func (c *Config) DropInvalidAccounts() {
if c == nil || len(c.Accounts) == 0 {
return
}
kept := make([]Account, 0, len(c.Accounts))
for _, acc := range c.Accounts {
if acc.Identifier() == "" {
continue
}
kept = append(kept, acc)
}
c.Accounts = kept
}
type CompatConfig struct {
WideInputStrictOutput *bool `json:"wide_input_strict_output,omitempty"`
}

View File

@@ -2,25 +2,22 @@ package config
import (
"encoding/base64"
"strings"
"os"
"testing"
)
func TestAccountIdentifierFallsBackToTokenHash(t *testing.T) {
func TestAccountIdentifierRequiresEmailOrMobile(t *testing.T) {
acc := Account{Token: "example-token-value"}
id := acc.Identifier()
if !strings.HasPrefix(id, "token:") {
t.Fatalf("expected token-prefixed identifier, got %q", id)
}
if len(id) != len("token:")+16 {
t.Fatalf("unexpected identifier length: %d (%q)", len(id), id)
if id != "" {
t.Fatalf("expected empty identifier when only token is present, got %q", id)
}
}
func TestStoreFindAccountWithTokenOnlyIdentifier(t *testing.T) {
func TestLoadStoreClearsTokensFromConfigInput(t *testing.T) {
t.Setenv("DS2API_CONFIG_JSON", `{
"keys":["k1"],
"accounts":[{"token":"token-only-account"}]
"accounts":[{"email":"u@example.com","password":"p","token":"token-only-account"}]
}`)
store := LoadStore()
@@ -28,22 +25,62 @@ func TestStoreFindAccountWithTokenOnlyIdentifier(t *testing.T) {
if len(accounts) != 1 {
t.Fatalf("expected 1 account, got %d", len(accounts))
}
id := accounts[0].Identifier()
if id == "" {
t.Fatalf("expected synthetic identifier for token-only account")
}
found, ok := store.FindAccount(id)
if !ok {
t.Fatalf("expected FindAccount to locate token-only account by synthetic id")
}
if found.Token != "token-only-account" {
t.Fatalf("unexpected token value: %q", found.Token)
if accounts[0].Token != "" {
t.Fatalf("expected token to be cleared after loading, got %q", accounts[0].Token)
}
}
func TestStoreUpdateAccountTokenKeepsOldAndNewIdentifierResolvable(t *testing.T) {
func TestLoadStoreDropsLegacyTokenOnlyAccounts(t *testing.T) {
t.Setenv("DS2API_CONFIG_JSON", `{
"accounts":[{"token":"old-token"}]
"accounts":[
{"token":"legacy-token-only"},
{"email":"u@example.com","password":"p","token":"runtime-token"}
]
}`)
store := LoadStore()
accounts := store.Accounts()
if len(accounts) != 1 {
t.Fatalf("expected token-only account to be dropped, got %d accounts", len(accounts))
}
if accounts[0].Identifier() != "u@example.com" {
t.Fatalf("unexpected remaining account: %#v", accounts[0])
}
if accounts[0].Token != "" {
t.Fatalf("expected persisted token to be cleared, got %q", accounts[0].Token)
}
}
func TestLoadStorePreservesFileBackedTokensForRuntime(t *testing.T) {
tmp, err := os.CreateTemp(t.TempDir(), "config-*.json")
if err != nil {
t.Fatalf("create temp config: %v", err)
}
defer tmp.Close()
if _, err := tmp.WriteString(`{
"accounts":[{"email":"u@example.com","password":"p","token":"persisted-token"}]
}`); err != nil {
t.Fatalf("write temp config: %v", err)
}
t.Setenv("DS2API_CONFIG_JSON", "")
t.Setenv("CONFIG_JSON", "")
t.Setenv("DS2API_CONFIG_PATH", tmp.Name())
store := LoadStore()
accounts := store.Accounts()
if len(accounts) != 1 {
t.Fatalf("expected 1 account, got %d", len(accounts))
}
if accounts[0].Token != "persisted-token" {
t.Fatalf("expected file-backed token preserved for runtime use, got %q", accounts[0].Token)
}
}
func TestStoreUpdateAccountTokenKeepsIdentifierResolvable(t *testing.T) {
t.Setenv("DS2API_CONFIG_JSON", `{
"accounts":[{"email":"user@example.com","password":"p"}]
}`)
store := LoadStore()
@@ -52,23 +89,12 @@ func TestStoreUpdateAccountTokenKeepsOldAndNewIdentifierResolvable(t *testing.T)
t.Fatalf("expected 1 account, got %d", len(before))
}
oldID := before[0].Identifier()
if oldID == "" {
t.Fatal("expected old identifier")
}
if err := store.UpdateAccountToken(oldID, "new-token"); err != nil {
t.Fatalf("update token failed: %v", err)
}
after := store.Accounts()
newID := after[0].Identifier()
if newID == "" || newID == oldID {
t.Fatalf("expected changed identifier, old=%q new=%q", oldID, newID)
}
if got, ok := store.FindAccount(newID); !ok || got.Token != "new-token" {
t.Fatalf("expected find by new identifier")
}
if got, ok := store.FindAccount(oldID); !ok || got.Token != "new-token" {
t.Fatalf("expected find by old identifier alias")
t.Fatalf("expected find by stable account identifier")
}
}

View File

@@ -39,6 +39,8 @@ func loadConfig() (Config, bool, error) {
}
if rawCfg != "" {
cfg, err := parseConfigString(rawCfg)
cfg.ClearAccountTokens()
cfg.DropInvalidAccounts()
return cfg, true, err
}
@@ -55,6 +57,7 @@ func loadConfig() (Config, bool, error) {
if err := json.Unmarshal(content, &cfg); err != nil {
return Config{}, false, err
}
cfg.DropInvalidAccounts()
if IsVercel() {
// Vercel filesystem is ephemeral/read-only for runtime writes; avoid save errors.
return cfg, true, nil
@@ -161,7 +164,9 @@ func (s *Store) Save() error {
Logger.Info("[save_config] source from env, skip write")
return nil
}
b, err := json.MarshalIndent(s.cfg, "", " ")
persistCfg := s.cfg.Clone()
persistCfg.ClearAccountTokens()
b, err := json.MarshalIndent(persistCfg, "", " ")
if err != nil {
return err
}
@@ -173,7 +178,9 @@ func (s *Store) saveLocked() error {
Logger.Info("[save_config] source from env, skip write")
return nil
}
b, err := json.MarshalIndent(s.cfg, "", " ")
persistCfg := s.cfg.Clone()
persistCfg.ClearAccountTokens()
b, err := json.MarshalIndent(persistCfg, "", " ")
if err != nil {
return err
}
@@ -197,7 +204,9 @@ func (s *Store) SetVercelSync(hash string, ts int64) error {
func (s *Store) ExportJSONAndBase64() (string, string, error) {
s.mu.RLock()
defer s.mu.RUnlock()
b, err := json.Marshal(s.cfg)
exportCfg := s.cfg.Clone()
exportCfg.ClearAccountTokens()
b, err := json.Marshal(exportCfg)
if err != nil {
return "", "", err
}

View File

@@ -73,7 +73,7 @@ func (c *Client) CreateSession(ctx context.Context, a *auth.RequestAuth, maxAtte
}
config.Logger.Warn("[create_session] failed", "status", status, "code", code, "biz_code", bizCode, "msg", msg, "biz_msg", bizMsg, "use_config_token", a.UseConfigToken, "account", a.AccountID)
if a.UseConfigToken {
if isTokenInvalid(status, code, bizCode, msg, bizMsg) && !refreshed {
if !refreshed && shouldAttemptRefresh(status, code, bizCode, msg, bizMsg) {
if c.Auth.RefreshToken(ctx, a) {
refreshed = true
continue
@@ -118,7 +118,7 @@ func (c *Client) GetPow(ctx context.Context, a *auth.RequestAuth, maxAttempts in
}
config.Logger.Warn("[get_pow] failed", "status", status, "code", code, "biz_code", bizCode, "msg", msg, "biz_msg", bizMsg, "use_config_token", a.UseConfigToken, "account", a.AccountID)
if a.UseConfigToken {
if isTokenInvalid(status, code, bizCode, msg, bizMsg) && !refreshed {
if !refreshed && shouldAttemptRefresh(status, code, bizCode, msg, bizMsg) {
if c.Auth.RefreshToken(ctx, a) {
refreshed = true
continue
@@ -160,6 +160,47 @@ func isTokenInvalid(status int, code int, bizCode int, msg string, bizMsg string
strings.Contains(msg, "invalid jwt")
}
func shouldAttemptRefresh(status int, code int, bizCode int, msg string, bizMsg string) bool {
if isTokenInvalid(status, code, bizCode, msg, bizMsg) {
return true
}
// Some DeepSeek failures come back as HTTP 200/code=0 but with non-zero biz_code.
// Only attempt refresh when these biz failures still look auth-related.
return status == http.StatusOK &&
code == 0 &&
bizCode != 0 &&
isAuthIndicativeBizFailure(msg, bizMsg)
}
func isAuthIndicativeBizFailure(msg string, bizMsg string) bool {
combined := strings.ToLower(strings.TrimSpace(msg) + " " + strings.TrimSpace(bizMsg))
authKeywords := []string{
"auth",
"authorization",
"credential",
"expired",
"invalid jwt",
"jwt",
"login",
"not login",
"session expired",
"token",
"unauthorized",
"登录",
"未登录",
"认证",
"凭证",
"会话过期",
"令牌",
}
for _, keyword := range authKeywords {
if strings.Contains(combined, keyword) {
return true
}
}
return false
}
func extractResponseStatus(resp map[string]any) (code int, bizCode int, msg string, bizMsg string) {
code = intFrom(resp["code"])
msg, _ = resp["msg"].(string)

View File

@@ -0,0 +1,27 @@
package deepseek
import "testing"
func TestShouldAttemptRefreshOnTokenInvalidSignal(t *testing.T) {
if !shouldAttemptRefresh(401, 0, 0, "unauthorized", "") {
t.Fatal("expected refresh when response indicates invalid token")
}
}
func TestShouldAttemptRefreshOnAuthIndicativeBizCodeFailure(t *testing.T) {
if !shouldAttemptRefresh(200, 0, 400123, "", "login expired, token invalid") {
t.Fatal("expected refresh on auth-indicative biz_code failure")
}
}
func TestShouldAttemptRefreshFalseOnNonAuthBizCodeFailure(t *testing.T) {
if shouldAttemptRefresh(200, 0, 400123, "", "session create failed: quota reached") {
t.Fatal("did not expect refresh on non-auth biz_code failure")
}
}
func TestShouldAttemptRefreshFalseOnGenericServerError(t *testing.T) {
if shouldAttemptRefresh(500, 500, 0, "internal error", "") {
t.Fatal("did not expect refresh on generic server error")
}
}

View File

@@ -60,6 +60,9 @@ function formatIncrementalToolCallDeltas(deltas, idStore) {
if (typeof d.arguments === 'string' && d.arguments !== '') {
fn.arguments = d.arguments;
}
if (Object.keys(fn).length === 0) {
continue;
}
if (Object.keys(fn).length > 0) {
item.function = fn;
}

View File

@@ -1,33 +1,22 @@
'use strict';
const {
extractToolNames,
createToolSieveState,
processToolSieveChunk,
flushToolSieve,
parseStandaloneToolCalls,
formatOpenAIStreamToolCalls,
} = require('../helpers/stream-tool-sieve');
const {
BASE_HEADERS,
} = require('../shared/deepseek-constants');
const {
writeOpenAIError,
} = require('./error_shape');
const {
parseChunkForContent,
isCitation,
} = require('./sse_parse');
const {
buildUsage,
} = require('./token_usage');
const { BASE_HEADERS } = require('../shared/deepseek-constants');
const { writeOpenAIError } = require('./error_shape');
const { parseChunkForContent, isCitation } = require('./sse_parse');
const { buildUsage } = require('./token_usage');
const {
resolveToolcallPolicy,
formatIncrementalToolCallDeltas,
filterIncrementalToolCallDeltasByAllowed,
} = require('./toolcall_policy');
const {
createChatCompletionEmitter,
} = require('./stream_emitter');
const { createChatCompletionEmitter } = require('./stream_emitter');
const {
asString,
isAbortError,
@@ -57,6 +46,7 @@ async function handleVercelStream(req, res, rawBody, payload) {
const searchEnabled = toBool(prep.body.search_enabled);
const toolPolicy = resolveToolcallPolicy(prep.body, payload.tools);
const toolNames = toolPolicy.toolNames;
const emitEarlyToolDeltas = toolPolicy.emitEarlyToolDeltas;
if (!model || !leaseID || !deepseekToken || !powHeader || !completionPayload) {
writeOpenAIError(res, 500, 'invalid vercel prepare response');
@@ -132,6 +122,7 @@ async function handleVercelStream(req, res, rawBody, payload) {
const toolSieveState = createToolSieveState();
let toolCallsEmitted = false;
const streamToolCallIDs = new Map();
const streamToolNames = new Map();
const decoder = new TextDecoder();
reader = completionRes.body.getReader();
let buffered = '';
@@ -255,6 +246,18 @@ async function handleVercelStream(req, res, rawBody, payload) {
}
const events = processToolSieveChunk(toolSieveState, p.text, toolNames);
for (const evt of events) {
if (evt.type === 'tool_call_deltas') {
if (!emitEarlyToolDeltas) {
continue;
}
const filtered = filterIncrementalToolCallDeltasByAllowed(evt.deltas, toolNames, streamToolNames);
const formatted = formatIncrementalToolCallDeltas(filtered, streamToolCallIDs);
if (formatted.length > 0) {
toolCallsEmitted = true;
sendDeltaFrame({ tool_calls: formatted });
}
continue;
}
if (evt.type === 'tool_calls') {
toolCallsEmitted = true;
sendDeltaFrame({ tool_calls: formatOpenAIStreamToolCalls(evt.calls, streamToolCallIDs) });

View File

@@ -17,15 +17,18 @@ function extractToolNames(tools) {
return [];
}
const out = [];
const seen = new Set();
for (const t of tools) {
if (!t || typeof t !== 'object') {
continue;
}
const fn = t.function && typeof t.function === 'object' ? t.function : t;
const name = toStringSafe(fn.name);
// Keep parity with Go injectToolPrompt: object tools without name still
// enter tool mode via fallback name "unknown".
out.push(name || 'unknown');
if (!name || seen.has(name)) {
continue;
}
seen.add(name);
out.push(name);
}
return out;
}

View File

@@ -256,11 +256,40 @@ function consumeToolCapture(state, toolNames) {
};
}
const trimmedFence = trimWrappingJSONFence(prefixPart, suffixPart);
return {
ready: true,
prefix: prefixPart,
prefix: trimmedFence.prefix,
calls: parsed.calls,
suffix: suffixPart,
suffix: trimmedFence.suffix,
};
}
function trimWrappingJSONFence(prefix, suffix) {
const rightTrimmedPrefix = (prefix || '').replace(/[ \t\r\n]+$/g, '');
const fenceIdx = rightTrimmedPrefix.lastIndexOf('```');
if (fenceIdx < 0) {
return { prefix, suffix };
}
// Only strip when this behaves like an opening fence.
// If it's a legitimate closing fence before standalone tool JSON, keep it.
const fenceCount = (rightTrimmedPrefix.slice(0, fenceIdx + 3).match(/```/g) || []).length;
if (fenceCount % 2 === 0) {
return { prefix, suffix };
}
const header = rightTrimmedPrefix.slice(fenceIdx + 3).trim().toLowerCase();
if (header && header !== 'json') {
return { prefix, suffix };
}
const leftTrimmedSuffix = (suffix || '').replace(/^[ \t\r\n]+/g, '');
if (!leftTrimmedSuffix.startsWith('```')) {
return { prefix, suffix };
}
const consumed = (suffix || '').length - leftTrimmedSuffix.length;
return {
prefix: rightTrimmedPrefix.slice(0, fenceIdx),
suffix: (suffix || '').slice(consumed + 3),
};
}

View File

@@ -26,6 +26,7 @@ func parseToolCallInput(v any) map[string]any {
repaired := repairInvalidJSONBackslashes(raw)
if repaired != raw {
if err := json.Unmarshal([]byte(repaired), &parsed); err == nil && parsed != nil {
repairPathLikeControlChars(parsed)
return parsed
}
}
@@ -33,6 +34,7 @@ func parseToolCallInput(v any) map[string]any {
repairedLoose := RepairLooseJSON(raw)
if repairedLoose != raw {
if err := json.Unmarshal([]byte(repairedLoose), &parsed); err == nil && parsed != nil {
repairPathLikeControlChars(parsed)
return parsed
}
}

185
internal/version/version.go Normal file
View File

@@ -0,0 +1,185 @@
package version
import (
"os"
"path/filepath"
"runtime"
"strconv"
"strings"
"sync"
)
// BuildVersion can be injected at build time via -ldflags.
// In release builds it should come from Git tag (e.g. v2.3.5).
var BuildVersion = ""
var (
currentOnce sync.Once
currentVal string
sourceVal string
)
func Current() (value string, source string) {
currentOnce.Do(func() {
if build := strings.TrimSpace(BuildVersion); build != "" {
currentVal = normalize(build)
sourceVal = "build-ldflags"
return
}
if fv := readVersionFile(); fv != "" {
currentVal = normalize(fv)
sourceVal = "file:VERSION"
return
}
if vv := versionFromVercelEnv(); vv != "" {
currentVal = vv
sourceVal = "env:vercel"
return
}
currentVal = "dev"
sourceVal = "default"
})
return currentVal, sourceVal
}
func readVersionFile() string {
candidates := []string{"VERSION"}
if wd, err := os.Getwd(); err == nil {
candidates = append(candidates, filepath.Join(wd, "VERSION"))
}
if _, file, _, ok := runtime.Caller(0); ok {
repoRoot := filepath.Clean(filepath.Join(filepath.Dir(file), "../.."))
candidates = append(candidates, filepath.Join(repoRoot, "VERSION"))
}
seen := map[string]struct{}{}
for _, c := range candidates {
c = filepath.Clean(strings.TrimSpace(c))
if c == "" {
continue
}
if _, ok := seen[c]; ok {
continue
}
seen[c] = struct{}{}
b, err := os.ReadFile(c)
if err != nil {
continue
}
if v := strings.TrimSpace(string(b)); v != "" {
return v
}
}
return ""
}
func normalize(v string) string {
v = strings.TrimSpace(v)
if v == "" {
return ""
}
return strings.TrimPrefix(v, "v")
}
func Tag(v string) string {
v = normalize(v)
if v == "" || v == "dev" {
return v
}
if v[0] < '0' || v[0] > '9' {
return v
}
return "v" + v
}
func versionFromVercelEnv() string {
if tag := normalize(strings.TrimSpace(os.Getenv("VERCEL_GIT_COMMIT_TAG"))); tag != "" {
return tag
}
ref := strings.TrimSpace(os.Getenv("VERCEL_GIT_COMMIT_REF"))
sha := strings.TrimSpace(os.Getenv("VERCEL_GIT_COMMIT_SHA"))
if len(sha) > 7 {
sha = sha[:7]
}
ref = sanitizeVersionLabel(ref)
sha = sanitizeVersionLabel(sha)
if ref == "" && sha == "" {
return ""
}
if ref != "" && sha != "" {
return "preview-" + ref + "." + sha
}
if ref != "" {
return "preview-" + ref
}
return "preview-" + sha
}
func sanitizeVersionLabel(in string) string {
in = strings.TrimSpace(strings.ToLower(in))
if in == "" {
return ""
}
var b strings.Builder
b.Grow(len(in))
prevDash := false
for i := 0; i < len(in); i++ {
c := in[i]
if (c >= 'a' && c <= 'z') || (c >= '0' && c <= '9') {
b.WriteByte(c)
prevDash = false
continue
}
if !prevDash {
b.WriteByte('-')
prevDash = true
}
}
out := strings.Trim(b.String(), "-")
return out
}
func Compare(a, b string) int {
pa := parse(normalize(a))
pb := parse(normalize(b))
for i := 0; i < 3; i++ {
if pa[i] < pb[i] {
return -1
}
if pa[i] > pb[i] {
return 1
}
}
return 0
}
func parse(v string) [3]int {
var out [3]int
parts := strings.SplitN(v, ".", 4)
for i := 0; i < 3 && i < len(parts); i++ {
n := readLeadingInt(parts[i])
out[i] = n
}
return out
}
func readLeadingInt(s string) int {
s = strings.TrimSpace(s)
if s == "" {
return 0
}
i := 0
for ; i < len(s); i++ {
if s[i] < '0' || s[i] > '9' {
break
}
}
if i == 0 {
return 0
}
n, err := strconv.Atoi(s[:i])
if err != nil {
return 0
}
return n
}

View File

@@ -0,0 +1,39 @@
package version
import "testing"
func TestNormalizeAndTag(t *testing.T) {
if got := normalize("v2.3.5"); got != "2.3.5" {
t.Fatalf("normalize failed: %q", got)
}
if got := Tag("2.3.5"); got != "v2.3.5" {
t.Fatalf("tag failed: %q", got)
}
}
func TestCompare(t *testing.T) {
if Compare("2.3.5", "2.3.5") != 0 {
t.Fatal("expected equal")
}
if Compare("2.3.5", "2.3.6") >= 0 {
t.Fatal("expected less")
}
if Compare("v2.10.0", "2.3.9") <= 0 {
t.Fatal("expected greater")
}
}
func TestTagKeepsPreviewStyle(t *testing.T) {
if got := Tag("preview-dev.abcd123"); got != "preview-dev.abcd123" {
t.Fatalf("expected preview tag unchanged, got %q", got)
}
}
func TestVersionFromVercelEnv(t *testing.T) {
t.Setenv("VERCEL_GIT_COMMIT_TAG", "")
t.Setenv("VERCEL_GIT_COMMIT_REF", "dev")
t.Setenv("VERCEL_GIT_COMMIT_SHA", "abcdef123456")
if got := versionFromVercelEnv(); got != "preview-dev.abcdef1" {
t.Fatalf("unexpected vercel preview version: %q", got)
}
}

View File

@@ -1,101 +0,0 @@
# DeepSeek Function Calling 缺陷分析与 ds2api 的增强修复策略
> **相关 PR**: #74 (代码核心实现) 与 #75 (Merge to dev)
> **问题背景**: 解决因包括 DeepSeek 在内的部分模型在函数调用Function Calling/Tool Call表现不够“规范”从而导致工具调用失败的问题。
## 一、底层架构对比:为什么会产生 Function Calling 缺陷?
在探讨缺陷前,我们需要理解两种 Function Calling 的底层结构差异:
### 1. OpenAI 的原生结构化返回 (API 级分离)
在 OpenAI 的规范中,**聊天文字与工具调用是在底层的 JSON 结构中被硬性拆分的**
* 聊天废话存放在 `response.choices[0].message.content` 里。
* 工具请求存放在单独的数组 `response.choices[0].message.tool_calls` 里。
**优势:** 这种设计对客户端极其友好。客户端只需判断 `tool_calls` 是否为空,就能决定是执行代码还是渲染文字。它支持同时并发多个工具请求,且底层的生成殷勤被严格训练和约束,极少抛出语法错误的 JSON。
### 2. DeepSeek 等模型的“单文本流”机制
相比之下,部分未经深度专门微调的模型(或者在特定的通信适配层中),它们依然倾向于把一切内容打包成一个纯文本流吐出。这就是为什么它们的输出往往不仅包含了本该属于 `tool_calls` 结构里的 JSON还会像个“老实人”一样夹杂了属于 `content` 里的散文。
---
## 二、DeepSeek 在 Function Calling 上的特定缺陷表现
相比于 OpenAI 严格遵循 API 约定的原生结构DeepSeek 等开源/国产推理模型在工具调用时,经常会暴露出以下三种典型的“不守规矩”的输出行为:
### 1. 混合输出:散文文本与工具 JSON 混杂 (Mixed Prose Streams)
当应用要求模型直接返回工具请求时DeepSeek 有时候会**“忍不住想和用户搭话”**。
它常常前置一段解释性废话,中间插入工具调用的 JSON 参数,并在末尾再补上一句总结:
```text
好的,我这就帮你读取 README.md 的内容:
{"tool_calls":[{"name":"read_file","input":{"path":"README.md"}}]}
请稍等片刻,我马上把它读出来。
```
**旧版系统痛点:**
原有的代码存在**严格模式Strict Mode**校验:
```go
// 如果解析到的 JSON 块前后存在任何非空字符串,就放弃当作工具调用!
if strings.TrimSpace(state.recentTextTail) != "" || strings.TrimSpace(prefixPart) != "" ... {
return captured, nil, "", true
}
```
这直接导致上述结构被网关认定是一段“普通聊天”,直接原封不动地返回给用户,这直接干挂了后续的工具自动执行流程。
### 2. 工具名格式幻觉:擅自修改或前缀化工具名称
由于 DeepSeek 的预训练数据中有大量的代码和不同的平台结构,它在回复工具名称时,常常无法忠实于 System Prompt 中提供的纯命名(也就是 `name: "read_file"`),而是加上前缀或者拼写变形,例如:
* `{"name": "mcp.search_web"}` (自带命名空间)
* `{"name": "tools.read_file"}`
* `{"name": "search-web"}` (下划线变成了中划线)
**旧版系统痛点:**
旧版系统对于工具名的匹配几乎只有“绝对相等”的字典级比对,只要差了一个字符或加了前缀,就会由于找不到合法工具而直接失败。
### 3. Role 角色的非标准返回
在部分工具通信流的响应中,返回的内容其所属的 `role` 没有被标准化处理,可能携带意料之外的属性,或是与下游严格比对出现冲突。
---
## 二、PR #74 的代码增强修复方案
为了解决大模型这种自身的不规范行为PR #74 在系统的中间层网关联入了一个**极其包容的容错引擎**。它并不强制要求模型“改过自新”,而是主动做了以下三块增强:
### 1. 从流中分离混合内容(废除 Strict Mode
修改了 `internal/adapter/openai/tool_sieve_core.go`
取消了前后包裹文本的拦截逻辑。当系统扫描到流式结构中有完整的 `{"tool_calls":...}` 时,它会将废话和 JSON 分发到不同的事件流中:
```go
if prefix != "" {
// 将前面的“好的,帮你读文件”剥离出来作为常规文本输出
state.noteText(prefix)
events = append(events, toolStreamEvent{Content: prefix})
}
// 捕获并拦截中间的工具请求,进行背后执行
state.pendingToolCalls = calls
```
**效果:** 用户的屏幕上只能看到正常的文字交流,而后端的工具也会立刻挂载。
### 2. 多级宽容匹配引擎 (Resolve Allowed Tool Name)
`internal/util/toolcalls_parse.go` 中,新增了一个由严到松降级匹配的强大漏斗策略函数 `resolveAllowedToolName`
1. **绝对匹配**:和以前一样,`read_file` == `read_file`
2. **忽略大小写**`Read_File` 算作合法。
3. **命名空间抹除**:通过寻找最后一个 `.` 来剥离前缀,强制将 `mcp.search_web` 还原出真实的 `search_web`
4. **终极正则清洗**
引入 `var toolNameLoosePattern = regexp.MustCompile(`[^a-z0-9]+`)`
这个正则剥离了字符串里所有的符号、空格、格式符。
将传入的 `read-file` 洗除符号成为 `readfile`,并去和系统中所有合法工具同样清洗后的版本进行比较。只要核心字母一致,即算作匹配成功。
### 3. Role 归一化 (Normalize OpenAIRoleForPrompt)
`internal/adapter/openai/responses_input_items.go` 等处,引入了特定的 `normalizeOpenAIRoleForPrompt(role)` 清洗,保证输入和传递给上游的 Role 枚举始终受控,消除了因为意外的身份字段传参崩溃。
---
## 报告总结与 tool_sieve 的本质作用
PR #74 / #75 并没有从模型本身开刀,而是基于**网关应足够健壮**的设计哲学。
**其实整个增强实现,本质上实现了一个名为 `tool_sieve` (工具筛子) 的中间层网关。**
面对 DeepSeek 这种吐出一团混合了聊天文字与 JSON 面团的“不标准”数据流,`tool_sieve` 就像一个勤劳的高精度筛子,不仅人工揉开了面团:
1. 它把散文分拣出来,塞回标准结构的 `content` 字段去展示;
2. 剥离并清洗出有瑕疵的 JSON 块,按照 OpenAI 的标准格式小心翼翼地放进 `tool_calls` 结构里去等待执行。
这意味着,即便 AI 被配置了奇怪的回复设定、加粗了强调语言,甚至是犯了标点符号拼写小失误,**只要它输出了可以拼凑成工具指令的 JSON 核心单元,整个中继层就能将其挽救,并把正确的工具结果呈现给模型和用户**。 这不仅修复了缺陷,更极大地增强了工具网关的通用性和鲁棒性。

View File

@@ -1,32 +0,0 @@
# DS2API Refactor Baseline (Historical Snapshot)
- Snapshot time: `2026-02-22T08:53:54Z`
- Snapshot branch: `dev`
- Snapshot HEAD: `5d3989a`
- Scope: backend + node api + webui large-file decoupling (no behavior change)
## Gate Commands
1. `./tests/scripts/run-unit-all.sh`
- Result: PASS
- Includes:
- `go test ./...`
- `node --test api/helpers/stream-tool-sieve.test.js api/chat-stream.test.js api/compat/js_compat_test.js`
2. `npm --prefix webui run build`
- Result: PASS
3. `./tests/scripts/check-refactor-line-gate.sh`
- Result: PASS (`checked=131 missing=0 over_limit=0`)
4. Stage gates (1-5) replay:
- `go test ./internal/config ./internal/admin ./internal/account ./internal/deepseek ./internal/format/openai` -> PASS
- `go test ./internal/adapter/openai ./internal/util ./internal/sse ./internal/compat` -> PASS
- `go test ./internal/adapter/claude ./internal/adapter/gemini ./internal/config` -> PASS
- `go test ./internal/testsuite ./cmd/ds2api-tests` -> PASS
- `node --test api/helpers/stream-tool-sieve.test.js api/chat-stream.test.js api/compat/js_compat_test.js` -> PASS
5. Final full regression:
- `go test ./... -count=1` -> PASS
## Notes
- This file records a historical baseline for refactor process tracking.
- It is not intended to represent the current repository HEAD.
- Frontend manual smoke for phase 6 still requires human execution and sign-off.

View File

@@ -1,22 +0,0 @@
# Refactor Line Gate
## Rules
1. Backend production files upper bound: `<= 300` lines.
2. Frontend (`webui/`) production files upper bound: `<= 500` lines.
3. Entry/facade files upper bound: `<= 120` lines.
4. Scope is limited to target files in `plans/refactor-line-gate-targets.txt`.
5. Test files are out of scope for this gate.
## Command
```bash
./tests/scripts/check-refactor-line-gate.sh
```
## Naming Note
- Original split plan used `internal/admin/handler_accounts_test.go` for account probing logic.
- In Go, `*_test.go` files are test-only compilation units and cannot host production handlers.
- The production file is implemented as `internal/admin/handler_accounts_testing.go`.

View File

@@ -98,6 +98,12 @@ test('incremental and final tool formatting share stable id via idStore', () =>
assert.equal(incremental[0].id, finalCalls[0].id);
});
test('formatIncrementalToolCallDeltas drops empty deltas (Go parity)', () => {
const idStore = new Map();
const formatted = formatIncrementalToolCallDeltas([{ index: 0 }], idStore);
assert.deepEqual(formatted, []);
});
test('parseChunkForContent keeps split response/content fragments inside response array', () => {
const chunk = {
p: 'response',

View File

@@ -31,13 +31,14 @@ function collectText(events) {
.join('');
}
test('extractToolNames keeps tool mode enabled with unknown fallback', () => {
test('extractToolNames keeps only declared tool names (Go parity)', () => {
const names = extractToolNames([
{ function: { description: 'no name tool' } },
{ function: { name: ' read_file ' } },
{ function: { name: 'read_file' } },
{},
]);
assert.deepEqual(names, ['unknown', 'read_file', 'unknown']);
assert.deepEqual(names, ['read_file']);
});
test('parseToolCalls keeps non-object argument strings as _raw (Go parity)', () => {
@@ -285,6 +286,18 @@ test('sieve emits tool_calls and keeps trailing prose when payload and prose sha
assert.equal(leakedText.toLowerCase().includes('tool_calls'), false);
});
test('sieve preserves closed fence before standalone tool payload', () => {
const events = runSieve(
['先给一个代码示例:\n```text\nhello\n```\n{"tool_calls":[{"name":"read_file","input":{"path":"README.MD"}}]}'],
['read_file'],
);
const hasTool = events.some((evt) => evt.type === 'tool_calls' && evt.calls?.length > 0);
const leakedText = collectText(events);
assert.equal(hasTool, true);
assert.equal(leakedText.includes('```'), true);
assert.equal(leakedText.toLowerCase().includes('tool_calls'), false);
});
test('formatOpenAIStreamToolCalls reuses ids with the same idStore', () => {
const idStore = new Map();
const calls = [{ name: 'read_file', input: { path: 'README.MD' } }];

View File

@@ -78,6 +78,10 @@
"source": "/admin/export",
"destination": "/api/index"
},
{
"source": "/admin/version",
"destination": "/api/index"
},
{
"source": "/admin",
"destination": "/admin/index.html"

View File

@@ -1,5 +1,7 @@
import { useCallback, useEffect, useState } from 'react'
const ENV_DRAFT_KEY = 'ds2api_env_config_draft_v1'
export function useAdminConfig({ token, showMessage, t }) {
const [config, setConfig] = useState({ keys: [], accounts: [] })
@@ -11,6 +13,11 @@ export function useAdminConfig({ token, showMessage, t }) {
})
if (res.ok) {
const data = await res.json()
if (data?.env_backed) {
localStorage.setItem(ENV_DRAFT_KEY, JSON.stringify(data))
} else {
localStorage.removeItem(ENV_DRAFT_KEY)
}
setConfig(data)
}
} catch (e) {
@@ -21,6 +28,17 @@ export function useAdminConfig({ token, showMessage, t }) {
useEffect(() => {
if (token) {
const rawDraft = localStorage.getItem(ENV_DRAFT_KEY)
if (rawDraft) {
try {
const draft = JSON.parse(rawDraft)
if (draft?.env_backed) {
setConfig(draft)
}
} catch (_e) {
localStorage.removeItem(ENV_DRAFT_KEY)
}
}
fetchConfig()
}
}, [fetchConfig, token])

View File

@@ -101,6 +101,7 @@ export default function AccountManagerContainer({ config, onRefresh, onMessage,
onPageSizeChange={changePageSize}
searchQuery={searchQuery}
onSearchChange={handleSearchChange}
envBacked={Boolean(config?.env_backed)}
/>
<AddKeyModal

View File

@@ -26,6 +26,7 @@ export default function AccountsTable({
onPageSizeChange,
searchQuery,
onSearchChange,
envBacked = false,
}) {
const [copiedId, setCopiedId] = useState(null)
@@ -101,14 +102,16 @@ export default function AccountsTable({
) : accounts.length > 0 ? (
accounts.map((acc, i) => {
const id = resolveAccountIdentifier(acc)
const runtimeUnknown = envBacked && !acc.test_status
const isActive = acc.test_status === 'ok' || acc.has_token
return (
<div key={i} className="p-4 flex flex-col md:flex-row md:items-center justify-between gap-4 hover:bg-muted/50 transition-colors">
<div className="flex items-center gap-3 min-w-0">
<div className={clsx(
"w-2 h-2 rounded-full shrink-0",
acc.test_status === 'failed' ? "bg-red-500 shadow-[0_0_8px_rgba(239,68,68,0.5)]" :
(acc.test_status === 'ok' || acc.has_token) ? "bg-emerald-500 shadow-[0_0_8px_rgba(16,185,129,0.5)]" :
"bg-amber-500"
isActive ? "bg-emerald-500 shadow-[0_0_8px_rgba(16,185,129,0.5)]" :
runtimeUnknown ? "bg-blue-500 shadow-[0_0_8px_rgba(59,130,246,0.5)]" : "bg-amber-500"
)} />
<div className="min-w-0">
<div
@@ -122,7 +125,7 @@ export default function AccountsTable({
}
</div>
<div className="flex items-center gap-2 text-xs text-muted-foreground mt-0.5">
<span>{acc.test_status === 'failed' ? t('accountManager.testStatusFailed') : (acc.test_status === 'ok' || acc.has_token) ? t('accountManager.sessionActive') : t('accountManager.reauthRequired')}</span>
<span>{acc.test_status === 'failed' ? t('accountManager.testStatusFailed') : isActive ? t('accountManager.sessionActive') : runtimeUnknown ? t('accountManager.runtimeStatusUnknown') : t('accountManager.reauthRequired')}</span>
{acc.token_preview && (
<span className="font-mono bg-muted px-1.5 py-0.5 rounded text-[10px]">
{acc.token_preview}

View File

@@ -4,7 +4,7 @@ import VercelSyncForm from './VercelSyncForm'
import VercelSyncStatus from './VercelSyncStatus'
import VercelGuide from './VercelGuide'
export default function VercelSyncContainer({ onMessage, authFetch, isVercel = false }) {
export default function VercelSyncContainer({ onMessage, authFetch, isVercel = false, config = null }) {
const { t } = useI18n()
const apiFetch = authFetch || fetch
@@ -28,6 +28,7 @@ export default function VercelSyncContainer({ onMessage, authFetch, isVercel = f
onMessage,
t,
isVercel,
config,
})
return (

View File

@@ -69,6 +69,11 @@ export default function VercelSyncForm({
{t('vercel.lastSyncTime', { time: new Date(syncStatus.last_sync_time * 1000).toLocaleString() })}
</p>
)}
{syncStatus?.draft_differs && (
<p className="text-xs text-amber-500 mt-2">
{t('vercel.draftDiffers')}
</p>
)}
</div>
<div className="space-y-4">

View File

@@ -1,4 +1,4 @@
import { useCallback, useState } from 'react'
import { useCallback, useEffect, useState } from 'react'
import {
LayoutDashboard,
Upload,
@@ -47,6 +47,29 @@ export default function DashboardShell({ token, onLogout, config, fetchConfig, s
return res
}, [onLogout, t, token])
const [versionInfo, setVersionInfo] = useState(null)
useEffect(() => {
let disposed = false
async function loadVersion() {
try {
const res = await authFetch('/admin/version')
const data = await res.json()
if (!disposed) {
setVersionInfo(data)
}
} catch (_err) {
if (!disposed) {
setVersionInfo(null)
}
}
}
loadVersion()
return () => {
disposed = true
}
}, [authFetch])
const renderTab = () => {
switch (activeTab) {
case 'accounts':
@@ -56,7 +79,7 @@ export default function DashboardShell({ token, onLogout, config, fetchConfig, s
case 'import':
return <BatchImport onRefresh={fetchConfig} onMessage={showMessage} authFetch={authFetch} />
case 'vercel':
return <VercelSync onMessage={showMessage} authFetch={authFetch} isVercel={isVercel} />
return <VercelSync onMessage={showMessage} authFetch={authFetch} isVercel={isVercel} config={config} />
case 'settings':
return <Settings onRefresh={fetchConfig} onMessage={showMessage} authFetch={authFetch} onForceLogout={onForceLogout} isVercel={isVercel} />
default:
@@ -135,6 +158,20 @@ export default function DashboardShell({ token, onLogout, config, fetchConfig, s
<div className="text-lg font-bold text-foreground">{config.keys?.length || 0}</div>
</div>
</div>
<div className="bg-background rounded-lg p-3 border border-border shadow-sm">
<div className="text-[9px] text-muted-foreground font-bold uppercase tracking-wider mb-1 opacity-70">{t('sidebar.version')}</div>
<div className="text-xs font-semibold text-foreground">{versionInfo?.current_tag || '-'}</div>
{versionInfo?.has_update && (
<a
className="inline-flex mt-1 text-[10px] text-amber-500 hover:text-amber-400"
href={versionInfo?.release_url || 'https://github.com/CJackHwang/ds2api/releases/latest'}
target="_blank"
rel="noreferrer"
>
{t('sidebar.updateAvailable', { latest: versionInfo.latest_tag || '' })}
</a>
)}
</div>
<button
onClick={onLogout}
className="w-full h-10 flex items-center justify-center gap-2 rounded-lg border border-border text-xs font-medium text-muted-foreground hover:bg-destructive/10 hover:text-destructive hover:border-destructive/20 transition-all"

View File

@@ -32,7 +32,9 @@
"statusOnline": "Online",
"accounts": "Accounts",
"keys": "Keys",
"signOut": "Sign out"
"signOut": "Sign out",
"version": "Version",
"updateAvailable": "Update available: {latest}"
},
"auth": {
"expired": "Authentication expired. Please sign in again.",
@@ -47,8 +49,8 @@
"delete": "Delete",
"copy": "Copy",
"generate": "Generate",
"test": "Test",
"testing": "Testing...",
"test": "Refresh token",
"testing": "Refreshing...",
"loading": "Loading..."
},
"messages": {
@@ -91,8 +93,8 @@
"deleteKeyConfirm": "Are you sure you want to delete this API key?",
"deleteAccountConfirm": "Are you sure you want to delete this account?",
"invalidIdentifier": "Invalid account identifier. Operation aborted.",
"testAllConfirm": "Test API connectivity for all accounts?",
"testAllCompleted": "Completed: {success}/{total} available",
"testAllConfirm": "Refresh all account tokens and verify login?",
"testAllCompleted": "Completed: {success}/{total} refreshed",
"testFailed": "Test failed: {error}",
"available": "Available",
"inUse": "In use",
@@ -108,11 +110,12 @@
"noApiKeys": "No API keys found.",
"accountsTitle": "DeepSeek Accounts",
"accountsDesc": "Manage the DeepSeek account pool",
"testAll": "Test all",
"testAll": "Refresh all tokens",
"addAccount": "Add account",
"testingAllAccounts": "Testing all accounts...",
"testingAllAccounts": "Refreshing tokens for all accounts...",
"sessionActive": "Session active",
"reauthRequired": "Re-auth required",
"runtimeStatusUnknown": "Will be determined after sync",
"testStatusFailed": "Last test failed",
"noAccounts": "No accounts found.",
"modalAddKeyTitle": "Add API key",
@@ -148,7 +151,7 @@
"missingApiKey": "Please provide an API key.",
"requestFailed": "Request failed.",
"networkError": "Network error: {error}",
"testSuccess": "{account}: Test successful ({time}ms)",
"testSuccess": "{account}: Token refresh successful ({time}ms)",
"config": "Configuration",
"modelLabel": "Model",
"streamMode": "Streaming",
@@ -292,6 +295,7 @@
"statusNotSynced": "Not synced",
"statusNeverSynced": "Never synced",
"lastSyncTime": "Last sync: {time}",
"draftDiffers": "Frontend draft differs from env config. Click Sync & redeploy.",
"pollPaused": "Status polling paused after {count} failures.",
"manualRefresh": "Refresh manually",
"howItWorks": "How it works",

View File

@@ -32,7 +32,9 @@
"statusOnline": "在线",
"accounts": "账号",
"keys": "密钥",
"signOut": "退出登录"
"signOut": "退出登录",
"version": "版本",
"updateAvailable": "发现新版本 {latest}"
},
"auth": {
"expired": "认证已过期,请重新登录",
@@ -47,8 +49,8 @@
"delete": "删除",
"copy": "复制",
"generate": "生成",
"test": "测试",
"testing": "正在测试...",
"test": "刷新 Token",
"testing": "正在刷新...",
"loading": "加载中..."
},
"messages": {
@@ -91,8 +93,8 @@
"deleteKeyConfirm": "确定要删除此 API 密钥吗?",
"deleteAccountConfirm": "确定要删除此账号吗?",
"invalidIdentifier": "账号标识无效,无法执行操作",
"testAllConfirm": "测试所有账号的 API 连通性",
"testAllCompleted": "完成:{success}/{total} 可用",
"testAllConfirm": "刷新所有账号 Token 并验证登录",
"testAllCompleted": "完成:{success}/{total} 刷新成功",
"testFailed": "测试失败: {error}",
"available": "可用",
"inUse": "正在使用",
@@ -108,11 +110,12 @@
"noApiKeys": "未找到 API 密钥",
"accountsTitle": "DeepSeek 账号",
"accountsDesc": "管理 DeepSeek 账号池",
"testAll": "测试全部",
"testAll": "刷新全部 Token",
"addAccount": "添加账号",
"testingAllAccounts": "正在测试所有账号...",
"testingAllAccounts": "正在刷新所有账号 Token...",
"sessionActive": "已建立会话",
"reauthRequired": "需重新登录",
"runtimeStatusUnknown": "状态以同步后为准",
"testStatusFailed": "上次测试失败",
"noAccounts": "未找到任何账号",
"modalAddKeyTitle": "添加 API 密钥",
@@ -148,7 +151,7 @@
"missingApiKey": "请提供 API 密钥",
"requestFailed": "请求失败",
"networkError": "网络错误: {error}",
"testSuccess": "{account}: 测试成功 ({time}ms)",
"testSuccess": "{account}: Token 刷新成功 ({time}ms)",
"config": "配置",
"modelLabel": "模型",
"streamMode": "流式模式",
@@ -292,6 +295,7 @@
"statusNotSynced": "未同步",
"statusNeverSynced": "从未同步",
"lastSyncTime": "上次同步: {time}",
"draftDiffers": "检测到前端草稿与环境变量配置不一致,请点击“同步并重新部署”。",
"pollPaused": "状态轮询已暂停:连续失败 {count} 次。",
"manualRefresh": "手动刷新",
"howItWorks": "工作原理",

View File

@@ -16,6 +16,7 @@ spec:
- Admin panel: `/admin`
- Health check: `/healthz`
- Config is persisted at `/data/config.json` (mounted volume)
- `BUILD_VERSION` is optional; when omitted, Docker build falls back to the repo `VERSION` file automatically
## First-time setup
1. Open your service URL, then visit `/admin`