Compare commits

...

24 Commits

Author SHA1 Message Date
CJACK.
67501cf4d2 Merge pull request #256 from CJackHwang/dev
全模型全渠道附件上传deepseek功能
全接口兼容性待测试
2026-04-13 04:00:49 +08:00
CJACK
25234af301 feat: enforce request body size limits and restrict inline file count to prevent resource exhaustion 2026-04-13 03:55:14 +08:00
CJACK
2aee80d0d3 fix: update URL decoding method and refine file ID extraction logic to exclude text-based inputs 2026-04-13 03:49:06 +08:00
CJACK
ab9f3cc417 refactor: remove unused leakedDanglingThinkOpenPattern regex from output sanitizer 2026-04-13 03:40:20 +08:00
CJACK
c92ed8d3c3 refactor: rename apiTester testSuccess key to requestSuccess and update localization files 2026-04-13 03:24:39 +08:00
CJACK
d78789a66e feat: implement error handling for empty upstream responses in chat streams and update UI to display stream-level errors 2026-04-13 03:22:38 +08:00
CJACK
acb110865f feat: implement cross-account validation and improved error handling for file attachments in API tester 2026-04-13 03:15:12 +08:00
CJACK
ffca8be597 feat: implement file readiness polling and add IsImage field to upload results 2026-04-13 02:55:45 +08:00
CJACK
7ef6a7d11f feat: update to v3.4.0 and redesign model selection UI with a dropdown and descriptive panel 2026-04-13 02:27:12 +08:00
CJACK
d53a2ea7d2 refactor: remove unused purpose parameter from upload and upstream empty output handlers 2026-04-13 01:59:51 +08:00
CJACK
daa636e040 refactor: handle upstream thinking-only responses as errors and sanitize dangling think tags in output 2026-04-13 01:55:14 +08:00
CJACK
aa41bae044 feat: add file attachment support to chat interface and API requests 2026-04-13 00:04:38 +08:00
CJACK
2027c7cd77 fix: add JSON headers to DeepSeek requests and prevent string content from being parsed as file IDs in OpenAI adapter 2026-04-12 23:49:56 +08:00
CJACK
0591128601 refactor: fix file handling error suppression, optimize hash calculation, and update API documentation with additional models 2026-04-12 23:35:57 +08:00
CJACK
caafdedb00 feat: implement OpenAI-compatible file upload and reference handling for DeepSeek API 2026-04-12 23:30:22 +08:00
CJACK
0a23c77ff7 feat: add sanitization for think tags and BOS markers in leaked output and update golang.org/x/net dependency 2026-04-12 17:43:57 +08:00
CJACK.
d759804c33 Merge pull request #255 from CJackHwang/codex/refactor-prompt-concatenation-using-tokenizer
feat(prompt): tokenizer-style prompt stitching with thinking-prefix support
2026-04-12 17:14:48 +08:00
CJACK.
433a3a877d feat(prompt): align DeepSeek prompt assembly with tokenizer-style turns 2026-04-12 13:59:42 +08:00
CJACK.
792e295512 Merge pull request #254 from CJackHwang/main
Update VERSION
2026-04-08 20:24:03 +08:00
CJACK.
d053d9ad04 Update VERSION 2026-04-08 20:22:55 +08:00
CJACK.
04e025c5e1 Update README.MD 2026-04-08 18:21:09 +08:00
CJACK.
184cbed3cb Merge pull request #252 from CJackHwang/dev
Merge pull request #249 from shuaihaoV/feat/deepseek-model-type-families

Add default, expert, and vision DeepSeek model families
2026-04-08 18:06:07 +08:00
CJACK.
378f99be4a Merge pull request #249 from shuaihaoV/feat/deepseek-model-type-families
Add default, expert, and vision DeepSeek model families
2026-04-08 17:53:02 +08:00
Shuaihao
ba76a2163b Add default, expert, and vision DeepSeek model families 2026-04-08 14:37:22 +08:00
71 changed files with 3348 additions and 456 deletions

View File

@@ -173,7 +173,7 @@ Gemini-compatible clients can also send `x-goog-api-key`, `?key=`, or `?api_key=
### `GET /v1/models`
No auth required. Returns supported models.
No auth required. Returns the currently supported DeepSeek native model list.
**Response**:
@@ -184,11 +184,21 @@ No auth required. Returns supported models.
{"id": "deepseek-chat", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-reasoner", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-chat-search", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-reasoner-search", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []}
{"id": "deepseek-reasoner-search", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-expert-chat", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-expert-reasoner", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-expert-chat-search", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-expert-reasoner-search", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-vision-chat", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-vision-reasoner", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-vision-chat-search", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-vision-reasoner-search", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []}
]
}
```
> Note: `/v1/models` returns normalized DeepSeek native model IDs. Common aliases are accepted only as request input and are not expanded as separate items in this endpoint.
### Model Alias Resolution
For `chat` / `responses` / `embeddings`, DS2API follows a wide-input/strict-output policy:
@@ -211,7 +221,7 @@ Content-Type: application/json
| Field | Type | Required | Notes |
| --- | --- | --- | --- |
| `model` | string | ✅ | DeepSeek native models + common aliases (`gpt-4o`, `gpt-5-codex`, `o3`, `claude-sonnet-4-5`, etc.) |
| `model` | string | ✅ | DeepSeek native models + common aliases (`gpt-4o`, `gpt-5-codex`, `o3`, `claude-sonnet-4-5`, `gemini-2.5-pro`, etc.) |
| `messages` | array | ✅ | OpenAI-style messages |
| `stream` | boolean | ❌ | Default `false` |
| `tools` | array | ❌ | Function calling schema |
@@ -408,7 +418,7 @@ No auth required.
}
```
> Note: the example is partial; the real response includes historical Claude 1.x/2.x/3.x/4.x IDs and common aliases.
> Note: the example is partial; besides the current primary aliases, the real response also includes Claude 4.x snapshots plus historical 3.x / 2.x / 1.x IDs and common aliases.
### `POST /anthropic/v1/messages`

18
API.md
View File

@@ -173,7 +173,7 @@ Gemini 兼容客户端还可以使用 `x-goog-api-key`、`?key=` 或 `?api_key=`
### `GET /v1/models`
无需鉴权。返回当前支持的模型列表。
无需鉴权。返回当前支持的 DeepSeek 原生模型列表。
**响应示例**
@@ -184,11 +184,21 @@ Gemini 兼容客户端还可以使用 `x-goog-api-key`、`?key=` 或 `?api_key=`
{"id": "deepseek-chat", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-reasoner", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-chat-search", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-reasoner-search", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []}
{"id": "deepseek-reasoner-search", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-expert-chat", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-expert-reasoner", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-expert-chat-search", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-expert-reasoner-search", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-vision-chat", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-vision-reasoner", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-vision-chat-search", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []},
{"id": "deepseek-vision-reasoner-search", "object": "model", "created": 1677610602, "owned_by": "deepseek", "permission": []}
]
}
```
> 说明:`/v1/models` 返回的是规范化后的 DeepSeek 原生模型 ID常见 alias 仅用于请求入参解析,不会在该接口中单独展开返回。
### 模型 alias 解析策略
`chat` / `responses` / `embeddings``model` 字段采用“宽进严出”:
@@ -211,7 +221,7 @@ Content-Type: application/json
| 字段 | 类型 | 必填 | 说明 |
| --- | --- | --- | --- |
| `model` | string | ✅ | 支持 DeepSeek 原生模型 + 常见 alias`gpt-4o``gpt-5-codex``o3``claude-sonnet-4-5` |
| `model` | string | ✅ | 支持 DeepSeek 原生模型 + 常见 alias`gpt-4o``gpt-5-codex``o3``claude-sonnet-4-5``gemini-2.5-pro` |
| `messages` | array | ✅ | OpenAI 风格消息数组 |
| `stream` | boolean | ❌ | 默认 `false` |
| `tools` | array | ❌ | Function Calling 定义 |
@@ -414,7 +424,7 @@ data: [DONE]
}
```
> 说明:示例仅展示部分模型;实际返回包含 Claude 1.x/2.x/3.x/4.x 历史模型 ID 与常见别名。
> 说明:示例仅展示部分模型;实际返回除当前主别名外,还包含 Claude 4.x snapshots以及 3.x / 2.x / 1.x 历史模型 ID 与常见别名。
### `POST /anthropic/v1/messages`

101
README.MD
View File

@@ -18,6 +18,8 @@
文档入口:[文档导航](docs/README.md) / [架构说明](docs/ARCHITECTURE.md) / [接口文档](API.md)
【感谢Linux.do社区及GitHub社区各位开发者对项目的支持与贡献】
> **重要免责声明**
>
> 本仓库仅供学习、研究、个人实验和内部验证使用,不提供任何形式的商业授权、适用性保证或结果保证。
@@ -118,26 +120,35 @@ flowchart LR
## 模型支持
### OpenAI 接口
### OpenAI 接口`GET /v1/models`
| 模型 | thinking | search |
| --- | --- | --- |
| `deepseek-chat` | ❌ | ❌ |
| `deepseek-reasoner` | ✅ | ❌ |
| `deepseek-chat-search` | ❌ | ✅ |
| `deepseek-reasoner-search` | ✅ | ✅ |
| 模型类型 | 模型 ID | thinking | search |
| --- | --- | --- | --- |
| default | `deepseek-chat` | ❌ | ❌ |
| default | `deepseek-reasoner` | ✅ | ❌ |
| default | `deepseek-chat-search` | ❌ | ✅ |
| default | `deepseek-reasoner-search` | ✅ | ✅ |
| expert | `deepseek-expert-chat` | ❌ | ❌ |
| expert | `deepseek-expert-reasoner` | ✅ | ❌ |
| expert | `deepseek-expert-chat-search` | ❌ | ✅ |
| expert | `deepseek-expert-reasoner-search` | ✅ | ✅ |
| vision | `deepseek-vision-chat` | ❌ | ❌ |
| vision | `deepseek-vision-reasoner` | ✅ | ❌ |
| vision | `deepseek-vision-chat-search` | ❌ | ✅ |
| vision | `deepseek-vision-reasoner-search` | ✅ | ✅ |
### Claude 接口
除原生模型外,也支持常见 alias 输入(如 `gpt-4o`、`gpt-5-codex`、`o3`、`claude-sonnet-4-5`、`gemini-2.5-pro` 等),但 `/v1/models` 返回的是规范化后的 DeepSeek 原生模型 ID。
| 模型 | 默认映射 |
### Claude 接口(`GET /anthropic/v1/models`
| 当前常用模型 | 默认映射 |
| --- | --- |
| `claude-sonnet-4-5` | `deepseek-chat` |
| `claude-haiku-4-5`(兼容 `claude-3-5-haiku-latest` | `deepseek-chat` |
| `claude-opus-4-6` | `deepseek-reasoner` |
可通过配置中的 `claude_mapping` 或 `claude_model_mapping` 覆盖映射关系。
另外,`/anthropic/v1/models` 现已包含 Claude 1.x/2.x/3.x/4.x 历史模型 ID 与常见别名,便于旧客户端直接兼容。
`/anthropic/v1/models` 除上述当前主别名外,还会返回 Claude 4.x snapshots以及 3.x / 2.x / 1.x 历史模型 ID 与常见 alias,便于旧客户端直接兼容。
#### Claude Code 接入避坑(实测)
@@ -152,6 +163,15 @@ Gemini 适配器将模型名通过 `model_aliases` 或内置规则映射到 Deep
## 快速开始
### 部署方式优先级建议
推荐按以下顺序选择部署方式:
1. **下载 Release 构建包运行**:最省事,产物已编译完成,最适合大多数用户。
2. **Docker / GHCR 镜像部署**:适合需要容器化、编排或云环境部署。
3. **Vercel 部署**:适合已有 Vercel 环境且接受其平台约束的场景。
4. **本地源码运行 / 自行编译**:适合开发、调试或需要自行修改代码的场景。
### 通用第一步(所有部署方式)
把 `config.json` 作为唯一配置源(推荐做法):
@@ -165,29 +185,19 @@ cp config.example.json config.json
- 本地运行:直接读取 `config.json`
- Docker / Vercel由 `config.json` 生成 `DS2API_CONFIG_JSON`Base64注入环境变量也可以直接写原始 JSON
### 方式一:本地运行
### 方式一:下载 Release 构建包
**前置要求**Go 1.26+Node.js `20.19+` 或 `22.12+`(仅在需要构建 WebUI 时)
每次发布 Release 时GitHub Actions 会自动构建多平台二进制包:
```bash
# 1. 克隆仓库
git clone https://github.com/CJackHwang/ds2api.git
cd ds2api
# 2. 配置
# 下载对应平台的压缩包后
tar -xzf ds2api_<tag>_linux_amd64.tar.gz
cd ds2api_<tag>_linux_amd64
cp config.example.json config.json
# 编辑 config.json,填入你的 DeepSeek 账号信息和 API key
# 3. 启动
go run ./cmd/ds2api
# 编辑 config.json
./ds2api
```
默认本地访问地址:`http://127.0.0.1:5001`
服务实际绑定:`0.0.0.0:5001`,因此同一局域网设备通常也可以通过你的内网 IP 访问。
> **WebUI 自动构建**:本地首次启动时,若 `static/admin` 不存在,会自动尝试执行 `npm ci`(仅在缺少依赖时)和 `npm run build -- --outDir static/admin --emptyOutDir`(需要本机有 Node.js。你也可以手动构建`./scripts/build-webui.sh`
### 方式二Docker 运行
```bash
@@ -241,35 +251,28 @@ base64 < config.json | tr -d '\n'
详细部署说明请参阅 [部署指南](docs/DEPLOY.md)。
### 方式四:下载 Release 构建包
### 方式四:本地源码运行
每次发布 Release 时GitHub Actions 会自动构建多平台二进制包:
**前置要求**Go 1.26+Node.js `20.19+` 或 `22.12+`(仅在需要构建 WebUI 时)
```bash
# 下载对应平台的压缩包后
tar -xzf ds2api_<tag>_linux_amd64.tar.gz
cd ds2api_<tag>_linux_amd64
# 1. 克隆仓库
git clone https://github.com/CJackHwang/ds2api.git
cd ds2api
# 2. 配置
cp config.example.json config.json
# 编辑 config.json
./ds2api
# 编辑 config.json,填入你的 DeepSeek 账号信息和 API key
# 3. 启动
go run ./cmd/ds2api
```
### 方式五OpenCode CLI 接入
默认本地访问地址:`http://127.0.0.1:5001`
1. 复制示例配置:
服务实际绑定:`0.0.0.0:5001`,因此同一局域网设备通常也可以通过你的内网 IP 访问。
```bash
cp opencode.json.example opencode.json
```
2. 编辑 `opencode.json`
- 将 `baseURL` 改为你的 DS2API 地址(例如 `https://your-domain.com/v1`
- 将 `apiKey` 改为你的 DS2API key对应 `config.keys`
3. 在项目目录启动 OpenCode CLI按你的安装方式运行 `opencode`)。
> 建议优先使用 OpenAI 兼容路径(`/v1/*`),即示例里的 `@ai-sdk/openai-compatible` provider。
> 若客户端支持 `wire_api`,可分别测试 `responses` 与 `chat`DS2API 两条链路都兼容。
> **WebUI 自动构建**:本地首次启动时,若 `static/admin` 不存在,会自动尝试执行 `npm ci`(仅在缺少依赖时)和 `npm run build -- --outDir static/admin --emptyOutDir`(需要本机有 Node.js。你也可以手动构建`./scripts/build-webui.sh`
## 配置说明

View File

@@ -118,26 +118,35 @@ For the full module-by-module architecture and directory responsibilities, see [
## Model Support
### OpenAI Endpoint
### OpenAI Endpoint (`GET /v1/models`)
| Model | thinking | search |
| --- | --- | --- |
| `deepseek-chat` | ❌ | ❌ |
| `deepseek-reasoner` | ✅ | ❌ |
| `deepseek-chat-search` | ❌ | ✅ |
| `deepseek-reasoner-search` | ✅ | ✅ |
| Family | Model ID | thinking | search |
| --- | --- | --- | --- |
| default | `deepseek-chat` | ❌ | ❌ |
| default | `deepseek-reasoner` | ✅ | ❌ |
| default | `deepseek-chat-search` | ❌ | ✅ |
| default | `deepseek-reasoner-search` | ✅ | ✅ |
| expert | `deepseek-expert-chat` | ❌ | ❌ |
| expert | `deepseek-expert-reasoner` | ✅ | ❌ |
| expert | `deepseek-expert-chat-search` | ❌ | ✅ |
| expert | `deepseek-expert-reasoner-search` | ✅ | ✅ |
| vision | `deepseek-vision-chat` | ❌ | ❌ |
| vision | `deepseek-vision-reasoner` | ✅ | ❌ |
| vision | `deepseek-vision-chat-search` | ❌ | ✅ |
| vision | `deepseek-vision-reasoner-search` | ✅ | ✅ |
### Claude Endpoint
Besides native IDs, DS2API also accepts common aliases as input (for example `gpt-4o`, `gpt-5-codex`, `o3`, `claude-sonnet-4-5`, `gemini-2.5-pro`), but `/v1/models` returns normalized DeepSeek native model IDs.
| Model | Default Mapping |
### Claude Endpoint (`GET /anthropic/v1/models`)
| Current common model | Default Mapping |
| --- | --- |
| `claude-sonnet-4-5` | `deepseek-chat` |
| `claude-haiku-4-5` (compatible with `claude-3-5-haiku-latest`) | `deepseek-chat` |
| `claude-opus-4-6` | `deepseek-reasoner` |
Override mapping via `claude_mapping` or `claude_model_mapping` in config.
In addition, `/anthropic/v1/models` now includes historical Claude 1.x/2.x/3.x/4.x IDs and common aliases for legacy client compatibility.
Besides the current primary aliases above, `/anthropic/v1/models` also returns Claude 4.x snapshots plus historical 3.x / 2.x / 1.x IDs and common aliases for legacy client compatibility.
#### Claude Code integration pitfalls (validated)
@@ -152,6 +161,15 @@ The Gemini adapter maps model names to DeepSeek native models via `model_aliases
## Quick Start
### Recommended deployment priority
Recommended order when choosing a deployment method:
1. **Download and run release binaries**: the easiest path for most users because the artifacts are already built.
2. **Docker / GHCR image deployment**: suitable for containerized, orchestrated, or cloud environments.
3. **Vercel deployment**: suitable if you already use Vercel and accept its platform constraints.
4. **Run from source / build locally**: suitable for development, debugging, or when you need to modify the code yourself.
### Universal First Step (all deployment modes)
Use `config.json` as the single source of truth (recommended):
@@ -165,47 +183,37 @@ Recommended per deployment mode:
- Local run: read `config.json` directly
- Docker / Vercel: generate Base64 from `config.json` and inject as `DS2API_CONFIG_JSON`, or paste raw JSON directly
### Option 1: Local Run
### Option 1: Download Release Binaries
**Prerequisites**: Go 1.26+, Node.js `20.19+` or `22.12+` (only if building WebUI locally)
GitHub Actions automatically builds multi-platform archives on each Release:
```bash
# 1. Clone
git clone https://github.com/CJackHwang/ds2api.git
cd ds2api
# 2. Configure
# After downloading the archive for your platform
tar -xzf ds2api_<tag>_linux_amd64.tar.gz
cd ds2api_<tag>_linux_amd64
cp config.example.json config.json
# Edit config.json with your DeepSeek account info and API keys
# 3. Start
go run ./cmd/ds2api
# Edit config.json
./ds2api
```
Default local URL: `http://127.0.0.1:5001`
The server actually binds to `0.0.0.0:5001`, so devices on the same LAN can usually reach it through your private IP as well.
> **WebUI auto-build**: On first local startup, if `static/admin` is missing, DS2API will auto-run `npm ci` (only when dependencies are missing) and `npm run build -- --outDir static/admin --emptyOutDir` (requires Node.js). You can also build manually: `./scripts/build-webui.sh`
### Option 2: Docker
### Option 2: Docker / GHCR
```bash
# 1. Prepare env file and config file
# Pull prebuilt image
docker pull ghcr.io/cjackhwang/ds2api:latest
# Or run a pinned version
# docker pull ghcr.io/cjackhwang/ds2api:v3.0.0
# Prepare env file and config file
cp .env.example .env
cp config.example.json config.json
# 2. Edit .env (at least set DS2API_ADMIN_KEY; optionally set DS2API_HOST_PORT to change the host port)
# DS2API_ADMIN_KEY=replace-with-a-strong-secret
# 3. Start
# Start with compose
docker-compose up -d
# 4. View logs
docker-compose logs -f
```
The default `docker-compose.yml` maps host port `6011` to container port `5001`. If you want `5001` exposed directly, set `DS2API_HOST_PORT=5001` (or adjust the `ports` mapping).
The default `docker-compose.yml` uses `ghcr.io/cjackhwang/ds2api:latest` and maps host port `6011` to container port `5001`. If you want `5001` exposed directly, set `DS2API_HOST_PORT=5001` (or adjust the `ports` mapping).
Rebuild after updates: `docker-compose up -d --build`
@@ -241,35 +249,28 @@ base64 < config.json | tr -d '\n'
For detailed deployment instructions, see the [Deployment Guide](docs/DEPLOY.en.md).
### Option 4: Download Release Binaries
### Option 4: Local Run
GitHub Actions automatically builds multi-platform archives on each Release:
**Prerequisites**: Go 1.26+, Node.js `20.19+` or `22.12+` (only if building WebUI locally)
```bash
# After downloading the archive for your platform
tar -xzf ds2api_<tag>_linux_amd64.tar.gz
cd ds2api_<tag>_linux_amd64
# 1. Clone
git clone https://github.com/CJackHwang/ds2api.git
cd ds2api
# 2. Configure
cp config.example.json config.json
# Edit config.json
./ds2api
# Edit config.json with your DeepSeek account info and API keys
# 3. Start
go run ./cmd/ds2api
```
### Option 5: OpenCode CLI
Default local URL: `http://127.0.0.1:5001`
1. Copy the example config:
The server actually binds to `0.0.0.0:5001`, so devices on the same LAN can usually reach it through your private IP as well.
```bash
cp opencode.json.example opencode.json
```
2. Edit `opencode.json`:
- Set `baseURL` to your DS2API endpoint (for example, `https://your-domain.com/v1`)
- Set `apiKey` to your DS2API key (from `config.keys`)
3. Start OpenCode CLI in the project directory (run `opencode` using your installed method).
> Recommended: use the OpenAI-compatible path (`/v1/*`) via `@ai-sdk/openai-compatible` as shown in the example.
> If your client supports `wire_api`, test both `responses` and `chat`; DS2API supports both paths.
> **WebUI auto-build**: On first local startup, if `static/admin` is missing, DS2API will auto-run `npm ci` (only when dependencies are missing) and `npm run build -- --outDir static/admin --emptyOutDir` (requires Node.js). You can also build manually: `./scripts/build-webui.sh`
## Configuration

View File

@@ -1 +1 @@
3.2.0
3.4.0

View File

@@ -10,11 +10,12 @@ Doc map: [Index](./README.md) | [Architecture](./ARCHITECTURE.en.md) | [API](../
## Table of Contents
- [Recommended deployment priority](#recommended-deployment-priority)
- [Prerequisites](#0-prerequisites)
- [1. Local Run](#1-local-run)
- [2. Docker Deployment](#2-docker-deployment)
- [1. Download Release Binaries](#1-download-release-binaries)
- [2. Docker / GHCR Deployment](#2-docker--ghcr-deployment)
- [3. Vercel Deployment](#3-vercel-deployment)
- [4. Download Release Binaries](#4-download-release-binaries)
- [4. Local Run from Source](#4-local-run-from-source)
- [5. Reverse Proxy (Nginx)](#5-reverse-proxy-nginx)
- [6. Linux systemd Service](#6-linux-systemd-service)
- [7. Post-Deploy Checks](#7-post-deploy-checks)
@@ -22,6 +23,17 @@ Doc map: [Index](./README.md) | [Architecture](./ARCHITECTURE.en.md) | [API](../
---
## Recommended deployment priority
Recommended order when choosing a deployment method:
1. **Download and run release binaries**: the easiest path for most users because the artifacts are already built.
2. **Docker / GHCR image deployment**: suitable for containerized, orchestrated, or cloud environments.
3. **Vercel deployment**: suitable if you already use Vercel and accept its platform constraints.
4. **Run from source / build locally**: suitable for development, debugging, or when you need to modify the code yourself.
---
## 0. Prerequisites
| Dependency | Minimum Version | Notes |
@@ -48,70 +60,59 @@ Use `config.json` as the single source of truth:
---
## 1. Local Run
## 1. Download Release Binaries
### 1.1 Basic Steps
Built-in GitHub Actions workflow: `.github/workflows/release-artifacts.yml`
- **Trigger**: only on Release `published` (no build on normal push)
- **Outputs**: multi-platform binary archives + `sha256sums.txt`
- **Container publishing**: GHCR only (`ghcr.io/cjackhwang/ds2api`)
| Platform | Architecture | Format |
| --- | --- | --- |
| Linux | amd64, arm64 | `.tar.gz` |
| macOS | amd64, arm64 | `.tar.gz` |
| Windows | amd64 | `.zip` |
Each archive includes:
- `ds2api` executable (`ds2api.exe` on Windows)
- `static/admin/` (built WebUI assets)
- `config.example.json`, `.env.example`
- `README.MD`, `README.en.md`, `LICENSE`
### Usage
```bash
# Clone
git clone https://github.com/CJackHwang/ds2api.git
cd ds2api
# 1. Download the archive for your platform
# 2. Extract
tar -xzf ds2api_<tag>_linux_amd64.tar.gz
cd ds2api_<tag>_linux_amd64
# Copy and edit config
# 3. Configure
cp config.example.json config.json
# Open config.json and fill in:
# - keys: your API access keys
# - accounts: DeepSeek accounts (email or mobile + password)
# Edit config.json
# Start
go run ./cmd/ds2api
```
Default local access URL: `http://127.0.0.1:5001`; the server actually binds to `0.0.0.0:5001` (override with `PORT`).
### 1.2 WebUI Build
On first local startup, if `static/admin/` is missing, DS2API will automatically attempt to build the WebUI (requires Node.js/npm; when dependencies are missing it runs `npm ci` first, then `npm run build -- --outDir static/admin --emptyOutDir`).
Manual build:
```bash
./scripts/build-webui.sh
```
Or step by step:
```bash
cd webui
npm install
npm run build
# Output goes to static/admin/
```
Control auto-build via environment variable:
```bash
# Disable auto-build
DS2API_AUTO_BUILD_WEBUI=false go run ./cmd/ds2api
# Force enable auto-build
DS2API_AUTO_BUILD_WEBUI=true go run ./cmd/ds2api
```
### 1.3 Compile to Binary
```bash
go build -o ds2api ./cmd/ds2api
# 4. Start
./ds2api
```
### Maintainer Release Flow
1. Create and publish a GitHub Release (with tag, for example `vX.Y.Z`)
2. Wait for the `Release Artifacts` workflow to complete
3. Download the matching archive from Release Assets
---
## 2. Docker Deployment
## 2. Docker / GHCR Deployment
### 2.1 Basic Steps
```bash
# Pull prebuilt image
docker pull ghcr.io/cjackhwang/ds2api:latest
# Copy env template and config file
cp .env.example .env
cp config.example.json config.json
@@ -128,7 +129,13 @@ docker-compose up -d
docker-compose logs -f
```
The default `docker-compose.yml` maps host port `6011` to container port `5001`. If you want `5001` exposed directly, set `DS2API_HOST_PORT=5001` (or adjust the `ports` mapping).
The default `docker-compose.yml` directly uses `ghcr.io/cjackhwang/ds2api:latest` and maps host port `6011` to container port `5001`. If you want `5001` exposed directly, set `DS2API_HOST_PORT=5001` (or adjust the `ports` mapping).
If you want a pinned version instead of `latest`, you can also pull a specific tag directly:
```bash
docker pull ghcr.io/cjackhwang/ds2api:v3.0.0
```
### 2.2 Update
@@ -350,57 +357,61 @@ If API responses return Vercel HTML `Authentication Required`:
---
## 4. Download Release Binaries
## 4. Local Run from Source
Built-in GitHub Actions workflow: `.github/workflows/release-artifacts.yml`
- **Trigger**: only on Release `published` (no build on normal push)
- **Outputs**: multi-platform binary archives + `sha256sums.txt`
- **Container publishing**: GHCR only (`ghcr.io/cjackhwang/ds2api`)
| Platform | Architecture | Format |
| --- | --- | --- |
| Linux | amd64, arm64 | `.tar.gz` |
| macOS | amd64, arm64 | `.tar.gz` |
| Windows | amd64 | `.zip` |
Each archive includes:
- `ds2api` executable (`ds2api.exe` on Windows)
- `static/admin/` (built WebUI assets)
- `config.example.json`, `.env.example`
- `README.MD`, `README.en.md`, `LICENSE`
### Usage
### 4.1 Basic Steps
```bash
# 1. Download the archive for your platform
# 2. Extract
tar -xzf ds2api_<tag>_linux_amd64.tar.gz
cd ds2api_<tag>_linux_amd64
# Clone
git clone https://github.com/CJackHwang/ds2api.git
cd ds2api
# 3. Configure
# Copy and edit config
cp config.example.json config.json
# Edit config.json
# Open config.json and fill in:
# - keys: your API access keys
# - accounts: DeepSeek accounts (email or mobile + password)
# 4. Start
./ds2api
# Start
go run ./cmd/ds2api
```
### Maintainer Release Flow
Default local access URL: `http://127.0.0.1:5001`; the server actually binds to `0.0.0.0:5001` (override with `PORT`).
1. Create and publish a GitHub Release (with tag, for example `vX.Y.Z`)
2. Wait for the `Release Artifacts` workflow to complete
3. Download the matching archive from Release Assets
### 4.2 WebUI Build
### Pull from GHCR (Optional)
On first local startup, if `static/admin/` is missing, DS2API will automatically attempt to build the WebUI (requires Node.js/npm; when dependencies are missing it runs `npm ci` first, then `npm run build -- --outDir static/admin --emptyOutDir`).
Manual build:
```bash
# latest
docker pull ghcr.io/cjackhwang/ds2api:latest
./scripts/build-webui.sh
```
# specific version (example)
docker pull ghcr.io/cjackhwang/ds2api:v3.0.0
Or step by step:
```bash
cd webui
npm install
npm run build
# Output goes to static/admin/
```
Control auto-build via environment variable:
```bash
# Disable auto-build
DS2API_AUTO_BUILD_WEBUI=false go run ./cmd/ds2api
# Force enable auto-build
DS2API_AUTO_BUILD_WEBUI=true go run ./cmd/ds2api
```
### 4.3 Compile to Binary
```bash
go build -o ds2api ./cmd/ds2api
./ds2api
```
---

View File

@@ -10,11 +10,12 @@
## 目录
- [部署方式优先级建议](#部署方式优先级建议)
- [前置要求](#0-前置要求)
- [一、本地运行](#一本地运行)
- [二、Docker 部署](#二docker-部署)
- [一、下载 Release 构建包](#一下载-release-构建包)
- [二、Docker / GHCR 部署](#二docker--ghcr-部署)
- [三、Vercel 部署](#三vercel-部署)
- [四、下载 Release 构建包](#四下载-release-构建包)
- [四、本地源码运行](#四本地源码运行)
- [五、反向代理Nginx](#五反向代理nginx)
- [六、Linux systemd 服务化](#六linux-systemd-服务化)
- [七、部署后检查](#七部署后检查)
@@ -22,6 +23,17 @@
---
## 部署方式优先级建议
推荐按以下顺序选择部署方式:
1. **下载 Release 构建包运行**:最省事,产物已编译完成,最适合大多数用户。
2. **Docker / GHCR 镜像部署**:适合需要容器化、编排或云环境部署。
3. **Vercel 部署**:适合已有 Vercel 环境且接受其平台约束的场景。
4. **本地源码运行 / 自行编译**:适合开发、调试或需要自行修改代码的场景。
---
## 0. 前置要求
| 依赖 | 最低版本 | 说明 |
@@ -48,70 +60,59 @@ cp config.example.json config.json
---
## 一、本地运行
## 一、下载 Release 构建包
### 1.1 基本步骤
仓库内置 GitHub Actions 工作流:`.github/workflows/release-artifacts.yml`
- **触发条件**:仅在 Release `published` 时触发(普通 push 不会构建)
- **构建产物**:多平台二进制压缩包 + `sha256sums.txt`
- **容器镜像发布**:仅发布到 GHCR`ghcr.io/cjackhwang/ds2api`
| 平台 | 架构 | 文件格式 |
| --- | --- | --- |
| Linux | amd64, arm64 | `.tar.gz` |
| macOS | amd64, arm64 | `.tar.gz` |
| Windows | amd64 | `.zip` |
每个压缩包包含:
- `ds2api` 可执行文件Windows 为 `ds2api.exe`
- `static/admin/`WebUI 构建产物)
- `config.example.json``.env.example`
- `README.MD``README.en.md``LICENSE`
### 使用步骤
```bash
# 克隆仓库
git clone https://github.com/CJackHwang/ds2api.git
cd ds2api
# 1. 下载对应平台的压缩包
# 2. 解压
tar -xzf ds2api_<tag>_linux_amd64.tar.gz
cd ds2api_<tag>_linux_amd64
# 复制并编辑配置
# 3. 配置
cp config.example.json config.json
# 使用你喜欢的编辑器打开 config.json,填入:
# - keys: 你的 API 访问密钥
# - accounts: DeepSeek 账号email 或 mobile + password
# 编辑 config.json
# 启动服务
go run ./cmd/ds2api
```
默认本地访问地址是 `http://127.0.0.1:5001`;服务实际绑定 `0.0.0.0:5001`,可通过 `PORT` 环境变量覆盖。
### 1.2 WebUI 构建
本地首次启动时,若 `static/admin/` 不存在,服务会自动尝试构建 WebUI需要 Node.js/npm缺依赖时会先执行 `npm ci`,再执行 `npm run build -- --outDir static/admin --emptyOutDir`)。
你也可以手动构建:
```bash
./scripts/build-webui.sh
```
或手动执行:
```bash
cd webui
npm install
npm run build
# 产物输出到 static/admin/
```
通过环境变量控制自动构建行为:
```bash
# 强制关闭自动构建
DS2API_AUTO_BUILD_WEBUI=false go run ./cmd/ds2api
# 强制开启自动构建
DS2API_AUTO_BUILD_WEBUI=true go run ./cmd/ds2api
```
### 1.3 编译为二进制文件
```bash
go build -o ds2api ./cmd/ds2api
# 4. 启动
./ds2api
```
### 维护者发布步骤
1. 在 GitHub 创建并发布 Release带 tag`vX.Y.Z`
2. 等待 Actions 工作流 `Release Artifacts` 完成
3. 在 Release 的 Assets 下载对应平台压缩包
---
## 二、Docker 部署
## 二、Docker / GHCR 部署
### 2.1 基本步骤
```bash
# 拉取预编译镜像
docker pull ghcr.io/cjackhwang/ds2api:latest
# 复制环境变量模板和配置文件
cp .env.example .env
cp config.example.json config.json
@@ -128,7 +129,13 @@ docker-compose up -d
docker-compose logs -f
```
默认 `docker-compose.yml` 把宿主机 `6011` 映射到容器内的 `5001`。如果你希望直接对外暴露 `5001`,请设置 `DS2API_HOST_PORT=5001`(或者手动调整 `ports` 配置)。
默认 `docker-compose.yml` 直接使用 `ghcr.io/cjackhwang/ds2api:latest`,并把宿主机 `6011` 映射到容器内的 `5001`。如果你希望直接对外暴露 `5001`,请设置 `DS2API_HOST_PORT=5001`(或者手动调整 `ports` 配置)。
如需固定版本,也可以直接拉取指定 tag
```bash
docker pull ghcr.io/cjackhwang/ds2api:v3.0.0
```
### 2.2 更新
@@ -350,57 +357,61 @@ No Output Directory named "public" found after the Build completed.
---
## 四、下载 Release 构建包
## 四、本地源码运行
仓库内置 GitHub Actions 工作流:`.github/workflows/release-artifacts.yml`
- **触发条件**:仅在 Release `published` 时触发(普通 push 不会构建)
- **构建产物**:多平台二进制压缩包 + `sha256sums.txt`
- **容器镜像发布**:仅发布到 GHCR`ghcr.io/cjackhwang/ds2api`
| 平台 | 架构 | 文件格式 |
| --- | --- | --- |
| Linux | amd64, arm64 | `.tar.gz` |
| macOS | amd64, arm64 | `.tar.gz` |
| Windows | amd64 | `.zip` |
每个压缩包包含:
- `ds2api` 可执行文件Windows 为 `ds2api.exe`
- `static/admin/`WebUI 构建产物)
- `config.example.json``.env.example`
- `README.MD``README.en.md``LICENSE`
### 使用步骤
### 4.1 基本步骤
```bash
# 1. 下载对应平台的压缩包
# 2. 解压
tar -xzf ds2api_<tag>_linux_amd64.tar.gz
cd ds2api_<tag>_linux_amd64
# 克隆仓库
git clone https://github.com/CJackHwang/ds2api.git
cd ds2api
# 3. 配置
# 复制并编辑配置
cp config.example.json config.json
# 编辑 config.json
# 使用你喜欢的编辑器打开 config.json,填入:
# - keys: 你的 API 访问密钥
# - accounts: DeepSeek 账号email 或 mobile + password
# 4. 启动
./ds2api
# 启动服务
go run ./cmd/ds2api
```
### 维护者发布步骤
默认本地访问地址是 `http://127.0.0.1:5001`;服务实际绑定 `0.0.0.0:5001`,可通过 `PORT` 环境变量覆盖。
1. 在 GitHub 创建并发布 Release带 tag`vX.Y.Z`
2. 等待 Actions 工作流 `Release Artifacts` 完成
3. 在 Release 的 Assets 下载对应平台压缩包
### 4.2 WebUI 构建
### 拉取 GHCR 镜像(可选)
本地首次启动时,若 `static/admin/` 不存在,服务会自动尝试构建 WebUI需要 Node.js/npm缺依赖时会先执行 `npm ci`,再执行 `npm run build -- --outDir static/admin --emptyOutDir`)。
你也可以手动构建:
```bash
# latest
docker pull ghcr.io/cjackhwang/ds2api:latest
./scripts/build-webui.sh
```
# 指定版本(示例)
docker pull ghcr.io/cjackhwang/ds2api:v3.0.0
或手动执行:
```bash
cd webui
npm install
npm run build
# 产物输出到 static/admin/
```
通过环境变量控制自动构建行为:
```bash
# 强制关闭自动构建
DS2API_AUTO_BUILD_WEBUI=false go run ./cmd/ds2api
# 强制开启自动构建
DS2API_AUTO_BUILD_WEBUI=true go run ./cmd/ds2api
```
### 4.3 编译为二进制文件
```bash
go build -o ds2api ./cmd/ds2api
./ds2api
```
---

2
go.mod
View File

@@ -18,7 +18,7 @@ require (
github.com/tidwall/pretty v1.2.1 // indirect
github.com/tidwall/sjson v1.2.5 // indirect
golang.org/x/crypto v0.49.0 // indirect
golang.org/x/net v0.52.0 // indirect
golang.org/x/net v0.52.0
golang.org/x/sys v0.42.0 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect
)

View File

@@ -34,11 +34,13 @@ func (s openAIProxyStub) ChatCompletions(w http.ResponseWriter, _ *http.Request)
type openAIProxyCaptureStub struct {
seenModel string
seenReq map[string]any
}
func (s *openAIProxyCaptureStub) ChatCompletions(w http.ResponseWriter, r *http.Request) {
var req map[string]any
_ = json.NewDecoder(r.Body).Decode(&req)
s.seenReq = req
if m, ok := req["model"].(string); ok {
s.seenModel = m
}
@@ -84,3 +86,33 @@ func TestClaudeProxyViaOpenAIPreservesClaudeMapping(t *testing.T) {
t.Fatalf("expected mapped proxy model deepseek-reasoner, got %q", got)
}
}
func TestClaudeProxyTranslatesInlineImageToOpenAIDataURL(t *testing.T) {
openAI := &openAIProxyCaptureStub{}
h := &Handler{OpenAI: openAI}
req := httptest.NewRequest(http.MethodPost, "/anthropic/v1/messages", strings.NewReader(`{"model":"claude-sonnet-4-5","messages":[{"role":"user","content":[{"type":"text","text":"hello"},{"type":"image","source":{"type":"base64","media_type":"image/png","data":"QUJDRA=="}}]}],"stream":false}`))
rec := httptest.NewRecorder()
h.Messages(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("unexpected status: %d body=%s", rec.Code, rec.Body.String())
}
messages, _ := openAI.seenReq["messages"].([]any)
if len(messages) != 1 {
t.Fatalf("expected one translated message, got %#v", openAI.seenReq)
}
msg, _ := messages[0].(map[string]any)
content, _ := msg["content"].([]any)
if len(content) != 2 {
t.Fatalf("expected translated content blocks, got %#v", msg)
}
imageBlock, _ := content[1].(map[string]any)
if strings.TrimSpace(asString(imageBlock["type"])) != "image_url" {
t.Fatalf("expected image_url block, got %#v", imageBlock)
}
imageURL, _ := imageBlock["image_url"].(map[string]any)
if !strings.HasPrefix(strings.TrimSpace(asString(imageURL["url"])), "data:image/png;base64,") {
t.Fatalf("expected translated data url, got %#v", imageBlock)
}
}

View File

@@ -36,7 +36,7 @@ func normalizeClaudeRequest(store ConfigReader, req map[string]any) (claudeNorma
thinkingEnabled = false
searchEnabled = false
}
finalPrompt := deepseek.MessagesPrepare(toMessageMaps(dsPayload["messages"]))
finalPrompt := deepseek.MessagesPrepareWithThinking(toMessageMaps(dsPayload["messages"]), thinkingEnabled)
toolNames := extractClaudeToolNames(toolsRequested)
if len(toolNames) == 0 && len(toolsRequested) > 0 {
toolNames = []string{"__any_tool__"}

View File

@@ -28,7 +28,7 @@ func normalizeGeminiRequest(store ConfigReader, routeModel string, req map[strin
}
toolsRaw := convertGeminiTools(req["tools"])
finalPrompt, toolNames := openai.BuildPromptForAdapter(messagesRaw, toolsRaw, "")
finalPrompt, toolNames := openai.BuildPromptForAdapter(messagesRaw, toolsRaw, "", thinkingEnabled)
passThrough := collectGeminiPassThrough(req)
return util.StandardRequest{

View File

@@ -82,11 +82,17 @@ func (s geminiOpenAIErrorStub) ChatCompletions(w http.ResponseWriter, _ *http.Re
}
type geminiOpenAISuccessStub struct {
stream bool
body string
stream bool
body string
seenReq map[string]any
}
func (s geminiOpenAISuccessStub) ChatCompletions(w http.ResponseWriter, _ *http.Request) {
func (s *geminiOpenAISuccessStub) ChatCompletions(w http.ResponseWriter, r *http.Request) {
if r != nil {
var req map[string]any
_ = json.NewDecoder(r.Body).Decode(&req)
s.seenReq = req
}
if s.stream {
w.Header().Set("Content-Type", "text/event-stream")
w.WriteHeader(http.StatusOK)
@@ -144,7 +150,7 @@ func TestGeminiRoutesRegistered(t *testing.T) {
func TestGenerateContentReturnsFunctionCallParts(t *testing.T) {
h := &Handler{
Store: testGeminiConfig{},
OpenAI: geminiOpenAISuccessStub{
OpenAI: &geminiOpenAISuccessStub{
body: `{"id":"chatcmpl-1","object":"chat.completion","choices":[{"index":0,"message":{"role":"assistant","tool_calls":[{"id":"call_1","type":"function","function":{"name":"eval_javascript","arguments":"{\"code\":\"1+1\"}"}}]},"finish_reason":"tool_calls"}]}`,
},
}
@@ -184,7 +190,7 @@ func TestGenerateContentReturnsFunctionCallParts(t *testing.T) {
}
func TestGenerateContentMixedToolSnippetAlsoTriggersFunctionCall(t *testing.T) {
h := &Handler{Store: testGeminiConfig{}, OpenAI: geminiOpenAISuccessStub{}}
h := &Handler{Store: testGeminiConfig{}, OpenAI: &geminiOpenAISuccessStub{}}
r := chi.NewRouter()
RegisterRoutes(r, h)
@@ -217,7 +223,7 @@ func TestGenerateContentMixedToolSnippetAlsoTriggersFunctionCall(t *testing.T) {
func TestStreamGenerateContentEmitsSSE(t *testing.T) {
h := &Handler{
Store: testGeminiConfig{},
OpenAI: geminiOpenAISuccessStub{stream: true},
OpenAI: &geminiOpenAISuccessStub{stream: true},
}
r := chi.NewRouter()
RegisterRoutes(r, h)
@@ -251,6 +257,39 @@ func TestStreamGenerateContentEmitsSSE(t *testing.T) {
}
}
func TestGeminiProxyTranslatesInlineImageToOpenAIDataURL(t *testing.T) {
openAI := &geminiOpenAISuccessStub{}
h := &Handler{Store: testGeminiConfig{}, OpenAI: openAI}
r := chi.NewRouter()
RegisterRoutes(r, h)
body := `{"contents":[{"role":"user","parts":[{"text":"hello"},{"inlineData":{"mimeType":"image/png","data":"QUJDRA=="}}]}]}`
req := httptest.NewRequest(http.MethodPost, "/v1beta/models/gemini-2.5-pro:generateContent", strings.NewReader(body))
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
messages, _ := openAI.seenReq["messages"].([]any)
if len(messages) != 1 {
t.Fatalf("expected one translated message, got %#v", openAI.seenReq)
}
msg, _ := messages[0].(map[string]any)
content, _ := msg["content"].([]any)
if len(content) != 2 {
t.Fatalf("expected translated content blocks, got %#v", msg)
}
imageBlock, _ := content[1].(map[string]any)
if strings.TrimSpace(asString(imageBlock["type"])) != "image_url" {
t.Fatalf("expected image_url block, got %#v", imageBlock)
}
imageURL, _ := imageBlock["image_url"].(map[string]any)
if !strings.HasPrefix(strings.TrimSpace(asString(imageURL["url"])), "data:image/png;base64,") {
t.Fatalf("expected translated data url, got %#v", imageBlock)
}
}
func TestGenerateContentOpenAIProxyErrorUsesGeminiEnvelope(t *testing.T) {
h := &Handler{
Store: testGeminiConfig{},

View File

@@ -98,6 +98,19 @@ func (s *chatStreamRuntime) sendDone() {
}
}
func (s *chatStreamRuntime) sendFailedChunk(status int, message, code string) {
s.sendChunk(map[string]any{
"status_code": status,
"error": map[string]any{
"message": message,
"type": openAIErrorType(status),
"code": code,
"param": nil,
},
})
s.sendDone()
}
func (s *chatStreamRuntime) finalize(finishReason string) {
finalThinking := s.thinking.String()
finalText := cleanVisibleOutput(s.text.String(), s.stripReferenceMarkers)
@@ -168,6 +181,21 @@ func (s *chatStreamRuntime) finalize(finishReason string) {
if len(detected.Calls) > 0 || s.toolCallsEmitted {
finishReason = "tool_calls"
}
if len(detected.Calls) == 0 && !s.toolCallsEmitted && strings.TrimSpace(finalText) == "" {
status := http.StatusTooManyRequests
message := "Upstream model returned empty output."
code := "upstream_empty_output"
if strings.TrimSpace(finalThinking) != "" {
message = "Upstream model returned reasoning without visible output."
}
if finishReason == "content_filter" {
status = http.StatusBadRequest
message = "Upstream content filtered the response and returned no output."
code = "content_filter"
}
s.sendFailedChunk(status, message, code)
return
}
usage := openaifmt.BuildChatUsage(s.finalPrompt, finalThinking, finalText)
s.sendChunk(openaifmt.BuildChatStreamChunk(
s.completionID,
@@ -184,6 +212,9 @@ func (s *chatStreamRuntime) onParsed(parsed sse.LineResult) streamengine.ParsedD
return streamengine.ParsedDecision{}
}
if parsed.ContentFilter {
if strings.TrimSpace(s.text.String()) == "" {
return streamengine.ParsedDecision{Stop: true, StopReason: streamengine.StopReason("content_filter")}
}
return streamengine.ParsedDecision{Stop: true, StopReason: streamengine.StopReasonHandlerRequested}
}
if parsed.ErrorMessage != "" {

View File

@@ -18,6 +18,7 @@ type AuthResolver interface {
type DeepSeekCaller interface {
CreateSession(ctx context.Context, a *auth.RequestAuth, maxAttempts int) (string, error)
GetPow(ctx context.Context, a *auth.RequestAuth, maxAttempts int) (string, error)
UploadFile(ctx context.Context, a *auth.RequestAuth, req deepseek.UploadFileRequest, maxAttempts int) (*deepseek.UploadFileResult, error)
CallCompletion(ctx context.Context, a *auth.RequestAuth, payload map[string]any, powResp string, maxAttempts int) (*http.Response, error)
DeleteSessionForToken(ctx context.Context, token string, sessionID string) (*deepseek.DeleteSessionResult, error)
DeleteAllSessionsForToken(ctx context.Context, token string) error

View File

@@ -26,8 +26,13 @@ func (h *Handler) Embeddings(w http.ResponseWriter, r *http.Request) {
}
defer h.Auth.Release(a)
r.Body = http.MaxBytesReader(w, r.Body, openAIGeneralMaxSize)
var req map[string]any
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
if strings.Contains(strings.ToLower(err.Error()), "too large") {
writeOpenAIError(w, http.StatusRequestEntityTooLarge, "request body too large")
return
}
writeOpenAIError(w, http.StatusBadRequest, "invalid json")
return
}

View File

@@ -0,0 +1,382 @@
package openai
import (
"context"
"crypto/sha256"
"encoding/base64"
"fmt"
"mime"
"net/http"
"net/url"
"path/filepath"
"strings"
"ds2api/internal/auth"
"ds2api/internal/deepseek"
)
const maxInlineFilesPerRequest = 50
type inlineFileUploadError struct {
status int
message string
err error
}
func (e *inlineFileUploadError) Error() string {
if e == nil {
return ""
}
if strings.TrimSpace(e.message) != "" {
return e.message
}
if e.err != nil {
return e.err.Error()
}
return "inline file processing failed"
}
type inlineUploadState struct {
ctx context.Context
handler *Handler
auth *auth.RequestAuth
uploadedByID map[string]string
uploadCount int
}
type inlineDecodedFile struct {
Data []byte
ContentType string
Filename string
ReplacementType string
}
func (h *Handler) preprocessInlineFileInputs(ctx context.Context, a *auth.RequestAuth, req map[string]any) error {
if h == nil || h.DS == nil || len(req) == 0 {
return nil
}
state := &inlineUploadState{
ctx: ctx,
handler: h,
auth: a,
uploadedByID: map[string]string{},
}
for _, key := range []string{"messages", "input", "attachments"} {
if raw, ok := req[key]; ok {
updated, err := state.walk(raw)
if err != nil {
return err
}
req[key] = updated
}
}
if refIDs := collectOpenAIRefFileIDs(req); len(refIDs) > 0 {
req["ref_file_ids"] = stringsToAnySlice(refIDs)
}
return nil
}
func writeOpenAIInlineFileError(w http.ResponseWriter, err error) {
inlineErr, ok := err.(*inlineFileUploadError)
if !ok || inlineErr == nil {
writeOpenAIError(w, http.StatusInternalServerError, "Failed to process file input.")
return
}
status := inlineErr.status
if status == 0 {
status = http.StatusInternalServerError
}
message := strings.TrimSpace(inlineErr.message)
if message == "" {
message = "Failed to process file input."
}
writeOpenAIError(w, status, message)
}
func (s *inlineUploadState) walk(raw any) (any, error) {
switch x := raw.(type) {
case []any:
out := make([]any, len(x))
for i, item := range x {
updated, err := s.walk(item)
if err != nil {
return nil, err
}
out[i] = updated
}
return out, nil
case map[string]any:
if replacement, replaced, err := s.tryUploadBlock(x); replaced || err != nil {
return replacement, err
}
for _, key := range []string{"messages", "input", "attachments", "content", "files", "items", "data", "source", "file", "image_url"} {
if nested, ok := x[key]; ok {
updated, err := s.walk(nested)
if err != nil {
return nil, err
}
x[key] = updated
}
}
return x, nil
default:
return raw, nil
}
}
func (s *inlineUploadState) tryUploadBlock(block map[string]any) (map[string]any, bool, error) {
decoded, ok, err := decodeOpenAIInlineFileBlock(block)
if err != nil {
return nil, true, &inlineFileUploadError{status: http.StatusBadRequest, message: err.Error(), err: err}
}
if !ok {
return nil, false, nil
}
if s.uploadCount >= maxInlineFilesPerRequest {
return nil, true, fmt.Errorf("exceeded maximum of %d inline files per request", maxInlineFilesPerRequest)
}
fileID, err := s.uploadInlineFile(decoded)
if err != nil {
return nil, true, &inlineFileUploadError{status: http.StatusInternalServerError, message: "Failed to upload inline file.", err: err}
}
s.uploadCount++
replacement := map[string]any{
"type": decoded.ReplacementType,
"file_id": fileID,
}
if decoded.Filename != "" {
replacement["filename"] = decoded.Filename
}
if decoded.ContentType != "" {
replacement["mime_type"] = decoded.ContentType
}
return replacement, true, nil
}
func (s *inlineUploadState) uploadInlineFile(file inlineDecodedFile) (string, error) {
sum := sha256.Sum256(append([]byte(file.ContentType+"\x00"+file.Filename+"\x00"), file.Data...))
cacheKey := fmt.Sprintf("%x", sum[:])
if fileID, ok := s.uploadedByID[cacheKey]; ok && strings.TrimSpace(fileID) != "" {
return fileID, nil
}
contentType := strings.TrimSpace(file.ContentType)
if contentType == "" {
contentType = http.DetectContentType(file.Data)
}
result, err := s.handler.DS.UploadFile(s.ctx, s.auth, deepseek.UploadFileRequest{
Filename: file.Filename,
ContentType: contentType,
Data: file.Data,
}, 3)
if err != nil {
return "", err
}
fileID := strings.TrimSpace(result.ID)
if fileID == "" {
return "", fmt.Errorf("upload succeeded without file id")
}
s.uploadedByID[cacheKey] = fileID
return fileID, nil
}
func decodeOpenAIInlineFileBlock(block map[string]any) (inlineDecodedFile, bool, error) {
if block == nil {
return inlineDecodedFile{}, false, nil
}
if strings.TrimSpace(asString(block["file_id"])) != "" {
return inlineDecodedFile{}, false, nil
}
if nested, ok := block["file"].(map[string]any); ok {
decoded, matched, err := decodeOpenAIInlineFileBlock(nested)
if err != nil || !matched {
return decoded, matched, err
}
if decoded.Filename == "" {
decoded.Filename = pickInlineFilename(block, decoded.ContentType, defaultInlinePrefix(decoded.ReplacementType))
}
return decoded, true, nil
}
blockType := strings.ToLower(strings.TrimSpace(asString(block["type"])))
if raw, matched := extractInlineImageDataURL(block); matched {
data, contentType, err := decodeInlinePayload(raw, contentTypeFromMap(block))
if err != nil {
return inlineDecodedFile{}, true, fmt.Errorf("invalid image input")
}
return inlineDecodedFile{
Data: data,
ContentType: contentType,
Filename: pickInlineFilename(block, contentType, "image"),
ReplacementType: "input_image",
}, true, nil
}
if raw, matched := extractInlineFilePayload(block, blockType); matched {
data, contentType, err := decodeInlinePayload(raw, contentTypeFromMap(block))
if err != nil {
return inlineDecodedFile{}, true, fmt.Errorf("invalid file input")
}
return inlineDecodedFile{
Data: data,
ContentType: contentType,
Filename: pickInlineFilename(block, contentType, defaultInlinePrefix(blockType)),
ReplacementType: "input_file",
}, true, nil
}
return inlineDecodedFile{}, false, nil
}
func extractInlineImageDataURL(block map[string]any) (string, bool) {
imageURL := block["image_url"]
switch x := imageURL.(type) {
case string:
if isDataURL(x) {
return strings.TrimSpace(x), true
}
case map[string]any:
if raw := strings.TrimSpace(asString(x["url"])); isDataURL(raw) {
return raw, true
}
}
if raw := strings.TrimSpace(asString(block["url"])); isDataURL(raw) {
return raw, true
}
return "", false
}
func extractInlineFilePayload(block map[string]any, blockType string) (string, bool) {
for _, value := range []any{block["file_data"], block["base64"], block["data"]} {
if raw := strings.TrimSpace(asString(value)); raw != "" {
if strings.Contains(blockType, "file") || block["file_data"] != nil || block["filename"] != nil || block["file_name"] != nil || block["name"] != nil {
return raw, true
}
}
}
return "", false
}
func decodeInlinePayload(raw string, explicitContentType string) ([]byte, string, error) {
raw = strings.TrimSpace(raw)
if raw == "" {
return nil, "", fmt.Errorf("empty payload")
}
if isDataURL(raw) {
return decodeDataURL(raw, explicitContentType)
}
decoded, err := decodeBase64Flexible(raw)
if err != nil {
return nil, "", err
}
contentType := strings.TrimSpace(explicitContentType)
if contentType == "" && len(decoded) > 0 {
contentType = http.DetectContentType(decoded)
}
return decoded, contentType, nil
}
func decodeDataURL(raw string, explicitContentType string) ([]byte, string, error) {
raw = strings.TrimSpace(raw)
if !isDataURL(raw) {
return nil, "", fmt.Errorf("unsupported data url")
}
header, payload, ok := strings.Cut(raw, ",")
if !ok {
return nil, "", fmt.Errorf("invalid data url")
}
meta := strings.TrimSpace(strings.TrimPrefix(header, "data:"))
contentType := strings.TrimSpace(explicitContentType)
if contentType == "" {
contentType = "application/octet-stream"
if meta != "" {
parts := strings.Split(meta, ";")
if len(parts) > 0 && strings.TrimSpace(parts[0]) != "" {
contentType = strings.TrimSpace(parts[0])
}
}
}
if strings.Contains(strings.ToLower(meta), ";base64") {
decoded, err := decodeBase64Flexible(payload)
if err != nil {
return nil, "", err
}
return decoded, contentType, nil
}
decoded, err := url.PathUnescape(payload)
if err != nil {
return nil, "", err
}
return []byte(decoded), contentType, nil
}
func decodeBase64Flexible(raw string) ([]byte, error) {
raw = strings.TrimSpace(raw)
for _, enc := range []*base64.Encoding{base64.StdEncoding, base64.RawStdEncoding, base64.URLEncoding, base64.RawURLEncoding} {
decoded, err := enc.DecodeString(raw)
if err == nil {
return decoded, nil
}
}
return nil, fmt.Errorf("invalid base64 payload")
}
func contentTypeFromMap(block map[string]any) string {
for _, value := range []any{block["mime_type"], block["mimeType"], block["content_type"], block["contentType"], block["media_type"], block["mediaType"]} {
if contentType := strings.TrimSpace(asString(value)); contentType != "" {
return contentType
}
}
if imageURL, ok := block["image_url"].(map[string]any); ok {
for _, value := range []any{imageURL["mime_type"], imageURL["mimeType"], imageURL["content_type"], imageURL["contentType"]} {
if contentType := strings.TrimSpace(asString(value)); contentType != "" {
return contentType
}
}
}
return ""
}
func pickInlineFilename(block map[string]any, contentType string, prefix string) string {
for _, value := range []any{block["filename"], block["file_name"], block["name"]} {
if name := strings.TrimSpace(asString(value)); name != "" {
return filepath.Base(name)
}
}
if prefix == "" {
prefix = "upload"
}
ext := ".bin"
if parsedType := strings.TrimSpace(contentType); parsedType != "" {
if comma := strings.Index(parsedType, ";"); comma >= 0 {
parsedType = strings.TrimSpace(parsedType[:comma])
}
if exts, err := mime.ExtensionsByType(parsedType); err == nil && len(exts) > 0 && strings.TrimSpace(exts[0]) != "" {
ext = exts[0]
}
}
return prefix + ext
}
func defaultInlinePrefix(blockType string) string {
blockType = strings.ToLower(strings.TrimSpace(blockType))
if strings.Contains(blockType, "image") {
return "image"
}
return "upload"
}
func isDataURL(raw string) bool {
return strings.HasPrefix(strings.ToLower(strings.TrimSpace(raw)), "data:")
}
func stringsToAnySlice(items []string) []any {
out := make([]any, 0, len(items))
for _, item := range items {
trimmed := strings.TrimSpace(item)
if trimmed == "" {
continue
}
out = append(out, trimmed)
}
if len(out) == 0 {
return nil
}
return out
}

View File

@@ -0,0 +1,274 @@
package openai
import (
"context"
"encoding/json"
"errors"
"net/http"
"net/http/httptest"
"strings"
"testing"
"github.com/go-chi/chi/v5"
"ds2api/internal/auth"
"ds2api/internal/deepseek"
)
type inlineUploadDSStub struct {
uploadCalls []deepseek.UploadFileRequest
lastCtx context.Context
completionReq map[string]any
createSession string
uploadErr error
completionResp *http.Response
}
func (m *inlineUploadDSStub) CreateSession(_ context.Context, _ *auth.RequestAuth, _ int) (string, error) {
if strings.TrimSpace(m.createSession) == "" {
return "session-id", nil
}
return m.createSession, nil
}
func (m *inlineUploadDSStub) GetPow(_ context.Context, _ *auth.RequestAuth, _ int) (string, error) {
return "pow", nil
}
func (m *inlineUploadDSStub) UploadFile(ctx context.Context, _ *auth.RequestAuth, req deepseek.UploadFileRequest, _ int) (*deepseek.UploadFileResult, error) {
m.lastCtx = ctx
m.uploadCalls = append(m.uploadCalls, req)
if m.uploadErr != nil {
return nil, m.uploadErr
}
return &deepseek.UploadFileResult{
ID: "file-inline-1",
Filename: req.Filename,
Bytes: int64(len(req.Data)),
Status: "uploaded",
Purpose: req.Purpose,
}, nil
}
func (m *inlineUploadDSStub) CallCompletion(_ context.Context, _ *auth.RequestAuth, payload map[string]any, _ string, _ int) (*http.Response, error) {
m.completionReq = payload
if m.completionResp != nil {
return m.completionResp, nil
}
return makeOpenAISSEHTTPResponse(
`data: {"p":"response/content","v":"ok"}`,
`data: [DONE]`,
), nil
}
func (m *inlineUploadDSStub) DeleteSessionForToken(_ context.Context, _ string, _ string) (*deepseek.DeleteSessionResult, error) {
return &deepseek.DeleteSessionResult{Success: true}, nil
}
func (m *inlineUploadDSStub) DeleteAllSessionsForToken(_ context.Context, _ string) error {
return nil
}
func TestPreprocessInlineFileInputsReplacesDataURLAndCollectsRefFileIDs(t *testing.T) {
ds := &inlineUploadDSStub{}
h := &Handler{DS: ds}
req := map[string]any{
"messages": []any{
map[string]any{
"role": "user",
"content": []any{
map[string]any{
"type": "image_url",
"image_url": map[string]any{"url": "data:image/png;base64,QUJDRA=="},
},
},
},
},
}
ctx, cancel := context.WithCancel(context.Background())
defer cancel()
if err := h.preprocessInlineFileInputs(ctx, &auth.RequestAuth{DeepSeekToken: "token"}, req); err != nil {
t.Fatalf("preprocess failed: %v", err)
}
if len(ds.uploadCalls) != 1 {
t.Fatalf("expected 1 upload, got %d", len(ds.uploadCalls))
}
if ds.lastCtx != ctx {
t.Fatalf("expected upload to use request context")
}
if ds.uploadCalls[0].ContentType != "image/png" {
t.Fatalf("expected image/png, got %q", ds.uploadCalls[0].ContentType)
}
if ds.uploadCalls[0].Filename != "image.png" {
t.Fatalf("expected inferred filename image.png, got %q", ds.uploadCalls[0].Filename)
}
messages, _ := req["messages"].([]any)
first, _ := messages[0].(map[string]any)
content, _ := first["content"].([]any)
block, _ := content[0].(map[string]any)
if block["type"] != "input_image" {
t.Fatalf("expected input_image replacement, got %#v", block)
}
if block["file_id"] != "file-inline-1" {
t.Fatalf("expected file-inline-1 replacement id, got %#v", block)
}
refIDs, _ := req["ref_file_ids"].([]any)
if len(refIDs) != 1 || refIDs[0] != "file-inline-1" {
t.Fatalf("unexpected ref_file_ids: %#v", req["ref_file_ids"])
}
}
func TestPreprocessInlineFileInputsDeduplicatesIdenticalPayloads(t *testing.T) {
ds := &inlineUploadDSStub{}
h := &Handler{DS: ds}
req := map[string]any{
"messages": []any{
map[string]any{
"role": "user",
"content": []any{
map[string]any{"type": "image_url", "image_url": map[string]any{"url": "data:image/png;base64,QUJDRA=="}},
map[string]any{"type": "image_url", "image_url": map[string]any{"url": "data:image/png;base64,QUJDRA=="}},
},
},
},
}
if err := h.preprocessInlineFileInputs(context.Background(), &auth.RequestAuth{DeepSeekToken: "token"}, req); err != nil {
t.Fatalf("preprocess failed: %v", err)
}
if len(ds.uploadCalls) != 1 {
t.Fatalf("expected deduplicated single upload, got %d", len(ds.uploadCalls))
}
refIDs, _ := req["ref_file_ids"].([]any)
if len(refIDs) != 1 || refIDs[0] != "file-inline-1" {
t.Fatalf("unexpected ref_file_ids after dedupe: %#v", req["ref_file_ids"])
}
}
func TestChatCompletionsUploadsInlineFilesBeforeCompletion(t *testing.T) {
ds := &inlineUploadDSStub{}
h := &Handler{Store: mockOpenAIConfig{wideInput: true}, Auth: streamStatusAuthStub{}, DS: ds}
reqBody := `{"model":"deepseek-chat","messages":[{"role":"user","content":[{"type":"input_text","text":"hi"},{"type":"image_url","image_url":{"url":"data:image/png;base64,QUJDRA=="}}]}],"stream":false}`
req := httptest.NewRequest(http.MethodPost, "/v1/chat/completions", strings.NewReader(reqBody))
req.Header.Set("Authorization", "Bearer direct-token")
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
h.ChatCompletions(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
if len(ds.uploadCalls) != 1 {
t.Fatalf("expected 1 upload call, got %d", len(ds.uploadCalls))
}
if ds.completionReq == nil {
t.Fatal("expected completion payload to be captured")
}
refIDs, _ := ds.completionReq["ref_file_ids"].([]any)
if len(refIDs) != 1 || refIDs[0] != "file-inline-1" {
t.Fatalf("unexpected completion ref_file_ids: %#v", ds.completionReq["ref_file_ids"])
}
}
func TestResponsesUploadsInlineFilesBeforeCompletion(t *testing.T) {
ds := &inlineUploadDSStub{}
h := &Handler{Store: mockOpenAIConfig{wideInput: true}, Auth: streamStatusAuthStub{}, DS: ds}
r := chi.NewRouter()
RegisterRoutes(r, h)
reqBody := `{"model":"deepseek-chat","input":[{"role":"user","content":[{"type":"input_text","text":"hi"},{"type":"input_image","image_url":{"url":"data:image/png;base64,QUJDRA=="}}]}],"stream":false}`
req := httptest.NewRequest(http.MethodPost, "/v1/responses", strings.NewReader(reqBody))
req.Header.Set("Authorization", "Bearer direct-token")
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
if len(ds.uploadCalls) != 1 {
t.Fatalf("expected 1 upload call, got %d", len(ds.uploadCalls))
}
refIDs, _ := ds.completionReq["ref_file_ids"].([]any)
if len(refIDs) != 1 || refIDs[0] != "file-inline-1" {
t.Fatalf("unexpected completion ref_file_ids: %#v", ds.completionReq["ref_file_ids"])
}
}
func TestChatCompletionsInlineUploadFailureReturnsBadRequest(t *testing.T) {
ds := &inlineUploadDSStub{}
h := &Handler{Store: mockOpenAIConfig{wideInput: true}, Auth: streamStatusAuthStub{}, DS: ds}
reqBody := `{"model":"deepseek-chat","messages":[{"role":"user","content":[{"type":"image_url","image_url":{"url":"data:image/png;base64,%%%"}}]}],"stream":false}`
req := httptest.NewRequest(http.MethodPost, "/v1/chat/completions", strings.NewReader(reqBody))
req.Header.Set("Authorization", "Bearer direct-token")
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
h.ChatCompletions(rec, req)
if rec.Code != http.StatusBadRequest {
t.Fatalf("expected 400, got %d body=%s", rec.Code, rec.Body.String())
}
if ds.completionReq != nil {
t.Fatalf("did not expect completion call on upload decode error")
}
}
func TestResponsesInlineUploadFailureReturnsInternalServerError(t *testing.T) {
ds := &inlineUploadDSStub{uploadErr: errors.New("boom")}
h := &Handler{Store: mockOpenAIConfig{wideInput: true}, Auth: streamStatusAuthStub{}, DS: ds}
r := chi.NewRouter()
RegisterRoutes(r, h)
reqBody := `{"model":"deepseek-chat","input":[{"role":"user","content":[{"type":"image_url","image_url":{"url":"data:image/png;base64,QUJDRA=="}}]}],"stream":false}`
req := httptest.NewRequest(http.MethodPost, "/v1/responses", strings.NewReader(reqBody))
req.Header.Set("Authorization", "Bearer direct-token")
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusInternalServerError {
t.Fatalf("expected 500, got %d body=%s", rec.Code, rec.Body.String())
}
if ds.completionReq != nil {
t.Fatalf("did not expect completion call after upload failure")
}
}
func TestVercelPrepareUploadsInlineFilesBeforeLeasePayload(t *testing.T) {
t.Setenv("VERCEL", "1")
t.Setenv("DS2API_VERCEL_INTERNAL_SECRET", "stream-secret")
ds := &inlineUploadDSStub{}
h := &Handler{Store: mockOpenAIConfig{wideInput: true}, Auth: streamStatusAuthStub{}, DS: ds}
r := chi.NewRouter()
RegisterRoutes(r, h)
reqBody := `{"model":"deepseek-chat","messages":[{"role":"user","content":[{"type":"input_text","text":"hi"},{"type":"image_url","image_url":{"url":"data:image/png;base64,QUJDRA=="}}]}],"stream":true}`
req := httptest.NewRequest(http.MethodPost, "/v1/chat/completions?__stream_prepare=1", strings.NewReader(reqBody))
req.Header.Set("Authorization", "Bearer direct-token")
req.Header.Set("X-Ds2-Internal-Token", "stream-secret")
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
if len(ds.uploadCalls) != 1 {
t.Fatalf("expected 1 upload call, got %d", len(ds.uploadCalls))
}
var out map[string]any
if err := json.Unmarshal(rec.Body.Bytes(), &out); err != nil {
t.Fatalf("decode response failed: %v body=%s", err, rec.Body.String())
}
payload, _ := out["payload"].(map[string]any)
if payload == nil {
t.Fatalf("expected payload in prepare response, got %#v", out)
}
refIDs, _ := payload["ref_file_ids"].([]any)
if len(refIDs) != 1 || refIDs[0] != "file-inline-1" {
t.Fatalf("unexpected payload ref_file_ids: %#v", payload["ref_file_ids"])
}
}

View File

@@ -0,0 +1,94 @@
package openai
import "strings"
func collectOpenAIRefFileIDs(req map[string]any) []string {
if len(req) == 0 {
return nil
}
out := make([]string, 0, 4)
seen := map[string]struct{}{}
for _, key := range []string{
"ref_file_ids",
"file_ids",
"attachments",
"messages",
"input",
} {
raw := req[key]
if raw == nil {
continue
}
// Skip top-level strings for 'messages' and 'input' as they are likely plain text content,
// not file IDs. String file IDs are expected in 'ref_file_ids' or 'file_ids'.
if key == "messages" || key == "input" {
if _, ok := raw.(string); ok {
continue
}
}
appendOpenAIRefFileIDs(&out, seen, raw)
}
if len(out) == 0 {
return nil
}
return out
}
func appendOpenAIRefFileIDs(out *[]string, seen map[string]struct{}, raw any) {
switch x := raw.(type) {
case string:
addOpenAIRefFileID(out, seen, x)
case []string:
for _, item := range x {
addOpenAIRefFileID(out, seen, item)
}
case []any:
for _, item := range x {
appendOpenAIRefFileIDs(out, seen, item)
}
case map[string]any:
if fileID := strings.TrimSpace(asString(x["file_id"])); fileID != "" {
addOpenAIRefFileID(out, seen, fileID)
}
if strings.Contains(strings.ToLower(strings.TrimSpace(asString(x["type"]))), "file") {
if fileID := strings.TrimSpace(asString(x["id"])); fileID != "" {
addOpenAIRefFileID(out, seen, fileID)
}
}
if fileMap, ok := x["file"].(map[string]any); ok {
if fileID := strings.TrimSpace(asString(fileMap["file_id"])); fileID != "" {
addOpenAIRefFileID(out, seen, fileID)
}
if fileID := strings.TrimSpace(asString(fileMap["id"])); fileID != "" {
addOpenAIRefFileID(out, seen, fileID)
}
}
// Recurse into potential containers. Note: we do NOT recurse into 'content' or 'input'
// if they are plain strings (handled by the top-level switch), but they are usually
// nested inside the map branch anyway.
// To be safe, we only recurse into these known container keys.
for _, key := range []string{"ref_file_ids", "file_ids", "attachments", "messages", "input", "content", "files", "items", "data", "source"} {
if nested, ok := x[key]; ok {
// If it's a message content that is a string, we must NOT treat it as an ID.
if key == "content" || key == "input" {
if _, ok := nested.(string); ok {
continue
}
}
appendOpenAIRefFileIDs(out, seen, nested)
}
}
}
}
func addOpenAIRefFileID(out *[]string, seen map[string]struct{}, fileID string) {
fileID = strings.TrimSpace(fileID)
if fileID == "" {
return
}
if _, ok := seen[fileID]; ok {
return
}
seen[fileID] = struct{}{}
*out = append(*out, fileID)
}

View File

@@ -0,0 +1,202 @@
package openai
import (
"bytes"
"context"
"encoding/json"
"errors"
"mime/multipart"
"net/http"
"net/http/httptest"
"testing"
"github.com/go-chi/chi/v5"
"ds2api/internal/auth"
"ds2api/internal/deepseek"
)
type managedFilesAuthStub struct{}
func (managedFilesAuthStub) Determine(_ *http.Request) (*auth.RequestAuth, error) {
return &auth.RequestAuth{
UseConfigToken: true,
DeepSeekToken: "managed-token",
CallerID: "caller:test",
AccountID: "acct-123",
TriedAccounts: map[string]bool{},
}, nil
}
func (managedFilesAuthStub) DetermineCaller(_ *http.Request) (*auth.RequestAuth, error) {
return &auth.RequestAuth{
UseConfigToken: true,
DeepSeekToken: "managed-token",
CallerID: "caller:test",
AccountID: "acct-123",
TriedAccounts: map[string]bool{},
}, nil
}
func (managedFilesAuthStub) Release(_ *auth.RequestAuth) {}
type filesRouteDSStub struct {
lastReq deepseek.UploadFileRequest
upload *deepseek.UploadFileResult
err error
}
func (m *filesRouteDSStub) CreateSession(_ context.Context, _ *auth.RequestAuth, _ int) (string, error) {
return "", nil
}
func (m *filesRouteDSStub) GetPow(_ context.Context, _ *auth.RequestAuth, _ int) (string, error) {
return "", nil
}
func (m *filesRouteDSStub) UploadFile(_ context.Context, _ *auth.RequestAuth, req deepseek.UploadFileRequest, _ int) (*deepseek.UploadFileResult, error) {
m.lastReq = req
if m.err != nil {
return nil, m.err
}
if m.upload != nil {
return m.upload, nil
}
return &deepseek.UploadFileResult{ID: "file-123", Filename: req.Filename, Bytes: int64(len(req.Data)), Purpose: req.Purpose, Status: "uploaded"}, nil
}
func (m *filesRouteDSStub) CallCompletion(_ context.Context, _ *auth.RequestAuth, _ map[string]any, _ string, _ int) (*http.Response, error) {
return nil, errors.New("not implemented")
}
func (m *filesRouteDSStub) DeleteSessionForToken(_ context.Context, _ string, _ string) (*deepseek.DeleteSessionResult, error) {
return &deepseek.DeleteSessionResult{Success: true}, nil
}
func (m *filesRouteDSStub) DeleteAllSessionsForToken(_ context.Context, _ string) error {
return nil
}
func newMultipartUploadRequest(t *testing.T, purpose string, filename string, data []byte) *http.Request {
t.Helper()
var body bytes.Buffer
writer := multipart.NewWriter(&body)
if purpose != "" {
if err := writer.WriteField("purpose", purpose); err != nil {
t.Fatalf("write purpose failed: %v", err)
}
}
part, err := writer.CreateFormFile("file", filename)
if err != nil {
t.Fatalf("create form file failed: %v", err)
}
if _, err := part.Write(data); err != nil {
t.Fatalf("write file failed: %v", err)
}
if err := writer.Close(); err != nil {
t.Fatalf("close writer failed: %v", err)
}
req := httptest.NewRequest(http.MethodPost, "/v1/files", &body)
req.Header.Set("Authorization", "Bearer direct-token")
req.Header.Set("Content-Type", writer.FormDataContentType())
return req
}
func TestFilesRouteUploadSuccess(t *testing.T) {
ds := &filesRouteDSStub{}
h := &Handler{Store: mockOpenAIConfig{wideInput: true}, Auth: streamStatusAuthStub{}, DS: ds}
r := chi.NewRouter()
RegisterRoutes(r, h)
req := newMultipartUploadRequest(t, "assistants", "notes.txt", []byte("hello world"))
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
if ds.lastReq.Filename != "notes.txt" {
t.Fatalf("expected filename notes.txt, got %q", ds.lastReq.Filename)
}
if ds.lastReq.Purpose != "assistants" {
t.Fatalf("expected purpose assistants, got %q", ds.lastReq.Purpose)
}
if string(ds.lastReq.Data) != "hello world" {
t.Fatalf("unexpected uploaded data: %q", string(ds.lastReq.Data))
}
var out map[string]any
if err := json.Unmarshal(rec.Body.Bytes(), &out); err != nil {
t.Fatalf("decode response failed: %v body=%s", err, rec.Body.String())
}
if out["object"] != "file" {
t.Fatalf("expected file object, got %#v", out)
}
if out["id"] != "file-123" {
t.Fatalf("expected file id file-123, got %#v", out["id"])
}
if out["filename"] != "notes.txt" {
t.Fatalf("expected filename notes.txt, got %#v", out["filename"])
}
}
func TestFilesRouteUploadIncludesAccountIDForManagedAccount(t *testing.T) {
ds := &filesRouteDSStub{}
h := &Handler{Store: mockOpenAIConfig{wideInput: true}, Auth: managedFilesAuthStub{}, DS: ds}
r := chi.NewRouter()
RegisterRoutes(r, h)
req := newMultipartUploadRequest(t, "assistants", "notes.txt", []byte("hello world"))
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
var out map[string]any
if err := json.Unmarshal(rec.Body.Bytes(), &out); err != nil {
t.Fatalf("decode response failed: %v body=%s", err, rec.Body.String())
}
if out["account_id"] != "acct-123" {
t.Fatalf("expected account_id acct-123, got %#v", out["account_id"])
}
}
func TestFilesRouteRejectsNonMultipart(t *testing.T) {
h := &Handler{Store: mockOpenAIConfig{wideInput: true}, Auth: streamStatusAuthStub{}, DS: &filesRouteDSStub{}}
r := chi.NewRouter()
RegisterRoutes(r, h)
req := httptest.NewRequest(http.MethodPost, "/v1/files", bytes.NewBufferString(`{"purpose":"assistants"}`))
req.Header.Set("Authorization", "Bearer direct-token")
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusBadRequest {
t.Fatalf("expected 400, got %d body=%s", rec.Code, rec.Body.String())
}
}
func TestFilesRouteRequiresFileField(t *testing.T) {
h := &Handler{Store: mockOpenAIConfig{wideInput: true}, Auth: streamStatusAuthStub{}, DS: &filesRouteDSStub{}}
r := chi.NewRouter()
RegisterRoutes(r, h)
var body bytes.Buffer
writer := multipart.NewWriter(&body)
if err := writer.WriteField("purpose", "assistants"); err != nil {
t.Fatalf("write field failed: %v", err)
}
if err := writer.Close(); err != nil {
t.Fatalf("close writer failed: %v", err)
}
req := httptest.NewRequest(http.MethodPost, "/v1/files", &body)
req.Header.Set("Authorization", "Bearer direct-token")
req.Header.Set("Content-Type", writer.FormDataContentType())
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusBadRequest {
t.Fatalf("expected 400, got %d body=%s", rec.Code, rec.Body.String())
}
}

View File

@@ -5,6 +5,7 @@ import (
"encoding/json"
"io"
"net/http"
"strings"
"time"
"ds2api/internal/auth"
@@ -43,11 +44,20 @@ func (h *Handler) ChatCompletions(w http.ResponseWriter, r *http.Request) {
r = r.WithContext(auth.WithAuth(r.Context(), a))
r.Body = http.MaxBytesReader(w, r.Body, openAIGeneralMaxSize)
var req map[string]any
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
if strings.Contains(strings.ToLower(err.Error()), "too large") {
writeOpenAIError(w, http.StatusRequestEntityTooLarge, "request body too large")
return
}
writeOpenAIError(w, http.StatusBadRequest, "invalid json")
return
}
if err := h.preprocessInlineFileInputs(r.Context(), a, req); err != nil {
writeOpenAIInlineFileError(w, err)
return
}
stdReq, err := normalizeOpenAIChatRequest(h.Store, req, requestTraceID(r))
if err != nil {
writeOpenAIError(w, http.StatusBadRequest, err.Error())
@@ -127,7 +137,7 @@ func (h *Handler) handleNonStream(w http.ResponseWriter, ctx context.Context, re
stripReferenceMarkers := h.compatStripReferenceMarkers()
finalThinking := cleanVisibleOutput(result.Thinking, stripReferenceMarkers)
finalText := cleanVisibleOutput(result.Text, stripReferenceMarkers)
if writeUpstreamEmptyOutputError(w, finalThinking, finalText, result.ContentFilter) {
if writeUpstreamEmptyOutputError(w, finalText, result.ContentFilter) {
return
}
respBody := openaifmt.BuildChatCompletion(completionID, model, finalPrompt, finalThinking, finalText, toolNames)

View File

@@ -27,6 +27,10 @@ func (m *autoDeleteModeDSStub) GetPow(_ context.Context, _ *auth.RequestAuth, _
return "pow", nil
}
func (m *autoDeleteModeDSStub) UploadFile(_ context.Context, _ *auth.RequestAuth, _ deepseek.UploadFileRequest, _ int) (*deepseek.UploadFileResult, error) {
return &deepseek.UploadFileResult{ID: "file-id", Filename: "file.txt", Bytes: 1, Status: "uploaded"}, nil
}
func (m *autoDeleteModeDSStub) CallCompletion(_ context.Context, _ *auth.RequestAuth, _ map[string]any, _ string, _ int) (*http.Response, error) {
return m.resp, nil
}

View File

@@ -0,0 +1,104 @@
package openai
import (
"io"
"net/http"
"strings"
"time"
"ds2api/internal/auth"
"ds2api/internal/deepseek"
)
const openAIUploadMaxMemory = 32 << 20
func (h *Handler) UploadFile(w http.ResponseWriter, r *http.Request) {
a, err := h.Auth.Determine(r)
if err != nil {
status := http.StatusUnauthorized
detail := err.Error()
if err == auth.ErrNoAccount {
status = http.StatusTooManyRequests
}
writeOpenAIError(w, status, detail)
return
}
defer h.Auth.Release(a)
if !strings.HasPrefix(strings.ToLower(strings.TrimSpace(r.Header.Get("Content-Type"))), "multipart/form-data") {
writeOpenAIError(w, http.StatusBadRequest, "content-type must be multipart/form-data")
return
}
// Enforce a hard cap on the total request body size to prevent OOM
r.Body = http.MaxBytesReader(w, r.Body, openAIUploadMaxSize)
if err := r.ParseMultipartForm(openAIUploadMaxMemory); err != nil {
if strings.Contains(strings.ToLower(err.Error()), "too large") {
writeOpenAIError(w, http.StatusRequestEntityTooLarge, "file size exceeds limit")
return
}
writeOpenAIError(w, http.StatusBadRequest, "invalid multipart form")
return
}
if r.MultipartForm != nil {
defer func() { _ = r.MultipartForm.RemoveAll() }()
}
r = r.WithContext(auth.WithAuth(r.Context(), a))
file, header, err := r.FormFile("file")
if err != nil {
writeOpenAIError(w, http.StatusBadRequest, "file is required")
return
}
defer func() { _ = file.Close() }()
data, err := io.ReadAll(file)
if err != nil {
writeOpenAIError(w, http.StatusBadRequest, "failed to read uploaded file")
return
}
contentType := strings.TrimSpace(header.Header.Get("Content-Type"))
if contentType == "" && len(data) > 0 {
contentType = http.DetectContentType(data)
}
result, err := h.DS.UploadFile(r.Context(), a, deepseek.UploadFileRequest{
Filename: header.Filename,
ContentType: contentType,
Purpose: strings.TrimSpace(r.FormValue("purpose")),
Data: data,
}, 3)
if err != nil {
writeOpenAIError(w, http.StatusInternalServerError, "Failed to upload file.")
return
}
if result != nil && result.AccountID == "" {
result.AccountID = a.AccountID
}
writeJSON(w, http.StatusOK, buildOpenAIFileObject(result))
}
func buildOpenAIFileObject(result *deepseek.UploadFileResult) map[string]any {
if result == nil {
obj := map[string]any{
"id": "",
"object": "file",
"bytes": 0,
"created_at": time.Now().Unix(),
"filename": "",
"purpose": "",
"status": "uploaded",
"status_details": nil,
}
return obj
}
obj := map[string]any{
"id": result.ID,
"object": "file",
"bytes": result.Bytes,
"created_at": time.Now().Unix(),
"filename": result.Filename,
"purpose": result.Purpose,
"status": result.Status,
"status_details": nil,
}
if result.AccountID != "" {
obj["account_id"] = result.AccountID
}
return obj
}

View File

@@ -13,6 +13,13 @@ import (
"ds2api/internal/util"
)
const (
// openAIUploadMaxSize limits total multipart request body size (100 MiB).
openAIUploadMaxSize = 100 << 20
// openAIGeneralMaxSize limits total JSON request body size (100 MiB).
openAIGeneralMaxSize = 100 << 20
)
// writeJSON is a package-internal alias kept to avoid mass-renaming across
// every call-site in this package.
var writeJSON = util.WriteJSON
@@ -46,6 +53,7 @@ func RegisterRoutes(r chi.Router, h *Handler) {
r.Post("/v1/chat/completions", h.ChatCompletions)
r.Post("/v1/responses", h.Responses)
r.Get("/v1/responses/{response_id}", h.GetResponseByID)
r.Post("/v1/files", h.UploadFile)
r.Post("/v1/embeddings", h.Embeddings)
}

View File

@@ -313,6 +313,25 @@ func TestHandleNonStreamReturnsContentFilterErrorWhenUpstreamFilteredWithoutOutp
}
}
func TestHandleNonStreamReturns429WhenUpstreamHasOnlyThinking(t *testing.T) {
h := &Handler{}
resp := makeSSEHTTPResponse(
`data: {"p":"response/thinking_content","v":"Only thinking"}`,
`data: [DONE]`,
)
rec := httptest.NewRecorder()
h.handleNonStream(rec, context.Background(), resp, "cid-thinking-only", "deepseek-reasoner", "prompt", true, nil)
if rec.Code != http.StatusTooManyRequests {
t.Fatalf("expected status 429 for thinking-only upstream output, got %d body=%s", rec.Code, rec.Body.String())
}
out := decodeJSONBody(t, rec.Body.String())
errObj, _ := out["error"].(map[string]any)
if asString(errObj["code"]) != "upstream_empty_output" {
t.Fatalf("expected code=upstream_empty_output, got %#v", out)
}
}
func TestHandleStreamToolCallInterceptsWithoutRawContentLeak(t *testing.T) {
h := &Handler{}
resp := makeSSEHTTPResponse(

View File

@@ -2,13 +2,21 @@ package openai
import (
"regexp"
"strings"
)
var emptyJSONFencePattern = regexp.MustCompile("(?is)```json\\s*```")
var leakedToolCallArrayPattern = regexp.MustCompile(`(?is)\[\{\s*"function"\s*:\s*\{[\s\S]*?\}\s*,\s*"id"\s*:\s*"call[^"]*"\s*,\s*"type"\s*:\s*"function"\s*}\]`)
var leakedToolResultBlobPattern = regexp.MustCompile(`(?is)<\s*\|\s*tool\s*\|\s*>\s*\{[\s\S]*?"tool_call_id"\s*:\s*"call[^"]*"\s*}`)
// leakedMetaMarkerPattern matches DeepSeek special tokens in BOTH forms:
var leakedThinkTagPattern = regexp.MustCompile(`(?is)</?\s*think\s*>`)
// leakedBOSMarkerPattern matches DeepSeek BOS markers in BOTH forms:
// - ASCII underscore: <begin_of_sentence>
// - U+2581 variant: <begin▁of▁sentence>
var leakedBOSMarkerPattern = regexp.MustCompile(`(?i)<[\|]\s*begin[_▁]of[_▁]sentence\s*[\|]>`)
// leakedMetaMarkerPattern matches the remaining DeepSeek special tokens in BOTH forms:
// - ASCII underscore: <end_of_sentence>, <end_of_toolresults>, <end_of_instructions>
// - U+2581 variant: <end▁of▁sentence>, <end▁of▁toolresults>, <end▁of▁instructions>
var leakedMetaMarkerPattern = regexp.MustCompile(`(?i)<[\|]\s*(?:assistant|tool|end[_▁]of[_▁]sentence|end[_▁]of[_▁]thinking|end[_▁]of[_▁]toolresults|end[_▁]of[_▁]instructions)\s*[\|]>`)
@@ -35,11 +43,48 @@ func sanitizeLeakedOutput(text string) string {
out := emptyJSONFencePattern.ReplaceAllString(text, "")
out = leakedToolCallArrayPattern.ReplaceAllString(out, "")
out = leakedToolResultBlobPattern.ReplaceAllString(out, "")
out = stripDanglingThinkSuffix(out)
out = leakedThinkTagPattern.ReplaceAllString(out, "")
out = leakedBOSMarkerPattern.ReplaceAllString(out, "")
out = leakedMetaMarkerPattern.ReplaceAllString(out, "")
out = sanitizeLeakedAgentXMLBlocks(out)
return out
}
func stripDanglingThinkSuffix(text string) string {
matches := leakedThinkTagPattern.FindAllStringIndex(text, -1)
if len(matches) == 0 {
return text
}
depth := 0
lastOpen := -1
for _, loc := range matches {
tag := strings.ToLower(text[loc[0]:loc[1]])
compact := strings.ReplaceAll(strings.ReplaceAll(strings.TrimSpace(tag), " ", ""), "\t", "")
if strings.HasPrefix(compact, "</") {
if depth > 0 {
depth--
if depth == 0 {
lastOpen = -1
}
}
continue
}
if depth == 0 {
lastOpen = loc[0]
}
depth++
}
if depth == 0 || lastOpen < 0 {
return text
}
prefix := text[:lastOpen]
if strings.TrimSpace(prefix) == "" {
return ""
}
return prefix
}
func sanitizeLeakedAgentXMLBlocks(text string) string {
out := text
for _, pattern := range leakedAgentXMLBlockPatterns {

View File

@@ -26,6 +26,22 @@ func TestSanitizeLeakedOutputRemovesStandaloneMetaMarkers(t *testing.T) {
}
}
func TestSanitizeLeakedOutputRemovesThinkAndBosMarkers(t *testing.T) {
raw := "A<think>B</think>C<begin▁of▁sentence>D<| begin_of_sentence |>E<begin_of_sentence>F"
got := sanitizeLeakedOutput(raw)
if got != "ABCDEF" {
t.Fatalf("unexpected sanitize result for think/BOS markers: %q", got)
}
}
func TestSanitizeLeakedOutputRemovesDanglingThinkBlock(t *testing.T) {
raw := "Answer prefix<think>internal reasoning that never closes"
got := sanitizeLeakedOutput(raw)
if got != "Answer prefix" {
t.Fatalf("unexpected sanitize result for dangling think block: %q", got)
}
}
func TestSanitizeLeakedOutputRemovesAgentXMLLeaks(t *testing.T) {
raw := "Done.<attempt_completion><result>Some final answer</result></attempt_completion>"
got := sanitizeLeakedOutput(raw)

View File

@@ -22,6 +22,24 @@ func TestGetModelRouteDirectAndAlias(t *testing.T) {
}
})
t.Run("direct_expert", func(t *testing.T) {
req := httptest.NewRequest(http.MethodGet, "/v1/models/deepseek-expert-chat", nil)
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
})
t.Run("direct_vision", func(t *testing.T) {
req := httptest.NewRequest(http.MethodGet, "/v1/models/deepseek-vision-chat", nil)
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
})
t.Run("alias", func(t *testing.T) {
req := httptest.NewRequest(http.MethodGet, "/v1/models/gpt-4.1", nil)
rec := httptest.NewRecorder()

View File

@@ -5,22 +5,22 @@ import (
"ds2api/internal/util"
)
func buildOpenAIFinalPrompt(messagesRaw []any, toolsRaw any, traceID string) (string, []string) {
return buildOpenAIFinalPromptWithPolicy(messagesRaw, toolsRaw, traceID, util.DefaultToolChoicePolicy())
func buildOpenAIFinalPrompt(messagesRaw []any, toolsRaw any, traceID string, thinkingEnabled bool) (string, []string) {
return buildOpenAIFinalPromptWithPolicy(messagesRaw, toolsRaw, traceID, util.DefaultToolChoicePolicy(), thinkingEnabled)
}
func buildOpenAIFinalPromptWithPolicy(messagesRaw []any, toolsRaw any, traceID string, toolPolicy util.ToolChoicePolicy) (string, []string) {
func buildOpenAIFinalPromptWithPolicy(messagesRaw []any, toolsRaw any, traceID string, toolPolicy util.ToolChoicePolicy, thinkingEnabled bool) (string, []string) {
messages := normalizeOpenAIMessagesForPrompt(messagesRaw, traceID)
toolNames := []string{}
if tools, ok := toolsRaw.([]any); ok && len(tools) > 0 {
messages, toolNames = injectToolPrompt(messages, tools, toolPolicy)
}
return deepseek.MessagesPrepare(messages), toolNames
return deepseek.MessagesPrepareWithThinking(messages, thinkingEnabled), toolNames
}
// BuildPromptForAdapter exposes the OpenAI-compatible prompt building flow so
// other protocol adapters (for example Gemini) can reuse the same tool/history
// normalization logic and remain behavior-compatible with chat/completions.
func BuildPromptForAdapter(messagesRaw []any, toolsRaw any, traceID string) (string, []string) {
return buildOpenAIFinalPrompt(messagesRaw, toolsRaw, traceID)
func BuildPromptForAdapter(messagesRaw []any, toolsRaw any, traceID string, thinkingEnabled bool) (string, []string) {
return buildOpenAIFinalPrompt(messagesRaw, toolsRaw, traceID, thinkingEnabled)
}

View File

@@ -40,7 +40,7 @@ func TestBuildOpenAIFinalPrompt_HandlerPathIncludesToolRoundtripSemantics(t *tes
},
}
finalPrompt, toolNames := buildOpenAIFinalPrompt(messages, tools, "")
finalPrompt, toolNames := buildOpenAIFinalPrompt(messages, tools, "", false)
if len(toolNames) != 1 || toolNames[0] != "get_weather" {
t.Fatalf("unexpected tool names: %#v", toolNames)
}
@@ -73,7 +73,7 @@ func TestBuildOpenAIFinalPrompt_VercelPreparePathKeepsFinalAnswerInstruction(t *
},
}
finalPrompt, _ := buildOpenAIFinalPrompt(messages, tools, "")
finalPrompt, _ := buildOpenAIFinalPrompt(messages, tools, "", false)
if !strings.Contains(finalPrompt, "Remember: Output ONLY the <tool_calls>...</tool_calls> XML block when calling tools.") {
t.Fatalf("vercel prepare finalPrompt missing final tool-call anchor instruction: %q", finalPrompt)
}

View File

@@ -156,6 +156,33 @@ func TestNormalizeResponsesInputAsMessagesFunctionCallItemPreservesConcatenatedA
}
}
func TestCollectOpenAIRefFileIDs(t *testing.T) {
got := collectOpenAIRefFileIDs(map[string]any{
"ref_file_ids": []any{"file-top", "file-dup"},
"attachments": []any{
map[string]any{"file_id": "file-attachment"},
},
"input": []any{
map[string]any{
"type": "message",
"content": []any{
map[string]any{"type": "input_file", "file_id": "file-input"},
map[string]any{"type": "input_file", "id": "file-dup"},
},
},
},
})
want := []string{"file-top", "file-dup", "file-attachment", "file-input"}
if len(got) != len(want) {
t.Fatalf("expected %d file ids, got %#v", len(want), got)
}
for i, id := range want {
if got[i] != id {
t.Fatalf("unexpected file ids at %d: got=%#v want=%#v", i, got, want)
}
}
}
func TestExtractEmbeddingInputs(t *testing.T) {
got := extractEmbeddingInputs([]any{"a", "b"})
if len(got) != 2 || got[0] != "a" || got[1] != "b" {

View File

@@ -65,11 +65,20 @@ func (h *Handler) Responses(w http.ResponseWriter, r *http.Request) {
return
}
r.Body = http.MaxBytesReader(w, r.Body, openAIGeneralMaxSize)
var req map[string]any
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
if strings.Contains(strings.ToLower(err.Error()), "too large") {
writeOpenAIError(w, http.StatusRequestEntityTooLarge, "request body too large")
return
}
writeOpenAIError(w, http.StatusBadRequest, "invalid json")
return
}
if err := h.preprocessInlineFileInputs(r.Context(), a, req); err != nil {
writeOpenAIInlineFileError(w, err)
return
}
traceID := requestTraceID(r)
stdReq, err := normalizeOpenAIResponsesRequest(h.Store, req, traceID)
if err != nil {
@@ -117,7 +126,7 @@ func (h *Handler) handleResponsesNonStream(w http.ResponseWriter, resp *http.Res
stripReferenceMarkers := h.compatStripReferenceMarkers()
sanitizedThinking := cleanVisibleOutput(result.Thinking, stripReferenceMarkers)
sanitizedText := cleanVisibleOutput(result.Text, stripReferenceMarkers)
if writeUpstreamEmptyOutputError(w, sanitizedThinking, sanitizedText, result.ContentFilter) {
if writeUpstreamEmptyOutputError(w, sanitizedText, result.ContentFilter) {
return
}
textParsed := toolcall.ParseStandaloneToolCallsDetailed(sanitizedText, toolNames)

View File

@@ -99,6 +99,30 @@ func newResponsesStreamRuntime(
}
}
func (s *responsesStreamRuntime) failResponse(message, code string) {
s.failed = true
failedResp := map[string]any{
"id": s.responseID,
"type": "response",
"object": "response",
"model": s.model,
"status": "failed",
"output": []any{},
"output_text": "",
"error": map[string]any{
"message": message,
"type": "invalid_request_error",
"code": code,
"param": nil,
},
}
if s.persistResponse != nil {
s.persistResponse(failedResp)
}
s.sendEvent("response.failed", openaifmt.BuildResponsesFailedPayload(s.responseID, s.model, message, code))
s.sendDone()
}
func (s *responsesStreamRuntime) finalize() {
finalThinking := s.thinking.String()
finalText := cleanVisibleOutput(s.text.String(), s.stripReferenceMarkers)
@@ -121,28 +145,16 @@ func (s *responsesStreamRuntime) finalize() {
s.closeMessageItem()
if s.toolChoice.IsRequired() && len(detected) == 0 {
s.failed = true
message := "tool_choice requires at least one valid tool call."
failedResp := map[string]any{
"id": s.responseID,
"type": "response",
"object": "response",
"model": s.model,
"status": "failed",
"output": []any{},
"output_text": "",
"error": map[string]any{
"message": message,
"type": "invalid_request_error",
"code": "tool_choice_violation",
"param": nil,
},
s.failResponse("tool_choice requires at least one valid tool call.", "tool_choice_violation")
return
}
if len(detected) == 0 && strings.TrimSpace(finalText) == "" {
code := "upstream_empty_output"
message := "Upstream model returned empty output."
if finalThinking != "" {
message = "Upstream model returned reasoning without visible output."
}
if s.persistResponse != nil {
s.persistResponse(failedResp)
}
s.sendEvent("response.failed", openaifmt.BuildResponsesFailedPayload(s.responseID, s.model, message, "tool_choice_violation"))
s.sendDone()
s.failResponse(message, code)
return
}
s.closeIncompleteFunctionItems()

View File

@@ -518,6 +518,44 @@ func TestHandleResponsesStreamRequiredMalformedToolPayloadFails(t *testing.T) {
}
}
func TestHandleResponsesStreamFailsWhenUpstreamHasOnlyThinking(t *testing.T) {
h := &Handler{}
req := httptest.NewRequest(http.MethodPost, "/v1/responses", nil)
rec := httptest.NewRecorder()
sseLine := func(path, value string) string {
b, _ := json.Marshal(map[string]any{
"p": path,
"v": value,
})
return "data: " + string(b) + "\n"
}
streamBody := sseLine("response/thinking_content", "Only thinking") + "data: [DONE]\n"
resp := &http.Response{
StatusCode: http.StatusOK,
Body: io.NopCloser(strings.NewReader(streamBody)),
}
h.handleResponsesStream(rec, req, resp, "owner-a", "resp_test", "deepseek-reasoner", "prompt", true, false, nil, util.DefaultToolChoicePolicy(), "")
body := rec.Body.String()
if !strings.Contains(body, "event: response.failed") {
t.Fatalf("expected response.failed event, body=%s", body)
}
if strings.Contains(body, "event: response.completed") {
t.Fatalf("did not expect response.completed, body=%s", body)
}
payload, ok := extractSSEEventPayload(body, "response.failed")
if !ok {
t.Fatalf("expected response.failed payload, body=%s", body)
}
errObj, _ := payload["error"].(map[string]any)
if asString(errObj["code"]) != "upstream_empty_output" {
t.Fatalf("expected code=upstream_empty_output, got %#v", payload)
}
}
func TestHandleResponsesStreamAllowsUnknownToolName(t *testing.T) {
h := &Handler{}
req := httptest.NewRequest(http.MethodPost, "/v1/responses", nil)
@@ -671,6 +709,28 @@ func TestHandleResponsesNonStreamReturnsContentFilterErrorWhenUpstreamFilteredWi
}
}
func TestHandleResponsesNonStreamReturns429WhenUpstreamHasOnlyThinking(t *testing.T) {
h := &Handler{}
rec := httptest.NewRecorder()
resp := &http.Response{
StatusCode: http.StatusOK,
Body: io.NopCloser(strings.NewReader(
`data: {"p":"response/thinking_content","v":"Only thinking"}` + "\n" +
`data: [DONE]` + "\n",
)),
}
h.handleResponsesNonStream(rec, resp, "owner-a", "resp_test", "deepseek-reasoner", "prompt", true, nil, util.DefaultToolChoicePolicy(), "")
if rec.Code != http.StatusTooManyRequests {
t.Fatalf("expected 429 for thinking-only upstream output, got %d body=%s", rec.Code, rec.Body.String())
}
out := decodeJSONBody(t, rec.Body.String())
errObj, _ := out["error"].(map[string]any)
if asString(errObj["code"]) != "upstream_empty_output" {
t.Fatalf("expected code=upstream_empty_output, got %#v", out)
}
}
func extractSSEEventPayload(body, targetEvent string) (map[string]any, bool) {
scanner := bufio.NewScanner(strings.NewReader(body))
matched := false

View File

@@ -24,9 +24,10 @@ func normalizeOpenAIChatRequest(store ConfigReader, req map[string]any, traceID
responseModel = resolvedModel
}
toolPolicy := util.DefaultToolChoicePolicy()
finalPrompt, toolNames := buildOpenAIFinalPromptWithPolicy(messagesRaw, req["tools"], traceID, toolPolicy)
finalPrompt, toolNames := buildOpenAIFinalPromptWithPolicy(messagesRaw, req["tools"], traceID, toolPolicy, thinkingEnabled)
toolNames = ensureToolDetectionEnabled(toolNames, req["tools"])
passThrough := collectOpenAIChatPassThrough(req)
refFileIDs := collectOpenAIRefFileIDs(req)
return util.StandardRequest{
Surface: "openai_chat",
@@ -40,6 +41,7 @@ func normalizeOpenAIChatRequest(store ConfigReader, req map[string]any, traceID
Stream: util.ToBool(req["stream"]),
Thinking: thinkingEnabled,
Search: searchEnabled,
RefFileIDs: refFileIDs,
PassThrough: passThrough,
}, nil
}
@@ -74,12 +76,13 @@ func normalizeOpenAIResponsesRequest(store ConfigReader, req map[string]any, tra
if err != nil {
return util.StandardRequest{}, err
}
finalPrompt, toolNames := buildOpenAIFinalPromptWithPolicy(messagesRaw, req["tools"], traceID, toolPolicy)
finalPrompt, toolNames := buildOpenAIFinalPromptWithPolicy(messagesRaw, req["tools"], traceID, toolPolicy, thinkingEnabled)
toolNames = ensureToolDetectionEnabled(toolNames, req["tools"])
if !toolPolicy.IsNone() {
toolPolicy.Allowed = namesToSet(toolNames)
}
passThrough := collectOpenAIChatPassThrough(req)
refFileIDs := collectOpenAIRefFileIDs(req)
return util.StandardRequest{
Surface: "openai_responses",
@@ -93,6 +96,7 @@ func normalizeOpenAIResponsesRequest(store ConfigReader, req map[string]any, tra
Stream: util.ToBool(req["stream"]),
Thinking: thinkingEnabled,
Search: searchEnabled,
RefFileIDs: refFileIDs,
PassThrough: passThrough,
}, nil
}

View File

@@ -41,6 +41,36 @@ func TestNormalizeOpenAIChatRequest(t *testing.T) {
}
}
func TestNormalizeOpenAIChatRequestCollectsRefFileIDs(t *testing.T) {
store := newEmptyStoreForNormalizeTest(t)
req := map[string]any{
"model": "gpt-5-codex",
"messages": []any{
map[string]any{
"role": "user",
"content": []any{
map[string]any{"type": "input_text", "text": "hello"},
map[string]any{"type": "input_file", "file_id": "file-msg"},
},
},
},
"attachments": []any{
map[string]any{"file_id": "file-attachment"},
},
"ref_file_ids": []any{"file-top", "file-attachment"},
}
n, err := normalizeOpenAIChatRequest(store, req, "")
if err != nil {
t.Fatalf("normalize failed: %v", err)
}
if len(n.RefFileIDs) != 3 {
t.Fatalf("expected 3 distinct file ids, got %#v", n.RefFileIDs)
}
if n.RefFileIDs[0] != "file-top" || n.RefFileIDs[1] != "file-attachment" || n.RefFileIDs[2] != "file-msg" {
t.Fatalf("unexpected file ids: %#v", n.RefFileIDs)
}
}
func TestNormalizeOpenAIResponsesRequestInput(t *testing.T) {
store := newEmptyStoreForNormalizeTest(t)
req := map[string]any{

View File

@@ -50,6 +50,10 @@ func (m streamStatusDSStub) GetPow(_ context.Context, _ *auth.RequestAuth, _ int
return "pow", nil
}
func (m streamStatusDSStub) UploadFile(_ context.Context, _ *auth.RequestAuth, _ deepseek.UploadFileRequest, _ int) (*deepseek.UploadFileResult, error) {
return &deepseek.UploadFileResult{ID: "file-id", Filename: "file.txt", Bytes: 1, Status: "uploaded"}, nil
}
func (m streamStatusDSStub) CallCompletion(_ context.Context, _ *auth.RequestAuth, _ map[string]any, _ string, _ int) (*http.Response, error) {
return m.resp, nil
}
@@ -239,6 +243,49 @@ func TestChatCompletionsStreamContentFilterStopsNormallyWithoutLeak(t *testing.T
}
}
func TestChatCompletionsStreamEmitsFailureFrameWhenUpstreamOutputEmpty(t *testing.T) {
statuses := make([]int, 0, 1)
h := &Handler{
Store: mockOpenAIConfig{wideInput: true},
Auth: streamStatusAuthStub{},
DS: streamStatusDSStub{resp: makeOpenAISSEHTTPResponse("data: [DONE]")},
}
r := chi.NewRouter()
r.Use(captureStatusMiddleware(&statuses))
RegisterRoutes(r, h)
reqBody := `{"model":"deepseek-chat","messages":[{"role":"user","content":"hi"}],"stream":true}`
req := httptest.NewRequest(http.MethodPost, "/v1/chat/completions", strings.NewReader(reqBody))
req.Header.Set("Authorization", "Bearer direct-token")
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
if len(statuses) != 1 || statuses[0] != http.StatusOK {
t.Fatalf("expected captured status 200, got %#v", statuses)
}
frames, done := parseSSEDataFrames(t, rec.Body.String())
if !done {
t.Fatalf("expected [DONE], body=%s", rec.Body.String())
}
if len(frames) != 1 {
t.Fatalf("expected one failure frame, got %#v body=%s", frames, rec.Body.String())
}
last := frames[0]
statusCode, ok := last["status_code"].(float64)
if !ok || int(statusCode) != http.StatusTooManyRequests {
t.Fatalf("expected status_code=429, got %#v body=%s", last["status_code"], rec.Body.String())
}
errObj, _ := last["error"].(map[string]any)
if asString(errObj["code"]) != "upstream_empty_output" {
t.Fatalf("expected code=upstream_empty_output, got %#v", last)
}
}
func TestResponsesStreamUsageIgnoresBatchAccumulatedTokenUsage(t *testing.T) {
statuses := make([]int, 0, 1)
h := &Handler{

View File

@@ -2,8 +2,8 @@ package openai
import "net/http"
func writeUpstreamEmptyOutputError(w http.ResponseWriter, thinking, text string, contentFilter bool) bool {
if thinking != "" || text != "" {
func writeUpstreamEmptyOutputError(w http.ResponseWriter, text string, contentFilter bool) bool {
if text != "" {
return false
}
if contentFilter {

View File

@@ -52,6 +52,10 @@ func (h *Handler) handleVercelStreamPrepare(w http.ResponseWriter, r *http.Reque
writeOpenAIError(w, http.StatusBadRequest, "invalid json")
return
}
if err := h.preprocessInlineFileInputs(r.Context(), a, req); err != nil {
writeOpenAIInlineFileError(w, err)
return
}
if !util.ToBool(req["stream"]) {
writeOpenAIError(w, http.StatusBadRequest, "stream must be true")
return

View File

@@ -15,8 +15,17 @@ import (
"ds2api/internal/config"
"ds2api/internal/deepseek"
"ds2api/internal/sse"
"ds2api/internal/util"
)
type modelAliasSnapshotReader struct {
aliases map[string]string
}
func (m modelAliasSnapshotReader) ModelAliases() map[string]string {
return m.aliases
}
func (h *Handler) testSingleAccount(w http.ResponseWriter, r *http.Request) {
var req map[string]any
_ = json.NewDecoder(r.Body).Decode(&req)
@@ -150,16 +159,27 @@ func (h *Handler) testAccount(ctx context.Context, acc config.Account, model, me
return result
}
thinking, search, ok := config.GetModelConfig(model)
resolvedModel, resolved := config.ResolveModel(modelAliasSnapshotReader{
aliases: h.Store.Snapshot().ModelAliases,
}, model)
if resolved {
model = resolvedModel
thinking, search, ok = config.GetModelConfig(model)
}
if !ok {
thinking, search = false, false
}
_ = search
pow, err := h.DS.GetPow(proxyCtx, authCtx, 1)
if err != nil {
result["message"] = "获取 PoW 失败: " + err.Error()
return result
}
payload := map[string]any{"chat_session_id": sessionID, "prompt": deepseek.MessagesPrepare([]map[string]any{{"role": "user", "content": message}}), "ref_file_ids": []any{}, "thinking_enabled": thinking, "search_enabled": search}
payload := util.StandardRequest{
ResolvedModel: model,
FinalPrompt: deepseek.MessagesPrepare([]map[string]any{{"role": "user", "content": message}}),
Thinking: thinking,
Search: search,
}.CompletionPayload(sessionID)
resp, err := h.DS.CallCompletion(proxyCtx, authCtx, payload, pow, 1)
if err != nil {
result["message"] = "请求失败: " + err.Error()

View File

@@ -5,6 +5,7 @@ import (
"context"
"encoding/json"
"errors"
"io"
"net/http"
"net/http/httptest"
"strings"
@@ -133,3 +134,78 @@ func TestDeleteAllSessions_RetryWithReloginOnDeleteFailure(t *testing.T) {
t.Fatalf("expected refreshed token persisted, got %q", updated.Token)
}
}
type completionPayloadDSMock struct {
payload map[string]any
}
func (m *completionPayloadDSMock) Login(_ context.Context, _ config.Account) (string, error) {
return "new-token", nil
}
func (m *completionPayloadDSMock) CreateSession(_ context.Context, _ *auth.RequestAuth, _ int) (string, error) {
return "session-id", nil
}
func (m *completionPayloadDSMock) GetPow(_ context.Context, _ *auth.RequestAuth, _ int) (string, error) {
return "pow-ok", nil
}
func (m *completionPayloadDSMock) CallCompletion(_ context.Context, _ *auth.RequestAuth, payload map[string]any, _ string, _ int) (*http.Response, error) {
m.payload = payload
return &http.Response{
StatusCode: http.StatusOK,
Body: io.NopCloser(strings.NewReader("data: {\"v\":\"ok\"}\n\ndata: [DONE]\n\n")),
}, nil
}
func (m *completionPayloadDSMock) DeleteAllSessionsForToken(_ context.Context, _ string) error {
return nil
}
func (m *completionPayloadDSMock) GetSessionCountForToken(_ context.Context, _ string) (*deepseek.SessionStats, error) {
return &deepseek.SessionStats{Success: true}, nil
}
func TestTestAccount_MessageModeUsesExpertModelTypeForExpertModel(t *testing.T) {
t.Setenv("DS2API_CONFIG_JSON", `{"accounts":[{"email":"batch@example.com","password":"pwd","token":"seed-token"}]}`)
store := config.LoadStore()
ds := &completionPayloadDSMock{}
h := &Handler{Store: store, DS: ds}
acc, ok := store.FindAccount("batch@example.com")
if !ok {
t.Fatal("expected test account")
}
result := h.testAccount(context.Background(), acc, "deepseek-expert-chat", "hello")
if ok, _ := result["success"].(bool); !ok {
t.Fatalf("expected success=true, got %#v", result)
}
if got := ds.payload["model_type"]; got != "expert" {
t.Fatalf("expected model_type expert, got %#v", got)
}
if got := ds.payload["chat_session_id"]; got != "session-id" {
t.Fatalf("unexpected chat_session_id: %#v", got)
}
}
func TestTestAccount_MessageModeUsesVisionModelTypeForVisionModel(t *testing.T) {
t.Setenv("DS2API_CONFIG_JSON", `{"accounts":[{"email":"batch@example.com","password":"pwd","token":"seed-token"}]}`)
store := config.LoadStore()
ds := &completionPayloadDSMock{}
h := &Handler{Store: store, DS: ds}
acc, ok := store.FindAccount("batch@example.com")
if !ok {
t.Fatal("expected test account")
}
result := h.testAccount(context.Background(), acc, "deepseek-vision-chat", "hello")
if ok, _ := result["success"].(bool); !ok {
t.Fatalf("expected success=true, got %#v", result)
}
if got := ds.payload["model_type"]; got != "vision" {
t.Fatalf("expected model_type vision, got %#v", got)
}
}

View File

@@ -49,6 +49,51 @@ func TestGetModelConfigDeepSeekReasonerSearch(t *testing.T) {
}
}
func TestGetModelConfigDeepSeekExpertChat(t *testing.T) {
thinking, search, ok := GetModelConfig("deepseek-expert-chat")
if !ok {
t.Fatal("expected ok for deepseek-expert-chat")
}
if thinking || search {
t.Fatalf("expected no thinking/search for deepseek-expert-chat, got thinking=%v search=%v", thinking, search)
}
}
func TestGetModelConfigDeepSeekExpertReasonerSearch(t *testing.T) {
thinking, search, ok := GetModelConfig("deepseek-expert-reasoner-search")
if !ok {
t.Fatal("expected ok for deepseek-expert-reasoner-search")
}
if !thinking || !search {
t.Fatalf("expected both true, got thinking=%v search=%v", thinking, search)
}
}
func TestGetModelConfigDeepSeekVisionReasonerSearch(t *testing.T) {
thinking, search, ok := GetModelConfig("deepseek-vision-reasoner-search")
if !ok {
t.Fatal("expected ok for deepseek-vision-reasoner-search")
}
if !thinking || !search {
t.Fatalf("expected both true, got thinking=%v search=%v", thinking, search)
}
}
func TestGetModelTypeDefaultExpertAndVision(t *testing.T) {
defaultType, ok := GetModelType("deepseek-chat")
if !ok || defaultType != "default" {
t.Fatalf("expected default model_type, got ok=%v model_type=%q", ok, defaultType)
}
expertType, ok := GetModelType("deepseek-expert-chat")
if !ok || expertType != "expert" {
t.Fatalf("expected expert model_type, got ok=%v model_type=%q", ok, expertType)
}
visionType, ok := GetModelType("deepseek-vision-chat")
if !ok || visionType != "vision" {
t.Fatalf("expected vision model_type, got ok=%v model_type=%q", ok, visionType)
}
}
func TestGetModelConfigCaseInsensitive(t *testing.T) {
thinking, search, ok := GetModelConfig("DeepSeek-Chat")
if !ok {
@@ -551,6 +596,30 @@ func TestOpenAIModelsResponse(t *testing.T) {
if len(data) == 0 {
t.Fatal("expected non-empty models list")
}
expected := map[string]bool{
"deepseek-chat": false,
"deepseek-reasoner": false,
"deepseek-chat-search": false,
"deepseek-reasoner-search": false,
"deepseek-expert-chat": false,
"deepseek-expert-reasoner": false,
"deepseek-expert-chat-search": false,
"deepseek-expert-reasoner-search": false,
"deepseek-vision-chat": false,
"deepseek-vision-reasoner": false,
"deepseek-vision-chat-search": false,
"deepseek-vision-reasoner-search": false,
}
for _, model := range data {
if _, ok := expected[model.ID]; ok {
expected[model.ID] = true
}
}
for id, seen := range expected {
if !seen {
t.Fatalf("expected OpenAI model list to include %s", id)
}
}
}
func TestClaudeModelsResponse(t *testing.T) {

View File

@@ -2,6 +2,10 @@ package config
import "testing"
type mockModelAliasReader map[string]string
func (m mockModelAliasReader) ModelAliases() map[string]string { return m }
func TestResolveModelDirectDeepSeek(t *testing.T) {
got, ok := ResolveModel(nil, "deepseek-chat")
if !ok || got != "deepseek-chat" {
@@ -30,6 +34,31 @@ func TestResolveModelUnknown(t *testing.T) {
}
}
func TestResolveModelDirectDeepSeekExpert(t *testing.T) {
got, ok := ResolveModel(nil, "deepseek-expert-chat")
if !ok || got != "deepseek-expert-chat" {
t.Fatalf("expected deepseek-expert-chat, got ok=%v model=%q", ok, got)
}
}
func TestResolveModelCustomAliasToExpert(t *testing.T) {
got, ok := ResolveModel(mockModelAliasReader{
"my-expert-model": "deepseek-expert-reasoner-search",
}, "my-expert-model")
if !ok || got != "deepseek-expert-reasoner-search" {
t.Fatalf("expected alias -> deepseek-expert-reasoner-search, got ok=%v model=%q", ok, got)
}
}
func TestResolveModelCustomAliasToVision(t *testing.T) {
got, ok := ResolveModel(mockModelAliasReader{
"my-vision-model": "deepseek-vision-chat-search",
}, "my-vision-model")
if !ok || got != "deepseek-vision-chat-search" {
t.Fatalf("expected alias -> deepseek-vision-chat-search, got ok=%v model=%q", ok, got)
}
}
func TestClaudeModelsResponsePaginationFields(t *testing.T) {
resp := ClaudeModelsResponse()
if _, ok := resp["first_id"]; !ok {

View File

@@ -19,6 +19,14 @@ var DeepSeekModels = []ModelInfo{
{ID: "deepseek-reasoner", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-chat-search", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-reasoner-search", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-expert-chat", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-expert-reasoner", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-expert-chat-search", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-expert-reasoner-search", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-vision-chat", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-vision-reasoner", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-vision-chat-search", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-vision-reasoner-search", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
}
var ClaudeModels = []ModelInfo{
@@ -72,11 +80,40 @@ func GetModelConfig(model string) (thinking bool, search bool, ok bool) {
return false, true, true
case "deepseek-reasoner-search":
return true, true, true
case "deepseek-expert-chat":
return false, false, true
case "deepseek-expert-reasoner":
return true, false, true
case "deepseek-expert-chat-search":
return false, true, true
case "deepseek-expert-reasoner-search":
return true, true, true
case "deepseek-vision-chat":
return false, false, true
case "deepseek-vision-reasoner":
return true, false, true
case "deepseek-vision-chat-search":
return false, true, true
case "deepseek-vision-reasoner-search":
return true, true, true
default:
return false, false, false
}
}
func GetModelType(model string) (modelType string, ok bool) {
switch lower(model) {
case "deepseek-chat", "deepseek-reasoner", "deepseek-chat-search", "deepseek-reasoner-search":
return "default", true
case "deepseek-expert-chat", "deepseek-expert-reasoner", "deepseek-expert-chat-search", "deepseek-expert-reasoner-search":
return "expert", true
case "deepseek-vision-chat", "deepseek-vision-reasoner", "deepseek-vision-chat-search", "deepseek-vision-reasoner-search":
return "vision", true
default:
return "", false
}
}
func IsSupportedDeepSeekModel(model string) bool {
_, _, ok := GetModelConfig(model)
return ok

View File

@@ -66,9 +66,7 @@ func (c *Client) CreateSession(ctx context.Context, a *auth.RequestAuth, maxAtte
}
code, bizCode, msg, bizMsg := extractResponseStatus(resp)
if status == http.StatusOK && code == 0 && bizCode == 0 {
data, _ := resp["data"].(map[string]any)
bizData, _ := data["biz_data"].(map[string]any)
sessionID, _ := bizData["id"].(string)
sessionID := extractCreateSessionID(resp)
if sessionID != "" {
return sessionID, nil
}
@@ -93,17 +91,25 @@ func (c *Client) CreateSession(ctx context.Context, a *auth.RequestAuth, maxAtte
}
func (c *Client) GetPow(ctx context.Context, a *auth.RequestAuth, maxAttempts int) (string, error) {
return c.GetPowForTarget(ctx, a, DeepSeekCompletionTargetPath, maxAttempts)
}
func (c *Client) GetPowForTarget(ctx context.Context, a *auth.RequestAuth, targetPath string, maxAttempts int) (string, error) {
if maxAttempts <= 0 {
maxAttempts = c.maxRetries
}
targetPath = strings.TrimSpace(targetPath)
if targetPath == "" {
targetPath = DeepSeekCompletionTargetPath
}
clients := c.requestClientsForAuth(ctx, a)
attempts := 0
refreshed := false
for attempts < maxAttempts {
headers := c.authHeaders(a.DeepSeekToken)
resp, status, err := c.postJSONWithStatus(ctx, clients.regular, clients.fallback, DeepSeekCreatePowURL, headers, map[string]any{"target_path": "/api/v0/chat/completion"})
resp, status, err := c.postJSONWithStatus(ctx, clients.regular, clients.fallback, DeepSeekCreatePowURL, headers, map[string]any{"target_path": targetPath})
if err != nil {
config.Logger.Warn("[get_pow] request error", "error", err, "account", a.AccountID)
config.Logger.Warn("[get_pow] request error", "error", err, "account", a.AccountID, "target_path", targetPath)
attempts++
continue
}
@@ -119,7 +125,7 @@ func (c *Client) GetPow(ctx context.Context, a *auth.RequestAuth, maxAttempts in
}
return BuildPowHeader(challenge, answer)
}
config.Logger.Warn("[get_pow] failed", "status", status, "code", code, "biz_code", bizCode, "msg", msg, "biz_msg", bizMsg, "use_config_token", a.UseConfigToken, "account", a.AccountID)
config.Logger.Warn("[get_pow] failed", "status", status, "code", code, "biz_code", bizCode, "msg", msg, "biz_msg", bizMsg, "use_config_token", a.UseConfigToken, "account", a.AccountID, "target_path", targetPath)
if a.UseConfigToken {
if !refreshed && shouldAttemptRefresh(status, code, bizCode, msg, bizMsg) {
if c.Auth.RefreshToken(ctx, a) {
@@ -204,6 +210,22 @@ func isAuthIndicativeBizFailure(msg string, bizMsg string) bool {
return false
}
// DeepSeek has returned create-session ids in both biz_data.id and
// biz_data.chat_session.id across observed response variants; accept either.
func extractCreateSessionID(resp map[string]any) string {
data, _ := resp["data"].(map[string]any)
bizData, _ := data["biz_data"].(map[string]any)
if sessionID, _ := bizData["id"].(string); strings.TrimSpace(sessionID) != "" {
return strings.TrimSpace(sessionID)
}
if chatSession, ok := bizData["chat_session"].(map[string]any); ok {
if sessionID, _ := chatSession["id"].(string); strings.TrimSpace(sessionID) != "" {
return strings.TrimSpace(sessionID)
}
}
return ""
}
func extractResponseStatus(resp map[string]any) (code int, bizCode int, msg string, bizMsg string) {
code = intFrom(resp["code"])
msg, _ = resp["msg"].(string)

View File

@@ -0,0 +1,34 @@
package deepseek
import "testing"
func TestExtractCreateSessionIDSupportsLegacyShape(t *testing.T) {
resp := map[string]any{
"data": map[string]any{
"biz_data": map[string]any{
"id": "legacy-session-id",
},
},
}
if got := extractCreateSessionID(resp); got != "legacy-session-id" {
t.Fatalf("expected legacy session id, got %q", got)
}
}
func TestExtractCreateSessionIDSupportsNestedChatSessionShape(t *testing.T) {
resp := map[string]any{
"data": map[string]any{
"biz_data": map[string]any{
"chat_session": map[string]any{
"id": "nested-session-id",
"model_type": "default",
},
},
},
}
if got := extractCreateSessionID(resp); got != "nested-session-id" {
t.Fatalf("expected nested session id, got %q", got)
}
}

View File

@@ -51,6 +51,7 @@ func (c *Client) streamPost(ctx context.Context, doer trans.Doer, url string, he
if err != nil {
return nil, err
}
headers = c.jsonHeaders(headers)
clients := c.requestClientsFromContext(ctx)
req, err := http.NewRequestWithContext(ctx, http.MethodPost, url, bytes.NewReader(b))
if err != nil {

View File

@@ -0,0 +1,188 @@
package deepseek
import (
"context"
"errors"
"fmt"
"net/http"
"net/url"
"strings"
"time"
"ds2api/internal/auth"
"ds2api/internal/config"
)
const (
fileReadyPollAttempts = 60
fileReadyPollInterval = time.Second
fileReadyPollTimeout = 65 * time.Second
)
var fileReadySleep = time.Sleep
func (c *Client) waitForUploadedFile(ctx context.Context, a *auth.RequestAuth, result *UploadFileResult) error {
if result == nil || strings.TrimSpace(result.ID) == "" {
return nil
}
if isReadyUploadFileStatus(result.Status) {
return nil
}
pollCtx, cancel := context.WithTimeout(ctx, fileReadyPollTimeout)
defer cancel()
var lastErr error
for attempt := 0; attempt < fileReadyPollAttempts; attempt++ {
if err := pollCtx.Err(); err != nil {
if lastErr != nil {
return fmt.Errorf("waiting for file %s to become ready: %w", result.ID, lastErr)
}
return fmt.Errorf("waiting for file %s to become ready: %w", result.ID, err)
}
fetched, err := c.fetchUploadedFile(pollCtx, a, result.ID)
if err == nil && fetched != nil {
mergeUploadFileResults(result, fetched)
if isReadyUploadFileStatus(result.Status) {
return nil
}
lastErr = fmt.Errorf("status=%s", strings.TrimSpace(result.Status))
} else if err != nil {
lastErr = err
config.Logger.Debug("[upload_file] waiting for file readiness", "file_id", result.ID, "attempt", attempt+1, "error", err)
}
if attempt < fileReadyPollAttempts-1 {
fileReadySleep(fileReadyPollInterval)
}
}
if lastErr == nil {
lastErr = fmt.Errorf("status=%s", strings.TrimSpace(result.Status))
}
return fmt.Errorf("file %s did not become ready: %w", result.ID, lastErr)
}
func (c *Client) fetchUploadedFile(ctx context.Context, a *auth.RequestAuth, fileID string) (*UploadFileResult, error) {
fileID = strings.TrimSpace(fileID)
if fileID == "" {
return nil, errors.New("file id is required")
}
clients := c.requestClientsForAuth(ctx, a)
reqURL := DeepSeekFetchFilesURL + "?file_ids=" + url.QueryEscape(fileID)
headers := c.authHeaders(a.DeepSeekToken)
resp, status, err := c.getJSONWithStatus(ctx, clients.regular, reqURL, headers)
if err != nil {
return nil, err
}
code, bizCode, msg, bizMsg := extractResponseStatus(resp)
if status != http.StatusOK || code != 0 || bizCode != 0 {
if strings.TrimSpace(bizMsg) != "" {
msg = bizMsg
}
if msg == "" {
msg = http.StatusText(status)
}
return nil, fmt.Errorf("request failed: status=%d, code=%d, msg=%s", status, code, msg)
}
result := extractFetchedUploadFileResult(resp, fileID)
if result == nil || strings.TrimSpace(result.ID) == "" {
return nil, errors.New("fetch files succeeded without matching file data")
}
result.Raw = resp
return result, nil
}
func extractFetchedUploadFileResult(resp map[string]any, targetID string) *UploadFileResult {
targetID = strings.TrimSpace(targetID)
if resp == nil || targetID == "" {
return nil
}
var walk func(any) *UploadFileResult
walk = func(v any) *UploadFileResult {
switch x := v.(type) {
case map[string]any:
if result := buildUploadFileResultFromMap(x, targetID); result != nil {
return result
}
for _, nested := range x {
if result := walk(nested); result != nil {
return result
}
}
case []any:
for _, item := range x {
if result := walk(item); result != nil {
return result
}
}
}
return nil
}
if result := walk(resp); result != nil {
return result
}
return nil
}
func buildUploadFileResultFromMap(m map[string]any, targetID string) *UploadFileResult {
fileID := strings.TrimSpace(firstNonEmptyString(m, "id", "file_id"))
if fileID == "" || !strings.EqualFold(fileID, targetID) {
return nil
}
result := &UploadFileResult{
ID: fileID,
Filename: firstNonEmptyString(m, "name", "filename", "file_name"),
Status: firstNonEmptyString(m, "status", "file_status"),
Purpose: firstNonEmptyString(m, "purpose"),
IsImage: firstBool(m, "is_image", "isImage"),
Bytes: firstPositiveInt64(m, "bytes", "size", "file_size"),
}
if result.Status == "" {
result.Status = "uploaded"
}
return result
}
func mergeUploadFileResults(dst, src *UploadFileResult) {
if dst == nil || src == nil {
return
}
if strings.TrimSpace(src.ID) != "" {
dst.ID = strings.TrimSpace(src.ID)
}
if strings.TrimSpace(src.Filename) != "" {
dst.Filename = strings.TrimSpace(src.Filename)
}
if src.Bytes > 0 {
dst.Bytes = src.Bytes
}
if strings.TrimSpace(src.Status) != "" {
dst.Status = strings.TrimSpace(src.Status)
}
if strings.TrimSpace(src.Purpose) != "" {
dst.Purpose = strings.TrimSpace(src.Purpose)
}
dst.IsImage = src.IsImage
if len(src.Raw) > 0 {
dst.Raw = src.Raw
}
if src.RawHeaders != nil {
dst.RawHeaders = src.RawHeaders.Clone()
}
}
func isReadyUploadFileStatus(status string) bool {
switch strings.ToLower(strings.TrimSpace(status)) {
case "processed", "ready", "done", "available", "success", "completed", "finished":
return true
default:
return false
}
}

View File

@@ -35,6 +35,12 @@ func preview(b []byte) string {
return s
}
func (c *Client) jsonHeaders(headers map[string]string) map[string]string {
out := cloneStringMap(headers)
out["Content-Type"] = "application/json"
return out
}
func ScanSSELines(resp *http.Response, onLine func([]byte) bool) error {
scanner := bufio.NewScanner(resp.Body)
buf := make([]byte, 0, 64*1024)

View File

@@ -27,6 +27,7 @@ func (c *Client) postJSONWithStatus(ctx context.Context, doer trans.Doer, fallba
if err != nil {
return nil, 0, err
}
headers = c.jsonHeaders(headers)
req, err := http.NewRequestWithContext(ctx, http.MethodPost, url, bytes.NewReader(b))
if err != nil {
return nil, 0, err

View File

@@ -0,0 +1,282 @@
package deepseek
import (
"bytes"
"context"
"encoding/json"
"errors"
"fmt"
"mime/multipart"
"net/http"
"net/textproto"
"path/filepath"
"strconv"
"strings"
"ds2api/internal/auth"
"ds2api/internal/config"
trans "ds2api/internal/deepseek/transport"
)
type UploadFileRequest struct {
Filename string
ContentType string
Purpose string
Data []byte
}
type UploadFileResult struct {
ID string
Filename string
Bytes int64
Status string
Purpose string
AccountID string
IsImage bool
Raw map[string]any
RawHeaders http.Header
}
func (c *Client) UploadFile(ctx context.Context, a *auth.RequestAuth, req UploadFileRequest, maxAttempts int) (*UploadFileResult, error) {
if maxAttempts <= 0 {
maxAttempts = c.maxRetries
}
if len(req.Data) == 0 {
return nil, errors.New("file is required")
}
filename := strings.TrimSpace(req.Filename)
if filename == "" {
filename = "upload.bin"
}
contentType := strings.TrimSpace(req.ContentType)
if contentType == "" {
contentType = "application/octet-stream"
}
purpose := strings.TrimSpace(req.Purpose)
body, contentTypeHeader, err := buildUploadMultipartBody(filename, contentType, req.Data)
if err != nil {
return nil, err
}
capturePayload := map[string]any{
"filename": filename,
"content_type": contentType,
"purpose": purpose,
"bytes": len(req.Data),
}
captureSession := c.capture.Start("deepseek_upload_file", DeepSeekUploadFileURL, a.AccountID, capturePayload)
attempts := 0
refreshed := false
powHeader := ""
for attempts < maxAttempts {
clients := c.requestClientsForAuth(ctx, a)
if strings.TrimSpace(powHeader) == "" {
powHeader, err = c.GetPowForTarget(ctx, a, DeepSeekUploadTargetPath, maxAttempts)
if err != nil {
return nil, err
}
clients = c.requestClientsForAuth(ctx, a)
}
headers := c.authHeaders(a.DeepSeekToken)
headers["Content-Type"] = contentTypeHeader
headers["x-ds-pow-response"] = powHeader
headers["x-file-size"] = strconv.Itoa(len(req.Data))
headers["x-thinking-enabled"] = "1"
resp, err := c.doUpload(ctx, clients.regular, clients.fallback, DeepSeekUploadFileURL, headers, body)
if err != nil {
config.Logger.Warn("[upload_file] request error", "error", err, "account", a.AccountID, "filename", filename)
powHeader = ""
attempts++
continue
}
if captureSession != nil {
resp.Body = captureSession.WrapBody(resp.Body, resp.StatusCode)
}
payloadBytes, readErr := readResponseBody(resp)
_ = resp.Body.Close()
if readErr != nil {
powHeader = ""
attempts++
continue
}
parsed := map[string]any{}
if len(payloadBytes) > 0 {
if err := json.Unmarshal(payloadBytes, &parsed); err != nil {
config.Logger.Warn("[upload_file] json parse failed", "status", resp.StatusCode, "preview", preview(payloadBytes))
}
}
code, bizCode, msg, bizMsg := extractResponseStatus(parsed)
if resp.StatusCode == http.StatusOK && code == 0 && bizCode == 0 {
result := extractUploadFileResult(parsed)
result.Raw = parsed
result.RawHeaders = resp.Header.Clone()
if result.Filename == "" {
result.Filename = filename
}
if result.Bytes == 0 {
result.Bytes = int64(len(req.Data))
}
if result.Purpose == "" {
result.Purpose = purpose
}
if result.AccountID == "" {
result.AccountID = a.AccountID
}
if result.ID == "" {
return nil, errors.New("upload file succeeded without file id")
}
if err := c.waitForUploadedFile(ctx, a, result); err != nil {
return nil, err
}
return result, nil
}
config.Logger.Warn("[upload_file] failed", "status", resp.StatusCode, "code", code, "biz_code", bizCode, "msg", msg, "biz_msg", bizMsg, "account", a.AccountID, "filename", filename)
powHeader = ""
if a.UseConfigToken {
if !refreshed && shouldAttemptRefresh(resp.StatusCode, code, bizCode, msg, bizMsg) {
if c.Auth.RefreshToken(ctx, a) {
refreshed = true
attempts++
continue
}
}
if c.Auth.SwitchAccount(ctx, a) {
refreshed = false
attempts++
continue
}
}
attempts++
}
return nil, errors.New("upload file failed")
}
func buildUploadMultipartBody(filename, contentType string, data []byte) ([]byte, string, error) {
var buf bytes.Buffer
writer := multipart.NewWriter(&buf)
partHeader := textproto.MIMEHeader{}
partHeader.Set("Content-Disposition", fmt.Sprintf(`form-data; name="file"; filename=%q`, escapeMultipartFilename(filename)))
partHeader.Set("Content-Type", contentType)
part, err := writer.CreatePart(partHeader)
if err != nil {
return nil, "", err
}
if _, err := part.Write(data); err != nil {
return nil, "", err
}
if err := writer.Close(); err != nil {
return nil, "", err
}
return buf.Bytes(), writer.FormDataContentType(), nil
}
func escapeMultipartFilename(filename string) string {
filename = filepath.Base(strings.TrimSpace(filename))
filename = strings.ReplaceAll(filename, `\`, "_")
filename = strings.ReplaceAll(filename, `"`, "_")
if filename == "." || filename == "" {
return "upload.bin"
}
return filename
}
func (c *Client) doUpload(ctx context.Context, doer trans.Doer, fallback trans.Doer, url string, headers map[string]string, body []byte) (*http.Response, error) {
req, err := http.NewRequestWithContext(ctx, http.MethodPost, url, bytes.NewReader(body))
if err != nil {
return nil, err
}
for k, v := range headers {
req.Header.Set(k, v)
}
resp, err := doer.Do(req)
if err == nil {
return resp, nil
}
config.Logger.Warn("[deepseek] fingerprint upload request failed, fallback to std transport", "url", url, "error", err)
req2, reqErr := http.NewRequestWithContext(ctx, http.MethodPost, url, bytes.NewReader(body))
if reqErr != nil {
return nil, reqErr
}
for k, v := range headers {
req2.Header.Set(k, v)
}
return fallback.Do(req2)
}
func extractUploadFileResult(resp map[string]any) *UploadFileResult {
result := &UploadFileResult{Status: "uploaded"}
data, _ := resp["data"].(map[string]any)
bizData, _ := data["biz_data"].(map[string]any)
searchMaps := []map[string]any{resp, data, bizData}
for _, parent := range []map[string]any{resp, data, bizData} {
if parent == nil {
continue
}
for _, key := range []string{"file", "biz_data", "data"} {
if nested, ok := parent[key].(map[string]any); ok {
searchMaps = append(searchMaps, nested)
}
}
}
for _, m := range searchMaps {
if m == nil {
continue
}
if result.ID == "" {
result.ID = firstNonEmptyString(m, "id", "file_id")
}
if result.Filename == "" {
result.Filename = firstNonEmptyString(m, "name", "filename", "file_name")
}
if result.Status == "uploaded" {
if status := firstNonEmptyString(m, "status", "file_status"); status != "" {
result.Status = status
}
}
if !result.IsImage {
result.IsImage = firstBool(m, "is_image", "isImage")
}
if result.Purpose == "" {
result.Purpose = firstNonEmptyString(m, "purpose")
}
if result.AccountID == "" {
result.AccountID = firstNonEmptyString(m, "account_id", "accountId", "owner_account_id", "ownerAccountId")
}
if result.Bytes == 0 {
result.Bytes = firstPositiveInt64(m, "bytes", "size", "file_size")
}
}
return result
}
func firstBool(m map[string]any, keys ...string) bool {
for _, key := range keys {
switch v := m[key].(type) {
case bool:
return v
case string:
switch strings.ToLower(strings.TrimSpace(v)) {
case "true", "1", "yes", "y":
return true
}
}
}
return false
}
func firstNonEmptyString(m map[string]any, keys ...string) string {
for _, key := range keys {
if v, _ := m[key].(string); strings.TrimSpace(v) != "" {
return strings.TrimSpace(v)
}
}
return ""
}
func firstPositiveInt64(m map[string]any, keys ...string) int64 {
for _, key := range keys {
if v := toInt64(m[key], 0); v > 0 {
return v
}
}
return 0
}

View File

@@ -0,0 +1,216 @@
package deepseek
import (
"context"
"encoding/base64"
"encoding/hex"
"encoding/json"
"io"
"net/http"
"strings"
"testing"
"time"
"ds2api/internal/auth"
powpkg "ds2api/pow"
)
func TestBuildUploadMultipartBodyOmitsPurposeAndIncludesFilePart(t *testing.T) {
body, contentType, err := buildUploadMultipartBody(`../demo.txt`, "text/plain", []byte("hello"))
if err != nil {
t.Fatalf("buildUploadMultipartBody error: %v", err)
}
if !strings.HasPrefix(contentType, "multipart/form-data; boundary=") {
t.Fatalf("unexpected content type: %q", contentType)
}
payload := string(body)
if strings.Contains(payload, `name="purpose"`) || strings.Contains(payload, "assistants") {
t.Fatalf("expected purpose to be omitted from payload: %q", payload)
}
if !strings.Contains(payload, `name="file"; filename="demo.txt"`) {
t.Fatalf("expected sanitized filename in payload: %q", payload)
}
if !strings.Contains(payload, "Content-Type: text/plain") {
t.Fatalf("expected file content type in payload: %q", payload)
}
if !strings.Contains(payload, "hello") {
t.Fatalf("expected file content in payload: %q", payload)
}
}
func TestExtractUploadFileResultSupportsNestedShapes(t *testing.T) {
got := extractUploadFileResult(map[string]any{
"data": map[string]any{
"biz_data": map[string]any{
"file": map[string]any{
"file_id": "file_123",
"file_name": "report.pdf",
"file_size": 99,
"status": "processed",
"purpose": "assistants",
"is_image": true,
},
},
},
})
if got.ID != "file_123" {
t.Fatalf("expected id file_123, got %#v", got)
}
if got.Filename != "report.pdf" {
t.Fatalf("expected filename report.pdf, got %#v", got)
}
if got.Bytes != 99 {
t.Fatalf("expected bytes 99, got %#v", got)
}
if got.Status != "processed" {
t.Fatalf("expected status processed, got %#v", got)
}
if got.Purpose != "assistants" {
t.Fatalf("expected purpose assistants, got %#v", got)
}
if !got.IsImage {
t.Fatalf("expected image flag true, got %#v", got)
}
}
func TestUploadFileUsesUploadTargetPowAndMultipartHeaders(t *testing.T) {
challengeHash := powpkg.DeepSeekHashV1([]byte(powpkg.BuildPrefix("salt", 1712345678) + "42"))
powResponse := `{"code":0,"msg":"ok","data":{"biz_code":0,"biz_data":{"challenge":{"algorithm":"DeepSeekHashV1","challenge":"` + hex.EncodeToString(challengeHash[:]) + `","salt":"salt","expire_at":1712345678,"difficulty":1000,"signature":"sig","target_path":"` + DeepSeekUploadTargetPath + `"}}}}`
uploadResponse := `{"code":0,"msg":"ok","data":{"biz_code":0,"biz_data":{"file":{"file_id":"file_789","filename":"demo.txt","bytes":5,"status":"processed","purpose":"assistants","is_image":false}}}}`
var seenPow string
var seenTargetPath string
var seenContentType string
var seenFileSize string
var seenBody string
call := 0
client := &Client{
regular: doerFunc(func(req *http.Request) (*http.Response, error) {
call++
bodyBytes, _ := io.ReadAll(req.Body)
switch call {
case 1:
seenTargetPath = string(bodyBytes)
return &http.Response{StatusCode: http.StatusOK, Header: make(http.Header), Body: io.NopCloser(strings.NewReader(powResponse)), Request: req}, nil
case 2:
seenPow = req.Header.Get("x-ds-pow-response")
seenContentType = req.Header.Get("Content-Type")
seenFileSize = req.Header.Get("x-file-size")
seenBody = string(bodyBytes)
return &http.Response{StatusCode: http.StatusOK, Header: make(http.Header), Body: io.NopCloser(strings.NewReader(uploadResponse)), Request: req}, nil
default:
t.Fatalf("unexpected request count %d", call)
return nil, nil
}
}),
fallback: &http.Client{Transport: roundTripperFunc(func(req *http.Request) (*http.Response, error) {
return nil, nil
})},
maxRetries: 1,
}
result, err := client.UploadFile(context.Background(), &auth.RequestAuth{DeepSeekToken: "token", TriedAccounts: map[string]bool{}}, UploadFileRequest{
Filename: "demo.txt",
ContentType: "text/plain",
Purpose: "assistants",
Data: []byte("hello"),
}, 1)
if err != nil {
t.Fatalf("UploadFile error: %v", err)
}
if result.ID != "file_789" {
t.Fatalf("expected uploaded file id file_789, got %#v", result)
}
if !strings.Contains(seenTargetPath, `"target_path":"`+DeepSeekUploadTargetPath+`"`) {
t.Fatalf("expected upload target_path in pow request, got %q", seenTargetPath)
}
if strings.TrimSpace(seenPow) == "" {
t.Fatal("expected x-ds-pow-response header")
}
rawPow, err := base64.StdEncoding.DecodeString(seenPow)
if err != nil {
t.Fatalf("decode pow header failed: %v", err)
}
var powHeader map[string]any
if err := json.Unmarshal(rawPow, &powHeader); err != nil {
t.Fatalf("unmarshal pow header failed: %v", err)
}
if powHeader["target_path"] != DeepSeekUploadTargetPath {
t.Fatalf("expected pow target_path %q, got %#v", DeepSeekUploadTargetPath, powHeader["target_path"])
}
if seenFileSize != "5" {
t.Fatalf("expected x-file-size=5, got %q", seenFileSize)
}
if !strings.HasPrefix(seenContentType, "multipart/form-data; boundary=") {
t.Fatalf("expected multipart content type, got %q", seenContentType)
}
if !strings.Contains(seenBody, `name="file"; filename="demo.txt"`) {
t.Fatalf("expected file part in upload body: %q", seenBody)
}
}
func TestUploadFileWaitsForProcessedFetchFiles(t *testing.T) {
oldSleep := fileReadySleep
fileReadySleep = func(time.Duration) {}
defer func() { fileReadySleep = oldSleep }()
challengeHash := powpkg.DeepSeekHashV1([]byte(powpkg.BuildPrefix("salt", 1712345678) + "42"))
powResponse := `{"code":0,"msg":"ok","data":{"biz_code":0,"biz_data":{"challenge":{"algorithm":"DeepSeekHashV1","challenge":"` + hex.EncodeToString(challengeHash[:]) + `","salt":"salt","expire_at":1712345678,"difficulty":1000,"signature":"sig","target_path":"` + DeepSeekUploadTargetPath + `"}}}}`
uploadResponse := `{"code":0,"msg":"ok","data":{"biz_code":0,"biz_data":{"file":{"file_id":"file_789","filename":"demo.txt","bytes":5,"status":"PENDING","purpose":"assistants","is_image":false}}}}`
pendingFetchResponse := `{"code":0,"msg":"ok","data":{"biz_code":0,"biz_data":{"files":[{"file_id":"file_789","filename":"demo.txt","bytes":5,"status":"PENDING","purpose":"assistants","is_image":false}]}}}`
processedFetchResponse := `{"code":0,"msg":"ok","data":{"biz_code":0,"biz_data":{"files":[{"file_id":"file_789","filename":"demo.txt","bytes":5,"status":"processed","purpose":"assistants","is_image":true}]}}}`
var call int
client := &Client{
regular: doerFunc(func(req *http.Request) (*http.Response, error) {
call++
switch call {
case 1:
bodyBytes, _ := io.ReadAll(req.Body)
if !strings.Contains(string(bodyBytes), `"target_path":"`+DeepSeekUploadTargetPath+`"`) {
t.Fatalf("expected pow target path request, got %s", string(bodyBytes))
}
return &http.Response{StatusCode: http.StatusOK, Header: make(http.Header), Body: io.NopCloser(strings.NewReader(powResponse)), Request: req}, nil
case 2:
return &http.Response{StatusCode: http.StatusOK, Header: make(http.Header), Body: io.NopCloser(strings.NewReader(uploadResponse)), Request: req}, nil
case 3, 4:
if req.Method != http.MethodGet {
t.Fatalf("expected GET fetch request, got %s", req.Method)
}
if req.URL.Path != "/api/v0/file/fetch_files" {
t.Fatalf("expected fetch files path /api/v0/file/fetch_files, got %q", req.URL.Path)
}
if got := req.URL.Query().Get("file_ids"); got != "file_789" {
t.Fatalf("expected file_ids=file_789, got %q", got)
}
respBody := pendingFetchResponse
if call == 4 {
respBody = processedFetchResponse
}
return &http.Response{StatusCode: http.StatusOK, Header: make(http.Header), Body: io.NopCloser(strings.NewReader(respBody)), Request: req}, nil
default:
t.Fatalf("unexpected request count %d", call)
return nil, nil
}
}),
fallback: &http.Client{Transport: roundTripperFunc(func(req *http.Request) (*http.Response, error) { return nil, nil })},
maxRetries: 1,
}
result, err := client.UploadFile(context.Background(), &auth.RequestAuth{DeepSeekToken: "token", TriedAccounts: map[string]bool{}}, UploadFileRequest{
Filename: "demo.txt",
ContentType: "text/plain",
Purpose: "assistants",
Data: []byte("hello"),
}, 1)
if err != nil {
t.Fatalf("UploadFile error: %v", err)
}
if result.ID != "file_789" {
t.Fatalf("expected uploaded file id file_789, got %#v", result)
}
if result.Status != "processed" {
t.Fatalf("expected final status processed, got %#v", result.Status)
}
if call != 4 {
t.Fatalf("expected 4 requests, got %d", call)
}
}

View File

@@ -12,18 +12,22 @@ const (
DeepSeekCreatePowURL = "https://chat.deepseek.com/api/v0/chat/create_pow_challenge"
DeepSeekCompletionURL = "https://chat.deepseek.com/api/v0/chat/completion"
DeepSeekContinueURL = "https://chat.deepseek.com/api/v0/chat/continue"
DeepSeekUploadFileURL = "https://chat.deepseek.com/api/v0/file/upload_file"
DeepSeekFetchFilesURL = "https://chat.deepseek.com/api/v0/file/fetch_files"
DeepSeekFetchSessionURL = "https://chat.deepseek.com/api/v0/chat_session/fetch_page"
DeepSeekDeleteSessionURL = "https://chat.deepseek.com/api/v0/chat_session/delete"
DeepSeekDeleteAllSessionsURL = "https://chat.deepseek.com/api/v0/chat_session/delete_all"
DeepSeekCompletionTargetPath = "/api/v0/chat/completion"
DeepSeekUploadTargetPath = "/api/v0/file/upload_file"
)
var defaultBaseHeaders = map[string]string{
"Host": "chat.deepseek.com",
"User-Agent": "DeepSeek/1.6.11 Android/35",
"User-Agent": "DeepSeek/1.8.0 Android/35",
"Accept": "application/json",
"Content-Type": "application/json",
"x-client-platform": "android",
"x-client-version": "1.6.11",
"x-client-version": "1.8.0",
"x-client-locale": "zh_CN",
"accept-charset": "UTF-8",
}

View File

@@ -1,11 +1,10 @@
{
"base_headers": {
"Host": "chat.deepseek.com",
"User-Agent": "DeepSeek/1.6.11 Android/35",
"User-Agent": "DeepSeek/1.8.0 Android/35",
"Accept": "application/json",
"Content-Type": "application/json",
"x-client-platform": "android",
"x-client-version": "1.6.11",
"x-client-version": "1.8.0",
"x-client-locale": "zh_CN",
"accept-charset": "UTF-8"
},

View File

@@ -5,3 +5,7 @@ import "ds2api/internal/prompt"
func MessagesPrepare(messages []map[string]any) string {
return prompt.MessagesPrepare(messages)
}
func MessagesPrepareWithThinking(messages []map[string]any, thinkingEnabled bool) string {
return prompt.MessagesPrepareWithThinking(messages, thinkingEnabled)
}

View File

@@ -5,11 +5,11 @@ const path = require('path');
const DEFAULT_BASE_HEADERS = Object.freeze({
Host: 'chat.deepseek.com',
'User-Agent': 'DeepSeek/1.6.11 Android/35',
'User-Agent': 'DeepSeek/1.8.0 Android/35',
Accept: 'application/json',
'Content-Type': 'application/json',
'x-client-platform': 'android',
'x-client-version': '1.6.11',
'x-client-version': '1.8.0',
'x-client-locale': 'zh_CN',
'accept-charset': 'UTF-8',
});

View File

@@ -10,6 +10,7 @@ import (
var markdownImagePattern = regexp.MustCompile(`!\[(.*?)\]\((.*?)\)`)
const (
beginSentenceMarker = "<begin▁of▁sentence>"
systemMarker = "<System>"
userMarker = "<User>"
assistantMarker = "<Assistant>"
@@ -17,9 +18,15 @@ const (
endSentenceMarker = "<end▁of▁sentence>"
endToolResultsMarker = "<end▁of▁toolresults>"
endInstructionsMarker = "<end▁of▁instructions>"
openThinkMarker = "<think>"
closeThinkMarker = "</think>"
)
func MessagesPrepare(messages []map[string]any) string {
return MessagesPrepareWithThinking(messages, false)
}
func MessagesPrepareWithThinking(messages []map[string]any, thinkingEnabled bool) string {
type block struct {
Role string
Text string
@@ -41,11 +48,14 @@ func MessagesPrepare(messages []map[string]any) string {
}
merged = append(merged, msg)
}
parts := make([]string, 0, len(merged))
parts := make([]string, 0, len(merged)+2)
parts = append(parts, beginSentenceMarker)
lastRole := ""
for _, m := range merged {
lastRole = m.Role
switch m.Role {
case "assistant":
parts = append(parts, formatRoleBlock(assistantMarker, m.Text, endSentenceMarker))
parts = append(parts, formatRoleBlock(assistantMarker, closeThinkMarker+m.Text, endSentenceMarker))
case "tool":
if strings.TrimSpace(m.Text) != "" {
parts = append(parts, formatRoleBlock(toolMarker, m.Text, endToolResultsMarker))
@@ -62,6 +72,13 @@ func MessagesPrepare(messages []map[string]any) string {
}
}
}
if lastRole != "assistant" {
thinkPrefix := closeThinkMarker
if thinkingEnabled {
thinkPrefix = openThinkMarker
}
parts = append(parts, assistantMarker+thinkPrefix)
}
out := strings.Join(parts, "\n\n")
return markdownImagePattern.ReplaceAllString(out, `[${1}](${2})`)
}

View File

@@ -32,13 +32,16 @@ func TestMessagesPrepareUsesTurnSuffixes(t *testing.T) {
{"role": "assistant", "content": "Answer"},
}
got := MessagesPrepare(messages)
if !strings.HasPrefix(got, "<begin▁of▁sentence>") {
t.Fatalf("expected begin-of-sentence marker, got %q", got)
}
if !strings.Contains(got, "<System>\nSystem rule<end▁of▁instructions>") {
t.Fatalf("expected system instructions suffix, got %q", got)
}
if !strings.Contains(got, "<User>\nQuestion<end▁of▁sentence>") {
t.Fatalf("expected user sentence suffix, got %q", got)
}
if !strings.Contains(got, "<Assistant>\nAnswer<end▁of▁sentence>") {
if !strings.Contains(got, "<Assistant>\n</think>Answer<end▁of▁sentence>") {
t.Fatalf("expected assistant sentence suffix, got %q", got)
}
}
@@ -51,3 +54,11 @@ func TestNormalizeContentArrayFallsBackToContentWhenTextEmpty(t *testing.T) {
t.Fatalf("expected fallback to content when text is empty, got %q", got)
}
}
func TestMessagesPrepareWithThinkingEndsWithOpenThink(t *testing.T) {
messages := []map[string]any{{"role": "user", "content": "Question"}}
got := MessagesPrepareWithThinking(messages, true)
if !strings.HasSuffix(got, "<Assistant><think>") {
t.Fatalf("expected thinking suffix, got %q", got)
}
}

View File

@@ -53,6 +53,10 @@ func (r *Runner) caseModelsOpenAI(ctx context.Context, cc *caseContext) error {
ids := extractModelIDs(resp.Body)
cc.assert("has_deepseek_chat", contains(ids, "deepseek-chat"), strings.Join(ids, ","))
cc.assert("has_deepseek_reasoner", contains(ids, "deepseek-reasoner"), strings.Join(ids, ","))
cc.assert("has_deepseek_expert_chat", contains(ids, "deepseek-expert-chat"), strings.Join(ids, ","))
cc.assert("has_deepseek_expert_reasoner", contains(ids, "deepseek-expert-reasoner"), strings.Join(ids, ","))
cc.assert("has_deepseek_vision_chat", contains(ids, "deepseek-vision-chat"), strings.Join(ids, ","))
cc.assert("has_deepseek_vision_reasoner", contains(ids, "deepseek-vision-reasoner"), strings.Join(ids, ","))
return nil
}

View File

@@ -12,7 +12,7 @@ func TestMessagesPrepareBasic(t *testing.T) {
if got == "" {
t.Fatal("expected non-empty prompt")
}
if got != "<User>\nHello<end▁of▁sentence>" {
if got != "<begin▁of▁sentence>\n\n<User>\nHello<end▁of▁sentence>\n\n<Assistant></think>" {
t.Fatalf("unexpected prompt: %q", got)
}
}
@@ -29,10 +29,13 @@ func TestMessagesPrepareRoles(t *testing.T) {
if !contains(got, "<System>\nYou are helper<end▁of▁instructions>\n\n<User>\nHi<end▁of▁sentence>") {
t.Fatalf("expected system/user separation in %q", got)
}
if !contains(got, "<User>\nHi<end▁of▁sentence>\n\n<Assistant>\nHello<end▁of▁sentence>") {
if !contains(got, "<begin▁of▁sentence>") {
t.Fatalf("expected begin marker in %q", got)
}
if !contains(got, "<User>\nHi<end▁of▁sentence>\n\n<Assistant>\n</think>Hello<end▁of▁sentence>") {
t.Fatalf("expected user/assistant separation in %q", got)
}
if !contains(got, "<Assistant>\nHello<end▁of▁sentence>\n\n<Tool>\nSearch results<end▁of▁toolresults>") {
if !contains(got, "<Assistant>\n</think>Hello<end▁of▁sentence>\n\n<Tool>\nSearch results<end▁of▁toolresults>") {
t.Fatalf("expected assistant/tool separation in %q", got)
}
if !contains(got, "<Tool>\nSearch results<end▁of▁toolresults>\n\n<User>\nHow are you<end▁of▁sentence>") {
@@ -74,7 +77,7 @@ func TestMessagesPrepareArrayTextVariants(t *testing.T) {
},
}
got := MessagesPrepare(messages)
if got != "<User>\nline1\nline2<end▁of▁sentence>" {
if got != "<begin▁of▁sentence>\n\n<User>\nline1\nline2<end▁of▁sentence>\n\n<Assistant></think>" {
t.Fatalf("unexpected content from text variants: %q", got)
}
}

View File

@@ -1,5 +1,7 @@
package util
import "ds2api/internal/config"
type StandardRequest struct {
Surface string
RequestedModel string
@@ -12,6 +14,7 @@ type StandardRequest struct {
Stream bool
Thinking bool
Search bool
RefFileIDs []string
PassThrough map[string]any
}
@@ -51,11 +54,27 @@ func (p ToolChoicePolicy) Allows(name string) bool {
}
func (r StandardRequest) CompletionPayload(sessionID string) map[string]any {
modelID := r.ResolvedModel
if modelID == "" {
modelID = r.RequestedModel
}
modelType := "default"
if resolvedType, ok := config.GetModelType(modelID); ok {
modelType = resolvedType
}
refFileIDs := make([]any, 0, len(r.RefFileIDs))
for _, fileID := range r.RefFileIDs {
if fileID == "" {
continue
}
refFileIDs = append(refFileIDs, fileID)
}
payload := map[string]any{
"chat_session_id": sessionID,
"model_type": modelType,
"parent_message_id": nil,
"prompt": r.FinalPrompt,
"ref_file_ids": []any{},
"ref_file_ids": refFileIDs,
"thinking_enabled": r.Thinking,
"search_enabled": r.Search,
}

View File

@@ -0,0 +1,57 @@
package util
import "testing"
func TestStandardRequestCompletionPayloadSetsModelTypeFromResolvedModel(t *testing.T) {
tests := []struct {
name string
model string
thinking bool
search bool
modelType string
}{
{name: "default", model: "deepseek-chat", thinking: false, search: false, modelType: "default"},
{name: "expert", model: "deepseek-expert-reasoner", thinking: true, search: false, modelType: "expert"},
{name: "vision", model: "deepseek-vision-chat-search", thinking: false, search: true, modelType: "vision"},
}
for _, tc := range tests {
t.Run(tc.name, func(t *testing.T) {
req := StandardRequest{
ResolvedModel: tc.model,
FinalPrompt: "hello",
Thinking: tc.thinking,
Search: tc.search,
RefFileIDs: []string{"file-a", "file-b"},
PassThrough: map[string]any{
"temperature": 0.3,
},
}
payload := req.CompletionPayload("session-123")
if got := payload["model_type"]; got != tc.modelType {
t.Fatalf("expected model_type %s, got %#v", tc.modelType, got)
}
if got := payload["chat_session_id"]; got != "session-123" {
t.Fatalf("unexpected chat_session_id: %#v", got)
}
if got := payload["thinking_enabled"]; got != tc.thinking {
t.Fatalf("unexpected thinking_enabled: %#v", got)
}
if got := payload["search_enabled"]; got != tc.search {
t.Fatalf("unexpected search_enabled: %#v", got)
}
if got := payload["temperature"]; got != 0.3 {
t.Fatalf("expected passthrough temperature, got %#v", got)
}
refFileIDs, ok := payload["ref_file_ids"].([]any)
if !ok {
t.Fatalf("expected ref_file_ids slice, got %#v", payload["ref_file_ids"])
}
if len(refFileIDs) != 2 || refFileIDs[0] != "file-a" || refFileIDs[1] != "file-b" {
t.Fatalf("unexpected ref_file_ids: %#v", refFileIDs)
}
})
}
}

View File

@@ -162,7 +162,7 @@ func TestMessagesPrepareMergesConsecutiveSameRole(t *testing.T) {
{"role": "user", "content": "World"},
}
got := MessagesPrepare(messages)
if !strings.HasPrefix(got, "<User>") {
if !strings.HasPrefix(got, "<begin▁of▁sentence>") {
t.Fatalf("expected user marker at the start, got %q", got)
}
if !strings.Contains(got, "Hello") || !strings.Contains(got, "World") {
@@ -193,7 +193,7 @@ func TestMessagesPrepareAssistantMarkers(t *testing.T) {
if strings.Count(got, "<end▁of▁sentence>") != 2 {
t.Fatalf("expected both turns to be terminated, got %q", got)
}
if !strings.Contains(got, "<Assistant>\nHello!<end▁of▁sentence>") {
if !strings.Contains(got, "<Assistant>\n</think>Hello!<end▁of▁sentence>") {
t.Fatalf("expected assistant EOS suffix, got %q", got)
}
if strings.Contains(got, "<system_instructions>") {

View File

@@ -1,28 +0,0 @@
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"ds2api": {
"npm": "@ai-sdk/openai-compatible",
"name": "DS2API",
"options": {
"baseURL": "http://localhost:5001/v1",
"apiKey": "your-api-key"
},
"models": {
"gpt-4o": {
"name": "GPT-4o (aliased to deepseek-chat)"
},
"gpt-5-codex": {
"name": "GPT-5 Codex (aliased to deepseek-reasoner)"
},
"deepseek-chat": {
"name": "DeepSeek Chat (DS2API)"
},
"deepseek-reasoner": {
"name": "DeepSeek Reasoner (DS2API)"
}
}
}
},
"model": "ds2api/gpt-5-codex"
}

View File

@@ -14,6 +14,8 @@ export default function ApiTesterContainer({ config, onMessage, authFetch }) {
setModel,
message,
setMessage,
attachedFiles,
setAttachedFiles,
apiKey,
setApiKey,
selectedAccount,
@@ -52,6 +54,14 @@ export default function ApiTesterContainer({ config, onMessage, authFetch }) {
{ id: 'deepseek-reasoner', name: 'deepseek-reasoner', icon: 'Cpu', desc: t('apiTester.models.reasoner'), color: 'text-amber-600' },
{ id: 'deepseek-chat-search', name: 'deepseek-chat-search', icon: 'SearchIcon', desc: t('apiTester.models.chatSearch'), color: 'text-cyan-500' },
{ id: 'deepseek-reasoner-search', name: 'deepseek-reasoner-search', icon: 'SearchIcon', desc: t('apiTester.models.reasonerSearch'), color: 'text-cyan-600' },
{ id: 'deepseek-expert-chat', name: 'deepseek-expert-chat', icon: 'MessageSquare', desc: t('apiTester.models.expertChat'), color: 'text-emerald-500' },
{ id: 'deepseek-expert-reasoner', name: 'deepseek-expert-reasoner', icon: 'Cpu', desc: t('apiTester.models.expertReasoner'), color: 'text-emerald-600' },
{ id: 'deepseek-expert-chat-search', name: 'deepseek-expert-chat-search', icon: 'SearchIcon', desc: t('apiTester.models.expertChatSearch'), color: 'text-teal-500' },
{ id: 'deepseek-expert-reasoner-search', name: 'deepseek-expert-reasoner-search', icon: 'SearchIcon', desc: t('apiTester.models.expertReasonerSearch'), color: 'text-teal-600' },
{ id: 'deepseek-vision-chat', name: 'deepseek-vision-chat', icon: 'MessageSquare', desc: t('apiTester.models.visionChat'), color: 'text-violet-500' },
{ id: 'deepseek-vision-reasoner', name: 'deepseek-vision-reasoner', icon: 'Cpu', desc: t('apiTester.models.visionReasoner'), color: 'text-violet-600' },
{ id: 'deepseek-vision-chat-search', name: 'deepseek-vision-chat-search', icon: 'SearchIcon', desc: t('apiTester.models.visionChatSearch'), color: 'text-fuchsia-500' },
{ id: 'deepseek-vision-reasoner-search', name: 'deepseek-vision-reasoner-search', icon: 'SearchIcon', desc: t('apiTester.models.visionReasonerSearch'), color: 'text-fuchsia-600' },
]
const { runTest, stopGeneration } = useChatStreamClient({
@@ -62,6 +72,7 @@ export default function ApiTesterContainer({ config, onMessage, authFetch }) {
effectiveKey,
selectedAccount,
streamingMode,
attachedFiles,
abortControllerRef,
setLoading,
setIsStreaming,
@@ -71,7 +82,7 @@ export default function ApiTesterContainer({ config, onMessage, authFetch }) {
})
return (
<div className={clsx('flex flex-col lg:grid lg:grid-cols-12 gap-6 h-[calc(100vh-140px)]')}>
<div className={clsx('flex flex-col lg:grid lg:grid-cols-12 gap-6 h-[calc(100vh-140px)] min-h-0')}>
<ConfigPanel
t={t}
configExpanded={configExpanded}
@@ -96,6 +107,12 @@ export default function ApiTesterContainer({ config, onMessage, authFetch }) {
t={t}
message={message}
setMessage={setMessage}
attachedFiles={attachedFiles}
setAttachedFiles={setAttachedFiles}
setSelectedAccount={setSelectedAccount}
effectiveKey={effectiveKey}
selectedAccount={selectedAccount}
onMessage={onMessage}
response={response}
isStreaming={isStreaming}
loading={loading}

View File

@@ -1,10 +1,19 @@
import { Bot, Loader2, Send, Square, User, Zap } from 'lucide-react'
import { Bot, Loader2, Send, Square, User, Zap, Paperclip, X, FileIcon } from 'lucide-react'
import clsx from 'clsx'
import { useRef, useState } from 'react'
import { getAttachedFileAccountIds } from './fileAccountBinding'
export default function ChatPanel({
t,
message,
setMessage,
attachedFiles = [],
setAttachedFiles,
setSelectedAccount,
effectiveKey,
selectedAccount,
onMessage,
response,
isStreaming,
loading,
@@ -13,6 +22,69 @@ export default function ChatPanel({
onRunTest,
onStopGeneration,
}) {
const fileInputRef = useRef(null)
const [uploadingFiles, setUploadingFiles] = useState(false)
const handleFileSelect = async (e) => {
const files = Array.from(e.target.files)
if (files.length === 0) return
if (!effectiveKey) {
onMessage('error', t('apiTester.missingApiKey') || 'Missing API Key')
return
}
setUploadingFiles(true)
const initialSelectedAccount = String(selectedAccount || '').trim()
let boundAccount = initialSelectedAccount
for (const file of files) {
const formData = new FormData()
formData.append('file', file)
formData.append('purpose', 'assistants')
const headers = {
'Authorization': `Bearer ${effectiveKey}`,
}
if (boundAccount) {
headers['X-Ds2-Target-Account'] = boundAccount
}
try {
const res = await fetch('/v1/files', {
method: 'POST',
headers,
body: formData
})
if (!res.ok) {
const err = await res.text()
onMessage('error', err || 'File upload failed')
continue
}
const data = await res.json()
setAttachedFiles(prev => [...prev, data])
const uploadedAccount = String(data?.account_id || '').trim()
if (!boundAccount && uploadedAccount) {
boundAccount = uploadedAccount
}
} catch (error) {
onMessage('error', error.message || 'Network error during upload')
}
}
setUploadingFiles(false)
if (!initialSelectedAccount && boundAccount && setSelectedAccount) {
setSelectedAccount(boundAccount)
}
if (fileInputRef.current) {
fileInputRef.current.value = ''
}
}
const removeFile = (id) => {
setAttachedFiles(prev => prev.filter(f => f.id !== id))
}
const attachmentAccountIds = getAttachedFileAccountIds(attachedFiles)
const attachmentAccountId = attachmentAccountIds.length === 1 ? attachmentAccountIds[0] : ''
return (
<div className="lg:col-span-9 flex flex-col bg-card border border-border rounded-xl shadow-sm overflow-hidden min-h-0 flex-1 relative">
<div className="flex-1 overflow-y-auto p-4 lg:p-6 space-y-8 custom-scrollbar scroll-smooth">
@@ -61,7 +133,9 @@ export default function ChatPanel({
)}
<div className="text-sm leading-7 text-foreground whitespace-pre-wrap">
{streamingContent || response?.choices?.[0]?.message?.content || (response?.error && <span className="text-destructive font-medium">{response.error}</span>) || (loading && <span className="text-muted-foreground italic">{t('apiTester.generating')}</span>)}
{response?.success === false
? <span className="text-destructive font-medium">{response.error || t('apiTester.requestFailed')}</span>
: (streamingContent || response?.choices?.[0]?.message?.content || (loading && <span className="text-muted-foreground italic">{t('apiTester.generating')}</span>))}
{isStreaming && <span className="inline-block w-1.5 h-4 bg-primary ml-1 align-middle animate-pulse" />}
</div>
</div>
@@ -70,9 +144,52 @@ export default function ChatPanel({
</div>
<div className="p-4 lg:p-6 border-t border-border bg-card">
{attachedFiles.length > 0 && (
<div className="max-w-4xl mx-auto flex flex-wrap gap-2 mb-3">
{attachedFiles.map(file => (
<div key={file.id} className="flex items-center gap-2 bg-secondary/50 border border-border rounded-md px-2 py-1 text-xs text-secondary-foreground">
<FileIcon className="w-3 h-3 text-muted-foreground" />
<span className="truncate max-w-[150px]">{file.filename || file.id}</span>
<button
onClick={() => removeFile(file.id)}
className="text-muted-foreground hover:text-destructive transition-colors ml-1"
>
<X className="w-3 h-3" />
</button>
</div>
))}
</div>
)}
{attachmentAccountIds.length > 1 && (
<div className="max-w-4xl mx-auto mb-3 text-[11px] text-amber-600">
{t('apiTester.fileAccountConflict')}
</div>
)}
{attachmentAccountId && (
<div className="max-w-4xl mx-auto mb-3 text-[11px] text-muted-foreground">
{t('apiTester.attachmentAccountHint', { account: attachmentAccountId })}
</div>
)}
<div className="max-w-4xl mx-auto relative group">
<input
type="file"
className="hidden"
ref={fileInputRef}
multiple
onChange={handleFileSelect}
/>
<div className="absolute left-2 bottom-2 z-10">
<button
onClick={() => fileInputRef.current?.click()}
disabled={uploadingFiles || isStreaming}
className="p-2 text-muted-foreground hover:text-primary transition-colors disabled:opacity-50 disabled:cursor-not-allowed"
title="Attach files"
>
{uploadingFiles ? <Loader2 className="w-4 h-4 animate-spin" /> : <Paperclip className="w-4 h-4" />}
</button>
</div>
<textarea
className="w-full bg-[#09090b] border border-border rounded-xl pl-4 pr-12 py-3 text-sm focus:ring-2 focus:ring-primary/20 focus:border-primary transition-all resize-none custom-scrollbar placeholder:text-muted-foreground/50 text-foreground shadow-inner"
className="w-full bg-[#09090b] border border-border rounded-xl pl-12 pr-12 py-3 text-sm focus:ring-2 focus:ring-primary/20 focus:border-primary transition-all resize-none custom-scrollbar placeholder:text-muted-foreground/50 text-foreground shadow-inner"
placeholder={t('apiTester.enterMessage')}
rows={1}
style={{ minHeight: '52px' }}
@@ -81,11 +198,13 @@ export default function ChatPanel({
onKeyDown={e => {
if (e.key === 'Enter' && !e.shiftKey) {
e.preventDefault()
onRunTest()
if (!loading && !uploadingFiles && (message.trim() || attachedFiles.length > 0)) {
onRunTest()
}
}
}}
/>
<div className="absolute right-2 bottom-2">
<div className="absolute right-2 bottom-2 z-10">
{loading && isStreaming ? (
<button onClick={onStopGeneration} className="p-2 text-muted-foreground hover:text-destructive transition-colors">
<Square className="w-4 h-4 fill-current" />
@@ -93,7 +212,7 @@ export default function ChatPanel({
) : (
<button
onClick={onRunTest}
disabled={loading || !message.trim()}
disabled={loading || uploadingFiles || (!message.trim() && attachedFiles.length === 0)}
className="p-2 text-primary hover:text-primary/80 transition-colors disabled:opacity-50 disabled:cursor-not-allowed"
>
{loading ? <Loader2 className="w-4 h-4 animate-spin" /> : <Send className="w-4 h-4" />}

View File

@@ -38,13 +38,15 @@ export default function ConfigPanel({
ToggleLeft,
ToggleRight,
}
const selectedModel = models.find(m => m.id === model) || models[0]
const SelectedModelIcon = selectedModel ? (iconMap[selectedModel.icon] || MessageSquare) : MessageSquare
return (
<div className={clsx(
"lg:col-span-3 flex flex-col transition-all duration-300 ease-in-out z-20",
"lg:col-span-3 flex flex-col transition-all duration-300 ease-in-out z-20 min-h-0",
configExpanded ? "h-auto" : "h-14 lg:h-full"
)}>
<div className="bg-card border border-border rounded-xl flex flex-col h-full shadow-sm">
<div className="bg-card border border-border rounded-xl flex flex-col h-full shadow-sm min-h-0 overflow-hidden">
<button
onClick={() => setConfigExpanded(!configExpanded)}
className="lg:hidden flex items-center justify-between p-4 w-full bg-muted/20 hover:bg-muted/30 transition-colors"
@@ -61,49 +63,51 @@ export default function ConfigPanel({
</button>
<div className={clsx(
"p-4 space-y-6 overflow-y-auto custom-scrollbar flex-1",
!configExpanded && "hidden lg:block"
"p-4 flex flex-col gap-5",
!configExpanded && "hidden lg:flex"
)}>
<div className="space-y-3">
<div className="space-y-2 shrink-0">
<label className="text-[11px] font-semibold text-muted-foreground uppercase tracking-wider ml-0.5">{t('apiTester.modelLabel')}</label>
<div className="grid grid-cols-1 gap-2">
{models.map(m => {
const Icon = iconMap[m.icon] || MessageSquare
return (
<button
key={m.id}
onClick={() => setModel(m.id)}
className={clsx(
"group relative flex items-start gap-3 p-3 rounded-lg border text-left transition-all duration-200",
model === m.id
? "bg-secondary border-primary/50 shadow-sm"
: "bg-transparent border-transparent hover:bg-muted"
)}
>
<div className={clsx(
"p-1.5 rounded-md shrink-0 transition-colors",
model === m.id ? m.color : "text-muted-foreground group-hover:text-foreground"
)}>
<Icon className="w-4 h-4" />
</div>
<div className="min-w-0 flex-1">
<div className={clsx("font-medium text-sm", model === m.id ? "text-foreground" : "text-foreground/80") }>
{m.name}
</div>
<div className="text-[11px] text-muted-foreground mt-0.5">{m.desc}</div>
</div>
{model === m.id && (
<div className={clsx("absolute top-3 right-3", m.color)}>
<div className="w-1.5 h-1.5 rounded-full bg-current" />
</div>
)}
</button>
)
})}
<div className="relative">
<select
className="w-full h-11 pl-3 pr-9 bg-secondary border border-border rounded-lg text-sm appearance-none focus:outline-none focus:ring-1 focus:ring-ring focus:border-ring transition-all cursor-pointer hover:bg-muted/70 text-foreground"
value={model}
onChange={e => setModel(e.target.value)}
>
{models.map(m => (
<option key={m.id} value={m.id} className="bg-popover text-popover-foreground">
{m.name}
</option>
))}
</select>
<ChevronDown className="absolute right-2.5 top-3.5 w-4 h-4 text-muted-foreground pointer-events-none" />
</div>
{selectedModel && (
<div className="mt-3 rounded-lg border border-border bg-muted/20 p-3">
<div className="flex items-start gap-3">
<div className={clsx(
"p-2 rounded-md shrink-0 border border-border bg-background/80",
selectedModel.color
)}>
<SelectedModelIcon className="w-4 h-4" />
</div>
<div className="min-w-0 flex-1">
<div className="font-medium text-sm text-foreground truncate">
{selectedModel.name}
</div>
<div className="text-[11px] text-muted-foreground mt-1 leading-relaxed">
{selectedModel.desc}
</div>
</div>
</div>
<p className="text-[11px] text-muted-foreground/70 mt-2">
{t('apiTester.modelPickerHint')}
</p>
</div>
)}
</div>
<div className="space-y-2">
<div className="space-y-2 shrink-0">
<label className="text-[11px] font-semibold text-muted-foreground uppercase tracking-wider ml-0.5">{t('apiTester.streamMode')}</label>
<button
onClick={() => setStreamingMode(!streamingMode)}
@@ -124,7 +128,7 @@ export default function ConfigPanel({
</button>
</div>
<div className="space-y-2">
<div className="space-y-2 shrink-0">
<label className="text-[11px] font-semibold text-muted-foreground uppercase tracking-wider ml-0.5">{t('apiTester.accountSelector')}</label>
<div className="relative">
<select
@@ -147,7 +151,7 @@ export default function ConfigPanel({
</div>
</div>
<div className="space-y-2">
<div className="space-y-2 shrink-0">
<label className="text-[11px] font-semibold text-muted-foreground uppercase tracking-wider ml-0.5">{t('apiTester.apiKeyOptional')}</label>
<input
type="text"

View File

@@ -0,0 +1,19 @@
export function getAttachedFileAccountIds(attachedFiles = []) {
const ids = []
const seen = new Set()
for (const file of attachedFiles || []) {
const raw = file?.account_id ?? file?.accountId ?? file?.owner_account_id ?? file?.ownerAccountId ?? ''
const id = String(raw).trim()
if (!id || seen.has(id)) continue
seen.add(id)
ids.push(id)
}
return ids
}
export function getAttachedFileAccountId(attachedFiles = []) {
const ids = getAttachedFileAccountIds(attachedFiles)
return ids.length > 0 ? ids[0] : ''
}

View File

@@ -12,6 +12,7 @@ export function useApiTesterState({ t }) {
const [streamingThinking, setStreamingThinking] = useState('')
const [isStreaming, setIsStreaming] = useState(false)
const [streamingMode, setStreamingMode] = useState(true)
const [attachedFiles, setAttachedFiles] = useState([])
const [configExpanded, setConfigExpanded] = useState(false)
const abortControllerRef = useRef(null)
@@ -27,6 +28,8 @@ export function useApiTesterState({ t }) {
setModel,
message,
setMessage,
attachedFiles,
setAttachedFiles,
apiKey,
setApiKey,
selectedAccount,

View File

@@ -1,5 +1,7 @@
import { useCallback } from 'react'
import { getAttachedFileAccountIds } from './fileAccountBinding'
export function useChatStreamClient({
t,
onMessage,
@@ -8,6 +10,7 @@ export function useChatStreamClient({
effectiveKey,
selectedAccount,
streamingMode,
attachedFiles,
abortControllerRef,
setLoading,
setIsStreaming,
@@ -46,6 +49,42 @@ export function useChatStreamClient({
}
}, [t])
const resolveAttachmentAccount = useCallback(() => {
const ids = getAttachedFileAccountIds(attachedFiles)
if (ids.length > 1) {
return {
accountId: '',
error: t('apiTester.fileAccountConflict'),
}
}
return {
accountId: ids[0] || '',
error: '',
}
}, [attachedFiles, t])
const extractStreamError = useCallback((json) => {
const error = json?.error
if (!error || typeof error !== 'object') {
return null
}
const message = typeof error.message === 'string' && error.message.trim()
? error.message.trim()
: t('apiTester.requestFailed')
const rawStatus = Number(json?.status_code ?? error.status_code ?? error.http_status)
const statusCode = Number.isFinite(rawStatus) && rawStatus > 0
? rawStatus
: (error.code === 'content_filter' ? 400 : 429)
return {
message,
statusCode,
code: typeof error.code === 'string' ? error.code : '',
type: typeof error.type === 'string' ? error.type : '',
}
}, [t])
const runTest = useCallback(async () => {
if (!effectiveKey) {
onMessage('error', t('apiTester.missingApiKey'))
@@ -62,23 +101,48 @@ export function useChatStreamClient({
abortControllerRef.current = new AbortController()
try {
const selectedAccountId = String(selectedAccount || '').trim()
const attachmentBinding = resolveAttachmentAccount()
if (attachmentBinding.error) {
setResponse({ success: false, error: attachmentBinding.error })
onMessage('error', attachmentBinding.error)
setLoading(false)
setIsStreaming(false)
return
}
if (attachmentBinding.accountId && selectedAccountId && selectedAccountId !== attachmentBinding.accountId) {
const errorMsg = t('apiTester.fileAccountMismatch', { account: attachmentBinding.accountId })
setResponse({ success: false, error: errorMsg })
onMessage('error', errorMsg)
setLoading(false)
setIsStreaming(false)
return
}
const requestAccount = selectedAccountId || attachmentBinding.accountId
const headers = {
'Content-Type': 'application/json',
'Authorization': `Bearer ${effectiveKey}`,
}
if (selectedAccount) {
headers['X-Ds2-Target-Account'] = selectedAccount
if (requestAccount) {
headers['X-Ds2-Target-Account'] = requestAccount
}
const body = {
model,
messages: [{ role: 'user', content: message }],
stream: streamingMode,
}
if (attachedFiles && attachedFiles.length > 0) {
body.file_ids = attachedFiles.map(f => f.id)
}
const endpoint = streamingMode ? '/v1/chat/completions' : '/v1/chat/completions?__go=1'
const res = await fetch(endpoint, {
method: 'POST',
headers,
body: JSON.stringify({
model,
messages: [{ role: 'user', content: message }],
stream: streamingMode,
}),
body: JSON.stringify(body),
signal: abortControllerRef.current.signal,
})
@@ -97,7 +161,11 @@ export function useChatStreamClient({
const reader = res.body.getReader()
const decoder = new TextDecoder()
let buffer = ''
let accumulatedThinking = ''
let accumulatedContent = ''
let streamError = null
streamLoop:
while (true) {
const { done, value } = await reader.read()
if (done) break
@@ -115,13 +183,20 @@ export function useChatStreamClient({
try {
const json = JSON.parse(dataStr)
const errorPayload = extractStreamError(json)
if (errorPayload) {
streamError = errorPayload
break streamLoop
}
const choice = json.choices?.[0]
if (choice?.delta) {
const delta = choice.delta
if (delta.reasoning_content) {
accumulatedThinking += delta.reasoning_content
setStreamingThinking(prev => prev + delta.reasoning_content)
}
if (delta.content) {
accumulatedContent += delta.content
setStreamingContent(prev => prev + delta.content)
}
}
@@ -130,11 +205,43 @@ export function useChatStreamClient({
}
}
}
if (streamError) {
await reader.cancel().catch(() => {})
setStreamingContent('')
setStreamingThinking('')
setResponse({
success: false,
status_code: streamError.statusCode,
error: streamError.message,
code: streamError.code,
type: streamError.type,
})
onMessage('error', streamError.message)
setLoading(false)
setIsStreaming(false)
return
}
setResponse({
success: true,
status_code: res.status,
choices: [{
finish_reason: 'stop',
index: 0,
message: {
role: 'assistant',
content: accumulatedContent,
reasoning_content: accumulatedThinking,
},
}],
})
onMessage('success', t('apiTester.requestSuccess', { account: requestAccount || selectedAccountId || 'Auto', time: Math.max(0, Date.now() - startedAt) }))
} else {
const data = await res.json()
setResponse({ success: true, status_code: res.status, ...data })
const elapsed = Math.max(0, Date.now() - startedAt)
onMessage('success', t('apiTester.testSuccess', { account: selectedAccount || 'Auto', time: elapsed }))
onMessage('success', t('apiTester.requestSuccess', { account: requestAccount || 'Auto', time: elapsed }))
}
} catch (e) {
if (e.name === 'AbortError') {
@@ -150,11 +257,14 @@ export function useChatStreamClient({
}
}, [
abortControllerRef,
attachedFiles,
effectiveKey,
extractErrorMessage,
extractStreamError,
message,
model,
onMessage,
resolveAttachmentAccount,
selectedAccount,
setIsStreaming,
setLoading,

View File

@@ -199,14 +199,24 @@
"chat": "Non-reasoning model",
"reasoner": "Reasoning model",
"chatSearch": "Non-reasoning model (with search)",
"reasonerSearch": "Reasoning model (with search)"
"reasonerSearch": "Reasoning model (with search)",
"expertChat": "Non-reasoning expert mode",
"expertReasoner": "Reasoning expert mode",
"expertChatSearch": "Non-reasoning expert mode (with search)",
"expertReasonerSearch": "Reasoning expert mode (with search)",
"visionChat": "Non-reasoning vision mode",
"visionReasoner": "Reasoning vision mode",
"visionChatSearch": "Non-reasoning vision mode (with search)",
"visionReasonerSearch": "Reasoning vision mode (with search)"
},
"missingApiKey": "Please provide an API key.",
"requestFailed": "Request failed.",
"networkError": "Network error: {error}",
"requestSuccess": "{account}: Request successful ({time}ms)",
"testSuccess": "{account}: Token refresh successful ({time}ms)",
"config": "Configuration",
"modelLabel": "Model",
"modelPickerHint": "Use the dropdown to pick a model. The list scrolls automatically.",
"streamMode": "Streaming",
"accountSelector": "Account",
"autoRandom": "🤖 Auto / Random",
@@ -215,6 +225,9 @@
"apiKeyPlaceholder": "Enter a custom key",
"modeManaged": "Managed key mode (uses account pool).",
"modeDirect": "Direct token mode (requires a valid DeepSeek token).",
"attachmentAccountHint": "Attached files are bound to account {account}. Sending will reuse the same account.",
"fileAccountConflict": "Attached files came from different accounts. Clear them and upload again under one account.",
"fileAccountMismatch": "The selected account does not match the attachment account. Switch to the bound account or clear the attachments and try again.",
"statusError": "Error",
"reasoningTrace": "Reasoning Trace",
"generating": "Generating response...",

View File

@@ -199,14 +199,24 @@
"chat": "非思考模型",
"reasoner": "思考模型",
"chatSearch": "非思考模型 (带搜索)",
"reasonerSearch": "思考模型 (带搜索)"
"reasonerSearch": "思考模型 (带搜索)",
"expertChat": "非思考专家模式",
"expertReasoner": "思考专家模式",
"expertChatSearch": "非思考专家模式 (带搜索)",
"expertReasonerSearch": "思考专家模式 (带搜索)",
"visionChat": "非思考视觉模式",
"visionReasoner": "思考视觉模式",
"visionChatSearch": "非思考视觉模式 (带搜索)",
"visionReasonerSearch": "思考视觉模式 (带搜索)"
},
"missingApiKey": "请提供 API 密钥",
"requestFailed": "请求失败",
"networkError": "网络错误: {error}",
"requestSuccess": "{account}: 请求成功 ({time}ms)",
"testSuccess": "{account}: Token 刷新成功 ({time}ms)",
"config": "配置",
"modelLabel": "模型",
"modelPickerHint": "使用下拉列表选择模型,长列表会自动滚动。",
"streamMode": "流式模式",
"accountSelector": "选择账号",
"autoRandom": "🤖 自动 / 随机",
@@ -215,6 +225,9 @@
"apiKeyPlaceholder": "输入自定义密钥",
"modeManaged": "当前使用托管 key 模式(会走账号池)。",
"modeDirect": "当前使用直通 token 模式(需填写有效 DeepSeek token。",
"attachmentAccountHint": "附件已绑定账号:{account},发送时会自动沿用同一账号。",
"fileAccountConflict": "附件来自不同账号,请先清空后重新上传。",
"fileAccountMismatch": "当前选择的账号与附件绑定账号不一致,请切换到绑定账号或清空附件后重试。",
"statusError": "错误",
"reasoningTrace": "思维链过程",
"generating": "正在生成响应...",