Compare commits

...

40 Commits

Author SHA1 Message Date
CJACK.
184cbed3cb Merge pull request #252 from CJackHwang/dev
Merge pull request #249 from shuaihaoV/feat/deepseek-model-type-families

Add default, expert, and vision DeepSeek model families
2026-04-08 18:06:07 +08:00
CJACK.
378f99be4a Merge pull request #249 from shuaihaoV/feat/deepseek-model-type-families
Add default, expert, and vision DeepSeek model families
2026-04-08 17:53:02 +08:00
Shuaihao
ba76a2163b Add default, expert, and vision DeepSeek model families 2026-04-08 14:37:22 +08:00
CJACK.
af9c51f3a7 Merge pull request #245 from CJackHwang/dev
Merge pull request #244 from CJackHwang/codex/temporarily-switch-to-internal-usage-count

Temporarily ignore DeepSeek upstream usage fields and prefer internal token estimation
2026-04-07 21:27:32 +08:00
CJACK.
92bb25265e Merge pull request #246 from CJackHwang/codex/fix-review-comments-before-merging
Fix proxy-bound fallback behavior and redact proxy password responses
2026-04-07 21:26:13 +08:00
CJACK.
84050d87e4 fix proxy fallback binding and redact proxy password responses 2026-04-07 21:22:28 +08:00
CJACK.
c6a6f1cf4e Merge pull request #244 from CJackHwang/codex/temporarily-switch-to-internal-usage-count
Temporarily ignore DeepSeek upstream usage fields and prefer internal token estimation
2026-04-07 20:39:36 +08:00
CJACK.
f4ed10d38d disable token-mismatch gate by default in raw stream simulator 2026-04-07 20:38:29 +08:00
CJACK.
d9e65c9710 remove upstream token-usage plumbing and always estimate from content 2026-04-07 20:12:18 +08:00
CJACK.
a14e5b0847 temporarily ignore upstream token usage fields globally 2026-04-07 19:40:47 +08:00
CJACK.
b59e991ad5 Merge pull request #241 from tanaer/feat/proxy-ip-management-dev
feat: 增加 SOCKS5/SOCKS5H 代理管理与账号代理路由
2026-04-07 17:14:48 +08:00
Jason.li
c84347b625 docs: align agent rules with quality gate lint 2026-04-07 14:19:40 +08:00
Jason.li
8ae2ea10c8 feat(proxy): add proxy IP management and account routing
Add admin CRUD and connectivity checks for SOCKS5/SOCKS5H proxy nodes.

Allow accounts to bind to a proxy, route DeepSeek requests through the selected node, and expose proxy management in the admin UI.
2026-04-07 14:16:13 +08:00
CJACK.
d32765bc84 Merge pull request #240 from CJackHwang/dev
Merge pull request #239 from CJackHwang/codex/fix-escaping-issues-and-token-counting

Fix HTML-escaped tool-call args and preserve upstream token usage (stream & non-stream)
2026-04-07 13:16:49 +08:00
CJACK.
08b1344f81 Merge pull request #242 from CJackHwang/codex/fix-issues-in-pull-request-#240
fix: avoid double-decoding XML entity text in markup tool-call parsing
2026-04-07 13:16:01 +08:00
CJACK.
8b0da7b6f8 fix: avoid double XML entity decoding in toolcall parser 2026-04-07 13:14:30 +08:00
CJACK.
1c95942e5d Merge pull request #239 from CJackHwang/codex/fix-escaping-issues-and-token-counting
Fix HTML-escaped tool-call args and preserve upstream token usage (stream & non-stream)
2026-04-07 12:56:02 +08:00
CJACK.
da7c46b278 Limit HTML unescape to markup tool-call parsing 2026-04-07 12:55:06 +08:00
CJACK.
cfcca69396 Update VERSION 2026-04-07 12:46:15 +08:00
CJACK.
4475bfe92f Merge pull request #238 from CJackHwang/codex/remove-project-structure-section-from-main-document
docs: remove duplicated project structure sections from READMEs
2026-04-07 12:36:30 +08:00
CJACK.
77a401fb19 Fix tool-call HTML escaping and stabilize usage token mapping 2026-04-07 12:35:50 +08:00
CJACK.
a935f61f74 docs: remove duplicated project structure sections from READMEs 2026-04-07 12:32:52 +08:00
CJACK.
80b88b37ff Merge pull request #236 from CJackHwang/codex/review-and-reorganize-all-md-documents
docs: add architecture docs and centralize documentation index; update READMEs and API links
2026-04-07 11:55:11 +08:00
CJACK.
475c9086d2 docs: 为展开目录树补充文件夹作用注释 2026-04-07 11:51:14 +08:00
CJACK.
8cfba9c650 Merge pull request #232 from CJackHwang/dev
refactor: improve XML tool parsing robustness, update system prompt constraints, and simplify tool filtering logic
2026-04-07 11:13:44 +08:00
CJACK.
98131881ed Merge pull request #234 from CJackHwang/codex/fix-documentation-and-accumulated_token_usage
Propagate DeepSeek SSE token usage to /v1/responses and remove stale POW env docs
2026-04-07 11:02:44 +08:00
CJACK.
86ecbc89bd Preserve SSE frame delimiters when injecting Gemini usage 2026-04-07 10:59:27 +08:00
CJACK.
668b9c26bd Unify token usage pass-through on OpenAI translate pipeline 2026-04-07 10:16:23 +08:00
CJACK.
5bcea3d727 Propagate upstream token usage across Gemini usage metadata 2026-04-07 10:16:00 +08:00
CJACK.
96b8587c5b Fix token usage propagation and remove stale env docs 2026-04-07 08:27:03 +08:00
CJACK.
d09260d06f Merge pull request #233 from CJackHwang/main
依赖升级
2026-04-07 07:12:40 +08:00
CJACK.
554b95d232 Merge pull request #231 from CJackHwang/dependabot/npm_and_yarn/webui/npm_and_yarn-7c6ac41456
chore(deps-dev): bump vite from 8.0.3 to 8.0.5 in /webui in the npm_and_yarn group across 1 directory
2026-04-07 07:02:53 +08:00
dependabot[bot]
b54ee05d12 chore(deps-dev): bump vite
Bumps the npm_and_yarn group with 1 update in the /webui directory: [vite](https://github.com/vitejs/vite/tree/HEAD/packages/vite).


Updates `vite` from 8.0.3 to 8.0.5
- [Release notes](https://github.com/vitejs/vite/releases)
- [Changelog](https://github.com/vitejs/vite/blob/main/packages/vite/CHANGELOG.md)
- [Commits](https://github.com/vitejs/vite/commits/v8.0.5/packages/vite)

---
updated-dependencies:
- dependency-name: vite
  dependency-version: 8.0.5
  dependency-type: direct:development
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-06 18:44:30 +00:00
CJACK
9968221633 refactor: improve XML tool parsing robustness, update system prompt constraints, and simplify tool filtering logic 2026-04-07 02:10:45 +08:00
CJACK
b79a13efd5 feat: support explicit prompt token tracking in SSE parsing and stream handlers 2026-04-07 01:39:27 +08:00
CJACK
da778a18fb refactor: replace WASM-based PoW with a high-performance native Go implementation and add context support for cancellation. 2026-04-07 01:20:01 +08:00
CJACK.
10921e0f84 Merge pull request #229 from CJackHwang/dev
refactor: replace WASM-based PoW solver with a native Go implementation in the pow package
2026-04-07 00:57:33 +08:00
CJACK
e7d561694a refactor: replace WASM-based PoW solver with a native Go implementation in the pow package 2026-04-07 00:10:36 +08:00
CJACK.
13687ce787 Merge pull request #227 from CJackHwang/codex/change-empty-responses-to-429
fix(openai): return 429 for empty upstream output
2026-04-06 17:11:07 +08:00
CJACK.
26aa02d4b5 fix(openai): return 429 for empty upstream output 2026-04-06 16:56:17 +08:00
115 changed files with 3547 additions and 1122 deletions

View File

@@ -79,7 +79,7 @@ jobs:
CGO_ENABLED=0 GOOS="${GOOS}" GOARCH="${GOARCH}" \
go build -trimpath -ldflags="-s -w -X ds2api/internal/version.BuildVersion=${BUILD_VERSION}" -o "${STAGE}/${BIN}" ./cmd/ds2api
cp config.example.json .env.example internal/deepseek/assets/sha3_wasm_bg.7b9ca65ddd.wasm LICENSE README.MD README.en.md "${STAGE}/"
cp config.example.json .env.example LICENSE README.MD README.en.md "${STAGE}/"
cp -R static/admin "${STAGE}/static/admin"
if [ "${GOOS}" = "windows" ]; then

23
AGENTS.md Normal file
View File

@@ -0,0 +1,23 @@
# AGENTS.md
These rules apply to all agent-made changes in this repository.
## PR Gate
- Before opening or updating a PR, run the same local gates as `.github/workflows/quality-gates.yml`.
- Required commands:
- `./scripts/lint.sh`
- `./tests/scripts/check-refactor-line-gate.sh`
- `./tests/scripts/run-unit-all.sh`
- `npm run build --prefix webui`
## Go Lint Rules
- Run `gofmt -w` on every changed Go file before commit or push.
- Do not ignore error returns from I/O-style cleanup calls such as `Close`, `Flush`, `Sync`, or similar methods.
- If a cleanup error cannot be returned, log it explicitly.
## Change Scope
- Keep changes additive and tightly scoped to the requested feature or bugfix.
- Do not mix unrelated refactors into feature PRs unless they are required to make the change pass gates.

View File

@@ -4,6 +4,8 @@ Language: [中文](API.md) | [English](API.en.md)
This document describes the actual behavior of the current Go codebase.
Docs: [Overview](README.en.md) / [Architecture](docs/ARCHITECTURE.en.md) / [Deployment](docs/DEPLOY.en.md) / [Testing](docs/TESTING.md)
---
## Table of Contents
@@ -267,6 +269,7 @@ data: [DONE]
- `deepseek-reasoner` / `deepseek-reasoner-search` models emit `delta.reasoning_content`
- Text emits `delta.content`
- Last chunk includes `finish_reason` and `usage`
- Token counting prefers pass-through from upstream DeepSeek SSE (`accumulated_token_usage` / `token_usage`), and only falls back to local estimation when upstream usage is absent
#### Tool Calls
@@ -383,6 +386,7 @@ Business auth required. Returns OpenAI-compatible embeddings shape.
## Claude-Compatible API
Besides `/anthropic/v1/*`, DS2API also supports shortcut paths: `/v1/messages`, `/messages`, `/v1/messages/count_tokens`, `/messages/count_tokens`.
Implementation-wise this path is unified on the OpenAI Chat Completions parse-and-translate pipeline to avoid maintaining divergent parsing chains.
### `GET /anthropic/v1/models`
@@ -517,6 +521,7 @@ Supported paths:
- `/v1/models/{model}:streamGenerateContent` (compat path)
Authentication is the same as other business routes (`Authorization: Bearer <token>` or `x-api-key`).
Implementation-wise this path is unified on the OpenAI Chat Completions parse-and-translate pipeline to avoid maintaining divergent parsing chains.
### `POST /v1beta/models/{model}:generateContent`
@@ -535,6 +540,7 @@ Returns SSE (`text/event-stream`), each chunk as `data: <json>`:
- regular text: incremental text chunks
- `tools` mode: buffered and emitted as `functionCall` at finalize phase
- final chunk: includes `finishReason: "STOP"` and `usageMetadata`
- Token counting prefers pass-through from upstream DeepSeek SSE (`accumulated_token_usage` / `token_usage`), and only falls back to local estimation when upstream usage is absent
---

6
API.md
View File

@@ -4,6 +4,8 @@
本文档描述当前 Go 代码库的实际 API 行为。
文档导航:[总览](README.MD) / [架构说明](docs/ARCHITECTURE.md) / [部署指南](docs/DEPLOY.md) / [测试指南](docs/TESTING.md)
---
## 目录
@@ -267,6 +269,7 @@ data: [DONE]
- `deepseek-reasoner` / `deepseek-reasoner-search` 模型输出 `delta.reasoning_content`
- 普通文本输出 `delta.content`
- 最后一段包含 `finish_reason``usage`
- token 计数优先透传上游 DeepSeek SSE`accumulated_token_usage` / `token_usage`);仅在上游缺失时回退本地估算
#### Tool Calls
@@ -389,6 +392,7 @@ data: [DONE]
## Claude 兼容接口
除标准路径 `/anthropic/v1/*` 外,还支持快捷路径 `/v1/messages``/messages``/v1/messages/count_tokens``/messages/count_tokens`
实现上统一走 OpenAI Chat Completions 解析与回译链路,避免多套解析逻辑分叉维护。
### `GET /anthropic/v1/models`
@@ -523,6 +527,7 @@ data: {"type":"message_stop"}
- `/v1/models/{model}:streamGenerateContent`(兼容路径)
鉴权方式同业务接口(`Authorization: Bearer <token>``x-api-key`)。
实现上统一走 OpenAI Chat Completions 解析与回译链路,避免多套解析逻辑分叉维护。
### `POST /v1beta/models/{model}:generateContent`
@@ -541,6 +546,7 @@ data: {"type":"message_stop"}
- 常规文本:持续返回增量文本 chunk
- `tools` 场景:会缓冲并在结束时输出 `functionCall` 结构
- 结束 chunk包含 `finishReason: "STOP"``usageMetadata`
- token 计数优先透传上游 DeepSeek SSE`accumulated_token_usage` / `token_usage`);仅在上游缺失时回退本地估算
---

View File

@@ -34,7 +34,7 @@ CMD ["/usr/local/bin/ds2api"]
FROM runtime-base AS runtime-from-source
COPY --from=go-builder /out/ds2api /usr/local/bin/ds2api
COPY --from=go-builder /app/internal/deepseek/assets/sha3_wasm_bg.7b9ca65ddd.wasm /app/sha3_wasm_bg.7b9ca65ddd.wasm
COPY --from=go-builder /app/config.example.json /app/config.example.json
COPY --from=webui-builder /app/static/admin /app/static/admin
@@ -53,13 +53,13 @@ RUN set -eux; \
test -n "${PKG_DIR}"; \
mkdir -p /out/static; \
cp "${PKG_DIR}/ds2api" /out/ds2api; \
cp "${PKG_DIR}/sha3_wasm_bg.7b9ca65ddd.wasm" /out/sha3_wasm_bg.7b9ca65ddd.wasm; \
cp "${PKG_DIR}/config.example.json" /out/config.example.json; \
cp -R "${PKG_DIR}/static/admin" /out/static/admin
FROM runtime-base AS runtime-from-dist
COPY --from=dist-extract /out/ds2api /usr/local/bin/ds2api
COPY --from=dist-extract /out/sha3_wasm_bg.7b9ca65ddd.wasm /app/sha3_wasm_bg.7b9ca65ddd.wasm
COPY --from=dist-extract /out/config.example.json /app/config.example.json
COPY --from=dist-extract /out/static/admin /app/static/admin

View File

@@ -16,6 +16,8 @@
将 DeepSeek Web 对话能力转换为 OpenAI、Claude 与 Gemini 兼容 API。后端为 **Go 全量实现**,前端为 React WebUI 管理台(源码在 `webui/`,部署时自动构建到 `static/admin`)。
文档入口:[文档导航](docs/README.md) / [架构说明](docs/ARCHITECTURE.md) / [接口文档](API.md)
> **重要免责声明**
>
> 本仓库仅供学习、研究、个人实验和内部验证使用,不提供任何形式的商业授权、适用性保证或结果保证。
@@ -24,7 +26,7 @@
>
> 请勿将本项目用于违反服务条款、协议、法律法规或平台规则的场景。商业使用前请自行确认 `LICENSE`、相关协议以及你是否获得了作者的书面许可。
## 架构概览
## 架构概览(摘要)
```mermaid
flowchart LR
@@ -48,7 +50,7 @@ flowchart LR
Auth["Auth Resolver\n(API key / bearer / x-goog-api-key)"]
Pool["Account Pool + Queue\n(并发槽位 + 等待队列)"]
DSClient["DeepSeek Client\n(Session / Auth / HTTP)"]
Pow["PoW WASM\n(wazero 预加载)"]
Pow["PoW 实现\n(纯 Go 毫秒级)"]
Tool["Tool Sieve\n(Go/Node 语义对齐)"]
end
end
@@ -72,6 +74,8 @@ flowchart LR
Bridge --> Client
```
详细架构拆分与目录职责见 [docs/ARCHITECTURE.md](docs/ARCHITECTURE.md)。
- **后端**Go`cmd/ds2api/`、`api/`、`internal/`),不依赖 Python 运行时
- **前端**React 管理台(`webui/`),运行时托管静态构建产物
- **部署**本地运行、Docker、Vercel Serverless、Linux systemd
@@ -95,7 +99,7 @@ flowchart LR
| Gemini 兼容 | `POST /v1beta/models/{model}:generateContent`、`POST /v1beta/models/{model}:streamGenerateContent`(及 `/v1/models/{model}:*` 路径) |
| 多账号轮询 | 自动 token 刷新、邮箱/手机号双登录方式 |
| 并发队列控制 | 每账号 in-flight 上限 + 等待队列,动态计算建议并发值 |
| DeepSeek PoW | WASM 计算(`wazero`),无需外部 Node.js 依赖 |
| DeepSeek PoW | 纯 Go 高性能实现DeepSeekHashV1毫秒级响应 |
| Tool Calling | 防泄漏处理:非代码块高置信特征识别、`delta.tool_calls` 早发、结构化增量输出 |
| Admin API | 配置管理、运行时设置热更新、账号测试 / 批量测试、会话清理、导入导出、Vercel 同步、版本检查 |
| WebUI 管理台 | `/admin` 单页应用(中英文双语、深色模式) |
@@ -344,7 +348,6 @@ cp opencode.json.example opencode.json
| `DS2API_CONFIG_PATH` | 配置文件路径 | `config.json` |
| `DS2API_CONFIG_JSON` | 直接注入配置JSON 或 Base64 | — |
| `DS2API_ENV_WRITEBACK` | 环境变量模式下自动写回配置文件并切换文件模式(`1/true/yes/on` | 关闭 |
| `DS2API_WASM_PATH` | PoW WASM 文件路径 | 自动查找 |
| `DS2API_STATIC_ADMIN_DIR` | 管理台静态文件目录 | `static/admin` |
| `DS2API_AUTO_BUILD_WEBUI` | 启动时自动构建 WebUI | 本地开启Vercel 关闭 |
| `DS2API_DEV_PACKET_CAPTURE` | 本地开发抓包开关(记录最近会话请求/响应体) | 本地非 Vercel 默认开启 |
@@ -432,72 +435,6 @@ go run ./cmd/ds2api
{"query":"广州天气","sample_id":"gz-weather-from-memory"}
```
## 项目结构
```text
ds2api/
├── app/ # 统一 HTTP Handler 组装层(供本地与 Serverless 复用)
├── cmd/
│ ├── ds2api/ # 本地 / 容器启动入口
│ └── ds2api-tests/ # 端到端测试集入口
├── api/
│ ├── index.go # Vercel Serverless Go 入口
│ ├── chat-stream.js # Vercel Node.js 流式转发
│ └── (rewrite targets in vercel.json)
├── internal/
│ ├── account/ # 账号池与并发队列
│ ├── adapter/
│ │ ├── openai/ # OpenAI 兼容适配器(含 Tool Call 解析、Vercel 流式 prepare/release
│ │ ├── claude/ # Claude 兼容适配器
│ │ └── gemini/ # Gemini 兼容适配器generateContent / streamGenerateContent
│ ├── admin/ # Admin API handlers含 Settings 热更新)
│ ├── auth/ # 鉴权与 JWT
│ ├── claudeconv/ # Claude 消息格式转换
│ ├── compat/ # Go 版本兼容与回归测试辅助
│ ├── config/ # 配置加载、校验与热更新
│ ├── deepseek/ # DeepSeek API 客户端、PoW WASM
│ ├── js/ # Node 运行时流式处理与兼容逻辑
│ ├── devcapture/ # 开发抓包模块
│ ├── rawsample/ # 原始流样本可见文本提取与回放辅助
│ ├── format/ # 输出格式化
│ ├── prompt/ # Prompt 构建
│ ├── server/ # HTTP 路由与中间件chi router
│ ├── sse/ # SSE 解析工具
│ ├── stream/ # 统一流式消费引擎
│ ├── testsuite/ # 端到端测试框架与用例编排
│ ├── translatorcliproxy/ # CLIProxy 桥接与流写入组件
│ ├── toolcall/ # Tool Call 解析、修复与格式化(核心业务逻辑)
│ ├── util/ # 通用工具函数Token 估算、JSON 辅助等)
│ ├── version/ # 版本解析 / 比较与 tag 规范化
│ └── webui/ # WebUI 静态文件托管与自动构建
├── webui/ # React WebUI 源码Vite + Tailwind
│ └── src/
│ ├── app/ # 路由、鉴权、配置状态管理
│ ├── features/ # 业务功能模块account/settings/vercel/apiTester
│ ├── components/ # 登录/落地页等通用组件
│ └── locales/ # 中英文语言包zh.json / en.json
├── scripts/
│ └── build-webui.sh # WebUI 手动构建脚本
├── tests/
│ ├── compat/ # 兼容性测试夹具与期望输出
│ ├── node/ # Node 侧单元测试chat-stream / tool-sieve
│ ├── raw_stream_samples/ # 原始 SSE 样本与回放元数据
│ └── scripts/ # 统一测试脚本入口unit/e2e
├── docs/ # 部署 / 贡献 / 测试等辅助文档
├── static/admin/ # WebUI 构建产物(不提交到 Git
├── .github/
│ ├── workflows/ # GitHub Actions质量门禁 + Release 自动构建)
│ ├── ISSUE_TEMPLATE/ # Issue 模板
│ └── PULL_REQUEST_TEMPLATE.md
├── config.example.json # 配置文件示例
├── .env.example # 环境变量示例
├── Dockerfile # 多阶段构建WebUI + Go
├── docker-compose.yml # 生产环境 Docker Compose
├── docker-compose.dev.yml # 开发环境 Docker Compose
├── vercel.json # Vercel 路由与构建配置
└── go.mod / go.sum # Go 模块依赖
```
## 文档索引
| 文档 | 说明 |

View File

@@ -16,6 +16,8 @@ Language: [中文](README.MD) | [English](README.en.md)
DS2API converts DeepSeek Web chat capability into OpenAI-compatible, Claude-compatible, and Gemini-compatible APIs. The backend is a **pure Go implementation**, with a React WebUI admin panel (source in `webui/`, build output auto-generated to `static/admin` during deployment).
Documentation entry: [Docs Index](docs/README.md) / [Architecture](docs/ARCHITECTURE.en.md) / [API Reference](API.en.md)
> **Important Disclaimer**
>
> This repository is provided for learning, research, personal experimentation, and internal validation only. It does not grant any commercial authorization and comes with no warranty of fitness, stability, or results.
@@ -24,7 +26,7 @@ DS2API converts DeepSeek Web chat capability into OpenAI-compatible, Claude-comp
>
> Do not use this project in ways that violate service terms, agreements, laws, or platform rules. Before any commercial use, review the `LICENSE`, the relevant terms, and confirm that you have the author's written permission.
## Architecture Overview
## Architecture Overview (Summary)
```mermaid
flowchart LR
@@ -48,7 +50,7 @@ flowchart LR
Auth["Auth Resolver\n(API key / bearer / x-goog-api-key)"]
Pool["Account Pool + Queue\n(in-flight slots + wait queue)"]
DSClient["DeepSeek Client\n(session / auth / HTTP)"]
Pow["PoW WASM\n(wazero preload)"]
Pow["PoW Solver\n(Pure Go ms-level)"]
Tool["Tool Sieve\n(Go/Node semantic parity)"]
end
end
@@ -72,6 +74,8 @@ flowchart LR
Bridge --> Client
```
For the full module-by-module architecture and directory responsibilities, see [docs/ARCHITECTURE.en.md](docs/ARCHITECTURE.en.md).
- **Backend**: Go (`cmd/ds2api/`, `api/`, `internal/`), no Python runtime
- **Frontend**: React admin panel (`webui/`), served as static build at runtime
- **Deployment**: local run, Docker, Vercel serverless, Linux systemd
@@ -95,7 +99,7 @@ flowchart LR
| Gemini compatible | `POST /v1beta/models/{model}:generateContent`, `POST /v1beta/models/{model}:streamGenerateContent` (plus `/v1/models/{model}:*` paths) |
| Multi-account rotation | Auto token refresh, email/mobile dual login |
| Concurrency control | Per-account in-flight limit + waiting queue, dynamic recommended concurrency |
| DeepSeek PoW | WASM solving via `wazero`, no external Node.js dependency |
| DeepSeek PoW | Pure Go high-performance solver (DeepSeekHashV1), ms-level response |
| Tool Calling | Anti-leak handling: non-code-block feature match, early `delta.tool_calls`, structured incremental output |
| Admin API | Config management, runtime settings hot-reload, account testing/batch test, session cleanup, import/export, Vercel sync, version check |
| WebUI Admin Panel | SPA at `/admin` (bilingual Chinese/English, dark mode) |
@@ -344,7 +348,6 @@ cp opencode.json.example opencode.json
| `DS2API_CONFIG_PATH` | Config file path | `config.json` |
| `DS2API_CONFIG_JSON` | Inline config (JSON or Base64) | — |
| `DS2API_ENV_WRITEBACK` | Auto-write env-backed config to file and transition to file mode (`1/true/yes/on`) | Disabled |
| `DS2API_WASM_PATH` | PoW WASM file path | Auto-detect |
| `DS2API_STATIC_ADMIN_DIR` | Admin static assets dir | `static/admin` |
| `DS2API_AUTO_BUILD_WEBUI` | Auto-build WebUI on startup | Enabled locally, disabled on Vercel |
| `DS2API_ACCOUNT_MAX_INFLIGHT` | Max in-flight requests per account | `2` |
@@ -430,72 +433,6 @@ The save endpoint can target a chain by `query`, `chain_key`, or `capture_id`. E
{"query":"Guangzhou weather","sample_id":"gz-weather-from-memory"}
```
## Project Structure
```text
ds2api/
├── app/ # Unified HTTP handler assembly (shared by local + serverless)
├── cmd/
│ ├── ds2api/ # Local / container entrypoint
│ └── ds2api-tests/ # End-to-end testsuite entrypoint
├── api/
│ ├── index.go # Vercel Serverless Go entry
│ ├── chat-stream.js # Vercel Node.js stream relay
│ └── (rewrite targets in vercel.json)
├── internal/
│ ├── account/ # Account pool and concurrency queue
│ ├── adapter/
│ │ ├── openai/ # OpenAI adapter (incl. tool call parsing, Vercel stream prepare/release)
│ │ ├── claude/ # Claude adapter
│ │ └── gemini/ # Gemini adapter (generateContent / streamGenerateContent)
│ ├── admin/ # Admin API handlers (incl. Settings hot-reload)
│ ├── auth/ # Auth and JWT
│ ├── claudeconv/ # Claude message format conversion
│ ├── compat/ # Go-version compatibility and regression helpers
│ ├── config/ # Config loading, validation, and hot-reload
│ ├── deepseek/ # DeepSeek API client, PoW WASM
│ ├── js/ # Node runtime stream/compat logic
│ ├── devcapture/ # Dev packet capture module
│ ├── rawsample/ # Visible-text extraction and replay helpers for raw stream samples
│ ├── format/ # Output formatting
│ ├── prompt/ # Prompt construction
│ ├── server/ # HTTP routing and middleware (chi router)
│ ├── sse/ # SSE parsing utilities
│ ├── stream/ # Unified stream consumption engine
│ ├── testsuite/ # End-to-end testsuite framework and case orchestration
│ ├── translatorcliproxy/ # CLIProxy bridge and stream writer components
│ ├── toolcall/ # Tool Call parsing, repair, and formatting (core business logic)
│ ├── util/ # Common utilities (Token estimation, JSON helpers, etc.)
│ ├── version/ # Version parsing/comparison and tag normalization
│ └── webui/ # WebUI static file serving and auto-build
├── webui/ # React WebUI source (Vite + Tailwind)
│ └── src/
│ ├── app/ # Routing, auth, config state
│ ├── features/ # Feature modules (account/settings/vercel/apiTester)
│ ├── components/ # Shared UI pieces (login/landing, etc.)
│ └── locales/ # Language packs (zh.json / en.json)
├── scripts/
│ └── build-webui.sh # Manual WebUI build script
├── tests/
│ ├── compat/ # Compatibility fixtures and expected outputs
│ ├── node/ # Node-side unit tests (chat-stream / tool-sieve)
│ ├── raw_stream_samples/ # Raw SSE samples and replay metadata
│ └── scripts/ # Unified test script entrypoints (unit/e2e)
├── docs/ # Deployment / contributing / testing docs
├── static/admin/ # WebUI build output (not committed to Git)
├── .github/
│ ├── workflows/ # GitHub Actions (quality gates + release automation)
│ ├── ISSUE_TEMPLATE/ # Issue templates
│ └── PULL_REQUEST_TEMPLATE.md
├── config.example.json # Config file template
├── .env.example # Environment variable template
├── Dockerfile # Multi-stage build (WebUI + Go)
├── docker-compose.yml # Production Docker Compose
├── docker-compose.dev.yml # Development Docker Compose
├── vercel.json # Vercel routing and build config
└── go.mod / go.sum # Go module dependencies
```
## Documentation Index
| Document | Description |

View File

@@ -1 +1 @@
3.1.2
3.2.0

136
docs/ARCHITECTURE.en.md Normal file
View File

@@ -0,0 +1,136 @@
# DS2API Architecture & Project Layout
Language: [中文](ARCHITECTURE.md) | [English](ARCHITECTURE.en.md)
> This file is the single architecture source for directory layout, module boundaries, and execution flow.
## 1. Top-level Layout (expanded)
> Notes: this is the **fully expanded** project directory list (excluding metadata/dependency dirs such as `.git/` and `webui/node_modules/`), with each folder annotated by purpose.
```text
ds2api/
├── .github/ # GitHub collaboration and CI config
│ ├── ISSUE_TEMPLATE/ # Issue templates
│ └── workflows/ # GitHub Actions workflows
├── api/ # Serverless entrypoints (Vercel Go/Node)
├── app/ # Application-level handler assembly
├── cmd/ # Executable entrypoints
│ ├── ds2api/ # Main service bootstrap
│ └── ds2api-tests/ # E2E testsuite CLI bootstrap
├── docs/ # Project documentation
├── internal/ # Core implementation (non-public packages)
│ ├── account/ # Account pool, inflight slots, waiting queue
│ ├── adapter/ # Multi-protocol adapters
│ │ ├── claude/ # Claude protocol adapter
│ │ ├── gemini/ # Gemini protocol adapter
│ │ └── openai/ # OpenAI adapter and shared execution core
│ ├── admin/ # Admin API (config/accounts/ops)
│ ├── auth/ # Auth/JWT/credential resolution
│ ├── claudeconv/ # Claude message conversion helpers
│ ├── compat/ # Compatibility and regression helpers
│ ├── config/ # Config loading/validation/hot reload
│ ├── deepseek/ # DeepSeek upstream client capabilities
│ │ └── transport/ # DeepSeek transport details
│ ├── devcapture/ # Dev capture and troubleshooting
│ ├── format/ # Response formatting layer
│ │ ├── claude/ # Claude output formatting
│ │ └── openai/ # OpenAI output formatting
│ ├── js/ # Node runtime related logic
│ │ ├── chat-stream/ # Node streaming bridge
│ │ ├── helpers/ # JS helper modules
│ │ │ └── stream-tool-sieve/ # JS implementation of tool sieve
│ │ └── shared/ # Shared semantics between Go/Node
│ ├── prompt/ # Prompt composition
│ ├── rawsample/ # Raw sample read/write and management
│ ├── server/ # Router and middleware assembly
│ ├── sse/ # SSE parsing utilities
│ ├── stream/ # Unified stream consumption engine
│ ├── testsuite/ # Testsuite execution framework
│ ├── textclean/ # Text cleanup
│ ├── toolcall/ # Tool-call parsing and repair
│ ├── translatorcliproxy/ # Cross-protocol translation bridge
│ ├── util/ # Shared utility helpers
│ ├── version/ # Version query/compare
│ └── webui/ # WebUI static hosting logic
├── plans/ # Stage plans and manual QA records
├── pow/ # PoW standalone implementation + benchmarks
├── scripts/ # Build/release helper scripts
├── tests/ # Test assets and scripts
│ ├── compat/ # Compatibility fixtures + expected outputs
│ │ ├── expected/ # Expected output samples
│ │ └── fixtures/ # Fixture inputs
│ │ ├── sse_chunks/ # SSE chunk fixtures
│ │ └── toolcalls/ # Tool-call fixtures
│ ├── node/ # Node unit tests
│ ├── raw_stream_samples/ # Upstream raw SSE samples
│ │ ├── content-filter-trigger-20260405-jwt3/ # Content-filter terminal sample
│ │ ├── continue-thinking-snapshot-replay-20260405/ # Continue-thinking sample
│ │ ├── guangzhou-weather-reasoner-search-20260404/ # Search/reference sample
│ │ ├── markdown-format-example-20260405/ # Markdown sample
│ │ └── markdown-format-example-20260405-spacefix/ # Space-fix sample
│ ├── scripts/ # Test entry scripts
│ └── tools/ # Testing helper tools
└── webui/ # React admin console source
├── public/ # Static assets
└── src/ # Frontend source code
├── app/ # Routing/state scaffolding
├── components/ # Shared UI components
├── features/ # Feature modules
│ ├── account/ # Account management page
│ ├── apiTester/ # API tester page
│ ├── settings/ # Settings page
│ └── vercel/ # Vercel sync page
├── layout/ # Layout components
├── locales/ # i18n strings
└── utils/ # Frontend utilities
```
## 2. Primary Request Flow
```mermaid
flowchart LR
C[Client/SDK] --> R[internal/server/router.go]
R --> OA[OpenAI Adapter]
R --> CA[Claude Adapter]
R --> GA[Gemini Adapter]
R --> AD[Admin API]
CA --> BR[translatorcliproxy]
GA --> BR
BR --> CORE[internal/adapter/openai ChatCompletions]
OA --> CORE
CORE --> AUTH[internal/auth + config key/account resolver]
CORE --> POOL[internal/account queue + concurrency]
CORE --> TOOL[internal/toolcall parser + sieve]
CORE --> DS[internal/deepseek client]
DS --> U[DeepSeek upstream]
```
## 3. Responsibilities in `internal/`
- `internal/server`: router tree + middlewares (health, protocol routes, Admin/WebUI).
- `internal/adapter/openai`: shared execution core (chat/responses/embeddings + tool semantics).
- `internal/adapter/{claude,gemini}`: protocol wrappers only (no duplicated upstream execution).
- `internal/translatorcliproxy`: structure translation between Claude/Gemini and OpenAI.
- `internal/deepseek`: upstream request/session/PoW/SSE handling.
- `internal/stream` + `internal/sse`: stream parsing and incremental assembly.
- `internal/toolcall`: JSON/XML/invoke/text-kv tool-call parsing + anti-leak sieve.
- `internal/admin`: config/accounts/vercel sync/version/dev-capture endpoints.
- `internal/config`: config loading/validation + runtime settings hot-reload.
- `internal/account`: managed account pool, inflight slots, waiting queue.
## 4. WebUI Runtime Relation
- `webui/` stores frontend source (Vite + React).
- Runtime serves static output from `static/admin`.
- On first local startup, if `static/admin` is missing, DS2API may auto-build it (Node.js required).
## 5. Documentation Split Strategy
- Onboarding & quick start: `README.MD` / `README.en.md`
- Architecture & layout: `docs/ARCHITECTURE*.md` (this file)
- API contracts: `API.md` / `API.en.md`
- Deployment/testing/contributing: `docs/DEPLOY*`, `docs/TESTING.md`, `docs/CONTRIBUTING*`
- Deep topics: `docs/toolcall-semantics.md`, `docs/DeepSeekSSE行为结构说明-2026-04-05.md`

136
docs/ARCHITECTURE.md Normal file
View File

@@ -0,0 +1,136 @@
# DS2API 架构与项目结构说明
语言 / Language: [中文](ARCHITECTURE.md) | [English](ARCHITECTURE.en.md)
> 本文档用于集中维护“代码目录结构 + 模块边界 + 主链路调用关系”。
## 1. 顶层目录结构(展开)
> 说明:以下为仓库内业务相关目录的**完整展开**(排除 `.git/` 与 `webui/node_modules/` 这类依赖/元数据目录),并标注每个文件夹作用。
```text
ds2api/
├── .github/ # GitHub 协作与 CI 配置
│ ├── ISSUE_TEMPLATE/ # Issue 模板
│ └── workflows/ # GitHub Actions 工作流
├── api/ # Serverless 入口Vercel Go/Node
├── app/ # 应用级 handler 装配层
├── cmd/ # 可执行程序入口
│ ├── ds2api/ # 主服务启动入口
│ └── ds2api-tests/ # E2E 测试集 CLI 入口
├── docs/ # 项目文档目录
├── internal/ # 核心业务实现(不对外暴露)
│ ├── account/ # 账号池、并发槽位、等待队列
│ ├── adapter/ # 多协议适配层
│ │ ├── claude/ # Claude 协议适配
│ │ ├── gemini/ # Gemini 协议适配
│ │ └── openai/ # OpenAI 协议与统一执行核心
│ ├── admin/ # Admin API配置/账号/运维)
│ ├── auth/ # 鉴权/JWT/凭证解析
│ ├── claudeconv/ # Claude 消息格式转换工具
│ ├── compat/ # 兼容性辅助与回归支持
│ ├── config/ # 配置加载、校验、热更新
│ ├── deepseek/ # DeepSeek 上游客户端能力
│ │ └── transport/ # DeepSeek 传输层细节
│ ├── devcapture/ # 开发抓包与调试采集
│ ├── format/ # 响应格式化层
│ │ ├── claude/ # Claude 输出格式化
│ │ └── openai/ # OpenAI 输出格式化
│ ├── js/ # Node Runtime 相关逻辑
│ │ ├── chat-stream/ # Node 流式输出桥接
│ │ ├── helpers/ # JS 辅助函数
│ │ │ └── stream-tool-sieve/ # Tool sieve JS 实现
│ │ └── shared/ # Go/Node 共用语义片段
│ ├── prompt/ # Prompt 组装
│ ├── rawsample/ # raw sample 读写与管理
│ ├── server/ # 路由与中间件装配
│ ├── sse/ # SSE 解析工具
│ ├── stream/ # 统一流式消费引擎
│ ├── testsuite/ # 测试集执行框架
│ ├── textclean/ # 文本清洗
│ ├── toolcall/ # 工具调用解析与修复
│ ├── translatorcliproxy/ # 多协议互转桥
│ ├── util/ # 通用工具函数
│ ├── version/ # 版本查询/比较
│ └── webui/ # WebUI 静态托管相关逻辑
├── plans/ # 阶段计划与人工验收记录
├── pow/ # PoW 独立实现与基准
├── scripts/ # 构建/发布/辅助脚本
├── tests/ # 测试资源与脚本
│ ├── compat/ # 兼容性夹具与期望输出
│ │ ├── expected/ # 预期结果样本
│ │ └── fixtures/ # 测试输入夹具
│ │ ├── sse_chunks/ # SSE chunk 夹具
│ │ └── toolcalls/ # toolcall 夹具
│ ├── node/ # Node 单元测试
│ ├── raw_stream_samples/ # 上游原始 SSE 样本
│ │ ├── content-filter-trigger-20260405-jwt3/ # 风控终态样本
│ │ ├── continue-thinking-snapshot-replay-20260405/ # continue 样本
│ │ ├── guangzhou-weather-reasoner-search-20260404/ # 搜索+引用样本
│ │ ├── markdown-format-example-20260405/ # Markdown 样本
│ │ └── markdown-format-example-20260405-spacefix/ # 空格修复样本
│ ├── scripts/ # 测试脚本入口
│ └── tools/ # 测试辅助工具
└── webui/ # React 管理台源码
├── public/ # 静态资源
└── src/ # 前端源码
├── app/ # 路由/状态框架
├── components/ # 共享组件
├── features/ # 功能模块
│ ├── account/ # 账号管理页面
│ ├── apiTester/ # API 测试页面
│ ├── settings/ # 设置页面
│ └── vercel/ # Vercel 同步页面
├── layout/ # 布局组件
├── locales/ # 国际化文案
└── utils/ # 前端工具函数
```
## 2. 请求主链路
```mermaid
flowchart LR
C[Client/SDK] --> R[internal/server/router.go]
R --> OA[OpenAI Adapter]
R --> CA[Claude Adapter]
R --> GA[Gemini Adapter]
R --> AD[Admin API]
CA --> BR[translatorcliproxy]
GA --> BR
BR --> CORE[internal/adapter/openai ChatCompletions]
OA --> CORE
CORE --> AUTH[internal/auth + config key/account resolver]
CORE --> POOL[internal/account queue + concurrency]
CORE --> TOOL[internal/toolcall parser + sieve]
CORE --> DS[internal/deepseek client]
DS --> U[DeepSeek upstream]
```
## 3. internal/ 子模块职责
- `internal/server`路由树和中间件挂载健康检查、协议入口、Admin/WebUI
- `internal/adapter/openai`统一执行内核chat/responses/embeddings 与 tool calling 语义)。
- `internal/adapter/{claude,gemini}`:协议输入输出适配,不重复实现上游调用逻辑。
- `internal/translatorcliproxy`Claude/Gemini 与 OpenAI 结构互转。
- `internal/deepseek`上游请求、会话、PoW、SSE 消费。
- `internal/stream` + `internal/sse`:流式解析与增量处理。
- `internal/toolcall`JSON/XML/invoke/text-kv 工具调用解析及防泄漏筛分。
- `internal/admin`配置管理、账号管理、Vercel 同步、版本检查、开发抓包。
- `internal/config`:配置加载、校验、运行时 settings 热更新。
- `internal/account`:托管账号池、并发槽位、等待队列。
## 4. WebUI 与运行时关系
- `webui/` 是前端源码Vite + React
- 运行时托管目录是 `static/admin`(构建产物)。
- 本地首次启动若 `static/admin` 缺失,会尝试自动构建(依赖 Node.js
## 5. 文档拆分策略
- 总览与快速开始:`README.MD` / `README.en.md`
- 架构与目录:`docs/ARCHITECTURE*.md`(本文件)
- 接口协议:`API.md` / `API.en.md`
- 部署、测试、贡献:`docs/DEPLOY*``docs/TESTING.md``docs/CONTRIBUTING*`
- 专题:`docs/toolcall-semantics.md``docs/DeepSeekSSE行为结构说明-2026-04-05.md`

View File

@@ -94,58 +94,12 @@ Manually build WebUI to `static/admin/`:
## Project Structure
```text
ds2api/
├── app/ # Shared HTTP handler assembly (local + serverless)
├── cmd/
│ ├── ds2api/ # Local/container entrypoint
│ └── ds2api-tests/ # End-to-end testsuite entrypoint
├── api/
│ ├── index.go # Vercel Serverless Go entry
│ ├── chat-stream.js # Vercel Node.js stream relay
│ └── (rewrite targets in vercel.json)
├── internal/
│ ├── account/ # Account pool and concurrency queue
│ ├── adapter/
│ │ ├── openai/ # OpenAI adapter
│ │ ├── claude/ # Claude adapter
│ │ └── gemini/ # Gemini adapter
│ ├── admin/ # Admin API handlers
│ ├── auth/ # Auth and JWT
│ ├── claudeconv/ # Claude message conversion
│ ├── compat/ # Go-version compatibility and regression helpers
│ ├── config/ # Config loading, validation, and hot-reload
│ ├── deepseek/ # DeepSeek client, PoW WASM
│ ├── js/ # Node runtime stream/compat logic
│ ├── devcapture/ # Dev packet capture
│ ├── format/ # Output formatting
│ ├── prompt/ # Prompt building
│ ├── server/ # HTTP routing (chi router)
│ ├── sse/ # SSE parsing utilities
│ ├── stream/ # Unified stream consumption engine
│ ├── testsuite/ # Testsuite framework and scenario orchestration
│ ├── translatorcliproxy/ # CLIProxy bridge and stream writer
│ ├── util/ # Common utilities
│ ├── version/ # Version parsing and comparison
│ └── webui/ # WebUI static hosting
├── webui/ # React WebUI source
│ └── src/
│ ├── app/ # Routing, auth, config state
│ ├── features/ # Feature modules
│ ├── components/ # Shared components
│ └── locales/ # Language packs
├── scripts/ # Build and test scripts
├── tests/
│ ├── compat/ # Compatibility fixtures and expected outputs
│ ├── node/ # Node-side unit tests
│ └── scripts/ # Test script entrypoints (unit/e2e)
├── plans/ # Plans, gates, and manual smoke-test records
├── static/admin/ # WebUI build output (not committed)
├── Dockerfile # Multi-stage build
├── docker-compose.yml # Production
├── docker-compose.dev.yml # Development
└── vercel.json # Vercel config
```
To avoid documentation drift, directory layout and module responsibilities were moved to:
- [docs/ARCHITECTURE.en.md](./ARCHITECTURE.en.md)
- [docs/README.md](./README.md)
Before contributing, review the architecture doc sections for request flow and `internal/` module boundaries.
## Reporting Issues

View File

@@ -94,58 +94,12 @@ docker-compose -f docker-compose.dev.yml up
## 项目结构
```text
ds2api/
├── app/ # 统一 HTTP Handler 装配(本地 + Serverless
├── cmd/
│ ├── ds2api/ # 本地/容器启动入口
│ └── ds2api-tests/ # 端到端测试集入口
├── api/
│ ├── index.go # Vercel Serverless Go 入口
│ ├── chat-stream.js # Vercel Node.js 流式转发
│ └── (rewrite targets in vercel.json)
├── internal/
│ ├── account/ # 账号池与并发队列
│ ├── adapter/
│ │ ├── openai/ # OpenAI 兼容适配器
│ │ ├── claude/ # Claude 兼容适配器
│ │ └── gemini/ # Gemini 兼容适配器
│ ├── admin/ # Admin API handlers
│ ├── auth/ # 鉴权与 JWT
│ ├── claudeconv/ # Claude 消息格式转换
│ ├── compat/ # Go 版本兼容与回归测试辅助
│ ├── config/ # 配置加载、校验与热更新
│ ├── deepseek/ # DeepSeek 客户端、PoW WASM
│ ├── js/ # Node 运行时流式/兼容逻辑
│ ├── devcapture/ # 开发抓包
│ ├── format/ # 输出格式化
│ ├── prompt/ # Prompt 构建
│ ├── server/ # HTTP 路由chi router
│ ├── sse/ # SSE 解析工具
│ ├── stream/ # 统一流式消费引擎
│ ├── testsuite/ # 测试集框架与场景编排
│ ├── translatorcliproxy/ # CLIProxy 桥接与流式写入
│ ├── util/ # 通用工具
│ ├── version/ # 版本解析与比较
│ └── webui/ # WebUI 静态托管
├── webui/ # React WebUI 源码
│ └── src/
│ ├── app/ # 路由、鉴权、配置状态
│ ├── features/ # 业务功能模块
│ ├── components/ # 通用组件
│ └── locales/ # 语言包
├── scripts/ # 构建与测试脚本
├── tests/
│ ├── compat/ # 兼容夹具与期望输出
│ ├── node/ # Node 侧单元测试
│ └── scripts/ # 测试脚本入口unit/e2e
├── plans/ # 计划、门禁和手工烟测记录
├── static/admin/ # WebUI 构建产物(不提交)
├── Dockerfile # 多阶段构建
├── docker-compose.yml # 生产环境
├── docker-compose.dev.yml # 开发环境
└── vercel.json # Vercel 配置
```
为避免与其他文档重复维护,目录结构与模块职责已迁移到:
- [docs/ARCHITECTURE.md](./ARCHITECTURE.md)
- [docs/README.md](./README.md)
贡献前建议先阅读架构文档中的“请求主链路”和 `internal/` 模块职责,再定位改动范围。
## 问题反馈

View File

@@ -4,6 +4,8 @@ Language: [中文](DEPLOY.md) | [English](DEPLOY.en.md)
This guide covers all deployment methods for the current Go-based codebase.
Doc map: [Index](./README.md) | [Architecture](./ARCHITECTURE.en.md) | [API](../API.en.md) | [Testing](./TESTING.md)
---
## Table of Contents
@@ -366,7 +368,6 @@ Each archive includes:
- `ds2api` executable (`ds2api.exe` on Windows)
- `static/admin/` (built WebUI assets)
- `sha3_wasm_bg.7b9ca65ddd.wasm` (optional; binary has embedded fallback)
- `config.example.json`, `.env.example`
- `README.MD`, `README.en.md`, `LICENSE`
@@ -456,8 +457,6 @@ server {
# Copy compiled binary and related files to target directory
sudo mkdir -p /opt/ds2api
sudo cp ds2api config.json /opt/ds2api/
# Optional: if you want to use an external WASM file (override the embedded one, from a release package or build output)
# sudo cp /path/to/sha3_wasm_bg.7b9ca65ddd.wasm /opt/ds2api/
sudo cp -r static/admin /opt/ds2api/static/admin
```

View File

@@ -4,6 +4,8 @@
本指南基于当前 Go 代码库,详细说明各种部署方式。
本页导航:[文档总索引](./README.md)[架构说明](./ARCHITECTURE.md)[接口文档](../API.md)[测试指南](./TESTING.md)
---
## 目录
@@ -366,7 +368,6 @@ No Output Directory named "public" found after the Build completed.
- `ds2api` 可执行文件Windows 为 `ds2api.exe`
- `static/admin/`WebUI 构建产物)
- `sha3_wasm_bg.7b9ca65ddd.wasm`(可选;程序内置 embed fallback
- `config.example.json``.env.example`
- `README.MD``README.en.md``LICENSE`
@@ -456,8 +457,6 @@ server {
# 将编译好的二进制文件和相关文件复制到目标目录
sudo mkdir -p /opt/ds2api
sudo cp ds2api config.json /opt/ds2api/
# 可选:若你希望使用外置 WASM 文件(覆盖内置版本,来自 release 包或构建产物)
# sudo cp /path/to/sha3_wasm_bg.7b9ca65ddd.wasm /opt/ds2api/
sudo cp -r static/admin /opt/ds2api/static/admin
```

View File

@@ -4,6 +4,8 @@
> 当前 corpus 由 4 份原始流组成,覆盖搜索+引用、风控终态、Markdown 输出和空格敏感输出等行为。
> 补充:文末还会注明少量“当前实现已确认、但 corpus 尚未完整覆盖”的行为,例如长思考场景下的自动续写状态。
文档导航:[文档总索引](./README.md) / [测试指南](./TESTING.md) / [样本目录说明](../tests/raw_stream_samples/README.md)
## 1. 样本覆盖
下列样本共同构成了本文的观察基础:

53
docs/README.md Normal file
View File

@@ -0,0 +1,53 @@
# DS2API 文档导航 | Documentation Index
语言 / Language: [中文](README.md) | [English](README.md#english)
## 中文
为减少重复维护,本仓库文档按“入口文档 + 专题文档”拆分。建议从下列顺序阅读:
1. [项目总览README](../README.MD)
2. [架构与目录说明](./ARCHITECTURE.md)
3. [接口文档API](../API.md)
4. [部署指南](./DEPLOY.md)
5. [测试指南](./TESTING.md)
6. [贡献指南](./CONTRIBUTING.md)
### 专题文档
- [Tool Calling 统一语义](./toolcall-semantics.md)
- [DeepSeek SSE 行为结构说明(逆向观察)](./DeepSeekSSE行为结构说明-2026-04-05.md)
### 文档维护约定
- `README.MD` / `README.en.md`:面向首次接触用户,保留“是什么 + 怎么快速跑起来”。
- `docs/ARCHITECTURE*.md`:面向开发者,集中维护项目结构、模块职责与调用链。
- `API*.md`:面向客户端接入者,聚焦接口行为、鉴权和示例。
- 其他 `docs/*.md`:主题化说明,避免在多个文档重复粘贴同一段内容。
---
## English
To reduce maintenance drift, docs are split into an “entry doc + topical docs” layout.
Recommended reading order:
1. [Project overview (README)](../README.en.md)
2. [Architecture and project layout](./ARCHITECTURE.en.md)
3. [API reference](../API.en.md)
4. [Deployment guide](./DEPLOY.en.md)
5. [Testing guide](./TESTING.md)
6. [Contributing guide](./CONTRIBUTING.en.md)
### Topical docs
- [Tool-calling unified semantics](./toolcall-semantics.md)
- [DeepSeek SSE behavior notes (reverse-engineered)](./DeepSeekSSE行为结构说明-2026-04-05.md)
### Maintenance conventions
- `README.MD` / `README.en.md`: onboarding-oriented (“what + quick start”).
- `docs/ARCHITECTURE*.md`: developer-oriented source of truth for module boundaries and execution flow.
- `API*.md`: integration-oriented behavior/contracts.
- Other `docs/*.md`: focused topics, avoid copy-pasting the same section into multiple files.

View File

@@ -2,6 +2,8 @@
语言 / Language: 中文 + English同页
文档导航: [总览](../README.MD) / [架构说明](./ARCHITECTURE.md) / [部署指南](./DEPLOY.md) / [接口文档](../API.md)
## 概述 | Overview
DS2API 提供两个层级的测试:
@@ -235,6 +237,7 @@ go run ./cmd/ds2api-tests --no-preflight
说明:
- 该工具默认重放 `tests/raw_stream_samples/manifest.json` 声明的 canonical 样本,按上游 SSE 顺序做 1:1 仿真解析。
- 默认校验不出现 `FINISHED` 文本泄露,并要求存在结束信号。
- 默认**不**把 `raw accumulated_token_usage` 与本地解析 token 做强一致校验(当前实现以内容估算为准);如需强校验可显式加 `--fail-on-token-mismatch`
- 每次运行都会把本地派生结果写入 `artifacts/raw-stream-sim/<run-id>/<sample-id>/replay.output.txt`,并输出结构化报告。
- 如果你有历史基线目录,可以通过 `--baseline-root` 让工具直接做文本对比。
- 更完整的协议级行为结构说明见 [DeepSeekSSE行为结构说明-2026-04-05.md](./DeepSeekSSE行为结构说明-2026-04-05.md)。

View File

@@ -2,6 +2,8 @@
本文档描述当前代码中 `ParseToolCallsDetailed` / `parseToolCallsDetailed` 的**实际行为**,用于对齐 Go 与 Node Runtime。
文档导航:[总览](../README.MD) / [架构说明](./ARCHITECTURE.md) / [测试指南](./TESTING.md)
## 1) 输出结构(当前实现)
- `calls`:解析得到的工具调用列表(`name` + `input`)。

1
go.mod
View File

@@ -8,7 +8,6 @@ require (
github.com/google/uuid v1.6.0
github.com/refraction-networking/utls v1.8.2
github.com/router-for-me/CLIProxyAPI/v6 v6.9.14
github.com/tetratelabs/wazero v1.11.0
)
require (

2
go.sum
View File

@@ -18,8 +18,6 @@ github.com/sirupsen/logrus v1.9.4 h1:TsZE7l11zFCLZnZ+teH4Umoq5BhEIfIzfRDZ1Uzql2w
github.com/sirupsen/logrus v1.9.4/go.mod h1:ftWc9WdOfJ0a92nsE2jF5u5ZwH8Bv2zdeOC42RjbV2g=
github.com/stretchr/testify v1.10.0 h1:Xv5erBjTwe/5IxqUQTdXv5kgmIvbHo3QQyRwhJsOfJA=
github.com/stretchr/testify v1.10.0/go.mod h1:r2ic/lqez/lEtzL7wO/rwa5dbSLXVDPFyf8C91i36aY=
github.com/tetratelabs/wazero v1.11.0 h1:+gKemEuKCTevU4d7ZTzlsvgd1uaToIDtlQlmNbwqYhA=
github.com/tetratelabs/wazero v1.11.0/go.mod h1:eV28rsN8Q+xwjogd7f4/Pp4xFxO7uOGbLcD/LzB1wiU=
github.com/tidwall/gjson v1.14.2/go.mod h1:/wbyibRr2FHMks5tjHJ5F8dMZh3AcwJEMf5vlfC0lxk=
github.com/tidwall/gjson v1.18.0 h1:FIDeeyB800efLX89e5a8Y0BNH+LOngJyGrIWxG2FKQY=
github.com/tidwall/gjson v1.18.0/go.mod h1:/wbyibRr2FHMks5tjHJ5F8dMZh3AcwJEMf5vlfC0lxk=

View File

@@ -24,10 +24,9 @@ type claudeStreamRuntime struct {
bufferToolContent bool
stripReferenceMarkers bool
messageID string
thinking strings.Builder
text strings.Builder
outputTokens int
messageID string
thinking strings.Builder
text strings.Builder
nextBlockIndex int
thinkingBlockOpen bool
@@ -70,9 +69,6 @@ func (s *claudeStreamRuntime) onParsed(parsed sse.LineResult) streamengine.Parse
if !parsed.Parsed {
return streamengine.ParsedDecision{}
}
if parsed.OutputTokens > 0 {
s.outputTokens = parsed.OutputTokens
}
if parsed.ErrorMessage != "" {
s.upstreamErr = parsed.ErrorMessage
return streamengine.ParsedDecision{Stop: true, StopReason: streamengine.StopReason("upstream_error")}

View File

@@ -109,9 +109,6 @@ func (s *claudeStreamRuntime) finalize(stopReason string) {
}
outputTokens := util.EstimateTokens(finalThinking) + util.EstimateTokens(finalText)
if s.outputTokens > 0 {
outputTokens = s.outputTokens
}
s.send("message_delta", map[string]any{
"type": "message_delta",
"delta": map[string]any{

View File

@@ -149,14 +149,13 @@ func (h *Handler) handleNonStreamGenerateContent(w http.ResponseWriter, resp *ht
cleanVisibleOutput(result.Thinking, stripReferenceMarkers),
cleanVisibleOutput(result.Text, stripReferenceMarkers),
toolNames,
result.OutputTokens,
))
}
//nolint:unused // retained for native Gemini non-stream handling path.
func buildGeminiGenerateContentResponse(model, finalPrompt, finalThinking, finalText string, toolNames []string, outputTokens int) map[string]any {
func buildGeminiGenerateContentResponse(model, finalPrompt, finalThinking, finalText string, toolNames []string) map[string]any {
parts := buildGeminiPartsFromFinal(finalText, finalThinking, toolNames)
usage := buildGeminiUsage(finalPrompt, finalThinking, finalText, outputTokens)
usage := buildGeminiUsage(finalPrompt, finalThinking, finalText)
return map[string]any{
"candidates": []map[string]any{
{
@@ -174,14 +173,10 @@ func buildGeminiGenerateContentResponse(model, finalPrompt, finalThinking, final
}
//nolint:unused // retained for native Gemini non-stream handling path.
func buildGeminiUsage(finalPrompt, finalThinking, finalText string, outputTokens int) map[string]any {
func buildGeminiUsage(finalPrompt, finalThinking, finalText string) map[string]any {
promptTokens := util.EstimateTokens(finalPrompt)
reasoningTokens := util.EstimateTokens(finalThinking)
completionTokens := util.EstimateTokens(finalText)
if outputTokens > 0 {
completionTokens = outputTokens
reasoningTokens = 0
}
return map[string]any{
"promptTokenCount": promptTokens,
"candidatesTokenCount": reasoningTokens + completionTokens,

View File

@@ -65,9 +65,8 @@ type geminiStreamRuntime struct {
stripReferenceMarkers bool
toolNames []string
thinking strings.Builder
text strings.Builder
outputTokens int
thinking strings.Builder
text strings.Builder
}
//nolint:unused // retained for native Gemini stream handling path.
@@ -112,9 +111,6 @@ func (s *geminiStreamRuntime) onParsed(parsed sse.LineResult) streamengine.Parse
if !parsed.Parsed {
return streamengine.ParsedDecision{}
}
if parsed.OutputTokens > 0 {
s.outputTokens = parsed.OutputTokens
}
if parsed.ContentFilter || parsed.ErrorMessage != "" || parsed.Stop {
return streamengine.ParsedDecision{Stop: true}
}
@@ -198,6 +194,6 @@ func (s *geminiStreamRuntime) finalize() {
},
},
"modelVersion": s.model,
"usageMetadata": buildGeminiUsage(s.finalPrompt, finalThinking, finalText, s.outputTokens),
"usageMetadata": buildGeminiUsage(s.finalPrompt, finalThinking, finalText),
})
}

View File

@@ -37,7 +37,6 @@ type chatStreamRuntime struct {
streamToolNames map[int]string
thinking strings.Builder
text strings.Builder
outputTokens int
}
func newChatStreamRuntime(
@@ -170,12 +169,6 @@ func (s *chatStreamRuntime) finalize(finishReason string) {
finishReason = "tool_calls"
}
usage := openaifmt.BuildChatUsage(s.finalPrompt, finalThinking, finalText)
if s.outputTokens > 0 {
usage["completion_tokens"] = s.outputTokens
if prompt, ok := usage["prompt_tokens"].(int); ok {
usage["total_tokens"] = prompt + s.outputTokens
}
}
s.sendChunk(openaifmt.BuildChatStreamChunk(
s.completionID,
s.created,
@@ -190,9 +183,6 @@ func (s *chatStreamRuntime) onParsed(parsed sse.LineResult) streamengine.ParsedD
if !parsed.Parsed {
return streamengine.ParsedDecision{}
}
if parsed.OutputTokens > 0 {
s.outputTokens = parsed.OutputTokens
}
if parsed.ContentFilter {
return streamengine.ParsedDecision{Stop: true, StopReason: streamengine.StopReasonHandlerRequested}
}
@@ -243,7 +233,7 @@ func (s *chatStreamRuntime) onParsed(parsed sse.LineResult) streamengine.ParsedD
if !s.emitEarlyToolDeltas {
continue
}
filtered := filterIncrementalToolCallDeltasByAllowed(evt.ToolCallDeltas, s.toolNames, s.streamToolNames)
filtered := filterIncrementalToolCallDeltasByAllowed(evt.ToolCallDeltas, s.streamToolNames)
if len(filtered) == 0 {
continue
}

View File

@@ -131,14 +131,6 @@ func (h *Handler) handleNonStream(w http.ResponseWriter, ctx context.Context, re
return
}
respBody := openaifmt.BuildChatCompletion(completionID, model, finalPrompt, finalThinking, finalText, toolNames)
if result.OutputTokens > 0 {
if usage, ok := respBody["usage"].(map[string]any); ok {
usage["completion_tokens"] = result.OutputTokens
if prompt, ok := usage["prompt_tokens"].(int); ok {
usage["total_tokens"] = prompt + result.OutputTokens
}
}
}
writeJSON(w, http.StatusOK, respBody)
}

View File

@@ -113,7 +113,7 @@ func formatIncrementalStreamToolCallDeltas(deltas []toolCallDelta, ids map[int]s
return out
}
func filterIncrementalToolCallDeltasByAllowed(deltas []toolCallDelta, allowedNames []string, seenNames map[int]string) []toolCallDelta {
func filterIncrementalToolCallDeltasByAllowed(deltas []toolCallDelta, seenNames map[int]string) []toolCallDelta {
if len(deltas) == 0 {
return nil
}

View File

@@ -275,7 +275,7 @@ func TestHandleNonStreamFencedToolCallExamplePromotesToolCall(t *testing.T) {
TestHandleNonStreamFencedToolCallExampleDoesNotPromoteToolCall(t)
}
func TestHandleNonStreamReturns502WhenUpstreamOutputEmpty(t *testing.T) {
func TestHandleNonStreamReturns429WhenUpstreamOutputEmpty(t *testing.T) {
h := &Handler{}
resp := makeSSEHTTPResponse(
`data: {"p":"response/content","v":""}`,
@@ -284,8 +284,8 @@ func TestHandleNonStreamReturns502WhenUpstreamOutputEmpty(t *testing.T) {
rec := httptest.NewRecorder()
h.handleNonStream(rec, context.Background(), resp, "cid-empty", "deepseek-chat", "prompt", false, nil)
if rec.Code != http.StatusBadGateway {
t.Fatalf("expected status 502 for empty upstream output, got %d body=%s", rec.Code, rec.Body.String())
if rec.Code != http.StatusTooManyRequests {
t.Fatalf("expected status 429 for empty upstream output, got %d body=%s", rec.Code, rec.Body.String())
}
out := decodeJSONBody(t, rec.Body.String())
errObj, _ := out["error"].(map[string]any)

View File

@@ -22,6 +22,24 @@ func TestGetModelRouteDirectAndAlias(t *testing.T) {
}
})
t.Run("direct_expert", func(t *testing.T) {
req := httptest.NewRequest(http.MethodGet, "/v1/models/deepseek-expert-chat", nil)
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
})
t.Run("direct_vision", func(t *testing.T) {
req := httptest.NewRequest(http.MethodGet, "/v1/models/deepseek-vision-chat", nil)
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
})
t.Run("alias", func(t *testing.T) {
req := httptest.NewRequest(http.MethodGet, "/v1/models/gpt-4.1", nil)
rec := httptest.NewRecorder()

View File

@@ -74,16 +74,13 @@ func TestBuildOpenAIFinalPrompt_VercelPreparePathKeepsFinalAnswerInstruction(t *
}
finalPrompt, _ := buildOpenAIFinalPrompt(messages, tools, "")
if !strings.Contains(finalPrompt, "After receiving a tool result, use it directly.") {
t.Fatalf("vercel prepare finalPrompt missing final-answer instruction: %q", finalPrompt)
}
if !strings.Contains(finalPrompt, "Only call another tool if the result is insufficient.") {
t.Fatalf("vercel prepare finalPrompt missing retry guard instruction: %q", finalPrompt)
if !strings.Contains(finalPrompt, "Remember: Output ONLY the <tool_calls>...</tool_calls> XML block when calling tools.") {
t.Fatalf("vercel prepare finalPrompt missing final tool-call anchor instruction: %q", finalPrompt)
}
if !strings.Contains(finalPrompt, "TOOL CALL FORMAT") {
t.Fatalf("vercel prepare finalPrompt missing xml format instruction: %q", finalPrompt)
}
if !strings.Contains(finalPrompt, "Do NOT wrap the XML in markdown code fences") {
if !strings.Contains(finalPrompt, "Do NOT wrap XML in markdown fences") {
t.Fatalf("vercel prepare finalPrompt missing no-fence xml instruction: %q", finalPrompt)
}
if strings.Contains(finalPrompt, "```json") {

View File

@@ -130,14 +130,6 @@ func (h *Handler) handleResponsesNonStream(w http.ResponseWriter, resp *http.Res
}
responseObj := openaifmt.BuildResponseObject(responseID, model, finalPrompt, sanitizedThinking, sanitizedText, toolNames)
if result.OutputTokens > 0 {
if usage, ok := responseObj["usage"].(map[string]any); ok {
usage["output_tokens"] = result.OutputTokens
if input, ok := usage["input_tokens"].(int); ok {
usage["total_tokens"] = input + result.OutputTokens
}
}
}
h.getResponseStore().put(owner, responseID, responseObj)
writeJSON(w, http.StatusOK, responseObj)
}

View File

@@ -51,7 +51,6 @@ type responsesStreamRuntime struct {
messagePartAdded bool
sequence int
failed bool
outputTokens int
persistResponse func(obj map[string]any)
}
@@ -149,14 +148,6 @@ func (s *responsesStreamRuntime) finalize() {
s.closeIncompleteFunctionItems()
obj := s.buildCompletedResponseObject(finalThinking, finalText, detected)
if s.outputTokens > 0 {
if usage, ok := obj["usage"].(map[string]any); ok {
usage["output_tokens"] = s.outputTokens
if input, ok := usage["input_tokens"].(int); ok {
usage["total_tokens"] = input + s.outputTokens
}
}
}
if s.persistResponse != nil {
s.persistResponse(obj)
}
@@ -185,9 +176,6 @@ func (s *responsesStreamRuntime) onParsed(parsed sse.LineResult) streamengine.Pa
if !parsed.Parsed {
return streamengine.ParsedDecision{}
}
if parsed.OutputTokens > 0 {
s.outputTokens = parsed.OutputTokens
}
if parsed.ContentFilter || parsed.ErrorMessage != "" || parsed.Stop {
return streamengine.ParsedDecision{Stop: true}
}

View File

@@ -48,7 +48,7 @@ func (s *responsesStreamRuntime) processToolStreamEvents(events []toolStreamEven
if !s.emitEarlyToolDeltas {
continue
}
filtered := filterIncrementalToolCallDeltasByAllowed(evt.ToolCallDeltas, s.toolNames, s.functionNames)
filtered := filterIncrementalToolCallDeltasByAllowed(evt.ToolCallDeltas, s.functionNames)
if len(filtered) == 0 {
continue
}

View File

@@ -627,7 +627,7 @@ func TestHandleResponsesNonStreamToolChoiceNoneStillAllowsFunctionCall(t *testin
}
}
func TestHandleResponsesNonStreamReturns502WhenUpstreamOutputEmpty(t *testing.T) {
func TestHandleResponsesNonStreamReturns429WhenUpstreamOutputEmpty(t *testing.T) {
h := &Handler{}
rec := httptest.NewRecorder()
resp := &http.Response{
@@ -639,8 +639,8 @@ func TestHandleResponsesNonStreamReturns502WhenUpstreamOutputEmpty(t *testing.T)
}
h.handleResponsesNonStream(rec, resp, "owner-a", "resp_test", "deepseek-chat", "prompt", false, nil, util.DefaultToolChoicePolicy(), "")
if rec.Code != http.StatusBadGateway {
t.Fatalf("expected 502 for empty upstream output, got %d body=%s", rec.Code, rec.Body.String())
if rec.Code != http.StatusTooManyRequests {
t.Fatalf("expected 429 for empty upstream output, got %d body=%s", rec.Code, rec.Body.String())
}
out := decodeJSONBody(t, rec.Body.String())
errObj, _ := out["error"].(map[string]any)

View File

@@ -238,3 +238,97 @@ func TestChatCompletionsStreamContentFilterStopsNormallyWithoutLeak(t *testing.T
t.Fatalf("expected finish_reason=stop for content-filter upstream stop, got %#v", choice["finish_reason"])
}
}
func TestResponsesStreamUsageIgnoresBatchAccumulatedTokenUsage(t *testing.T) {
statuses := make([]int, 0, 1)
h := &Handler{
Store: mockOpenAIConfig{wideInput: true},
Auth: streamStatusAuthStub{},
DS: streamStatusDSStub{resp: makeOpenAISSEHTTPResponse(
`data: {"p":"response/content","v":"hello"}`,
`data: {"p":"response","o":"BATCH","v":[{"p":"accumulated_token_usage","v":190},{"p":"quasi_status","v":"FINISHED"}]}`,
)},
}
r := chi.NewRouter()
r.Use(captureStatusMiddleware(&statuses))
RegisterRoutes(r, h)
reqBody := `{"model":"deepseek-chat","input":"hi","stream":true}`
req := httptest.NewRequest(http.MethodPost, "/v1/responses", strings.NewReader(reqBody))
req.Header.Set("Authorization", "Bearer direct-token")
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
if len(statuses) != 1 || statuses[0] != http.StatusOK {
t.Fatalf("expected captured status 200, got %#v", statuses)
}
frames, done := parseSSEDataFrames(t, rec.Body.String())
if !done {
t.Fatalf("expected [DONE], body=%s", rec.Body.String())
}
if len(frames) == 0 {
t.Fatalf("expected at least one json frame, body=%s", rec.Body.String())
}
last := frames[len(frames)-1]
resp, _ := last["response"].(map[string]any)
if resp == nil {
t.Fatalf("expected response payload in final frame, got %#v", last)
}
usage, _ := resp["usage"].(map[string]any)
if usage == nil {
t.Fatalf("expected usage in response payload, got %#v", resp)
}
if got, _ := usage["output_tokens"].(float64); int(got) == 190 {
t.Fatalf("expected upstream accumulated token usage to be ignored, got %#v", usage["output_tokens"])
}
}
func TestResponsesNonStreamUsageIgnoresPromptAndOutputTokenUsage(t *testing.T) {
statuses := make([]int, 0, 1)
h := &Handler{
Store: mockOpenAIConfig{wideInput: true},
Auth: streamStatusAuthStub{},
DS: streamStatusDSStub{resp: makeOpenAISSEHTTPResponse(
`data: {"p":"response/content","v":"ok"}`,
`data: {"p":"response","o":"BATCH","v":[{"p":"token_usage","v":{"prompt_tokens":11,"completion_tokens":29}},{"p":"quasi_status","v":"FINISHED"}]}`,
)},
}
r := chi.NewRouter()
r.Use(captureStatusMiddleware(&statuses))
RegisterRoutes(r, h)
reqBody := `{"model":"deepseek-chat","input":"hi","stream":false}`
req := httptest.NewRequest(http.MethodPost, "/v1/responses", strings.NewReader(reqBody))
req.Header.Set("Authorization", "Bearer direct-token")
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
if len(statuses) != 1 || statuses[0] != http.StatusOK {
t.Fatalf("expected captured status 200, got %#v", statuses)
}
var out map[string]any
if err := json.Unmarshal(rec.Body.Bytes(), &out); err != nil {
t.Fatalf("decode response failed: %v body=%s", err, rec.Body.String())
}
usage, _ := out["usage"].(map[string]any)
if usage == nil {
t.Fatalf("expected usage object, got %#v", out)
}
input, _ := usage["input_tokens"].(float64)
output, _ := usage["output_tokens"].(float64)
total, _ := usage["total_tokens"].(float64)
if int(output) == 29 {
t.Fatalf("expected upstream completion token usage to be ignored, got %#v", usage["output_tokens"])
}
if int(total) != int(input)+int(output) {
t.Fatalf("expected total_tokens=input_tokens+output_tokens, usage=%#v", usage)
}
}

View File

@@ -10,6 +10,6 @@ func writeUpstreamEmptyOutputError(w http.ResponseWriter, thinking, text string,
writeOpenAIErrorWithCode(w, http.StatusBadRequest, "Upstream content filtered the response and returned no output.", "content_filter")
return true
}
writeOpenAIErrorWithCode(w, http.StatusBadGateway, "Upstream model returned empty output.", "upstream_empty_output")
writeOpenAIErrorWithCode(w, http.StatusTooManyRequests, "Upstream model returned empty output.", "upstream_empty_output")
return true
}

View File

@@ -26,9 +26,15 @@ func RegisterRoutes(r chi.Router, h *Handler) {
pr.Get("/config/export", h.configExport)
pr.Post("/keys", h.addKey)
pr.Delete("/keys/{key}", h.deleteKey)
pr.Get("/proxies", h.listProxies)
pr.Post("/proxies", h.addProxy)
pr.Put("/proxies/{proxyID}", h.updateProxy)
pr.Delete("/proxies/{proxyID}", h.deleteProxy)
pr.Post("/proxies/test", h.testProxy)
pr.Get("/accounts", h.listAccounts)
pr.Post("/accounts", h.addAccount)
pr.Delete("/accounts/{identifier}", h.deleteAccount)
pr.Put("/accounts/{identifier}/proxy", h.updateAccountProxy)
pr.Get("/queue/status", h.queueStatus)
pr.Post("/accounts/test", h.testSingleAccount)
pr.Post("/accounts/test-all", h.testAllAccounts)

View File

@@ -68,6 +68,7 @@ func (h *Handler) listAccounts(w http.ResponseWriter, r *http.Request) {
"identifier": acc.Identifier(),
"email": acc.Email,
"mobile": acc.Mobile,
"proxy_id": acc.ProxyID,
"has_password": acc.Password != "",
"has_token": token != "",
"token_preview": preview,
@@ -86,6 +87,11 @@ func (h *Handler) addAccount(w http.ResponseWriter, r *http.Request) {
return
}
err := h.Store.Update(func(c *config.Config) error {
if acc.ProxyID != "" {
if _, ok := findProxyByID(*c, acc.ProxyID); !ok {
return fmt.Errorf("代理不存在")
}
}
mobileKey := config.CanonicalMobileKey(acc.Mobile)
for _, a := range c.Accounts {
if acc.Email != "" && a.Email == acc.Email {

View File

@@ -15,8 +15,17 @@ import (
"ds2api/internal/config"
"ds2api/internal/deepseek"
"ds2api/internal/sse"
"ds2api/internal/util"
)
type modelAliasSnapshotReader struct {
aliases map[string]string
}
func (m modelAliasSnapshotReader) ModelAliases() map[string]string {
return m.aliases
}
func (h *Handler) testSingleAccount(w http.ResponseWriter, r *http.Request) {
var req map[string]any
_ = json.NewDecoder(r.Body).Decode(&req)
@@ -115,10 +124,11 @@ func (h *Handler) testAccount(ctx context.Context, acc config.Account, model, me
result["message"] = "登录成功但写入运行时 token 失败: " + err.Error()
return result
}
authCtx := &authn.RequestAuth{UseConfigToken: false, DeepSeekToken: token}
sessionID, err := h.DS.CreateSession(ctx, authCtx, 1)
authCtx := &authn.RequestAuth{UseConfigToken: false, DeepSeekToken: token, AccountID: identifier, Account: acc}
proxyCtx := authn.WithAuth(ctx, authCtx)
sessionID, err := h.DS.CreateSession(proxyCtx, authCtx, 1)
if err != nil {
newToken, loginErr := h.DS.Login(ctx, acc)
newToken, loginErr := h.DS.Login(proxyCtx, acc)
if loginErr != nil {
result["message"] = "创建会话失败: " + err.Error()
return result
@@ -129,7 +139,7 @@ func (h *Handler) testAccount(ctx context.Context, acc config.Account, model, me
result["message"] = "刷新 token 成功但写入运行时 token 失败: " + err.Error()
return result
}
sessionID, err = h.DS.CreateSession(ctx, authCtx, 1)
sessionID, err = h.DS.CreateSession(proxyCtx, authCtx, 1)
if err != nil {
result["message"] = "创建会话失败: " + err.Error()
return result
@@ -137,7 +147,7 @@ func (h *Handler) testAccount(ctx context.Context, acc config.Account, model, me
}
// 获取会话数量
sessionStats, sessionErr := h.DS.GetSessionCountForToken(ctx, token)
sessionStats, sessionErr := h.DS.GetSessionCountForToken(proxyCtx, token)
if sessionErr == nil && sessionStats != nil {
result["session_count"] = sessionStats.FirstPageCount
}
@@ -149,17 +159,28 @@ func (h *Handler) testAccount(ctx context.Context, acc config.Account, model, me
return result
}
thinking, search, ok := config.GetModelConfig(model)
resolvedModel, resolved := config.ResolveModel(modelAliasSnapshotReader{
aliases: h.Store.Snapshot().ModelAliases,
}, model)
if resolved {
model = resolvedModel
thinking, search, ok = config.GetModelConfig(model)
}
if !ok {
thinking, search = false, false
}
_ = search
pow, err := h.DS.GetPow(ctx, authCtx, 1)
pow, err := h.DS.GetPow(proxyCtx, authCtx, 1)
if err != nil {
result["message"] = "获取 PoW 失败: " + err.Error()
return result
}
payload := map[string]any{"chat_session_id": sessionID, "prompt": deepseek.MessagesPrepare([]map[string]any{{"role": "user", "content": message}}), "ref_file_ids": []any{}, "thinking_enabled": thinking, "search_enabled": search}
resp, err := h.DS.CallCompletion(ctx, authCtx, payload, pow, 1)
payload := util.StandardRequest{
ResolvedModel: model,
FinalPrompt: deepseek.MessagesPrepare([]map[string]any{{"role": "user", "content": message}}),
Thinking: thinking,
Search: search,
}.CompletionPayload(sessionID)
resp, err := h.DS.CallCompletion(proxyCtx, authCtx, payload, pow, 1)
if err != nil {
result["message"] = "请求失败: " + err.Error()
return result
@@ -244,25 +265,29 @@ func (h *Handler) deleteAllSessions(w http.ResponseWriter, r *http.Request) {
}
// 每次先登录刷新一次 token避免使用过期 token。
token, err := h.DS.Login(r.Context(), acc)
authCtx := &authn.RequestAuth{UseConfigToken: false, AccountID: acc.Identifier(), Account: acc}
proxyCtx := authn.WithAuth(r.Context(), authCtx)
token, err := h.DS.Login(proxyCtx, acc)
if err != nil {
writeJSON(w, http.StatusOK, map[string]any{"success": false, "message": "登录失败: " + err.Error()})
return
}
_ = h.Store.UpdateAccountToken(acc.Identifier(), token)
authCtx.DeepSeekToken = token
// 删除所有会话
err = h.DS.DeleteAllSessionsForToken(r.Context(), token)
err = h.DS.DeleteAllSessionsForToken(proxyCtx, token)
if err != nil {
// token 可能过期,尝试重新登录并重试一次
newToken, loginErr := h.DS.Login(r.Context(), acc)
newToken, loginErr := h.DS.Login(proxyCtx, acc)
if loginErr != nil {
writeJSON(w, http.StatusOK, map[string]any{"success": false, "message": "删除失败: " + err.Error()})
return
}
token = newToken
_ = h.Store.UpdateAccountToken(acc.Identifier(), token)
if retryErr := h.DS.DeleteAllSessionsForToken(r.Context(), token); retryErr != nil {
authCtx.DeepSeekToken = token
if retryErr := h.DS.DeleteAllSessionsForToken(proxyCtx, token); retryErr != nil {
writeJSON(w, http.StatusOK, map[string]any{"success": false, "message": "删除失败: " + retryErr.Error()})
return
}

View File

@@ -5,6 +5,7 @@ import (
"context"
"encoding/json"
"errors"
"io"
"net/http"
"net/http/httptest"
"strings"
@@ -133,3 +134,78 @@ func TestDeleteAllSessions_RetryWithReloginOnDeleteFailure(t *testing.T) {
t.Fatalf("expected refreshed token persisted, got %q", updated.Token)
}
}
type completionPayloadDSMock struct {
payload map[string]any
}
func (m *completionPayloadDSMock) Login(_ context.Context, _ config.Account) (string, error) {
return "new-token", nil
}
func (m *completionPayloadDSMock) CreateSession(_ context.Context, _ *auth.RequestAuth, _ int) (string, error) {
return "session-id", nil
}
func (m *completionPayloadDSMock) GetPow(_ context.Context, _ *auth.RequestAuth, _ int) (string, error) {
return "pow-ok", nil
}
func (m *completionPayloadDSMock) CallCompletion(_ context.Context, _ *auth.RequestAuth, payload map[string]any, _ string, _ int) (*http.Response, error) {
m.payload = payload
return &http.Response{
StatusCode: http.StatusOK,
Body: io.NopCloser(strings.NewReader("data: {\"v\":\"ok\"}\n\ndata: [DONE]\n\n")),
}, nil
}
func (m *completionPayloadDSMock) DeleteAllSessionsForToken(_ context.Context, _ string) error {
return nil
}
func (m *completionPayloadDSMock) GetSessionCountForToken(_ context.Context, _ string) (*deepseek.SessionStats, error) {
return &deepseek.SessionStats{Success: true}, nil
}
func TestTestAccount_MessageModeUsesExpertModelTypeForExpertModel(t *testing.T) {
t.Setenv("DS2API_CONFIG_JSON", `{"accounts":[{"email":"batch@example.com","password":"pwd","token":"seed-token"}]}`)
store := config.LoadStore()
ds := &completionPayloadDSMock{}
h := &Handler{Store: store, DS: ds}
acc, ok := store.FindAccount("batch@example.com")
if !ok {
t.Fatal("expected test account")
}
result := h.testAccount(context.Background(), acc, "deepseek-expert-chat", "hello")
if ok, _ := result["success"].(bool); !ok {
t.Fatalf("expected success=true, got %#v", result)
}
if got := ds.payload["model_type"]; got != "expert" {
t.Fatalf("expected model_type expert, got %#v", got)
}
if got := ds.payload["chat_session_id"]; got != "session-id" {
t.Fatalf("unexpected chat_session_id: %#v", got)
}
}
func TestTestAccount_MessageModeUsesVisionModelTypeForVisionModel(t *testing.T) {
t.Setenv("DS2API_CONFIG_JSON", `{"accounts":[{"email":"batch@example.com","password":"pwd","token":"seed-token"}]}`)
store := config.LoadStore()
ds := &completionPayloadDSMock{}
h := &Handler{Store: store, DS: ds}
acc, ok := store.FindAccount("batch@example.com")
if !ok {
t.Fatal("expected test account")
}
result := h.testAccount(context.Background(), acc, "deepseek-vision-chat", "hello")
if ok, _ := result["success"].(bool); !ok {
t.Fatalf("expected success=true, got %#v", result)
}
if got := ds.payload["model_type"]; got != "vision" {
t.Fatalf("expected model_type vision, got %#v", got)
}
}

View File

@@ -3,6 +3,8 @@ package admin
import (
"net/http"
"strings"
"ds2api/internal/config"
)
func (h *Handler) getConfig(w http.ResponseWriter, _ *http.Request) {
@@ -10,6 +12,7 @@ func (h *Handler) getConfig(w http.ResponseWriter, _ *http.Request) {
safe := map[string]any{
"keys": snap.Keys,
"accounts": []map[string]any{},
"proxies": []map[string]any{},
"env_backed": h.Store.IsEnvBacked(),
"env_source_present": h.Store.HasEnvConfigSource(),
"env_writeback_enabled": h.Store.IsEnvWritebackEnabled(),
@@ -36,12 +39,27 @@ func (h *Handler) getConfig(w http.ResponseWriter, _ *http.Request) {
"identifier": acc.Identifier(),
"email": acc.Email,
"mobile": acc.Mobile,
"proxy_id": acc.ProxyID,
"has_password": strings.TrimSpace(acc.Password) != "",
"has_token": token != "",
"token_preview": preview,
})
}
safe["accounts"] = accounts
proxies := make([]map[string]any, 0, len(snap.Proxies))
for _, proxy := range snap.Proxies {
proxy = config.NormalizeProxy(proxy)
proxies = append(proxies, map[string]any{
"id": proxy.ID,
"name": proxy.Name,
"type": proxy.Type,
"host": proxy.Host,
"port": proxy.Port,
"username": proxy.Username,
"has_password": strings.TrimSpace(proxy.Password) != "",
})
}
safe["proxies"] = proxies
writeJSON(w, http.StatusOK, safe)
}

View File

@@ -0,0 +1,202 @@
package admin
import (
"context"
"encoding/json"
"net/http"
"net/url"
"strings"
"github.com/go-chi/chi/v5"
"ds2api/internal/config"
"ds2api/internal/deepseek"
)
var proxyConnectivityTester = func(ctx context.Context, proxy config.Proxy) map[string]any {
return deepseek.TestProxyConnectivity(ctx, proxy)
}
func validateProxyMutation(cfg *config.Config) error {
if cfg == nil {
return nil
}
if err := config.ValidateProxyConfig(cfg.Proxies); err != nil {
return err
}
return config.ValidateAccountProxyReferences(cfg.Accounts, cfg.Proxies)
}
func proxyResponse(proxy config.Proxy) map[string]any {
proxy = config.NormalizeProxy(proxy)
return map[string]any{
"id": proxy.ID,
"name": proxy.Name,
"type": proxy.Type,
"host": proxy.Host,
"port": proxy.Port,
"username": proxy.Username,
"has_password": strings.TrimSpace(proxy.Password) != "",
}
}
func (h *Handler) listProxies(w http.ResponseWriter, _ *http.Request) {
proxies := h.Store.Snapshot().Proxies
items := make([]map[string]any, 0, len(proxies))
for _, proxy := range proxies {
proxy = config.NormalizeProxy(proxy)
items = append(items, map[string]any{
"id": proxy.ID,
"name": proxy.Name,
"type": proxy.Type,
"host": proxy.Host,
"port": proxy.Port,
"username": proxy.Username,
"has_password": strings.TrimSpace(proxy.Password) != "",
})
}
writeJSON(w, http.StatusOK, map[string]any{"items": items, "total": len(items)})
}
func (h *Handler) addProxy(w http.ResponseWriter, r *http.Request) {
var req map[string]any
_ = json.NewDecoder(r.Body).Decode(&req)
proxy := toProxy(req)
err := h.Store.Update(func(c *config.Config) error {
c.Proxies = append(c.Proxies, proxy)
return validateProxyMutation(c)
})
if err != nil {
writeJSON(w, http.StatusBadRequest, map[string]any{"detail": err.Error()})
return
}
writeJSON(w, http.StatusOK, map[string]any{"success": true, "proxy": proxyResponse(proxy)})
}
func (h *Handler) updateProxy(w http.ResponseWriter, r *http.Request) {
proxyID := chi.URLParam(r, "proxyID")
if decoded, err := url.PathUnescape(proxyID); err == nil {
proxyID = decoded
}
var req map[string]any
_ = json.NewDecoder(r.Body).Decode(&req)
proxy := toProxy(req)
proxy.ID = strings.TrimSpace(proxyID)
err := h.Store.Update(func(c *config.Config) error {
for i, existing := range c.Proxies {
existing = config.NormalizeProxy(existing)
if existing.ID != proxy.ID {
continue
}
if proxy.Password == "" {
proxy.Password = existing.Password
}
c.Proxies[i] = proxy
return validateProxyMutation(c)
}
return newRequestError("代理不存在")
})
if err != nil {
if detail, ok := requestErrorDetail(err); ok {
writeJSON(w, http.StatusNotFound, map[string]any{"detail": detail})
return
}
writeJSON(w, http.StatusBadRequest, map[string]any{"detail": err.Error()})
return
}
writeJSON(w, http.StatusOK, map[string]any{"success": true, "proxy": proxyResponse(proxy)})
}
func (h *Handler) deleteProxy(w http.ResponseWriter, r *http.Request) {
proxyID := chi.URLParam(r, "proxyID")
if decoded, err := url.PathUnescape(proxyID); err == nil {
proxyID = decoded
}
err := h.Store.Update(func(c *config.Config) error {
idx := -1
for i, existing := range c.Proxies {
existing = config.NormalizeProxy(existing)
if existing.ID == strings.TrimSpace(proxyID) {
idx = i
break
}
}
if idx < 0 {
return newRequestError("代理不存在")
}
c.Proxies = append(c.Proxies[:idx], c.Proxies[idx+1:]...)
for i := range c.Accounts {
if strings.TrimSpace(c.Accounts[i].ProxyID) == strings.TrimSpace(proxyID) {
c.Accounts[i].ProxyID = ""
}
}
return validateProxyMutation(c)
})
if err != nil {
if detail, ok := requestErrorDetail(err); ok {
writeJSON(w, http.StatusNotFound, map[string]any{"detail": detail})
return
}
writeJSON(w, http.StatusBadRequest, map[string]any{"detail": err.Error()})
return
}
writeJSON(w, http.StatusOK, map[string]any{"success": true})
}
func (h *Handler) testProxy(w http.ResponseWriter, r *http.Request) {
var req map[string]any
_ = json.NewDecoder(r.Body).Decode(&req)
proxyID := fieldString(req, "proxy_id")
var proxy config.Proxy
if proxyID != "" {
var ok bool
proxy, ok = findProxyByID(h.Store.Snapshot(), proxyID)
if !ok {
writeJSON(w, http.StatusNotFound, map[string]any{"detail": "代理不存在"})
return
}
} else {
proxy = toProxy(req)
}
result := proxyConnectivityTester(r.Context(), proxy)
writeJSON(w, http.StatusOK, result)
}
func (h *Handler) updateAccountProxy(w http.ResponseWriter, r *http.Request) {
identifier := chi.URLParam(r, "identifier")
if decoded, err := url.PathUnescape(identifier); err == nil {
identifier = decoded
}
var req map[string]any
_ = json.NewDecoder(r.Body).Decode(&req)
proxyID := fieldString(req, "proxy_id")
err := h.Store.Update(func(c *config.Config) error {
if proxyID != "" {
if _, ok := findProxyByID(*c, proxyID); !ok {
return newRequestError("代理不存在")
}
}
for i, acc := range c.Accounts {
if !accountMatchesIdentifier(acc, identifier) {
continue
}
c.Accounts[i].ProxyID = proxyID
return validateProxyMutation(c)
}
return newRequestError("账号不存在")
})
if err != nil {
if detail, ok := requestErrorDetail(err); ok {
writeJSON(w, http.StatusBadRequest, map[string]any{"detail": detail})
return
}
writeJSON(w, http.StatusBadRequest, map[string]any{"detail": err.Error()})
return
}
h.Pool.Reset()
writeJSON(w, http.StatusOK, map[string]any{"success": true, "proxy_id": proxyID})
}

View File

@@ -0,0 +1,227 @@
package admin
import (
"bytes"
"context"
"encoding/json"
"net/http"
"net/http/httptest"
"testing"
"github.com/go-chi/chi/v5"
"ds2api/internal/account"
"ds2api/internal/config"
)
func newAdminProxyTestHandler(t *testing.T, raw string) *Handler {
t.Helper()
t.Setenv("DS2API_CONFIG_JSON", raw)
store := config.LoadStore()
return &Handler{
Store: store,
Pool: account.NewPool(store),
}
}
func TestAddProxyPersistsNormalizedProxy(t *testing.T) {
h := newAdminProxyTestHandler(t, `{"accounts":[]}`)
r := chi.NewRouter()
r.Post("/admin/proxies", h.addProxy)
req := httptest.NewRequest(http.MethodPost, "/admin/proxies", bytes.NewBufferString(`{
"name":" HK Exit ",
"type":" SOCKS5H ",
"host":" 127.0.0.1 ",
"port":1081,
"username":" user ",
"password":" pass "
}`))
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("unexpected status: %d body=%s", rec.Code, rec.Body.String())
}
proxies := h.Store.Snapshot().Proxies
if len(proxies) != 1 {
t.Fatalf("expected 1 proxy, got %d", len(proxies))
}
if proxies[0].Name != "HK Exit" {
t.Fatalf("unexpected proxy name: %#v", proxies[0])
}
if proxies[0].Type != "socks5h" {
t.Fatalf("unexpected proxy type: %#v", proxies[0])
}
if proxies[0].Username != "user" || proxies[0].Password != "pass" {
t.Fatalf("expected trimmed credentials, got %#v", proxies[0])
}
if proxies[0].ID == "" {
t.Fatalf("expected generated proxy id, got %#v", proxies[0])
}
}
func TestAddProxyDoesNotFailOnUnrelatedInvalidRuntimeConfig(t *testing.T) {
router := newHTTPAdminHarness(t, `{
"keys":["k1"],
"runtime":{
"account_max_inflight":8,
"global_max_inflight":4
}
}`, &testingDSMock{})
rec := httptest.NewRecorder()
router.ServeHTTP(rec, adminReq(http.MethodPost, "/proxies", []byte(`{
"name":"HK Exit",
"type":"socks5h",
"host":"127.0.0.1",
"port":1080
}`)))
if rec.Code != http.StatusOK {
t.Fatalf("expected add proxy success despite unrelated runtime issue, got %d body=%s", rec.Code, rec.Body.String())
}
readRec := httptest.NewRecorder()
router.ServeHTTP(readRec, adminReq(http.MethodGet, "/config", nil))
if readRec.Code != http.StatusOK {
t.Fatalf("config read status=%d body=%s", readRec.Code, readRec.Body.String())
}
var payload map[string]any
if err := json.Unmarshal(readRec.Body.Bytes(), &payload); err != nil {
t.Fatalf("decode config response: %v", err)
}
proxies, _ := payload["proxies"].([]any)
if len(proxies) != 1 {
t.Fatalf("expected proxy to be persisted, got %#v", payload["proxies"])
}
}
func TestDeleteProxyClearsAssignedAccountProxyID(t *testing.T) {
h := newAdminProxyTestHandler(t, `{
"proxies":[{"id":"proxy-1","name":"Node 1","type":"socks5","host":"127.0.0.1","port":1080}],
"accounts":[{"email":"u@example.com","password":"pwd","proxy_id":"proxy-1"}]
}`)
r := chi.NewRouter()
r.Delete("/admin/proxies/{proxyID}", h.deleteProxy)
req := httptest.NewRequest(http.MethodDelete, "/admin/proxies/proxy-1", nil)
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("unexpected status: %d body=%s", rec.Code, rec.Body.String())
}
snap := h.Store.Snapshot()
if len(snap.Proxies) != 0 {
t.Fatalf("expected proxy removed, got %#v", snap.Proxies)
}
if len(snap.Accounts) != 1 {
t.Fatalf("expected account kept, got %#v", snap.Accounts)
}
if snap.Accounts[0].ProxyID != "" {
t.Fatalf("expected proxy assignment cleared, got %#v", snap.Accounts[0])
}
}
func TestUpdateProxyResponseDoesNotExposeStoredPassword(t *testing.T) {
h := newAdminProxyTestHandler(t, `{
"proxies":[{"id":"proxy-1","name":"Node 1","type":"socks5h","host":"127.0.0.1","port":1080,"username":"u","password":"secret"}]
}`)
r := chi.NewRouter()
r.Put("/admin/proxies/{proxyID}", h.updateProxy)
req := httptest.NewRequest(http.MethodPut, "/admin/proxies/proxy-1", bytes.NewBufferString(`{
"name":"Node 1",
"type":"socks5h",
"host":"127.0.0.2",
"port":1081,
"username":"u2"
}`))
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("unexpected status: %d body=%s", rec.Code, rec.Body.String())
}
var payload map[string]any
if err := json.Unmarshal(rec.Body.Bytes(), &payload); err != nil {
t.Fatalf("decode response: %v", err)
}
proxy, _ := payload["proxy"].(map[string]any)
if _, exists := proxy["password"]; exists {
t.Fatalf("response should not expose password, got %#v", proxy)
}
if hasPassword, _ := proxy["has_password"].(bool); !hasPassword {
t.Fatalf("expected has_password=true, got %#v", proxy)
}
}
func TestUpdateAccountProxyAssignsProxyID(t *testing.T) {
h := newAdminProxyTestHandler(t, `{
"proxies":[{"id":"proxy-1","name":"Node 1","type":"socks5h","host":"127.0.0.1","port":1080}],
"accounts":[{"email":"u@example.com","password":"pwd"}]
}`)
r := chi.NewRouter()
r.Put("/admin/accounts/{identifier}/proxy", h.updateAccountProxy)
req := httptest.NewRequest(http.MethodPut, "/admin/accounts/u@example.com/proxy", bytes.NewBufferString(`{"proxy_id":"proxy-1"}`))
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("unexpected status: %d body=%s", rec.Code, rec.Body.String())
}
acc, ok := h.Store.FindAccount("u@example.com")
if !ok {
t.Fatal("expected account")
}
if acc.ProxyID != "proxy-1" {
t.Fatalf("expected proxy assigned, got %#v", acc)
}
}
func TestTestProxyUsesStoredProxy(t *testing.T) {
h := newAdminProxyTestHandler(t, `{
"proxies":[{"id":"proxy-1","name":"Node 1","type":"socks5h","host":"127.0.0.1","port":1080}]
}`)
original := proxyConnectivityTester
defer func() { proxyConnectivityTester = original }()
var got config.Proxy
proxyConnectivityTester = func(_ context.Context, proxy config.Proxy) map[string]any {
got = proxy
return map[string]any{
"success": true,
"proxy_id": proxy.ID,
"proxy_type": proxy.Type,
"response_time": 12,
}
}
r := chi.NewRouter()
r.Post("/admin/proxies/test", h.testProxy)
req := httptest.NewRequest(http.MethodPost, "/admin/proxies/test", bytes.NewBufferString(`{"proxy_id":"proxy-1"}`))
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("unexpected status: %d body=%s", rec.Code, rec.Body.String())
}
if got.ID != "proxy-1" || got.Type != "socks5h" {
t.Fatalf("expected stored proxy passed to tester, got %#v", got)
}
var payload map[string]any
if err := json.Unmarshal(rec.Body.Bytes(), &payload); err != nil {
t.Fatalf("decode response: %v", err)
}
if ok, _ := payload["success"].(bool); !ok {
t.Fatalf("expected success payload, got %#v", payload)
}
}

View File

@@ -65,6 +65,7 @@ func toAccount(m map[string]any) config.Account {
Email: email,
Mobile: mobile,
Password: fieldString(m, "password"),
ProxyID: fieldString(m, "proxy_id"),
}
}
@@ -100,9 +101,36 @@ func accountMatchesIdentifier(acc config.Account, identifier string) bool {
func normalizeAccountForStorage(acc config.Account) config.Account {
acc.Email = strings.TrimSpace(acc.Email)
acc.Mobile = config.NormalizeMobileForStorage(acc.Mobile)
acc.ProxyID = strings.TrimSpace(acc.ProxyID)
return acc
}
func toProxy(m map[string]any) config.Proxy {
return config.NormalizeProxy(config.Proxy{
ID: fieldString(m, "id"),
Name: fieldString(m, "name"),
Type: fieldString(m, "type"),
Host: fieldString(m, "host"),
Port: intFrom(m["port"]),
Username: fieldString(m, "username"),
Password: fieldString(m, "password"),
})
}
func findProxyByID(c config.Config, proxyID string) (config.Proxy, bool) {
id := strings.TrimSpace(proxyID)
if id == "" {
return config.Proxy{}, false
}
for _, proxy := range c.Proxies {
proxy = config.NormalizeProxy(proxy)
if proxy.ID == id {
return proxy, true
}
}
return config.Proxy{}, false
}
func accountDedupeKey(acc config.Account) string {
if email := strings.TrimSpace(acc.Email); email != "" {
return "email:" + email

View File

@@ -37,7 +37,6 @@ func TestGoCompatSSEFixtures(t *testing.T) {
Finished bool `json:"finished"`
NewType string `json:"new_type"`
ContentFilter bool `json:"content_filter"`
OutputTokens int `json:"output_tokens"`
ErrorMessage string `json:"error_message"`
}
mustLoadJSON(t, expectedPath, &expected)
@@ -58,11 +57,10 @@ func TestGoCompatSSEFixtures(t *testing.T) {
res.Stop != expected.Finished ||
res.NextType != expected.NewType ||
res.ContentFilter != expected.ContentFilter ||
res.OutputTokens != expected.OutputTokens ||
res.ErrorMessage != expected.ErrorMessage {
t.Fatalf("fixture %s mismatch:\n got parts=%#v finished=%v newType=%q contentFilter=%v outputTokens=%d errorMessage=%q\nwant parts=%#v finished=%v newType=%q contentFilter=%v outputTokens=%d errorMessage=%q",
name, gotParts, res.Stop, res.NextType, res.ContentFilter, res.OutputTokens, res.ErrorMessage,
expected.Parts, expected.Finished, expected.NewType, expected.ContentFilter, expected.OutputTokens, expected.ErrorMessage)
t.Fatalf("fixture %s mismatch:\n got parts=%#v finished=%v newType=%q contentFilter=%v errorMessage=%q\nwant parts=%#v finished=%v newType=%q contentFilter=%v errorMessage=%q",
name, gotParts, res.Stop, res.NextType, res.ContentFilter, res.ErrorMessage,
expected.Parts, expected.Finished, expected.NewType, expected.ContentFilter, expected.ErrorMessage)
}
}
}

View File

@@ -20,6 +20,9 @@ func (c Config) MarshalJSON() ([]byte, error) {
if len(c.Accounts) > 0 {
m["accounts"] = c.Accounts
}
if len(c.Proxies) > 0 {
m["proxies"] = c.Proxies
}
if len(c.ClaudeMapping) > 0 {
m["claude_mapping"] = c.ClaudeMapping
}
@@ -70,6 +73,10 @@ func (c *Config) UnmarshalJSON(b []byte) error {
if err := json.Unmarshal(v, &c.Accounts); err != nil {
return fmt.Errorf("invalid field %q: %w", k, err)
}
case "proxies":
if err := json.Unmarshal(v, &c.Proxies); err != nil {
return fmt.Errorf("invalid field %q: %w", k, err)
}
case "claude_mapping":
if err := json.Unmarshal(v, &c.ClaudeMapping); err != nil {
return fmt.Errorf("invalid field %q: %w", k, err)
@@ -130,6 +137,7 @@ func (c Config) Clone() Config {
clone := Config{
Keys: slices.Clone(c.Keys),
Accounts: slices.Clone(c.Accounts),
Proxies: slices.Clone(c.Proxies),
ClaudeMapping: cloneStringMap(c.ClaudeMapping),
ClaudeModelMap: cloneStringMap(c.ClaudeModelMap),
ModelAliases: cloneStringMap(c.ModelAliases),

View File

@@ -1,8 +1,16 @@
package config
import (
"crypto/sha1"
"encoding/hex"
"fmt"
"strings"
)
type Config struct {
Keys []string `json:"keys,omitempty"`
Accounts []Account `json:"accounts,omitempty"`
Proxies []Proxy `json:"proxies,omitempty"`
ClaudeMapping map[string]string `json:"claude_mapping,omitempty"`
ClaudeModelMap map[string]string `json:"claude_model_mapping,omitempty"`
ModelAliases map[string]string `json:"model_aliases,omitempty"`
@@ -22,6 +30,38 @@ type Account struct {
Mobile string `json:"mobile,omitempty"`
Password string `json:"password,omitempty"`
Token string `json:"token,omitempty"`
ProxyID string `json:"proxy_id,omitempty"`
}
type Proxy struct {
ID string `json:"id,omitempty"`
Name string `json:"name,omitempty"`
Type string `json:"type,omitempty"`
Host string `json:"host,omitempty"`
Port int `json:"port,omitempty"`
Username string `json:"username,omitempty"`
Password string `json:"password,omitempty"`
}
func NormalizeProxy(p Proxy) Proxy {
p.ID = strings.TrimSpace(p.ID)
p.Name = strings.TrimSpace(p.Name)
p.Type = strings.ToLower(strings.TrimSpace(p.Type))
p.Host = strings.TrimSpace(p.Host)
p.Username = strings.TrimSpace(p.Username)
p.Password = strings.TrimSpace(p.Password)
if p.ID == "" {
p.ID = StableProxyID(p)
}
if p.Name == "" && p.Host != "" && p.Port > 0 {
p.Name = fmt.Sprintf("%s:%d", p.Host, p.Port)
}
return p
}
func StableProxyID(p Proxy) string {
sum := sha1.Sum([]byte(strings.ToLower(strings.TrimSpace(p.Type)) + "|" + strings.ToLower(strings.TrimSpace(p.Host)) + "|" + fmt.Sprintf("%d", p.Port) + "|" + strings.TrimSpace(p.Username)))
return "proxy_" + hex.EncodeToString(sum[:6])
}
func (c *Config) ClearAccountTokens() {

View File

@@ -49,6 +49,51 @@ func TestGetModelConfigDeepSeekReasonerSearch(t *testing.T) {
}
}
func TestGetModelConfigDeepSeekExpertChat(t *testing.T) {
thinking, search, ok := GetModelConfig("deepseek-expert-chat")
if !ok {
t.Fatal("expected ok for deepseek-expert-chat")
}
if thinking || search {
t.Fatalf("expected no thinking/search for deepseek-expert-chat, got thinking=%v search=%v", thinking, search)
}
}
func TestGetModelConfigDeepSeekExpertReasonerSearch(t *testing.T) {
thinking, search, ok := GetModelConfig("deepseek-expert-reasoner-search")
if !ok {
t.Fatal("expected ok for deepseek-expert-reasoner-search")
}
if !thinking || !search {
t.Fatalf("expected both true, got thinking=%v search=%v", thinking, search)
}
}
func TestGetModelConfigDeepSeekVisionReasonerSearch(t *testing.T) {
thinking, search, ok := GetModelConfig("deepseek-vision-reasoner-search")
if !ok {
t.Fatal("expected ok for deepseek-vision-reasoner-search")
}
if !thinking || !search {
t.Fatalf("expected both true, got thinking=%v search=%v", thinking, search)
}
}
func TestGetModelTypeDefaultExpertAndVision(t *testing.T) {
defaultType, ok := GetModelType("deepseek-chat")
if !ok || defaultType != "default" {
t.Fatalf("expected default model_type, got ok=%v model_type=%q", ok, defaultType)
}
expertType, ok := GetModelType("deepseek-expert-chat")
if !ok || expertType != "expert" {
t.Fatalf("expected expert model_type, got ok=%v model_type=%q", ok, expertType)
}
visionType, ok := GetModelType("deepseek-vision-chat")
if !ok || visionType != "vision" {
t.Fatalf("expected vision model_type, got ok=%v model_type=%q", ok, visionType)
}
}
func TestGetModelConfigCaseInsensitive(t *testing.T) {
thinking, search, ok := GetModelConfig("DeepSeek-Chat")
if !ok {
@@ -551,6 +596,30 @@ func TestOpenAIModelsResponse(t *testing.T) {
if len(data) == 0 {
t.Fatal("expected non-empty models list")
}
expected := map[string]bool{
"deepseek-chat": false,
"deepseek-reasoner": false,
"deepseek-chat-search": false,
"deepseek-reasoner-search": false,
"deepseek-expert-chat": false,
"deepseek-expert-reasoner": false,
"deepseek-expert-chat-search": false,
"deepseek-expert-reasoner-search": false,
"deepseek-vision-chat": false,
"deepseek-vision-reasoner": false,
"deepseek-vision-chat-search": false,
"deepseek-vision-reasoner-search": false,
}
for _, model := range data {
if _, ok := expected[model.ID]; ok {
expected[model.ID] = true
}
}
for id, seen := range expected {
if !seen {
t.Fatalf("expected OpenAI model list to include %s", id)
}
}
}
func TestClaudeModelsResponse(t *testing.T) {

View File

@@ -32,6 +32,47 @@ func TestLoadStoreClearsTokensFromConfigInput(t *testing.T) {
}
}
func TestLoadStorePreservesProxiesAndAccountProxyAssignment(t *testing.T) {
t.Setenv("DS2API_CONFIG_JSON", `{
"proxies":[
{
"id":"proxy-sh-1",
"name":"Shanghai Exit",
"type":"socks5h",
"host":"127.0.0.1",
"port":1080,
"username":"demo",
"password":"secret"
}
],
"accounts":[
{
"email":"u@example.com",
"password":"p",
"proxy_id":"proxy-sh-1"
}
]
}`)
store := LoadStore()
snap := store.Snapshot()
if len(snap.Proxies) != 1 {
t.Fatalf("expected 1 proxy, got %d", len(snap.Proxies))
}
if snap.Proxies[0].ID != "proxy-sh-1" {
t.Fatalf("unexpected proxy id: %#v", snap.Proxies[0])
}
if snap.Proxies[0].Type != "socks5h" {
t.Fatalf("unexpected proxy type: %#v", snap.Proxies[0])
}
if len(snap.Accounts) != 1 {
t.Fatalf("expected 1 account, got %d", len(snap.Accounts))
}
if snap.Accounts[0].ProxyID != "proxy-sh-1" {
t.Fatalf("expected account proxy assignment preserved, got %#v", snap.Accounts[0])
}
}
func TestLoadStoreDropsLegacyTokenOnlyAccounts(t *testing.T) {
t.Setenv("DS2API_CONFIG_JSON", `{
"accounts":[

View File

@@ -2,6 +2,10 @@ package config
import "testing"
type mockModelAliasReader map[string]string
func (m mockModelAliasReader) ModelAliases() map[string]string { return m }
func TestResolveModelDirectDeepSeek(t *testing.T) {
got, ok := ResolveModel(nil, "deepseek-chat")
if !ok || got != "deepseek-chat" {
@@ -30,6 +34,31 @@ func TestResolveModelUnknown(t *testing.T) {
}
}
func TestResolveModelDirectDeepSeekExpert(t *testing.T) {
got, ok := ResolveModel(nil, "deepseek-expert-chat")
if !ok || got != "deepseek-expert-chat" {
t.Fatalf("expected deepseek-expert-chat, got ok=%v model=%q", ok, got)
}
}
func TestResolveModelCustomAliasToExpert(t *testing.T) {
got, ok := ResolveModel(mockModelAliasReader{
"my-expert-model": "deepseek-expert-reasoner-search",
}, "my-expert-model")
if !ok || got != "deepseek-expert-reasoner-search" {
t.Fatalf("expected alias -> deepseek-expert-reasoner-search, got ok=%v model=%q", ok, got)
}
}
func TestResolveModelCustomAliasToVision(t *testing.T) {
got, ok := ResolveModel(mockModelAliasReader{
"my-vision-model": "deepseek-vision-chat-search",
}, "my-vision-model")
if !ok || got != "deepseek-vision-chat-search" {
t.Fatalf("expected alias -> deepseek-vision-chat-search, got ok=%v model=%q", ok, got)
}
}
func TestClaudeModelsResponsePaginationFields(t *testing.T) {
resp := ClaudeModelsResponse()
if _, ok := resp["first_id"]; !ok {

View File

@@ -19,6 +19,14 @@ var DeepSeekModels = []ModelInfo{
{ID: "deepseek-reasoner", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-chat-search", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-reasoner-search", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-expert-chat", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-expert-reasoner", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-expert-chat-search", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-expert-reasoner-search", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-vision-chat", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-vision-reasoner", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-vision-chat-search", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
{ID: "deepseek-vision-reasoner-search", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
}
var ClaudeModels = []ModelInfo{
@@ -72,11 +80,40 @@ func GetModelConfig(model string) (thinking bool, search bool, ok bool) {
return false, true, true
case "deepseek-reasoner-search":
return true, true, true
case "deepseek-expert-chat":
return false, false, true
case "deepseek-expert-reasoner":
return true, false, true
case "deepseek-expert-chat-search":
return false, true, true
case "deepseek-expert-reasoner-search":
return true, true, true
case "deepseek-vision-chat":
return false, false, true
case "deepseek-vision-reasoner":
return true, false, true
case "deepseek-vision-chat-search":
return false, true, true
case "deepseek-vision-reasoner-search":
return true, true, true
default:
return false, false, false
}
}
func GetModelType(model string) (modelType string, ok bool) {
switch lower(model) {
case "deepseek-chat", "deepseek-reasoner", "deepseek-chat-search", "deepseek-reasoner-search":
return "default", true
case "deepseek-expert-chat", "deepseek-expert-reasoner", "deepseek-expert-chat-search", "deepseek-expert-reasoner-search":
return "expert", true
case "deepseek-vision-chat", "deepseek-vision-reasoner", "deepseek-vision-chat-search", "deepseek-vision-reasoner-search":
return "vision", true
default:
return "", false
}
}
func IsSupportedDeepSeekModel(model string) bool {
_, _, ok := GetModelConfig(model)
return ok

View File

@@ -33,10 +33,6 @@ func ConfigPath() string {
return ResolvePath("DS2API_CONFIG_PATH", "config.json")
}
func WASMPath() string {
return ResolvePath("DS2API_WASM_PATH", "sha3_wasm_bg.7b9ca65ddd.wasm")
}
func RawStreamSampleRoot() string {
return ResolvePath("DS2API_RAW_STREAM_SAMPLE_ROOT", "tests/raw_stream_samples")
}

View File

@@ -6,6 +6,9 @@ import (
)
func ValidateConfig(c Config) error {
if err := ValidateProxyConfig(c.Proxies); err != nil {
return err
}
if err := ValidateAdminConfig(c.Admin); err != nil {
return err
}
@@ -21,6 +24,55 @@ func ValidateConfig(c Config) error {
if err := ValidateAutoDeleteConfig(c.AutoDelete); err != nil {
return err
}
if err := ValidateAccountProxyReferences(c.Accounts, c.Proxies); err != nil {
return err
}
return nil
}
func ValidateProxyConfig(proxies []Proxy) error {
seen := make(map[string]struct{}, len(proxies))
for _, proxy := range proxies {
proxy = NormalizeProxy(proxy)
if err := ValidateTrimmedString("proxies.id", proxy.ID, true); err != nil {
return err
}
switch proxy.Type {
case "socks5", "socks5h":
default:
return fmt.Errorf("proxies.type must be one of socks5, socks5h")
}
if err := ValidateTrimmedString("proxies.host", proxy.Host, true); err != nil {
return err
}
if err := ValidateIntRange("proxies.port", proxy.Port, 1, 65535, true); err != nil {
return err
}
if _, ok := seen[proxy.ID]; ok {
return fmt.Errorf("duplicate proxy id: %s", proxy.ID)
}
seen[proxy.ID] = struct{}{}
}
return nil
}
func ValidateAccountProxyReferences(accounts []Account, proxies []Proxy) error {
if len(accounts) == 0 {
return nil
}
ids := make(map[string]struct{}, len(proxies))
for _, proxy := range proxies {
ids[NormalizeProxy(proxy).ID] = struct{}{}
}
for _, acc := range accounts {
proxyID := strings.TrimSpace(acc.ProxyID)
if proxyID == "" {
continue
}
if _, ok := ids[proxyID]; !ok {
return fmt.Errorf("account proxy_id references unknown proxy: %s", proxyID)
}
}
return nil
}

View File

@@ -13,6 +13,7 @@ import (
)
func (c *Client) Login(ctx context.Context, acc config.Account) (string, error) {
clients := c.requestClientsForAccount(acc)
payload := map[string]any{
"password": strings.TrimSpace(acc.Password),
"device_id": "deepseek_to_api",
@@ -27,7 +28,7 @@ func (c *Client) Login(ctx context.Context, acc config.Account) (string, error)
} else {
return "", errors.New("missing email/mobile")
}
resp, err := c.postJSON(ctx, c.regular, DeepSeekLoginURL, BaseHeaders, payload)
resp, err := c.postJSON(ctx, clients.regular, clients.fallback, DeepSeekLoginURL, BaseHeaders, payload)
if err != nil {
return "", err
}
@@ -52,11 +53,12 @@ func (c *Client) CreateSession(ctx context.Context, a *auth.RequestAuth, maxAtte
if maxAttempts <= 0 {
maxAttempts = c.maxRetries
}
clients := c.requestClientsForAuth(ctx, a)
attempts := 0
refreshed := false
for attempts < maxAttempts {
headers := c.authHeaders(a.DeepSeekToken)
resp, status, err := c.postJSONWithStatus(ctx, c.regular, DeepSeekCreateSessionURL, headers, map[string]any{"agent": "chat"})
resp, status, err := c.postJSONWithStatus(ctx, clients.regular, clients.fallback, DeepSeekCreateSessionURL, headers, map[string]any{"agent": "chat"})
if err != nil {
config.Logger.Warn("[create_session] request error", "error", err, "account", a.AccountID)
attempts++
@@ -64,9 +66,7 @@ func (c *Client) CreateSession(ctx context.Context, a *auth.RequestAuth, maxAtte
}
code, bizCode, msg, bizMsg := extractResponseStatus(resp)
if status == http.StatusOK && code == 0 && bizCode == 0 {
data, _ := resp["data"].(map[string]any)
bizData, _ := data["biz_data"].(map[string]any)
sessionID, _ := bizData["id"].(string)
sessionID := extractCreateSessionID(resp)
if sessionID != "" {
return sessionID, nil
}
@@ -94,11 +94,12 @@ func (c *Client) GetPow(ctx context.Context, a *auth.RequestAuth, maxAttempts in
if maxAttempts <= 0 {
maxAttempts = c.maxRetries
}
clients := c.requestClientsForAuth(ctx, a)
attempts := 0
refreshed := false
for attempts < maxAttempts {
headers := c.authHeaders(a.DeepSeekToken)
resp, status, err := c.postJSONWithStatus(ctx, c.regular, DeepSeekCreatePowURL, headers, map[string]any{"target_path": "/api/v0/chat/completion"})
resp, status, err := c.postJSONWithStatus(ctx, clients.regular, clients.fallback, DeepSeekCreatePowURL, headers, map[string]any{"target_path": "/api/v0/chat/completion"})
if err != nil {
config.Logger.Warn("[get_pow] request error", "error", err, "account", a.AccountID)
attempts++
@@ -109,7 +110,7 @@ func (c *Client) GetPow(ctx context.Context, a *auth.RequestAuth, maxAttempts in
data, _ := resp["data"].(map[string]any)
bizData, _ := data["biz_data"].(map[string]any)
challenge, _ := bizData["challenge"].(map[string]any)
answer, err := c.powSolver.Compute(ctx, challenge)
answer, err := ComputePow(ctx, challenge)
if err != nil {
attempts++
continue
@@ -201,6 +202,22 @@ func isAuthIndicativeBizFailure(msg string, bizMsg string) bool {
return false
}
// DeepSeek has returned create-session ids in both biz_data.id and
// biz_data.chat_session.id across observed response variants; accept either.
func extractCreateSessionID(resp map[string]any) string {
data, _ := resp["data"].(map[string]any)
bizData, _ := data["biz_data"].(map[string]any)
if sessionID, _ := bizData["id"].(string); strings.TrimSpace(sessionID) != "" {
return strings.TrimSpace(sessionID)
}
if chatSession, ok := bizData["chat_session"].(map[string]any); ok {
if sessionID, _ := chatSession["id"].(string); strings.TrimSpace(sessionID) != "" {
return strings.TrimSpace(sessionID)
}
}
return ""
}
func extractResponseStatus(resp map[string]any) (code int, bizCode int, msg string, bizMsg string) {
code = intFrom(resp["code"])
msg, _ = resp["msg"].(string)

View File

@@ -0,0 +1,34 @@
package deepseek
import "testing"
func TestExtractCreateSessionIDSupportsLegacyShape(t *testing.T) {
resp := map[string]any{
"data": map[string]any{
"biz_data": map[string]any{
"id": "legacy-session-id",
},
},
}
if got := extractCreateSessionID(resp); got != "legacy-session-id" {
t.Fatalf("expected legacy session id, got %q", got)
}
}
func TestExtractCreateSessionIDSupportsNestedChatSessionShape(t *testing.T) {
resp := map[string]any{
"data": map[string]any{
"biz_data": map[string]any{
"chat_session": map[string]any{
"id": "nested-session-id",
"model_type": "default",
},
},
},
}
if got := extractCreateSessionID(resp); got != "nested-session-id" {
t.Fatalf("expected nested session id, got %q", got)
}
}

View File

@@ -10,18 +10,20 @@ import (
"ds2api/internal/auth"
"ds2api/internal/config"
trans "ds2api/internal/deepseek/transport"
)
func (c *Client) CallCompletion(ctx context.Context, a *auth.RequestAuth, payload map[string]any, powResp string, maxAttempts int) (*http.Response, error) {
if maxAttempts <= 0 {
maxAttempts = c.maxRetries
}
clients := c.requestClientsForAuth(ctx, a)
headers := c.authHeaders(a.DeepSeekToken)
headers["x-ds-pow-response"] = powResp
captureSession := c.capture.Start("deepseek_completion", DeepSeekCompletionURL, a.AccountID, payload)
attempts := 0
for attempts < maxAttempts {
resp, err := c.streamPost(ctx, DeepSeekCompletionURL, headers, payload)
resp, err := c.streamPost(ctx, clients.stream, DeepSeekCompletionURL, headers, payload)
if err != nil {
attempts++
time.Sleep(time.Second)
@@ -44,11 +46,12 @@ func (c *Client) CallCompletion(ctx context.Context, a *auth.RequestAuth, payloa
return nil, errors.New("completion failed")
}
func (c *Client) streamPost(ctx context.Context, url string, headers map[string]string, payload any) (*http.Response, error) {
func (c *Client) streamPost(ctx context.Context, doer trans.Doer, url string, headers map[string]string, payload any) (*http.Response, error) {
b, err := json.Marshal(payload)
if err != nil {
return nil, err
}
clients := c.requestClientsFromContext(ctx)
req, err := http.NewRequestWithContext(ctx, http.MethodPost, url, bytes.NewReader(b))
if err != nil {
return nil, err
@@ -56,7 +59,7 @@ func (c *Client) streamPost(ctx context.Context, url string, headers map[string]
for k, v := range headers {
req.Header.Set(k, v)
}
resp, err := c.stream.Do(req)
resp, err := doer.Do(req)
if err != nil {
config.Logger.Warn("[deepseek] fingerprint stream request failed, fallback to std transport", "url", url, "error", err)
req2, reqErr := http.NewRequestWithContext(ctx, http.MethodPost, url, bytes.NewReader(b))
@@ -66,7 +69,7 @@ func (c *Client) streamPost(ctx context.Context, url string, headers map[string]
for k, v := range headers {
req2.Header.Set(k, v)
}
return c.fallbackS.Do(req2)
return clients.fallbackS.Do(req2)
}
return resp, nil
}

View File

@@ -51,6 +51,7 @@ func (c *Client) callContinue(ctx context.Context, a *auth.RequestAuth, sessionI
if strings.TrimSpace(sessionID) == "" || responseMessageID <= 0 {
return nil, errors.New("missing continue identifiers")
}
clients := c.requestClientsForAuth(ctx, a)
headers := c.authHeaders(a.DeepSeekToken)
headers["x-ds-pow-response"] = powResp
payload := map[string]any{
@@ -60,7 +61,7 @@ func (c *Client) callContinue(ctx context.Context, a *auth.RequestAuth, sessionI
}
config.Logger.Info("[auto_continue] calling continue", "session_id", sessionID, "message_id", responseMessageID)
captureSession := c.capture.Start("deepseek_continue", DeepSeekContinueURL, a.AccountID, payload)
resp, err := c.streamPost(ctx, DeepSeekContinueURL, headers, payload)
resp, err := c.streamPost(ctx, clients.stream, DeepSeekContinueURL, headers, payload)
if err != nil {
return nil, err
}

View File

@@ -3,6 +3,7 @@ package deepseek
import (
"context"
"net/http"
"sync"
"time"
"ds2api/internal/auth"
@@ -23,24 +24,27 @@ type Client struct {
stream trans.Doer
fallback *http.Client
fallbackS *http.Client
powSolver *PowSolver
maxRetries int
proxyClientsMu sync.RWMutex
proxyClients map[string]requestClients
}
func NewClient(store *config.Store, resolver *auth.Resolver) *Client {
return &Client{
Store: store,
Auth: resolver,
capture: devcapture.Global(),
regular: trans.New(60 * time.Second),
stream: trans.New(0),
fallback: &http.Client{Timeout: 60 * time.Second},
fallbackS: &http.Client{Timeout: 0},
powSolver: NewPowSolver(config.WASMPath()),
maxRetries: 3,
Store: store,
Auth: resolver,
capture: devcapture.Global(),
regular: trans.New(60 * time.Second),
stream: trans.New(0),
fallback: &http.Client{Timeout: 60 * time.Second},
fallbackS: &http.Client{Timeout: 0},
maxRetries: 3,
proxyClients: map[string]requestClients{},
}
}
func (c *Client) PreloadPow(ctx context.Context) error {
return c.powSolver.init(ctx)
// PreloadPow 保留兼容接口,纯 Go 实现无需预加载。
func (c *Client) PreloadPow(_ context.Context) error {
return nil
}

View File

@@ -11,8 +11,8 @@ import (
trans "ds2api/internal/deepseek/transport"
)
func (c *Client) postJSON(ctx context.Context, doer trans.Doer, url string, headers map[string]string, payload any) (map[string]any, error) {
body, status, err := c.postJSONWithStatus(ctx, doer, url, headers, payload)
func (c *Client) postJSON(ctx context.Context, doer trans.Doer, fallback trans.Doer, url string, headers map[string]string, payload any) (map[string]any, error) {
body, status, err := c.postJSONWithStatus(ctx, doer, fallback, url, headers, payload)
if err != nil {
return nil, err
}
@@ -22,7 +22,7 @@ func (c *Client) postJSON(ctx context.Context, doer trans.Doer, url string, head
return body, nil
}
func (c *Client) postJSONWithStatus(ctx context.Context, doer trans.Doer, url string, headers map[string]string, payload any) (map[string]any, int, error) {
func (c *Client) postJSONWithStatus(ctx context.Context, doer trans.Doer, fallback trans.Doer, url string, headers map[string]string, payload any) (map[string]any, int, error) {
b, err := json.Marshal(payload)
if err != nil {
return nil, 0, err
@@ -44,7 +44,7 @@ func (c *Client) postJSONWithStatus(ctx context.Context, doer trans.Doer, url st
for k, v := range headers {
req2.Header.Set(k, v)
}
resp, err = c.fallback.Do(req2)
resp, err = fallback.Do(req2)
if err != nil {
return nil, 0, err
}
@@ -64,6 +64,7 @@ func (c *Client) postJSONWithStatus(ctx context.Context, doer trans.Doer, url st
}
func (c *Client) getJSONWithStatus(ctx context.Context, doer trans.Doer, url string, headers map[string]string) (map[string]any, int, error) {
clients := c.requestClientsFromContext(ctx)
req, err := http.NewRequestWithContext(ctx, http.MethodGet, url, nil)
if err != nil {
return nil, 0, err
@@ -81,7 +82,7 @@ func (c *Client) getJSONWithStatus(ctx context.Context, doer trans.Doer, url str
for k, v := range headers {
req2.Header.Set(k, v)
}
resp, err = c.fallback.Do(req2)
resp, err = clients.fallback.Do(req2)
if err != nil {
return nil, 0, err
}

View File

@@ -0,0 +1,52 @@
package deepseek
import (
"context"
"errors"
"io"
"net/http"
"strings"
"testing"
)
func TestPostJSONWithStatusUsesProvidedFallbackClient(t *testing.T) {
var fallbackCalled bool
client := &Client{}
primary := failingDoer{err: errors.New("primary failed")}
fallbackDoer := doerFunc(func(req *http.Request) (*http.Response, error) {
fallbackCalled = true
return &http.Response{
StatusCode: http.StatusOK,
Header: make(http.Header),
Body: io.NopCloser(strings.NewReader(`{"ok":true}`)),
Request: req,
}, nil
})
resp, status, err := client.postJSONWithStatus(
context.Background(),
primary,
fallbackDoer,
"https://example.com/api",
map[string]string{"x-test": "1"},
map[string]any{"foo": "bar"},
)
if err != nil {
t.Fatalf("postJSONWithStatus error: %v", err)
}
if status != http.StatusOK {
t.Fatalf("status=%d want=%d", status, http.StatusOK)
}
if !fallbackCalled {
t.Fatal("expected provided fallback doer to be called")
}
if ok, _ := resp["ok"].(bool); !ok {
t.Fatalf("unexpected response body: %#v", resp)
}
}
type doerFunc func(*http.Request) (*http.Response, error)
func (f doerFunc) Do(req *http.Request) (*http.Response, error) {
return f(req)
}

View File

@@ -36,6 +36,7 @@ func (c *Client) GetSessionCount(ctx context.Context, a *auth.RequestAuth, maxAt
if maxAttempts <= 0 {
maxAttempts = c.maxRetries
}
clients := c.requestClientsForAuth(ctx, a)
stats := &SessionStats{
AccountID: a.AccountID,
@@ -50,7 +51,7 @@ func (c *Client) GetSessionCount(ctx context.Context, a *auth.RequestAuth, maxAt
// 构建请求 URL
reqURL := DeepSeekFetchSessionURL + "?lte_cursor.pinned=false"
resp, status, err := c.getJSONWithStatus(ctx, c.regular, reqURL, headers)
resp, status, err := c.getJSONWithStatus(ctx, clients.regular, reqURL, headers)
if err != nil {
config.Logger.Warn("[get_session_count] request error", "error", err, "account", a.AccountID)
attempts++
@@ -106,10 +107,11 @@ func (c *Client) GetSessionCount(ctx context.Context, a *auth.RequestAuth, maxAt
// GetSessionCountForToken 直接使用 token 获取会话数量(直通模式)
func (c *Client) GetSessionCountForToken(ctx context.Context, token string) (*SessionStats, error) {
clients := c.requestClientsFromContext(ctx)
headers := c.authHeaders(token)
reqURL := DeepSeekFetchSessionURL + "?lte_cursor.pinned=false"
resp, status, err := c.getJSONWithStatus(ctx, c.regular, reqURL, headers)
resp, status, err := c.getJSONWithStatus(ctx, clients.regular, reqURL, headers)
if err != nil {
return nil, err
}
@@ -160,7 +162,7 @@ func (c *Client) GetSessionCountAll(ctx context.Context) []*SessionStats {
// 如果没有 token尝试登录获取
if token == "" {
var err error
token, err = c.Login(ctx, acc)
token, err = c.Login(auth.WithAuth(ctx, &auth.RequestAuth{AccountID: acc.Identifier(), Account: acc}), acc)
if err != nil {
results = append(results, &SessionStats{
AccountID: accountID,
@@ -171,7 +173,8 @@ func (c *Client) GetSessionCountAll(ctx context.Context) []*SessionStats {
}
}
stats, err := c.GetSessionCountForToken(ctx, token)
ctxWithAuth := auth.WithAuth(ctx, &auth.RequestAuth{AccountID: acc.Identifier(), Account: acc, DeepSeekToken: token})
stats, err := c.GetSessionCountForToken(ctxWithAuth, token)
if err != nil {
results = append(results, &SessionStats{
AccountID: accountID,
@@ -190,6 +193,7 @@ func (c *Client) GetSessionCountAll(ctx context.Context) []*SessionStats {
// FetchSessionPage 获取会话列表(支持分页)
func (c *Client) FetchSessionPage(ctx context.Context, a *auth.RequestAuth, cursor string) ([]SessionInfo, bool, error) {
clients := c.requestClientsForAuth(ctx, a)
headers := c.authHeaders(a.DeepSeekToken)
// 构建请求 URL
@@ -200,7 +204,7 @@ func (c *Client) FetchSessionPage(ctx context.Context, a *auth.RequestAuth, curs
}
reqURL := DeepSeekFetchSessionURL + "?" + params.Encode()
resp, status, err := c.getJSONWithStatus(ctx, c.regular, reqURL, headers)
resp, status, err := c.getJSONWithStatus(ctx, clients.regular, reqURL, headers)
if err != nil {
return nil, false, err
}

View File

@@ -22,6 +22,7 @@ func (c *Client) DeleteSession(ctx context.Context, a *auth.RequestAuth, session
if maxAttempts <= 0 {
maxAttempts = c.maxRetries
}
clients := c.requestClientsForAuth(ctx, a)
result := &DeleteSessionResult{
SessionID: sessionID,
@@ -42,7 +43,7 @@ func (c *Client) DeleteSession(ctx context.Context, a *auth.RequestAuth, session
"chat_session_id": sessionID,
}
resp, status, err := c.postJSONWithStatus(ctx, c.regular, DeepSeekDeleteSessionURL, headers, payload)
resp, status, err := c.postJSONWithStatus(ctx, clients.regular, clients.fallback, DeepSeekDeleteSessionURL, headers, payload)
if err != nil {
config.Logger.Warn("[delete_session] request error", "error", err, "session_id", sessionID)
attempts++
@@ -81,6 +82,7 @@ func (c *Client) DeleteSession(ctx context.Context, a *auth.RequestAuth, session
// DeleteSessionForToken 直接使用 token 删除会话(直通模式)
func (c *Client) DeleteSessionForToken(ctx context.Context, token string, sessionID string) (*DeleteSessionResult, error) {
clients := c.requestClientsFromContext(ctx)
result := &DeleteSessionResult{
SessionID: sessionID,
}
@@ -95,7 +97,7 @@ func (c *Client) DeleteSessionForToken(ctx context.Context, token string, sessio
"chat_session_id": sessionID,
}
resp, status, err := c.postJSONWithStatus(ctx, c.regular, DeepSeekDeleteSessionURL, headers, payload)
resp, status, err := c.postJSONWithStatus(ctx, clients.regular, clients.fallback, DeepSeekDeleteSessionURL, headers, payload)
if err != nil {
result.ErrorMessage = err.Error()
return result, err
@@ -114,10 +116,11 @@ func (c *Client) DeleteSessionForToken(ctx context.Context, token string, sessio
// DeleteAllSessions 删除所有会话(谨慎使用)
func (c *Client) DeleteAllSessions(ctx context.Context, a *auth.RequestAuth) error {
clients := c.requestClientsForAuth(ctx, a)
headers := c.authHeaders(a.DeepSeekToken)
payload := map[string]any{}
resp, status, err := c.postJSONWithStatus(ctx, c.regular, DeepSeekDeleteAllSessionsURL, headers, payload)
resp, status, err := c.postJSONWithStatus(ctx, clients.regular, clients.fallback, DeepSeekDeleteAllSessionsURL, headers, payload)
if err != nil {
config.Logger.Warn("[delete_all_sessions] request error", "error", err)
return err
@@ -135,10 +138,11 @@ func (c *Client) DeleteAllSessions(ctx context.Context, a *auth.RequestAuth) err
// DeleteAllSessionsForToken 直接使用 token 删除所有会话(直通模式)
func (c *Client) DeleteAllSessionsForToken(ctx context.Context, token string) error {
clients := c.requestClientsFromContext(ctx)
headers := c.authHeaders(token)
payload := map[string]any{}
resp, status, err := c.postJSONWithStatus(ctx, c.regular, DeepSeekDeleteAllSessionsURL, headers, payload)
resp, status, err := c.postJSONWithStatus(ctx, clients.regular, clients.fallback, DeepSeekDeleteAllSessionsURL, headers, payload)
if err != nil {
config.Logger.Warn("[delete_all_sessions_for_token] request error", "error", err)
return err

View File

@@ -19,11 +19,11 @@ const (
var defaultBaseHeaders = map[string]string{
"Host": "chat.deepseek.com",
"User-Agent": "DeepSeek/1.6.11 Android/35",
"User-Agent": "DeepSeek/1.8.0 Android/35",
"Accept": "application/json",
"Content-Type": "application/json",
"x-client-platform": "android",
"x-client-version": "1.6.11",
"x-client-version": "1.8.0",
"x-client-locale": "zh_CN",
"accept-charset": "UTF-8",
}

View File

@@ -1,18 +1,17 @@
{
"base_headers": {
"Host": "chat.deepseek.com",
"User-Agent": "DeepSeek/1.6.11 Android/35",
"User-Agent": "DeepSeek/1.8.0 Android/35",
"Accept": "application/json",
"Content-Type": "application/json",
"x-client-platform": "android",
"x-client-version": "1.6.11",
"x-client-version": "1.8.0",
"x-client-locale": "zh_CN",
"accept-charset": "UTF-8"
},
"skip_contains_patterns": [
"quasi_status",
"elapsed_secs",
"token_usage",
"pending_fragment",
"conversation_mode",
"fragments/-1/status",

View File

@@ -105,43 +105,16 @@ func TestBuildPowHeaderEmptyChallenge(t *testing.T) {
}
}
// ─── PowSolver pool size ─────────────────────────────────────────────
func TestPowPoolSizeFromEnvDefault(t *testing.T) {
t.Setenv("DS2API_POW_POOL_SIZE", "")
got := powPoolSizeFromEnv()
if got < 1 {
t.Fatalf("expected positive default pool size, got %d", got)
}
}
func TestPowPoolSizeFromEnvInvalid(t *testing.T) {
t.Setenv("DS2API_POW_POOL_SIZE", "abc")
got := powPoolSizeFromEnv()
if got < 1 {
t.Fatalf("expected positive default for invalid, got %d", got)
}
}
func TestPowPoolSizeFromEnvSpecificValue(t *testing.T) {
t.Setenv("DS2API_POW_POOL_SIZE", "5")
got := powPoolSizeFromEnv()
if got != 5 {
t.Fatalf("expected 5, got %d", got)
}
}
// ─── NewClient ───────────────────────────────────────────────────────
func TestNewClientInitialState(t *testing.T) {
client := NewClient(nil, nil)
if client.powSolver == nil {
t.Fatal("expected powSolver to be initialized")
if client == nil {
t.Fatal("expected non-nil client")
}
}
func TestNewClientPreloadPowIdempotent(t *testing.T) {
t.Setenv("DS2API_POW_POOL_SIZE", "1")
client := NewClient(nil, nil)
if err := client.PreloadPow(context.Background()); err != nil {
t.Fatalf("first preload failed: %v", err)
@@ -150,16 +123,3 @@ func TestNewClientPreloadPowIdempotent(t *testing.T) {
t.Fatalf("second preload failed: %v", err)
}
}
// ─── PowSolver init and module pool ──────────────────────────────────
func TestPowSolverPoolSizeMatchesEnv(t *testing.T) {
t.Setenv("DS2API_POW_POOL_SIZE", "2")
solver := NewPowSolver("test.wasm")
if err := solver.init(context.Background()); err != nil {
t.Fatalf("init failed: %v", err)
}
if cap(solver.pool) != 2 {
t.Fatalf("expected pool capacity 2, got %d", cap(solver.pool))
}
}

View File

@@ -1,6 +0,0 @@
package deepseek
import _ "embed"
//go:embed assets/sha3_wasm_bg.7b9ca65ddd.wasm
var embeddedWASM []byte

View File

@@ -3,218 +3,27 @@ package deepseek
import (
"context"
"encoding/base64"
"encoding/binary"
"encoding/json"
"errors"
"math"
"os"
stdruntime "runtime"
"strconv"
"sync"
"ds2api/internal/config"
"github.com/tetratelabs/wazero"
"github.com/tetratelabs/wazero/api"
"ds2api/pow"
)
type PowSolver struct {
wasmPath string
once sync.Once
err error
runtime wazero.Runtime
compiled wazero.CompiledModule
pool chan *pooledModule
poolSize int
}
type pooledModule struct {
mod api.Module
stackFn api.Function
allocFn api.Function
freeFn api.Function
solveFn api.Function
}
func NewPowSolver(wasmPath string) *PowSolver {
return &PowSolver{wasmPath: wasmPath}
}
func (p *PowSolver) init(ctx context.Context) error {
p.once.Do(func() {
wasmBytes, err := os.ReadFile(p.wasmPath)
if err != nil {
if len(embeddedWASM) == 0 {
p.err = err
return
}
wasmBytes = embeddedWASM
}
p.runtime = wazero.NewRuntime(ctx)
p.compiled, p.err = p.runtime.CompileModule(ctx, wasmBytes)
if p.err == nil {
p.poolSize = powPoolSizeFromEnv()
p.pool = make(chan *pooledModule, p.poolSize)
for range p.poolSize {
inst, err := p.createModule(ctx)
if err != nil {
p.err = err
return
}
p.pool <- inst
}
}
})
return p.err
}
func (p *PowSolver) Compute(ctx context.Context, challenge map[string]any) (int64, error) {
if err := p.init(ctx); err != nil {
return 0, err
}
// ComputePow 使用纯 Go 实现求解 PoW challenge (DeepSeekHashV1)。
func ComputePow(ctx context.Context, challenge map[string]any) (int64, error) {
algo, _ := challenge["algorithm"].(string)
if algo != "DeepSeekHashV1" {
return 0, errors.New("unsupported algorithm")
}
challengeStr, _ := challenge["challenge"].(string)
salt, _ := challenge["salt"].(string)
signature, _ := challenge["signature"].(string)
targetPath, _ := challenge["target_path"].(string)
_ = signature
_ = targetPath
difficulty := toFloat64(challenge["difficulty"], 144000)
expireAt := toInt64(challenge["expire_at"], 1680000000)
prefix := salt + "_" + itoa(expireAt) + "_"
difficulty := toInt64FromFloat(challenge["difficulty"], 144000)
pm, err := p.acquireModule(ctx)
if err != nil {
return 0, err
}
defer p.releaseModule(pm)
mem := pm.mod.Memory()
if mem == nil {
return 0, errors.New("wasm memory missing")
}
retPtrs, err := pm.stackFn.Call(ctx, uint64(uint32(^uint32(15)))) // -16 i32
if err != nil || len(retPtrs) == 0 {
return 0, errors.New("stack alloc failed")
}
retptr := uint32(retPtrs[0])
defer func() {
_, _ = pm.stackFn.Call(context.Background(), 16)
}()
chPtr, chLen, err := writeUTF8(ctx, pm.allocFn, mem, challengeStr)
if err != nil {
return 0, err
}
defer freeUTF8(pm.freeFn, chPtr, chLen)
prefixPtr, prefixLen, err := writeUTF8(ctx, pm.allocFn, mem, prefix)
if err != nil {
return 0, err
}
defer freeUTF8(pm.freeFn, prefixPtr, prefixLen)
if _, err := pm.solveFn.Call(ctx,
uint64(retptr),
uint64(chPtr), uint64(chLen),
uint64(prefixPtr), uint64(prefixLen),
math.Float64bits(difficulty),
); err != nil {
return 0, err
}
statusBytes, ok := mem.Read(retptr, 4)
if !ok {
return 0, errors.New("read status failed")
}
status := int32(binary.LittleEndian.Uint32(statusBytes))
valueBytes, ok := mem.Read(retptr+8, 8)
if !ok {
return 0, errors.New("read value failed")
}
value := math.Float64frombits(binary.LittleEndian.Uint64(valueBytes))
if status == 0 {
return 0, errors.New("pow solve failed")
}
return int64(value), nil
}
func (p *PowSolver) createModule(ctx context.Context) (*pooledModule, error) {
mod, err := p.runtime.InstantiateModule(ctx, p.compiled, wazero.NewModuleConfig())
if err != nil {
return nil, err
}
stackFn := mod.ExportedFunction("__wbindgen_add_to_stack_pointer")
allocFn := mod.ExportedFunction("__wbindgen_export_0")
solveFn := mod.ExportedFunction("wasm_solve")
if stackFn == nil || allocFn == nil || solveFn == nil {
_ = mod.Close(context.Background())
return nil, errors.New("required wasm exports missing")
}
return &pooledModule{
mod: mod,
stackFn: stackFn,
allocFn: allocFn,
freeFn: mod.ExportedFunction("__wbindgen_export_2"),
solveFn: solveFn,
}, nil
}
func (p *PowSolver) acquireModule(ctx context.Context) (*pooledModule, error) {
if p.pool != nil {
for {
select {
case pm := <-p.pool:
if pm != nil {
return pm, nil
}
case <-ctx.Done():
return nil, ctx.Err()
}
}
}
return p.createModule(ctx)
}
func (p *PowSolver) releaseModule(pm *pooledModule) {
if pm == nil || pm.mod == nil {
return
}
if p.pool != nil {
select {
case p.pool <- pm:
return
default:
}
}
_ = pm.mod.Close(context.Background())
}
func writeUTF8(ctx context.Context, allocFn api.Function, mem api.Memory, text string) (uint32, uint32, error) {
data := []byte(text)
res, err := allocFn.Call(ctx, uint64(len(data)), 1)
if err != nil || len(res) == 0 {
return 0, 0, errors.New("alloc failed")
}
ptr := uint32(res[0])
if !mem.Write(ptr, data) {
return 0, 0, errors.New("mem write failed")
}
return ptr, uint32(len(data)), nil
}
func freeUTF8(freeFn api.Function, ptr, size uint32) {
if freeFn == nil || ptr == 0 || size == 0 {
return
}
_, _ = freeFn.Call(context.Background(), uint64(ptr), uint64(size), 1)
return pow.SolvePow(ctx, challengeStr, salt, expireAt, difficulty)
}
// BuildPowHeader 序列化 {algorithm,challenge,salt,answer,signature,target_path} 为 base64(JSON)。
func BuildPowHeader(challenge map[string]any, answer int64) (string, error) {
payload := map[string]any{
"algorithm": challenge["algorithm"],
@@ -257,32 +66,7 @@ func toInt64(v any, d int64) int64 {
}
}
func itoa(n int64) string {
return strconv.FormatInt(n, 10)
}
func powPoolSizeFromEnv() int {
const fallback = 4
n := fallback
if cpus := stdruntime.GOMAXPROCS(0); cpus > 0 {
n = cpus
}
if raw := os.Getenv("DS2API_POW_POOL_SIZE"); raw != "" {
if v, err := strconv.Atoi(raw); err == nil && v > 0 {
n = v
}
}
if n > 64 {
return 64
}
return n
}
func PreloadWASM(wasmPath string) {
solver := NewPowSolver(wasmPath)
if err := solver.init(context.Background()); err != nil {
config.Logger.Warn("[WASM] preload failed", "error", err)
return
}
config.Logger.Info("[WASM] module preloaded", "path", wasmPath)
// toInt64FromFloat 与 toInt64 等价,仅名称区分用途。
func toInt64FromFloat(v any, d int64) int64 {
return toInt64(v, d)
}

View File

@@ -3,66 +3,18 @@ package deepseek
import (
"context"
"testing"
"time"
)
func TestPowPoolSizeFromEnv(t *testing.T) {
t.Setenv("DS2API_POW_POOL_SIZE", "3")
if got := powPoolSizeFromEnv(); got != 3 {
t.Fatalf("expected pool size 3, got %d", got)
}
}
func TestPowSolverAcquireReleaseReusesModule(t *testing.T) {
t.Setenv("DS2API_POW_POOL_SIZE", "1")
solver := NewPowSolver("missing-file.wasm")
if err := solver.init(context.Background()); err != nil {
t.Fatalf("init failed: %v", err)
}
pm1, err := solver.acquireModule(context.Background())
if err != nil {
t.Fatalf("acquire first module failed: %v", err)
}
solver.releaseModule(pm1)
pm2, err := solver.acquireModule(context.Background())
if err != nil {
t.Fatalf("acquire second module failed: %v", err)
}
if pm1 != pm2 {
t.Fatalf("expected pooled module reuse, got different instances")
}
solver.releaseModule(pm2)
}
func TestPowSolverAcquireHonorsContextWhenPoolExhausted(t *testing.T) {
t.Setenv("DS2API_POW_POOL_SIZE", "1")
solver := NewPowSolver("missing-file.wasm")
if err := solver.init(context.Background()); err != nil {
t.Fatalf("init failed: %v", err)
}
held, err := solver.acquireModule(context.Background())
if err != nil {
t.Fatalf("acquire held module failed: %v", err)
}
defer solver.releaseModule(held)
ctx, cancel := context.WithTimeout(context.Background(), 20*time.Millisecond)
defer cancel()
if _, err := solver.acquireModule(ctx); err == nil {
t.Fatalf("expected context cancellation while pool is exhausted")
}
}
func TestClientPreloadPowUsesClientSolver(t *testing.T) {
t.Setenv("DS2API_POW_POOL_SIZE", "1")
func TestPreloadPowNoOp(t *testing.T) {
client := NewClient(nil, nil)
if err := client.PreloadPow(context.Background()); err != nil {
t.Fatalf("preload failed: %v", err)
}
if client.powSolver.runtime == nil || client.powSolver.compiled == nil {
t.Fatalf("expected client pow solver to be initialized")
t.Fatalf("PreloadPow should be no-op, got error: %v", err)
}
}
func TestComputePowUnsupportedAlgorithm(t *testing.T) {
_, err := ComputePow(context.Background(), map[string]any{"algorithm": "unknown"})
if err == nil {
t.Fatal("expected error for unsupported algorithm")
}
}

239
internal/deepseek/proxy.go Normal file
View File

@@ -0,0 +1,239 @@
package deepseek
import (
"context"
"fmt"
"net"
"net/http"
"strconv"
"strings"
"time"
"golang.org/x/net/proxy"
"ds2api/internal/auth"
"ds2api/internal/config"
trans "ds2api/internal/deepseek/transport"
)
type requestClients struct {
regular trans.Doer
stream trans.Doer
fallback *http.Client
fallbackS *http.Client
}
type hostLookupFunc func(ctx context.Context, network, host string) ([]string, error)
var proxyConnectivityTestURL = "https://chat.deepseek.com/"
var defaultHostLookup hostLookupFunc = func(ctx context.Context, _ string, host string) ([]string, error) {
return net.DefaultResolver.LookupHost(ctx, host)
}
func proxyDialAddress(ctx context.Context, proxyType, address string, lookup hostLookupFunc) (string, error) {
proxyType = strings.ToLower(strings.TrimSpace(proxyType))
if proxyType != "socks5" {
return address, nil
}
host, port, err := net.SplitHostPort(address)
if err != nil {
return "", err
}
if net.ParseIP(host) != nil {
return address, nil
}
if lookup == nil {
lookup = defaultHostLookup
}
addrs, err := lookup(ctx, "ip", host)
if err != nil {
return "", err
}
if len(addrs) == 0 {
return "", fmt.Errorf("no ip address resolved for %s", host)
}
return net.JoinHostPort(addrs[0], port), nil
}
func proxyCacheKey(proxyCfg config.Proxy) string {
proxyCfg = config.NormalizeProxy(proxyCfg)
return strings.Join([]string{
proxyCfg.ID,
proxyCfg.Type,
strings.ToLower(proxyCfg.Host),
strconv.Itoa(proxyCfg.Port),
proxyCfg.Username,
proxyCfg.Password,
}, "|")
}
func proxyDialContext(proxyCfg config.Proxy) (trans.DialContextFunc, error) {
proxyCfg = config.NormalizeProxy(proxyCfg)
var authCfg *proxy.Auth
if proxyCfg.Username != "" || proxyCfg.Password != "" {
authCfg = &proxy.Auth{User: proxyCfg.Username, Password: proxyCfg.Password}
}
forward := &net.Dialer{Timeout: 15 * time.Second, KeepAlive: 30 * time.Second}
dialer, err := proxy.SOCKS5("tcp", net.JoinHostPort(proxyCfg.Host, strconv.Itoa(proxyCfg.Port)), authCfg, forward)
if err != nil {
return nil, err
}
return func(ctx context.Context, network, address string) (net.Conn, error) {
target, err := proxyDialAddress(ctx, proxyCfg.Type, address, defaultHostLookup)
if err != nil {
return nil, err
}
if ctxDialer, ok := dialer.(proxy.ContextDialer); ok {
return ctxDialer.DialContext(ctx, network, target)
}
return dialer.Dial(network, target)
}, nil
}
func (c *Client) defaultRequestClients() requestClients {
return requestClients{
regular: c.regular,
stream: c.stream,
fallback: c.fallback,
fallbackS: c.fallbackS,
}
}
func (c *Client) resolveProxyForAccount(acc config.Account) (config.Proxy, bool) {
if c == nil || c.Store == nil {
return config.Proxy{}, false
}
proxyID := strings.TrimSpace(acc.ProxyID)
if proxyID == "" {
return config.Proxy{}, false
}
snap := c.Store.Snapshot()
for _, proxyCfg := range snap.Proxies {
proxyCfg = config.NormalizeProxy(proxyCfg)
if proxyCfg.ID == proxyID {
return proxyCfg, true
}
}
return config.Proxy{}, false
}
func (c *Client) requestClientsFromContext(ctx context.Context) requestClients {
if a, ok := auth.FromContext(ctx); ok {
return c.requestClientsForAccount(a.Account)
}
return c.defaultRequestClients()
}
func (c *Client) requestClientsForAuth(ctx context.Context, a *auth.RequestAuth) requestClients {
if a != nil {
return c.requestClientsForAccount(a.Account)
}
return c.requestClientsFromContext(ctx)
}
func (c *Client) requestClientsForAccount(acc config.Account) requestClients {
proxyCfg, ok := c.resolveProxyForAccount(acc)
if !ok {
return c.defaultRequestClients()
}
key := proxyCacheKey(proxyCfg)
c.proxyClientsMu.RLock()
cached, ok := c.proxyClients[key]
c.proxyClientsMu.RUnlock()
if ok {
return cached
}
dialContext, err := proxyDialContext(proxyCfg)
if err != nil {
config.Logger.Warn("[proxy] build dialer failed", "proxy_id", proxyCfg.ID, "error", err)
return c.defaultRequestClients()
}
bundle := requestClients{
regular: trans.NewWithDialContext(60*time.Second, dialContext),
stream: trans.NewWithDialContext(0, dialContext),
fallback: trans.NewFallbackClient(60*time.Second, dialContext),
fallbackS: trans.NewFallbackClient(0, dialContext),
}
c.proxyClientsMu.Lock()
if c.proxyClients == nil {
c.proxyClients = make(map[string]requestClients)
}
c.proxyClients[key] = bundle
c.proxyClientsMu.Unlock()
return bundle
}
func applyProxyConnectivityHeaders(req *http.Request) {
if req == nil {
return
}
for key, value := range BaseHeaders {
key = strings.TrimSpace(key)
value = strings.TrimSpace(value)
if key == "" || value == "" {
continue
}
req.Header.Set(key, value)
}
}
func proxyConnectivityStatus(statusCode int) (bool, string) {
switch {
case statusCode >= 200 && statusCode < 300:
return true, fmt.Sprintf("代理可达,目标返回 HTTP %d", statusCode)
case statusCode >= 300 && statusCode < 500:
return true, fmt.Sprintf("代理可达,但目标返回 HTTP %d可能是风控或挑战", statusCode)
default:
return false, fmt.Sprintf("目标返回 HTTP %d", statusCode)
}
}
func TestProxyConnectivity(ctx context.Context, proxyCfg config.Proxy) map[string]any {
start := time.Now()
proxyCfg = config.NormalizeProxy(proxyCfg)
result := map[string]any{
"success": false,
"proxy_id": proxyCfg.ID,
"proxy_type": proxyCfg.Type,
"response_time": 0,
}
if err := config.ValidateProxyConfig([]config.Proxy{proxyCfg}); err != nil {
result["message"] = "代理配置无效: " + err.Error()
return result
}
dialContext, err := proxyDialContext(proxyCfg)
if err != nil {
result["message"] = "代理拨号器初始化失败: " + err.Error()
return result
}
client := trans.NewFallbackClient(15*time.Second, dialContext)
req, err := http.NewRequestWithContext(ctx, http.MethodGet, proxyConnectivityTestURL, nil)
if err != nil {
result["message"] = err.Error()
return result
}
applyProxyConnectivityHeaders(req)
resp, err := client.Do(req)
result["response_time"] = int(time.Since(start).Milliseconds())
if err != nil {
result["message"] = err.Error()
return result
}
defer func() {
if closeErr := resp.Body.Close(); closeErr != nil {
config.Logger.Warn("[proxy] close response body failed", "proxy_id", proxyCfg.ID, "error", closeErr)
}
}()
result["status_code"] = resp.StatusCode
result["success"], result["message"] = proxyConnectivityStatus(resp.StatusCode)
return result
}

View File

@@ -0,0 +1,85 @@
package deepseek
import (
"context"
"net/http"
"strings"
"testing"
)
func TestProxyDialAddressUsesLocalResolutionForSocks5(t *testing.T) {
ctx := context.Background()
resolved, err := proxyDialAddress(ctx, "socks5", "example.com:443", func(_ context.Context, network, host string) ([]string, error) {
if network != "ip" {
t.Fatalf("unexpected lookup network: %q", network)
}
if host != "example.com" {
t.Fatalf("unexpected lookup host: %q", host)
}
return []string{"203.0.113.10"}, nil
})
if err != nil {
t.Fatalf("proxyDialAddress returned error: %v", err)
}
if resolved != "203.0.113.10:443" {
t.Fatalf("expected locally resolved address, got %q", resolved)
}
}
func TestProxyDialAddressKeepsHostnameForSocks5h(t *testing.T) {
ctx := context.Background()
lookups := 0
resolved, err := proxyDialAddress(ctx, "socks5h", "example.com:443", func(_ context.Context, network, host string) ([]string, error) {
lookups++
return []string{"203.0.113.10"}, nil
})
if err != nil {
t.Fatalf("proxyDialAddress returned error: %v", err)
}
if resolved != "example.com:443" {
t.Fatalf("expected hostname preserved for remote DNS, got %q", resolved)
}
if lookups != 0 {
t.Fatalf("expected no local DNS lookup for socks5h, got %d", lookups)
}
}
func TestApplyProxyConnectivityHeadersUsesBaseHeaders(t *testing.T) {
req, err := http.NewRequest(http.MethodGet, "https://chat.deepseek.com/", nil)
if err != nil {
t.Fatalf("http.NewRequest returned error: %v", err)
}
applyProxyConnectivityHeaders(req)
for key, want := range BaseHeaders {
if got := req.Header.Get(key); got != want {
t.Fatalf("expected header %q=%q, got %q", key, want, got)
}
}
}
func TestProxyConnectivityStatus(t *testing.T) {
cases := []struct {
name string
statusCode int
success bool
wantText string
}{
{name: "ok", statusCode: 200, success: true, wantText: "HTTP 200"},
{name: "challenge", statusCode: 403, success: true, wantText: "风控或挑战"},
{name: "upstream error", statusCode: 502, success: false, wantText: "HTTP 502"},
}
for _, tc := range cases {
t.Run(tc.name, func(t *testing.T) {
success, message := proxyConnectivityStatus(tc.statusCode)
if success != tc.success {
t.Fatalf("expected success=%v, got %v", tc.success, success)
}
if message == "" || !strings.Contains(message, tc.wantText) {
t.Fatalf("expected message to contain %q, got %q", tc.wantText, message)
}
})
}
}

View File

@@ -15,21 +15,33 @@ type Doer interface {
Do(req *http.Request) (*http.Response, error)
}
type DialContextFunc func(ctx context.Context, network, addr string) (net.Conn, error)
type Client struct {
http *http.Client
}
func New(timeout time.Duration) *Client {
return NewWithDialContext(timeout, nil)
}
func NewWithDialContext(timeout time.Duration, dialContext DialContextFunc) *Client {
useEnvProxy := dialContext == nil
if dialContext == nil {
dialContext = (&net.Dialer{Timeout: 15 * time.Second, KeepAlive: 30 * time.Second}).DialContext
}
base := &http.Transport{
Proxy: http.ProxyFromEnvironment,
ForceAttemptHTTP2: false,
MaxIdleConns: 200,
MaxIdleConnsPerHost: 100,
IdleConnTimeout: 90 * time.Second,
DialContext: (&net.Dialer{Timeout: 15 * time.Second, KeepAlive: 30 * time.Second}).DialContext,
DialTLSContext: safariTLSDialer(),
DialContext: dialContext,
DialTLSContext: safariTLSDialer(dialContext),
TLSClientConfig: &tls.Config{MinVersion: tls.VersionTLS12},
}
if useEnvProxy {
base.Proxy = http.ProxyFromEnvironment
}
return &Client{http: &http.Client{Timeout: timeout, Transport: base}}
}
@@ -37,10 +49,31 @@ func (c *Client) Do(req *http.Request) (*http.Response, error) {
return c.http.Do(req)
}
func safariTLSDialer() func(ctx context.Context, network, addr string) (net.Conn, error) {
var dialer net.Dialer
func NewFallbackClient(timeout time.Duration, dialContext DialContextFunc) *http.Client {
useEnvProxy := dialContext == nil
if dialContext == nil {
dialContext = (&net.Dialer{Timeout: 15 * time.Second, KeepAlive: 30 * time.Second}).DialContext
}
base := &http.Transport{
ForceAttemptHTTP2: false,
MaxIdleConns: 200,
MaxIdleConnsPerHost: 100,
IdleConnTimeout: 90 * time.Second,
DialContext: dialContext,
TLSClientConfig: &tls.Config{MinVersion: tls.VersionTLS12},
}
if useEnvProxy {
base.Proxy = http.ProxyFromEnvironment
}
return &http.Client{Timeout: timeout, Transport: base}
}
func safariTLSDialer(dialContext DialContextFunc) func(ctx context.Context, network, addr string) (net.Conn, error) {
if dialContext == nil {
dialContext = (&net.Dialer{Timeout: 15 * time.Second, KeepAlive: 30 * time.Second}).DialContext
}
return func(ctx context.Context, network, addr string) (net.Conn, error) {
plainConn, err := dialer.DialContext(ctx, network, addr)
plainConn, err := dialContext(ctx, network, addr)
if err != nil {
return nil, err
}

View File

@@ -20,7 +20,9 @@ function parseChunkForContent(chunk, thinkingEnabled, currentType, stripReferenc
};
}
const outputTokens = extractAccumulatedTokenUsage(chunk);
const usage = extractAccumulatedTokenUsage(chunk);
const promptTokens = usage.prompt;
const outputTokens = usage.output;
if (Object.prototype.hasOwnProperty.call(chunk, 'error')) {
return {
@@ -29,7 +31,8 @@ function parseChunkForContent(chunk, thinkingEnabled, currentType, stripReferenc
finished: true,
contentFilter: false,
errorMessage: formatErrorMessage(chunk.error),
outputTokens: 0,
promptTokens,
outputTokens,
newType: currentType,
};
}
@@ -43,6 +46,7 @@ function parseChunkForContent(chunk, thinkingEnabled, currentType, stripReferenc
finished: true,
contentFilter: true,
errorMessage: '',
promptTokens,
outputTokens,
newType: currentType,
};
@@ -55,6 +59,7 @@ function parseChunkForContent(chunk, thinkingEnabled, currentType, stripReferenc
finished: false,
contentFilter: false,
errorMessage: '',
promptTokens,
outputTokens,
newType: currentType,
};
@@ -67,6 +72,7 @@ function parseChunkForContent(chunk, thinkingEnabled, currentType, stripReferenc
finished: true,
contentFilter: false,
errorMessage: '',
promptTokens,
outputTokens,
newType: currentType,
};
@@ -77,6 +83,7 @@ function parseChunkForContent(chunk, thinkingEnabled, currentType, stripReferenc
finished: false,
contentFilter: false,
errorMessage: '',
promptTokens,
outputTokens,
newType: currentType,
};
@@ -89,6 +96,7 @@ function parseChunkForContent(chunk, thinkingEnabled, currentType, stripReferenc
finished: false,
contentFilter: false,
errorMessage: '',
promptTokens,
outputTokens,
newType: currentType,
};
@@ -157,6 +165,7 @@ function parseChunkForContent(chunk, thinkingEnabled, currentType, stripReferenc
finished: true,
contentFilter: false,
errorMessage: '',
promptTokens,
outputTokens,
newType,
};
@@ -168,6 +177,7 @@ function parseChunkForContent(chunk, thinkingEnabled, currentType, stripReferenc
finished: false,
contentFilter: false,
errorMessage: '',
promptTokens,
outputTokens,
newType,
};
@@ -182,6 +192,7 @@ function parseChunkForContent(chunk, thinkingEnabled, currentType, stripReferenc
finished: false,
contentFilter: false,
errorMessage: '',
promptTokens,
outputTokens,
newType,
};
@@ -196,6 +207,7 @@ function parseChunkForContent(chunk, thinkingEnabled, currentType, stripReferenc
finished: true,
contentFilter: false,
errorMessage: '',
promptTokens,
outputTokens,
newType,
};
@@ -207,6 +219,7 @@ function parseChunkForContent(chunk, thinkingEnabled, currentType, stripReferenc
finished: false,
contentFilter: false,
errorMessage: '',
promptTokens,
outputTokens,
newType,
};
@@ -242,6 +255,7 @@ function parseChunkForContent(chunk, thinkingEnabled, currentType, stripReferenc
finished: false,
contentFilter: false,
errorMessage: '',
promptTokens,
outputTokens,
newType,
};
@@ -429,56 +443,10 @@ function hasContentFilterStatusValue(v) {
}
function extractAccumulatedTokenUsage(chunk) {
return findAccumulatedTokenUsage(chunk);
}
function findAccumulatedTokenUsage(v) {
if (Array.isArray(v)) {
for (const item of v) {
const n = findAccumulatedTokenUsage(item);
if (n > 0) {
return n;
}
}
return 0;
}
if (!v || typeof v !== 'object') {
return 0;
}
const pathValue = asString(v.p);
if (pathValue && pathValue.toLowerCase().includes('accumulated_token_usage')) {
const n = toInt(v.v);
if (n > 0) {
return n;
}
}
const direct = toInt(v.accumulated_token_usage);
if (direct > 0) {
return direct;
}
for (const value of Object.values(v)) {
const n = findAccumulatedTokenUsage(value);
if (n > 0) {
return n;
}
}
return 0;
}
function toInt(v) {
if (typeof v === 'number' && Number.isFinite(v)) {
return Math.trunc(v);
}
if (typeof v === 'string' && v.trim() !== '') {
const n = Number(v);
if (Number.isFinite(n)) {
return Math.trunc(n);
}
}
if (typeof v !== 'number') {
return 0;
}
return Number.isFinite(v) ? Math.trunc(v) : 0;
// 临时策略:忽略上游 usage 字段accumulated_token_usage / token_usage
// 统一使用内部估算计数,避免上下文累计口径误差。
void chunk;
return { prompt: 0, output: 0 };
}
function formatErrorMessage(v) {

View File

@@ -1,15 +1,17 @@
'use strict';
function buildUsage(prompt, thinking, output, outputTokens = 0) {
const promptTokens = estimateTokens(prompt);
function buildUsage(prompt, thinking, output, outputTokens = 0, providedPromptTokens = 0) {
const reasoningTokens = estimateTokens(thinking);
const completionTokens = estimateTokens(output);
const finalPromptTokens = Number.isFinite(providedPromptTokens) && providedPromptTokens > 0 ? Math.trunc(providedPromptTokens) : estimateTokens(prompt);
const overriddenCompletionTokens = Number.isFinite(outputTokens) && outputTokens > 0 ? Math.trunc(outputTokens) : 0;
const finalCompletionTokens = overriddenCompletionTokens > 0 ? overriddenCompletionTokens : reasoningTokens + completionTokens;
return {
prompt_tokens: promptTokens,
prompt_tokens: finalPromptTokens,
completion_tokens: finalCompletionTokens,
total_tokens: promptTokens + finalCompletionTokens,
total_tokens: finalPromptTokens + finalCompletionTokens,
completion_tokens_details: {
reasoning_tokens: reasoningTokens,
},

View File

@@ -125,7 +125,6 @@ async function handleVercelStream(req, res, rawBody, payload) {
let currentType = thinkingEnabled ? 'thinking' : 'text';
let thinkingText = '';
let outputText = '';
let outputTokens = 0;
const toolSieveEnabled = toolPolicy.toolSieveEnabled;
const toolSieveState = createToolSieveState();
let toolCallsEmitted = false;
@@ -178,7 +177,7 @@ async function handleVercelStream(req, res, rawBody, payload) {
created,
model,
choices: [{ delta: {}, index: 0, finish_reason: reason }],
usage: buildUsage(finalPrompt, thinkingText, outputText, outputTokens),
usage: buildUsage(finalPrompt, thinkingText, outputText),
});
if (!res.writableEnded && !res.destroyed) {
res.write('data: [DONE]\n\n');
@@ -227,9 +226,6 @@ async function handleVercelStream(req, res, rawBody, payload) {
if (!parsed.parsed) {
continue;
}
if (parsed.outputTokens > 0) {
outputTokens = parsed.outputTokens;
}
currentType = parsed.newType;
if (parsed.errorMessage) {
await finish('content_filter');

View File

@@ -5,11 +5,11 @@ const path = require('path');
const DEFAULT_BASE_HEADERS = Object.freeze({
Host: 'chat.deepseek.com',
'User-Agent': 'DeepSeek/1.6.11 Android/35',
'User-Agent': 'DeepSeek/1.8.0 Android/35',
Accept: 'application/json',
'Content-Type': 'application/json',
'x-client-platform': 'android',
'x-client-version': '1.6.11',
'x-client-version': '1.8.0',
'x-client-locale': 'zh_CN',
'accept-charset': 'UTF-8',
});

View File

@@ -42,9 +42,9 @@ func NewApp() (*App, error) {
})
dsClient = deepseek.NewClient(store, resolver)
if err := dsClient.PreloadPow(context.Background()); err != nil {
config.Logger.Warn("[WASM] preload failed", "error", err)
config.Logger.Warn("[PoW] init failed", "error", err)
} else {
config.Logger.Info("[WASM] module preloaded", "path", config.WASMPath())
config.Logger.Info("[PoW] pure Go solver ready")
}
openaiHandler := &openai.Handler{Store: store, Auth: resolver, DS: dsClient}

View File

@@ -12,7 +12,6 @@ import (
type CollectResult struct {
Text string
Thinking string
OutputTokens int
ContentFilter bool
}
@@ -28,7 +27,6 @@ func CollectStream(resp *http.Response, thinkingEnabled bool, closeBody bool) Co
}
text := strings.Builder{}
thinking := strings.Builder{}
outputTokens := 0
contentFilter := false
currentType := "text"
if thinkingEnabled {
@@ -44,14 +42,8 @@ func CollectStream(resp *http.Response, thinkingEnabled bool, closeBody bool) Co
if result.ContentFilter {
contentFilter = true
}
if result.OutputTokens > 0 {
outputTokens = result.OutputTokens
}
return false
}
if result.OutputTokens > 0 {
outputTokens = result.OutputTokens
}
for _, p := range result.Parts {
if p.Type == "thinking" {
trimmed := TrimContinuationOverlap(thinking.String(), p.Text)
@@ -66,7 +58,6 @@ func CollectStream(resp *http.Response, thinkingEnabled bool, closeBody bool) Co
return CollectResult{
Text: text.String(),
Thinking: thinking.String(),
OutputTokens: outputTokens,
ContentFilter: contentFilter,
}
}

View File

@@ -10,7 +10,6 @@ type LineResult struct {
ErrorMessage string
Parts []ContentPart
NextType string
OutputTokens int
}
// ParseDeepSeekContentLine centralizes one-line DeepSeek SSE parsing for both
@@ -20,9 +19,8 @@ func ParseDeepSeekContentLine(raw []byte, thinkingEnabled bool, currentType stri
if !parsed {
return LineResult{NextType: currentType}
}
outputTokens := extractAccumulatedTokenUsage(chunk)
if done {
return LineResult{Parsed: true, Stop: true, NextType: currentType, OutputTokens: outputTokens}
return LineResult{Parsed: true, Stop: true, NextType: currentType}
}
if errObj, hasErr := chunk["error"]; hasErr {
return LineResult{
@@ -30,7 +28,6 @@ func ParseDeepSeekContentLine(raw []byte, thinkingEnabled bool, currentType stri
Stop: true,
ErrorMessage: fmt.Sprintf("%v", errObj),
NextType: currentType,
OutputTokens: outputTokens,
}
}
if code, _ := chunk["code"].(string); code == "content_filter" {
@@ -39,7 +36,6 @@ func ParseDeepSeekContentLine(raw []byte, thinkingEnabled bool, currentType stri
Stop: true,
ContentFilter: true,
NextType: currentType,
OutputTokens: outputTokens,
}
}
if hasContentFilterStatus(chunk) {
@@ -48,16 +44,14 @@ func ParseDeepSeekContentLine(raw []byte, thinkingEnabled bool, currentType stri
Stop: true,
ContentFilter: true,
NextType: currentType,
OutputTokens: outputTokens,
}
}
parts, finished, nextType := ParseSSEChunkForContent(chunk, thinkingEnabled, currentType)
parts = filterLeakedContentFilterParts(parts)
return LineResult{
Parsed: true,
Stop: finished,
Parts: parts,
NextType: nextType,
OutputTokens: outputTokens,
Parsed: true,
Stop: finished,
Parts: parts,
NextType: nextType,
}
}

View File

@@ -26,7 +26,7 @@ func TestParseDeepSeekContentLineContentFilter(t *testing.T) {
}
}
func TestParseDeepSeekContentLineContentFilterCodeIncludesOutputTokens(t *testing.T) {
func TestParseDeepSeekContentLineContentFilterCodeStops(t *testing.T) {
res := ParseDeepSeekContentLine(
[]byte(`data: {"code":"content_filter","accumulated_token_usage":99}`),
false, "text",
@@ -34,9 +34,6 @@ func TestParseDeepSeekContentLineContentFilterCodeIncludesOutputTokens(t *testin
if !res.Parsed || !res.Stop || !res.ContentFilter {
t.Fatalf("expected content-filter stop result: %#v", res)
}
if res.OutputTokens != 99 {
t.Fatalf("expected output token usage 99, got %d", res.OutputTokens)
}
}
func TestParseDeepSeekContentLineContentFilterStatus(t *testing.T) {
@@ -46,28 +43,25 @@ func TestParseDeepSeekContentLineContentFilterStatus(t *testing.T) {
}
}
func TestParseDeepSeekContentLineCapturesAccumulatedTokenUsage(t *testing.T) {
func TestParseDeepSeekContentLineIgnoresAccumulatedTokenUsage(t *testing.T) {
res := ParseDeepSeekContentLine([]byte(`data: {"p":"response","o":"BATCH","v":[{"p":"accumulated_token_usage","v":1383},{"p":"quasi_status","v":"FINISHED"}]}`), false, "text")
if res.OutputTokens != 1383 {
t.Fatalf("expected output token usage 1383, got %d", res.OutputTokens)
if !res.Parsed {
t.Fatalf("expected parsed result")
}
}
func TestParseDeepSeekContentLineCapturesAccumulatedTokenUsageString(t *testing.T) {
func TestParseDeepSeekContentLineIgnoresAccumulatedTokenUsageString(t *testing.T) {
res := ParseDeepSeekContentLine([]byte(`data: {"p":"response","o":"BATCH","v":[{"p":"accumulated_token_usage","v":"190"},{"p":"quasi_status","v":"FINISHED"}]}`), false, "text")
if res.OutputTokens != 190 {
t.Fatalf("expected output token usage 190, got %d", res.OutputTokens)
if !res.Parsed {
t.Fatalf("expected parsed result")
}
}
func TestParseDeepSeekContentLineErrorIncludesOutputTokens(t *testing.T) {
func TestParseDeepSeekContentLineErrorStops(t *testing.T) {
res := ParseDeepSeekContentLine([]byte(`data: {"error":"boom","accumulated_token_usage":123}`), false, "text")
if !res.Parsed || !res.Stop {
t.Fatalf("expected stop on error: %#v", res)
}
if res.OutputTokens != 123 {
t.Fatalf("expected output token usage 123 on error, got %d", res.OutputTokens)
}
}
func TestParseDeepSeekContentLineContent(t *testing.T) {

View File

@@ -3,8 +3,6 @@ package sse
import (
"bytes"
"encoding/json"
"math"
"strconv"
"strings"
"ds2api/internal/deepseek"
@@ -363,70 +361,3 @@ func hasContentFilterStatusValue(v any) bool {
}
return false
}
func extractAccumulatedTokenUsage(chunk map[string]any) int {
return findAccumulatedTokenUsage(chunk)
}
func findAccumulatedTokenUsage(v any) int {
switch x := v.(type) {
case map[string]any:
if p, _ := x["p"].(string); strings.Contains(strings.ToLower(p), "accumulated_token_usage") {
if n, ok := toInt(x["v"]); ok && n > 0 {
return n
}
}
if n, ok := toInt(x["accumulated_token_usage"]); ok && n > 0 {
return n
}
for _, vv := range x {
if n := findAccumulatedTokenUsage(vv); n > 0 {
return n
}
}
case []any:
for _, item := range x {
if n := findAccumulatedTokenUsage(item); n > 0 {
return n
}
}
}
return 0
}
func toInt(v any) (int, bool) {
switch x := v.(type) {
case int:
return x, true
case int32:
return int(x), true
case int64:
return int(x), true
case float64:
if math.IsNaN(x) || math.IsInf(x, 0) {
return 0, false
}
return int(x), true
case json.Number:
i, err := x.Int64()
if err != nil {
return 0, false
}
return int(i), true
case string:
s := strings.TrimSpace(x)
if s == "" {
return 0, false
}
if i, err := strconv.Atoi(s); err == nil {
return i, true
}
f, err := strconv.ParseFloat(s, 64)
if err != nil || math.IsNaN(f) || math.IsInf(f, 0) {
return 0, false
}
return int(f), true
default:
return 0, false
}
}

View File

@@ -50,18 +50,6 @@ func TestShouldSkipPathQuasiStatus(t *testing.T) {
}
}
func TestShouldSkipPathElapsedSecs(t *testing.T) {
if !shouldSkipPath("response/elapsed_secs") {
t.Fatal("expected skip for elapsed_secs path")
}
}
func TestShouldSkipPathTokenUsage(t *testing.T) {
if !shouldSkipPath("response/token_usage") {
t.Fatal("expected skip for token_usage path")
}
}
func TestShouldSkipPathPendingFragment(t *testing.T) {
if !shouldSkipPath("response/pending_fragment") {
t.Fatal("expected skip for pending_fragment path")
@@ -127,7 +115,7 @@ func TestParseSSEChunkForContentNoVField(t *testing.T) {
func TestParseSSEChunkForContentSkippedPath(t *testing.T) {
parts, finished, nextType := ParseSSEChunkForContent(map[string]any{
"p": "response/token_usage",
"p": "response/quasi_status",
"v": "some data",
}, false, "text")
if finished || len(parts) > 0 {
@@ -498,7 +486,7 @@ func TestExtractContentRecursiveFinishedStatus(t *testing.T) {
func TestExtractContentRecursiveSkipsPath(t *testing.T) {
items := []any{
map[string]any{"p": "token_usage", "v": "data"},
map[string]any{"p": "quasi_status", "v": "data"},
}
parts, finished := extractContentRecursive(items, "text")
if finished {

View File

@@ -1,134 +0,0 @@
package sse
import (
"bufio"
"encoding/json"
"errors"
"os"
"path/filepath"
"strconv"
"strings"
"testing"
)
func TestRawStreamSamplesTokenReplay(t *testing.T) {
root := filepath.Join("..", "..", "tests", "raw_stream_samples")
entries, err := os.ReadDir(root)
if err != nil {
t.Fatalf("read samples root: %v", err)
}
found := 0
for _, entry := range entries {
if !entry.IsDir() {
continue
}
ssePath := filepath.Join(root, entry.Name(), "upstream.stream.sse")
if _, err := os.Stat(ssePath); err != nil {
continue
}
found++
t.Run(entry.Name(), func(t *testing.T) {
raw, err := os.ReadFile(ssePath)
if err != nil {
t.Fatalf("read sample: %v", err)
}
parsedTokens, expectedTokens, err := replayAndCollectTokens(string(raw))
if err != nil {
t.Fatalf("replay token collection failed: %v", err)
}
if expectedTokens <= 0 {
t.Fatalf("expected positive token usage from raw stream, got %d", expectedTokens)
}
if parsedTokens != expectedTokens {
t.Fatalf("token mismatch parsed=%d expected=%d", parsedTokens, expectedTokens)
}
})
}
if found == 0 {
t.Fatalf("no upstream.stream.sse samples found under %s", root)
}
}
func replayAndCollectTokens(raw string) (parsedTokens int, expectedTokens int, err error) {
currentType := "thinking"
scanner := bufio.NewScanner(strings.NewReader(raw))
scanner.Buffer(make([]byte, 0, 64*1024), 2*1024*1024)
for scanner.Scan() {
line := strings.TrimSpace(scanner.Text())
if !strings.HasPrefix(line, "data:") {
continue
}
payload := strings.TrimSpace(strings.TrimPrefix(line, "data:"))
if payload == "" || payload == "[DONE]" || !strings.HasPrefix(payload, "{") {
continue
}
var chunk map[string]any
if err := json.Unmarshal([]byte(payload), &chunk); err != nil {
continue
}
if n := rawAccumulatedTokenUsage(chunk); n > 0 {
expectedTokens = n
}
res := ParseDeepSeekContentLine([]byte(line), true, currentType)
currentType = res.NextType
if res.OutputTokens > 0 {
parsedTokens = res.OutputTokens
}
}
if scanErr := scanner.Err(); scanErr != nil {
if errors.Is(scanErr, bufio.ErrTooLong) {
return 0, 0, errors.New("raw stream line exceeds 2MiB scanner limit")
}
return 0, 0, scanErr
}
return parsedTokens, expectedTokens, nil
}
func rawAccumulatedTokenUsage(v any) int {
switch x := v.(type) {
case []any:
for _, item := range x {
if n := rawAccumulatedTokenUsage(item); n > 0 {
return n
}
}
case map[string]any:
if n := rawToInt(x["accumulated_token_usage"]); n > 0 {
return n
}
if p, _ := x["p"].(string); strings.Contains(strings.ToLower(strings.TrimSpace(p)), "accumulated_token_usage") {
if n := rawToInt(x["v"]); n > 0 {
return n
}
}
for _, vv := range x {
if n := rawAccumulatedTokenUsage(vv); n > 0 {
return n
}
}
}
return 0
}
func rawToInt(v any) int {
switch x := v.(type) {
case float64:
return int(x)
case int:
return x
case string:
s := strings.TrimSpace(x)
if s == "" {
return 0
}
if n, err := strconv.Atoi(s); err == nil {
return n
}
if f, err := strconv.ParseFloat(s, 64); err == nil {
return int(f)
}
}
return 0
}

View File

@@ -53,6 +53,10 @@ func (r *Runner) caseModelsOpenAI(ctx context.Context, cc *caseContext) error {
ids := extractModelIDs(resp.Body)
cc.assert("has_deepseek_chat", contains(ids, "deepseek-chat"), strings.Join(ids, ","))
cc.assert("has_deepseek_reasoner", contains(ids, "deepseek-reasoner"), strings.Join(ids, ","))
cc.assert("has_deepseek_expert_chat", contains(ids, "deepseek-expert-chat"), strings.Join(ids, ","))
cc.assert("has_deepseek_expert_reasoner", contains(ids, "deepseek-expert-reasoner"), strings.Join(ids, ","))
cc.assert("has_deepseek_vision_chat", contains(ids, "deepseek-vision-chat"), strings.Join(ids, ","))
cc.assert("has_deepseek_vision_reasoner", contains(ids, "deepseek-vision-reasoner"), strings.Join(ids, ","))
return nil
}

View File

@@ -46,24 +46,33 @@ When calling tools, emit ONLY raw XML at the very end of your response. No text
</tool_calls>
RULES:
1) Output ONLY the XML above when calling tools. Do NOT mix tool XML with regular text.
2) <parameters> MUST contain a strict JSON object. All JSON keys and strings use double quotes.
3) Multiple tools → multiple <tool_call> blocks inside ONE <tool_calls> root.
4) Do NOT wrap the XML in markdown code fences (no triple backticks).
5) After receiving a tool result, use it directly. Only call another tool if the result is insufficient.
6) Parameters MUST use the exact field names from the selected tool schema.
7) CRITICAL: Do NOT invent or add any extra fields (such as "_raw", "_xml"). Use ONLY the fields strictly defined in the schema. Extra fields will cause execution failure.
1) When calling tools, you MUST use the <tool_calls> XML format.
2) No text is allowed AFTER the XML block.
3) <parameters> MUST be a single-line strict JSON object. Use double quotes.
4) Multiple tools must be inside the same <tool_calls> root.
5) Do NOT wrap XML in markdown fences (` + "```" + `).
6) Do NOT invent parameters. Use only the provided schema.
7) CRITICAL: Do NOT use native tool markers like "<Tool>" or "<tool>".
8) CRITICAL: Do NOT output role markers like "<System>", "<User>", or "<Assistant>".
9) CRITICAL: Do NOT output internal monologues (e.g. "I will list files now..."). Just output your answer or the XML.
❌ WRONG — Do NOT do these:
Wrong 1 — mixed text and XML:
I'll read the file for you. <tool_calls><tool_call>...
Wrong 2 — describing tool calls in text:
[调用 Bash] {"command": "ls"}
Wrong 1 — mixed text after XML:
<tool_calls>...</tool_calls> I hope this helps.
Wrong 2 — function-call syntax:
Grep({"pattern": "token"})
Wrong 3 — missing <tool_calls> wrapper:
<tool_call><tool_name>` + ex1 + `</tool_name><parameters>{}</parameters></tool_call>
Wrong 4 — extra/invented fields:
<parameters>{"_raw": "...", "command": "ls"}</parameters>
Wrong 4 — Markdown code fences:
` + "```xml" + `
<tool_calls>...</tool_calls>
` + "```" + `
Wrong 5 — native tool tokens:
<Tool>call_some_tool{"param":1}<Tool>
Wrong 6 — role markers in response:
<Assistant> Here is the result...
Remember: The ONLY valid way to use tools is the <tool_calls> XML block at the end of your response.
✅ CORRECT EXAMPLES:

View File

@@ -2,6 +2,7 @@ package toolcall
import (
"encoding/json"
"html"
"regexp"
"strings"
)
@@ -92,7 +93,7 @@ func parseMarkupSingleToolCall(attrs string, inner string) ParsedToolCall {
}
func parseMarkupInput(raw string) map[string]any {
raw = strings.TrimSpace(raw)
raw = strings.TrimSpace(html.UnescapeString(raw))
if raw == "" {
return map[string]any{}
}
@@ -102,7 +103,7 @@ func parseMarkupInput(raw string) map[string]any {
if kv := parseMarkupKVObject(raw); len(kv) > 0 {
return kv
}
return map[string]any{"_raw": stripTagText(raw)}
return map[string]any{"_raw": html.UnescapeString(stripTagText(raw))}
}
func parseMarkupKVObject(text string) map[string]any {
@@ -123,7 +124,7 @@ func parseMarkupKVObject(text string) map[string]any {
if !strings.EqualFold(key, endKey) {
continue
}
value := strings.TrimSpace(stripTagText(m[2]))
value := strings.TrimSpace(html.UnescapeString(stripTagText(m[2])))
if value == "" {
continue
}

View File

@@ -41,7 +41,7 @@ func ParseToolCallsDetailed(text string, availableToolNames []string) ToolCallPa
continue
}
parsed := tc
calls, rejectedNames := filterToolCallsDetailed(parsed, availableToolNames)
calls, rejectedNames := filterToolCallsDetailed(parsed)
result.Calls = calls
result.RejectedToolNames = rejectedNames
result.RejectedByPolicy = len(rejectedNames) > 0 && len(calls) == 0
@@ -77,7 +77,7 @@ func ParseToolCallsDetailed(text string, availableToolNames []string) ToolCallPa
result.SawToolCallSyntax = true
}
calls, rejectedNames := filterToolCallsDetailed(parsed, availableToolNames)
calls, rejectedNames := filterToolCallsDetailed(parsed)
result.Calls = calls
result.RejectedToolNames = rejectedNames
result.RejectedByPolicy = len(rejectedNames) > 0 && len(calls) == 0
@@ -108,7 +108,7 @@ func ParseStandaloneToolCallsDetailed(text string, availableToolNames []string)
continue
}
result.SawToolCallSyntax = true
calls, rejectedNames := filterToolCallsDetailed(parsed, availableToolNames)
calls, rejectedNames := filterToolCallsDetailed(parsed)
result.Calls = calls
result.RejectedToolNames = rejectedNames
result.RejectedByPolicy = len(rejectedNames) > 0 && len(calls) == 0
@@ -143,14 +143,14 @@ func ParseStandaloneToolCallsDetailed(text string, availableToolNames []string)
}
}
result.SawToolCallSyntax = true
calls, rejectedNames := filterToolCallsDetailed(parsed, availableToolNames)
calls, rejectedNames := filterToolCallsDetailed(parsed)
result.Calls = calls
result.RejectedToolNames = rejectedNames
result.RejectedByPolicy = len(rejectedNames) > 0 && len(calls) == 0
return result
}
func filterToolCallsDetailed(parsed []ParsedToolCall, availableToolNames []string) ([]ParsedToolCall, []string) {
func filterToolCallsDetailed(parsed []ParsedToolCall) ([]ParsedToolCall, []string) {
out := make([]ParsedToolCall, 0, len(parsed))
for _, tc := range parsed {
if tc.Name == "" {

View File

@@ -3,6 +3,7 @@ package toolcall
import (
"encoding/json"
"encoding/xml"
"html"
"regexp"
"strings"
)
@@ -114,10 +115,11 @@ func parseSingleXMLToolCall(block string) (ParsedToolCall, bool) {
if err := dec.DecodeElement(&node, &t); err == nil {
inner := strings.TrimSpace(node.Inner)
if inner != "" {
if parsed := parseToolCallInput(inner); len(parsed) > 0 {
unescapedInner := html.UnescapeString(inner)
if parsed := parseToolCallInput(unescapedInner); len(parsed) > 0 {
if len(parsed) == 1 {
if _, onlyRaw := parsed["_raw"]; onlyRaw {
if kv := parseMarkupKVObject(inner); len(kv) > 0 {
if kv := parseMarkupKVObject(unescapedInner); len(kv) > 0 {
for k, vv := range kv {
params[k] = vv
}
@@ -128,7 +130,7 @@ func parseSingleXMLToolCall(block string) (ParsedToolCall, bool) {
for k, vv := range parsed {
params[k] = vv
}
} else if kv := parseMarkupKVObject(inner); len(kv) > 0 {
} else if kv := parseMarkupKVObject(unescapedInner); len(kv) > 0 {
for k, vv := range kv {
params[k] = vv
}
@@ -158,7 +160,7 @@ func parseSingleXMLToolCall(block string) (ParsedToolCall, bool) {
if inParams || inTool {
var v string
if err := dec.DecodeElement(&v, &t); err == nil {
params[t.Name.Local] = strings.TrimSpace(v)
params[t.Name.Local] = strings.TrimSpace(html.UnescapeString(v))
}
}
}
@@ -173,12 +175,12 @@ func parseSingleXMLToolCall(block string) (ParsedToolCall, bool) {
}
}
if strings.TrimSpace(name) == "" {
name = strings.TrimSpace(extractXMLToolNameByRegex(stripTopLevelXMLParameters(inner)))
name = strings.TrimSpace(html.UnescapeString(extractXMLToolNameByRegex(stripTopLevelXMLParameters(inner))))
}
if strings.TrimSpace(name) == "" {
return ParsedToolCall{}, false
}
return ParsedToolCall{Name: strings.TrimSpace(name), Input: params}, true
return ParsedToolCall{Name: strings.TrimSpace(html.UnescapeString(name)), Input: params}, true
}
func stripTopLevelXMLParameters(inner string) string {
@@ -231,7 +233,7 @@ func parseFunctionCallTagStyle(text string) (ParsedToolCall, bool) {
if len(m) < 2 {
return ParsedToolCall{}, false
}
name := strings.TrimSpace(m[1])
name := strings.TrimSpace(html.UnescapeString(m[1]))
if name == "" {
return ParsedToolCall{}, false
}
@@ -241,7 +243,7 @@ func parseFunctionCallTagStyle(text string) (ParsedToolCall, bool) {
continue
}
key := strings.TrimSpace(pm[1])
val := strings.TrimSpace(pm[2])
val := strings.TrimSpace(html.UnescapeString(pm[2]))
if key != "" {
input[key] = val
}
@@ -270,11 +272,11 @@ func parseSingleAntmlFunctionCallMatch(m []string) (ParsedToolCall, bool) {
if len(m) < 3 {
return ParsedToolCall{}, false
}
name := strings.TrimSpace(m[1])
name := strings.TrimSpace(html.UnescapeString(m[1]))
if name == "" {
return ParsedToolCall{}, false
}
body := strings.TrimSpace(m[2])
body := strings.TrimSpace(html.UnescapeString(m[2]))
input := map[string]any{}
if strings.HasPrefix(body, "{") {
if err := json.Unmarshal([]byte(body), &input); err == nil {
@@ -291,7 +293,7 @@ func parseSingleAntmlFunctionCallMatch(m []string) (ParsedToolCall, bool) {
continue
}
k := strings.TrimSpace(am[1])
v := strings.TrimSpace(am[2])
v := strings.TrimSpace(html.UnescapeString(am[2]))
if k != "" {
input[k] = v
}
@@ -304,7 +306,7 @@ func parseInvokeFunctionCallStyle(text string) (ParsedToolCall, bool) {
if len(m) < 3 {
return ParsedToolCall{}, false
}
name := strings.TrimSpace(m[1])
name := strings.TrimSpace(html.UnescapeString(m[1]))
if name == "" {
return ParsedToolCall{}, false
}
@@ -314,7 +316,7 @@ func parseInvokeFunctionCallStyle(text string) (ParsedToolCall, bool) {
continue
}
k := strings.TrimSpace(pm[1])
v := strings.TrimSpace(pm[2])
v := strings.TrimSpace(html.UnescapeString(pm[2]))
if k != "" {
input[k] = v
}
@@ -334,7 +336,7 @@ func parseToolUseFunctionStyle(text string) (ParsedToolCall, bool) {
if len(m) < 3 {
return ParsedToolCall{}, false
}
name := strings.TrimSpace(m[1])
name := strings.TrimSpace(html.UnescapeString(m[1]))
if name == "" {
return ParsedToolCall{}, false
}
@@ -345,7 +347,7 @@ func parseToolUseFunctionStyle(text string) (ParsedToolCall, bool) {
continue
}
k := strings.TrimSpace(pm[1])
v := strings.TrimSpace(pm[2])
v := strings.TrimSpace(html.UnescapeString(pm[2]))
if k != "" {
input[k] = v
}
@@ -358,11 +360,11 @@ func parseToolUseNameParametersStyle(text string) (ParsedToolCall, bool) {
if len(m) < 3 {
return ParsedToolCall{}, false
}
name := strings.TrimSpace(m[1])
name := strings.TrimSpace(html.UnescapeString(m[1]))
if name == "" {
return ParsedToolCall{}, false
}
raw := strings.TrimSpace(m[2])
raw := strings.TrimSpace(html.UnescapeString(m[2]))
input := map[string]any{}
if raw != "" {
if parsed := parseToolCallInput(raw); len(parsed) > 0 {
@@ -379,11 +381,11 @@ func parseToolUseFunctionNameParametersStyle(text string) (ParsedToolCall, bool)
if len(m) < 3 {
return ParsedToolCall{}, false
}
name := strings.TrimSpace(m[1])
name := strings.TrimSpace(html.UnescapeString(m[1]))
if name == "" {
return ParsedToolCall{}, false
}
raw := strings.TrimSpace(m[2])
raw := strings.TrimSpace(html.UnescapeString(m[2]))
input := map[string]any{}
if raw != "" {
if parsed := parseToolCallInput(raw); len(parsed) > 0 {
@@ -400,11 +402,11 @@ func parseToolUseToolNameBodyStyle(text string) (ParsedToolCall, bool) {
if len(m) < 3 {
return ParsedToolCall{}, false
}
name := strings.TrimSpace(m[1])
name := strings.TrimSpace(html.UnescapeString(m[1]))
if name == "" {
return ParsedToolCall{}, false
}
body := strings.TrimSpace(m[2])
body := strings.TrimSpace(html.UnescapeString(m[2]))
input := map[string]any{}
if body != "" {
if kv := parseXMLChildKV(body); len(kv) > 0 {

View File

@@ -691,3 +691,27 @@ func TestRepairLooseJSONWithNestedObjects(t *testing.T) {
}
}
}
func TestParseToolCallsUnescapesHTMLEntityArguments(t *testing.T) {
text := `<tool_call><tool_name>Bash</tool_name><parameters>{"command":"echo a &gt; out.txt"}</parameters></tool_call>`
calls := ParseToolCalls(text, []string{"bash"})
if len(calls) != 1 {
t.Fatalf("expected one call, got %#v", calls)
}
cmd, _ := calls[0].Input["command"].(string)
if cmd != "echo a > out.txt" {
t.Fatalf("expected html entities to be unescaped in command, got %q", cmd)
}
}
func TestParseToolCallsJSONPayloadKeepsLiteralEntities(t *testing.T) {
text := `{"tool_calls":[{"name":"bash","input":{"command":"echo &gt; literally"}}]}`
calls := ParseToolCalls(text, []string{"bash"})
if len(calls) != 1 {
t.Fatalf("expected one call, got %#v", calls)
}
cmd, _ := calls[0].Input["command"].(string)
if cmd != "echo &gt; literally" {
t.Fatalf("expected json payload to keep literal entities, got %q", cmd)
}
}

View File

@@ -3,6 +3,7 @@ package translatorcliproxy
import (
"bytes"
"context"
"encoding/json"
"strings"
sdktranslator "github.com/router-for-me/CLIProxyAPI/v6/sdk/translator"
@@ -15,7 +16,12 @@ func ToOpenAI(from sdktranslator.Format, model string, raw []byte, stream bool)
func FromOpenAINonStream(to sdktranslator.Format, model string, originalReq, translatedReq, raw []byte) []byte {
var param any
return sdktranslator.TranslateNonStream(context.Background(), sdktranslator.FormatOpenAI, to, model, originalReq, translatedReq, raw, &param)
converted := sdktranslator.TranslateNonStream(context.Background(), sdktranslator.FormatOpenAI, to, model, originalReq, translatedReq, raw, &param)
usage, ok := extractOpenAIUsageFromJSON(raw)
if !ok {
return converted
}
return injectNonStreamUsageMetadata(converted, to, usage)
}
func FromOpenAIStream(to sdktranslator.Format, model string, originalReq, translatedReq, streamBody []byte) []byte {
@@ -65,3 +71,57 @@ func ParseFormat(name string) sdktranslator.Format {
func ToOpenAIByName(formatName, model string, raw []byte, stream bool) []byte {
return ToOpenAI(ParseFormat(formatName), model, raw, stream)
}
func extractOpenAIUsageFromJSON(raw []byte) (openAIUsage, bool) {
payload := map[string]any{}
if err := json.Unmarshal(raw, &payload); err != nil {
return openAIUsage{}, false
}
usageObj, _ := payload["usage"].(map[string]any)
if usageObj == nil {
return openAIUsage{}, false
}
p := toInt(usageObj["prompt_tokens"])
c := toInt(usageObj["completion_tokens"])
t := toInt(usageObj["total_tokens"])
if p <= 0 {
p = toInt(usageObj["input_tokens"])
}
if c <= 0 {
c = toInt(usageObj["output_tokens"])
}
if t <= 0 {
t = p + c
}
if p <= 0 && c <= 0 && t <= 0 {
return openAIUsage{}, false
}
return openAIUsage{PromptTokens: p, CompletionTokens: c, TotalTokens: t}, true
}
func injectNonStreamUsageMetadata(converted []byte, target sdktranslator.Format, usage openAIUsage) []byte {
obj := map[string]any{}
if err := json.Unmarshal(converted, &obj); err != nil {
return converted
}
switch target {
case sdktranslator.FormatClaude:
obj["usage"] = map[string]any{
"input_tokens": usage.PromptTokens,
"output_tokens": usage.CompletionTokens,
}
case sdktranslator.FormatGemini:
obj["usageMetadata"] = map[string]any{
"promptTokenCount": usage.PromptTokens,
"candidatesTokenCount": usage.CompletionTokens,
"totalTokenCount": usage.TotalTokens,
}
default:
return converted
}
out, err := json.Marshal(obj)
if err != nil {
return converted
}
return out
}

View File

@@ -26,6 +26,42 @@ func TestFromOpenAINonStreamClaude(t *testing.T) {
}
}
func TestFromOpenAINonStreamClaudePreservesUsageFromOpenAI(t *testing.T) {
original := []byte(`{"model":"claude-sonnet-4-5","messages":[{"role":"user","content":"hi"}],"stream":false}`)
translatedReq := []byte(`{"model":"claude-sonnet-4-5","messages":[{"role":"user","content":"hi"}],"stream":false}`)
openaibody := []byte(`{"id":"chatcmpl_1","object":"chat.completion","created":1,"model":"claude-sonnet-4-5","choices":[{"index":0,"message":{"role":"assistant","content":"hello"},"finish_reason":"stop"}],"usage":{"prompt_tokens":11,"completion_tokens":29,"total_tokens":40}}`)
got := string(FromOpenAINonStream(sdktranslator.FormatClaude, "claude-sonnet-4-5", original, translatedReq, openaibody))
if !strings.Contains(got, `"input_tokens":11`) || !strings.Contains(got, `"output_tokens":29`) {
t.Fatalf("expected claude usage to preserve prompt/completion tokens, got: %s", got)
}
}
func TestFromOpenAINonStreamGeminiPreservesUsageFromOpenAI(t *testing.T) {
original := []byte(`{"contents":[{"role":"user","parts":[{"text":"hi"}]}]}`)
translatedReq := []byte(`{"model":"gemini-2.5-pro","messages":[{"role":"user","content":"hi"}],"stream":false}`)
openaibody := []byte(`{"id":"chatcmpl_1","object":"chat.completion","created":1,"model":"gemini-2.5-pro","choices":[{"index":0,"message":{"role":"assistant","content":"hello"},"finish_reason":"stop"}],"usage":{"prompt_tokens":11,"completion_tokens":29,"total_tokens":40}}`)
got := string(FromOpenAINonStream(sdktranslator.FormatGemini, "gemini-2.5-pro", original, translatedReq, openaibody))
if !strings.Contains(got, `"promptTokenCount":11`) || !strings.Contains(got, `"candidatesTokenCount":29`) || !strings.Contains(got, `"totalTokenCount":40`) {
t.Fatalf("expected gemini usageMetadata to preserve prompt/completion tokens, got: %s", got)
}
}
func TestFromOpenAINonStreamPreservesResponsesUsageShape(t *testing.T) {
original := []byte(`{"contents":[{"role":"user","parts":[{"text":"hi"}]}]}`)
translatedReq := []byte(`{"model":"gemini-2.5-pro","messages":[{"role":"user","content":"hi"}],"stream":false}`)
openaibody := []byte(`{"id":"resp_1","object":"response","model":"gemini-2.5-pro","usage":{"input_tokens":"11","output_tokens":"29","total_tokens":"40"}}`)
gotGemini := string(FromOpenAINonStream(sdktranslator.FormatGemini, "gemini-2.5-pro", original, translatedReq, openaibody))
if !strings.Contains(gotGemini, `"promptTokenCount":11`) || !strings.Contains(gotGemini, `"candidatesTokenCount":29`) || !strings.Contains(gotGemini, `"totalTokenCount":40`) {
t.Fatalf("expected gemini usageMetadata from input/output usage fields, got: %s", gotGemini)
}
origClaude := []byte(`{"model":"claude-sonnet-4-5","messages":[{"role":"user","content":"hi"}],"stream":false}`)
gotClaude := string(FromOpenAINonStream(sdktranslator.FormatClaude, "claude-sonnet-4-5", origClaude, origClaude, openaibody))
if !strings.Contains(gotClaude, `"input_tokens":11`) || !strings.Contains(gotClaude, `"output_tokens":29`) {
t.Fatalf("expected claude usage from input/output usage fields, got: %s", gotClaude)
}
}
func TestParseFormatAliases(t *testing.T) {
cases := map[string]sdktranslator.Format{
"responses": sdktranslator.FormatOpenAIResponse,

View File

@@ -3,7 +3,10 @@ package translatorcliproxy
import (
"bytes"
"context"
"encoding/json"
"net/http"
"strconv"
"strings"
sdktranslator "github.com/router-for-me/CLIProxyAPI/v6/sdk/translator"
)
@@ -77,7 +80,13 @@ func (w *OpenAIStreamTranslatorWriter) Write(p []byte) (int, error) {
if !bytes.HasPrefix(trimmed, []byte("data:")) {
continue
}
usage, hasUsage := extractOpenAIUsage(trimmed)
chunks := sdktranslator.TranslateStream(context.Background(), sdktranslator.FormatOpenAI, w.target, w.model, w.originalReq, w.translatedReq, trimmed, &w.param)
if hasUsage {
for i := range chunks {
chunks[i] = injectStreamUsageMetadata(chunks[i], w.target, usage)
}
}
for i := range chunks {
if len(chunks[i]) == 0 {
continue
@@ -118,3 +127,114 @@ func (w *OpenAIStreamTranslatorWriter) readOneLine() ([]byte, bool) {
w.lineBuf.Next(idx + 1)
return line, true
}
type openAIUsage struct {
PromptTokens int
CompletionTokens int
TotalTokens int
}
func extractOpenAIUsage(line []byte) (openAIUsage, bool) {
raw := strings.TrimSpace(strings.TrimPrefix(string(line), "data:"))
if raw == "" || raw == "[DONE]" {
return openAIUsage{}, false
}
var payload map[string]any
if err := json.Unmarshal([]byte(raw), &payload); err != nil {
return openAIUsage{}, false
}
usageObj, _ := payload["usage"].(map[string]any)
if usageObj == nil {
return openAIUsage{}, false
}
p := toInt(usageObj["prompt_tokens"])
c := toInt(usageObj["completion_tokens"])
t := toInt(usageObj["total_tokens"])
if p <= 0 {
p = toInt(usageObj["input_tokens"])
}
if c <= 0 {
c = toInt(usageObj["output_tokens"])
}
if p <= 0 && c <= 0 && t <= 0 {
return openAIUsage{}, false
}
if t <= 0 {
t = p + c
}
return openAIUsage{PromptTokens: p, CompletionTokens: c, TotalTokens: t}, true
}
func injectStreamUsageMetadata(chunk []byte, target sdktranslator.Format, usage openAIUsage) []byte {
if target != sdktranslator.FormatGemini {
return chunk
}
suffix := ""
switch {
case bytes.HasSuffix(chunk, []byte("\n\n")):
suffix = "\n\n"
case bytes.HasSuffix(chunk, []byte("\n")):
suffix = "\n"
}
text := strings.TrimSpace(string(chunk))
if text == "" {
return chunk
}
var (
hasDataPrefix bool
jsonText = text
)
if strings.HasPrefix(jsonText, "data:") {
hasDataPrefix = true
jsonText = strings.TrimSpace(strings.TrimPrefix(jsonText, "data:"))
}
if jsonText == "" || jsonText == "[DONE]" {
return chunk
}
obj := map[string]any{}
if err := json.Unmarshal([]byte(jsonText), &obj); err != nil {
return chunk
}
if _, ok := obj["candidates"]; !ok {
return chunk
}
obj["usageMetadata"] = map[string]any{
"promptTokenCount": usage.PromptTokens,
"candidatesTokenCount": usage.CompletionTokens,
"totalTokenCount": usage.TotalTokens,
}
b, err := json.Marshal(obj)
if err != nil {
return chunk
}
if hasDataPrefix {
return []byte("data: " + string(b) + suffix)
}
if suffix != "" {
return append(b, []byte(suffix)...)
}
return b
}
func toInt(v any) int {
switch x := v.(type) {
case int:
return x
case int32:
return int(x)
case int64:
return int(x)
case float64:
return int(x)
case float32:
return int(x)
case string:
n, err := strconv.Atoi(strings.TrimSpace(x))
if err != nil {
return 0
}
return n
default:
return 0
}
}

View File

@@ -18,12 +18,16 @@ func TestOpenAIStreamTranslatorWriterClaude(t *testing.T) {
w.WriteHeader(200)
_, _ = w.Write([]byte("data: {\"id\":\"chatcmpl_1\",\"object\":\"chat.completion.chunk\",\"created\":1,\"model\":\"claude-sonnet-4-5\",\"choices\":[{\"index\":0,\"delta\":{\"role\":\"assistant\"},\"finish_reason\":null}]}\n\n"))
_, _ = w.Write([]byte("data: {\"id\":\"chatcmpl_1\",\"object\":\"chat.completion.chunk\",\"created\":1,\"model\":\"claude-sonnet-4-5\",\"choices\":[{\"index\":0,\"delta\":{\"content\":\"hi\"},\"finish_reason\":null}]}\n\n"))
_, _ = w.Write([]byte("data: {\"id\":\"chatcmpl_1\",\"object\":\"chat.completion.chunk\",\"created\":1,\"model\":\"claude-sonnet-4-5\",\"choices\":[{\"index\":0,\"delta\":{},\"finish_reason\":\"stop\"}],\"usage\":{\"prompt_tokens\":11,\"completion_tokens\":29,\"total_tokens\":40}}\n\n"))
_, _ = w.Write([]byte("data: [DONE]\n\n"))
body := rec.Body.String()
if !strings.Contains(body, "event: message_start") {
t.Fatalf("expected claude message_start event, got: %s", body)
}
if !strings.Contains(body, `"output_tokens":29`) {
t.Fatalf("expected claude stream usage to preserve output tokens, got: %s", body)
}
}
func TestOpenAIStreamTranslatorWriterGemini(t *testing.T) {
@@ -35,12 +39,16 @@ func TestOpenAIStreamTranslatorWriterGemini(t *testing.T) {
w.Header().Set("Content-Type", "text/event-stream")
w.WriteHeader(200)
_, _ = w.Write([]byte("data: {\"id\":\"chatcmpl_1\",\"object\":\"chat.completion.chunk\",\"created\":1,\"model\":\"gemini-2.5-pro\",\"choices\":[{\"index\":0,\"delta\":{\"content\":\"hi\"},\"finish_reason\":null}]}\n\n"))
_, _ = w.Write([]byte("data: {\"id\":\"chatcmpl_1\",\"object\":\"chat.completion.chunk\",\"created\":1,\"model\":\"gemini-2.5-pro\",\"choices\":[{\"index\":0,\"delta\":{},\"finish_reason\":\"stop\"}],\"usage\":{\"prompt_tokens\":11,\"completion_tokens\":29,\"total_tokens\":40}}\n\n"))
_, _ = w.Write([]byte("data: [DONE]\n\n"))
body := rec.Body.String()
if !strings.Contains(body, "candidates") {
t.Fatalf("expected gemini stream payload, got: %s", body)
}
if !strings.Contains(body, `"promptTokenCount":11`) || !strings.Contains(body, `"candidatesTokenCount":29`) {
t.Fatalf("expected gemini stream usageMetadata to preserve usage, got: %s", body)
}
}
func TestOpenAIStreamTranslatorWriterPreservesKeepAliveComment(t *testing.T) {
@@ -55,3 +63,26 @@ func TestOpenAIStreamTranslatorWriterPreservesKeepAliveComment(t *testing.T) {
t.Fatalf("expected keep-alive comment passthrough, got %q", body)
}
}
func TestInjectStreamUsageMetadataPreservesSSEFrameTerminator(t *testing.T) {
chunk := []byte("data: {\"candidates\":[{\"index\":0}],\"model\":\"gemini-2.5-pro\"}\n\n")
usage := openAIUsage{PromptTokens: 11, CompletionTokens: 29, TotalTokens: 40}
got := injectStreamUsageMetadata(chunk, sdktranslator.FormatGemini, usage)
if !strings.HasSuffix(string(got), "\n\n") {
t.Fatalf("expected injected chunk to preserve \\n\\n frame terminator, got %q", string(got))
}
if !strings.Contains(string(got), `"usageMetadata"`) {
t.Fatalf("expected usageMetadata injected, got %q", string(got))
}
}
func TestExtractOpenAIUsageSupportsResponsesUsageFields(t *testing.T) {
line := []byte(`data: {"usage":{"input_tokens":"11","output_tokens":"29","total_tokens":"40"}}`)
got, ok := extractOpenAIUsage(line)
if !ok {
t.Fatal("expected usage extracted from input/output usage fields")
}
if got.PromptTokens != 11 || got.CompletionTokens != 29 || got.TotalTokens != 40 {
t.Fatalf("unexpected usage extracted: %#v", got)
}
}

View File

@@ -1,5 +1,7 @@
package util
import "ds2api/internal/config"
type StandardRequest struct {
Surface string
RequestedModel string
@@ -51,8 +53,17 @@ func (p ToolChoicePolicy) Allows(name string) bool {
}
func (r StandardRequest) CompletionPayload(sessionID string) map[string]any {
modelID := r.ResolvedModel
if modelID == "" {
modelID = r.RequestedModel
}
modelType := "default"
if resolvedType, ok := config.GetModelType(modelID); ok {
modelType = resolvedType
}
payload := map[string]any{
"chat_session_id": sessionID,
"model_type": modelType,
"parent_message_id": nil,
"prompt": r.FinalPrompt,
"ref_file_ids": []any{},

View File

@@ -0,0 +1,49 @@
package util
import "testing"
func TestStandardRequestCompletionPayloadSetsModelTypeFromResolvedModel(t *testing.T) {
tests := []struct {
name string
model string
thinking bool
search bool
modelType string
}{
{name: "default", model: "deepseek-chat", thinking: false, search: false, modelType: "default"},
{name: "expert", model: "deepseek-expert-reasoner", thinking: true, search: false, modelType: "expert"},
{name: "vision", model: "deepseek-vision-chat-search", thinking: false, search: true, modelType: "vision"},
}
for _, tc := range tests {
t.Run(tc.name, func(t *testing.T) {
req := StandardRequest{
ResolvedModel: tc.model,
FinalPrompt: "hello",
Thinking: tc.thinking,
Search: tc.search,
PassThrough: map[string]any{
"temperature": 0.3,
},
}
payload := req.CompletionPayload("session-123")
if got := payload["model_type"]; got != tc.modelType {
t.Fatalf("expected model_type %s, got %#v", tc.modelType, got)
}
if got := payload["chat_session_id"]; got != "session-123" {
t.Fatalf("unexpected chat_session_id: %#v", got)
}
if got := payload["thinking_enabled"]; got != tc.thinking {
t.Fatalf("unexpected thinking_enabled: %#v", got)
}
if got := payload["search_enabled"]; got != tc.search {
t.Fatalf("unexpected search_enabled: %#v", got)
}
if got := payload["temperature"]; got != 0.3 {
t.Fatalf("expected passthrough temperature, got %#v", got)
}
})
}
}

64
pow/README.md Normal file
View File

@@ -0,0 +1,64 @@
# DeepSeek PoW 纯算实现
替代 `internal/deepseek/assets/sha3_wasm_bg.*.wasm` + wazero 运行时。
## 算法
DeepSeekHashV1 = SHA3-256 但 **Keccak-f[1600] 跳过 round 0** (只做 rounds 1..23)。其余参数不变:
rate=136, padding=0x06+0x80, output=32 字节。
PoW 协议:服务端选 answer ∈ [0, difficulty),计算 `challenge = hash(prefix + str(answer))`
客户端遍历 [0, difficulty) 找到匹配的 nonce。
```
prefix = salt + "_" + str(expire_at) + "_"
input = (prefix + str(nonce)).encode("utf-8")
hash = DeepSeekHashV1(input) → 32 bytes
header = base64(json({algorithm, challenge, salt, answer, signature, target_path}))
```
## 性能 (Apple M4, Go 1.25)
```
BenchmarkHash 187.5 ns/op 0 alloc → 5.33M hash/s
BenchmarkSolve 13.4 ms/op 2 alloc → 75 道/秒/核 (difficulty=144000)
```
对比 wazero 调 WASM: hash 快 **5×**, solve 快 **2.8×**
## 测试
```bash
cd pow && go test -v ./... && go test -bench=. -benchmem
```
## 替换 WASM
替换 `internal/deepseek/pow.go``PowSolver.Compute`:
```go
// 原: 调 wasm_solve(retptr, chPtr, chLen, prefixPtr, prefixLen, difficulty)
// 新:
import "ds2api/pow"
func (c *Client) GetPow(ctx context.Context, a *auth.RequestAuth, ...) (string, error) {
// ... 省略 token/retry 逻辑,只改 compute 部分 ...
challenge, _ := bizData["challenge"].(map[string]any)
ch := &pow.Challenge{
Algorithm: challenge["algorithm"].(string),
Challenge: challenge["challenge"].(string),
Salt: challenge["salt"].(string),
ExpireAt: int64(challenge["expire_at"].(float64)),
Difficulty: int64(challenge["difficulty"].(float64)),
Signature: challenge["signature"].(string),
TargetPath: challenge["target_path"].(string),
}
return pow.SolveAndBuildHeader(ch)
}
```
可删除:
- `internal/deepseek/assets/sha3_wasm_bg.*.wasm`
- `internal/deepseek/embedded_pow.go`
- `internal/deepseek/pow.go``PowSolver` 结构体、wazero 相关池化代码
- `go.mod``github.com/tetratelabs/wazero` 依赖

153
pow/deepseek_hash.go Normal file
View File

@@ -0,0 +1,153 @@
// Package pow 提供 DeepSeekHashV1 纯 Go 实现。
// DeepSeekHashV1 = SHA3-256 但跳过 Keccak-f[1600] round 0 (只做 rounds 1..23)。
package pow
import "encoding/binary"
var rc = [24]uint64{
0x0000000000000001, 0x0000000000008082, 0x800000000000808A, 0x8000000080008000,
0x000000000000808B, 0x0000000080000001, 0x8000000080008081, 0x8000000000008009,
0x000000000000008A, 0x0000000000000088, 0x0000000080008009, 0x000000008000000A,
0x000000008000808B, 0x800000000000008B, 0x8000000000008089, 0x8000000000008003,
0x8000000000008002, 0x8000000000000080, 0x000000000000800A, 0x800000008000000A,
0x8000000080008081, 0x8000000000008080, 0x0000000080000001, 0x8000000080008008,
}
func rotl64(v uint64, k uint) uint64 { return v<<k | v>>(64-k) }
func keccakF23(s *[25]uint64) {
a0, a1, a2, a3, a4 := s[0], s[1], s[2], s[3], s[4]
a5, a6, a7, a8, a9 := s[5], s[6], s[7], s[8], s[9]
a10, a11, a12, a13, a14 := s[10], s[11], s[12], s[13], s[14]
a15, a16, a17, a18, a19 := s[15], s[16], s[17], s[18], s[19]
a20, a21, a22, a23, a24 := s[20], s[21], s[22], s[23], s[24]
for r := 1; r < 24; r++ {
c0 := a0 ^ a5 ^ a10 ^ a15 ^ a20
c1 := a1 ^ a6 ^ a11 ^ a16 ^ a21
c2 := a2 ^ a7 ^ a12 ^ a17 ^ a22
c3 := a3 ^ a8 ^ a13 ^ a18 ^ a23
c4 := a4 ^ a9 ^ a14 ^ a19 ^ a24
d0 := c4 ^ rotl64(c1, 1)
d1 := c0 ^ rotl64(c2, 1)
d2 := c1 ^ rotl64(c3, 1)
d3 := c2 ^ rotl64(c4, 1)
d4 := c3 ^ rotl64(c0, 1)
a0 ^= d0
a5 ^= d0
a10 ^= d0
a15 ^= d0
a20 ^= d0
a1 ^= d1
a6 ^= d1
a11 ^= d1
a16 ^= d1
a21 ^= d1
a2 ^= d2
a7 ^= d2
a12 ^= d2
a17 ^= d2
a22 ^= d2
a3 ^= d3
a8 ^= d3
a13 ^= d3
a18 ^= d3
a23 ^= d3
a4 ^= d4
a9 ^= d4
a14 ^= d4
a19 ^= d4
a24 ^= d4
b0 := a0
b10 := rotl64(a1, 1)
b20 := rotl64(a2, 62)
b5 := rotl64(a3, 28)
b15 := rotl64(a4, 27)
b16 := rotl64(a5, 36)
b1 := rotl64(a6, 44)
b11 := rotl64(a7, 6)
b21 := rotl64(a8, 55)
b6 := rotl64(a9, 20)
b7 := rotl64(a10, 3)
b17 := rotl64(a11, 10)
b2 := rotl64(a12, 43)
b12 := rotl64(a13, 25)
b22 := rotl64(a14, 39)
b23 := rotl64(a15, 41)
b8 := rotl64(a16, 45)
b18 := rotl64(a17, 15)
b3 := rotl64(a18, 21)
b13 := rotl64(a19, 8)
b14 := rotl64(a20, 18)
b24 := rotl64(a21, 2)
b9 := rotl64(a22, 61)
b19 := rotl64(a23, 56)
b4 := rotl64(a24, 14)
a0 = b0 ^ (^b1 & b2)
a1 = b1 ^ (^b2 & b3)
a2 = b2 ^ (^b3 & b4)
a3 = b3 ^ (^b4 & b0)
a4 = b4 ^ (^b0 & b1)
a5 = b5 ^ (^b6 & b7)
a6 = b6 ^ (^b7 & b8)
a7 = b7 ^ (^b8 & b9)
a8 = b8 ^ (^b9 & b5)
a9 = b9 ^ (^b5 & b6)
a10 = b10 ^ (^b11 & b12)
a11 = b11 ^ (^b12 & b13)
a12 = b12 ^ (^b13 & b14)
a13 = b13 ^ (^b14 & b10)
a14 = b14 ^ (^b10 & b11)
a15 = b15 ^ (^b16 & b17)
a16 = b16 ^ (^b17 & b18)
a17 = b17 ^ (^b18 & b19)
a18 = b18 ^ (^b19 & b15)
a19 = b19 ^ (^b15 & b16)
a20 = b20 ^ (^b21 & b22)
a21 = b21 ^ (^b22 & b23)
a22 = b22 ^ (^b23 & b24)
a23 = b23 ^ (^b24 & b20)
a24 = b24 ^ (^b20 & b21)
a0 ^= rc[r]
}
s[0], s[1], s[2], s[3], s[4] = a0, a1, a2, a3, a4
s[5], s[6], s[7], s[8], s[9] = a5, a6, a7, a8, a9
s[10], s[11], s[12], s[13], s[14] = a10, a11, a12, a13, a14
s[15], s[16], s[17], s[18], s[19] = a15, a16, a17, a18, a19
s[20], s[21], s[22], s[23], s[24] = a20, a21, a22, a23, a24
}
// DeepSeekHashV1 返回 data 的 32 字节摘要,与 WASM wasm_deepseek_hash_v1 等价。
func DeepSeekHashV1(data []byte) [32]byte {
const rate = 136
var s [25]uint64
off := 0
for off+rate <= len(data) {
for i := 0; i < rate/8; i++ {
s[i] ^= binary.LittleEndian.Uint64(data[off+i*8:])
}
keccakF23(&s)
off += rate
}
var final [rate]byte
copy(final[:], data[off:])
final[len(data)-off] = 0x06
final[rate-1] |= 0x80
for i := 0; i < rate/8; i++ {
s[i] ^= binary.LittleEndian.Uint64(final[i*8:])
}
keccakF23(&s)
var out [32]byte
binary.LittleEndian.PutUint64(out[0:], s[0])
binary.LittleEndian.PutUint64(out[8:], s[1])
binary.LittleEndian.PutUint64(out[16:], s[2])
binary.LittleEndian.PutUint64(out[24:], s[3])
return out
}

147
pow/deepseek_pow.go Normal file
View File

@@ -0,0 +1,147 @@
package pow
import (
"context"
"encoding/base64"
"encoding/binary"
"encoding/hex"
"encoding/json"
"errors"
"strconv"
)
// Challenge 对应 /api/v0/chat/create_pow_challenge 返回 dem data.biz_data.challenge。
type Challenge struct {
Algorithm string `json:"algorithm"`
Challenge string `json:"challenge"`
Salt string `json:"salt"`
ExpireAt int64 `json:"expire_at"`
Difficulty int64 `json:"difficulty"`
Signature string `json:"signature"`
TargetPath string `json:"target_path"`
}
// BuildPrefix: "<salt>_<expire_at>_" (对应 pow.go:89)
func BuildPrefix(salt string, expireAt int64) string {
return salt + "_" + strconv.FormatInt(expireAt, 10) + "_"
}
// SolvePow 搜索 nonce ∈ [0, difficulty) 使得 DeepSeekHashV1(prefix+str(nonce)) == challenge。
// prefix 预吸收进 state,循环内零分配。
func SolvePow(ctx context.Context, challengeHex, salt string, expireAt, difficulty int64) (int64, error) {
if len(challengeHex) != 64 {
return 0, errors.New("pow: challenge must be 64 hex chars")
}
target, err := hex.DecodeString(challengeHex)
if err != nil {
return 0, err
}
var ta [32]byte
copy(ta[:], target)
t0 := binary.LittleEndian.Uint64(ta[0:])
t1 := binary.LittleEndian.Uint64(ta[8:])
t2 := binary.LittleEndian.Uint64(ta[16:])
t3 := binary.LittleEndian.Uint64(ta[24:])
prefix := []byte(BuildPrefix(salt, expireAt))
const rate = 136
var baseState [25]uint64
off := 0
for off+rate <= len(prefix) {
for i := 0; i < rate/8; i++ {
baseState[i] ^= binary.LittleEndian.Uint64(prefix[off+i*8:])
}
keccakF23(&baseState)
off += rate
}
tailLen := len(prefix) - off
var tail [rate]byte
copy(tail[:], prefix[off:])
var numBuf [20]byte
for n := int64(0); n < difficulty; n++ {
// Periodically check if context is canceled to avoid wasting CPU
if n&0x3FF == 0 {
if err := ctx.Err(); err != nil {
return 0, err
}
}
v := uint64(n)
pos := 20
if v == 0 {
pos--
numBuf[pos] = '0'
} else {
for v > 0 {
pos--
numBuf[pos] = byte('0' + v%10)
v /= 10
}
}
numLen := 20 - pos
s := baseState
totalTail := tailLen + numLen
if totalTail < rate {
var buf [rate]byte
copy(buf[:tailLen], tail[:tailLen])
copy(buf[tailLen:totalTail], numBuf[pos:])
buf[totalTail] = 0x06
buf[rate-1] |= 0x80
for i := 0; i < rate/8; i++ {
s[i] ^= binary.LittleEndian.Uint64(buf[i*8:])
}
keccakF23(&s)
} else {
var buf [rate]byte
copy(buf[:tailLen], tail[:tailLen])
copy(buf[tailLen:rate], numBuf[pos:pos+(rate-tailLen)])
for i := 0; i < rate/8; i++ {
s[i] ^= binary.LittleEndian.Uint64(buf[i*8:])
}
keccakF23(&s)
var buf2 [rate]byte
rem := totalTail - rate
copy(buf2[:rem], numBuf[pos+(rate-tailLen):pos+(rate-tailLen)+rem])
buf2[rem] = 0x06
buf2[rate-1] |= 0x80
for i := 0; i < rate/8; i++ {
s[i] ^= binary.LittleEndian.Uint64(buf2[i*8:])
}
keccakF23(&s)
}
if s[0] == t0 && s[1] == t1 && s[2] == t2 && s[3] == t3 {
return n, nil
}
}
return 0, errors.New("pow: no solution within difficulty")
}
// BuildPowHeader 序列化 {algorithm,challenge,salt,answer,signature,target_path} 为 base64(JSON)。
// 不含 difficulty/expire_at (对应 pow.go:218)。
func BuildPowHeader(c *Challenge, answer int64) (string, error) {
b, err := json.Marshal(map[string]any{
"algorithm": c.Algorithm, "challenge": c.Challenge, "salt": c.Salt,
"answer": answer, "signature": c.Signature, "target_path": c.TargetPath,
})
if err != nil {
return "", err
}
return base64.StdEncoding.EncodeToString(b), nil
}
// SolveAndBuildHeader 端到端: Challenge → x-ds-pow-response header string。
func SolveAndBuildHeader(ctx context.Context, c *Challenge) (string, error) {
if c.Algorithm != "DeepSeekHashV1" {
return "", errors.New("pow: unsupported algorithm: " + c.Algorithm)
}
d := c.Difficulty
if d == 0 {
d = 144000
}
answer, err := SolvePow(ctx, c.Challenge, c.Salt, c.ExpireAt, d)
if err != nil {
return "", err
}
return BuildPowHeader(c, answer)
}

80
pow/deepseek_pow_test.go Normal file
View File

@@ -0,0 +1,80 @@
package pow
import (
"context"
"encoding/base64"
"encoding/hex"
"encoding/json"
"strconv"
"testing"
)
// 测试向量来自直接调用 DeepSeek 官方 WASM。
func TestDeepSeekHashV1(t *testing.T) {
for _, tc := range []struct{ in, want string }{
{"", "e594808bc5b7151ac160c6d39a02e0a8e261ed588578403099e3561dc40c26b3"},
{"testsalt_1700000000_42", "d4a2ea58c89e40887c933484868380c6f803eaa8dc53a3b9df8e431b921a4f09"},
{"testsalt_1700000000_100000", "abea2f35796b65486e9be1b36f7878c66cab021e96faa473fdf4decd31f9ba30"},
{"abc123salt_1700000000_12345", "74b3b7452745b70e85eb32ee7f0a9ec0381d42dd5137b695da915e104fc390e1"},
} {
h := DeepSeekHashV1([]byte(tc.in))
got := hex.EncodeToString(h[:])
if got != tc.want {
t.Errorf("hash(%q) = %s, want %s", tc.in, got, tc.want)
}
}
}
func TestSolvePow(t *testing.T) {
for _, tc := range []struct {
salt string
expire int64
answer int64
diff int64
}{
{"testsalt", 1700000000, 42, 1000},
{"testsalt", 1700000000, 500, 2000},
{"abc123salt", 1700000000, 12345, 20000},
} {
h := DeepSeekHashV1([]byte(BuildPrefix(tc.salt, tc.expire) + strconv.FormatInt(tc.answer, 10)))
got, err := SolvePow(context.Background(), hex.EncodeToString(h[:]), tc.salt, tc.expire, tc.diff)
if err != nil || got != tc.answer {
t.Errorf("salt=%q answer=%d: got=%d err=%v", tc.salt, tc.answer, got, err)
}
}
}
func TestSolveAndBuildHeader(t *testing.T) {
t0 := DeepSeekHashV1([]byte("salt_1712345678_777"))
header, err := SolveAndBuildHeader(context.Background(), &Challenge{
Algorithm: "DeepSeekHashV1", Challenge: hex.EncodeToString(t0[:]),
Salt: "salt", ExpireAt: 1712345678, Difficulty: 2000,
Signature: "sig", TargetPath: "/api/v0/chat/completion",
})
if err != nil {
t.Fatal(err)
}
raw, _ := base64.StdEncoding.DecodeString(header)
var m map[string]any
if err := json.Unmarshal(raw, &m); err != nil {
t.Fatal(err)
}
if int64(m["answer"].(float64)) != 777 {
t.Errorf("answer = %v, want 777", m["answer"])
}
}
func BenchmarkHash(b *testing.B) {
d := []byte("realisticsalt_1712345678_12345")
for i := 0; i < b.N; i++ {
DeepSeekHashV1(d)
}
}
func BenchmarkSolve(b *testing.B) {
h := DeepSeekHashV1([]byte("realisticsalt_1712345678_72000"))
ch := hex.EncodeToString(h[:])
for i := 0; i < b.N; i++ {
_, _ = SolvePow(context.Background(), ch, "realisticsalt", 1712345678, 144000)
}
}

View File

@@ -3,6 +3,6 @@
"finished": true,
"new_type": "text",
"content_filter": true,
"output_tokens": 77,
"output_tokens": 0,
"error_message": ""
}

Some files were not shown because too many files have changed in this diff Show More