diff --git a/API.en.md b/API.en.md index 7c29663..6e93202 100644 --- a/API.en.md +++ b/API.en.md @@ -243,6 +243,8 @@ Retired historical families such as `claude-1.*`, `claude-2.*`, `claude-instant- ### `POST /v1/chat/completions` +> Path note: besides the canonical `/v1/chat/completions`, DS2API also accepts the root shortcut `/chat/completions`. On Vercel Runtime, `stream=true` on either path is handled by the Node streaming bridge, while non-stream stays on the Go primary path. + **Headers**: ```http diff --git a/API.md b/API.md index ff5f6c1..0470fd5 100644 --- a/API.md +++ b/API.md @@ -249,6 +249,8 @@ OpenAI `/v1/*` 仍是规范路径。对于只配置 DS2API 根地址的客户端 ### `POST /v1/chat/completions` +> 路径说明:除规范路径 `/v1/chat/completions` 外,也支持根路径快捷别名 `/chat/completions`;在 Vercel Runtime 上,这两个路径的 `stream=true` 请求都会进入 Node 流式桥接逻辑,非流式仍走 Go 主链路。 + **请求头**: ```http diff --git a/README.MD b/README.MD index 3b8e841..3edf3b8 100644 --- a/README.MD +++ b/README.MD @@ -295,7 +295,7 @@ cp config.example.json config.json base64 < config.json | tr -d '\n' ``` -> **流式说明**:`/v1/chat/completions` 在 Vercel 上默认走 `api/chat-stream.js`(Node Runtime)以保证实时 SSE。鉴权、账号选择、会话/PoW 准备仍由 Go 内部 prepare 接口完成;流式响应(含 `tools`)在 Node 侧执行与 Go 对齐的输出组装与防泄漏处理。虽然这里只有 OpenAI chat 流式走 Node,但 CORS 放行策略仍与 Go 主路由保持一致,统一覆盖第三方客户端预检场景。 +> **流式说明**:OpenAI Chat 流式在 Vercel 上会由 `api/chat-stream.js`(Node Runtime)承接,支持规范路径 `/v1/chat/completions` 与根路径快捷别名 `/chat/completions`。鉴权、账号选择、会话/PoW 准备仍由 Go 内部 prepare 接口完成;流式响应(含 `tools`)在 Node 侧执行与 Go 对齐的输出组装与防泄漏处理。虽然这里只有 OpenAI chat 流式走 Node,但 CORS 放行策略仍与 Go 主路由保持一致,统一覆盖第三方客户端预检场景。 详细部署说明请参阅 [部署指南](docs/DEPLOY.md)。 diff --git a/README.en.md b/README.en.md index e232f61..62503b6 100644 --- a/README.en.md +++ b/README.en.md @@ -283,7 +283,7 @@ Recommended: convert `config.json` to Base64 locally, then paste into `DS2API_CO base64 < config.json | tr -d '\n' ``` -> **Streaming note**: `/v1/chat/completions` on Vercel is routed to `api/chat-stream.js` (Node Runtime) for real-time SSE. Auth, account selection, and session/PoW preparation are still handled by the Go internal prepare endpoint; streaming output (including `tools`) is assembled on Node with Go-aligned anti-leak handling. This is the only interface family currently routed through Node, and its CORS allow behavior is kept aligned with the Go router so third-party preflight handling stays unified. +> **Streaming note**: OpenAI Chat streaming on Vercel is routed to `api/chat-stream.js` (Node Runtime), with both the canonical `/v1/chat/completions` path and the root shortcut `/chat/completions` supported. Auth, account selection, and session/PoW preparation are still handled by the Go internal prepare endpoint; streaming output (including `tools`) is assembled on Node with Go-aligned anti-leak handling. This is the only interface family currently routed through Node, and its CORS allow behavior is kept aligned with the Go router so third-party preflight handling stays unified. For detailed deployment instructions, see the [Deployment Guide](docs/DEPLOY.en.md).