mirror of
https://github.com/CJackHwang/ds2api.git
synced 2026-05-06 01:15:29 +08:00
Compare commits
15 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
4678a061d0 | ||
|
|
70076c217f | ||
|
|
554fae6b3f | ||
|
|
76884c0d94 | ||
|
|
269d7cd8f9 | ||
|
|
7870a61bb0 | ||
|
|
ec4f178908 | ||
|
|
f413d42b0c | ||
|
|
5406f07938 | ||
|
|
fe87ded82b | ||
|
|
8ace349f84 | ||
|
|
112bedb05d | ||
|
|
c099a6f7bf | ||
|
|
5e55cf36d8 | ||
|
|
837dc74ffc |
18
API.en.md
18
API.en.md
@@ -660,11 +660,13 @@ Requires JWT: `Authorization: Bearer <jwt>`
|
|||||||
|
|
||||||
### `GET /admin/vercel/config`
|
### `GET /admin/vercel/config`
|
||||||
|
|
||||||
Returns Vercel preconfiguration status.
|
Returns Vercel preconfiguration status. Environment variables are preferred, then the saved `vercel` config block is used as a fallback.
|
||||||
|
|
||||||
```json
|
```json
|
||||||
{
|
{
|
||||||
"has_token": true,
|
"has_token": true,
|
||||||
|
"token_preview": "vc****en",
|
||||||
|
"token_source": "config",
|
||||||
"project_id": "prj_xxx",
|
"project_id": "prj_xxx",
|
||||||
"team_id": null
|
"team_id": null
|
||||||
}
|
}
|
||||||
@@ -685,6 +687,12 @@ Returns sanitized config, including both `keys` and `api_keys`.
|
|||||||
"env_source_present": true,
|
"env_source_present": true,
|
||||||
"env_writeback_enabled": true,
|
"env_writeback_enabled": true,
|
||||||
"config_path": "/data/config.json",
|
"config_path": "/data/config.json",
|
||||||
|
"vercel": {
|
||||||
|
"has_token": true,
|
||||||
|
"token_preview": "vc****en",
|
||||||
|
"project_id": "prj_xxx",
|
||||||
|
"team_id": ""
|
||||||
|
},
|
||||||
"accounts": [
|
"accounts": [
|
||||||
{
|
{
|
||||||
"identifier": "user@example.com",
|
"identifier": "user@example.com",
|
||||||
@@ -1096,11 +1104,11 @@ The success payload includes `sample_id`, `dir`, `meta_path`, and `upstream_path
|
|||||||
|
|
||||||
| Field | Required | Notes |
|
| Field | Required | Notes |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| `vercel_token` | ❌ | If empty or `__USE_PRECONFIG__`, read env |
|
| `vercel_token` | ❌ | If empty or `__USE_PRECONFIG__`, read env, then saved config |
|
||||||
| `project_id` | ❌ | Fallback: `VERCEL_PROJECT_ID` |
|
| `project_id` | ❌ | Fallback: `VERCEL_PROJECT_ID`, then saved config |
|
||||||
| `team_id` | ❌ | Fallback: `VERCEL_TEAM_ID` |
|
| `team_id` | ❌ | Fallback: `VERCEL_TEAM_ID`, then saved config |
|
||||||
| `auto_validate` | ❌ | Default `true` |
|
| `auto_validate` | ❌ | Default `true` |
|
||||||
| `save_credentials` | ❌ | Default `true` |
|
| `save_credentials` | ❌ | Default `true`; saves explicitly supplied Vercel credentials for the next sync |
|
||||||
|
|
||||||
**Success response**:
|
**Success response**:
|
||||||
|
|
||||||
|
|||||||
22
API.md
22
API.md
@@ -42,7 +42,7 @@
|
|||||||
- Tool Calling 的解析策略在 Go 与 Node Runtime 间保持一致:推荐模型输出 DSML 外壳 `<|DSML|tool_calls>` → `<|DSML|invoke name="...">` → `<|DSML|parameter name="...">`;兼容层也接受 DSML wrapper 别名 `<dsml|tool_calls>`、`<|tool_calls>`、`<|tool_calls>`、常见 DSML 分隔符漏写形态(如 `<|DSML tool_calls>`)、`DSML` 与工具标签名黏连的常见 typo(如 `<DSMLtool_calls>`),以及旧式 canonical XML `<tool_calls>` → `<invoke name="...">` → `<parameter name="...">`。实现上采用窄容错结构扫描:只有 `tool_calls` wrapper 或可修复的缺失 opening wrapper 会进入工具路径,裸 `<invoke>` 不计为已支持语法;流式场景继续执行防泄漏筛分。若参数体本身是合法 JSON 字面量(如 `123`、`true`、`null`、数组或对象),会按结构化值输出,不再一律当作字符串;若 CDATA 偶发漏闭合,则会在最终 parse / flush 恢复阶段做窄修复,尽量保住已完整包裹的外层工具调用。
|
- Tool Calling 的解析策略在 Go 与 Node Runtime 间保持一致:推荐模型输出 DSML 外壳 `<|DSML|tool_calls>` → `<|DSML|invoke name="...">` → `<|DSML|parameter name="...">`;兼容层也接受 DSML wrapper 别名 `<dsml|tool_calls>`、`<|tool_calls>`、`<|tool_calls>`、常见 DSML 分隔符漏写形态(如 `<|DSML tool_calls>`)、`DSML` 与工具标签名黏连的常见 typo(如 `<DSMLtool_calls>`),以及旧式 canonical XML `<tool_calls>` → `<invoke name="...">` → `<parameter name="...">`。实现上采用窄容错结构扫描:只有 `tool_calls` wrapper 或可修复的缺失 opening wrapper 会进入工具路径,裸 `<invoke>` 不计为已支持语法;流式场景继续执行防泄漏筛分。若参数体本身是合法 JSON 字面量(如 `123`、`true`、`null`、数组或对象),会按结构化值输出,不再一律当作字符串;若 CDATA 偶发漏闭合,则会在最终 parse / flush 恢复阶段做窄修复,尽量保住已完整包裹的外层工具调用。
|
||||||
- `Admin API` 将配置与运行时策略分开:`/admin/config*` 管静态配置,`/admin/settings*` 管运行时行为。
|
- `Admin API` 将配置与运行时策略分开:`/admin/config*` 管静态配置,`/admin/settings*` 管运行时行为。
|
||||||
- 当上游返回 thinking-only 响应(模型输出了推理链但无可见文本)时,非流式补全会自动重试一次:以多轮对话 follow-up 方式追加 prompt 后缀 `"Previous reply had no visible output. Please regenerate the visible final answer or tool call now."` 并设置 `parent_message_id` 在同一 DeepSeek session 内让模型重新输出;重试最大 1 次。
|
- 当上游返回 thinking-only 响应(模型输出了推理链但无可见文本)时,非流式补全会自动重试一次:以多轮对话 follow-up 方式追加 prompt 后缀 `"Previous reply had no visible output. Please regenerate the visible final answer or tool call now."` 并设置 `parent_message_id` 在同一 DeepSeek session 内让模型重新输出;重试最大 1 次。
|
||||||
- 引用标记剥离(strip reference markers)当前为固定开启的运行时行为,所有协议适配层统一生效。
|
- 引用标记处理边界:流式输出默认隐藏 `[citation:N]` / `[reference:N]` 这类上游内部占位符;非流式输出默认把 DeepSeek 搜索引用标记转换为 Markdown 引用链接。
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
@@ -168,6 +168,8 @@ Gemini 兼容客户端还可以使用 `x-goog-api-key`、`?key=` 或 `?api_key=`
|
|||||||
| GET | `/admin/chat-history/{id}` | Admin | 查看单条服务器端对话记录 |
|
| GET | `/admin/chat-history/{id}` | Admin | 查看单条服务器端对话记录 |
|
||||||
| DELETE | `/admin/chat-history/{id}` | Admin | 删除单条服务器端对话记录 |
|
| DELETE | `/admin/chat-history/{id}` | Admin | 删除单条服务器端对话记录 |
|
||||||
| PUT | `/admin/chat-history/settings` | Admin | 更新对话记录保留条数 |
|
| PUT | `/admin/chat-history/settings` | Admin | 更新对话记录保留条数 |
|
||||||
|
|
||||||
|
服务器端记录本质上是 DeepSeek 上游响应归档:OpenAI Chat、OpenAI Responses、Claude Messages、Gemini GenerateContent 等直连 DeepSeek 的生成接口,在收到上游响应后会于各协议回译/裁剪前写入记录;列表按请求创建时间倒序展示,流式请求会在生成过程中持续刷新状态与详情。WebUI「API 测试」发出的请求也会进入该记录。
|
||||||
| GET | `/admin/version` | Admin | 查询当前版本与最新 Release |
|
| GET | `/admin/version` | Admin | 查询当前版本与最新 Release |
|
||||||
|
|
||||||
OpenAI `/v1/*` 仍是规范路径。对于只配置 DS2API 根地址的客户端,同一套 OpenAI handler 也通过根路径快捷路由暴露:`/models`、`/models/{id}`、`/chat/completions`、`/responses`、`/responses/{response_id}`、`/embeddings`、`/files`、`/files/{file_id}`。
|
OpenAI `/v1/*` 仍是规范路径。对于只配置 DS2API 根地址的客户端,同一套 OpenAI handler 也通过根路径快捷路由暴露:`/models`、`/models/{id}`、`/chat/completions`、`/responses`、`/responses/{response_id}`、`/embeddings`、`/files`、`/files/{file_id}`。
|
||||||
@@ -669,11 +671,13 @@ data: {"type":"message_stop"}
|
|||||||
|
|
||||||
### `GET /admin/vercel/config`
|
### `GET /admin/vercel/config`
|
||||||
|
|
||||||
返回 Vercel 预配置状态。
|
返回 Vercel 预配置状态。优先读取环境变量,其次回退到已保存的 `vercel` 配置块。
|
||||||
|
|
||||||
```json
|
```json
|
||||||
{
|
{
|
||||||
"has_token": true,
|
"has_token": true,
|
||||||
|
"token_preview": "vc****en",
|
||||||
|
"token_source": "config",
|
||||||
"project_id": "prj_xxx",
|
"project_id": "prj_xxx",
|
||||||
"team_id": null
|
"team_id": null
|
||||||
}
|
}
|
||||||
@@ -694,6 +698,12 @@ data: {"type":"message_stop"}
|
|||||||
"env_source_present": true,
|
"env_source_present": true,
|
||||||
"env_writeback_enabled": true,
|
"env_writeback_enabled": true,
|
||||||
"config_path": "/data/config.json",
|
"config_path": "/data/config.json",
|
||||||
|
"vercel": {
|
||||||
|
"has_token": true,
|
||||||
|
"token_preview": "vc****en",
|
||||||
|
"project_id": "prj_xxx",
|
||||||
|
"team_id": ""
|
||||||
|
},
|
||||||
"accounts": [
|
"accounts": [
|
||||||
{
|
{
|
||||||
"identifier": "user@example.com",
|
"identifier": "user@example.com",
|
||||||
@@ -1107,11 +1117,11 @@ data: {"type":"message_stop"}
|
|||||||
|
|
||||||
| 字段 | 必填 | 说明 |
|
| 字段 | 必填 | 说明 |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| `vercel_token` | ❌ | 空或 `__USE_PRECONFIG__` 则读环境变量 |
|
| `vercel_token` | ❌ | 空或 `__USE_PRECONFIG__` 则读环境变量,再回退到已保存配置 |
|
||||||
| `project_id` | ❌ | 空则读 `VERCEL_PROJECT_ID` |
|
| `project_id` | ❌ | 空则读 `VERCEL_PROJECT_ID`,再回退到已保存配置 |
|
||||||
| `team_id` | ❌ | 空则读 `VERCEL_TEAM_ID` |
|
| `team_id` | ❌ | 空则读 `VERCEL_TEAM_ID`,再回退到已保存配置 |
|
||||||
| `auto_validate` | ❌ | 默认 `true` |
|
| `auto_validate` | ❌ | 默认 `true` |
|
||||||
| `save_credentials` | ❌ | 默认 `true` |
|
| `save_credentials` | ❌ | 默认 `true`;保存本次显式填写的 Vercel 凭据,供下次同步复用 |
|
||||||
|
|
||||||
**成功响应**:
|
**成功响应**:
|
||||||
|
|
||||||
|
|||||||
10
README.MD
10
README.MD
@@ -23,6 +23,16 @@
|
|||||||
|
|
||||||
【感谢Linux.do社区及GitHub社区各位开发者对项目的支持与贡献】
|
【感谢Linux.do社区及GitHub社区各位开发者对项目的支持与贡献】
|
||||||
|
|
||||||
|
## Star History
|
||||||
|
|
||||||
|
<a href="https://www.star-history.com/?repos=cjackhwang%2Fds2api&type=date&legend=top-left">
|
||||||
|
<picture>
|
||||||
|
<source media="(prefers-color-scheme: dark)" srcset="https://api.star-history.com/chart?repos=cjackhwang/ds2api&type=date&theme=dark&legend=top-left" />
|
||||||
|
<source media="(prefers-color-scheme: light)" srcset="https://api.star-history.com/chart?repos=cjackhwang/ds2api&type=date&legend=top-left" />
|
||||||
|
<img alt="Star History Chart" src="https://api.star-history.com/chart?repos=cjackhwang/ds2api&type=date&legend=top-left" />
|
||||||
|
</picture>
|
||||||
|
</a>
|
||||||
|
|
||||||
> **重要免责声明**
|
> **重要免责声明**
|
||||||
>
|
>
|
||||||
> 本仓库仅供学习、研究、个人实验和内部验证使用,不提供任何形式的商业授权、适用性保证或结果保证。
|
> 本仓库仅供学习、研究、个人实验和内部验证使用,不提供任何形式的商业授权、适用性保证或结果保证。
|
||||||
|
|||||||
10
README.en.md
10
README.en.md
@@ -20,6 +20,16 @@ DS2API converts DeepSeek Web chat capability into OpenAI-compatible, Claude-comp
|
|||||||
|
|
||||||
Documentation entry: [Docs Index](docs/README.md) / [Architecture](docs/ARCHITECTURE.en.md) / [API Reference](API.en.md)
|
Documentation entry: [Docs Index](docs/README.md) / [Architecture](docs/ARCHITECTURE.en.md) / [API Reference](API.en.md)
|
||||||
|
|
||||||
|
## Star History
|
||||||
|
|
||||||
|
<a href="https://www.star-history.com/?repos=cjackhwang%2Fds2api&type=date&legend=top-left">
|
||||||
|
<picture>
|
||||||
|
<source media="(prefers-color-scheme: dark)" srcset="https://api.star-history.com/chart?repos=cjackhwang/ds2api&type=date&theme=dark&legend=top-left" />
|
||||||
|
<source media="(prefers-color-scheme: light)" srcset="https://api.star-history.com/chart?repos=cjackhwang/ds2api&type=date&legend=top-left" />
|
||||||
|
<img alt="Star History Chart" src="https://api.star-history.com/chart?repos=cjackhwang/ds2api&type=date&legend=top-left" />
|
||||||
|
</picture>
|
||||||
|
</a>
|
||||||
|
|
||||||
> **Important Disclaimer**
|
> **Important Disclaimer**
|
||||||
>
|
>
|
||||||
> This repository is provided for learning, research, personal experimentation, and internal validation only. It does not grant any commercial authorization and comes with no warranty of fitness, stability, or results.
|
> This repository is provided for learning, research, personal experimentation, and internal validation only. It does not grant any commercial authorization and comes with no warranty of fitness, stability, or results.
|
||||||
|
|||||||
@@ -74,6 +74,7 @@ gofmt -w <changed-go-files>
|
|||||||
|
|
||||||
- Admin API:`/admin/chat-history`、`/admin/chat-history/{id}`。
|
- Admin API:`/admin/chat-history`、`/admin/chat-history/{id}`。
|
||||||
- 后端存储:`internal/chathistory/store.go`。
|
- 后端存储:`internal/chathistory/store.go`。
|
||||||
|
- 输出归档:`internal/responsehistory` 在协议回译/裁剪前记录 DeepSeek 上游 assistant text / thinking;即使工具调用已被对外响应转成结构化 `tool_calls` 并从可见正文剔除,后台历史仍应保留原始 DSML / XML 片段,方便排查格式漂移。
|
||||||
- 前端轮询和 ETag:`webui/src/features/chatHistory/ChatHistoryContainer.jsx`。
|
- 前端轮询和 ETag:`webui/src/features/chatHistory/ChatHistoryContainer.jsx`。
|
||||||
|
|
||||||
Tool call 问题优先跑:
|
Tool call 问题优先跑:
|
||||||
|
|||||||
@@ -114,7 +114,8 @@ DS2API 当前的核心思路,不是把客户端传来的 `messages`、`tools`
|
|||||||
- 对 OpenAI Chat / Responses 的非流式收尾,如果最终可见正文为空,兼容层会优先尝试把思维链中的独立 DSML / XML 工具块当作真实工具调用解析出来。流式链路也会在收尾阶段做同样的 fallback 检测,但不会因为思维链内容去中途拦截或改写流式输出;真正的工具识别始终基于原始上游文本,而不是基于“已经做过可见输出清洗”的版本,因此即使最终可见层会剥离完整 leaked DSML / XML `tool_calls` wrapper、并抑制全空参数或无效 wrapper 块,也不会影响真实工具调用转成结构化 `tool_calls` / `function_call`。补发结果会作为本轮 assistant 的结构化 `tool_calls` / `function_call` 输出返回,而不是塞进 `content` 文本;如果客户端没有开启 thinking / reasoning,思维链只用于检测,不会作为 `reasoning_content` 或可见正文暴露。只有正文为空且思维链里也没有可执行工具调用时,才继续按空回复错误处理。
|
- 对 OpenAI Chat / Responses 的非流式收尾,如果最终可见正文为空,兼容层会优先尝试把思维链中的独立 DSML / XML 工具块当作真实工具调用解析出来。流式链路也会在收尾阶段做同样的 fallback 检测,但不会因为思维链内容去中途拦截或改写流式输出;真正的工具识别始终基于原始上游文本,而不是基于“已经做过可见输出清洗”的版本,因此即使最终可见层会剥离完整 leaked DSML / XML `tool_calls` wrapper、并抑制全空参数或无效 wrapper 块,也不会影响真实工具调用转成结构化 `tool_calls` / `function_call`。补发结果会作为本轮 assistant 的结构化 `tool_calls` / `function_call` 输出返回,而不是塞进 `content` 文本;如果客户端没有开启 thinking / reasoning,思维链只用于检测,不会作为 `reasoning_content` 或可见正文暴露。只有正文为空且思维链里也没有可执行工具调用时,才继续按空回复错误处理。
|
||||||
- OpenAI Chat / Responses 的空回复错误处理之前会默认做一次内部补偿重试:第一次上游完整结束后,如果最终可见正文为空、没有解析到工具调用、也没有已经向客户端流式发出工具调用,并且终止原因不是 `content_filter`,兼容层会复用同一个 `chat_session_id`、账号、token 与工具策略,把原始 completion `prompt` 追加固定后缀 `Previous reply had no visible output. Please regenerate the visible final answer or tool call now.` 后重新提交一次。重试遵循 DeepSeek 多轮对话协议:从第一次上游 SSE 流中提取 `response_message_id`,并在重试 payload 中设置 `parent_message_id` 为该值,使重试成为同一会话的后续轮次而非断裂的根消息;同时重新获取一次 PoW(若 PoW 获取失败则回退到原始 PoW)。该重试不会重新标准化消息、不会新建 session、不会切换账号,也不会向流式客户端插入重试标记;第二次 thinking / reasoning 会按正常增量直接接到第一次之后,并继续使用 overlap trim 去重。若第二次仍为空,终端错误码仍保持现有 `upstream_empty_output`;若任一尝试触发空 `content_filter`,不做补偿重试并保持 `content_filter` 错误。JS Vercel 运行时同样设置 `parent_message_id`,但因无法直接调用 PoW API 而复用原始 PoW。
|
- OpenAI Chat / Responses 的空回复错误处理之前会默认做一次内部补偿重试:第一次上游完整结束后,如果最终可见正文为空、没有解析到工具调用、也没有已经向客户端流式发出工具调用,并且终止原因不是 `content_filter`,兼容层会复用同一个 `chat_session_id`、账号、token 与工具策略,把原始 completion `prompt` 追加固定后缀 `Previous reply had no visible output. Please regenerate the visible final answer or tool call now.` 后重新提交一次。重试遵循 DeepSeek 多轮对话协议:从第一次上游 SSE 流中提取 `response_message_id`,并在重试 payload 中设置 `parent_message_id` 为该值,使重试成为同一会话的后续轮次而非断裂的根消息;同时重新获取一次 PoW(若 PoW 获取失败则回退到原始 PoW)。该重试不会重新标准化消息、不会新建 session、不会切换账号,也不会向流式客户端插入重试标记;第二次 thinking / reasoning 会按正常增量直接接到第一次之后,并继续使用 overlap trim 去重。若第二次仍为空,终端错误码仍保持现有 `upstream_empty_output`;若任一尝试触发空 `content_filter`,不做补偿重试并保持 `content_filter` 错误。JS Vercel 运行时同样设置 `parent_message_id`,但因无法直接调用 PoW API 而复用原始 PoW。
|
||||||
|
|
||||||
- OpenAI Chat / Responses 在最终可见正文渲染阶段,会把 DeepSeek 搜索返回中的 `[citation:N]` / `[reference:N]` 标记替换成对应 Markdown 链接。`citation` 标记按一基序号解析;`reference` 标记只有在同一段正文中出现 `[reference:0]`(允许冒号后有空格)时才按零基序号映射,并且不会影响同段正文里的 `citation` 标记。
|
- 非流式 OpenAI Chat / Responses、Claude Messages、Gemini generateContent 在最终可见正文渲染阶段,会把 DeepSeek 搜索返回中的 `[citation:N]` / `[reference:N]` 标记替换成对应 Markdown 链接。`citation` 标记按一基序号解析;`reference` 标记只有在同一段正文中出现 `[reference:0]`(允许冒号后有空格)时才按零基序号映射,并且不会影响同段正文里的 `citation` 标记。
|
||||||
|
- 流式输出仍默认隐藏 `[citation:N]` / `[reference:N]` 这类上游内部标记,避免分片输出中泄漏尚未完成映射的引用占位符。
|
||||||
|
|
||||||
## 5. prompt 是怎么拼出来的
|
## 5. prompt 是怎么拼出来的
|
||||||
|
|
||||||
|
|||||||
@@ -11,7 +11,7 @@ func TestBuildTurnFromCollectedTextCitation(t *testing.T) {
|
|||||||
turn := BuildTurnFromCollected(sse.CollectResult{
|
turn := BuildTurnFromCollected(sse.CollectResult{
|
||||||
Text: "See [citation:1]",
|
Text: "See [citation:1]",
|
||||||
CitationLinks: map[int]string{1: "https://example.com"},
|
CitationLinks: map[int]string{1: "https://example.com"},
|
||||||
}, BuildOptions{Model: "deepseek-v4-flash", Prompt: "prompt", SearchEnabled: true, StripReferenceMarkers: true})
|
}, BuildOptions{Model: "deepseek-v4-flash", Prompt: "prompt", SearchEnabled: true})
|
||||||
if turn.Text != "See [1](https://example.com)" {
|
if turn.Text != "See [1](https://example.com)" {
|
||||||
t.Fatalf("text mismatch: %q", turn.Text)
|
t.Fatalf("text mismatch: %q", turn.Text)
|
||||||
}
|
}
|
||||||
@@ -23,6 +23,20 @@ func TestBuildTurnFromCollectedTextCitation(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestBuildTurnFromCollectedKeepsNonStreamReferenceLinks(t *testing.T) {
|
||||||
|
turn := BuildTurnFromCollected(sse.CollectResult{
|
||||||
|
Text: "结论[reference:0],补充[reference:1]。",
|
||||||
|
CitationLinks: map[int]string{
|
||||||
|
1: "https://example.com/a",
|
||||||
|
2: "https://example.com/b",
|
||||||
|
},
|
||||||
|
}, BuildOptions{Model: "deepseek-v4-flash-search", Prompt: "prompt", SearchEnabled: true})
|
||||||
|
want := "结论[0](https://example.com/a),补充[1](https://example.com/b)。"
|
||||||
|
if turn.Text != want {
|
||||||
|
t.Fatalf("text mismatch: got %q want %q", turn.Text, want)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func TestBuildTurnFromCollectedToolCall(t *testing.T) {
|
func TestBuildTurnFromCollectedToolCall(t *testing.T) {
|
||||||
turn := BuildTurnFromCollected(sse.CollectResult{
|
turn := BuildTurnFromCollected(sse.CollectResult{
|
||||||
Text: `<tool_calls><invoke name="Write"><parameter name="content">{"x":1}</parameter></invoke></tool_calls>`,
|
Text: `<tool_calls><invoke name="Write"><parameter name="content">{"x":1}</parameter></invoke></tool_calls>`,
|
||||||
|
|||||||
@@ -43,6 +43,7 @@ type Entry struct {
|
|||||||
Status string `json:"status"`
|
Status string `json:"status"`
|
||||||
CallerID string `json:"caller_id,omitempty"`
|
CallerID string `json:"caller_id,omitempty"`
|
||||||
AccountID string `json:"account_id,omitempty"`
|
AccountID string `json:"account_id,omitempty"`
|
||||||
|
Surface string `json:"surface,omitempty"`
|
||||||
Model string `json:"model,omitempty"`
|
Model string `json:"model,omitempty"`
|
||||||
Stream bool `json:"stream"`
|
Stream bool `json:"stream"`
|
||||||
UserInput string `json:"user_input,omitempty"`
|
UserInput string `json:"user_input,omitempty"`
|
||||||
@@ -72,6 +73,7 @@ type SummaryEntry struct {
|
|||||||
Status string `json:"status"`
|
Status string `json:"status"`
|
||||||
CallerID string `json:"caller_id,omitempty"`
|
CallerID string `json:"caller_id,omitempty"`
|
||||||
AccountID string `json:"account_id,omitempty"`
|
AccountID string `json:"account_id,omitempty"`
|
||||||
|
Surface string `json:"surface,omitempty"`
|
||||||
Model string `json:"model,omitempty"`
|
Model string `json:"model,omitempty"`
|
||||||
Stream bool `json:"stream"`
|
Stream bool `json:"stream"`
|
||||||
UserInput string `json:"user_input,omitempty"`
|
UserInput string `json:"user_input,omitempty"`
|
||||||
@@ -92,6 +94,7 @@ type File struct {
|
|||||||
type StartParams struct {
|
type StartParams struct {
|
||||||
CallerID string
|
CallerID string
|
||||||
AccountID string
|
AccountID string
|
||||||
|
Surface string
|
||||||
Model string
|
Model string
|
||||||
Stream bool
|
Stream bool
|
||||||
UserInput string
|
UserInput string
|
||||||
@@ -271,6 +274,7 @@ func (s *Store) Start(params StartParams) (Entry, error) {
|
|||||||
Status: "streaming",
|
Status: "streaming",
|
||||||
CallerID: strings.TrimSpace(params.CallerID),
|
CallerID: strings.TrimSpace(params.CallerID),
|
||||||
AccountID: strings.TrimSpace(params.AccountID),
|
AccountID: strings.TrimSpace(params.AccountID),
|
||||||
|
Surface: strings.TrimSpace(params.Surface),
|
||||||
Model: strings.TrimSpace(params.Model),
|
Model: strings.TrimSpace(params.Model),
|
||||||
Stream: params.Stream,
|
Stream: params.Stream,
|
||||||
UserInput: strings.TrimSpace(params.UserInput),
|
UserInput: strings.TrimSpace(params.UserInput),
|
||||||
@@ -546,10 +550,13 @@ func (s *Store) rebuildIndexLocked() {
|
|||||||
summaries = append(summaries, summaryFromEntry(item))
|
summaries = append(summaries, summaryFromEntry(item))
|
||||||
}
|
}
|
||||||
sort.Slice(summaries, func(i, j int) bool {
|
sort.Slice(summaries, func(i, j int) bool {
|
||||||
if summaries[i].UpdatedAt == summaries[j].UpdatedAt {
|
if summaries[i].CreatedAt == summaries[j].CreatedAt {
|
||||||
return summaries[i].CreatedAt > summaries[j].CreatedAt
|
if summaries[i].Revision == summaries[j].Revision {
|
||||||
|
return summaries[i].UpdatedAt > summaries[j].UpdatedAt
|
||||||
|
}
|
||||||
|
return summaries[i].Revision > summaries[j].Revision
|
||||||
}
|
}
|
||||||
return summaries[i].UpdatedAt > summaries[j].UpdatedAt
|
return summaries[i].CreatedAt > summaries[j].CreatedAt
|
||||||
})
|
})
|
||||||
if s.state.Limit < DisabledLimit || !isAllowedLimit(s.state.Limit) {
|
if s.state.Limit < DisabledLimit || !isAllowedLimit(s.state.Limit) {
|
||||||
s.state.Limit = DefaultLimit
|
s.state.Limit = DefaultLimit
|
||||||
@@ -593,6 +600,7 @@ func summaryFromEntry(item Entry) SummaryEntry {
|
|||||||
Status: item.Status,
|
Status: item.Status,
|
||||||
CallerID: item.CallerID,
|
CallerID: item.CallerID,
|
||||||
AccountID: item.AccountID,
|
AccountID: item.AccountID,
|
||||||
|
Surface: item.Surface,
|
||||||
Model: item.Model,
|
Model: item.Model,
|
||||||
Stream: item.Stream,
|
Stream: item.Stream,
|
||||||
UserInput: item.UserInput,
|
UserInput: item.UserInput,
|
||||||
|
|||||||
@@ -8,6 +8,7 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"sync"
|
"sync"
|
||||||
"testing"
|
"testing"
|
||||||
|
"time"
|
||||||
"unicode/utf8"
|
"unicode/utf8"
|
||||||
)
|
)
|
||||||
|
|
||||||
@@ -494,6 +495,36 @@ func TestStoreWritesOnlyChangedDetailFiles(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestStoreOrdersByCreationTimeNotStreamingUpdates(t *testing.T) {
|
||||||
|
path := filepath.Join(t.TempDir(), "chat_history.json")
|
||||||
|
store := New(path)
|
||||||
|
|
||||||
|
first, err := store.Start(StartParams{UserInput: "first"})
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("start first failed: %v", err)
|
||||||
|
}
|
||||||
|
time.Sleep(time.Millisecond)
|
||||||
|
second, err := store.Start(StartParams{UserInput: "second"})
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("start second failed: %v", err)
|
||||||
|
}
|
||||||
|
time.Sleep(time.Millisecond)
|
||||||
|
if _, err := store.Update(first.ID, UpdateParams{Status: "streaming", Content: "still running"}); err != nil {
|
||||||
|
t.Fatalf("update first failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
snapshot, err := store.Snapshot()
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("snapshot failed: %v", err)
|
||||||
|
}
|
||||||
|
if len(snapshot.Items) != 2 {
|
||||||
|
t.Fatalf("expected two items, got %#v", snapshot.Items)
|
||||||
|
}
|
||||||
|
if snapshot.Items[0].ID != second.ID || snapshot.Items[1].ID != first.ID {
|
||||||
|
t.Fatalf("expected creation-time order to stay stable, got %#v", snapshot.Items)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func TestUpdatePreservesContentWhenNewContentIsEmpty(t *testing.T) {
|
func TestUpdatePreservesContentWhenNewContentIsEmpty(t *testing.T) {
|
||||||
path := filepath.Join(t.TempDir(), "chat_history.json")
|
path := filepath.Join(t.TempDir(), "chat_history.json")
|
||||||
store := New(path)
|
store := New(path)
|
||||||
|
|||||||
@@ -119,6 +119,29 @@ func TestExecuteNonStreamWithRetryUsesParentMessageForEmptyRetry(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestExecuteNonStreamWithRetryConvertsReferenceMarkers(t *testing.T) {
|
||||||
|
ds := &fakeDeepSeekCaller{responses: []*http.Response{sseHTTPResponse(
|
||||||
|
http.StatusOK,
|
||||||
|
`data: {"p":"response/content","v":"答案[reference:0]。","citation":{"cite_index":0,"url":"https://example.com/ref"}}`,
|
||||||
|
)}}
|
||||||
|
stdReq := promptcompat.StandardRequest{
|
||||||
|
Surface: "test",
|
||||||
|
ResponseModel: "deepseek-v4-flash-search",
|
||||||
|
PromptTokenText: "prompt",
|
||||||
|
FinalPrompt: "final prompt",
|
||||||
|
Search: true,
|
||||||
|
}
|
||||||
|
|
||||||
|
result, outErr := ExecuteNonStreamWithRetry(context.Background(), ds, &auth.RequestAuth{}, stdReq, Options{})
|
||||||
|
if outErr != nil {
|
||||||
|
t.Fatalf("unexpected output error: %#v", outErr)
|
||||||
|
}
|
||||||
|
want := "答案[0](https://example.com/ref)。"
|
||||||
|
if result.Turn.Text != want {
|
||||||
|
t.Fatalf("text mismatch: got %q want %q", result.Turn.Text, want)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func TestStartCompletionAppliesCurrentInputFileGlobally(t *testing.T) {
|
func TestStartCompletionAppliesCurrentInputFileGlobally(t *testing.T) {
|
||||||
ds := &fakeDeepSeekCaller{responses: []*http.Response{sseHTTPResponse(http.StatusOK, `data: {"p":"response/content","v":"ok"}`)}}
|
ds := &fakeDeepSeekCaller{responses: []*http.Response{sseHTTPResponse(http.StatusOK, `data: {"p":"response/content","v":"ok"}`)}}
|
||||||
stdReq := promptcompat.StandardRequest{
|
stdReq := promptcompat.StandardRequest{
|
||||||
|
|||||||
@@ -48,6 +48,9 @@ func (c Config) MarshalJSON() ([]byte, error) {
|
|||||||
if c.ThinkingInjection.Enabled != nil || strings.TrimSpace(c.ThinkingInjection.Prompt) != "" {
|
if c.ThinkingInjection.Enabled != nil || strings.TrimSpace(c.ThinkingInjection.Prompt) != "" {
|
||||||
m["thinking_injection"] = c.ThinkingInjection
|
m["thinking_injection"] = c.ThinkingInjection
|
||||||
}
|
}
|
||||||
|
if strings.TrimSpace(c.Vercel.Token) != "" || strings.TrimSpace(c.Vercel.ProjectID) != "" || strings.TrimSpace(c.Vercel.TeamID) != "" {
|
||||||
|
m["vercel"] = NormalizeVercelConfig(c.Vercel)
|
||||||
|
}
|
||||||
if c.VercelSyncHash != "" {
|
if c.VercelSyncHash != "" {
|
||||||
m["_vercel_sync_hash"] = c.VercelSyncHash
|
m["_vercel_sync_hash"] = c.VercelSyncHash
|
||||||
}
|
}
|
||||||
@@ -125,6 +128,10 @@ func (c *Config) UnmarshalJSON(b []byte) error {
|
|||||||
if err := json.Unmarshal(v, &c.ThinkingInjection); err != nil {
|
if err := json.Unmarshal(v, &c.ThinkingInjection); err != nil {
|
||||||
return fmt.Errorf("invalid field %q: %w", k, err)
|
return fmt.Errorf("invalid field %q: %w", k, err)
|
||||||
}
|
}
|
||||||
|
case "vercel":
|
||||||
|
if err := json.Unmarshal(v, &c.Vercel); err != nil {
|
||||||
|
return fmt.Errorf("invalid field %q: %w", k, err)
|
||||||
|
}
|
||||||
case "_vercel_sync_hash":
|
case "_vercel_sync_hash":
|
||||||
if err := json.Unmarshal(v, &c.VercelSyncHash); err != nil {
|
if err := json.Unmarshal(v, &c.VercelSyncHash); err != nil {
|
||||||
return fmt.Errorf("invalid field %q: %w", k, err)
|
return fmt.Errorf("invalid field %q: %w", k, err)
|
||||||
@@ -164,6 +171,7 @@ func (c Config) Clone() Config {
|
|||||||
Enabled: cloneBoolPtr(c.ThinkingInjection.Enabled),
|
Enabled: cloneBoolPtr(c.ThinkingInjection.Enabled),
|
||||||
Prompt: c.ThinkingInjection.Prompt,
|
Prompt: c.ThinkingInjection.Prompt,
|
||||||
},
|
},
|
||||||
|
Vercel: c.Vercel,
|
||||||
VercelSyncHash: c.VercelSyncHash,
|
VercelSyncHash: c.VercelSyncHash,
|
||||||
VercelSyncTime: c.VercelSyncTime,
|
VercelSyncTime: c.VercelSyncTime,
|
||||||
AdditionalFields: map[string]any{},
|
AdditionalFields: map[string]any{},
|
||||||
|
|||||||
@@ -20,6 +20,7 @@ type Config struct {
|
|||||||
AutoDelete AutoDeleteConfig `json:"auto_delete"`
|
AutoDelete AutoDeleteConfig `json:"auto_delete"`
|
||||||
CurrentInputFile CurrentInputFileConfig `json:"current_input_file,omitempty"`
|
CurrentInputFile CurrentInputFileConfig `json:"current_input_file,omitempty"`
|
||||||
ThinkingInjection ThinkingInjectionConfig `json:"thinking_injection,omitempty"`
|
ThinkingInjection ThinkingInjectionConfig `json:"thinking_injection,omitempty"`
|
||||||
|
Vercel VercelConfig `json:"vercel,omitempty"`
|
||||||
VercelSyncHash string `json:"_vercel_sync_hash,omitempty"`
|
VercelSyncHash string `json:"_vercel_sync_hash,omitempty"`
|
||||||
VercelSyncTime int64 `json:"_vercel_sync_time,omitempty"`
|
VercelSyncTime int64 `json:"_vercel_sync_time,omitempty"`
|
||||||
AdditionalFields map[string]any `json:"-"`
|
AdditionalFields map[string]any `json:"-"`
|
||||||
@@ -99,6 +100,7 @@ func (c *Config) NormalizeCredentials() {
|
|||||||
c.Accounts[i].Remark = strings.TrimSpace(c.Accounts[i].Remark)
|
c.Accounts[i].Remark = strings.TrimSpace(c.Accounts[i].Remark)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
c.Vercel = NormalizeVercelConfig(c.Vercel)
|
||||||
c.normalizeModelAliases()
|
c.normalizeModelAliases()
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -175,3 +177,24 @@ type ThinkingInjectionConfig struct {
|
|||||||
Enabled *bool `json:"enabled,omitempty"`
|
Enabled *bool `json:"enabled,omitempty"`
|
||||||
Prompt string `json:"prompt,omitempty"`
|
Prompt string `json:"prompt,omitempty"`
|
||||||
}
|
}
|
||||||
|
|
||||||
|
type VercelConfig struct {
|
||||||
|
Token string `json:"token,omitempty"`
|
||||||
|
ProjectID string `json:"project_id,omitempty"`
|
||||||
|
TeamID string `json:"team_id,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
func NormalizeVercelConfig(v VercelConfig) VercelConfig {
|
||||||
|
return VercelConfig{
|
||||||
|
Token: strings.TrimSpace(v.Token),
|
||||||
|
ProjectID: strings.TrimSpace(v.ProjectID),
|
||||||
|
TeamID: strings.TrimSpace(v.TeamID),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (c *Config) ClearVercelCredentials() {
|
||||||
|
if c == nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
c.Vercel = VercelConfig{}
|
||||||
|
}
|
||||||
|
|||||||
@@ -173,6 +173,11 @@ func TestConfigJSONRoundtrip(t *testing.T) {
|
|||||||
Runtime: RuntimeConfig{
|
Runtime: RuntimeConfig{
|
||||||
TokenRefreshIntervalHours: 12,
|
TokenRefreshIntervalHours: 12,
|
||||||
},
|
},
|
||||||
|
Vercel: VercelConfig{
|
||||||
|
Token: " vercel-token ",
|
||||||
|
ProjectID: " prj_123 ",
|
||||||
|
TeamID: " team_123 ",
|
||||||
|
},
|
||||||
VercelSyncHash: "hash123",
|
VercelSyncHash: "hash123",
|
||||||
VercelSyncTime: 1234567890,
|
VercelSyncTime: 1234567890,
|
||||||
AdditionalFields: map[string]any{
|
AdditionalFields: map[string]any{
|
||||||
@@ -205,6 +210,9 @@ func TestConfigJSONRoundtrip(t *testing.T) {
|
|||||||
if decoded.AutoDelete.Mode != "single" {
|
if decoded.AutoDelete.Mode != "single" {
|
||||||
t.Fatalf("unexpected auto delete mode: %#v", decoded.AutoDelete.Mode)
|
t.Fatalf("unexpected auto delete mode: %#v", decoded.AutoDelete.Mode)
|
||||||
}
|
}
|
||||||
|
if decoded.Vercel.Token != "vercel-token" || decoded.Vercel.ProjectID != "prj_123" || decoded.Vercel.TeamID != "team_123" {
|
||||||
|
t.Fatalf("unexpected vercel config: %#v", decoded.Vercel)
|
||||||
|
}
|
||||||
if decoded.VercelSyncHash != "hash123" {
|
if decoded.VercelSyncHash != "hash123" {
|
||||||
t.Fatalf("unexpected vercel sync hash: %q", decoded.VercelSyncHash)
|
t.Fatalf("unexpected vercel sync hash: %q", decoded.VercelSyncHash)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -15,5 +15,6 @@ type Handler struct {
|
|||||||
|
|
||||||
var writeJSON = adminshared.WriteJSON
|
var writeJSON = adminshared.WriteJSON
|
||||||
var intFrom = adminshared.IntFrom
|
var intFrom = adminshared.IntFrom
|
||||||
|
var maskSecretPreview = adminshared.MaskSecretPreview
|
||||||
|
|
||||||
func nilIfEmpty(s string) any { return adminshared.NilIfEmpty(s) }
|
func nilIfEmpty(s string) any { return adminshared.NilIfEmpty(s) }
|
||||||
|
|||||||
@@ -61,9 +61,34 @@ func (h *Handler) verify(w http.ResponseWriter, r *http.Request) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func (h *Handler) getVercelConfig(w http.ResponseWriter, _ *http.Request) {
|
func (h *Handler) getVercelConfig(w http.ResponseWriter, _ *http.Request) {
|
||||||
|
saved := h.Store.Snapshot().Vercel
|
||||||
|
token, tokenSource := firstConfiguredValue(
|
||||||
|
[2]string{"env", os.Getenv("VERCEL_TOKEN")},
|
||||||
|
[2]string{"config", saved.Token},
|
||||||
|
)
|
||||||
|
projectID, _ := firstConfiguredValue(
|
||||||
|
[2]string{"env", os.Getenv("VERCEL_PROJECT_ID")},
|
||||||
|
[2]string{"config", saved.ProjectID},
|
||||||
|
)
|
||||||
|
teamID, _ := firstConfiguredValue(
|
||||||
|
[2]string{"env", os.Getenv("VERCEL_TEAM_ID")},
|
||||||
|
[2]string{"config", saved.TeamID},
|
||||||
|
)
|
||||||
writeJSON(w, http.StatusOK, map[string]any{
|
writeJSON(w, http.StatusOK, map[string]any{
|
||||||
"has_token": strings.TrimSpace(os.Getenv("VERCEL_TOKEN")) != "",
|
"has_token": token != "",
|
||||||
"project_id": strings.TrimSpace(os.Getenv("VERCEL_PROJECT_ID")),
|
"token_preview": maskSecretPreview(token),
|
||||||
"team_id": nilIfEmpty(strings.TrimSpace(os.Getenv("VERCEL_TEAM_ID"))),
|
"token_source": nilIfEmpty(tokenSource),
|
||||||
|
"project_id": projectID,
|
||||||
|
"team_id": nilIfEmpty(teamID),
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func firstConfiguredValue(values ...[2]string) (string, string) {
|
||||||
|
for _, pair := range values {
|
||||||
|
value := strings.TrimSpace(pair[1])
|
||||||
|
if value != "" {
|
||||||
|
return value, strings.TrimSpace(pair[0])
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return "", ""
|
||||||
|
}
|
||||||
|
|||||||
38
internal/httpapi/admin/auth/handler_auth_test.go
Normal file
38
internal/httpapi/admin/auth/handler_auth_test.go
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
package auth
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/json"
|
||||||
|
"net/http"
|
||||||
|
"net/http/httptest"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"ds2api/internal/config"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestGetVercelConfigFallsBackToSavedConfig(t *testing.T) {
|
||||||
|
t.Setenv("DS2API_CONFIG_JSON", `{"keys":["k1"],"vercel":{"token":"saved-token","project_id":"saved-project","team_id":"saved-team"}}`)
|
||||||
|
t.Setenv("VERCEL_TOKEN", "")
|
||||||
|
t.Setenv("VERCEL_PROJECT_ID", "")
|
||||||
|
t.Setenv("VERCEL_TEAM_ID", "")
|
||||||
|
h := &Handler{Store: config.LoadStore()}
|
||||||
|
|
||||||
|
rec := httptest.NewRecorder()
|
||||||
|
h.getVercelConfig(rec, httptest.NewRequest(http.MethodGet, "/admin/vercel/config", nil))
|
||||||
|
|
||||||
|
if rec.Code != http.StatusOK {
|
||||||
|
t.Fatalf("expected 200, got %d: %s", rec.Code, rec.Body.String())
|
||||||
|
}
|
||||||
|
var payload map[string]any
|
||||||
|
if err := json.Unmarshal(rec.Body.Bytes(), &payload); err != nil {
|
||||||
|
t.Fatalf("decode response: %v", err)
|
||||||
|
}
|
||||||
|
if payload["has_token"] != true {
|
||||||
|
t.Fatalf("expected saved token to be detected: %#v", payload)
|
||||||
|
}
|
||||||
|
if payload["token_source"] != "config" || payload["project_id"] != "saved-project" || payload["team_id"] != "saved-team" {
|
||||||
|
t.Fatalf("unexpected preconfig payload: %#v", payload)
|
||||||
|
}
|
||||||
|
if payload["token_preview"] == "saved-token" {
|
||||||
|
t.Fatal("token preview leaked the full token")
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -94,6 +94,10 @@ func (h *Handler) configImport(w http.ResponseWriter, r *http.Request) {
|
|||||||
if strings.TrimSpace(incoming.Embeddings.Provider) != "" {
|
if strings.TrimSpace(incoming.Embeddings.Provider) != "" {
|
||||||
next.Embeddings.Provider = incoming.Embeddings.Provider
|
next.Embeddings.Provider = incoming.Embeddings.Provider
|
||||||
}
|
}
|
||||||
|
incomingVercel := config.NormalizeVercelConfig(incoming.Vercel)
|
||||||
|
if strings.TrimSpace(incomingVercel.Token) != "" || strings.TrimSpace(incomingVercel.ProjectID) != "" || strings.TrimSpace(incomingVercel.TeamID) != "" {
|
||||||
|
next.Vercel = incomingVercel
|
||||||
|
}
|
||||||
if strings.TrimSpace(incoming.Admin.PasswordHash) != "" {
|
if strings.TrimSpace(incoming.Admin.PasswordHash) != "" {
|
||||||
next.Admin.PasswordHash = incoming.Admin.PasswordHash
|
next.Admin.PasswordHash = incoming.Admin.PasswordHash
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -19,6 +19,12 @@ func (h *Handler) getConfig(w http.ResponseWriter, _ *http.Request) {
|
|||||||
"env_writeback_enabled": h.Store.IsEnvWritebackEnabled(),
|
"env_writeback_enabled": h.Store.IsEnvWritebackEnabled(),
|
||||||
"config_path": h.Store.ConfigPath(),
|
"config_path": h.Store.ConfigPath(),
|
||||||
"model_aliases": snap.ModelAliases,
|
"model_aliases": snap.ModelAliases,
|
||||||
|
"vercel": map[string]any{
|
||||||
|
"has_token": strings.TrimSpace(snap.Vercel.Token) != "",
|
||||||
|
"token_preview": maskSecretPreview(snap.Vercel.Token),
|
||||||
|
"project_id": snap.Vercel.ProjectID,
|
||||||
|
"team_id": snap.Vercel.TeamID,
|
||||||
|
},
|
||||||
}
|
}
|
||||||
accounts := make([]map[string]any, 0, len(snap.Accounts))
|
accounts := make([]map[string]any, 0, len(snap.Accounts))
|
||||||
for _, acc := range snap.Accounts {
|
for _, acc := range snap.Accounts {
|
||||||
|
|||||||
@@ -78,6 +78,7 @@ func ComputeSyncHash(store ConfigStore) string {
|
|||||||
}
|
}
|
||||||
snap := store.Snapshot().Clone()
|
snap := store.Snapshot().Clone()
|
||||||
snap.ClearAccountTokens()
|
snap.ClearAccountTokens()
|
||||||
|
snap.ClearVercelCredentials()
|
||||||
snap.VercelSyncHash = ""
|
snap.VercelSyncHash = ""
|
||||||
snap.VercelSyncTime = 0
|
snap.VercelSyncTime = 0
|
||||||
b, _ := json.Marshal(snap)
|
b, _ := json.Marshal(snap)
|
||||||
@@ -93,6 +94,7 @@ func SyncHashForJSON(s string) string {
|
|||||||
cfg.VercelSyncHash = ""
|
cfg.VercelSyncHash = ""
|
||||||
cfg.VercelSyncTime = 0
|
cfg.VercelSyncTime = 0
|
||||||
cfg.ClearAccountTokens()
|
cfg.ClearAccountTokens()
|
||||||
|
cfg.ClearVercelCredentials()
|
||||||
b, err := json.Marshal(cfg)
|
b, err := json.Marshal(cfg)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return ""
|
return ""
|
||||||
|
|||||||
@@ -23,7 +23,7 @@ func (h *Handler) syncVercel(w http.ResponseWriter, r *http.Request) {
|
|||||||
writeJSON(w, http.StatusBadRequest, map[string]any{"detail": "invalid json"})
|
writeJSON(w, http.StatusBadRequest, map[string]any{"detail": "invalid json"})
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
opts, err := parseVercelSyncOptions(req)
|
opts, err := parseVercelSyncOptions(req, h.Store.Snapshot().Vercel)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
writeJSON(w, http.StatusBadRequest, map[string]any{"detail": err.Error()})
|
writeJSON(w, http.StatusBadRequest, map[string]any{"detail": err.Error()})
|
||||||
return
|
return
|
||||||
@@ -50,6 +50,12 @@ func (h *Handler) syncVercel(w http.ResponseWriter, r *http.Request) {
|
|||||||
return
|
return
|
||||||
}
|
}
|
||||||
savedCreds := h.saveVercelProjectCredentials(r.Context(), client, opts, params, headers, envs)
|
savedCreds := h.saveVercelProjectCredentials(r.Context(), client, opts, params, headers, envs)
|
||||||
|
credentialsWarning := ""
|
||||||
|
if saved, err := h.saveLocalVercelCredentials(opts); err == nil && saved {
|
||||||
|
savedCreds = append(savedCreds, "config.vercel")
|
||||||
|
} else if err != nil {
|
||||||
|
credentialsWarning = "保存 Vercel 凭据到本地配置失败: " + err.Error()
|
||||||
|
}
|
||||||
manual, deployURL := triggerVercelDeployment(r.Context(), client, opts.ProjectID, params, headers)
|
manual, deployURL := triggerVercelDeployment(r.Context(), client, opts.ProjectID, params, headers)
|
||||||
_ = h.Store.SetVercelSync(syncHashForJSON(cfgJSON), time.Now().Unix())
|
_ = h.Store.SetVercelSync(syncHashForJSON(cfgJSON), time.Now().Unix())
|
||||||
result := map[string]any{"success": true, "validated_accounts": validated}
|
result := map[string]any{"success": true, "validated_accounts": validated}
|
||||||
@@ -66,6 +72,9 @@ func (h *Handler) syncVercel(w http.ResponseWriter, r *http.Request) {
|
|||||||
if len(savedCreds) > 0 {
|
if len(savedCreds) > 0 {
|
||||||
result["saved_credentials"] = savedCreds
|
result["saved_credentials"] = savedCreds
|
||||||
}
|
}
|
||||||
|
if credentialsWarning != "" {
|
||||||
|
result["credentials_warning"] = credentialsWarning
|
||||||
|
}
|
||||||
writeJSON(w, http.StatusOK, result)
|
writeJSON(w, http.StatusOK, result)
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -78,7 +87,7 @@ type vercelSyncOptions struct {
|
|||||||
UsePreconfig bool
|
UsePreconfig bool
|
||||||
}
|
}
|
||||||
|
|
||||||
func parseVercelSyncOptions(req map[string]any) (vercelSyncOptions, error) {
|
func parseVercelSyncOptions(req map[string]any, saved config.VercelConfig) (vercelSyncOptions, error) {
|
||||||
vercelToken, _ := req["vercel_token"].(string)
|
vercelToken, _ := req["vercel_token"].(string)
|
||||||
projectID, _ := req["project_id"].(string)
|
projectID, _ := req["project_id"].(string)
|
||||||
teamID, _ := req["team_id"].(string)
|
teamID, _ := req["team_id"].(string)
|
||||||
@@ -92,13 +101,13 @@ func parseVercelSyncOptions(req map[string]any) (vercelSyncOptions, error) {
|
|||||||
}
|
}
|
||||||
usePreconfig := vercelToken == "__USE_PRECONFIG__" || strings.TrimSpace(vercelToken) == ""
|
usePreconfig := vercelToken == "__USE_PRECONFIG__" || strings.TrimSpace(vercelToken) == ""
|
||||||
if usePreconfig {
|
if usePreconfig {
|
||||||
vercelToken = strings.TrimSpace(os.Getenv("VERCEL_TOKEN"))
|
vercelToken = firstNonEmpty(os.Getenv("VERCEL_TOKEN"), saved.Token)
|
||||||
}
|
}
|
||||||
if strings.TrimSpace(projectID) == "" {
|
if strings.TrimSpace(projectID) == "" {
|
||||||
projectID = strings.TrimSpace(os.Getenv("VERCEL_PROJECT_ID"))
|
projectID = firstNonEmpty(os.Getenv("VERCEL_PROJECT_ID"), saved.ProjectID)
|
||||||
}
|
}
|
||||||
if strings.TrimSpace(teamID) == "" {
|
if strings.TrimSpace(teamID) == "" {
|
||||||
teamID = strings.TrimSpace(os.Getenv("VERCEL_TEAM_ID"))
|
teamID = firstNonEmpty(os.Getenv("VERCEL_TEAM_ID"), saved.TeamID)
|
||||||
}
|
}
|
||||||
vercelToken = strings.TrimSpace(vercelToken)
|
vercelToken = strings.TrimSpace(vercelToken)
|
||||||
projectID = strings.TrimSpace(projectID)
|
projectID = strings.TrimSpace(projectID)
|
||||||
@@ -116,6 +125,15 @@ func parseVercelSyncOptions(req map[string]any) (vercelSyncOptions, error) {
|
|||||||
}, nil
|
}, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func firstNonEmpty(values ...string) string {
|
||||||
|
for _, value := range values {
|
||||||
|
if trimmed := strings.TrimSpace(value); trimmed != "" {
|
||||||
|
return trimmed
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
|
||||||
func buildVercelParams(teamID string) url.Values {
|
func buildVercelParams(teamID string) url.Values {
|
||||||
params := url.Values{}
|
params := url.Values{}
|
||||||
if strings.TrimSpace(teamID) != "" {
|
if strings.TrimSpace(teamID) != "" {
|
||||||
@@ -178,6 +196,25 @@ func (h *Handler) saveVercelProjectCredentials(ctx context.Context, client *http
|
|||||||
return saved
|
return saved
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (h *Handler) saveLocalVercelCredentials(opts vercelSyncOptions) (bool, error) {
|
||||||
|
if !opts.SaveCreds {
|
||||||
|
return false, nil
|
||||||
|
}
|
||||||
|
err := h.Store.Update(func(c *config.Config) error {
|
||||||
|
token := opts.VercelToken
|
||||||
|
if opts.UsePreconfig {
|
||||||
|
token = c.Vercel.Token
|
||||||
|
}
|
||||||
|
c.Vercel = config.NormalizeVercelConfig(config.VercelConfig{
|
||||||
|
Token: token,
|
||||||
|
ProjectID: opts.ProjectID,
|
||||||
|
TeamID: opts.TeamID,
|
||||||
|
})
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
return err == nil, err
|
||||||
|
}
|
||||||
|
|
||||||
func triggerVercelDeployment(ctx context.Context, client *http.Client, projectID string, params url.Values, headers map[string]string) (bool, string) {
|
func triggerVercelDeployment(ctx context.Context, client *http.Client, projectID string, params url.Values, headers map[string]string) (bool, string) {
|
||||||
projectResp, status, _ := vercelRequest(ctx, client, http.MethodGet, "https://api.vercel.com/v9/projects/"+projectID, params, headers, nil)
|
projectResp, status, _ := vercelRequest(ctx, client, http.MethodGet, "https://api.vercel.com/v9/projects/"+projectID, params, headers, nil)
|
||||||
if status != http.StatusOK {
|
if status != http.StatusOK {
|
||||||
@@ -243,7 +280,7 @@ func (h *Handler) vercelStatus(w http.ResponseWriter, r *http.Request) {
|
|||||||
func (h *Handler) exportSyncConfig(req map[string]any) (string, string, error) {
|
func (h *Handler) exportSyncConfig(req map[string]any) (string, string, error) {
|
||||||
override, ok := req["config_override"]
|
override, ok := req["config_override"]
|
||||||
if !ok || override == nil {
|
if !ok || override == nil {
|
||||||
return h.Store.ExportJSONAndBase64()
|
return encodeVercelSyncConfig(h.Store.Snapshot())
|
||||||
}
|
}
|
||||||
raw, err := json.Marshal(override)
|
raw, err := json.Marshal(override)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
@@ -253,8 +290,13 @@ func (h *Handler) exportSyncConfig(req map[string]any) (string, string, error) {
|
|||||||
if err := json.Unmarshal(raw, &cfg); err != nil {
|
if err := json.Unmarshal(raw, &cfg); err != nil {
|
||||||
return "", "", err
|
return "", "", err
|
||||||
}
|
}
|
||||||
|
return encodeVercelSyncConfig(cfg)
|
||||||
|
}
|
||||||
|
|
||||||
|
func encodeVercelSyncConfig(cfg config.Config) (string, string, error) {
|
||||||
cfg.DropInvalidAccounts()
|
cfg.DropInvalidAccounts()
|
||||||
cfg.ClearAccountTokens()
|
cfg.ClearAccountTokens()
|
||||||
|
cfg.ClearVercelCredentials()
|
||||||
cfg.VercelSyncHash = ""
|
cfg.VercelSyncHash = ""
|
||||||
cfg.VercelSyncTime = 0
|
cfg.VercelSyncTime = 0
|
||||||
b, err := json.Marshal(cfg)
|
b, err := json.Marshal(cfg)
|
||||||
@@ -272,6 +314,7 @@ func syncHashForJSON(s string) string {
|
|||||||
cfg.VercelSyncHash = ""
|
cfg.VercelSyncHash = ""
|
||||||
cfg.VercelSyncTime = 0
|
cfg.VercelSyncTime = 0
|
||||||
cfg.ClearAccountTokens()
|
cfg.ClearAccountTokens()
|
||||||
|
cfg.ClearVercelCredentials()
|
||||||
b, err := json.Marshal(cfg)
|
b, err := json.Marshal(cfg)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return ""
|
return ""
|
||||||
|
|||||||
100
internal/httpapi/admin/vercel/handler_vercel_test.go
Normal file
100
internal/httpapi/admin/vercel/handler_vercel_test.go
Normal file
@@ -0,0 +1,100 @@
|
|||||||
|
package vercel
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/json"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"ds2api/internal/config"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestParseVercelSyncOptionsFallsBackToSavedConfig(t *testing.T) {
|
||||||
|
t.Setenv("VERCEL_TOKEN", "")
|
||||||
|
t.Setenv("VERCEL_PROJECT_ID", "")
|
||||||
|
t.Setenv("VERCEL_TEAM_ID", "")
|
||||||
|
|
||||||
|
opts, err := parseVercelSyncOptions(map[string]any{
|
||||||
|
"vercel_token": "__USE_PRECONFIG__",
|
||||||
|
}, config.VercelConfig{
|
||||||
|
Token: " saved-token ",
|
||||||
|
ProjectID: " saved-project ",
|
||||||
|
TeamID: " saved-team ",
|
||||||
|
})
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("parse options error: %v", err)
|
||||||
|
}
|
||||||
|
if opts.VercelToken != "saved-token" || opts.ProjectID != "saved-project" || opts.TeamID != "saved-team" {
|
||||||
|
t.Fatalf("unexpected options: %#v", opts)
|
||||||
|
}
|
||||||
|
if !opts.UsePreconfig {
|
||||||
|
t.Fatal("expected preconfig mode")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestSaveLocalVercelCredentialsStoresExplicitInput(t *testing.T) {
|
||||||
|
t.Setenv("DS2API_CONFIG_JSON", `{"keys":["k1"]}`)
|
||||||
|
store := config.LoadStore()
|
||||||
|
h := &Handler{Store: store}
|
||||||
|
|
||||||
|
saved, err := h.saveLocalVercelCredentials(vercelSyncOptions{
|
||||||
|
VercelToken: " token ",
|
||||||
|
ProjectID: " project ",
|
||||||
|
TeamID: " team ",
|
||||||
|
SaveCreds: true,
|
||||||
|
})
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("save local credentials error: %v", err)
|
||||||
|
}
|
||||||
|
if !saved {
|
||||||
|
t.Fatal("expected credentials to be saved")
|
||||||
|
}
|
||||||
|
got := store.Snapshot().Vercel
|
||||||
|
if got.Token != "token" || got.ProjectID != "project" || got.TeamID != "team" {
|
||||||
|
t.Fatalf("unexpected saved credentials: %#v", got)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestSaveLocalVercelCredentialsPreservesPreconfiguredTokenAndUpdatesProject(t *testing.T) {
|
||||||
|
t.Setenv("DS2API_CONFIG_JSON", `{"keys":["k1"],"vercel":{"token":"saved-token","project_id":"old-project","team_id":"old-team"}}`)
|
||||||
|
store := config.LoadStore()
|
||||||
|
h := &Handler{Store: store}
|
||||||
|
|
||||||
|
saved, err := h.saveLocalVercelCredentials(vercelSyncOptions{
|
||||||
|
VercelToken: "resolved-token",
|
||||||
|
ProjectID: "new-project",
|
||||||
|
TeamID: "new-team",
|
||||||
|
SaveCreds: true,
|
||||||
|
UsePreconfig: true,
|
||||||
|
})
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("save local credentials error: %v", err)
|
||||||
|
}
|
||||||
|
if !saved {
|
||||||
|
t.Fatal("expected project/team updates to be saved")
|
||||||
|
}
|
||||||
|
got := store.Snapshot().Vercel
|
||||||
|
if got.Token != "saved-token" || got.ProjectID != "new-project" || got.TeamID != "new-team" {
|
||||||
|
t.Fatalf("unexpected saved credentials: %#v", got)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestExportSyncConfigStripsSavedVercelCredentials(t *testing.T) {
|
||||||
|
t.Setenv("DS2API_CONFIG_JSON", `{"keys":["k1"],"vercel":{"token":"secret-token","project_id":"project","team_id":"team"}}`)
|
||||||
|
store := config.LoadStore()
|
||||||
|
h := &Handler{Store: store}
|
||||||
|
|
||||||
|
jsonStr, _, err := h.exportSyncConfig(map[string]any{})
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("export sync config error: %v", err)
|
||||||
|
}
|
||||||
|
if strings.Contains(jsonStr, "secret-token") || strings.Contains(jsonStr, `"vercel"`) {
|
||||||
|
t.Fatalf("expected sync export to strip Vercel credentials, got %s", jsonStr)
|
||||||
|
}
|
||||||
|
var exported config.Config
|
||||||
|
if err := json.Unmarshal([]byte(jsonStr), &exported); err != nil {
|
||||||
|
t.Fatalf("exported config is invalid JSON: %v", err)
|
||||||
|
}
|
||||||
|
if len(exported.Keys) != 1 || exported.Keys[0] != "k1" {
|
||||||
|
t.Fatalf("unexpected exported config: %#v", exported)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -5,15 +5,25 @@ import (
|
|||||||
"io"
|
"io"
|
||||||
"net/http"
|
"net/http"
|
||||||
"net/http/httptest"
|
"net/http/httptest"
|
||||||
|
"path/filepath"
|
||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"ds2api/internal/auth"
|
"ds2api/internal/auth"
|
||||||
|
"ds2api/internal/chathistory"
|
||||||
dsclient "ds2api/internal/deepseek/client"
|
dsclient "ds2api/internal/deepseek/client"
|
||||||
)
|
)
|
||||||
|
|
||||||
type claudeCurrentInputAuth struct{}
|
type claudeCurrentInputAuth struct{}
|
||||||
|
|
||||||
|
type claudeHistoryConfig struct {
|
||||||
|
aliases map[string]string
|
||||||
|
}
|
||||||
|
|
||||||
|
func (m claudeHistoryConfig) ModelAliases() map[string]string { return m.aliases }
|
||||||
|
func (claudeHistoryConfig) CurrentInputFileEnabled() bool { return false }
|
||||||
|
func (claudeHistoryConfig) CurrentInputFileMinChars() int { return 0 }
|
||||||
|
|
||||||
func (claudeCurrentInputAuth) Determine(*http.Request) (*auth.RequestAuth, error) {
|
func (claudeCurrentInputAuth) Determine(*http.Request) (*auth.RequestAuth, error) {
|
||||||
return &auth.RequestAuth{
|
return &auth.RequestAuth{
|
||||||
DeepSeekToken: "direct-token",
|
DeepSeekToken: "direct-token",
|
||||||
@@ -22,6 +32,50 @@ func (claudeCurrentInputAuth) Determine(*http.Request) (*auth.RequestAuth, error
|
|||||||
}, nil
|
}, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestClaudeDirectRecordsResponseHistory(t *testing.T) {
|
||||||
|
ds := &claudeCurrentInputDS{}
|
||||||
|
historyStore := chathistory.New(filepath.Join(t.TempDir(), "history.json"))
|
||||||
|
h := &Handler{
|
||||||
|
Store: claudeHistoryConfig{aliases: map[string]string{"claude-sonnet-4-6": "deepseek-v4-flash"}},
|
||||||
|
Auth: claudeCurrentInputAuth{},
|
||||||
|
DS: ds,
|
||||||
|
ChatHistory: historyStore,
|
||||||
|
}
|
||||||
|
reqBody := `{"model":"claude-sonnet-4-6","messages":[{"role":"user","content":"hello from claude"}],"max_tokens":1024}`
|
||||||
|
req := httptest.NewRequest(http.MethodPost, "/v1/messages", strings.NewReader(reqBody))
|
||||||
|
req.Header.Set("Content-Type", "application/json")
|
||||||
|
rec := httptest.NewRecorder()
|
||||||
|
|
||||||
|
h.Messages(rec, req)
|
||||||
|
|
||||||
|
if rec.Code != http.StatusOK {
|
||||||
|
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
|
||||||
|
}
|
||||||
|
snapshot, err := historyStore.Snapshot()
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("snapshot history: %v", err)
|
||||||
|
}
|
||||||
|
if len(snapshot.Items) != 1 {
|
||||||
|
t.Fatalf("expected one history item, got %d", len(snapshot.Items))
|
||||||
|
}
|
||||||
|
item, err := historyStore.Get(snapshot.Items[0].ID)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("get history item: %v", err)
|
||||||
|
}
|
||||||
|
if item.Surface != "claude.messages" {
|
||||||
|
t.Fatalf("unexpected surface: %q", item.Surface)
|
||||||
|
}
|
||||||
|
if item.Model != "claude-sonnet-4-6" {
|
||||||
|
t.Fatalf("unexpected model: %q", item.Model)
|
||||||
|
}
|
||||||
|
if item.UserInput != "hello from claude" {
|
||||||
|
t.Fatalf("unexpected user input: %q", item.UserInput)
|
||||||
|
}
|
||||||
|
if item.Content != "ok" {
|
||||||
|
t.Fatalf("expected raw upstream content, got %q", item.Content)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func (claudeCurrentInputAuth) Release(*auth.RequestAuth) {}
|
func (claudeCurrentInputAuth) Release(*auth.RequestAuth) {}
|
||||||
|
|
||||||
type claudeCurrentInputDS struct {
|
type claudeCurrentInputDS struct {
|
||||||
@@ -53,10 +107,12 @@ func (d *claudeCurrentInputDS) CallCompletion(_ context.Context, _ *auth.Request
|
|||||||
|
|
||||||
func TestClaudeDirectAppliesCurrentInputFile(t *testing.T) {
|
func TestClaudeDirectAppliesCurrentInputFile(t *testing.T) {
|
||||||
ds := &claudeCurrentInputDS{}
|
ds := &claudeCurrentInputDS{}
|
||||||
|
historyStore := chathistory.New(filepath.Join(t.TempDir(), "history.json"))
|
||||||
h := &Handler{
|
h := &Handler{
|
||||||
Store: mockClaudeConfig{aliases: map[string]string{"claude-sonnet-4-6": "deepseek-v4-flash"}},
|
Store: mockClaudeConfig{aliases: map[string]string{"claude-sonnet-4-6": "deepseek-v4-flash"}},
|
||||||
Auth: claudeCurrentInputAuth{},
|
Auth: claudeCurrentInputAuth{},
|
||||||
DS: ds,
|
DS: ds,
|
||||||
|
ChatHistory: historyStore,
|
||||||
}
|
}
|
||||||
reqBody := `{"model":"claude-sonnet-4-6","messages":[{"role":"user","content":"hello from claude"}],"max_tokens":1024}`
|
reqBody := `{"model":"claude-sonnet-4-6","messages":[{"role":"user","content":"hello from claude"}],"max_tokens":1024}`
|
||||||
req := httptest.NewRequest(http.MethodPost, "/v1/messages", strings.NewReader(reqBody))
|
req := httptest.NewRequest(http.MethodPost, "/v1/messages", strings.NewReader(reqBody))
|
||||||
@@ -82,4 +138,21 @@ func TestClaudeDirectAppliesCurrentInputFile(t *testing.T) {
|
|||||||
if !strings.Contains(prompt, "Continue from the latest state in the attached DS2API_HISTORY.txt context.") {
|
if !strings.Contains(prompt, "Continue from the latest state in the attached DS2API_HISTORY.txt context.") {
|
||||||
t.Fatalf("expected continuation prompt, got %q", prompt)
|
t.Fatalf("expected continuation prompt, got %q", prompt)
|
||||||
}
|
}
|
||||||
|
snapshot, err := historyStore.Snapshot()
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("snapshot history: %v", err)
|
||||||
|
}
|
||||||
|
if len(snapshot.Items) != 1 {
|
||||||
|
t.Fatalf("expected one history item, got %d", len(snapshot.Items))
|
||||||
|
}
|
||||||
|
full, err := historyStore.Get(snapshot.Items[0].ID)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("get history item: %v", err)
|
||||||
|
}
|
||||||
|
if full.HistoryText != string(ds.uploads[0].Data) {
|
||||||
|
t.Fatalf("expected uploaded current input file to be persisted in history text")
|
||||||
|
}
|
||||||
|
if len(full.Messages) != 1 || !strings.Contains(full.Messages[0].Content, "Continue from the latest state in the attached DS2API_HISTORY.txt context.") {
|
||||||
|
t.Fatalf("expected persisted message to match upstream continuation prompt, got %#v", full.Messages)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -2,6 +2,7 @@ package claude
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"bytes"
|
"bytes"
|
||||||
|
"context"
|
||||||
"encoding/json"
|
"encoding/json"
|
||||||
"errors"
|
"errors"
|
||||||
"fmt"
|
"fmt"
|
||||||
@@ -15,8 +16,10 @@ import (
|
|||||||
"ds2api/internal/completionruntime"
|
"ds2api/internal/completionruntime"
|
||||||
"ds2api/internal/config"
|
"ds2api/internal/config"
|
||||||
claudefmt "ds2api/internal/format/claude"
|
claudefmt "ds2api/internal/format/claude"
|
||||||
|
"ds2api/internal/httpapi/openai/history"
|
||||||
"ds2api/internal/httpapi/requestbody"
|
"ds2api/internal/httpapi/requestbody"
|
||||||
"ds2api/internal/promptcompat"
|
"ds2api/internal/promptcompat"
|
||||||
|
"ds2api/internal/responsehistory"
|
||||||
streamengine "ds2api/internal/stream"
|
streamengine "ds2api/internal/stream"
|
||||||
"ds2api/internal/translatorcliproxy"
|
"ds2api/internal/translatorcliproxy"
|
||||||
"ds2api/internal/util"
|
"ds2api/internal/util"
|
||||||
@@ -79,38 +82,70 @@ func (h *Handler) handleClaudeDirect(w http.ResponseWriter, r *http.Request) boo
|
|||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
defer h.Auth.Release(a)
|
defer h.Auth.Release(a)
|
||||||
if norm.Standard.Stream {
|
stdReq, err := h.applyCurrentInputFile(r.Context(), a, norm.Standard)
|
||||||
h.handleClaudeDirectStream(w, r, a, norm.Standard)
|
if err != nil {
|
||||||
|
status, message := mapCurrentInputFileError(err)
|
||||||
|
writeClaudeError(w, status, message)
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
result, outErr := completionruntime.ExecuteNonStreamWithRetry(r.Context(), h.DS, a, norm.Standard, completionruntime.Options{
|
historySession := responsehistory.Start(responsehistory.StartParams{
|
||||||
StripReferenceMarkers: stripReferenceMarkersEnabled(),
|
Store: h.ChatHistory,
|
||||||
RetryEnabled: true,
|
Request: r,
|
||||||
CurrentInputFile: h.Store,
|
Auth: a,
|
||||||
|
Surface: "claude.messages",
|
||||||
|
Standard: stdReq,
|
||||||
|
})
|
||||||
|
if stdReq.Stream {
|
||||||
|
h.handleClaudeDirectStream(w, r, a, stdReq, historySession)
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
result, outErr := completionruntime.ExecuteNonStreamWithRetry(r.Context(), h.DS, a, stdReq, completionruntime.Options{
|
||||||
|
RetryEnabled: true,
|
||||||
|
CurrentInputFile: h.Store,
|
||||||
})
|
})
|
||||||
if outErr != nil {
|
if outErr != nil {
|
||||||
|
if historySession != nil {
|
||||||
|
historySession.ErrorTurn(outErr.Status, outErr.Message, outErr.Code, result.Turn)
|
||||||
|
}
|
||||||
writeClaudeError(w, outErr.Status, outErr.Message)
|
writeClaudeError(w, outErr.Status, outErr.Message)
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
|
if historySession != nil {
|
||||||
|
historySession.SuccessTurn(http.StatusOK, result.Turn, responsehistory.GenericUsage(result.Turn))
|
||||||
|
}
|
||||||
writeJSON(w, http.StatusOK, claudefmt.BuildMessageResponseFromTurn(
|
writeJSON(w, http.StatusOK, claudefmt.BuildMessageResponseFromTurn(
|
||||||
fmt.Sprintf("msg_%d", time.Now().UnixNano()),
|
fmt.Sprintf("msg_%d", time.Now().UnixNano()),
|
||||||
norm.Standard.ResponseModel,
|
stdReq.ResponseModel,
|
||||||
result.Turn,
|
result.Turn,
|
||||||
exposeThinking,
|
exposeThinking,
|
||||||
))
|
))
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
|
|
||||||
func (h *Handler) handleClaudeDirectStream(w http.ResponseWriter, r *http.Request, a *auth.RequestAuth, stdReq promptcompat.StandardRequest) {
|
func (h *Handler) applyCurrentInputFile(ctx context.Context, a *auth.RequestAuth, stdReq promptcompat.StandardRequest) (promptcompat.StandardRequest, error) {
|
||||||
|
if h == nil {
|
||||||
|
return stdReq, nil
|
||||||
|
}
|
||||||
|
return (history.Service{Store: h.Store, DS: h.DS}).ApplyCurrentInputFile(ctx, a, stdReq)
|
||||||
|
}
|
||||||
|
|
||||||
|
func mapCurrentInputFileError(err error) (int, string) {
|
||||||
|
return history.MapError(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (h *Handler) handleClaudeDirectStream(w http.ResponseWriter, r *http.Request, a *auth.RequestAuth, stdReq promptcompat.StandardRequest, historySession *responsehistory.Session) {
|
||||||
start, outErr := completionruntime.StartCompletion(r.Context(), h.DS, a, stdReq, completionruntime.Options{
|
start, outErr := completionruntime.StartCompletion(r.Context(), h.DS, a, stdReq, completionruntime.Options{
|
||||||
CurrentInputFile: h.Store,
|
CurrentInputFile: h.Store,
|
||||||
})
|
})
|
||||||
if outErr != nil {
|
if outErr != nil {
|
||||||
|
if historySession != nil {
|
||||||
|
historySession.Error(outErr.Status, outErr.Message, outErr.Code, "", "")
|
||||||
|
}
|
||||||
writeClaudeError(w, outErr.Status, outErr.Message)
|
writeClaudeError(w, outErr.Status, outErr.Message)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
streamReq := start.Request
|
streamReq := start.Request
|
||||||
h.handleClaudeStreamRealtime(w, r, start.Response, streamReq.ResponseModel, streamReq.Messages, streamReq.Thinking, streamReq.Search, streamReq.ToolNames, streamReq.ToolsRaw)
|
h.handleClaudeStreamRealtime(w, r, start.Response, streamReq.ResponseModel, streamReq.Messages, streamReq.Thinking, streamReq.Search, streamReq.ToolNames, streamReq.ToolsRaw, historySession)
|
||||||
}
|
}
|
||||||
|
|
||||||
func (h *Handler) proxyViaOpenAI(w http.ResponseWriter, r *http.Request, store ConfigReader) bool {
|
func (h *Handler) proxyViaOpenAI(w http.ResponseWriter, r *http.Request, store ConfigReader) bool {
|
||||||
@@ -264,10 +299,17 @@ func stripClaudeThinkingBlocks(raw []byte) []byte {
|
|||||||
return out
|
return out
|
||||||
}
|
}
|
||||||
|
|
||||||
func (h *Handler) handleClaudeStreamRealtime(w http.ResponseWriter, r *http.Request, resp *http.Response, model string, messages []any, thinkingEnabled, searchEnabled bool, toolNames []string, toolsRaw any) {
|
func (h *Handler) handleClaudeStreamRealtime(w http.ResponseWriter, r *http.Request, resp *http.Response, model string, messages []any, thinkingEnabled, searchEnabled bool, toolNames []string, toolsRaw any, historySessions ...*responsehistory.Session) {
|
||||||
|
var historySession *responsehistory.Session
|
||||||
|
if len(historySessions) > 0 {
|
||||||
|
historySession = historySessions[0]
|
||||||
|
}
|
||||||
defer func() { _ = resp.Body.Close() }()
|
defer func() { _ = resp.Body.Close() }()
|
||||||
if resp.StatusCode != http.StatusOK {
|
if resp.StatusCode != http.StatusOK {
|
||||||
body, _ := io.ReadAll(resp.Body)
|
body, _ := io.ReadAll(resp.Body)
|
||||||
|
if historySession != nil {
|
||||||
|
historySession.Error(resp.StatusCode, strings.TrimSpace(string(body)), "error", "", "")
|
||||||
|
}
|
||||||
writeClaudeError(w, http.StatusInternalServerError, string(body))
|
writeClaudeError(w, http.StatusInternalServerError, string(body))
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
@@ -294,6 +336,7 @@ func (h *Handler) handleClaudeStreamRealtime(w http.ResponseWriter, r *http.Requ
|
|||||||
toolNames,
|
toolNames,
|
||||||
toolsRaw,
|
toolsRaw,
|
||||||
buildClaudePromptTokenText(messages, thinkingEnabled),
|
buildClaudePromptTokenText(messages, thinkingEnabled),
|
||||||
|
historySession,
|
||||||
)
|
)
|
||||||
streamRuntime.sendMessageStart()
|
streamRuntime.sendMessageStart()
|
||||||
|
|
||||||
|
|||||||
@@ -6,6 +6,7 @@ import (
|
|||||||
|
|
||||||
"github.com/go-chi/chi/v5"
|
"github.com/go-chi/chi/v5"
|
||||||
|
|
||||||
|
"ds2api/internal/chathistory"
|
||||||
"ds2api/internal/config"
|
"ds2api/internal/config"
|
||||||
dsprotocol "ds2api/internal/deepseek/protocol"
|
dsprotocol "ds2api/internal/deepseek/protocol"
|
||||||
"ds2api/internal/textclean"
|
"ds2api/internal/textclean"
|
||||||
@@ -16,10 +17,11 @@ import (
|
|||||||
var writeJSON = util.WriteJSON
|
var writeJSON = util.WriteJSON
|
||||||
|
|
||||||
type Handler struct {
|
type Handler struct {
|
||||||
Store ConfigReader
|
Store ConfigReader
|
||||||
Auth AuthResolver
|
Auth AuthResolver
|
||||||
DS DeepSeekCaller
|
DS DeepSeekCaller
|
||||||
OpenAI OpenAIChatRunner
|
OpenAI OpenAIChatRunner
|
||||||
|
ChatHistory *chathistory.Store
|
||||||
}
|
}
|
||||||
|
|
||||||
func stripReferenceMarkersEnabled() bool {
|
func stripReferenceMarkersEnabled() bool {
|
||||||
|
|||||||
@@ -6,6 +6,7 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"time"
|
"time"
|
||||||
|
|
||||||
|
"ds2api/internal/responsehistory"
|
||||||
"ds2api/internal/sse"
|
"ds2api/internal/sse"
|
||||||
streamengine "ds2api/internal/stream"
|
streamengine "ds2api/internal/stream"
|
||||||
"ds2api/internal/toolcall"
|
"ds2api/internal/toolcall"
|
||||||
@@ -46,6 +47,7 @@ type claudeStreamRuntime struct {
|
|||||||
textEmitted bool
|
textEmitted bool
|
||||||
ended bool
|
ended bool
|
||||||
upstreamErr string
|
upstreamErr string
|
||||||
|
history *responsehistory.Session
|
||||||
}
|
}
|
||||||
|
|
||||||
func newClaudeStreamRuntime(
|
func newClaudeStreamRuntime(
|
||||||
@@ -60,6 +62,7 @@ func newClaudeStreamRuntime(
|
|||||||
toolNames []string,
|
toolNames []string,
|
||||||
toolsRaw any,
|
toolsRaw any,
|
||||||
promptTokenText string,
|
promptTokenText string,
|
||||||
|
history *responsehistory.Session,
|
||||||
) *claudeStreamRuntime {
|
) *claudeStreamRuntime {
|
||||||
return &claudeStreamRuntime{
|
return &claudeStreamRuntime{
|
||||||
w: w,
|
w: w,
|
||||||
@@ -74,6 +77,7 @@ func newClaudeStreamRuntime(
|
|||||||
toolNames: toolNames,
|
toolNames: toolNames,
|
||||||
toolsRaw: toolsRaw,
|
toolsRaw: toolsRaw,
|
||||||
promptTokenText: promptTokenText,
|
promptTokenText: promptTokenText,
|
||||||
|
history: history,
|
||||||
messageID: fmt.Sprintf("msg_%d", time.Now().UnixNano()),
|
messageID: fmt.Sprintf("msg_%d", time.Now().UnixNano()),
|
||||||
thinkingBlockIndex: -1,
|
thinkingBlockIndex: -1,
|
||||||
textBlockIndex: -1,
|
textBlockIndex: -1,
|
||||||
@@ -232,5 +236,11 @@ func (s *claudeStreamRuntime) onParsed(parsed sse.LineResult) streamengine.Parse
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if s.history != nil {
|
||||||
|
s.history.Progress(
|
||||||
|
responsehistory.ThinkingForArchive(s.rawThinking.String(), s.toolDetectionThinking.String(), s.thinking.String()),
|
||||||
|
responsehistory.TextForArchive(s.rawText.String(), s.text.String()),
|
||||||
|
)
|
||||||
|
}
|
||||||
return streamengine.ParsedDecision{ContentSeen: contentSeen}
|
return streamengine.ParsedDecision{ContentSeen: contentSeen}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -2,6 +2,7 @@ package claude
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"ds2api/internal/assistantturn"
|
"ds2api/internal/assistantturn"
|
||||||
|
"ds2api/internal/responsehistory"
|
||||||
"ds2api/internal/sse"
|
"ds2api/internal/sse"
|
||||||
"ds2api/internal/toolcall"
|
"ds2api/internal/toolcall"
|
||||||
"ds2api/internal/toolstream"
|
"ds2api/internal/toolstream"
|
||||||
@@ -175,6 +176,15 @@ func (s *claudeStreamRuntime) finalize(stopReason string) {
|
|||||||
if outcome.HasToolCalls {
|
if outcome.HasToolCalls {
|
||||||
stopReason = "tool_use"
|
stopReason = "tool_use"
|
||||||
}
|
}
|
||||||
|
if s.history != nil {
|
||||||
|
s.history.Success(
|
||||||
|
200,
|
||||||
|
responsehistory.ThinkingForArchive(turn.RawThinking, turn.DetectionThinking, turn.Thinking),
|
||||||
|
responsehistory.TextForArchive(turn.RawText, turn.Text),
|
||||||
|
stopReason,
|
||||||
|
responsehistory.GenericUsage(turn),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
s.send("message_delta", map[string]any{
|
s.send("message_delta", map[string]any{
|
||||||
"type": "message_delta",
|
"type": "message_delta",
|
||||||
@@ -191,10 +201,16 @@ func (s *claudeStreamRuntime) finalize(stopReason string) {
|
|||||||
|
|
||||||
func (s *claudeStreamRuntime) onFinalize(reason streamengine.StopReason, scannerErr error) {
|
func (s *claudeStreamRuntime) onFinalize(reason streamengine.StopReason, scannerErr error) {
|
||||||
if string(reason) == "upstream_error" {
|
if string(reason) == "upstream_error" {
|
||||||
|
if s.history != nil {
|
||||||
|
s.history.Error(500, s.upstreamErr, "upstream_error", responsehistory.ThinkingForArchive(s.rawThinking.String(), s.toolDetectionThinking.String(), s.thinking.String()), responsehistory.TextForArchive(s.rawText.String(), s.text.String()))
|
||||||
|
}
|
||||||
s.sendError(s.upstreamErr)
|
s.sendError(s.upstreamErr)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
if scannerErr != nil {
|
if scannerErr != nil {
|
||||||
|
if s.history != nil {
|
||||||
|
s.history.Error(500, scannerErr.Error(), "error", responsehistory.ThinkingForArchive(s.rawThinking.String(), s.toolDetectionThinking.String(), s.thinking.String()), responsehistory.TextForArchive(s.rawText.String(), s.text.String()))
|
||||||
|
}
|
||||||
s.sendError(scannerErr.Error())
|
s.sendError(scannerErr.Error())
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -2,6 +2,7 @@ package gemini
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"bytes"
|
"bytes"
|
||||||
|
"context"
|
||||||
"encoding/json"
|
"encoding/json"
|
||||||
"errors"
|
"errors"
|
||||||
"io"
|
"io"
|
||||||
@@ -14,8 +15,10 @@ import (
|
|||||||
"ds2api/internal/assistantturn"
|
"ds2api/internal/assistantturn"
|
||||||
"ds2api/internal/auth"
|
"ds2api/internal/auth"
|
||||||
"ds2api/internal/completionruntime"
|
"ds2api/internal/completionruntime"
|
||||||
|
"ds2api/internal/httpapi/openai/history"
|
||||||
"ds2api/internal/httpapi/requestbody"
|
"ds2api/internal/httpapi/requestbody"
|
||||||
"ds2api/internal/promptcompat"
|
"ds2api/internal/promptcompat"
|
||||||
|
"ds2api/internal/responsehistory"
|
||||||
"ds2api/internal/sse"
|
"ds2api/internal/sse"
|
||||||
"ds2api/internal/toolcall"
|
"ds2api/internal/toolcall"
|
||||||
"ds2api/internal/translatorcliproxy"
|
"ds2api/internal/translatorcliproxy"
|
||||||
@@ -76,33 +79,65 @@ func (h *Handler) handleGeminiDirect(w http.ResponseWriter, r *http.Request, str
|
|||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
defer h.Auth.Release(a)
|
defer h.Auth.Release(a)
|
||||||
|
stdReq, err = h.applyCurrentInputFile(r.Context(), a, stdReq)
|
||||||
|
if err != nil {
|
||||||
|
status, message := mapCurrentInputFileError(err)
|
||||||
|
writeGeminiError(w, status, message)
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
historySession := responsehistory.Start(responsehistory.StartParams{
|
||||||
|
Store: h.ChatHistory,
|
||||||
|
Request: r,
|
||||||
|
Auth: a,
|
||||||
|
Surface: "gemini.generate_content",
|
||||||
|
Standard: stdReq,
|
||||||
|
})
|
||||||
if stream {
|
if stream {
|
||||||
h.handleGeminiDirectStream(w, r, a, stdReq)
|
h.handleGeminiDirectStream(w, r, a, stdReq, historySession)
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
result, outErr := completionruntime.ExecuteNonStreamWithRetry(r.Context(), h.DS, a, stdReq, completionruntime.Options{
|
result, outErr := completionruntime.ExecuteNonStreamWithRetry(r.Context(), h.DS, a, stdReq, completionruntime.Options{
|
||||||
StripReferenceMarkers: stripReferenceMarkersEnabled(),
|
RetryEnabled: true,
|
||||||
RetryEnabled: true,
|
CurrentInputFile: h.Store,
|
||||||
CurrentInputFile: h.Store,
|
|
||||||
})
|
})
|
||||||
if outErr != nil {
|
if outErr != nil {
|
||||||
|
if historySession != nil {
|
||||||
|
historySession.ErrorTurn(outErr.Status, outErr.Message, outErr.Code, result.Turn)
|
||||||
|
}
|
||||||
writeGeminiError(w, outErr.Status, outErr.Message)
|
writeGeminiError(w, outErr.Status, outErr.Message)
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
|
if historySession != nil {
|
||||||
|
historySession.SuccessTurn(http.StatusOK, result.Turn, responsehistory.GenericUsage(result.Turn))
|
||||||
|
}
|
||||||
writeJSON(w, http.StatusOK, buildGeminiGenerateContentResponseFromTurn(result.Turn))
|
writeJSON(w, http.StatusOK, buildGeminiGenerateContentResponseFromTurn(result.Turn))
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
|
|
||||||
func (h *Handler) handleGeminiDirectStream(w http.ResponseWriter, r *http.Request, a *auth.RequestAuth, stdReq promptcompat.StandardRequest) {
|
func (h *Handler) applyCurrentInputFile(ctx context.Context, a *auth.RequestAuth, stdReq promptcompat.StandardRequest) (promptcompat.StandardRequest, error) {
|
||||||
|
if h == nil {
|
||||||
|
return stdReq, nil
|
||||||
|
}
|
||||||
|
return (history.Service{Store: h.Store, DS: h.DS}).ApplyCurrentInputFile(ctx, a, stdReq)
|
||||||
|
}
|
||||||
|
|
||||||
|
func mapCurrentInputFileError(err error) (int, string) {
|
||||||
|
return history.MapError(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (h *Handler) handleGeminiDirectStream(w http.ResponseWriter, r *http.Request, a *auth.RequestAuth, stdReq promptcompat.StandardRequest, historySession *responsehistory.Session) {
|
||||||
start, outErr := completionruntime.StartCompletion(r.Context(), h.DS, a, stdReq, completionruntime.Options{
|
start, outErr := completionruntime.StartCompletion(r.Context(), h.DS, a, stdReq, completionruntime.Options{
|
||||||
CurrentInputFile: h.Store,
|
CurrentInputFile: h.Store,
|
||||||
})
|
})
|
||||||
if outErr != nil {
|
if outErr != nil {
|
||||||
|
if historySession != nil {
|
||||||
|
historySession.Error(outErr.Status, outErr.Message, outErr.Code, "", "")
|
||||||
|
}
|
||||||
writeGeminiError(w, outErr.Status, outErr.Message)
|
writeGeminiError(w, outErr.Status, outErr.Message)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
streamReq := start.Request
|
streamReq := start.Request
|
||||||
h.handleStreamGenerateContent(w, r, start.Response, streamReq.ResponseModel, streamReq.PromptTokenText, streamReq.Thinking, streamReq.Search, streamReq.ToolNames, streamReq.ToolsRaw)
|
h.handleStreamGenerateContent(w, r, start.Response, streamReq.ResponseModel, streamReq.PromptTokenText, streamReq.Thinking, streamReq.Search, streamReq.ToolNames, streamReq.ToolsRaw, historySession)
|
||||||
}
|
}
|
||||||
|
|
||||||
func (h *Handler) proxyViaOpenAI(w http.ResponseWriter, r *http.Request, stream bool) bool {
|
func (h *Handler) proxyViaOpenAI(w http.ResponseWriter, r *http.Request, stream bool) bool {
|
||||||
@@ -294,12 +329,11 @@ func (h *Handler) handleNonStreamGenerateContent(w http.ResponseWriter, resp *ht
|
|||||||
}
|
}
|
||||||
|
|
||||||
result := sse.CollectStream(resp, thinkingEnabled, true)
|
result := sse.CollectStream(resp, thinkingEnabled, true)
|
||||||
stripReferenceMarkers := stripReferenceMarkersEnabled()
|
|
||||||
writeJSON(w, http.StatusOK, buildGeminiGenerateContentResponse(
|
writeJSON(w, http.StatusOK, buildGeminiGenerateContentResponse(
|
||||||
model,
|
model,
|
||||||
finalPrompt,
|
finalPrompt,
|
||||||
cleanVisibleOutput(result.Thinking, stripReferenceMarkers),
|
cleanVisibleOutput(result.Thinking, false),
|
||||||
cleanVisibleOutput(result.Text, stripReferenceMarkers),
|
cleanVisibleOutput(result.Text, false),
|
||||||
toolNames,
|
toolNames,
|
||||||
))
|
))
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -5,6 +5,7 @@ import (
|
|||||||
|
|
||||||
"github.com/go-chi/chi/v5"
|
"github.com/go-chi/chi/v5"
|
||||||
|
|
||||||
|
"ds2api/internal/chathistory"
|
||||||
"ds2api/internal/textclean"
|
"ds2api/internal/textclean"
|
||||||
"ds2api/internal/util"
|
"ds2api/internal/util"
|
||||||
)
|
)
|
||||||
@@ -12,10 +13,11 @@ import (
|
|||||||
var writeJSON = util.WriteJSON
|
var writeJSON = util.WriteJSON
|
||||||
|
|
||||||
type Handler struct {
|
type Handler struct {
|
||||||
Store ConfigReader
|
Store ConfigReader
|
||||||
Auth AuthResolver
|
Auth AuthResolver
|
||||||
DS DeepSeekCaller
|
DS DeepSeekCaller
|
||||||
OpenAI OpenAIChatRunner
|
OpenAI OpenAIChatRunner
|
||||||
|
ChatHistory *chathistory.Store
|
||||||
}
|
}
|
||||||
|
|
||||||
//nolint:unused // used by native Gemini stream/non-stream runtime helpers.
|
//nolint:unused // used by native Gemini stream/non-stream runtime helpers.
|
||||||
|
|||||||
@@ -9,15 +9,23 @@ import (
|
|||||||
|
|
||||||
"ds2api/internal/assistantturn"
|
"ds2api/internal/assistantturn"
|
||||||
dsprotocol "ds2api/internal/deepseek/protocol"
|
dsprotocol "ds2api/internal/deepseek/protocol"
|
||||||
|
"ds2api/internal/responsehistory"
|
||||||
"ds2api/internal/sse"
|
"ds2api/internal/sse"
|
||||||
streamengine "ds2api/internal/stream"
|
streamengine "ds2api/internal/stream"
|
||||||
)
|
)
|
||||||
|
|
||||||
//nolint:unused // retained for native Gemini stream handling path.
|
//nolint:unused // retained for native Gemini stream handling path.
|
||||||
func (h *Handler) handleStreamGenerateContent(w http.ResponseWriter, r *http.Request, resp *http.Response, model, finalPrompt string, thinkingEnabled, searchEnabled bool, toolNames []string, toolsRaw any) {
|
func (h *Handler) handleStreamGenerateContent(w http.ResponseWriter, r *http.Request, resp *http.Response, model, finalPrompt string, thinkingEnabled, searchEnabled bool, toolNames []string, toolsRaw any, historySessions ...*responsehistory.Session) {
|
||||||
|
var historySession *responsehistory.Session
|
||||||
|
if len(historySessions) > 0 {
|
||||||
|
historySession = historySessions[0]
|
||||||
|
}
|
||||||
defer func() { _ = resp.Body.Close() }()
|
defer func() { _ = resp.Body.Close() }()
|
||||||
if resp.StatusCode != http.StatusOK {
|
if resp.StatusCode != http.StatusOK {
|
||||||
body, _ := io.ReadAll(resp.Body)
|
body, _ := io.ReadAll(resp.Body)
|
||||||
|
if historySession != nil {
|
||||||
|
historySession.Error(resp.StatusCode, strings.TrimSpace(string(body)), "error", "", "")
|
||||||
|
}
|
||||||
writeGeminiError(w, resp.StatusCode, strings.TrimSpace(string(body)))
|
writeGeminiError(w, resp.StatusCode, strings.TrimSpace(string(body)))
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
@@ -29,7 +37,7 @@ func (h *Handler) handleStreamGenerateContent(w http.ResponseWriter, r *http.Req
|
|||||||
|
|
||||||
rc := http.NewResponseController(w)
|
rc := http.NewResponseController(w)
|
||||||
_, canFlush := w.(http.Flusher)
|
_, canFlush := w.(http.Flusher)
|
||||||
runtime := newGeminiStreamRuntime(w, rc, canFlush, model, finalPrompt, thinkingEnabled, searchEnabled, stripReferenceMarkersEnabled(), toolNames, toolsRaw)
|
runtime := newGeminiStreamRuntime(w, rc, canFlush, model, finalPrompt, thinkingEnabled, searchEnabled, stripReferenceMarkersEnabled(), toolNames, toolsRaw, historySession)
|
||||||
|
|
||||||
initialType := "text"
|
initialType := "text"
|
||||||
if thinkingEnabled {
|
if thinkingEnabled {
|
||||||
@@ -70,6 +78,7 @@ type geminiStreamRuntime struct {
|
|||||||
accumulator *assistantturn.Accumulator
|
accumulator *assistantturn.Accumulator
|
||||||
contentFilter bool
|
contentFilter bool
|
||||||
responseMessageID int
|
responseMessageID int
|
||||||
|
history *responsehistory.Session
|
||||||
}
|
}
|
||||||
|
|
||||||
//nolint:unused // retained for native Gemini stream handling path.
|
//nolint:unused // retained for native Gemini stream handling path.
|
||||||
@@ -84,6 +93,7 @@ func newGeminiStreamRuntime(
|
|||||||
stripReferenceMarkers bool,
|
stripReferenceMarkers bool,
|
||||||
toolNames []string,
|
toolNames []string,
|
||||||
toolsRaw any,
|
toolsRaw any,
|
||||||
|
history *responsehistory.Session,
|
||||||
) *geminiStreamRuntime {
|
) *geminiStreamRuntime {
|
||||||
return &geminiStreamRuntime{
|
return &geminiStreamRuntime{
|
||||||
w: w,
|
w: w,
|
||||||
@@ -97,6 +107,7 @@ func newGeminiStreamRuntime(
|
|||||||
stripReferenceMarkers: stripReferenceMarkers,
|
stripReferenceMarkers: stripReferenceMarkers,
|
||||||
toolNames: toolNames,
|
toolNames: toolNames,
|
||||||
toolsRaw: toolsRaw,
|
toolsRaw: toolsRaw,
|
||||||
|
history: history,
|
||||||
accumulator: assistantturn.NewAccumulator(assistantturn.AccumulatorOptions{
|
accumulator: assistantturn.NewAccumulator(assistantturn.AccumulatorOptions{
|
||||||
ThinkingEnabled: thinkingEnabled,
|
ThinkingEnabled: thinkingEnabled,
|
||||||
SearchEnabled: searchEnabled,
|
SearchEnabled: searchEnabled,
|
||||||
@@ -170,6 +181,13 @@ func (s *geminiStreamRuntime) onParsed(parsed sse.LineResult) streamengine.Parse
|
|||||||
"modelVersion": s.model,
|
"modelVersion": s.model,
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
if s.history != nil {
|
||||||
|
rawText, text, rawThinking, thinking, detectionThinking := s.accumulator.Snapshot()
|
||||||
|
s.history.Progress(
|
||||||
|
responsehistory.ThinkingForArchive(rawThinking, detectionThinking, thinking),
|
||||||
|
responsehistory.TextForArchive(rawText, text),
|
||||||
|
)
|
||||||
|
}
|
||||||
return streamengine.ParsedDecision{ContentSeen: accumulated.ContentSeen}
|
return streamengine.ParsedDecision{ContentSeen: accumulated.ContentSeen}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -193,6 +211,15 @@ func (s *geminiStreamRuntime) finalize() {
|
|||||||
ToolsRaw: s.toolsRaw,
|
ToolsRaw: s.toolsRaw,
|
||||||
})
|
})
|
||||||
outcome := assistantturn.FinalizeTurn(turn, assistantturn.FinalizeOptions{})
|
outcome := assistantturn.FinalizeTurn(turn, assistantturn.FinalizeOptions{})
|
||||||
|
if s.history != nil {
|
||||||
|
s.history.Success(
|
||||||
|
http.StatusOK,
|
||||||
|
responsehistory.ThinkingForArchive(turn.RawThinking, turn.DetectionThinking, turn.Thinking),
|
||||||
|
responsehistory.TextForArchive(turn.RawText, turn.Text),
|
||||||
|
assistantturn.FinishReason(turn),
|
||||||
|
responsehistory.GenericUsage(turn),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
if s.bufferContent {
|
if s.bufferContent {
|
||||||
parts := buildGeminiPartsFromTurn(turn)
|
parts := buildGeminiPartsFromTurn(turn)
|
||||||
|
|||||||
@@ -7,12 +7,14 @@ import (
|
|||||||
"io"
|
"io"
|
||||||
"net/http"
|
"net/http"
|
||||||
"net/http/httptest"
|
"net/http/httptest"
|
||||||
|
"path/filepath"
|
||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/go-chi/chi/v5"
|
"github.com/go-chi/chi/v5"
|
||||||
|
|
||||||
"ds2api/internal/auth"
|
"ds2api/internal/auth"
|
||||||
|
"ds2api/internal/chathistory"
|
||||||
dsclient "ds2api/internal/deepseek/client"
|
dsclient "ds2api/internal/deepseek/client"
|
||||||
)
|
)
|
||||||
|
|
||||||
@@ -138,10 +140,12 @@ func TestGeminiDirectAppliesCurrentInputFile(t *testing.T) {
|
|||||||
ds := &testGeminiDS{
|
ds := &testGeminiDS{
|
||||||
resp: makeGeminiUpstreamResponse(`data: {"p":"response/content","v":"ok"}`),
|
resp: makeGeminiUpstreamResponse(`data: {"p":"response/content","v":"ok"}`),
|
||||||
}
|
}
|
||||||
|
historyStore := chathistory.New(filepath.Join(t.TempDir(), "history.json"))
|
||||||
h := &Handler{
|
h := &Handler{
|
||||||
Store: testGeminiConfig{},
|
Store: testGeminiConfig{},
|
||||||
Auth: testGeminiAuth{},
|
Auth: testGeminiAuth{},
|
||||||
DS: ds,
|
DS: ds,
|
||||||
|
ChatHistory: historyStore,
|
||||||
}
|
}
|
||||||
reqBody := `{"contents":[{"role":"user","parts":[{"text":"hello from gemini"}]}]}`
|
reqBody := `{"contents":[{"role":"user","parts":[{"text":"hello from gemini"}]}]}`
|
||||||
req := httptest.NewRequest(http.MethodPost, "/v1beta/models/gemini-2.5-pro:generateContent", strings.NewReader(reqBody))
|
req := httptest.NewRequest(http.MethodPost, "/v1beta/models/gemini-2.5-pro:generateContent", strings.NewReader(reqBody))
|
||||||
@@ -172,6 +176,29 @@ func TestGeminiDirectAppliesCurrentInputFile(t *testing.T) {
|
|||||||
if !strings.Contains(prompt, "Continue from the latest state in the attached DS2API_HISTORY.txt context.") {
|
if !strings.Contains(prompt, "Continue from the latest state in the attached DS2API_HISTORY.txt context.") {
|
||||||
t.Fatalf("expected continuation prompt, got %q", prompt)
|
t.Fatalf("expected continuation prompt, got %q", prompt)
|
||||||
}
|
}
|
||||||
|
snapshot, err := historyStore.Snapshot()
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("snapshot history: %v", err)
|
||||||
|
}
|
||||||
|
if len(snapshot.Items) != 1 {
|
||||||
|
t.Fatalf("expected one history item, got %d", len(snapshot.Items))
|
||||||
|
}
|
||||||
|
full, err := historyStore.Get(snapshot.Items[0].ID)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("get history item: %v", err)
|
||||||
|
}
|
||||||
|
if full.Surface != "gemini.generate_content" {
|
||||||
|
t.Fatalf("unexpected surface: %q", full.Surface)
|
||||||
|
}
|
||||||
|
if full.Content != "ok" {
|
||||||
|
t.Fatalf("expected raw upstream content, got %q", full.Content)
|
||||||
|
}
|
||||||
|
if full.HistoryText != string(ds.uploadCalls[0].Data) {
|
||||||
|
t.Fatalf("expected uploaded current input file to be persisted in history text")
|
||||||
|
}
|
||||||
|
if len(full.Messages) != 1 || !strings.Contains(full.Messages[0].Content, "Continue from the latest state in the attached DS2API_HISTORY.txt context.") {
|
||||||
|
t.Fatalf("expected persisted message to match upstream continuation prompt, got %#v", full.Messages)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestGeminiRoutesRegistered(t *testing.T) {
|
func TestGeminiRoutesRegistered(t *testing.T) {
|
||||||
|
|||||||
@@ -14,9 +14,6 @@ import (
|
|||||||
"ds2api/internal/promptcompat"
|
"ds2api/internal/promptcompat"
|
||||||
)
|
)
|
||||||
|
|
||||||
const adminWebUISourceHeader = "X-Ds2-Source"
|
|
||||||
const adminWebUISourceValue = "admin-webui-api-tester"
|
|
||||||
|
|
||||||
type chatHistorySession struct {
|
type chatHistorySession struct {
|
||||||
store *chathistory.Store
|
store *chathistory.Store
|
||||||
entryID string
|
entryID string
|
||||||
@@ -40,6 +37,7 @@ func startChatHistory(store *chathistory.Store, r *http.Request, a *auth.Request
|
|||||||
entry, err := store.Start(chathistory.StartParams{
|
entry, err := store.Start(chathistory.StartParams{
|
||||||
CallerID: strings.TrimSpace(a.CallerID),
|
CallerID: strings.TrimSpace(a.CallerID),
|
||||||
AccountID: strings.TrimSpace(a.AccountID),
|
AccountID: strings.TrimSpace(a.AccountID),
|
||||||
|
Surface: "openai.chat_completions",
|
||||||
Model: strings.TrimSpace(stdReq.ResponseModel),
|
Model: strings.TrimSpace(stdReq.ResponseModel),
|
||||||
Stream: stdReq.Stream,
|
Stream: stdReq.Stream,
|
||||||
UserInput: extractSingleUserInput(stdReq.Messages),
|
UserInput: extractSingleUserInput(stdReq.Messages),
|
||||||
@@ -50,6 +48,7 @@ func startChatHistory(store *chathistory.Store, r *http.Request, a *auth.Request
|
|||||||
startParams := chathistory.StartParams{
|
startParams := chathistory.StartParams{
|
||||||
CallerID: strings.TrimSpace(a.CallerID),
|
CallerID: strings.TrimSpace(a.CallerID),
|
||||||
AccountID: strings.TrimSpace(a.AccountID),
|
AccountID: strings.TrimSpace(a.AccountID),
|
||||||
|
Surface: "openai.chat_completions",
|
||||||
Model: strings.TrimSpace(stdReq.ResponseModel),
|
Model: strings.TrimSpace(stdReq.ResponseModel),
|
||||||
Stream: stdReq.Stream,
|
Stream: stdReq.Stream,
|
||||||
UserInput: extractSingleUserInput(stdReq.Messages),
|
UserInput: extractSingleUserInput(stdReq.Messages),
|
||||||
@@ -82,7 +81,7 @@ func shouldCaptureChatHistory(r *http.Request) bool {
|
|||||||
if isVercelStreamPrepareRequest(r) || isVercelStreamReleaseRequest(r) {
|
if isVercelStreamPrepareRequest(r) || isVercelStreamReleaseRequest(r) {
|
||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
return strings.TrimSpace(r.Header.Get(adminWebUISourceHeader)) != adminWebUISourceValue
|
return true
|
||||||
}
|
}
|
||||||
|
|
||||||
func extractSingleUserInput(messages []any) string {
|
func extractSingleUserInput(messages []any) string {
|
||||||
@@ -188,6 +187,23 @@ func (s *chatHistorySession) stopped(thinking, content, finishReason string) {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func historyTextForArchive(raw, visible string) string {
|
||||||
|
if strings.TrimSpace(raw) != "" {
|
||||||
|
return raw
|
||||||
|
}
|
||||||
|
return visible
|
||||||
|
}
|
||||||
|
|
||||||
|
func historyThinkingForArchive(raw, detection, visible string) string {
|
||||||
|
if strings.TrimSpace(raw) != "" {
|
||||||
|
return raw
|
||||||
|
}
|
||||||
|
if strings.TrimSpace(detection) != "" {
|
||||||
|
return detection
|
||||||
|
}
|
||||||
|
return visible
|
||||||
|
}
|
||||||
|
|
||||||
func (s *chatHistorySession) retryMissingEntry() bool {
|
func (s *chatHistorySession) retryMissingEntry() bool {
|
||||||
if s == nil || s.store == nil || s.disabled {
|
if s == nil || s.store == nil || s.disabled {
|
||||||
return false
|
return false
|
||||||
|
|||||||
@@ -6,6 +6,7 @@ import (
|
|||||||
"net/http/httptest"
|
"net/http/httptest"
|
||||||
"os"
|
"os"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
|
"strconv"
|
||||||
"strings"
|
"strings"
|
||||||
"sync"
|
"sync"
|
||||||
"testing"
|
"testing"
|
||||||
@@ -102,6 +103,86 @@ func TestChatCompletionsNonStreamPersistsHistory(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestChatHistoryNonStreamArchivesRawToolCallMarkup(t *testing.T) {
|
||||||
|
historyStore := newTestChatHistoryStore(t)
|
||||||
|
entry, err := historyStore.Start(chathistory.StartParams{
|
||||||
|
CallerID: "caller:test",
|
||||||
|
Model: "deepseek-v4-flash",
|
||||||
|
UserInput: "call tool",
|
||||||
|
})
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("start history failed: %v", err)
|
||||||
|
}
|
||||||
|
session := &chatHistorySession{
|
||||||
|
store: historyStore,
|
||||||
|
entryID: entry.ID,
|
||||||
|
startedAt: time.Now(),
|
||||||
|
lastPersist: time.Now().Add(-time.Second),
|
||||||
|
finalPrompt: "call tool",
|
||||||
|
}
|
||||||
|
rawToolCall := `<tool_calls><invoke name="search"><parameter name="q">golang</parameter></invoke></tool_calls>`
|
||||||
|
|
||||||
|
h := &Handler{}
|
||||||
|
rec := httptest.NewRecorder()
|
||||||
|
resp := makeOpenAISSEHTTPResponse(`data: {"p":"response/content","v":`+strconv.Quote(rawToolCall)+`}`, `data: [DONE]`)
|
||||||
|
h.handleNonStream(rec, resp, "cid-tool-history", "deepseek-v4-flash", "prompt", 0, false, false, []string{"search"}, nil, session)
|
||||||
|
|
||||||
|
if rec.Code != http.StatusOK {
|
||||||
|
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
|
||||||
|
}
|
||||||
|
full, err := historyStore.Get(entry.ID)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("get detail failed: %v", err)
|
||||||
|
}
|
||||||
|
if full.Content != rawToolCall {
|
||||||
|
t.Fatalf("expected raw tool markup archived, got %q", full.Content)
|
||||||
|
}
|
||||||
|
if full.FinishReason != "tool_calls" {
|
||||||
|
t.Fatalf("expected tool_calls finish reason, got %#v", full.FinishReason)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestChatHistoryStreamArchivesRawToolCallMarkup(t *testing.T) {
|
||||||
|
historyStore := newTestChatHistoryStore(t)
|
||||||
|
entry, err := historyStore.Start(chathistory.StartParams{
|
||||||
|
CallerID: "caller:test",
|
||||||
|
Model: "deepseek-v4-flash",
|
||||||
|
Stream: true,
|
||||||
|
UserInput: "call tool",
|
||||||
|
})
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("start history failed: %v", err)
|
||||||
|
}
|
||||||
|
session := &chatHistorySession{
|
||||||
|
store: historyStore,
|
||||||
|
entryID: entry.ID,
|
||||||
|
startedAt: time.Now(),
|
||||||
|
lastPersist: time.Now().Add(-time.Second),
|
||||||
|
finalPrompt: "call tool",
|
||||||
|
}
|
||||||
|
rawToolCall := `<tool_calls><invoke name="search"><parameter name="q">golang</parameter></invoke></tool_calls>`
|
||||||
|
|
||||||
|
h := &Handler{}
|
||||||
|
req := httptest.NewRequest(http.MethodPost, "/v1/chat/completions", nil)
|
||||||
|
rec := httptest.NewRecorder()
|
||||||
|
resp := makeOpenAISSEHTTPResponse(`data: {"p":"response/content","v":`+strconv.Quote(rawToolCall)+`}`, `data: [DONE]`)
|
||||||
|
h.handleStream(rec, req, resp, "cid-stream-tool-history", "deepseek-v4-flash", "prompt", 0, false, false, []string{"search"}, nil, session)
|
||||||
|
|
||||||
|
if rec.Code != http.StatusOK {
|
||||||
|
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
|
||||||
|
}
|
||||||
|
full, err := historyStore.Get(entry.ID)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("get detail failed: %v", err)
|
||||||
|
}
|
||||||
|
if full.Content != rawToolCall {
|
||||||
|
t.Fatalf("expected raw streamed tool markup archived, got %q", full.Content)
|
||||||
|
}
|
||||||
|
if full.FinishReason != "tool_calls" {
|
||||||
|
t.Fatalf("expected tool_calls finish reason, got %#v", full.FinishReason)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func TestStartChatHistoryRecoversFromTransientWriteFailure(t *testing.T) {
|
func TestStartChatHistoryRecoversFromTransientWriteFailure(t *testing.T) {
|
||||||
historyStore := newTestChatHistoryStore(t)
|
historyStore := newTestChatHistoryStore(t)
|
||||||
restore := blockChatHistoryDetailDir(t, historyStore.DetailDir())
|
restore := blockChatHistoryDetailDir(t, historyStore.DetailDir())
|
||||||
@@ -213,7 +294,7 @@ func TestHandleStreamContextCancelledMarksHistoryStopped(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestChatCompletionsSkipsAdminWebUISource(t *testing.T) {
|
func TestChatCompletionsRecordsAdminWebUISource(t *testing.T) {
|
||||||
historyStore := newTestChatHistoryStore(t)
|
historyStore := newTestChatHistoryStore(t)
|
||||||
h := &Handler{
|
h := &Handler{
|
||||||
Store: mockOpenAIConfig{},
|
Store: mockOpenAIConfig{},
|
||||||
@@ -226,7 +307,7 @@ func TestChatCompletionsSkipsAdminWebUISource(t *testing.T) {
|
|||||||
req := httptest.NewRequest(http.MethodPost, "/v1/chat/completions", strings.NewReader(reqBody))
|
req := httptest.NewRequest(http.MethodPost, "/v1/chat/completions", strings.NewReader(reqBody))
|
||||||
req.Header.Set("Authorization", "Bearer direct-token")
|
req.Header.Set("Authorization", "Bearer direct-token")
|
||||||
req.Header.Set("Content-Type", "application/json")
|
req.Header.Set("Content-Type", "application/json")
|
||||||
req.Header.Set(adminWebUISourceHeader, adminWebUISourceValue)
|
req.Header.Set("X-Ds2-Source", "admin-webui-api-tester")
|
||||||
rec := httptest.NewRecorder()
|
rec := httptest.NewRecorder()
|
||||||
h.ChatCompletions(rec, req)
|
h.ChatCompletions(rec, req)
|
||||||
|
|
||||||
@@ -237,8 +318,8 @@ func TestChatCompletionsSkipsAdminWebUISource(t *testing.T) {
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
t.Fatalf("snapshot failed: %v", err)
|
t.Fatalf("snapshot failed: %v", err)
|
||||||
}
|
}
|
||||||
if len(snapshot.Items) != 0 {
|
if len(snapshot.Items) != 1 {
|
||||||
t.Fatalf("expected admin webui source to be skipped, got %#v", snapshot.Items)
|
t.Fatalf("expected admin webui source to be recorded, got %#v", snapshot.Items)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -195,6 +195,24 @@ func (s *chatStreamRuntime) markContextCancelled() {
|
|||||||
s.finalFinishReason = string(streamengine.StopReasonContextCancelled)
|
s.finalFinishReason = string(streamengine.StopReasonContextCancelled)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (s *chatStreamRuntime) historyText() string {
|
||||||
|
if s == nil {
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
return historyTextForArchive(s.accumulator.RawText.String(), s.finalText)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *chatStreamRuntime) historyThinking() string {
|
||||||
|
if s == nil {
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
return historyThinkingForArchive(
|
||||||
|
s.accumulator.RawThinking.String(),
|
||||||
|
s.accumulator.ToolDetectionThinking.String(),
|
||||||
|
s.finalThinking,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
func (s *chatStreamRuntime) resetStreamToolCallState() {
|
func (s *chatStreamRuntime) resetStreamToolCallState() {
|
||||||
s.streamToolCallIDs = map[int]string{}
|
s.streamToolCallIDs = map[int]string{}
|
||||||
s.streamToolNames = map[int]string{}
|
s.streamToolNames = map[int]string{}
|
||||||
|
|||||||
@@ -31,6 +31,14 @@ type chatNonStreamResult struct {
|
|||||||
outputError *assistantturn.OutputError
|
outputError *assistantturn.OutputError
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (r chatNonStreamResult) historyText() string {
|
||||||
|
return historyTextForArchive(r.rawText, r.text)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (r chatNonStreamResult) historyThinking() string {
|
||||||
|
return historyThinkingForArchive(r.rawThinking, r.toolDetectionThinking, r.thinking)
|
||||||
|
}
|
||||||
|
|
||||||
func (h *Handler) handleNonStreamWithRetry(w http.ResponseWriter, ctx context.Context, a *auth.RequestAuth, resp *http.Response, payload map[string]any, pow, completionID, model, finalPrompt string, refFileTokens int, thinkingEnabled, searchEnabled bool, toolNames []string, toolsRaw any, historySession *chatHistorySession) {
|
func (h *Handler) handleNonStreamWithRetry(w http.ResponseWriter, ctx context.Context, a *auth.RequestAuth, resp *http.Response, payload map[string]any, pow, completionID, model, finalPrompt string, refFileTokens int, thinkingEnabled, searchEnabled bool, toolNames []string, toolsRaw any, historySession *chatHistorySession) {
|
||||||
attempts := 0
|
attempts := 0
|
||||||
currentResp := resp
|
currentResp := resp
|
||||||
@@ -70,7 +78,7 @@ func (h *Handler) handleNonStreamWithRetry(w http.ResponseWriter, ctx context.Co
|
|||||||
nextResp, err := h.DS.CallCompletion(ctx, a, retryPayload, retryPow, 3)
|
nextResp, err := h.DS.CallCompletion(ctx, a, retryPayload, retryPow, 3)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
if historySession != nil {
|
if historySession != nil {
|
||||||
historySession.error(http.StatusInternalServerError, "Failed to get completion.", "error", result.thinking, result.text)
|
historySession.error(http.StatusInternalServerError, "Failed to get completion.", "error", result.historyThinking(), result.historyText())
|
||||||
}
|
}
|
||||||
writeOpenAIError(w, http.StatusInternalServerError, "Failed to get completion.")
|
writeOpenAIError(w, http.StatusInternalServerError, "Failed to get completion.")
|
||||||
config.Logger.Warn("[openai_empty_retry] retry request failed", "surface", "chat.completions", "stream", false, "retry_attempt", attempts, "error", err)
|
config.Logger.Warn("[openai_empty_retry] retry request failed", "surface", "chat.completions", "stream", false, "retry_attempt", attempts, "error", err)
|
||||||
@@ -90,12 +98,11 @@ func (h *Handler) collectChatNonStreamAttempt(w http.ResponseWriter, resp *http.
|
|||||||
}
|
}
|
||||||
result := sse.CollectStream(resp, thinkingEnabled, true)
|
result := sse.CollectStream(resp, thinkingEnabled, true)
|
||||||
turn := assistantturn.BuildTurnFromCollected(result, assistantturn.BuildOptions{
|
turn := assistantturn.BuildTurnFromCollected(result, assistantturn.BuildOptions{
|
||||||
Model: model,
|
Model: model,
|
||||||
Prompt: usagePrompt,
|
Prompt: usagePrompt,
|
||||||
SearchEnabled: searchEnabled,
|
SearchEnabled: searchEnabled,
|
||||||
StripReferenceMarkers: stripReferenceMarkersEnabled(),
|
ToolNames: toolNames,
|
||||||
ToolNames: toolNames,
|
ToolsRaw: toolsRaw,
|
||||||
ToolsRaw: toolsRaw,
|
|
||||||
})
|
})
|
||||||
respBody := openaifmt.BuildChatCompletionWithToolCalls(completionID, model, usagePrompt, turn.Thinking, turn.Text, turn.ToolCalls, toolsRaw)
|
respBody := openaifmt.BuildChatCompletionWithToolCalls(completionID, model, usagePrompt, turn.Thinking, turn.Text, turn.ToolCalls, toolsRaw)
|
||||||
return chatNonStreamResult{
|
return chatNonStreamResult{
|
||||||
@@ -120,14 +127,14 @@ func (h *Handler) finishChatNonStreamResult(w http.ResponseWriter, result chatNo
|
|||||||
status, message, code = result.outputError.Status, result.outputError.Message, result.outputError.Code
|
status, message, code = result.outputError.Status, result.outputError.Message, result.outputError.Code
|
||||||
}
|
}
|
||||||
if historySession != nil {
|
if historySession != nil {
|
||||||
historySession.error(status, message, code, result.thinking, result.text)
|
historySession.error(status, message, code, result.historyThinking(), result.historyText())
|
||||||
}
|
}
|
||||||
writeOpenAIErrorWithCode(w, status, message, code)
|
writeOpenAIErrorWithCode(w, status, message, code)
|
||||||
config.Logger.Info("[openai_empty_retry] terminal empty output", "surface", "chat.completions", "stream", false, "retry_attempts", attempts, "success_source", "none", "content_filter", result.contentFilter)
|
config.Logger.Info("[openai_empty_retry] terminal empty output", "surface", "chat.completions", "stream", false, "retry_attempts", attempts, "success_source", "none", "content_filter", result.contentFilter)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
if historySession != nil {
|
if historySession != nil {
|
||||||
historySession.success(http.StatusOK, result.thinking, result.text, result.finishReason, openaifmt.BuildChatUsageForModel("", usagePrompt, result.thinking, result.text, refFileTokens))
|
historySession.success(http.StatusOK, result.historyThinking(), result.historyText(), result.finishReason, openaifmt.BuildChatUsageForModel("", usagePrompt, result.thinking, result.text, refFileTokens))
|
||||||
}
|
}
|
||||||
writeJSON(w, http.StatusOK, result.body)
|
writeJSON(w, http.StatusOK, result.body)
|
||||||
source := "first_attempt"
|
source := "first_attempt"
|
||||||
@@ -246,7 +253,7 @@ func (h *Handler) consumeChatStreamAttempt(r *http.Request, resp *http.Response,
|
|||||||
OnParsed: func(parsed sse.LineResult) streamengine.ParsedDecision {
|
OnParsed: func(parsed sse.LineResult) streamengine.ParsedDecision {
|
||||||
decision := streamRuntime.onParsed(parsed)
|
decision := streamRuntime.onParsed(parsed)
|
||||||
if historySession != nil {
|
if historySession != nil {
|
||||||
historySession.progress(streamRuntime.accumulator.Thinking.String(), streamRuntime.accumulator.Text.String())
|
historySession.progress(streamRuntime.historyThinking(), streamRuntime.historyText())
|
||||||
}
|
}
|
||||||
return decision
|
return decision
|
||||||
},
|
},
|
||||||
@@ -258,7 +265,7 @@ func (h *Handler) consumeChatStreamAttempt(r *http.Request, resp *http.Response,
|
|||||||
OnContextDone: func() {
|
OnContextDone: func() {
|
||||||
streamRuntime.markContextCancelled()
|
streamRuntime.markContextCancelled()
|
||||||
if historySession != nil {
|
if historySession != nil {
|
||||||
historySession.stopped(streamRuntime.accumulator.Thinking.String(), streamRuntime.accumulator.Text.String(), string(streamengine.StopReasonContextCancelled))
|
historySession.stopped(streamRuntime.historyThinking(), streamRuntime.historyText(), string(streamengine.StopReasonContextCancelled))
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
})
|
})
|
||||||
@@ -278,16 +285,16 @@ func recordChatStreamHistory(streamRuntime *chatStreamRuntime, historySession *c
|
|||||||
return
|
return
|
||||||
}
|
}
|
||||||
if streamRuntime.finalErrorMessage != "" {
|
if streamRuntime.finalErrorMessage != "" {
|
||||||
historySession.error(streamRuntime.finalErrorStatus, streamRuntime.finalErrorMessage, streamRuntime.finalErrorCode, streamRuntime.accumulator.Thinking.String(), streamRuntime.accumulator.Text.String())
|
historySession.error(streamRuntime.finalErrorStatus, streamRuntime.finalErrorMessage, streamRuntime.finalErrorCode, streamRuntime.historyThinking(), streamRuntime.historyText())
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
historySession.success(http.StatusOK, streamRuntime.finalThinking, streamRuntime.finalText, streamRuntime.finalFinishReason, streamRuntime.finalUsage)
|
historySession.success(http.StatusOK, streamRuntime.historyThinking(), streamRuntime.historyText(), streamRuntime.finalFinishReason, streamRuntime.finalUsage)
|
||||||
}
|
}
|
||||||
|
|
||||||
func failChatStreamRetry(streamRuntime *chatStreamRuntime, historySession *chatHistorySession, status int, message, code string) {
|
func failChatStreamRetry(streamRuntime *chatStreamRuntime, historySession *chatHistorySession, status int, message, code string) {
|
||||||
streamRuntime.sendFailedChunk(status, message, code)
|
streamRuntime.sendFailedChunk(status, message, code)
|
||||||
if historySession != nil {
|
if historySession != nil {
|
||||||
historySession.error(status, message, code, streamRuntime.accumulator.Thinking.String(), streamRuntime.accumulator.Text.String())
|
historySession.error(status, message, code, streamRuntime.historyThinking(), streamRuntime.historyText())
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -80,14 +80,13 @@ func (h *Handler) ChatCompletions(w http.ResponseWriter, r *http.Request) {
|
|||||||
|
|
||||||
if !stdReq.Stream {
|
if !stdReq.Stream {
|
||||||
result, outErr := completionruntime.ExecuteNonStreamWithRetry(r.Context(), h.DS, a, stdReq, completionruntime.Options{
|
result, outErr := completionruntime.ExecuteNonStreamWithRetry(r.Context(), h.DS, a, stdReq, completionruntime.Options{
|
||||||
StripReferenceMarkers: stripReferenceMarkersEnabled(),
|
RetryEnabled: true,
|
||||||
RetryEnabled: true,
|
CurrentInputFile: h.Store,
|
||||||
CurrentInputFile: h.Store,
|
|
||||||
})
|
})
|
||||||
sessionID = result.SessionID
|
sessionID = result.SessionID
|
||||||
if outErr != nil {
|
if outErr != nil {
|
||||||
if historySession != nil {
|
if historySession != nil {
|
||||||
historySession.error(outErr.Status, outErr.Message, outErr.Code, result.Turn.Thinking, result.Turn.Text)
|
historySession.error(outErr.Status, outErr.Message, outErr.Code, historyThinkingForArchive(result.Turn.RawThinking, result.Turn.DetectionThinking, result.Turn.Thinking), historyTextForArchive(result.Turn.RawText, result.Turn.Text))
|
||||||
}
|
}
|
||||||
writeOpenAIErrorWithCode(w, outErr.Status, outErr.Message, outErr.Code)
|
writeOpenAIErrorWithCode(w, outErr.Status, outErr.Message, outErr.Code)
|
||||||
return
|
return
|
||||||
@@ -96,7 +95,7 @@ func (h *Handler) ChatCompletions(w http.ResponseWriter, r *http.Request) {
|
|||||||
respBody["usage"] = assistantturn.OpenAIChatUsage(result.Turn)
|
respBody["usage"] = assistantturn.OpenAIChatUsage(result.Turn)
|
||||||
finishReason := assistantturn.FinalizeTurn(result.Turn, assistantturn.FinalizeOptions{}).FinishReason
|
finishReason := assistantturn.FinalizeTurn(result.Turn, assistantturn.FinalizeOptions{}).FinishReason
|
||||||
if historySession != nil {
|
if historySession != nil {
|
||||||
historySession.success(http.StatusOK, result.Turn.Thinking, result.Turn.Text, finishReason, assistantturn.OpenAIChatUsage(result.Turn))
|
historySession.success(http.StatusOK, historyThinkingForArchive(result.Turn.RawThinking, result.Turn.DetectionThinking, result.Turn.Thinking), historyTextForArchive(result.Turn.RawText, result.Turn.Text), finishReason, assistantturn.OpenAIChatUsage(result.Turn))
|
||||||
}
|
}
|
||||||
writeJSON(w, http.StatusOK, respBody)
|
writeJSON(w, http.StatusOK, respBody)
|
||||||
return
|
return
|
||||||
@@ -164,20 +163,19 @@ func (h *Handler) handleNonStream(w http.ResponseWriter, resp *http.Response, co
|
|||||||
result := sse.CollectStream(resp, thinkingEnabled, true)
|
result := sse.CollectStream(resp, thinkingEnabled, true)
|
||||||
|
|
||||||
turn := assistantturn.BuildTurnFromCollected(result, assistantturn.BuildOptions{
|
turn := assistantturn.BuildTurnFromCollected(result, assistantturn.BuildOptions{
|
||||||
Model: model,
|
Model: model,
|
||||||
Prompt: finalPrompt,
|
Prompt: finalPrompt,
|
||||||
RefFileTokens: refFileTokens,
|
RefFileTokens: refFileTokens,
|
||||||
SearchEnabled: searchEnabled,
|
SearchEnabled: searchEnabled,
|
||||||
StripReferenceMarkers: stripReferenceMarkersEnabled(),
|
ToolNames: toolNames,
|
||||||
ToolNames: toolNames,
|
ToolsRaw: toolsRaw,
|
||||||
ToolsRaw: toolsRaw,
|
ToolChoice: promptcompat.DefaultToolChoicePolicy(),
|
||||||
ToolChoice: promptcompat.DefaultToolChoicePolicy(),
|
|
||||||
})
|
})
|
||||||
outcome := assistantturn.FinalizeTurn(turn, assistantturn.FinalizeOptions{})
|
outcome := assistantturn.FinalizeTurn(turn, assistantturn.FinalizeOptions{})
|
||||||
if outcome.ShouldFail {
|
if outcome.ShouldFail {
|
||||||
status, message, code := outcome.Error.Status, outcome.Error.Message, outcome.Error.Code
|
status, message, code := outcome.Error.Status, outcome.Error.Message, outcome.Error.Code
|
||||||
if historySession != nil {
|
if historySession != nil {
|
||||||
historySession.error(status, message, code, turn.Thinking, turn.Text)
|
historySession.error(status, message, code, historyThinkingForArchive(turn.RawThinking, turn.DetectionThinking, turn.Thinking), historyTextForArchive(turn.RawText, turn.Text))
|
||||||
}
|
}
|
||||||
writeOpenAIErrorWithCode(w, status, message, code)
|
writeOpenAIErrorWithCode(w, status, message, code)
|
||||||
return
|
return
|
||||||
@@ -185,7 +183,7 @@ func (h *Handler) handleNonStream(w http.ResponseWriter, resp *http.Response, co
|
|||||||
respBody := openaifmt.BuildChatCompletionWithToolCalls(completionID, model, finalPrompt, turn.Thinking, turn.Text, turn.ToolCalls, toolsRaw)
|
respBody := openaifmt.BuildChatCompletionWithToolCalls(completionID, model, finalPrompt, turn.Thinking, turn.Text, turn.ToolCalls, toolsRaw)
|
||||||
respBody["usage"] = assistantturn.OpenAIChatUsage(turn)
|
respBody["usage"] = assistantturn.OpenAIChatUsage(turn)
|
||||||
if historySession != nil {
|
if historySession != nil {
|
||||||
historySession.success(http.StatusOK, turn.Thinking, turn.Text, outcome.FinishReason, assistantturn.OpenAIChatUsage(turn))
|
historySession.success(http.StatusOK, historyThinkingForArchive(turn.RawThinking, turn.DetectionThinking, turn.Thinking), historyTextForArchive(turn.RawText, turn.Text), outcome.FinishReason, assistantturn.OpenAIChatUsage(turn))
|
||||||
}
|
}
|
||||||
writeJSON(w, http.StatusOK, respBody)
|
writeJSON(w, http.StatusOK, respBody)
|
||||||
}
|
}
|
||||||
@@ -253,7 +251,7 @@ func (h *Handler) handleStream(w http.ResponseWriter, r *http.Request, resp *htt
|
|||||||
OnParsed: func(parsed sse.LineResult) streamengine.ParsedDecision {
|
OnParsed: func(parsed sse.LineResult) streamengine.ParsedDecision {
|
||||||
decision := streamRuntime.onParsed(parsed)
|
decision := streamRuntime.onParsed(parsed)
|
||||||
if historySession != nil {
|
if historySession != nil {
|
||||||
historySession.progress(streamRuntime.accumulator.Thinking.String(), streamRuntime.accumulator.Text.String())
|
historySession.progress(streamRuntime.historyThinking(), streamRuntime.historyText())
|
||||||
}
|
}
|
||||||
return decision
|
return decision
|
||||||
},
|
},
|
||||||
@@ -267,14 +265,15 @@ func (h *Handler) handleStream(w http.ResponseWriter, r *http.Request, resp *htt
|
|||||||
return
|
return
|
||||||
}
|
}
|
||||||
if streamRuntime.finalErrorMessage != "" {
|
if streamRuntime.finalErrorMessage != "" {
|
||||||
historySession.error(streamRuntime.finalErrorStatus, streamRuntime.finalErrorMessage, streamRuntime.finalErrorCode, streamRuntime.accumulator.Thinking.String(), streamRuntime.accumulator.Text.String())
|
historySession.error(streamRuntime.finalErrorStatus, streamRuntime.finalErrorMessage, streamRuntime.finalErrorCode, streamRuntime.historyThinking(), streamRuntime.historyText())
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
historySession.success(http.StatusOK, streamRuntime.finalThinking, streamRuntime.finalText, streamRuntime.finalFinishReason, streamRuntime.finalUsage)
|
historySession.success(http.StatusOK, streamRuntime.historyThinking(), streamRuntime.historyText(), streamRuntime.finalFinishReason, streamRuntime.finalUsage)
|
||||||
},
|
},
|
||||||
OnContextDone: func() {
|
OnContextDone: func() {
|
||||||
|
streamRuntime.markContextCancelled()
|
||||||
if historySession != nil {
|
if historySession != nil {
|
||||||
historySession.stopped(streamRuntime.accumulator.Thinking.String(), streamRuntime.accumulator.Text.String(), string(streamengine.StopReasonContextCancelled))
|
historySession.stopped(streamRuntime.historyThinking(), streamRuntime.historyText(), string(streamengine.StopReasonContextCancelled))
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
})
|
})
|
||||||
|
|||||||
@@ -10,11 +10,12 @@ import (
|
|||||||
"ds2api/internal/config"
|
"ds2api/internal/config"
|
||||||
dsprotocol "ds2api/internal/deepseek/protocol"
|
dsprotocol "ds2api/internal/deepseek/protocol"
|
||||||
"ds2api/internal/promptcompat"
|
"ds2api/internal/promptcompat"
|
||||||
|
"ds2api/internal/responsehistory"
|
||||||
streamengine "ds2api/internal/stream"
|
streamengine "ds2api/internal/stream"
|
||||||
)
|
)
|
||||||
|
|
||||||
func (h *Handler) handleResponsesStreamWithRetry(w http.ResponseWriter, r *http.Request, a *auth.RequestAuth, resp *http.Response, payload map[string]any, pow, owner, responseID, model, finalPrompt string, refFileTokens int, thinkingEnabled, searchEnabled bool, toolNames []string, toolsRaw any, toolChoice promptcompat.ToolChoicePolicy, traceID string) {
|
func (h *Handler) handleResponsesStreamWithRetry(w http.ResponseWriter, r *http.Request, a *auth.RequestAuth, resp *http.Response, payload map[string]any, pow, owner, responseID, model, finalPrompt string, refFileTokens int, thinkingEnabled, searchEnabled bool, toolNames []string, toolsRaw any, toolChoice promptcompat.ToolChoicePolicy, traceID string, historySession *responsehistory.Session) {
|
||||||
streamRuntime, initialType, ok := h.prepareResponsesStreamRuntime(w, resp, owner, responseID, model, finalPrompt, refFileTokens, thinkingEnabled, searchEnabled, toolNames, toolsRaw, toolChoice, traceID)
|
streamRuntime, initialType, ok := h.prepareResponsesStreamRuntime(w, resp, owner, responseID, model, finalPrompt, refFileTokens, thinkingEnabled, searchEnabled, toolNames, toolsRaw, toolChoice, traceID, historySession)
|
||||||
if !ok {
|
if !ok {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
@@ -55,10 +56,13 @@ func (h *Handler) handleResponsesStreamWithRetry(w http.ResponseWriter, r *http.
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func (h *Handler) prepareResponsesStreamRuntime(w http.ResponseWriter, resp *http.Response, owner, responseID, model, finalPrompt string, refFileTokens int, thinkingEnabled, searchEnabled bool, toolNames []string, toolsRaw any, toolChoice promptcompat.ToolChoicePolicy, traceID string) (*responsesStreamRuntime, string, bool) {
|
func (h *Handler) prepareResponsesStreamRuntime(w http.ResponseWriter, resp *http.Response, owner, responseID, model, finalPrompt string, refFileTokens int, thinkingEnabled, searchEnabled bool, toolNames []string, toolsRaw any, toolChoice promptcompat.ToolChoicePolicy, traceID string, historySession *responsehistory.Session) (*responsesStreamRuntime, string, bool) {
|
||||||
if resp.StatusCode != http.StatusOK {
|
if resp.StatusCode != http.StatusOK {
|
||||||
defer func() { _ = resp.Body.Close() }()
|
defer func() { _ = resp.Body.Close() }()
|
||||||
body, _ := io.ReadAll(resp.Body)
|
body, _ := io.ReadAll(resp.Body)
|
||||||
|
if historySession != nil {
|
||||||
|
historySession.Error(resp.StatusCode, strings.TrimSpace(string(body)), "error", "", "")
|
||||||
|
}
|
||||||
writeOpenAIError(w, resp.StatusCode, strings.TrimSpace(string(body)))
|
writeOpenAIError(w, resp.StatusCode, strings.TrimSpace(string(body)))
|
||||||
return nil, "", false
|
return nil, "", false
|
||||||
}
|
}
|
||||||
@@ -78,7 +82,7 @@ func (h *Handler) prepareResponsesStreamRuntime(w http.ResponseWriter, resp *htt
|
|||||||
h.toolcallFeatureMatchEnabled() && h.toolcallEarlyEmitHighConfidence(),
|
h.toolcallFeatureMatchEnabled() && h.toolcallEarlyEmitHighConfidence(),
|
||||||
toolChoice, traceID, func(obj map[string]any) {
|
toolChoice, traceID, func(obj map[string]any) {
|
||||||
h.getResponseStore().put(owner, responseID, obj)
|
h.getResponseStore().put(owner, responseID, obj)
|
||||||
},
|
}, historySession,
|
||||||
)
|
)
|
||||||
streamRuntime.refFileTokens = refFileTokens
|
streamRuntime.refFileTokens = refFileTokens
|
||||||
streamRuntime.sendCreated()
|
streamRuntime.sendCreated()
|
||||||
|
|||||||
@@ -47,6 +47,7 @@ func TestConsumeResponsesStreamAttemptMarksContextCancelledState(t *testing.T) {
|
|||||||
promptcompat.DefaultToolChoicePolicy(),
|
promptcompat.DefaultToolChoicePolicy(),
|
||||||
"",
|
"",
|
||||||
nil,
|
nil,
|
||||||
|
nil,
|
||||||
)
|
)
|
||||||
resp := makeResponsesOpenAISSEHTTPResponse(
|
resp := makeResponsesOpenAISSEHTTPResponse(
|
||||||
`data: {"p":"response/content","v":"hello"}`,
|
`data: {"p":"response/content","v":"hello"}`,
|
||||||
|
|||||||
@@ -18,6 +18,7 @@ import (
|
|||||||
dsprotocol "ds2api/internal/deepseek/protocol"
|
dsprotocol "ds2api/internal/deepseek/protocol"
|
||||||
openaifmt "ds2api/internal/format/openai"
|
openaifmt "ds2api/internal/format/openai"
|
||||||
"ds2api/internal/promptcompat"
|
"ds2api/internal/promptcompat"
|
||||||
|
"ds2api/internal/responsehistory"
|
||||||
"ds2api/internal/sse"
|
"ds2api/internal/sse"
|
||||||
streamengine "ds2api/internal/stream"
|
streamengine "ds2api/internal/stream"
|
||||||
)
|
)
|
||||||
@@ -95,16 +96,28 @@ func (h *Handler) Responses(w http.ResponseWriter, r *http.Request) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
responseID := "resp_" + strings.ReplaceAll(uuid.NewString(), "-", "")
|
responseID := "resp_" + strings.ReplaceAll(uuid.NewString(), "-", "")
|
||||||
|
historySession := responsehistory.Start(responsehistory.StartParams{
|
||||||
|
Store: h.ChatHistory,
|
||||||
|
Request: r,
|
||||||
|
Auth: a,
|
||||||
|
Surface: "openai.responses",
|
||||||
|
Standard: stdReq,
|
||||||
|
})
|
||||||
if !stdReq.Stream {
|
if !stdReq.Stream {
|
||||||
result, outErr := completionruntime.ExecuteNonStreamWithRetry(r.Context(), h.DS, a, stdReq, completionruntime.Options{
|
result, outErr := completionruntime.ExecuteNonStreamWithRetry(r.Context(), h.DS, a, stdReq, completionruntime.Options{
|
||||||
StripReferenceMarkers: stripReferenceMarkersEnabled(),
|
RetryEnabled: true,
|
||||||
RetryEnabled: true,
|
CurrentInputFile: h.Store,
|
||||||
CurrentInputFile: h.Store,
|
|
||||||
})
|
})
|
||||||
if outErr != nil {
|
if outErr != nil {
|
||||||
|
if historySession != nil {
|
||||||
|
historySession.ErrorTurn(outErr.Status, outErr.Message, outErr.Code, result.Turn)
|
||||||
|
}
|
||||||
writeOpenAIErrorWithCode(w, outErr.Status, outErr.Message, outErr.Code)
|
writeOpenAIErrorWithCode(w, outErr.Status, outErr.Message, outErr.Code)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
if historySession != nil {
|
||||||
|
historySession.SuccessTurn(http.StatusOK, result.Turn, assistantturn.OpenAIResponsesUsage(result.Turn))
|
||||||
|
}
|
||||||
responseObj := openaifmt.BuildResponseObjectWithToolCalls(responseID, stdReq.ResponseModel, result.Turn.Prompt, result.Turn.Thinking, result.Turn.Text, result.Turn.ToolCalls, stdReq.ToolsRaw)
|
responseObj := openaifmt.BuildResponseObjectWithToolCalls(responseID, stdReq.ResponseModel, result.Turn.Prompt, result.Turn.Thinking, result.Turn.Text, result.Turn.ToolCalls, stdReq.ToolsRaw)
|
||||||
responseObj["usage"] = assistantturn.OpenAIResponsesUsage(result.Turn)
|
responseObj["usage"] = assistantturn.OpenAIResponsesUsage(result.Turn)
|
||||||
h.getResponseStore().put(owner, responseID, responseObj)
|
h.getResponseStore().put(owner, responseID, responseObj)
|
||||||
@@ -116,13 +129,16 @@ func (h *Handler) Responses(w http.ResponseWriter, r *http.Request) {
|
|||||||
CurrentInputFile: h.Store,
|
CurrentInputFile: h.Store,
|
||||||
})
|
})
|
||||||
if outErr != nil {
|
if outErr != nil {
|
||||||
|
if historySession != nil {
|
||||||
|
historySession.Error(outErr.Status, outErr.Message, outErr.Code, "", "")
|
||||||
|
}
|
||||||
writeOpenAIErrorWithCode(w, outErr.Status, outErr.Message, outErr.Code)
|
writeOpenAIErrorWithCode(w, outErr.Status, outErr.Message, outErr.Code)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
streamReq := start.Request
|
streamReq := start.Request
|
||||||
refFileTokens := streamReq.RefFileTokens
|
refFileTokens := streamReq.RefFileTokens
|
||||||
h.handleResponsesStreamWithRetry(w, r, a, start.Response, start.Payload, start.Pow, owner, responseID, streamReq.ResponseModel, streamReq.PromptTokenText, refFileTokens, streamReq.Thinking, streamReq.Search, streamReq.ToolNames, streamReq.ToolsRaw, streamReq.ToolChoice, traceID)
|
h.handleResponsesStreamWithRetry(w, r, a, start.Response, start.Payload, start.Pow, owner, responseID, streamReq.ResponseModel, streamReq.PromptTokenText, refFileTokens, streamReq.Thinking, streamReq.Search, streamReq.ToolNames, streamReq.ToolsRaw, streamReq.ToolChoice, traceID, historySession)
|
||||||
}
|
}
|
||||||
|
|
||||||
func (h *Handler) handleResponsesNonStream(w http.ResponseWriter, resp *http.Response, owner, responseID, model, finalPrompt string, refFileTokens int, thinkingEnabled, searchEnabled bool, toolNames []string, toolsRaw any, toolChoice promptcompat.ToolChoicePolicy, traceID string) {
|
func (h *Handler) handleResponsesNonStream(w http.ResponseWriter, resp *http.Response, owner, responseID, model, finalPrompt string, refFileTokens int, thinkingEnabled, searchEnabled bool, toolNames []string, toolsRaw any, toolChoice promptcompat.ToolChoicePolicy, traceID string) {
|
||||||
@@ -135,14 +151,13 @@ func (h *Handler) handleResponsesNonStream(w http.ResponseWriter, resp *http.Res
|
|||||||
result := sse.CollectStream(resp, thinkingEnabled, true)
|
result := sse.CollectStream(resp, thinkingEnabled, true)
|
||||||
|
|
||||||
turn := assistantturn.BuildTurnFromCollected(result, assistantturn.BuildOptions{
|
turn := assistantturn.BuildTurnFromCollected(result, assistantturn.BuildOptions{
|
||||||
Model: model,
|
Model: model,
|
||||||
Prompt: finalPrompt,
|
Prompt: finalPrompt,
|
||||||
RefFileTokens: refFileTokens,
|
RefFileTokens: refFileTokens,
|
||||||
SearchEnabled: searchEnabled,
|
SearchEnabled: searchEnabled,
|
||||||
StripReferenceMarkers: stripReferenceMarkersEnabled(),
|
ToolNames: toolNames,
|
||||||
ToolNames: toolNames,
|
ToolsRaw: toolsRaw,
|
||||||
ToolsRaw: toolsRaw,
|
ToolChoice: toolChoice,
|
||||||
ToolChoice: toolChoice,
|
|
||||||
})
|
})
|
||||||
logResponsesToolPolicyRejection(traceID, toolChoice, turn.ParsedToolCalls, "text")
|
logResponsesToolPolicyRejection(traceID, toolChoice, turn.ParsedToolCalls, "text")
|
||||||
outcome := assistantturn.FinalizeTurn(turn, assistantturn.FinalizeOptions{})
|
outcome := assistantturn.FinalizeTurn(turn, assistantturn.FinalizeOptions{})
|
||||||
@@ -198,6 +213,7 @@ func (h *Handler) handleResponsesStream(w http.ResponseWriter, r *http.Request,
|
|||||||
func(obj map[string]any) {
|
func(obj map[string]any) {
|
||||||
h.getResponseStore().put(owner, responseID, obj)
|
h.getResponseStore().put(owner, responseID, obj)
|
||||||
},
|
},
|
||||||
|
nil,
|
||||||
)
|
)
|
||||||
streamRuntime.refFileTokens = refFileTokens
|
streamRuntime.refFileTokens = refFileTokens
|
||||||
streamRuntime.sendCreated()
|
streamRuntime.sendCreated()
|
||||||
|
|||||||
100
internal/httpapi/openai/responses/responses_history_test.go
Normal file
100
internal/httpapi/openai/responses/responses_history_test.go
Normal file
@@ -0,0 +1,100 @@
|
|||||||
|
package responses
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"io"
|
||||||
|
"net/http"
|
||||||
|
"net/http/httptest"
|
||||||
|
"path/filepath"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/go-chi/chi/v5"
|
||||||
|
|
||||||
|
"ds2api/internal/auth"
|
||||||
|
"ds2api/internal/chathistory"
|
||||||
|
dsclient "ds2api/internal/deepseek/client"
|
||||||
|
)
|
||||||
|
|
||||||
|
type responsesHistoryDS struct {
|
||||||
|
payload map[string]any
|
||||||
|
}
|
||||||
|
|
||||||
|
func (d *responsesHistoryDS) CreateSession(context.Context, *auth.RequestAuth, int) (string, error) {
|
||||||
|
return "session-id", nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (d *responsesHistoryDS) GetPow(context.Context, *auth.RequestAuth, int) (string, error) {
|
||||||
|
return "pow", nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (d *responsesHistoryDS) UploadFile(context.Context, *auth.RequestAuth, dsclient.UploadFileRequest, int) (*dsclient.UploadFileResult, error) {
|
||||||
|
return &dsclient.UploadFileResult{ID: "file-id"}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (d *responsesHistoryDS) CallCompletion(_ context.Context, _ *auth.RequestAuth, payload map[string]any, _ string, _ int) (*http.Response, error) {
|
||||||
|
d.payload = payload
|
||||||
|
return &http.Response{
|
||||||
|
StatusCode: http.StatusOK,
|
||||||
|
Header: make(http.Header),
|
||||||
|
Body: io.NopCloser(strings.NewReader("data: {\"p\":\"response/content\",\"v\":\"ok\"}\n")),
|
||||||
|
}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (d *responsesHistoryDS) DeleteSessionForToken(context.Context, string, string) (*dsclient.DeleteSessionResult, error) {
|
||||||
|
return &dsclient.DeleteSessionResult{Success: true}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (d *responsesHistoryDS) DeleteAllSessionsForToken(context.Context, string) error {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestResponsesRecordsResponseHistory(t *testing.T) {
|
||||||
|
store, resolver := newDirectTokenResolver(t)
|
||||||
|
historyStore := chathistory.New(filepath.Join(t.TempDir(), "history.json"))
|
||||||
|
ds := &responsesHistoryDS{}
|
||||||
|
h := &Handler{
|
||||||
|
Store: store,
|
||||||
|
Auth: resolver,
|
||||||
|
DS: ds,
|
||||||
|
ChatHistory: historyStore,
|
||||||
|
}
|
||||||
|
r := chi.NewRouter()
|
||||||
|
RegisterRoutes(r, h)
|
||||||
|
|
||||||
|
req := httptest.NewRequest(http.MethodPost, "/v1/responses", strings.NewReader(`{"model":"deepseek-v4-flash","input":"hello responses"}`))
|
||||||
|
req.Header.Set("Authorization", "Bearer direct-token")
|
||||||
|
req.Header.Set("Content-Type", "application/json")
|
||||||
|
rec := httptest.NewRecorder()
|
||||||
|
r.ServeHTTP(rec, req)
|
||||||
|
|
||||||
|
if rec.Code != http.StatusOK {
|
||||||
|
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
|
||||||
|
}
|
||||||
|
if ds.payload == nil {
|
||||||
|
t.Fatalf("expected upstream payload to be sent")
|
||||||
|
}
|
||||||
|
snapshot, err := historyStore.Snapshot()
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("snapshot history: %v", err)
|
||||||
|
}
|
||||||
|
if len(snapshot.Items) != 1 {
|
||||||
|
t.Fatalf("expected one history item, got %d", len(snapshot.Items))
|
||||||
|
}
|
||||||
|
item, err := historyStore.Get(snapshot.Items[0].ID)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("get history item: %v", err)
|
||||||
|
}
|
||||||
|
if item.Surface != "openai.responses" {
|
||||||
|
t.Fatalf("unexpected surface: %q", item.Surface)
|
||||||
|
}
|
||||||
|
if !strings.Contains(item.UserInput, "Continue from the latest state in the attached DS2API_HISTORY.txt context.") {
|
||||||
|
t.Fatalf("unexpected user input: %q", item.UserInput)
|
||||||
|
}
|
||||||
|
if !strings.Contains(item.HistoryText, "hello responses") {
|
||||||
|
t.Fatalf("expected original input in persisted history text, got %q", item.HistoryText)
|
||||||
|
}
|
||||||
|
if item.Content != "ok" {
|
||||||
|
t.Fatalf("expected raw upstream content, got %q", item.Content)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -10,6 +10,7 @@ import (
|
|||||||
openaifmt "ds2api/internal/format/openai"
|
openaifmt "ds2api/internal/format/openai"
|
||||||
"ds2api/internal/httpapi/openai/shared"
|
"ds2api/internal/httpapi/openai/shared"
|
||||||
"ds2api/internal/promptcompat"
|
"ds2api/internal/promptcompat"
|
||||||
|
"ds2api/internal/responsehistory"
|
||||||
"ds2api/internal/sse"
|
"ds2api/internal/sse"
|
||||||
streamengine "ds2api/internal/stream"
|
streamengine "ds2api/internal/stream"
|
||||||
"ds2api/internal/toolstream"
|
"ds2api/internal/toolstream"
|
||||||
@@ -61,6 +62,7 @@ type responsesStreamRuntime struct {
|
|||||||
finalErrorCode string
|
finalErrorCode string
|
||||||
|
|
||||||
persistResponse func(obj map[string]any)
|
persistResponse func(obj map[string]any)
|
||||||
|
history *responsehistory.Session
|
||||||
}
|
}
|
||||||
|
|
||||||
func newResponsesStreamRuntime(
|
func newResponsesStreamRuntime(
|
||||||
@@ -80,6 +82,7 @@ func newResponsesStreamRuntime(
|
|||||||
toolChoice promptcompat.ToolChoicePolicy,
|
toolChoice promptcompat.ToolChoicePolicy,
|
||||||
traceID string,
|
traceID string,
|
||||||
persistResponse func(obj map[string]any),
|
persistResponse func(obj map[string]any),
|
||||||
|
history *responsehistory.Session,
|
||||||
) *responsesStreamRuntime {
|
) *responsesStreamRuntime {
|
||||||
return &responsesStreamRuntime{
|
return &responsesStreamRuntime{
|
||||||
w: w,
|
w: w,
|
||||||
@@ -106,6 +109,7 @@ func newResponsesStreamRuntime(
|
|||||||
toolChoice: toolChoice,
|
toolChoice: toolChoice,
|
||||||
traceID: traceID,
|
traceID: traceID,
|
||||||
persistResponse: persistResponse,
|
persistResponse: persistResponse,
|
||||||
|
history: history,
|
||||||
accumulator: shared.StreamAccumulator{
|
accumulator: shared.StreamAccumulator{
|
||||||
ThinkingEnabled: thinkingEnabled,
|
ThinkingEnabled: thinkingEnabled,
|
||||||
SearchEnabled: searchEnabled,
|
SearchEnabled: searchEnabled,
|
||||||
@@ -138,6 +142,9 @@ func (s *responsesStreamRuntime) failResponse(status int, message, code string)
|
|||||||
if s.persistResponse != nil {
|
if s.persistResponse != nil {
|
||||||
s.persistResponse(failedResp)
|
s.persistResponse(failedResp)
|
||||||
}
|
}
|
||||||
|
if s.history != nil {
|
||||||
|
s.history.Error(status, message, code, responsehistory.ThinkingForArchive(s.accumulator.RawThinking.String(), s.accumulator.ToolDetectionThinking.String(), s.accumulator.Thinking.String()), responsehistory.TextForArchive(s.accumulator.RawText.String(), s.accumulator.Text.String()))
|
||||||
|
}
|
||||||
s.sendEvent("response.failed", openaifmt.BuildResponsesFailedPayload(s.responseID, s.model, status, message, code))
|
s.sendEvent("response.failed", openaifmt.BuildResponsesFailedPayload(s.responseID, s.model, status, message, code))
|
||||||
s.sendDone()
|
s.sendDone()
|
||||||
}
|
}
|
||||||
@@ -214,6 +221,15 @@ func (s *responsesStreamRuntime) finalize(finishReason string, deferEmptyOutput
|
|||||||
if s.persistResponse != nil {
|
if s.persistResponse != nil {
|
||||||
s.persistResponse(obj)
|
s.persistResponse(obj)
|
||||||
}
|
}
|
||||||
|
if s.history != nil {
|
||||||
|
s.history.Success(
|
||||||
|
http.StatusOK,
|
||||||
|
responsehistory.ThinkingForArchive(turn.RawThinking, turn.DetectionThinking, turn.Thinking),
|
||||||
|
responsehistory.TextForArchive(turn.RawText, turn.Text),
|
||||||
|
outcome.FinishReason,
|
||||||
|
assistantturn.OpenAIResponsesUsage(turn),
|
||||||
|
)
|
||||||
|
}
|
||||||
s.sendEvent("response.completed", openaifmt.BuildResponsesCompletedPayload(obj))
|
s.sendEvent("response.completed", openaifmt.BuildResponsesCompletedPayload(obj))
|
||||||
s.sendDone()
|
s.sendDone()
|
||||||
return true
|
return true
|
||||||
@@ -272,5 +288,11 @@ func (s *responsesStreamRuntime) onParsed(parsed sse.LineResult) streamengine.Pa
|
|||||||
}
|
}
|
||||||
|
|
||||||
batch.flush()
|
batch.flush()
|
||||||
|
if s.history != nil {
|
||||||
|
s.history.Progress(
|
||||||
|
responsehistory.ThinkingForArchive(s.accumulator.RawThinking.String(), s.accumulator.ToolDetectionThinking.String(), s.accumulator.Thinking.String()),
|
||||||
|
responsehistory.TextForArchive(s.accumulator.RawText.String(), s.accumulator.Text.String()),
|
||||||
|
)
|
||||||
|
}
|
||||||
return streamengine.ParsedDecision{ContentSeen: accumulated.ContentSeen}
|
return streamengine.ParsedDecision{ContentSeen: accumulated.ContentSeen}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -89,11 +89,11 @@ func (a *StreamAccumulator) applyTextPart(text string) StreamPartDelta {
|
|||||||
}
|
}
|
||||||
a.RawText.WriteString(rawTrimmed)
|
a.RawText.WriteString(rawTrimmed)
|
||||||
delta := StreamPartDelta{Type: "text", RawText: rawTrimmed}
|
delta := StreamPartDelta{Type: "text", RawText: rawTrimmed}
|
||||||
cleanedText := CleanVisibleOutput(rawTrimmed, a.StripReferenceMarkers)
|
if a.SearchEnabled && sse.IsCitation(rawTrimmed) {
|
||||||
if a.SearchEnabled && sse.IsCitation(cleanedText) {
|
|
||||||
delta.CitationOnly = true
|
delta.CitationOnly = true
|
||||||
return delta
|
return delta
|
||||||
}
|
}
|
||||||
|
cleanedText := CleanVisibleOutput(rawTrimmed, a.StripReferenceMarkers)
|
||||||
trimmed := sse.TrimContinuationOverlapFromBuilder(&a.Text, cleanedText)
|
trimmed := sse.TrimContinuationOverlapFromBuilder(&a.Text, cleanedText)
|
||||||
if trimmed == "" {
|
if trimmed == "" {
|
||||||
return delta
|
return delta
|
||||||
|
|||||||
@@ -95,3 +95,21 @@ func TestStreamAccumulatorSuppressesCitationTextWhenSearchEnabled(t *testing.T)
|
|||||||
t.Fatalf("visible text = %q", got)
|
t.Fatalf("visible text = %q", got)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestStreamAccumulatorStripsInlineCitationAndReferenceMarkers(t *testing.T) {
|
||||||
|
acc := StreamAccumulator{SearchEnabled: true, StripReferenceMarkers: true}
|
||||||
|
result := acc.Apply(sse.LineResult{
|
||||||
|
Parsed: true,
|
||||||
|
Parts: []sse.ContentPart{{Type: "text", Text: "广州天气[citation:1] 多云[reference:0]"}},
|
||||||
|
})
|
||||||
|
|
||||||
|
if !result.ContentSeen {
|
||||||
|
t.Fatalf("expected marker chunk to mark upstream content")
|
||||||
|
}
|
||||||
|
if got := acc.Text.String(); got != "广州天气 多云" {
|
||||||
|
t.Fatalf("visible text = %q", got)
|
||||||
|
}
|
||||||
|
if len(result.Parts) != 1 || result.Parts[0].VisibleText != "广州天气 多云" {
|
||||||
|
t.Fatalf("unexpected parts: %#v", result.Parts)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|||||||
@@ -621,7 +621,7 @@ function stripReferenceMarkersText(text) {
|
|||||||
if (!text) {
|
if (!text) {
|
||||||
return text;
|
return text;
|
||||||
}
|
}
|
||||||
return text.replace(/\[reference:\s*\d+\]/gi, '');
|
return text.replace(/\[(?:citation|reference):\s*\d+\]/gi, '');
|
||||||
}
|
}
|
||||||
|
|
||||||
function asString(v) {
|
function asString(v) {
|
||||||
|
|||||||
289
internal/responsehistory/session.go
Normal file
289
internal/responsehistory/session.go
Normal file
@@ -0,0 +1,289 @@
|
|||||||
|
package responsehistory
|
||||||
|
|
||||||
|
import (
|
||||||
|
"errors"
|
||||||
|
"net/http"
|
||||||
|
"strings"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"ds2api/internal/assistantturn"
|
||||||
|
"ds2api/internal/auth"
|
||||||
|
"ds2api/internal/chathistory"
|
||||||
|
"ds2api/internal/config"
|
||||||
|
"ds2api/internal/prompt"
|
||||||
|
"ds2api/internal/promptcompat"
|
||||||
|
)
|
||||||
|
|
||||||
|
type Session struct {
|
||||||
|
store *chathistory.Store
|
||||||
|
entryID string
|
||||||
|
startedAt time.Time
|
||||||
|
lastPersist time.Time
|
||||||
|
startParams chathistory.StartParams
|
||||||
|
disabled bool
|
||||||
|
}
|
||||||
|
|
||||||
|
type StartParams struct {
|
||||||
|
Store *chathistory.Store
|
||||||
|
Request *http.Request
|
||||||
|
Auth *auth.RequestAuth
|
||||||
|
Surface string
|
||||||
|
Standard promptcompat.StandardRequest
|
||||||
|
}
|
||||||
|
|
||||||
|
func Start(params StartParams) *Session {
|
||||||
|
if params.Store == nil || params.Request == nil || params.Auth == nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
if !params.Store.Enabled() || !shouldCapture(params.Request) {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
startParams := chathistory.StartParams{
|
||||||
|
CallerID: strings.TrimSpace(params.Auth.CallerID),
|
||||||
|
AccountID: strings.TrimSpace(params.Auth.AccountID),
|
||||||
|
Surface: strings.TrimSpace(params.Surface),
|
||||||
|
Model: strings.TrimSpace(params.Standard.ResponseModel),
|
||||||
|
Stream: params.Standard.Stream,
|
||||||
|
UserInput: ExtractSingleUserInput(params.Standard.Messages),
|
||||||
|
Messages: ExtractAllMessages(params.Standard.Messages),
|
||||||
|
HistoryText: params.Standard.HistoryText,
|
||||||
|
FinalPrompt: params.Standard.FinalPrompt,
|
||||||
|
}
|
||||||
|
entry, err := params.Store.Start(startParams)
|
||||||
|
session := &Session{
|
||||||
|
store: params.Store,
|
||||||
|
entryID: entry.ID,
|
||||||
|
startedAt: time.Now(),
|
||||||
|
lastPersist: time.Now(),
|
||||||
|
startParams: startParams,
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
if entry.ID == "" {
|
||||||
|
config.Logger.Warn("[response_history] start failed", "surface", startParams.Surface, "error", err)
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
config.Logger.Warn("[response_history] start persisted in memory after write failure", "surface", startParams.Surface, "error", err)
|
||||||
|
}
|
||||||
|
return session
|
||||||
|
}
|
||||||
|
|
||||||
|
func shouldCapture(r *http.Request) bool {
|
||||||
|
if r == nil || r.URL == nil {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
if strings.TrimSpace(r.URL.Query().Get("__stream_prepare")) == "1" {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
if strings.TrimSpace(r.URL.Query().Get("__stream_release")) == "1" {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
func ExtractSingleUserInput(messages []any) string {
|
||||||
|
for i := len(messages) - 1; i >= 0; i-- {
|
||||||
|
msg, ok := messages[i].(map[string]any)
|
||||||
|
if !ok {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
role := strings.ToLower(strings.TrimSpace(asString(msg["role"])))
|
||||||
|
if role != "user" {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
if normalized := strings.TrimSpace(prompt.NormalizeContent(msg["content"])); normalized != "" {
|
||||||
|
return normalized
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
|
||||||
|
func ExtractAllMessages(messages []any) []chathistory.Message {
|
||||||
|
out := make([]chathistory.Message, 0, len(messages))
|
||||||
|
for _, raw := range messages {
|
||||||
|
msg, ok := raw.(map[string]any)
|
||||||
|
if !ok {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
role := strings.ToLower(strings.TrimSpace(asString(msg["role"])))
|
||||||
|
content := strings.TrimSpace(prompt.NormalizeContent(msg["content"]))
|
||||||
|
if role == "" || content == "" {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
out = append(out, chathistory.Message{
|
||||||
|
Role: role,
|
||||||
|
Content: content,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
return out
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Session) Progress(thinking, content string) {
|
||||||
|
if s == nil || s.store == nil || s.disabled {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
now := time.Now()
|
||||||
|
if now.Sub(s.lastPersist) < 250*time.Millisecond {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
s.lastPersist = now
|
||||||
|
s.persistUpdate(chathistory.UpdateParams{
|
||||||
|
Status: "streaming",
|
||||||
|
ReasoningContent: thinking,
|
||||||
|
Content: content,
|
||||||
|
StatusCode: http.StatusOK,
|
||||||
|
ElapsedMs: time.Since(s.startedAt).Milliseconds(),
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Session) Success(statusCode int, thinking, content, finishReason string, usage map[string]any) {
|
||||||
|
if s == nil || s.store == nil || s.disabled {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
s.persistUpdate(chathistory.UpdateParams{
|
||||||
|
Status: "success",
|
||||||
|
ReasoningContent: thinking,
|
||||||
|
Content: content,
|
||||||
|
StatusCode: statusCode,
|
||||||
|
ElapsedMs: time.Since(s.startedAt).Milliseconds(),
|
||||||
|
FinishReason: finishReason,
|
||||||
|
Usage: usage,
|
||||||
|
Completed: true,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Session) Error(statusCode int, message, finishReason, thinking, content string) {
|
||||||
|
if s == nil || s.store == nil || s.disabled {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
s.persistUpdate(chathistory.UpdateParams{
|
||||||
|
Status: "error",
|
||||||
|
ReasoningContent: thinking,
|
||||||
|
Content: content,
|
||||||
|
Error: message,
|
||||||
|
StatusCode: statusCode,
|
||||||
|
ElapsedMs: time.Since(s.startedAt).Milliseconds(),
|
||||||
|
FinishReason: finishReason,
|
||||||
|
Completed: true,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Session) SuccessTurn(statusCode int, turn assistantturn.Turn, usage map[string]any) {
|
||||||
|
outcome := assistantturn.FinalizeTurn(turn, assistantturn.FinalizeOptions{})
|
||||||
|
s.Success(
|
||||||
|
statusCode,
|
||||||
|
ThinkingForArchive(turn.RawThinking, turn.DetectionThinking, turn.Thinking),
|
||||||
|
TextForArchive(turn.RawText, turn.Text),
|
||||||
|
outcome.FinishReason,
|
||||||
|
usage,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Session) ErrorTurn(statusCode int, message, finishReason string, turn assistantturn.Turn) {
|
||||||
|
s.Error(
|
||||||
|
statusCode,
|
||||||
|
message,
|
||||||
|
finishReason,
|
||||||
|
ThinkingForArchive(turn.RawThinking, turn.DetectionThinking, turn.Thinking),
|
||||||
|
TextForArchive(turn.RawText, turn.Text),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
func TextForArchive(raw, visible string) string {
|
||||||
|
if strings.TrimSpace(raw) != "" {
|
||||||
|
return raw
|
||||||
|
}
|
||||||
|
return visible
|
||||||
|
}
|
||||||
|
|
||||||
|
func ThinkingForArchive(raw, detection, visible string) string {
|
||||||
|
if strings.TrimSpace(raw) != "" {
|
||||||
|
return raw
|
||||||
|
}
|
||||||
|
if strings.TrimSpace(detection) != "" {
|
||||||
|
return detection
|
||||||
|
}
|
||||||
|
return visible
|
||||||
|
}
|
||||||
|
|
||||||
|
func GenericUsage(turn assistantturn.Turn) map[string]any {
|
||||||
|
return map[string]any{
|
||||||
|
"input_tokens": turn.Usage.InputTokens,
|
||||||
|
"output_tokens": turn.Usage.OutputTokens,
|
||||||
|
"reasoning_tokens": turn.Usage.ReasoningTokens,
|
||||||
|
"total_tokens": turn.Usage.TotalTokens,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Session) retryMissingEntry() bool {
|
||||||
|
if s == nil || s.store == nil || s.disabled {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
entry, err := s.store.Start(s.startParams)
|
||||||
|
if errors.Is(err, chathistory.ErrDisabled) {
|
||||||
|
s.disabled = true
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
if entry.ID == "" {
|
||||||
|
if err != nil {
|
||||||
|
config.Logger.Warn("[response_history] recreate missing entry failed", "surface", s.startParams.Surface, "error", err)
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
s.entryID = entry.ID
|
||||||
|
if err != nil {
|
||||||
|
config.Logger.Warn("[response_history] recreate missing entry persisted in memory after write failure", "surface", s.startParams.Surface, "error", err)
|
||||||
|
}
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Session) persistUpdate(params chathistory.UpdateParams) {
|
||||||
|
if s == nil || s.store == nil || s.disabled {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if _, err := s.store.Update(s.entryID, params); err != nil {
|
||||||
|
s.handlePersistError(params, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Session) handlePersistError(params chathistory.UpdateParams, err error) {
|
||||||
|
if err == nil || s == nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if errors.Is(err, chathistory.ErrDisabled) {
|
||||||
|
s.disabled = true
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if isMissingError(err) {
|
||||||
|
if s.retryMissingEntry() {
|
||||||
|
if _, retryErr := s.store.Update(s.entryID, params); retryErr != nil {
|
||||||
|
if errors.Is(retryErr, chathistory.ErrDisabled) || isMissingError(retryErr) {
|
||||||
|
s.disabled = true
|
||||||
|
return
|
||||||
|
}
|
||||||
|
config.Logger.Warn("[response_history] retry after missing entry failed", "surface", s.startParams.Surface, "error", retryErr)
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
s.disabled = true
|
||||||
|
return
|
||||||
|
}
|
||||||
|
config.Logger.Warn("[response_history] update failed", "surface", s.startParams.Surface, "error", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
func isMissingError(err error) bool {
|
||||||
|
if err == nil {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
return strings.Contains(strings.ToLower(err.Error()), "not found")
|
||||||
|
}
|
||||||
|
|
||||||
|
func asString(v any) string {
|
||||||
|
switch x := v.(type) {
|
||||||
|
case string:
|
||||||
|
return x
|
||||||
|
case nil:
|
||||||
|
return ""
|
||||||
|
default:
|
||||||
|
return strings.TrimSpace(prompt.NormalizeContent(x))
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -65,8 +65,8 @@ func NewApp() (*App, error) {
|
|||||||
responsesHandler := &responses.Handler{Store: store, Auth: resolver, DS: dsClient, ChatHistory: chatHistoryStore}
|
responsesHandler := &responses.Handler{Store: store, Auth: resolver, DS: dsClient, ChatHistory: chatHistoryStore}
|
||||||
filesHandler := &files.Handler{Store: store, Auth: resolver, DS: dsClient, ChatHistory: chatHistoryStore}
|
filesHandler := &files.Handler{Store: store, Auth: resolver, DS: dsClient, ChatHistory: chatHistoryStore}
|
||||||
embeddingsHandler := &embeddings.Handler{Store: store, Auth: resolver, DS: dsClient, ChatHistory: chatHistoryStore}
|
embeddingsHandler := &embeddings.Handler{Store: store, Auth: resolver, DS: dsClient, ChatHistory: chatHistoryStore}
|
||||||
claudeHandler := &claude.Handler{Store: store, Auth: resolver, DS: dsClient, OpenAI: chatHandler}
|
claudeHandler := &claude.Handler{Store: store, Auth: resolver, DS: dsClient, OpenAI: chatHandler, ChatHistory: chatHistoryStore}
|
||||||
geminiHandler := &gemini.Handler{Store: store, Auth: resolver, DS: dsClient, OpenAI: chatHandler}
|
geminiHandler := &gemini.Handler{Store: store, Auth: resolver, DS: dsClient, OpenAI: chatHandler, ChatHistory: chatHistoryStore}
|
||||||
adminHandler := &admin.Handler{Store: store, Pool: pool, DS: dsClient, OpenAI: chatHandler, ChatHistory: chatHistoryStore}
|
adminHandler := &admin.Handler{Store: store, Pool: pool, DS: dsClient, OpenAI: chatHandler, ChatHistory: chatHistoryStore}
|
||||||
webuiHandler := webui.NewHandler()
|
webuiHandler := webui.NewHandler()
|
||||||
|
|
||||||
|
|||||||
@@ -2,19 +2,18 @@ package textclean
|
|||||||
|
|
||||||
import "regexp"
|
import "regexp"
|
||||||
|
|
||||||
var referenceMarkerPattern = regexp.MustCompile(`(?i)\[reference:\s*\d+\]`)
|
var citationReferenceMarkerPattern = regexp.MustCompile(`(?i)\[(citation|reference):\s*\d+\]`)
|
||||||
|
|
||||||
func StripReferenceMarkers(text string) string {
|
func StripReferenceMarkers(text string) string {
|
||||||
if text == "" {
|
if text == "" {
|
||||||
return text
|
return text
|
||||||
}
|
}
|
||||||
return referenceMarkerPattern.ReplaceAllString(text, "")
|
return citationReferenceMarkerPattern.ReplaceAllString(text, "")
|
||||||
}
|
}
|
||||||
|
|
||||||
// StripReferenceMarkersEnabled returns true while reference-marker
|
// StripReferenceMarkersEnabled returns the default for streaming surfaces,
|
||||||
// stripping remains the fixed runtime default. When the behaviour is
|
// where partial citation/reference markers are hidden before the final
|
||||||
// eventually removed this function can be deleted and callers can drop
|
// link metadata is available.
|
||||||
// the conditional.
|
|
||||||
func StripReferenceMarkersEnabled() bool {
|
func StripReferenceMarkersEnabled() bool {
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -55,6 +55,45 @@ func (h *Handler) admin(w http.ResponseWriter, r *http.Request) {
|
|||||||
http.Error(w, "WebUI not built. Run `cd webui && npm run build` first.", http.StatusNotFound)
|
http.Error(w, "WebUI not built. Run `cd webui && npm run build` first.", http.StatusNotFound)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// staticContentTypes pins the Content-Type of common WebUI assets so we do not
|
||||||
|
// rely on mime.TypeByExtension, which on Windows consults the registry and can
|
||||||
|
// return the wrong type (e.g. application/xml for .css) when third-party
|
||||||
|
// software has overwritten HKEY_CLASSES_ROOT entries. Browsers strictly enforce
|
||||||
|
// stylesheet/script MIME types and will refuse to apply a misidentified asset,
|
||||||
|
// breaking the /admin page on affected machines.
|
||||||
|
var staticContentTypes = map[string]string{
|
||||||
|
".css": "text/css; charset=utf-8",
|
||||||
|
".js": "text/javascript; charset=utf-8",
|
||||||
|
".mjs": "text/javascript; charset=utf-8",
|
||||||
|
".html": "text/html; charset=utf-8",
|
||||||
|
".htm": "text/html; charset=utf-8",
|
||||||
|
".json": "application/json; charset=utf-8",
|
||||||
|
".map": "application/json; charset=utf-8",
|
||||||
|
".svg": "image/svg+xml",
|
||||||
|
".png": "image/png",
|
||||||
|
".jpg": "image/jpeg",
|
||||||
|
".jpeg": "image/jpeg",
|
||||||
|
".gif": "image/gif",
|
||||||
|
".webp": "image/webp",
|
||||||
|
".ico": "image/x-icon",
|
||||||
|
".woff": "font/woff",
|
||||||
|
".woff2": "font/woff2",
|
||||||
|
".ttf": "font/ttf",
|
||||||
|
".otf": "font/otf",
|
||||||
|
".txt": "text/plain; charset=utf-8",
|
||||||
|
".wasm": "application/wasm",
|
||||||
|
}
|
||||||
|
|
||||||
|
// setStaticContentType pins the response Content-Type by file extension so that
|
||||||
|
// http.ServeFile does not fall back to mime.TypeByExtension (which on Windows
|
||||||
|
// reads the registry and may return an incorrect type).
|
||||||
|
func setStaticContentType(w http.ResponseWriter, fullPath string) {
|
||||||
|
ext := strings.ToLower(filepath.Ext(fullPath))
|
||||||
|
if ct, ok := staticContentTypes[ext]; ok {
|
||||||
|
w.Header().Set("Content-Type", ct)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func (h *Handler) serveFromDisk(w http.ResponseWriter, r *http.Request, staticDir string) {
|
func (h *Handler) serveFromDisk(w http.ResponseWriter, r *http.Request, staticDir string) {
|
||||||
path := strings.TrimPrefix(r.URL.Path, "/admin")
|
path := strings.TrimPrefix(r.URL.Path, "/admin")
|
||||||
path = strings.TrimPrefix(path, "/")
|
path = strings.TrimPrefix(path, "/")
|
||||||
@@ -70,6 +109,7 @@ func (h *Handler) serveFromDisk(w http.ResponseWriter, r *http.Request, staticDi
|
|||||||
} else {
|
} else {
|
||||||
w.Header().Set("Cache-Control", "no-store, must-revalidate")
|
w.Header().Set("Cache-Control", "no-store, must-revalidate")
|
||||||
}
|
}
|
||||||
|
setStaticContentType(w, full)
|
||||||
http.ServeFile(w, r, full)
|
http.ServeFile(w, r, full)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
@@ -82,6 +122,7 @@ func (h *Handler) serveFromDisk(w http.ResponseWriter, r *http.Request, staticDi
|
|||||||
return
|
return
|
||||||
}
|
}
|
||||||
w.Header().Set("Cache-Control", "no-store, must-revalidate")
|
w.Header().Set("Cache-Control", "no-store, must-revalidate")
|
||||||
|
setStaticContentType(w, index)
|
||||||
http.ServeFile(w, r, index)
|
http.ServeFile(w, r, index)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
102
internal/webui/handler_test.go
Normal file
102
internal/webui/handler_test.go
Normal file
@@ -0,0 +1,102 @@
|
|||||||
|
package webui
|
||||||
|
|
||||||
|
import (
|
||||||
|
"net/http"
|
||||||
|
"net/http/httptest"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
)
|
||||||
|
|
||||||
|
// TestServeFromDiskPinsContentType ensures static admin assets are returned
|
||||||
|
// with an explicit, RFC-compliant Content-Type that does not depend on
|
||||||
|
// mime.TypeByExtension. On Windows mime.TypeByExtension consults the registry
|
||||||
|
// (HKEY_CLASSES_ROOT) which third-party software can corrupt — for example
|
||||||
|
// installing certain editors rewrites .css to application/xml — and Chrome
|
||||||
|
// then refuses to apply a stylesheet whose Content-Type is not text/css,
|
||||||
|
// breaking the /admin page entirely. Pinning the type by file extension makes
|
||||||
|
// the response deterministic across operating systems and machine state.
|
||||||
|
func TestServeFromDiskPinsContentType(t *testing.T) {
|
||||||
|
staticDir := t.TempDir()
|
||||||
|
assetsDir := filepath.Join(staticDir, "assets")
|
||||||
|
if err := os.MkdirAll(assetsDir, 0o755); err != nil {
|
||||||
|
t.Fatalf("mkdir assets: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
files := map[string]string{
|
||||||
|
"index.html": "<!doctype html><html></html>",
|
||||||
|
"assets/index.css": "body{}",
|
||||||
|
"assets/index.js": "console.log(1)",
|
||||||
|
"assets/icon.svg": `<svg xmlns="http://www.w3.org/2000/svg"></svg>`,
|
||||||
|
"assets/source.js.map": `{"version":3}`,
|
||||||
|
}
|
||||||
|
for rel, body := range files {
|
||||||
|
full := filepath.Join(staticDir, filepath.FromSlash(rel))
|
||||||
|
if err := os.MkdirAll(filepath.Dir(full), 0o755); err != nil {
|
||||||
|
t.Fatalf("mkdir %s: %v", rel, err)
|
||||||
|
}
|
||||||
|
if err := os.WriteFile(full, []byte(body), 0o644); err != nil {
|
||||||
|
t.Fatalf("write %s: %v", rel, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
h := &Handler{StaticDir: staticDir}
|
||||||
|
|
||||||
|
cases := []struct {
|
||||||
|
urlPath string
|
||||||
|
wantPrefix string
|
||||||
|
wantCacheCtl string
|
||||||
|
}{
|
||||||
|
{"/admin/assets/index.css", "text/css", "public, max-age=31536000, immutable"},
|
||||||
|
{"/admin/assets/index.js", "text/javascript", "public, max-age=31536000, immutable"},
|
||||||
|
{"/admin/assets/icon.svg", "image/svg+xml", "public, max-age=31536000, immutable"},
|
||||||
|
{"/admin/assets/source.js.map", "application/json", "public, max-age=31536000, immutable"},
|
||||||
|
// "/admin/index.html" is intentionally omitted: http.ServeFile redirects
|
||||||
|
// requests for index.html to "./", matching Go's net/http behavior. The
|
||||||
|
// route the SPA actually lands on is "/admin/" below.
|
||||||
|
{"/admin/", "text/html", "no-store, must-revalidate"},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, tc := range cases {
|
||||||
|
t.Run(tc.urlPath, func(t *testing.T) {
|
||||||
|
req := httptest.NewRequest(http.MethodGet, tc.urlPath, nil)
|
||||||
|
rec := httptest.NewRecorder()
|
||||||
|
h.serveFromDisk(rec, req, staticDir)
|
||||||
|
|
||||||
|
if rec.Code != http.StatusOK {
|
||||||
|
t.Fatalf("status = %d, want 200", rec.Code)
|
||||||
|
}
|
||||||
|
ct := rec.Header().Get("Content-Type")
|
||||||
|
if !strings.HasPrefix(ct, tc.wantPrefix) {
|
||||||
|
t.Fatalf("Content-Type = %q, want prefix %q", ct, tc.wantPrefix)
|
||||||
|
}
|
||||||
|
if got := rec.Header().Get("Cache-Control"); got != tc.wantCacheCtl {
|
||||||
|
t.Fatalf("Cache-Control = %q, want %q", got, tc.wantCacheCtl)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// TestSetStaticContentTypeUnknownExtensionFallsThrough verifies that unknown
|
||||||
|
// extensions leave the Content-Type header unset, so http.ServeFile can apply
|
||||||
|
// its own detection (sniffing or mime.TypeByExtension) for cases the pinned
|
||||||
|
// table does not cover.
|
||||||
|
func TestSetStaticContentTypeUnknownExtensionFallsThrough(t *testing.T) {
|
||||||
|
rec := httptest.NewRecorder()
|
||||||
|
setStaticContentType(rec, "/tmp/data.unknownext")
|
||||||
|
if got := rec.Header().Get("Content-Type"); got != "" {
|
||||||
|
t.Fatalf("Content-Type = %q, want empty for unknown extension", got)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// TestSetStaticContentTypeIsCaseInsensitive guards against a regression where
|
||||||
|
// uppercase extensions (e.g. STYLE.CSS shipped from some build pipelines)
|
||||||
|
// would bypass the pinned table and fall back to the registry on Windows.
|
||||||
|
func TestSetStaticContentTypeIsCaseInsensitive(t *testing.T) {
|
||||||
|
rec := httptest.NewRecorder()
|
||||||
|
setStaticContentType(rec, "/tmp/STYLE.CSS")
|
||||||
|
if got := rec.Header().Get("Content-Type"); !strings.HasPrefix(got, "text/css") {
|
||||||
|
t.Fatalf("Content-Type = %q, want text/css prefix", got)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -58,3 +58,47 @@ test('chat history strict parser inserts history after system messages', async (
|
|||||||
{ role: 'user', content: 'latest' },
|
{ role: 'user', content: 'latest' },
|
||||||
]);
|
]);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
test('chat history transcript parser replaces current input file placeholder', async () => {
|
||||||
|
const {
|
||||||
|
buildListModeMessages,
|
||||||
|
} = await loadUtils();
|
||||||
|
const t = (key) => key;
|
||||||
|
const item = {
|
||||||
|
messages: [{
|
||||||
|
role: 'user',
|
||||||
|
content: 'Continue from the latest state in the attached DS2API_HISTORY.txt context. Treat it as the current working state and answer the latest user request directly.',
|
||||||
|
}],
|
||||||
|
history_text: [
|
||||||
|
'# DS2API_HISTORY.txt',
|
||||||
|
'Prior conversation history and tool progress.',
|
||||||
|
'',
|
||||||
|
'=== 1. SYSTEM ===',
|
||||||
|
'policy',
|
||||||
|
'',
|
||||||
|
'=== 2. USER ===',
|
||||||
|
'hello',
|
||||||
|
'',
|
||||||
|
'=== 3. ASSISTANT ===',
|
||||||
|
'hi',
|
||||||
|
'',
|
||||||
|
'=== 4. TOOL ===',
|
||||||
|
'[name=search_web tool_call_id=call_1]',
|
||||||
|
'{"ok":true}',
|
||||||
|
'',
|
||||||
|
'=== 5. USER ===',
|
||||||
|
'latest',
|
||||||
|
'',
|
||||||
|
].join('\n'),
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = buildListModeMessages(item, t);
|
||||||
|
assert.equal(result.historyMerged, true);
|
||||||
|
assert.deepEqual(result.messages, [
|
||||||
|
{ role: 'system', content: 'policy' },
|
||||||
|
{ role: 'user', content: 'hello' },
|
||||||
|
{ role: 'assistant', content: 'hi' },
|
||||||
|
{ role: 'tool', content: '[name=search_web tool_call_id=call_1]\n{"ok":true}' },
|
||||||
|
{ role: 'user', content: 'latest' },
|
||||||
|
]);
|
||||||
|
});
|
||||||
|
|||||||
@@ -615,17 +615,17 @@ test('parseChunkForContent preserves space-only content tokens', () => {
|
|||||||
assert.deepEqual(parsed.parts, [{ text: ' ', type: 'text' }]);
|
assert.deepEqual(parsed.parts, [{ text: ' ', type: 'text' }]);
|
||||||
});
|
});
|
||||||
|
|
||||||
test('parseChunkForContent strips reference markers from fragment content', () => {
|
test('parseChunkForContent strips citation and reference markers from fragment content', () => {
|
||||||
const chunk = {
|
const chunk = {
|
||||||
p: 'response/fragments',
|
p: 'response/fragments',
|
||||||
o: 'APPEND',
|
o: 'APPEND',
|
||||||
v: [
|
v: [
|
||||||
{ type: 'RESPONSE', content: '广州天气 [reference:12] 多云' },
|
{ type: 'RESPONSE', content: '广州天气 [citation:1] [reference:12] 多云' },
|
||||||
],
|
],
|
||||||
};
|
};
|
||||||
const parsed = parseChunkForContent(chunk, false, 'text');
|
const parsed = parseChunkForContent(chunk, false, 'text');
|
||||||
assert.equal(parsed.finished, false);
|
assert.equal(parsed.finished, false);
|
||||||
assert.deepEqual(parsed.parts, [{ text: '广州天气 多云', type: 'text' }]);
|
assert.deepEqual(parsed.parts, [{ text: '广州天气 多云', type: 'text' }]);
|
||||||
});
|
});
|
||||||
|
|
||||||
test('parseChunkForContent detects content_filter status and ignores upstream output tokens', () => {
|
test('parseChunkForContent detects content_filter status and ignores upstream output tokens', () => {
|
||||||
|
|||||||
@@ -123,7 +123,6 @@ export function useChatStreamClient({
|
|||||||
const headers = {
|
const headers = {
|
||||||
'Content-Type': 'application/json',
|
'Content-Type': 'application/json',
|
||||||
'Authorization': `Bearer ${effectiveKey}`,
|
'Authorization': `Bearer ${effectiveKey}`,
|
||||||
'X-Ds2-Source': 'admin-webui-api-tester',
|
|
||||||
}
|
}
|
||||||
if (requestAccount) {
|
if (requestAccount) {
|
||||||
headers['X-Ds2-Target-Account'] = requestAccount
|
headers['X-Ds2-Target-Account'] = requestAccount
|
||||||
|
|||||||
@@ -10,6 +10,9 @@ import {
|
|||||||
VIEW_MODE_KEY,
|
VIEW_MODE_KEY,
|
||||||
} from './chatHistoryUtils'
|
} from './chatHistoryUtils'
|
||||||
|
|
||||||
|
const LIST_REFRESH_MS = 1500
|
||||||
|
const STREAMING_DETAIL_REFRESH_MS = 750
|
||||||
|
|
||||||
export default function ChatHistoryContainer({ authFetch, onMessage }) {
|
export default function ChatHistoryContainer({ authFetch, onMessage }) {
|
||||||
const { t, lang } = useI18n()
|
const { t, lang } = useI18n()
|
||||||
const apiFetch = authFetch || fetch
|
const apiFetch = authFetch || fetch
|
||||||
@@ -136,7 +139,7 @@ export default function ChatHistoryContainer({ authFetch, onMessage }) {
|
|||||||
if (!autoRefreshReady || limit === DISABLED_LIMIT) return undefined
|
if (!autoRefreshReady || limit === DISABLED_LIMIT) return undefined
|
||||||
const timer = window.setInterval(() => {
|
const timer = window.setInterval(() => {
|
||||||
loadList({ mode: 'silent', announceError: false })
|
loadList({ mode: 'silent', announceError: false })
|
||||||
}, 5000)
|
}, LIST_REFRESH_MS)
|
||||||
return () => window.clearInterval(timer)
|
return () => window.clearInterval(timer)
|
||||||
}, [autoRefreshReady, limit])
|
}, [autoRefreshReady, limit])
|
||||||
|
|
||||||
@@ -144,7 +147,7 @@ export default function ChatHistoryContainer({ authFetch, onMessage }) {
|
|||||||
if (!autoRefreshReady || !selectedId || selectedSummary?.status !== 'streaming') return undefined
|
if (!autoRefreshReady || !selectedId || selectedSummary?.status !== 'streaming') return undefined
|
||||||
const timer = window.setInterval(() => {
|
const timer = window.setInterval(() => {
|
||||||
loadDetail(selectedId, { announceError: false })
|
loadDetail(selectedId, { announceError: false })
|
||||||
}, 1000)
|
}, STREAMING_DETAIL_REFRESH_MS)
|
||||||
return () => window.clearInterval(timer)
|
return () => window.clearInterval(timer)
|
||||||
}, [autoRefreshReady, selectedId, selectedSummary?.status])
|
}, [autoRefreshReady, selectedId, selectedSummary?.status])
|
||||||
|
|
||||||
|
|||||||
@@ -207,6 +207,10 @@ function MetaGrid({ selectedItem, t }) {
|
|||||||
{formatElapsed(selectedItem.elapsed_ms, t)}
|
{formatElapsed(selectedItem.elapsed_ms, t)}
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
<div className="rounded-lg border border-border bg-card px-3 py-2">
|
||||||
|
<div className="text-[11px] text-muted-foreground">{t('chatHistory.metaSurface')}</div>
|
||||||
|
<div className="text-sm font-medium text-foreground break-all">{selectedItem.surface || t('chatHistory.metaUnknown')}</div>
|
||||||
|
</div>
|
||||||
<div className="rounded-lg border border-border bg-card px-3 py-2">
|
<div className="rounded-lg border border-border bg-card px-3 py-2">
|
||||||
<div className="text-[11px] text-muted-foreground">{t('chatHistory.metaModel')}</div>
|
<div className="text-[11px] text-muted-foreground">{t('chatHistory.metaModel')}</div>
|
||||||
<div className="text-sm font-medium text-foreground break-all">{selectedItem.model || t('chatHistory.metaUnknown')}</div>
|
<div className="text-sm font-medium text-foreground break-all">{selectedItem.model || t('chatHistory.metaUnknown')}</div>
|
||||||
|
|||||||
@@ -69,7 +69,7 @@ export function ChatHistoryListPane({ items, selectedItem, deletingId, t, lang,
|
|||||||
{item.user_input || t('chatHistory.untitled')}
|
{item.user_input || t('chatHistory.untitled')}
|
||||||
</div>
|
</div>
|
||||||
<div className="text-[11px] text-muted-foreground mt-1 truncate">
|
<div className="text-[11px] text-muted-foreground mt-1 truncate">
|
||||||
{item.model || '-'}
|
{[item.surface, item.model].filter(Boolean).join(' · ') || '-'}
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div className="flex items-center gap-2 shrink-0">
|
<div className="flex items-center gap-2 shrink-0">
|
||||||
|
|||||||
@@ -15,12 +15,24 @@ const CURRENT_INPUT_FILE_PROMPT = 'Continue from the latest state in the attache
|
|||||||
const LEGACY_CURRENT_INPUT_FILE_PROMPTS = new Set([
|
const LEGACY_CURRENT_INPUT_FILE_PROMPTS = new Set([
|
||||||
'The current request and prior conversation context have already been provided. Answer the latest user request directly.',
|
'The current request and prior conversation context have already been provided. Answer the latest user request directly.',
|
||||||
])
|
])
|
||||||
|
const HISTORY_TRANSCRIPT_TITLE = '# DS2API_HISTORY.txt'
|
||||||
|
const HISTORY_TRANSCRIPT_ENTRY_RE = /^===\s+\d+\.\s+([A-Z][A-Z_ -]*)\s+===\s*$/gm
|
||||||
|
|
||||||
function isCurrentInputFilePrompt(value) {
|
function isCurrentInputFilePrompt(value) {
|
||||||
const text = String(value || '').trim()
|
const text = String(value || '').trim()
|
||||||
return text === CURRENT_INPUT_FILE_PROMPT || LEGACY_CURRENT_INPUT_FILE_PROMPTS.has(text)
|
return text === CURRENT_INPUT_FILE_PROMPT || LEGACY_CURRENT_INPUT_FILE_PROMPTS.has(text)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function normalizeHistoryRole(role) {
|
||||||
|
const normalized = String(role || '').trim().toLowerCase()
|
||||||
|
if (normalized === 'function') return 'tool'
|
||||||
|
if (normalized === 'developer') return 'system'
|
||||||
|
if (normalized === 'system' || normalized === 'user' || normalized === 'assistant' || normalized === 'tool') {
|
||||||
|
return normalized
|
||||||
|
}
|
||||||
|
return normalized || 'system'
|
||||||
|
}
|
||||||
|
|
||||||
export function formatDateTime(value, lang) {
|
export function formatDateTime(value, lang) {
|
||||||
if (!value) return '-'
|
if (!value) return '-'
|
||||||
try {
|
try {
|
||||||
@@ -221,11 +233,37 @@ export function parseStrictHistoryMessages(historyText) {
|
|||||||
return parsed
|
return parsed
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function parseTranscriptHistoryMessages(historyText) {
|
||||||
|
const rawText = String(historyText || '')
|
||||||
|
const titleIndex = rawText.indexOf(HISTORY_TRANSCRIPT_TITLE)
|
||||||
|
const transcript = titleIndex >= 0 ? rawText.slice(titleIndex) : rawText
|
||||||
|
const matches = [...transcript.matchAll(HISTORY_TRANSCRIPT_ENTRY_RE)]
|
||||||
|
if (!matches.length) return null
|
||||||
|
|
||||||
|
const parsed = []
|
||||||
|
for (let i = 0; i < matches.length; i += 1) {
|
||||||
|
const match = matches[i]
|
||||||
|
const next = matches[i + 1]
|
||||||
|
const role = normalizeHistoryRole(match[1])
|
||||||
|
const start = (match.index || 0) + match[0].length
|
||||||
|
const end = next ? next.index : transcript.length
|
||||||
|
const content = transcript.slice(start, end).replace(/^\r?\n/, '').trim()
|
||||||
|
if (!content) continue
|
||||||
|
parsed.push({ role, content })
|
||||||
|
}
|
||||||
|
|
||||||
|
return parsed.length ? parsed : null
|
||||||
|
}
|
||||||
|
|
||||||
|
export function parseHistoryMessages(historyText) {
|
||||||
|
return parseStrictHistoryMessages(historyText) || parseTranscriptHistoryMessages(historyText)
|
||||||
|
}
|
||||||
|
|
||||||
export function buildListModeMessages(item, t) {
|
export function buildListModeMessages(item, t) {
|
||||||
const liveMessages = Array.isArray(item?.messages) && item.messages.length > 0
|
const liveMessages = Array.isArray(item?.messages) && item.messages.length > 0
|
||||||
? item.messages
|
? item.messages
|
||||||
: [{ role: 'user', content: item?.user_input || t('chatHistory.emptyUserInput') }]
|
: [{ role: 'user', content: item?.user_input || t('chatHistory.emptyUserInput') }]
|
||||||
const historyMessages = parseStrictHistoryMessages(item?.history_text)
|
const historyMessages = parseHistoryMessages(item?.history_text)
|
||||||
|
|
||||||
if (!historyMessages?.length) {
|
if (!historyMessages?.length) {
|
||||||
return { messages: liveMessages, historyMerged: false }
|
return { messages: liveMessages, historyMerged: false }
|
||||||
|
|||||||
@@ -15,6 +15,8 @@ export default function VercelSyncContainer({ onMessage, authFetch, isVercel = f
|
|||||||
setProjectId,
|
setProjectId,
|
||||||
teamId,
|
teamId,
|
||||||
setTeamId,
|
setTeamId,
|
||||||
|
saveCredentials,
|
||||||
|
setSaveCredentials,
|
||||||
loading,
|
loading,
|
||||||
result,
|
result,
|
||||||
preconfig,
|
preconfig,
|
||||||
@@ -46,6 +48,8 @@ export default function VercelSyncContainer({ onMessage, authFetch, isVercel = f
|
|||||||
setProjectId={setProjectId}
|
setProjectId={setProjectId}
|
||||||
teamId={teamId}
|
teamId={teamId}
|
||||||
setTeamId={setTeamId}
|
setTeamId={setTeamId}
|
||||||
|
saveCredentials={saveCredentials}
|
||||||
|
setSaveCredentials={setSaveCredentials}
|
||||||
loading={loading}
|
loading={loading}
|
||||||
onSync={handleSync}
|
onSync={handleSync}
|
||||||
/>
|
/>
|
||||||
|
|||||||
@@ -14,6 +14,8 @@ export default function VercelSyncForm({
|
|||||||
setProjectId,
|
setProjectId,
|
||||||
teamId,
|
teamId,
|
||||||
setTeamId,
|
setTeamId,
|
||||||
|
saveCredentials,
|
||||||
|
setSaveCredentials,
|
||||||
loading,
|
loading,
|
||||||
onSync,
|
onSync,
|
||||||
}) {
|
}) {
|
||||||
@@ -124,6 +126,19 @@ export default function VercelSyncForm({
|
|||||||
onChange={e => setTeamId(e.target.value)}
|
onChange={e => setTeamId(e.target.value)}
|
||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
<label className="flex items-start gap-3 text-sm">
|
||||||
|
<input
|
||||||
|
type="checkbox"
|
||||||
|
className="mt-1 h-4 w-4 rounded border-border text-primary focus:ring-ring"
|
||||||
|
checked={saveCredentials}
|
||||||
|
onChange={e => setSaveCredentials(e.target.checked)}
|
||||||
|
/>
|
||||||
|
<span className="space-y-1">
|
||||||
|
<span className="block font-medium">{t('vercel.saveCredentials')}</span>
|
||||||
|
<span className="block text-xs text-muted-foreground">{t('vercel.saveCredentialsHint')}</span>
|
||||||
|
</span>
|
||||||
|
</label>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div className="pt-4">
|
<div className="pt-4">
|
||||||
|
|||||||
@@ -12,6 +12,7 @@ export function useVercelSyncState({ apiFetch, onMessage, t, isVercel = false })
|
|||||||
const [vercelToken, setVercelToken] = useState('')
|
const [vercelToken, setVercelToken] = useState('')
|
||||||
const [projectId, setProjectId] = useState('')
|
const [projectId, setProjectId] = useState('')
|
||||||
const [teamId, setTeamId] = useState('')
|
const [teamId, setTeamId] = useState('')
|
||||||
|
const [saveCredentials, setSaveCredentials] = useState(true)
|
||||||
const [loading, setLoading] = useState(false)
|
const [loading, setLoading] = useState(false)
|
||||||
const [result, setResult] = useState(null)
|
const [result, setResult] = useState(null)
|
||||||
const [preconfig, setPreconfig] = useState(null)
|
const [preconfig, setPreconfig] = useState(null)
|
||||||
@@ -117,6 +118,7 @@ export function useVercelSyncState({ apiFetch, onMessage, t, isVercel = false })
|
|||||||
vercel_token: tokenToUse,
|
vercel_token: tokenToUse,
|
||||||
project_id: projectId,
|
project_id: projectId,
|
||||||
team_id: teamId || undefined,
|
team_id: teamId || undefined,
|
||||||
|
save_credentials: saveCredentials,
|
||||||
}),
|
}),
|
||||||
})
|
})
|
||||||
const data = await res.json()
|
const data = await res.json()
|
||||||
@@ -133,7 +135,7 @@ export function useVercelSyncState({ apiFetch, onMessage, t, isVercel = false })
|
|||||||
} finally {
|
} finally {
|
||||||
setLoading(false)
|
setLoading(false)
|
||||||
}
|
}
|
||||||
}, [apiFetch, fetchSyncStatus, onMessage, preconfig?.has_token, projectId, t, teamId, vercelToken])
|
}, [apiFetch, fetchSyncStatus, onMessage, preconfig?.has_token, projectId, saveCredentials, t, teamId, vercelToken])
|
||||||
|
|
||||||
return {
|
return {
|
||||||
vercelToken,
|
vercelToken,
|
||||||
@@ -142,6 +144,8 @@ export function useVercelSyncState({ apiFetch, onMessage, t, isVercel = false })
|
|||||||
setProjectId,
|
setProjectId,
|
||||||
teamId,
|
teamId,
|
||||||
setTeamId,
|
setTeamId,
|
||||||
|
saveCredentials,
|
||||||
|
setSaveCredentials,
|
||||||
loading,
|
loading,
|
||||||
result,
|
result,
|
||||||
preconfig,
|
preconfig,
|
||||||
|
|||||||
@@ -18,8 +18,8 @@
|
|||||||
"desc": "Test API connectivity and responses"
|
"desc": "Test API connectivity and responses"
|
||||||
},
|
},
|
||||||
"history": {
|
"history": {
|
||||||
"label": "Conversations",
|
"label": "Responses",
|
||||||
"desc": "Browse server-side external chat history"
|
"desc": "Browse server-side upstream response records"
|
||||||
},
|
},
|
||||||
"import": {
|
"import": {
|
||||||
"label": "Batch Import",
|
"label": "Batch Import",
|
||||||
@@ -261,7 +261,7 @@
|
|||||||
"loading": "Loading conversation history...",
|
"loading": "Loading conversation history...",
|
||||||
"loadFailed": "Failed to load conversation history.",
|
"loadFailed": "Failed to load conversation history.",
|
||||||
"retentionTitle": "Retention",
|
"retentionTitle": "Retention",
|
||||||
"retentionDesc": "The server keeps only the latest N external /v1/chat/completions conversations.",
|
"retentionDesc": "The server keeps only the latest N DeepSeek upstream response records across OpenAI Chat, OpenAI Responses, Claude, and Gemini direct interfaces.",
|
||||||
"off": "OFF",
|
"off": "OFF",
|
||||||
"refresh": "Refresh",
|
"refresh": "Refresh",
|
||||||
"clearAll": "Clear all",
|
"clearAll": "Clear all",
|
||||||
@@ -277,7 +277,7 @@
|
|||||||
"viewModeList": "List mode",
|
"viewModeList": "List mode",
|
||||||
"viewModeMerged": "Merged mode",
|
"viewModeMerged": "Merged mode",
|
||||||
"emptyTitle": "No conversation history yet",
|
"emptyTitle": "No conversation history yet",
|
||||||
"emptyDesc": "When external clients call /v1/chat/completions, the server will save the results here automatically.",
|
"emptyDesc": "When a supported interface talks to DeepSeek upstream and receives a response, the server saves the result here automatically.",
|
||||||
"untitled": "Untitled conversation",
|
"untitled": "Untitled conversation",
|
||||||
"noPreview": "No preview available.",
|
"noPreview": "No preview available.",
|
||||||
"selectPrompt": "Select a record on the left to view details.",
|
"selectPrompt": "Select a record on the left to view details.",
|
||||||
@@ -303,6 +303,7 @@
|
|||||||
"metaTitle": "Metadata",
|
"metaTitle": "Metadata",
|
||||||
"metaAccount": "Account",
|
"metaAccount": "Account",
|
||||||
"metaElapsed": "Elapsed",
|
"metaElapsed": "Elapsed",
|
||||||
|
"metaSurface": "Surface",
|
||||||
"metaModel": "Model",
|
"metaModel": "Model",
|
||||||
"metaStatusCode": "Status code",
|
"metaStatusCode": "Status code",
|
||||||
"metaStream": "Output mode",
|
"metaStream": "Output mode",
|
||||||
@@ -461,6 +462,8 @@
|
|||||||
"projectIdHint": "Find it in Project Settings → General.",
|
"projectIdHint": "Find it in Project Settings → General.",
|
||||||
"teamIdLabel": "Team ID",
|
"teamIdLabel": "Team ID",
|
||||||
"optional": "optional",
|
"optional": "optional",
|
||||||
|
"saveCredentials": "Remember Vercel credentials",
|
||||||
|
"saveCredentialsHint": "Save the token, project ID, and team ID for the next sync.",
|
||||||
"syncing": "Syncing...",
|
"syncing": "Syncing...",
|
||||||
"syncRedeploy": "Sync & redeploy",
|
"syncRedeploy": "Sync & redeploy",
|
||||||
"redeployHint": "This triggers a Vercel redeploy and usually takes 30–60 seconds.",
|
"redeployHint": "This triggers a Vercel redeploy and usually takes 30–60 seconds.",
|
||||||
|
|||||||
@@ -18,8 +18,8 @@
|
|||||||
"desc": "测试 API 连接与响应"
|
"desc": "测试 API 连接与响应"
|
||||||
},
|
},
|
||||||
"history": {
|
"history": {
|
||||||
"label": "对话记录",
|
"label": "响应记录",
|
||||||
"desc": "查看服务器保存的外部对话历史"
|
"desc": "查看服务器保存的上游响应归档"
|
||||||
},
|
},
|
||||||
"import": {
|
"import": {
|
||||||
"label": "批量导入",
|
"label": "批量导入",
|
||||||
@@ -261,7 +261,7 @@
|
|||||||
"loading": "正在加载对话记录...",
|
"loading": "正在加载对话记录...",
|
||||||
"loadFailed": "加载对话记录失败",
|
"loadFailed": "加载对话记录失败",
|
||||||
"retentionTitle": "保留条数",
|
"retentionTitle": "保留条数",
|
||||||
"retentionDesc": "服务器端只保留最新 N 条外部 /v1/chat/completions 对话记录。",
|
"retentionDesc": "服务器端只保留最新 N 条 DeepSeek 上游响应记录,覆盖 OpenAI Chat、OpenAI Responses、Claude 和 Gemini 直连接口。",
|
||||||
"off": "OFF",
|
"off": "OFF",
|
||||||
"refresh": "刷新",
|
"refresh": "刷新",
|
||||||
"clearAll": "清空全部",
|
"clearAll": "清空全部",
|
||||||
@@ -277,7 +277,7 @@
|
|||||||
"viewModeList": "列表模式",
|
"viewModeList": "列表模式",
|
||||||
"viewModeMerged": "合并模式",
|
"viewModeMerged": "合并模式",
|
||||||
"emptyTitle": "还没有可用的对话记录",
|
"emptyTitle": "还没有可用的对话记录",
|
||||||
"emptyDesc": "当外部客户端调用 /v1/chat/completions 时,服务端会自动把结果写入这里。",
|
"emptyDesc": "当支持的接口与 DeepSeek 上游交互并收到响应时,服务端会自动把结果写入这里。",
|
||||||
"untitled": "未命名对话",
|
"untitled": "未命名对话",
|
||||||
"noPreview": "暂无预览内容",
|
"noPreview": "暂无预览内容",
|
||||||
"selectPrompt": "从左侧选择一条记录查看详情。",
|
"selectPrompt": "从左侧选择一条记录查看详情。",
|
||||||
@@ -303,6 +303,7 @@
|
|||||||
"metaTitle": "元信息",
|
"metaTitle": "元信息",
|
||||||
"metaAccount": "使用账号",
|
"metaAccount": "使用账号",
|
||||||
"metaElapsed": "耗时",
|
"metaElapsed": "耗时",
|
||||||
|
"metaSurface": "接口",
|
||||||
"metaModel": "模型",
|
"metaModel": "模型",
|
||||||
"metaStatusCode": "状态码",
|
"metaStatusCode": "状态码",
|
||||||
"metaStream": "输出模式",
|
"metaStream": "输出模式",
|
||||||
@@ -461,6 +462,8 @@
|
|||||||
"projectIdHint": "可在项目设置 (Project Settings) → 常规 (General) 中找到",
|
"projectIdHint": "可在项目设置 (Project Settings) → 常规 (General) 中找到",
|
||||||
"teamIdLabel": "团队 ID",
|
"teamIdLabel": "团队 ID",
|
||||||
"optional": "可选",
|
"optional": "可选",
|
||||||
|
"saveCredentials": "记住 Vercel 凭据",
|
||||||
|
"saveCredentialsHint": "保存访问令牌、项目 ID 和团队 ID,供下次同步直接复用。",
|
||||||
"syncing": "正在同步...",
|
"syncing": "正在同步...",
|
||||||
"syncRedeploy": "同步并重新部署",
|
"syncRedeploy": "同步并重新部署",
|
||||||
"redeployHint": "这将触发 Vercel 的重新部署,大约需要 30-60 秒。",
|
"redeployHint": "这将触发 Vercel 的重新部署,大约需要 30-60 秒。",
|
||||||
|
|||||||
Reference in New Issue
Block a user