Compare commits

..

31 Commits

Author SHA1 Message Date
CJACK.
7ab5a0e66d Merge pull request #458 from CJackHwang/dev
Avoid lowercasing ignored XML tails in toolcall
2026-05-08 17:13:00 +08:00
CJACK.
410efbd70b Merge pull request #457 from NgoQuocViet2001/ai/skipxml-lower-hotpath
fix(toolcall): avoid lowercasing ignored XML tails
2026-05-08 17:05:28 +08:00
NgoQuocViet2001
7179b995bb fix(toolcall): avoid lowercasing ignored XML tails 2026-05-08 14:15:32 +07:00
CJACK.
fef3798e5e Merge pull request #453 from CJackHwang/dev
Fix character length calculation issue
2026-05-08 13:40:47 +08:00
CJACK.
00fe18b505 Update VERSION 2026-05-08 13:36:17 +08:00
CJACK.
9b746e32d8 Merge pull request #452 from waiwaic/fix/turkish-i-boundary-panic
fix(toolcall): use len(lower) not len(text) after ToLower to prevent out-of-bounds panic
2026-05-08 13:34:28 +08:00
waiwai
ace440481a refactor(toolcall): remove lower param from skipXMLIgnoredSection
The lower parameter was a footgun: callers had to keep it in sync with the
loop bound over text. Instead, skipXMLIgnoredSection now accepts only text
and constructs strings.ToLower(tail) internally for its prefix checks.

This eliminates the entire class of len(text) vs len(lower) boundary bugs
along with the min() workaround.

Also changes:
- findToolCDATAEnd: drop lower param, use text directly for closeMarker
  search (]]> is ASCII, ToLower is a no-op for it)
- cdataEndLooksStructural: drop lower param, use raw text byte comparison
- All external callers: loop bound reverts to plain len(text)

The inner tag-matching functions (findXMLStartTagOutsideCDATA,
findMatchingXMLEndTagOutsideCDATA) retain their own local lower for
HasPrefix comparisons against the target tag name, keeping concerns
properly separated.

Fixes #435.
2026-05-08 13:29:21 +08:00
CJACK.
66e0fa568f Merge pull request #449 from CJackHwang/dev
Update
2026-05-08 01:24:16 +08:00
CJACK.
fa489248bc Merge pull request #450 from CJackHwang/codex/add-json-tag-for-ollama-model-id
Add Ollama-compatible API endpoints and model capability support
2026-05-08 01:21:41 +08:00
CJACK.
657b9379ed test(docs): assert ollama show id field and document ollama endpoints 2026-05-08 01:11:35 +08:00
CJACK.
9062330104 Merge pull request #446 from dinhnn/main
feat: add Ollama API endpoints /api/version, /api/tags, /api/show for Copilot integration
2026-05-08 00:54:16 +08:00
Dinh Nguyen
d0d61a5d77 Update ollama api test 2026-05-07 14:23:12 +07:00
dinhnn
ffef451f7a Fixbug test typing 2026-05-07 13:48:03 +07:00
Dinh Nguyen
a68a79e087 Add ollama api for copilot support 2026-05-07 09:41:46 +07:00
CJACK.
c8db66615c Update VERSION 2026-05-06 13:04:16 +08:00
CJACK.
79ae9c8970 Merge pull request #436 from waiwaic/main
fix: auto-detect Vercel for chat history path
2026-05-06 13:03:30 +08:00
waiwai
2378f0fbe7 fix: auto-detect Vercel for chat history path
On Vercel, /var/task is read-only at runtime. ChatHistoryPath() now
auto-detects Vercel via IsVercel() and defaults to /tmp/chat_history.json
when no explicit DS2API_CHAT_HISTORY_PATH is set. Manual env var still
works as explicit override.
2026-05-06 11:10:14 +08:00
CJACK.
aa29084038 Merge pull request #434 from CJackHwang/dev
Merge pull request #433 from CJackHwang/codex/flash-searchpro-search

Remove heuristic model name resolution and require explicit aliases or canonical IDs
2026-05-06 00:38:20 +08:00
CJACK.
21c1527c79 Merge pull request #433 from CJackHwang/codex/flash-searchpro-search
Remove heuristic model name resolution and require explicit aliases or canonical IDs
2026-05-06 00:03:44 +08:00
CJACK.
7ec0d99702 Merge pull request #431 from CJackHwang/main
Fix OpenAI stream heartbeat and avoid empty choices
2026-05-06 00:02:25 +08:00
CJACK.
7e639667f8 refactor: remove heuristic model resolution and enforce allowlist 2026-05-06 00:00:27 +08:00
CJACK.
066c48c107 Bump version from 4.4.2 to 4.4.3 2026-05-05 22:29:36 +08:00
CJACK.
d69b0658ea Merge pull request #430 from NgoQuocViet2001/ai/openai-stream-empty-choices
fix(openai): avoid empty choices stream heartbeat
2026-05-05 22:24:21 +08:00
NgoQuocViet2001
4315b424bf fix(openai): keep stream heartbeat choice-free 2026-05-05 21:13:38 +07:00
CJACK.
4678a061d0 Merge pull request #423 from CJackHwang/dev
Dev push
2026-05-04 23:20:00 +08:00
CJACK.
70076c217f Update VERSION 2026-05-04 23:16:33 +08:00
CJACK.
554fae6b3f Merge pull request #421 from NgoQuocViet2001/ai/vercel-credential-cache
feat(admin): remember Vercel sync credentials
2026-05-04 23:11:00 +08:00
NgoQuocViet2001
76884c0d94 feat(admin): remember Vercel sync credentials 2026-05-04 21:28:02 +07:00
CJACK.
269d7cd8f9 Merge pull request #418 from lwz762/fix/admin-css-mime-windows
fix(webui): 修复 Windows 注册表 MIME 错误导致 /admin 样式失效
2026-05-04 14:31:47 +08:00
lin
7870a61bb0 fix(webui): pin Content-Type for /admin static assets
http.ServeFile relies on mime.TypeByExtension, which on Windows reads
HKEY_CLASSES_ROOT to resolve the MIME type. Third-party software (some
editors and registry-cleaning tools) can rewrite ".css" to
"application/xml", causing Chrome to refuse the stylesheet and breaking
the /admin panel with a fully unstyled page. The same class of bug
affects ".js" -> "text/plain" in some setups.

Pin the Content-Type by file extension before delegating to ServeFile,
covering the WebUI asset surface (css, js, mjs, html, json, map, svg,
common image and font formats, wasm). Unknown extensions still fall
through to ServeFile's default detection.

Tests cover the pinned types, case-insensitive extension matching, and
the unknown-extension passthrough.
2026-05-04 10:09:50 +08:00
CJACK.
ec4f178908 Merge pull request #416 from CJackHwang/main
Add Star History section to README

Added a Star History section with a chart to the README.
2026-05-03 20:48:53 +08:00
33 changed files with 954 additions and 167 deletions

View File

@@ -18,6 +18,7 @@ Docs: [Overview](README.en.md) / [Architecture](docs/ARCHITECTURE.en.md) / [Depl
- [OpenAI-Compatible API](#openai-compatible-api)
- [Claude-Compatible API](#claude-compatible-api)
- [Gemini-Compatible API](#gemini-compatible-api)
- [Ollama API](#ollama-api)
- [Admin API](#admin-api)
- [Error Payloads](#error-payloads)
- [cURL Examples](#curl-examples)
@@ -123,6 +124,9 @@ Gemini-compatible clients can also send `x-goog-api-key`, `?key=`, or `?api_key=
| POST | `/v1beta/models/{model}:streamGenerateContent` | Business | Gemini stream |
| POST | `/v1/models/{model}:generateContent` | Business | Gemini non-stream compat path |
| POST | `/v1/models/{model}:streamGenerateContent` | Business | Gemini stream compat path |
| GET | `/api/version` | None | Ollama version endpoint |
| GET | `/api/tags` | None | Ollama model list |
| POST | `/api/show` | None | Ollama model capability query (returns `id` + `capabilities`) |
| POST | `/admin/login` | None | Admin login |
| GET | `/admin/verify` | JWT | Verify admin JWT |
| GET | `/admin/vercel/config` | Admin | Read preconfigured Vercel creds |
@@ -617,6 +621,20 @@ Returns SSE (`text/event-stream`), each chunk as `data: <json>`:
---
## Ollama API
- `POST /api/show` request body: `{"model":"<model-id>"}`.
- Response uses lowercase `id` (not `ID`) and includes `capabilities` for Ollama-style clients and strict schemas.
Example response:
```json
{
"id": "deepseek-v4-flash",
"capabilities": ["tools", "thinking"]
}
```
## Admin API
### `POST /admin/login`
@@ -660,11 +678,13 @@ Requires JWT: `Authorization: Bearer <jwt>`
### `GET /admin/vercel/config`
Returns Vercel preconfiguration status.
Returns Vercel preconfiguration status. Environment variables are preferred, then the saved `vercel` config block is used as a fallback.
```json
{
"has_token": true,
"token_preview": "vc****en",
"token_source": "config",
"project_id": "prj_xxx",
"team_id": null
}
@@ -685,6 +705,12 @@ Returns sanitized config, including both `keys` and `api_keys`.
"env_source_present": true,
"env_writeback_enabled": true,
"config_path": "/data/config.json",
"vercel": {
"has_token": true,
"token_preview": "vc****en",
"project_id": "prj_xxx",
"team_id": ""
},
"accounts": [
{
"identifier": "user@example.com",
@@ -1096,11 +1122,11 @@ The success payload includes `sample_id`, `dir`, `meta_path`, and `upstream_path
| Field | Required | Notes |
| --- | --- | --- |
| `vercel_token` | ❌ | If empty or `__USE_PRECONFIG__`, read env |
| `project_id` | ❌ | Fallback: `VERCEL_PROJECT_ID` |
| `team_id` | ❌ | Fallback: `VERCEL_TEAM_ID` |
| `vercel_token` | ❌ | If empty or `__USE_PRECONFIG__`, read env, then saved config |
| `project_id` | ❌ | Fallback: `VERCEL_PROJECT_ID`, then saved config |
| `team_id` | ❌ | Fallback: `VERCEL_TEAM_ID`, then saved config |
| `auto_validate` | ❌ | Default `true` |
| `save_credentials` | ❌ | Default `true` |
| `save_credentials` | ❌ | Default `true`; saves explicitly supplied Vercel credentials for the next sync |
**Success response**:

36
API.md
View File

@@ -18,6 +18,7 @@
- [OpenAI 兼容接口](#openai-兼容接口)
- [Claude 兼容接口](#claude-兼容接口)
- [Gemini 兼容接口](#gemini-兼容接口)
- [Ollama 兼容接口](#ollama-兼容接口)
- [Admin 接口](#admin-接口)
- [错误响应格式](#错误响应格式)
- [cURL 示例](#curl-示例)
@@ -125,6 +126,9 @@ Gemini 兼容客户端还可以使用 `x-goog-api-key`、`?key=` 或 `?api_key=`
| POST | `/v1beta/models/{model}:streamGenerateContent` | 业务 | Gemini 流式 |
| POST | `/v1/models/{model}:generateContent` | 业务 | Gemini 非流式兼容路径 |
| POST | `/v1/models/{model}:streamGenerateContent` | 业务 | Gemini 流式兼容路径 |
| GET | `/api/version` | 无 | Ollama 版本接口 |
| GET | `/api/tags` | 无 | Ollama 模型列表 |
| POST | `/api/show` | 无 | Ollama 单模型能力查询(返回 `id``capabilities` |
| POST | `/admin/login` | 无 | 管理登录 |
| GET | `/admin/verify` | JWT | 校验管理 JWT |
| GET | `/admin/vercel/config` | Admin | 读取 Vercel 预配置 |
@@ -628,6 +632,20 @@ data: {"type":"message_stop"}
---
## Ollama 兼容接口
- `POST /api/show` 请求体:`{"model":"<model-id>"}`
- 响应字段使用小写 `id`(不是 `ID`),并返回 `capabilities` 数组,便于与 Ollama 风格客户端/严格 schema 对齐。
示例响应:
```json
{
"id": "deepseek-v4-flash",
"capabilities": ["tools", "thinking"]
}
```
## Admin 接口
### `POST /admin/login`
@@ -671,11 +689,13 @@ data: {"type":"message_stop"}
### `GET /admin/vercel/config`
返回 Vercel 预配置状态。
返回 Vercel 预配置状态。优先读取环境变量,其次回退到已保存的 `vercel` 配置块。
```json
{
"has_token": true,
"token_preview": "vc****en",
"token_source": "config",
"project_id": "prj_xxx",
"team_id": null
}
@@ -696,6 +716,12 @@ data: {"type":"message_stop"}
"env_source_present": true,
"env_writeback_enabled": true,
"config_path": "/data/config.json",
"vercel": {
"has_token": true,
"token_preview": "vc****en",
"project_id": "prj_xxx",
"team_id": ""
},
"accounts": [
{
"identifier": "user@example.com",
@@ -1109,11 +1135,11 @@ data: {"type":"message_stop"}
| 字段 | 必填 | 说明 |
| --- | --- | --- |
| `vercel_token` | ❌ | 空或 `__USE_PRECONFIG__` 则读环境变量 |
| `project_id` | ❌ | 空则读 `VERCEL_PROJECT_ID` |
| `team_id` | ❌ | 空则读 `VERCEL_TEAM_ID` |
| `vercel_token` | ❌ | 空或 `__USE_PRECONFIG__` 则读环境变量,再回退到已保存配置 |
| `project_id` | ❌ | 空则读 `VERCEL_PROJECT_ID`,再回退到已保存配置 |
| `team_id` | ❌ | 空则读 `VERCEL_TEAM_ID`,再回退到已保存配置 |
| `auto_validate` | ❌ | 默认 `true` |
| `save_credentials` | ❌ | 默认 `true` |
| `save_credentials` | ❌ | 默认 `true`;保存本次显式填写的 Vercel 凭据,供下次同步复用 |
**成功响应**

View File

@@ -1 +1 @@
4.4.1
4.4.5

View File

@@ -48,6 +48,9 @@ func (c Config) MarshalJSON() ([]byte, error) {
if c.ThinkingInjection.Enabled != nil || strings.TrimSpace(c.ThinkingInjection.Prompt) != "" {
m["thinking_injection"] = c.ThinkingInjection
}
if strings.TrimSpace(c.Vercel.Token) != "" || strings.TrimSpace(c.Vercel.ProjectID) != "" || strings.TrimSpace(c.Vercel.TeamID) != "" {
m["vercel"] = NormalizeVercelConfig(c.Vercel)
}
if c.VercelSyncHash != "" {
m["_vercel_sync_hash"] = c.VercelSyncHash
}
@@ -125,6 +128,10 @@ func (c *Config) UnmarshalJSON(b []byte) error {
if err := json.Unmarshal(v, &c.ThinkingInjection); err != nil {
return fmt.Errorf("invalid field %q: %w", k, err)
}
case "vercel":
if err := json.Unmarshal(v, &c.Vercel); err != nil {
return fmt.Errorf("invalid field %q: %w", k, err)
}
case "_vercel_sync_hash":
if err := json.Unmarshal(v, &c.VercelSyncHash); err != nil {
return fmt.Errorf("invalid field %q: %w", k, err)
@@ -164,6 +171,7 @@ func (c Config) Clone() Config {
Enabled: cloneBoolPtr(c.ThinkingInjection.Enabled),
Prompt: c.ThinkingInjection.Prompt,
},
Vercel: c.Vercel,
VercelSyncHash: c.VercelSyncHash,
VercelSyncTime: c.VercelSyncTime,
AdditionalFields: map[string]any{},

View File

@@ -20,6 +20,7 @@ type Config struct {
AutoDelete AutoDeleteConfig `json:"auto_delete"`
CurrentInputFile CurrentInputFileConfig `json:"current_input_file,omitempty"`
ThinkingInjection ThinkingInjectionConfig `json:"thinking_injection,omitempty"`
Vercel VercelConfig `json:"vercel,omitempty"`
VercelSyncHash string `json:"_vercel_sync_hash,omitempty"`
VercelSyncTime int64 `json:"_vercel_sync_time,omitempty"`
AdditionalFields map[string]any `json:"-"`
@@ -99,6 +100,7 @@ func (c *Config) NormalizeCredentials() {
c.Accounts[i].Remark = strings.TrimSpace(c.Accounts[i].Remark)
}
c.Vercel = NormalizeVercelConfig(c.Vercel)
c.normalizeModelAliases()
}
@@ -175,3 +177,24 @@ type ThinkingInjectionConfig struct {
Enabled *bool `json:"enabled,omitempty"`
Prompt string `json:"prompt,omitempty"`
}
type VercelConfig struct {
Token string `json:"token,omitempty"`
ProjectID string `json:"project_id,omitempty"`
TeamID string `json:"team_id,omitempty"`
}
func NormalizeVercelConfig(v VercelConfig) VercelConfig {
return VercelConfig{
Token: strings.TrimSpace(v.Token),
ProjectID: strings.TrimSpace(v.ProjectID),
TeamID: strings.TrimSpace(v.TeamID),
}
}
func (c *Config) ClearVercelCredentials() {
if c == nil {
return
}
c.Vercel = VercelConfig{}
}

View File

@@ -173,6 +173,11 @@ func TestConfigJSONRoundtrip(t *testing.T) {
Runtime: RuntimeConfig{
TokenRefreshIntervalHours: 12,
},
Vercel: VercelConfig{
Token: " vercel-token ",
ProjectID: " prj_123 ",
TeamID: " team_123 ",
},
VercelSyncHash: "hash123",
VercelSyncTime: 1234567890,
AdditionalFields: map[string]any{
@@ -205,6 +210,9 @@ func TestConfigJSONRoundtrip(t *testing.T) {
if decoded.AutoDelete.Mode != "single" {
t.Fatalf("unexpected auto delete mode: %#v", decoded.AutoDelete.Mode)
}
if decoded.Vercel.Token != "vercel-token" || decoded.Vercel.ProjectID != "prj_123" || decoded.Vercel.TeamID != "team_123" {
t.Fatalf("unexpected vercel config: %#v", decoded.Vercel)
}
if decoded.VercelSyncHash != "hash123" {
t.Fatalf("unexpected vercel sync hash: %q", decoded.VercelSyncHash)
}

View File

@@ -75,20 +75,6 @@ func TestResolveExpandedHistoricalAliases(t *testing.T) {
}
}
func TestResolveModelHeuristicReasoner(t *testing.T) {
got, ok := ResolveModel(nil, "o3-super")
if !ok || got != "deepseek-v4-pro" {
t.Fatalf("expected heuristic reasoner, got ok=%v model=%q", ok, got)
}
}
func TestResolveModelHeuristicReasonerNoThinking(t *testing.T) {
got, ok := ResolveModel(nil, "o3-super-nothinking")
if !ok || got != "deepseek-v4-pro-nothinking" {
t.Fatalf("expected heuristic reasoner nothinking, got ok=%v model=%q", ok, got)
}
}
func TestResolveModelUnknown(t *testing.T) {
_, ok := ResolveModel(nil, "totally-custom-model")
if ok {
@@ -96,6 +82,13 @@ func TestResolveModelUnknown(t *testing.T) {
}
}
func TestResolveModelUnknownKnownFamilyName(t *testing.T) {
_, ok := ResolveModel(nil, "gpt-5.5-pro-search")
if ok {
t.Fatal("expected unknown known-family model to fail resolve without alias")
}
}
func TestResolveModelRejectsLegacyDeepSeekIDs(t *testing.T) {
legacyModels := []string{
"deepseek-chat",
@@ -151,13 +144,6 @@ func TestResolveModelCustomAliasToVision(t *testing.T) {
}
}
func TestResolveModelHeuristicVisionIgnoresSearchSuffix(t *testing.T) {
got, ok := ResolveModel(nil, "gemini-vision-search")
if !ok || got != "deepseek-v4-vision" {
t.Fatalf("expected heuristic vision alias to resolve without search variant, got ok=%v model=%q", ok, got)
}
}
func TestClaudeModelsResponsePaginationFields(t *testing.T) {
resp := ClaudeModelsResponse()
if _, ok := resp["first_id"]; !ok {

View File

@@ -1,6 +1,9 @@
package config
import "strings"
import (
"strings"
"time"
)
type ModelInfo struct {
ID string `json:"id"`
@@ -9,6 +12,16 @@ type ModelInfo struct {
OwnedBy string `json:"owned_by"`
Permission []any `json:"permission,omitempty"`
}
type OllamaModelInfo struct {
Name string `json:"name"`
Model string `json:"model"`
Size int64 `json:"size"`
ModifiedAt string `json:"modified_at"`
}
type OllamaCapabilitiesModelInfo struct {
ID string `json:"id"`
Capabilities []string `json:"capabilities"`
}
type ModelAliasReader interface {
ModelAliases() map[string]string
@@ -24,8 +37,21 @@ var deepSeekBaseModels = []ModelInfo{
{ID: "deepseek-v4-vision", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
}
var DeepSeekModels = appendNoThinkingVariants(deepSeekBaseModels)
var OllamaCapabilitiesModels = []OllamaCapabilitiesModelInfo{
{ID: "deepseek-v4-flash", Capabilities: []string{"tools", "thinking"}},
{ID: "deepseek-v4-pro", Capabilities: []string{"tools", "thinking"}},
{ID: "deepseek-v4-flash-search", Capabilities: []string{"tools", "thinking"}},
{ID: "deepseek-v4-pro-search", Capabilities: []string{"tools", "thinking"}},
{ID: "deepseek-v4-vision", Capabilities: []string{"tools", "thinking", "vision"}},
{ID: "deepseek-v4-flash-nothinking", Capabilities: []string{"tools"}},
{ID: "deepseek-v4-pro-nothinking", Capabilities: []string{"tools"}},
{ID: "deepseek-v4-flash-search-nothinking", Capabilities: []string{"tools"}},
{ID: "deepseek-v4-pro-search-nothinking", Capabilities: []string{"tools"}},
{ID: "deepseek-v4-vision-nothinking", Capabilities: []string{"tools", "vision"}},
}
var DeepSeekModels = appendNoThinkingVariants(deepSeekBaseModels)
var OllamaModels = mapToOllamaModels(DeepSeekModels)
var claudeBaseModels = []ModelInfo{
// Current aliases
{ID: "claude-opus-4-6", Object: "model", Created: 1715635200, OwnedBy: "anthropic"},
@@ -214,26 +240,10 @@ func ResolveModel(store ModelAliasReader, requested string) (string, bool) {
return mapped, true
}
baseModel, noThinking := splitNoThinkingModel(model)
resolvedModel, ok := resolveCanonicalModel(aliases, baseModel)
if !ok {
return "", false
}
return withNoThinkingVariant(resolvedModel, noThinking), true
}
func isRetiredHistoricalModel(model string) bool {
switch {
case strings.HasPrefix(model, "claude-1."):
return true
case strings.HasPrefix(model, "claude-2."):
return true
case strings.HasPrefix(model, "claude-instant-"):
return true
case strings.HasPrefix(model, "gpt-3.5"):
return true
default:
return false
if mapped, ok := aliases[baseModel]; ok && IsSupportedDeepSeekModel(mapped) {
return withNoThinkingVariant(mapped, noThinking), true
}
return "", false
}
func lower(s string) string {
@@ -263,6 +273,23 @@ func OpenAIModelByID(store ModelAliasReader, id string) (ModelInfo, bool) {
return ModelInfo{}, false
}
func OllamaModelsResponse() map[string]any {
return map[string]any{"models": OllamaModels}
}
func OllamaModelByID(store ModelAliasReader, id string) (OllamaCapabilitiesModelInfo, bool) {
canonical, ok := ResolveModel(store, id)
if !ok {
return OllamaCapabilitiesModelInfo{}, false
}
for _, model := range OllamaCapabilitiesModels {
if model.ID == canonical {
return model, true
}
}
return OllamaCapabilitiesModelInfo{}, false
}
func ClaudeModelsResponse() map[string]any {
resp := map[string]any{"object": "list", "data": ClaudeModels}
if len(ClaudeModels) > 0 {
@@ -286,6 +313,23 @@ func appendNoThinkingVariants(models []ModelInfo) []ModelInfo {
}
return out
}
func mapToOllamaModels(models []ModelInfo) []OllamaModelInfo {
out := make([]OllamaModelInfo, 0, len(models))
for _, model := range models {
var modifiedAt string
if model.Created > 0 {
modifiedAt = time.Unix(model.Created, 0).Format(time.RFC3339)
}
ollamaModel := OllamaModelInfo{
Name: model.ID,
Model: model.ID,
Size: 0,
ModifiedAt: modifiedAt,
}
out = append(out, ollamaModel)
}
return out
}
func splitNoThinkingModel(model string) (string, bool) {
model = lower(strings.TrimSpace(model))
@@ -315,58 +359,3 @@ func loadModelAliases(store ModelAliasReader) map[string]string {
}
return aliases
}
func resolveCanonicalModel(aliases map[string]string, model string) (string, bool) {
model = lower(strings.TrimSpace(model))
if model == "" {
return "", false
}
if isRetiredHistoricalModel(model) {
return "", false
}
if IsSupportedDeepSeekModel(model) {
return model, true
}
if mapped, ok := aliases[model]; ok && IsSupportedDeepSeekModel(mapped) {
return mapped, true
}
if strings.HasPrefix(model, "deepseek-") {
return "", false
}
knownFamily := false
for _, prefix := range []string{
"gpt-", "o1", "o3", "claude-", "gemini-", "llama-", "qwen-", "mistral-", "command-",
} {
if strings.HasPrefix(model, prefix) {
knownFamily = true
break
}
}
if !knownFamily {
return "", false
}
useVision := strings.Contains(model, "vision")
useReasoner := strings.Contains(model, "reason") ||
strings.Contains(model, "reasoner") ||
strings.HasPrefix(model, "o1") ||
strings.HasPrefix(model, "o3") ||
strings.Contains(model, "opus") ||
strings.Contains(model, "slow") ||
strings.Contains(model, "r1")
useSearch := strings.Contains(model, "search")
switch {
case useVision:
return "deepseek-v4-vision", true
case useReasoner && useSearch:
return "deepseek-v4-pro-search", true
case useReasoner:
return "deepseek-v4-pro", true
case useSearch:
return "deepseek-v4-flash-search", true
default:
return "deepseek-v4-flash", true
}
}

View File

@@ -58,6 +58,11 @@ func RawStreamSampleRoot() string {
}
func ChatHistoryPath() string {
// On Vercel, /var/task is read-only at runtime. If no explicit path is set,
// default to /tmp/chat_history.json (the only writable directory).
if IsVercel() && strings.TrimSpace(os.Getenv("DS2API_CHAT_HISTORY_PATH")) == "" {
return "/tmp/chat_history.json"
}
return ResolvePath("DS2API_CHAT_HISTORY_PATH", "data/chat_history.json")
}

View File

@@ -15,5 +15,6 @@ type Handler struct {
var writeJSON = adminshared.WriteJSON
var intFrom = adminshared.IntFrom
var maskSecretPreview = adminshared.MaskSecretPreview
func nilIfEmpty(s string) any { return adminshared.NilIfEmpty(s) }

View File

@@ -61,9 +61,34 @@ func (h *Handler) verify(w http.ResponseWriter, r *http.Request) {
}
func (h *Handler) getVercelConfig(w http.ResponseWriter, _ *http.Request) {
saved := h.Store.Snapshot().Vercel
token, tokenSource := firstConfiguredValue(
[2]string{"env", os.Getenv("VERCEL_TOKEN")},
[2]string{"config", saved.Token},
)
projectID, _ := firstConfiguredValue(
[2]string{"env", os.Getenv("VERCEL_PROJECT_ID")},
[2]string{"config", saved.ProjectID},
)
teamID, _ := firstConfiguredValue(
[2]string{"env", os.Getenv("VERCEL_TEAM_ID")},
[2]string{"config", saved.TeamID},
)
writeJSON(w, http.StatusOK, map[string]any{
"has_token": strings.TrimSpace(os.Getenv("VERCEL_TOKEN")) != "",
"project_id": strings.TrimSpace(os.Getenv("VERCEL_PROJECT_ID")),
"team_id": nilIfEmpty(strings.TrimSpace(os.Getenv("VERCEL_TEAM_ID"))),
"has_token": token != "",
"token_preview": maskSecretPreview(token),
"token_source": nilIfEmpty(tokenSource),
"project_id": projectID,
"team_id": nilIfEmpty(teamID),
})
}
func firstConfiguredValue(values ...[2]string) (string, string) {
for _, pair := range values {
value := strings.TrimSpace(pair[1])
if value != "" {
return value, strings.TrimSpace(pair[0])
}
}
return "", ""
}

View File

@@ -0,0 +1,38 @@
package auth
import (
"encoding/json"
"net/http"
"net/http/httptest"
"testing"
"ds2api/internal/config"
)
func TestGetVercelConfigFallsBackToSavedConfig(t *testing.T) {
t.Setenv("DS2API_CONFIG_JSON", `{"keys":["k1"],"vercel":{"token":"saved-token","project_id":"saved-project","team_id":"saved-team"}}`)
t.Setenv("VERCEL_TOKEN", "")
t.Setenv("VERCEL_PROJECT_ID", "")
t.Setenv("VERCEL_TEAM_ID", "")
h := &Handler{Store: config.LoadStore()}
rec := httptest.NewRecorder()
h.getVercelConfig(rec, httptest.NewRequest(http.MethodGet, "/admin/vercel/config", nil))
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d: %s", rec.Code, rec.Body.String())
}
var payload map[string]any
if err := json.Unmarshal(rec.Body.Bytes(), &payload); err != nil {
t.Fatalf("decode response: %v", err)
}
if payload["has_token"] != true {
t.Fatalf("expected saved token to be detected: %#v", payload)
}
if payload["token_source"] != "config" || payload["project_id"] != "saved-project" || payload["team_id"] != "saved-team" {
t.Fatalf("unexpected preconfig payload: %#v", payload)
}
if payload["token_preview"] == "saved-token" {
t.Fatal("token preview leaked the full token")
}
}

View File

@@ -94,6 +94,10 @@ func (h *Handler) configImport(w http.ResponseWriter, r *http.Request) {
if strings.TrimSpace(incoming.Embeddings.Provider) != "" {
next.Embeddings.Provider = incoming.Embeddings.Provider
}
incomingVercel := config.NormalizeVercelConfig(incoming.Vercel)
if strings.TrimSpace(incomingVercel.Token) != "" || strings.TrimSpace(incomingVercel.ProjectID) != "" || strings.TrimSpace(incomingVercel.TeamID) != "" {
next.Vercel = incomingVercel
}
if strings.TrimSpace(incoming.Admin.PasswordHash) != "" {
next.Admin.PasswordHash = incoming.Admin.PasswordHash
}

View File

@@ -19,6 +19,12 @@ func (h *Handler) getConfig(w http.ResponseWriter, _ *http.Request) {
"env_writeback_enabled": h.Store.IsEnvWritebackEnabled(),
"config_path": h.Store.ConfigPath(),
"model_aliases": snap.ModelAliases,
"vercel": map[string]any{
"has_token": strings.TrimSpace(snap.Vercel.Token) != "",
"token_preview": maskSecretPreview(snap.Vercel.Token),
"project_id": snap.Vercel.ProjectID,
"team_id": snap.Vercel.TeamID,
},
}
accounts := make([]map[string]any, 0, len(snap.Accounts))
for _, acc := range snap.Accounts {

View File

@@ -78,6 +78,7 @@ func ComputeSyncHash(store ConfigStore) string {
}
snap := store.Snapshot().Clone()
snap.ClearAccountTokens()
snap.ClearVercelCredentials()
snap.VercelSyncHash = ""
snap.VercelSyncTime = 0
b, _ := json.Marshal(snap)
@@ -93,6 +94,7 @@ func SyncHashForJSON(s string) string {
cfg.VercelSyncHash = ""
cfg.VercelSyncTime = 0
cfg.ClearAccountTokens()
cfg.ClearVercelCredentials()
b, err := json.Marshal(cfg)
if err != nil {
return ""

View File

@@ -23,7 +23,7 @@ func (h *Handler) syncVercel(w http.ResponseWriter, r *http.Request) {
writeJSON(w, http.StatusBadRequest, map[string]any{"detail": "invalid json"})
return
}
opts, err := parseVercelSyncOptions(req)
opts, err := parseVercelSyncOptions(req, h.Store.Snapshot().Vercel)
if err != nil {
writeJSON(w, http.StatusBadRequest, map[string]any{"detail": err.Error()})
return
@@ -50,6 +50,12 @@ func (h *Handler) syncVercel(w http.ResponseWriter, r *http.Request) {
return
}
savedCreds := h.saveVercelProjectCredentials(r.Context(), client, opts, params, headers, envs)
credentialsWarning := ""
if saved, err := h.saveLocalVercelCredentials(opts); err == nil && saved {
savedCreds = append(savedCreds, "config.vercel")
} else if err != nil {
credentialsWarning = "保存 Vercel 凭据到本地配置失败: " + err.Error()
}
manual, deployURL := triggerVercelDeployment(r.Context(), client, opts.ProjectID, params, headers)
_ = h.Store.SetVercelSync(syncHashForJSON(cfgJSON), time.Now().Unix())
result := map[string]any{"success": true, "validated_accounts": validated}
@@ -66,6 +72,9 @@ func (h *Handler) syncVercel(w http.ResponseWriter, r *http.Request) {
if len(savedCreds) > 0 {
result["saved_credentials"] = savedCreds
}
if credentialsWarning != "" {
result["credentials_warning"] = credentialsWarning
}
writeJSON(w, http.StatusOK, result)
}
@@ -78,7 +87,7 @@ type vercelSyncOptions struct {
UsePreconfig bool
}
func parseVercelSyncOptions(req map[string]any) (vercelSyncOptions, error) {
func parseVercelSyncOptions(req map[string]any, saved config.VercelConfig) (vercelSyncOptions, error) {
vercelToken, _ := req["vercel_token"].(string)
projectID, _ := req["project_id"].(string)
teamID, _ := req["team_id"].(string)
@@ -92,13 +101,13 @@ func parseVercelSyncOptions(req map[string]any) (vercelSyncOptions, error) {
}
usePreconfig := vercelToken == "__USE_PRECONFIG__" || strings.TrimSpace(vercelToken) == ""
if usePreconfig {
vercelToken = strings.TrimSpace(os.Getenv("VERCEL_TOKEN"))
vercelToken = firstNonEmpty(os.Getenv("VERCEL_TOKEN"), saved.Token)
}
if strings.TrimSpace(projectID) == "" {
projectID = strings.TrimSpace(os.Getenv("VERCEL_PROJECT_ID"))
projectID = firstNonEmpty(os.Getenv("VERCEL_PROJECT_ID"), saved.ProjectID)
}
if strings.TrimSpace(teamID) == "" {
teamID = strings.TrimSpace(os.Getenv("VERCEL_TEAM_ID"))
teamID = firstNonEmpty(os.Getenv("VERCEL_TEAM_ID"), saved.TeamID)
}
vercelToken = strings.TrimSpace(vercelToken)
projectID = strings.TrimSpace(projectID)
@@ -116,6 +125,15 @@ func parseVercelSyncOptions(req map[string]any) (vercelSyncOptions, error) {
}, nil
}
func firstNonEmpty(values ...string) string {
for _, value := range values {
if trimmed := strings.TrimSpace(value); trimmed != "" {
return trimmed
}
}
return ""
}
func buildVercelParams(teamID string) url.Values {
params := url.Values{}
if strings.TrimSpace(teamID) != "" {
@@ -178,6 +196,25 @@ func (h *Handler) saveVercelProjectCredentials(ctx context.Context, client *http
return saved
}
func (h *Handler) saveLocalVercelCredentials(opts vercelSyncOptions) (bool, error) {
if !opts.SaveCreds {
return false, nil
}
err := h.Store.Update(func(c *config.Config) error {
token := opts.VercelToken
if opts.UsePreconfig {
token = c.Vercel.Token
}
c.Vercel = config.NormalizeVercelConfig(config.VercelConfig{
Token: token,
ProjectID: opts.ProjectID,
TeamID: opts.TeamID,
})
return nil
})
return err == nil, err
}
func triggerVercelDeployment(ctx context.Context, client *http.Client, projectID string, params url.Values, headers map[string]string) (bool, string) {
projectResp, status, _ := vercelRequest(ctx, client, http.MethodGet, "https://api.vercel.com/v9/projects/"+projectID, params, headers, nil)
if status != http.StatusOK {
@@ -243,7 +280,7 @@ func (h *Handler) vercelStatus(w http.ResponseWriter, r *http.Request) {
func (h *Handler) exportSyncConfig(req map[string]any) (string, string, error) {
override, ok := req["config_override"]
if !ok || override == nil {
return h.Store.ExportJSONAndBase64()
return encodeVercelSyncConfig(h.Store.Snapshot())
}
raw, err := json.Marshal(override)
if err != nil {
@@ -253,8 +290,13 @@ func (h *Handler) exportSyncConfig(req map[string]any) (string, string, error) {
if err := json.Unmarshal(raw, &cfg); err != nil {
return "", "", err
}
return encodeVercelSyncConfig(cfg)
}
func encodeVercelSyncConfig(cfg config.Config) (string, string, error) {
cfg.DropInvalidAccounts()
cfg.ClearAccountTokens()
cfg.ClearVercelCredentials()
cfg.VercelSyncHash = ""
cfg.VercelSyncTime = 0
b, err := json.Marshal(cfg)
@@ -272,6 +314,7 @@ func syncHashForJSON(s string) string {
cfg.VercelSyncHash = ""
cfg.VercelSyncTime = 0
cfg.ClearAccountTokens()
cfg.ClearVercelCredentials()
b, err := json.Marshal(cfg)
if err != nil {
return ""

View File

@@ -0,0 +1,100 @@
package vercel
import (
"encoding/json"
"strings"
"testing"
"ds2api/internal/config"
)
func TestParseVercelSyncOptionsFallsBackToSavedConfig(t *testing.T) {
t.Setenv("VERCEL_TOKEN", "")
t.Setenv("VERCEL_PROJECT_ID", "")
t.Setenv("VERCEL_TEAM_ID", "")
opts, err := parseVercelSyncOptions(map[string]any{
"vercel_token": "__USE_PRECONFIG__",
}, config.VercelConfig{
Token: " saved-token ",
ProjectID: " saved-project ",
TeamID: " saved-team ",
})
if err != nil {
t.Fatalf("parse options error: %v", err)
}
if opts.VercelToken != "saved-token" || opts.ProjectID != "saved-project" || opts.TeamID != "saved-team" {
t.Fatalf("unexpected options: %#v", opts)
}
if !opts.UsePreconfig {
t.Fatal("expected preconfig mode")
}
}
func TestSaveLocalVercelCredentialsStoresExplicitInput(t *testing.T) {
t.Setenv("DS2API_CONFIG_JSON", `{"keys":["k1"]}`)
store := config.LoadStore()
h := &Handler{Store: store}
saved, err := h.saveLocalVercelCredentials(vercelSyncOptions{
VercelToken: " token ",
ProjectID: " project ",
TeamID: " team ",
SaveCreds: true,
})
if err != nil {
t.Fatalf("save local credentials error: %v", err)
}
if !saved {
t.Fatal("expected credentials to be saved")
}
got := store.Snapshot().Vercel
if got.Token != "token" || got.ProjectID != "project" || got.TeamID != "team" {
t.Fatalf("unexpected saved credentials: %#v", got)
}
}
func TestSaveLocalVercelCredentialsPreservesPreconfiguredTokenAndUpdatesProject(t *testing.T) {
t.Setenv("DS2API_CONFIG_JSON", `{"keys":["k1"],"vercel":{"token":"saved-token","project_id":"old-project","team_id":"old-team"}}`)
store := config.LoadStore()
h := &Handler{Store: store}
saved, err := h.saveLocalVercelCredentials(vercelSyncOptions{
VercelToken: "resolved-token",
ProjectID: "new-project",
TeamID: "new-team",
SaveCreds: true,
UsePreconfig: true,
})
if err != nil {
t.Fatalf("save local credentials error: %v", err)
}
if !saved {
t.Fatal("expected project/team updates to be saved")
}
got := store.Snapshot().Vercel
if got.Token != "saved-token" || got.ProjectID != "new-project" || got.TeamID != "new-team" {
t.Fatalf("unexpected saved credentials: %#v", got)
}
}
func TestExportSyncConfigStripsSavedVercelCredentials(t *testing.T) {
t.Setenv("DS2API_CONFIG_JSON", `{"keys":["k1"],"vercel":{"token":"secret-token","project_id":"project","team_id":"team"}}`)
store := config.LoadStore()
h := &Handler{Store: store}
jsonStr, _, err := h.exportSyncConfig(map[string]any{})
if err != nil {
t.Fatalf("export sync config error: %v", err)
}
if strings.Contains(jsonStr, "secret-token") || strings.Contains(jsonStr, `"vercel"`) {
t.Fatalf("expected sync export to strip Vercel credentials, got %s", jsonStr)
}
var exported config.Config
if err := json.Unmarshal([]byte(jsonStr), &exported); err != nil {
t.Fatalf("exported config is invalid JSON: %v", err)
}
if len(exported.Keys) != 1 || exported.Keys[0] != "k1" {
t.Fatalf("unexpected exported config: %#v", exported)
}
}

View File

@@ -0,0 +1,58 @@
package ollama
import (
"ds2api/internal/config"
"ds2api/internal/util"
"encoding/json"
"github.com/go-chi/chi/v5"
"log/slog"
"net/http"
)
var WriteJSON = util.WriteJSON
type ConfigReader interface {
ModelAliases() map[string]string
}
type Handler struct {
Store ConfigReader
}
type OllamaModelRequest struct {
Model string `json:"model"`
}
func RegisterRoutes(r chi.Router, h *Handler) {
r.Get("/api/version", h.GetVersion)
r.Get("/api/tags", h.ListOllamaModels)
r.Post("/api/show", h.GetOllamaModel)
}
func (h *Handler) GetVersion(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/json")
w.WriteHeader(http.StatusOK)
_, _ = w.Write([]byte(`{"version":"0.23.1"}`))
}
func (h *Handler) ListOllamaModels(w http.ResponseWriter, r *http.Request) {
WriteJSON(w, http.StatusOK, config.OllamaModelsResponse())
}
func (h *Handler) GetOllamaModel(w http.ResponseWriter, r *http.Request) {
var payload OllamaModelRequest
if err := json.NewDecoder(r.Body).Decode(&payload); err != nil {
http.Error(w, "Invalid JSON body: "+err.Error(), http.StatusBadRequest)
return
}
defer func() {
if err := r.Body.Close(); err != nil {
slog.Warn("[ollama] failed to close request body", "error", err)
}
}()
modelID := payload.Model
model, ok := config.OllamaModelByID(h.Store, modelID)
if !ok {
http.Error(w, "Model not found.", http.StatusNotFound)
return
}
WriteJSON(w, http.StatusOK, model)
}

View File

@@ -0,0 +1,127 @@
package ollama
import (
"encoding/json"
"github.com/go-chi/chi/v5"
"net/http"
"net/http/httptest"
"strings"
"testing"
)
type ollamaTestSurface struct {
Store ConfigReader
handler *Handler
}
func (h *ollamaTestSurface) apiHandler() *Handler {
if h.handler == nil {
h.handler = &Handler{Store: h.Store}
}
return h.handler
}
func registerOllamaTestRoutes(r chi.Router, h *ollamaTestSurface) {
r.Get("/api/version", h.apiHandler().GetVersion)
r.Get("/api/tags", h.apiHandler().ListOllamaModels)
r.Post("/api/show", h.apiHandler().GetOllamaModel)
}
func TestGetOllamaVersionRoute(t *testing.T) {
h := &ollamaTestSurface{}
r := chi.NewRouter()
registerOllamaTestRoutes(r, h)
req := httptest.NewRequest(http.MethodGet, "/api/version", nil)
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
}
func TestGetOllamaModelsRoute(t *testing.T) {
h := &ollamaTestSurface{}
r := chi.NewRouter()
registerOllamaTestRoutes(r, h)
req := httptest.NewRequest(http.MethodGet, "/api/tags", nil)
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
}
func TestGetOllamaModelRoute(t *testing.T) {
h := &ollamaTestSurface{}
r := chi.NewRouter()
registerOllamaTestRoutes(r, h)
t.Run("direct", func(t *testing.T) {
body := `{"model":"deepseek-v4-flash"}`
req := httptest.NewRequest(http.MethodPost, "/api/show", strings.NewReader(body))
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
var payload map[string]any
if err := json.Unmarshal(rec.Body.Bytes(), &payload); err != nil {
t.Fatalf("expected valid json body, got err=%v body=%s", err, rec.Body.String())
}
if _, ok := payload["id"]; !ok {
t.Fatalf("expected response has lowercase id field, body=%s", rec.Body.String())
}
if _, ok := payload["ID"]; ok {
t.Fatalf("expected response does not expose uppercase ID field, body=%s", rec.Body.String())
}
})
t.Run("direct_nothinking", func(t *testing.T) {
body := `{"model":"deepseek-v4-flash-nothinking"}`
req := httptest.NewRequest(http.MethodPost, "/api/show", strings.NewReader(body))
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
})
t.Run("direct_expert", func(t *testing.T) {
body := `{"model":"deepseek-v4-pro"}`
req := httptest.NewRequest(http.MethodPost, "/api/show", strings.NewReader(body))
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
})
t.Run("direct_vision", func(t *testing.T) {
body := `{"model":"deepseek-v4-vision"}`
req := httptest.NewRequest(http.MethodPost, "/api/show", strings.NewReader(body))
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
})
}
func TestGetOllamaModelRouteNotFound(t *testing.T) {
h := &ollamaTestSurface{}
r := chi.NewRouter()
registerOllamaTestRoutes(r, h)
body := `{"model":"not-exists"}`
req := httptest.NewRequest(http.MethodPost, "/api/show", strings.NewReader(body))
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusNotFound {
t.Fatalf("expected 404, got %d body=%s", rec.Code, rec.Body.String())
}
}

View File

@@ -127,13 +127,7 @@ func (s *chatStreamRuntime) sendKeepAlive() {
return
}
_, _ = s.w.Write([]byte(": keep-alive\n\n"))
s.sendChunk(openaifmt.BuildChatStreamChunk(
s.completionID,
s.created,
s.model,
[]map[string]any{},
nil,
))
_ = s.rc.Flush()
}
func (s *chatStreamRuntime) sendChunk(v any) {

View File

@@ -10,7 +10,7 @@ import (
"ds2api/internal/promptcompat"
)
func TestChatStreamKeepAliveEmitsEmptyChoiceDataFrame(t *testing.T) {
func TestChatStreamKeepAliveUsesCommentOnly(t *testing.T) {
rec := httptest.NewRecorder()
runtime := newChatStreamRuntime(
rec,
@@ -40,18 +40,8 @@ func TestChatStreamKeepAliveEmitsEmptyChoiceDataFrame(t *testing.T) {
if done {
t.Fatalf("keep-alive must not emit [DONE], body=%q", body)
}
if len(frames) != 1 {
t.Fatalf("expected one data frame, got %d body=%q", len(frames), body)
}
if got := asString(frames[0]["id"]); got != "chatcmpl-test" {
t.Fatalf("expected completion id to be preserved, got %q", got)
}
if got := asString(frames[0]["object"]); got != "chat.completion.chunk" {
t.Fatalf("expected chat chunk object, got %q", got)
}
choices, _ := frames[0]["choices"].([]any)
if len(choices) != 0 {
t.Fatalf("expected empty choices heartbeat, got %#v", choices)
if len(frames) != 0 {
t.Fatalf("keep-alive must not emit JSON data frames, got %#v body=%q", frames, body)
}
}

View File

@@ -22,6 +22,7 @@ import (
"ds2api/internal/httpapi/admin"
"ds2api/internal/httpapi/claude"
"ds2api/internal/httpapi/gemini"
"ds2api/internal/httpapi/ollama"
"ds2api/internal/httpapi/openai/chat"
"ds2api/internal/httpapi/openai/embeddings"
"ds2api/internal/httpapi/openai/files"
@@ -68,6 +69,7 @@ func NewApp() (*App, error) {
claudeHandler := &claude.Handler{Store: store, Auth: resolver, DS: dsClient, OpenAI: chatHandler, ChatHistory: chatHistoryStore}
geminiHandler := &gemini.Handler{Store: store, Auth: resolver, DS: dsClient, OpenAI: chatHandler, ChatHistory: chatHistoryStore}
adminHandler := &admin.Handler{Store: store, Pool: pool, DS: dsClient, OpenAI: chatHandler, ChatHistory: chatHistoryStore}
ollamaHandler := &ollama.Handler{Store: store}
webuiHandler := webui.NewHandler()
r := chi.NewRouter()
@@ -112,6 +114,7 @@ func NewApp() (*App, error) {
r.Post("/embeddings", embeddingsHandler.Embeddings)
claude.RegisterRoutes(r, claudeHandler)
gemini.RegisterRoutes(r, geminiHandler)
ollama.RegisterRoutes(r, ollamaHandler)
r.Route("/admin", func(ar chi.Router) {
admin.RegisterRoutes(ar, adminHandler)
})

View File

@@ -17,11 +17,10 @@ func rewriteDSMLToolMarkupOutsideIgnored(text string) string {
if text == "" {
return ""
}
lower := strings.ToLower(text)
var b strings.Builder
b.Grow(len(text))
for i := 0; i < len(text); {
next, advanced, blocked := skipXMLIgnoredSection(text, lower, i)
next, advanced, blocked := skipXMLIgnoredSection(text, i)
if blocked {
b.WriteString(text[i:])
break

View File

@@ -144,7 +144,7 @@ func findXMLStartTagOutsideCDATA(text, tag string, from int) (start, bodyStart i
lower := strings.ToLower(text)
target := "<" + strings.ToLower(tag)
for i := maxInt(from, 0); i < len(text); {
next, advanced, blocked := skipXMLIgnoredSection(text, lower, i)
next, advanced, blocked := skipXMLIgnoredSection(text, i)
if blocked {
return -1, -1, "", false
}
@@ -170,7 +170,7 @@ func findMatchingXMLEndTagOutsideCDATA(text, tag string, from int) (closeStart,
closeTarget := "</" + strings.ToLower(tag)
depth := 1
for i := maxInt(from, 0); i < len(text); {
next, advanced, blocked := skipXMLIgnoredSection(text, lower, i)
next, advanced, blocked := skipXMLIgnoredSection(text, i)
if blocked {
return -1, -1, false
}
@@ -206,16 +206,19 @@ func findMatchingXMLEndTagOutsideCDATA(text, tag string, from int) (closeStart,
return -1, -1, false
}
func skipXMLIgnoredSection(text, lower string, i int) (next int, advanced bool, blocked bool) {
func skipXMLIgnoredSection(text string, i int) (next int, advanced bool, blocked bool) {
if i < 0 || i >= len(text) {
return i, false, false
}
switch {
case strings.HasPrefix(lower[i:], "<![cdata["):
end := findToolCDATAEnd(text, lower, i+len("<![cdata["))
case hasASCIIPrefixFoldAt(text, i, "<![cdata["):
end := findToolCDATAEnd(text, i+len("<![cdata["))
if end < 0 {
return 0, false, true
}
return end + len("]]>"), true, false
case strings.HasPrefix(lower[i:], "<!--"):
end := strings.Index(lower[i+len("<!--"):], "-->")
case strings.HasPrefix(text[i:], "<!--"):
end := strings.Index(text[i+len("<!--"):], "-->")
if end < 0 {
return 0, false, true
}
@@ -225,14 +228,33 @@ func skipXMLIgnoredSection(text, lower string, i int) (next int, advanced bool,
}
}
func findToolCDATAEnd(text, lower string, from int) int {
if from < 0 || from > len(text) {
func hasASCIIPrefixFoldAt(text string, start int, prefix string) bool {
if start < 0 || len(text)-start < len(prefix) {
return false
}
for j := 0; j < len(prefix); j++ {
if asciiLower(text[start+j]) != asciiLower(prefix[j]) {
return false
}
}
return true
}
func asciiLower(b byte) byte {
if b >= 'A' && b <= 'Z' {
return b + ('a' - 'A')
}
return b
}
func findToolCDATAEnd(text string, from int) int {
if from < 0 || from >= len(text) {
return -1
}
const closeMarker = "]]>"
firstNonFenceEnd := -1
for searchFrom := from; searchFrom < len(text); {
rel := strings.Index(lower[searchFrom:], closeMarker)
rel := strings.Index(text[searchFrom:], closeMarker)
if rel < 0 {
break
}
@@ -241,27 +263,28 @@ func findToolCDATAEnd(text, lower string, from int) int {
if cdataOffsetIsInsideMarkdownFence(text[from:end]) {
continue
}
if cdataEndLooksStructural(text, searchFrom) {
return end
}
if firstNonFenceEnd < 0 {
firstNonFenceEnd = end
}
if cdataEndLooksStructural(lower, searchFrom) {
return end
}
}
return firstNonFenceEnd
}
func cdataEndLooksStructural(lower string, after int) bool {
for after < len(lower) {
switch lower[after] {
case ' ', '\t', '\r', '\n':
func cdataEndLooksStructural(text string, after int) bool {
for after < len(text) {
switch {
case text[after] == ' ' || text[after] == '\t' || text[after] == '\r' || text[after] == '\n':
after++
continue
case after+1 < len(text) && text[after] == '<' && text[after+1] == '/':
return true
default:
return false
}
break
}
return strings.HasPrefix(lower[after:], "</")
return false
}
func cdataOffsetIsInsideMarkdownFence(fragment string) bool {

View File

@@ -28,9 +28,8 @@ type ToolMarkupTag struct {
}
func ContainsToolMarkupSyntaxOutsideIgnored(text string) (hasDSML, hasCanonical bool) {
lower := strings.ToLower(text)
for i := 0; i < len(text); {
next, advanced, blocked := skipXMLIgnoredSection(text, lower, i)
next, advanced, blocked := skipXMLIgnoredSection(text, i)
if blocked {
return hasDSML, hasCanonical
}
@@ -56,9 +55,8 @@ func ContainsToolMarkupSyntaxOutsideIgnored(text string) (hasDSML, hasCanonical
}
func ContainsToolCallWrapperSyntaxOutsideIgnored(text string) (hasDSML, hasCanonical bool) {
lower := strings.ToLower(text)
for i := 0; i < len(text); {
next, advanced, blocked := skipXMLIgnoredSection(text, lower, i)
next, advanced, blocked := skipXMLIgnoredSection(text, i)
if blocked {
return hasDSML, hasCanonical
}
@@ -88,9 +86,8 @@ func ContainsToolCallWrapperSyntaxOutsideIgnored(text string) (hasDSML, hasCanon
}
func FindToolMarkupTagOutsideIgnored(text string, start int) (ToolMarkupTag, bool) {
lower := strings.ToLower(text)
for i := maxInt(start, 0); i < len(text); {
next, advanced, blocked := skipXMLIgnoredSection(text, lower, i)
next, advanced, blocked := skipXMLIgnoredSection(text, i)
if blocked {
return ToolMarkupTag{}, false
}
@@ -107,7 +104,7 @@ func FindToolMarkupTagOutsideIgnored(text string, start int) (ToolMarkupTag, boo
}
func FindMatchingToolMarkupClose(text string, open ToolMarkupTag) (ToolMarkupTag, bool) {
if text == "" || open.Name == "" || open.Closing {
if text == "" || open.Name == "" || open.Closing || open.End >= len(text) {
return ToolMarkupTag{}, false
}
depth := 1

View File

@@ -892,3 +892,139 @@ func TestParseToolCallsSkipsProseMentionOfSameWrapperVariant(t *testing.T) {
t.Fatalf("expected command to parse, got %q", got)
}
}
func TestTurkishILowercaseMapping(t *testing.T) {
tests := []struct {
name string
text string
start int
wantOk bool
wantName string
}{
{"turkish_i_at_name_start", "İ<tool>", 0, false, ""},
{"turkish_i_at_name_end", "<toolİ>", 0, false, ""},
{"turkish_i_before_tag", "İ<tool>", 0, false, ""},
{"normal_tool_calls", "<tool_calls>", 0, true, "tool_calls"},
{"normal_invoke", "<invoke name=\"test\">", 0, true, "invoke"},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
got, ok := FindToolMarkupTagOutsideIgnored(tt.text, tt.start)
if ok != tt.wantOk {
t.Errorf("FindToolMarkupTagOutsideIgnored(%q, %d) ok = %v, want %v", tt.text, tt.start, ok, tt.wantOk)
return
}
if ok && got.Name != tt.wantName {
t.Errorf("FindToolMarkupTagOutsideIgnored(%q, %d) name = %q, want %q", tt.text, tt.start, got.Name, tt.wantName)
}
})
}
}
func TestSkipXMLIgnoredSectionBoundaryConditions(t *testing.T) {
text := "hello"
tests := []struct {
name string
i int
wantNext int
wantAdv bool
wantBlk bool
}{
{"valid_index", 2, 2, false, false},
{"at_end_equal_len", 5, 5, false, false},
{"beyond_end", 6, 6, false, false},
{"negative", -1, -1, false, false},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
next, adv, blk := skipXMLIgnoredSection(text, tt.i)
if next != tt.wantNext || adv != tt.wantAdv || blk != tt.wantBlk {
t.Errorf("skipXMLIgnoredSection(%q, %d) = (%d, %v, %v), want (%d, %v, %v)",
text, tt.i, next, adv, blk, tt.wantNext, tt.wantAdv, tt.wantBlk)
}
})
}
}
func TestSkipXMLIgnoredSectionCommentWithUnicodeKeepsByteOffset(t *testing.T) {
text := "<!-- İ -->x<tool_calls>"
next, adv, blk := skipXMLIgnoredSection(text, 0)
if blk || !adv {
t.Fatalf("skipXMLIgnoredSection() = (%d, %v, %v), want advanced unblocked comment", next, adv, blk)
}
if want := len("<!-- İ -->"); next != want {
t.Fatalf("skipXMLIgnoredSection() next = %d, want %d", next, want)
}
}
func TestSkipXMLIgnoredSectionMatchesCDATAWithoutAllocatingTail(t *testing.T) {
text := "<![cDaTa[<tool_calls>]]><tool_calls>"
next, adv, blk := skipXMLIgnoredSection(text, 0)
if blk || !adv {
t.Fatalf("skipXMLIgnoredSection() = (%d, %v, %v), want advanced unblocked CDATA", next, adv, blk)
}
if want := len("<![cDaTa[<tool_calls>]]>"); next != want {
t.Fatalf("skipXMLIgnoredSection() next = %d, want %d", next, want)
}
tag, ok := FindToolMarkupTagOutsideIgnored(text, 0)
if !ok {
t.Fatal("expected tool tag after skipped CDATA")
}
if tag.Start != next {
t.Fatalf("FindToolMarkupTagOutsideIgnored() start = %d, want %d", tag.Start, next)
}
}
func TestFindToolCDATAEndBoundaryConditions(t *testing.T) {
text := "<![CDATA[hello]]>"
tests := []struct {
name string
from int
wantResult int
}{
{"valid", 12, 14},
{"at_end", 17, -1},
{"beyond_end", 18, -1},
{"negative", -1, -1},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
got := findToolCDATAEnd(text, tt.from)
if got != tt.wantResult {
t.Errorf("findToolCDATAEnd(%q, %d) = %d, want %d",
text, tt.from, got, tt.wantResult)
}
})
}
}
func TestFindMatchingToolMarkupCloseBoundaryConditions(t *testing.T) {
tests := []struct {
name string
text string
open ToolMarkupTag
wantOk bool
}{
{"empty_text", "", ToolMarkupTag{Name: "tool_calls", End: 0}, false},
{"open_end_beyond_text", "hello", ToolMarkupTag{Name: "tool_calls", End: 100}, false},
{"open_end_equals_len", "hello", ToolMarkupTag{Name: "tool_calls", End: 5}, false},
{"valid_simple", "<tool_calls></tool_calls>", ToolMarkupTag{Name: "tool_calls", End: 11}, true},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
_, ok := FindMatchingToolMarkupClose(tt.text, tt.open)
if ok != tt.wantOk {
t.Errorf("FindMatchingToolMarkupClose(%q, %+v) ok = %v, want %v", tt.text, tt.open, ok, tt.wantOk)
}
})
}
}

View File

@@ -55,6 +55,45 @@ func (h *Handler) admin(w http.ResponseWriter, r *http.Request) {
http.Error(w, "WebUI not built. Run `cd webui && npm run build` first.", http.StatusNotFound)
}
// staticContentTypes pins the Content-Type of common WebUI assets so we do not
// rely on mime.TypeByExtension, which on Windows consults the registry and can
// return the wrong type (e.g. application/xml for .css) when third-party
// software has overwritten HKEY_CLASSES_ROOT entries. Browsers strictly enforce
// stylesheet/script MIME types and will refuse to apply a misidentified asset,
// breaking the /admin page on affected machines.
var staticContentTypes = map[string]string{
".css": "text/css; charset=utf-8",
".js": "text/javascript; charset=utf-8",
".mjs": "text/javascript; charset=utf-8",
".html": "text/html; charset=utf-8",
".htm": "text/html; charset=utf-8",
".json": "application/json; charset=utf-8",
".map": "application/json; charset=utf-8",
".svg": "image/svg+xml",
".png": "image/png",
".jpg": "image/jpeg",
".jpeg": "image/jpeg",
".gif": "image/gif",
".webp": "image/webp",
".ico": "image/x-icon",
".woff": "font/woff",
".woff2": "font/woff2",
".ttf": "font/ttf",
".otf": "font/otf",
".txt": "text/plain; charset=utf-8",
".wasm": "application/wasm",
}
// setStaticContentType pins the response Content-Type by file extension so that
// http.ServeFile does not fall back to mime.TypeByExtension (which on Windows
// reads the registry and may return an incorrect type).
func setStaticContentType(w http.ResponseWriter, fullPath string) {
ext := strings.ToLower(filepath.Ext(fullPath))
if ct, ok := staticContentTypes[ext]; ok {
w.Header().Set("Content-Type", ct)
}
}
func (h *Handler) serveFromDisk(w http.ResponseWriter, r *http.Request, staticDir string) {
path := strings.TrimPrefix(r.URL.Path, "/admin")
path = strings.TrimPrefix(path, "/")
@@ -70,6 +109,7 @@ func (h *Handler) serveFromDisk(w http.ResponseWriter, r *http.Request, staticDi
} else {
w.Header().Set("Cache-Control", "no-store, must-revalidate")
}
setStaticContentType(w, full)
http.ServeFile(w, r, full)
return
}
@@ -82,6 +122,7 @@ func (h *Handler) serveFromDisk(w http.ResponseWriter, r *http.Request, staticDi
return
}
w.Header().Set("Cache-Control", "no-store, must-revalidate")
setStaticContentType(w, index)
http.ServeFile(w, r, index)
}

View File

@@ -0,0 +1,102 @@
package webui
import (
"net/http"
"net/http/httptest"
"os"
"path/filepath"
"strings"
"testing"
)
// TestServeFromDiskPinsContentType ensures static admin assets are returned
// with an explicit, RFC-compliant Content-Type that does not depend on
// mime.TypeByExtension. On Windows mime.TypeByExtension consults the registry
// (HKEY_CLASSES_ROOT) which third-party software can corrupt — for example
// installing certain editors rewrites .css to application/xml — and Chrome
// then refuses to apply a stylesheet whose Content-Type is not text/css,
// breaking the /admin page entirely. Pinning the type by file extension makes
// the response deterministic across operating systems and machine state.
func TestServeFromDiskPinsContentType(t *testing.T) {
staticDir := t.TempDir()
assetsDir := filepath.Join(staticDir, "assets")
if err := os.MkdirAll(assetsDir, 0o755); err != nil {
t.Fatalf("mkdir assets: %v", err)
}
files := map[string]string{
"index.html": "<!doctype html><html></html>",
"assets/index.css": "body{}",
"assets/index.js": "console.log(1)",
"assets/icon.svg": `<svg xmlns="http://www.w3.org/2000/svg"></svg>`,
"assets/source.js.map": `{"version":3}`,
}
for rel, body := range files {
full := filepath.Join(staticDir, filepath.FromSlash(rel))
if err := os.MkdirAll(filepath.Dir(full), 0o755); err != nil {
t.Fatalf("mkdir %s: %v", rel, err)
}
if err := os.WriteFile(full, []byte(body), 0o644); err != nil {
t.Fatalf("write %s: %v", rel, err)
}
}
h := &Handler{StaticDir: staticDir}
cases := []struct {
urlPath string
wantPrefix string
wantCacheCtl string
}{
{"/admin/assets/index.css", "text/css", "public, max-age=31536000, immutable"},
{"/admin/assets/index.js", "text/javascript", "public, max-age=31536000, immutable"},
{"/admin/assets/icon.svg", "image/svg+xml", "public, max-age=31536000, immutable"},
{"/admin/assets/source.js.map", "application/json", "public, max-age=31536000, immutable"},
// "/admin/index.html" is intentionally omitted: http.ServeFile redirects
// requests for index.html to "./", matching Go's net/http behavior. The
// route the SPA actually lands on is "/admin/" below.
{"/admin/", "text/html", "no-store, must-revalidate"},
}
for _, tc := range cases {
t.Run(tc.urlPath, func(t *testing.T) {
req := httptest.NewRequest(http.MethodGet, tc.urlPath, nil)
rec := httptest.NewRecorder()
h.serveFromDisk(rec, req, staticDir)
if rec.Code != http.StatusOK {
t.Fatalf("status = %d, want 200", rec.Code)
}
ct := rec.Header().Get("Content-Type")
if !strings.HasPrefix(ct, tc.wantPrefix) {
t.Fatalf("Content-Type = %q, want prefix %q", ct, tc.wantPrefix)
}
if got := rec.Header().Get("Cache-Control"); got != tc.wantCacheCtl {
t.Fatalf("Cache-Control = %q, want %q", got, tc.wantCacheCtl)
}
})
}
}
// TestSetStaticContentTypeUnknownExtensionFallsThrough verifies that unknown
// extensions leave the Content-Type header unset, so http.ServeFile can apply
// its own detection (sniffing or mime.TypeByExtension) for cases the pinned
// table does not cover.
func TestSetStaticContentTypeUnknownExtensionFallsThrough(t *testing.T) {
rec := httptest.NewRecorder()
setStaticContentType(rec, "/tmp/data.unknownext")
if got := rec.Header().Get("Content-Type"); got != "" {
t.Fatalf("Content-Type = %q, want empty for unknown extension", got)
}
}
// TestSetStaticContentTypeIsCaseInsensitive guards against a regression where
// uppercase extensions (e.g. STYLE.CSS shipped from some build pipelines)
// would bypass the pinned table and fall back to the registry on Windows.
func TestSetStaticContentTypeIsCaseInsensitive(t *testing.T) {
rec := httptest.NewRecorder()
setStaticContentType(rec, "/tmp/STYLE.CSS")
if got := rec.Header().Get("Content-Type"); !strings.HasPrefix(got, "text/css") {
t.Fatalf("Content-Type = %q, want text/css prefix", got)
}
}

View File

@@ -15,6 +15,8 @@ export default function VercelSyncContainer({ onMessage, authFetch, isVercel = f
setProjectId,
teamId,
setTeamId,
saveCredentials,
setSaveCredentials,
loading,
result,
preconfig,
@@ -46,6 +48,8 @@ export default function VercelSyncContainer({ onMessage, authFetch, isVercel = f
setProjectId={setProjectId}
teamId={teamId}
setTeamId={setTeamId}
saveCredentials={saveCredentials}
setSaveCredentials={setSaveCredentials}
loading={loading}
onSync={handleSync}
/>

View File

@@ -14,6 +14,8 @@ export default function VercelSyncForm({
setProjectId,
teamId,
setTeamId,
saveCredentials,
setSaveCredentials,
loading,
onSync,
}) {
@@ -124,6 +126,19 @@ export default function VercelSyncForm({
onChange={e => setTeamId(e.target.value)}
/>
</div>
<label className="flex items-start gap-3 text-sm">
<input
type="checkbox"
className="mt-1 h-4 w-4 rounded border-border text-primary focus:ring-ring"
checked={saveCredentials}
onChange={e => setSaveCredentials(e.target.checked)}
/>
<span className="space-y-1">
<span className="block font-medium">{t('vercel.saveCredentials')}</span>
<span className="block text-xs text-muted-foreground">{t('vercel.saveCredentialsHint')}</span>
</span>
</label>
</div>
<div className="pt-4">

View File

@@ -12,6 +12,7 @@ export function useVercelSyncState({ apiFetch, onMessage, t, isVercel = false })
const [vercelToken, setVercelToken] = useState('')
const [projectId, setProjectId] = useState('')
const [teamId, setTeamId] = useState('')
const [saveCredentials, setSaveCredentials] = useState(true)
const [loading, setLoading] = useState(false)
const [result, setResult] = useState(null)
const [preconfig, setPreconfig] = useState(null)
@@ -117,6 +118,7 @@ export function useVercelSyncState({ apiFetch, onMessage, t, isVercel = false })
vercel_token: tokenToUse,
project_id: projectId,
team_id: teamId || undefined,
save_credentials: saveCredentials,
}),
})
const data = await res.json()
@@ -133,7 +135,7 @@ export function useVercelSyncState({ apiFetch, onMessage, t, isVercel = false })
} finally {
setLoading(false)
}
}, [apiFetch, fetchSyncStatus, onMessage, preconfig?.has_token, projectId, t, teamId, vercelToken])
}, [apiFetch, fetchSyncStatus, onMessage, preconfig?.has_token, projectId, saveCredentials, t, teamId, vercelToken])
return {
vercelToken,
@@ -142,6 +144,8 @@ export function useVercelSyncState({ apiFetch, onMessage, t, isVercel = false })
setProjectId,
teamId,
setTeamId,
saveCredentials,
setSaveCredentials,
loading,
result,
preconfig,

View File

@@ -462,6 +462,8 @@
"projectIdHint": "Find it in Project Settings → General.",
"teamIdLabel": "Team ID",
"optional": "optional",
"saveCredentials": "Remember Vercel credentials",
"saveCredentialsHint": "Save the token, project ID, and team ID for the next sync.",
"syncing": "Syncing...",
"syncRedeploy": "Sync & redeploy",
"redeployHint": "This triggers a Vercel redeploy and usually takes 3060 seconds.",

View File

@@ -462,6 +462,8 @@
"projectIdHint": "可在项目设置 (Project Settings) → 常规 (General) 中找到",
"teamIdLabel": "团队 ID",
"optional": "可选",
"saveCredentials": "记住 Vercel 凭据",
"saveCredentialsHint": "保存访问令牌、项目 ID 和团队 ID供下次同步直接复用。",
"syncing": "正在同步...",
"syncRedeploy": "同步并重新部署",
"redeployHint": "这将触发 Vercel 的重新部署,大约需要 30-60 秒。",