Compare commits

...

9 Commits

Author SHA1 Message Date
CJACK.
31e64ff31d Merge pull request #272 from CJackHwang/dev
Remove outdated architecture documentation and improve citation parsing
2026-04-20 18:38:26 +08:00
CJACK.
5984802df4 Merge pull request #273 from CJackHwang/codex/fix-citation-index-normalization-issue
Fix zero-based citation index normalization
2026-04-20 18:35:46 +08:00
CJACK.
e0ed4ba238 Handle one-based and zero-based citation indices safely 2026-04-20 18:29:58 +08:00
CJACK.
ae37654893 Fix zero-based citation index normalization 2026-04-20 18:18:00 +08:00
CJACK.
aa7f821151 Bump version from 3.5.0 to 3.5.1 2026-04-20 17:32:05 +08:00
CJACK.
f7426f9f04 Remove detailed backend architecture explanations
Removed detailed descriptions of routing, execution, adapter layers, tool calling, configuration, streaming capabilities, and observability enhancements.
2026-04-20 17:18:37 +08:00
CJACK.
787e034174 Merge pull request #271 from livesRan/citation注释解析
/v1/chat/completions 接口返回报文中出现了[citation:1][citation:2]等未解析的标签,本次改动将返…
2026-04-20 13:06:38 +08:00
songguoliang
d73f7b8b73 /v1/chat/completions 接口返回报文中出现了[citation:1][citation:2]等未解析的标签,本次改动将返回报文中的标签做了解析 2026-04-20 11:22:31 +08:00
CJACK
b8d844e2f6 docs: remove outdated 3.X architecture documentation from README files 2026-04-20 01:44:58 +08:00
12 changed files with 309 additions and 33 deletions

View File

@@ -82,16 +82,6 @@ flowchart LR
- **前端**React 管理台(`webui/`),运行时托管静态构建产物
- **部署**本地运行、Docker、Vercel Serverless、Linux systemd
### 3.X 底层架构调整(相较旧版本)
- **统一路由内核**:所有协议入口统一汇聚到 `internal/server/router.go`,并在同一路由树中注册 OpenAI / Claude / Gemini / Admin / WebUI 路由,避免多入口行为漂移。
- **统一执行链路**Claude / Gemini 入口先经 `internal/translatorcliproxy` 做协议转换,再进入 `openai.ChatCompletions` 统一处理工具调用与流式语义,最后再转换回原协议响应。
- **适配器分层更清晰**`internal/adapter/{claude,gemini}` 负责入口/出口协议封装,`internal/adapter/openai` 负责核心执行DeepSeek 侧调用只保留在 OpenAI 内核中。
- **Tool Calling 双运行时对齐**Go 侧(`internal/toolcall`)与 Vercel Node 侧(`internal/js/helpers/stream-tool-sieve`)保持一致的解析/防泄漏语义;当前以 XML/Markup 家族为主(`<tool_call>` / `<function_call>` / `<invoke>` / `tool_use` / antml 变体)。
- **配置与运行时设置解耦**:静态配置(`config`)与运行时策略(`settings`)通过 Admin API 分离管理,支持热更新和密码轮换失效旧 JWT。
- **流式能力升级**`/v1/responses` 与 `/v1/chat/completions` 共享更一致的工具调用增量输出策略,降低不同 SDK 下的行为差异。
- **可观测与可运维增强**`/healthz`、`/readyz`、`/admin/version`、`/admin/dev/captures` 形成排障闭环,便于发布后验证。
## 核心能力
| 能力 | 说明 |

View File

@@ -80,16 +80,6 @@ For the full module-by-module architecture and directory responsibilities, see [
- **Frontend**: React admin panel (`webui/`), served as static build at runtime
- **Deployment**: local run, Docker, Vercel serverless, Linux systemd
### 3.X Architecture Changes (vs older releases)
- **Unified routing core**: all protocol entries are now centralized through `internal/server/router.go`, with OpenAI / Claude / Gemini / Admin / WebUI routes registered in one tree to avoid multi-entry drift.
- **Unified execution chain**: Claude/Gemini entries are translated by `internal/translatorcliproxy`, then executed through `openai.ChatCompletions` for shared tool-calling and stream semantics, then translated back to the client protocol.
- **Cleaner adapter boundaries**: `internal/adapter/{claude,gemini}` handles protocol wrappers, while `internal/adapter/openai` remains the execution core; upstream DeepSeek calls are retained only in the OpenAI core.
- **Tool-calling parity across runtimes**: Go (`internal/toolcall`) and Vercel Node (`internal/js/helpers/stream-tool-sieve`) share aligned parsing/anti-leak semantics, now centered on XML/Markup-family payloads (`<tool_call>` / `<function_call>` / `<invoke>` / `tool_use` / antml variants).
- **Config/runtime separation**: static config (`config`) and runtime policy (`settings`) are managed independently via Admin APIs, enabling hot updates and password rotation with JWT invalidation.
- **Streaming behavior upgrade**: `/v1/responses` and `/v1/chat/completions` now share a more consistent incremental tool-call emission strategy across SDK ecosystems.
- **Improved operability**: `/healthz`, `/readyz`, `/admin/version`, and `/admin/dev/captures` form a tighter post-deploy diagnostics loop.
## Key Capabilities
| Capability | Details |

View File

@@ -1 +1 @@
3.5.0
3.5.1

View File

@@ -0,0 +1,31 @@
package openai
import (
"fmt"
"regexp"
"strconv"
"strings"
)
var citationMarkerPattern = regexp.MustCompile(`(?i)\[citation:\s*(\d+)\]`)
func replaceCitationMarkersWithLinks(text string, links map[int]string) string {
if strings.TrimSpace(text) == "" || len(links) == 0 {
return text
}
return citationMarkerPattern.ReplaceAllStringFunc(text, func(match string) string {
sub := citationMarkerPattern.FindStringSubmatch(match)
if len(sub) < 2 {
return match
}
idx, err := strconv.Atoi(strings.TrimSpace(sub[1]))
if err != nil || idx <= 0 {
return match
}
url := strings.TrimSpace(links[idx])
if url == "" {
return match
}
return fmt.Sprintf("[%d](%s)", idx, url)
})
}

View File

@@ -0,0 +1,28 @@
package openai
import "testing"
func TestReplaceCitationMarkersWithLinks(t *testing.T) {
raw := "这是一条更新[citation:1],更多信息见[citation:2]。"
links := map[int]string{
1: "https://example.com/news-1",
2: "https://example.com/news-2",
}
got := replaceCitationMarkersWithLinks(raw, links)
want := "这是一条更新[1](https://example.com/news-1),更多信息见[2](https://example.com/news-2)。"
if got != want {
t.Fatalf("expected %q, got %q", want, got)
}
}
func TestReplaceCitationMarkersWithLinksKeepsUnknownIndex(t *testing.T) {
raw := "只有一个来源[citation:1],未知来源[citation:3]。"
links := map[int]string{1: "https://example.com/a"}
got := replaceCitationMarkersWithLinks(raw, links)
want := "只有一个来源[1](https://example.com/a),未知来源[citation:3]。"
if got != want {
t.Fatalf("expected %q, got %q", want, got)
}
}

View File

@@ -88,7 +88,7 @@ func (h *Handler) ChatCompletions(w http.ResponseWriter, r *http.Request) {
h.handleStream(w, r, resp, sessionID, stdReq.ResponseModel, stdReq.FinalPrompt, stdReq.Thinking, stdReq.Search, stdReq.ToolNames)
return
}
h.handleNonStream(w, r.Context(), resp, sessionID, stdReq.ResponseModel, stdReq.FinalPrompt, stdReq.Thinking, stdReq.ToolNames)
h.handleNonStream(w, r.Context(), resp, sessionID, stdReq.ResponseModel, stdReq.FinalPrompt, stdReq.Thinking, stdReq.Search, stdReq.ToolNames)
}
func (h *Handler) autoDeleteRemoteSession(ctx context.Context, a *auth.RequestAuth, sessionID string) {
@@ -124,7 +124,7 @@ func (h *Handler) autoDeleteRemoteSession(ctx context.Context, a *auth.RequestAu
}
}
func (h *Handler) handleNonStream(w http.ResponseWriter, ctx context.Context, resp *http.Response, completionID, model, finalPrompt string, thinkingEnabled bool, toolNames []string) {
func (h *Handler) handleNonStream(w http.ResponseWriter, ctx context.Context, resp *http.Response, completionID, model, finalPrompt string, thinkingEnabled, searchEnabled bool, toolNames []string) {
if resp.StatusCode != http.StatusOK {
defer func() { _ = resp.Body.Close() }()
body, _ := io.ReadAll(resp.Body)
@@ -137,6 +137,9 @@ func (h *Handler) handleNonStream(w http.ResponseWriter, ctx context.Context, re
stripReferenceMarkers := h.compatStripReferenceMarkers()
finalThinking := cleanVisibleOutput(result.Thinking, stripReferenceMarkers)
finalText := cleanVisibleOutput(result.Text, stripReferenceMarkers)
if searchEnabled {
finalText = replaceCitationMarkersWithLinks(finalText, result.CitationLinks)
}
if writeUpstreamEmptyOutputError(w, finalText, result.ContentFilter) {
return
}

View File

@@ -94,7 +94,7 @@ func TestHandleNonStreamReturns429WhenUpstreamOutputEmpty(t *testing.T) {
)
rec := httptest.NewRecorder()
h.handleNonStream(rec, context.Background(), resp, "cid-empty", "deepseek-chat", "prompt", false, nil)
h.handleNonStream(rec, context.Background(), resp, "cid-empty", "deepseek-chat", "prompt", false, false, nil)
if rec.Code != http.StatusTooManyRequests {
t.Fatalf("expected status 429 for empty upstream output, got %d body=%s", rec.Code, rec.Body.String())
}
@@ -113,7 +113,7 @@ func TestHandleNonStreamReturnsContentFilterErrorWhenUpstreamFilteredWithoutOutp
)
rec := httptest.NewRecorder()
h.handleNonStream(rec, context.Background(), resp, "cid-empty-filtered", "deepseek-chat", "prompt", false, nil)
h.handleNonStream(rec, context.Background(), resp, "cid-empty-filtered", "deepseek-chat", "prompt", false, false, nil)
if rec.Code != http.StatusBadRequest {
t.Fatalf("expected status 400 for filtered upstream output, got %d body=%s", rec.Code, rec.Body.String())
}
@@ -132,7 +132,7 @@ func TestHandleNonStreamReturns429WhenUpstreamHasOnlyThinking(t *testing.T) {
)
rec := httptest.NewRecorder()
h.handleNonStream(rec, context.Background(), resp, "cid-thinking-only", "deepseek-reasoner", "prompt", true, nil)
h.handleNonStream(rec, context.Background(), resp, "cid-thinking-only", "deepseek-reasoner", "prompt", true, false, nil)
if rec.Code != http.StatusTooManyRequests {
t.Fatalf("expected status 429 for thinking-only upstream output, got %d body=%s", rec.Code, rec.Body.String())
}

View File

@@ -112,10 +112,10 @@ func (h *Handler) Responses(w http.ResponseWriter, r *http.Request) {
h.handleResponsesStream(w, r, resp, owner, responseID, stdReq.ResponseModel, stdReq.FinalPrompt, stdReq.Thinking, stdReq.Search, stdReq.ToolNames, stdReq.ToolChoice, traceID)
return
}
h.handleResponsesNonStream(w, resp, owner, responseID, stdReq.ResponseModel, stdReq.FinalPrompt, stdReq.Thinking, stdReq.ToolNames, stdReq.ToolChoice, traceID)
h.handleResponsesNonStream(w, resp, owner, responseID, stdReq.ResponseModel, stdReq.FinalPrompt, stdReq.Thinking, stdReq.Search, stdReq.ToolNames, stdReq.ToolChoice, traceID)
}
func (h *Handler) handleResponsesNonStream(w http.ResponseWriter, resp *http.Response, owner, responseID, model, finalPrompt string, thinkingEnabled bool, toolNames []string, toolChoice util.ToolChoicePolicy, traceID string) {
func (h *Handler) handleResponsesNonStream(w http.ResponseWriter, resp *http.Response, owner, responseID, model, finalPrompt string, thinkingEnabled, searchEnabled bool, toolNames []string, toolChoice util.ToolChoicePolicy, traceID string) {
defer func() { _ = resp.Body.Close() }()
if resp.StatusCode != http.StatusOK {
body, _ := io.ReadAll(resp.Body)
@@ -126,6 +126,9 @@ func (h *Handler) handleResponsesNonStream(w http.ResponseWriter, resp *http.Res
stripReferenceMarkers := h.compatStripReferenceMarkers()
sanitizedThinking := cleanVisibleOutput(result.Thinking, stripReferenceMarkers)
sanitizedText := cleanVisibleOutput(result.Text, stripReferenceMarkers)
if searchEnabled {
sanitizedText = replaceCitationMarkersWithLinks(sanitizedText, result.CitationLinks)
}
if writeUpstreamEmptyOutputError(w, sanitizedText, result.ContentFilter) {
return
}

View File

@@ -196,7 +196,7 @@ func TestHandleResponsesNonStreamRequiredToolChoiceViolation(t *testing.T) {
Allowed: map[string]struct{}{"read_file": {}},
}
h.handleResponsesNonStream(rec, resp, "owner-a", "resp_test", "deepseek-chat", "prompt", false, []string{"read_file"}, policy, "")
h.handleResponsesNonStream(rec, resp, "owner-a", "resp_test", "deepseek-chat", "prompt", false, false, []string{"read_file"}, policy, "")
if rec.Code != http.StatusUnprocessableEntity {
t.Fatalf("expected 422 for required tool_choice violation, got %d body=%s", rec.Code, rec.Body.String())
}
@@ -223,7 +223,7 @@ func TestHandleResponsesNonStreamRequiredToolChoiceIgnoresThinkingToolPayload(t
Allowed: map[string]struct{}{"read_file": {}},
}
h.handleResponsesNonStream(rec, resp, "owner-a", "resp_test", "deepseek-chat", "prompt", true, []string{"read_file"}, policy, "")
h.handleResponsesNonStream(rec, resp, "owner-a", "resp_test", "deepseek-chat", "prompt", true, false, []string{"read_file"}, policy, "")
if rec.Code != http.StatusUnprocessableEntity {
t.Fatalf("expected 422 for required tool_choice violation, got %d body=%s", rec.Code, rec.Body.String())
}
@@ -245,7 +245,7 @@ func TestHandleResponsesNonStreamReturns429WhenUpstreamOutputEmpty(t *testing.T)
)),
}
h.handleResponsesNonStream(rec, resp, "owner-a", "resp_test", "deepseek-chat", "prompt", false, nil, util.DefaultToolChoicePolicy(), "")
h.handleResponsesNonStream(rec, resp, "owner-a", "resp_test", "deepseek-chat", "prompt", false, false, nil, util.DefaultToolChoicePolicy(), "")
if rec.Code != http.StatusTooManyRequests {
t.Fatalf("expected 429 for empty upstream output, got %d body=%s", rec.Code, rec.Body.String())
}
@@ -267,7 +267,7 @@ func TestHandleResponsesNonStreamReturnsContentFilterErrorWhenUpstreamFilteredWi
)),
}
h.handleResponsesNonStream(rec, resp, "owner-a", "resp_test", "deepseek-chat", "prompt", false, nil, util.DefaultToolChoicePolicy(), "")
h.handleResponsesNonStream(rec, resp, "owner-a", "resp_test", "deepseek-chat", "prompt", false, false, nil, util.DefaultToolChoicePolicy(), "")
if rec.Code != http.StatusBadRequest {
t.Fatalf("expected 400 for filtered empty upstream output, got %d body=%s", rec.Code, rec.Body.String())
}
@@ -289,7 +289,7 @@ func TestHandleResponsesNonStreamReturns429WhenUpstreamHasOnlyThinking(t *testin
)),
}
h.handleResponsesNonStream(rec, resp, "owner-a", "resp_test", "deepseek-reasoner", "prompt", true, nil, util.DefaultToolChoicePolicy(), "")
h.handleResponsesNonStream(rec, resp, "owner-a", "resp_test", "deepseek-reasoner", "prompt", true, false, nil, util.DefaultToolChoicePolicy(), "")
if rec.Code != http.StatusTooManyRequests {
t.Fatalf("expected 429 for thinking-only upstream output, got %d body=%s", rec.Code, rec.Body.String())
}

View File

@@ -0,0 +1,174 @@
package sse
import (
"strconv"
"strings"
)
type citationLinkCollector struct {
ordered []string
seen map[string]struct{}
explicitRaw map[int]string
hasZeroIdx bool
}
func newCitationLinkCollector() *citationLinkCollector {
return &citationLinkCollector{
seen: map[string]struct{}{},
explicitRaw: map[int]string{},
}
}
func (c *citationLinkCollector) ingestChunk(chunk map[string]any) {
if c == nil || len(chunk) == 0 {
return
}
c.walkValue(chunk)
}
func (c *citationLinkCollector) build() map[int]string {
out := make(map[int]string, len(c.explicitRaw)+len(c.ordered))
for idx, u := range c.buildNormalizedExplicit() {
out[idx] = u
}
for i, u := range c.ordered {
idx := i + 1
if _, exists := out[idx]; !exists {
out[idx] = u
}
}
return out
}
func (c *citationLinkCollector) buildNormalizedExplicit() map[int]string {
out := make(map[int]string, len(c.explicitRaw))
// Default behavior keeps positive indices as-is (one-based payloads).
for idx, u := range c.explicitRaw {
if idx <= 0 || strings.TrimSpace(u) == "" {
continue
}
out[idx] = u
}
if !c.hasZeroIdx {
return out
}
// If zero index appears, upstream may be using zero-based indices.
// Add shifted candidates and resolve conflicts using ordered appearance,
// which matches visible citation marker order in response text.
for rawIdx, u := range c.explicitRaw {
if rawIdx < 0 || strings.TrimSpace(u) == "" {
continue
}
normalized := rawIdx + 1
existing, exists := out[normalized]
if !exists {
out[normalized] = u
continue
}
if c.preferURLForIndex(normalized, existing, u) == u {
out[normalized] = u
}
}
return out
}
func (c *citationLinkCollector) preferURLForIndex(idx int, current, candidate string) string {
if idx <= 0 || idx > len(c.ordered) {
return current
}
expected := c.ordered[idx-1]
switch {
case strings.TrimSpace(expected) == "":
return current
case candidate == expected && current != expected:
return candidate
default:
return current
}
}
func (c *citationLinkCollector) walkValue(v any) {
switch x := v.(type) {
case []any:
for _, item := range x {
c.walkValue(item)
}
case map[string]any:
c.captureURLAndIndex(x)
for _, vv := range x {
c.walkValue(vv)
}
}
}
func (c *citationLinkCollector) captureURLAndIndex(m map[string]any) {
url := strings.TrimSpace(asString(m["url"]))
if !isWebURL(url) {
return
}
c.addOrdered(url)
idx, hasIdx := citationIndexFromAny(m["cite_index"])
if !hasIdx {
return
}
if idx < 0 {
return
}
if idx == 0 {
c.hasZeroIdx = true
}
if existing, ok := c.explicitRaw[idx]; ok && strings.TrimSpace(existing) != "" {
return
}
c.explicitRaw[idx] = url
}
func (c *citationLinkCollector) addOrdered(url string) {
if _, ok := c.seen[url]; ok {
return
}
c.seen[url] = struct{}{}
c.ordered = append(c.ordered, url)
}
func citationIndexFromAny(v any) (int, bool) {
switch x := v.(type) {
case int:
return x, true
case int32:
return int(x), true
case int64:
return int(x), true
case float32:
return int(x), true
case float64:
return int(x), true
case string:
s := strings.TrimSpace(x)
if s == "" {
return 0, false
}
n, err := strconv.Atoi(s)
if err != nil {
return 0, false
}
return n, true
default:
return 0, false
}
}
func isWebURL(v string) bool {
v = strings.ToLower(strings.TrimSpace(v))
return strings.HasPrefix(v, "http://") || strings.HasPrefix(v, "https://")
}
func asString(v any) string {
s, _ := v.(string)
return s
}

View File

@@ -13,6 +13,7 @@ type CollectResult struct {
Text string
Thinking string
ContentFilter bool
CitationLinks map[int]string
}
// CollectStream fully consumes a DeepSeek SSE response and separates
@@ -28,11 +29,15 @@ func CollectStream(resp *http.Response, thinkingEnabled bool, closeBody bool) Co
text := strings.Builder{}
thinking := strings.Builder{}
contentFilter := false
collector := newCitationLinkCollector()
currentType := "text"
if thinkingEnabled {
currentType = "thinking"
}
_ = deepseek.ScanSSELines(resp, func(line []byte) bool {
if chunk, done, parsed := ParseDeepSeekSSELine(line); parsed && !done {
collector.ingestChunk(chunk)
}
result := ParseDeepSeekContentLine(line, thinkingEnabled, currentType)
currentType = result.NextType
if !result.Parsed {
@@ -59,5 +64,6 @@ func CollectStream(resp *http.Response, thinkingEnabled bool, closeBody bool) Co
Text: text.String(),
Thinking: thinking.String(),
ContentFilter: contentFilter,
CitationLinks: collector.build(),
}
}

View File

@@ -115,6 +115,57 @@ func TestCollectStreamWithCitation(t *testing.T) {
}
}
func TestCollectStreamExtractsCitationLinks(t *testing.T) {
resp := makeHTTPResponse(
"data: {\"p\":\"response/fragments/-1/results\",\"v\":[{\"url\":\"https://example.com/a\",\"cite_index\":0},{\"url\":\"https://example.com/b\",\"cite_index\":1}]}\n" +
"data: {\"p\":\"response/content\",\"v\":\"结论[citation:1][citation:2]\"}\n" +
"data: [DONE]\n",
)
result := CollectStream(resp, false, false)
if got := result.CitationLinks[1]; got != "https://example.com/a" {
t.Fatalf("expected citation 1 link, got %q", got)
}
if got := result.CitationLinks[2]; got != "https://example.com/b" {
t.Fatalf("expected citation 2 link, got %q", got)
}
}
func TestCollectStreamExtractsCitationLinksForSequentialZeroBasedIndices(t *testing.T) {
resp := makeHTTPResponse(
"data: {\"p\":\"response/fragments/-1/results\",\"v\":[{\"url\":\"https://example.com/a\",\"cite_index\":0},{\"url\":\"https://example.com/b\",\"cite_index\":1},{\"url\":\"https://example.com/c\",\"cite_index\":2}]}\n" +
"data: {\"p\":\"response/content\",\"v\":\"结论[citation:1][citation:2][citation:3]\"}\n" +
"data: [DONE]\n",
)
result := CollectStream(resp, false, false)
if got := result.CitationLinks[1]; got != "https://example.com/a" {
t.Fatalf("expected citation 1 link, got %q", got)
}
if got := result.CitationLinks[2]; got != "https://example.com/b" {
t.Fatalf("expected citation 2 link, got %q", got)
}
if got := result.CitationLinks[3]; got != "https://example.com/c" {
t.Fatalf("expected citation 3 link, got %q", got)
}
}
func TestCollectStreamExtractsCitationLinksForOneBasedIndices(t *testing.T) {
resp := makeHTTPResponse(
"data: {\"p\":\"response/fragments/-1/results\",\"v\":[{\"url\":\"https://example.com/a\",\"cite_index\":1},{\"url\":\"https://example.com/b\",\"cite_index\":2}]}\n" +
"data: {\"p\":\"response/content\",\"v\":\"结论[citation:1][citation:2]\"}\n" +
"data: [DONE]\n",
)
result := CollectStream(resp, false, false)
if got := result.CitationLinks[1]; got != "https://example.com/a" {
t.Fatalf("expected citation 1 link, got %q", got)
}
if got := result.CitationLinks[2]; got != "https://example.com/b" {
t.Fatalf("expected citation 2 link, got %q", got)
}
}
func TestCollectStreamMultipleThinkingChunks(t *testing.T) {
resp := makeHTTPResponse(
"data: {\"p\":\"response/thinking_content\",\"v\":\"part1\"}\n" +