Compare commits

...

17 Commits

Author SHA1 Message Date
CJACK.
66e0fa568f Merge pull request #449 from CJackHwang/dev
Update
2026-05-08 01:24:16 +08:00
CJACK.
fa489248bc Merge pull request #450 from CJackHwang/codex/add-json-tag-for-ollama-model-id
Add Ollama-compatible API endpoints and model capability support
2026-05-08 01:21:41 +08:00
CJACK.
657b9379ed test(docs): assert ollama show id field and document ollama endpoints 2026-05-08 01:11:35 +08:00
CJACK.
9062330104 Merge pull request #446 from dinhnn/main
feat: add Ollama API endpoints /api/version, /api/tags, /api/show for Copilot integration
2026-05-08 00:54:16 +08:00
Dinh Nguyen
d0d61a5d77 Update ollama api test 2026-05-07 14:23:12 +07:00
dinhnn
ffef451f7a Fixbug test typing 2026-05-07 13:48:03 +07:00
Dinh Nguyen
a68a79e087 Add ollama api for copilot support 2026-05-07 09:41:46 +07:00
CJACK.
c8db66615c Update VERSION 2026-05-06 13:04:16 +08:00
CJACK.
79ae9c8970 Merge pull request #436 from waiwaic/main
fix: auto-detect Vercel for chat history path
2026-05-06 13:03:30 +08:00
waiwai
2378f0fbe7 fix: auto-detect Vercel for chat history path
On Vercel, /var/task is read-only at runtime. ChatHistoryPath() now
auto-detects Vercel via IsVercel() and defaults to /tmp/chat_history.json
when no explicit DS2API_CHAT_HISTORY_PATH is set. Manual env var still
works as explicit override.
2026-05-06 11:10:14 +08:00
CJACK.
aa29084038 Merge pull request #434 from CJackHwang/dev
Merge pull request #433 from CJackHwang/codex/flash-searchpro-search

Remove heuristic model name resolution and require explicit aliases or canonical IDs
2026-05-06 00:38:20 +08:00
CJACK.
21c1527c79 Merge pull request #433 from CJackHwang/codex/flash-searchpro-search
Remove heuristic model name resolution and require explicit aliases or canonical IDs
2026-05-06 00:03:44 +08:00
CJACK.
7ec0d99702 Merge pull request #431 from CJackHwang/main
Fix OpenAI stream heartbeat and avoid empty choices
2026-05-06 00:02:25 +08:00
CJACK.
7e639667f8 refactor: remove heuristic model resolution and enforce allowlist 2026-05-06 00:00:27 +08:00
CJACK.
066c48c107 Bump version from 4.4.2 to 4.4.3 2026-05-05 22:29:36 +08:00
CJACK.
d69b0658ea Merge pull request #430 from NgoQuocViet2001/ai/openai-stream-empty-choices
fix(openai): avoid empty choices stream heartbeat
2026-05-05 22:24:21 +08:00
NgoQuocViet2001
4315b424bf fix(openai): keep stream heartbeat choice-free 2026-05-05 21:13:38 +07:00
11 changed files with 306 additions and 118 deletions

View File

@@ -18,6 +18,7 @@ Docs: [Overview](README.en.md) / [Architecture](docs/ARCHITECTURE.en.md) / [Depl
- [OpenAI-Compatible API](#openai-compatible-api)
- [Claude-Compatible API](#claude-compatible-api)
- [Gemini-Compatible API](#gemini-compatible-api)
- [Ollama API](#ollama-api)
- [Admin API](#admin-api)
- [Error Payloads](#error-payloads)
- [cURL Examples](#curl-examples)
@@ -123,6 +124,9 @@ Gemini-compatible clients can also send `x-goog-api-key`, `?key=`, or `?api_key=
| POST | `/v1beta/models/{model}:streamGenerateContent` | Business | Gemini stream |
| POST | `/v1/models/{model}:generateContent` | Business | Gemini non-stream compat path |
| POST | `/v1/models/{model}:streamGenerateContent` | Business | Gemini stream compat path |
| GET | `/api/version` | None | Ollama version endpoint |
| GET | `/api/tags` | None | Ollama model list |
| POST | `/api/show` | None | Ollama model capability query (returns `id` + `capabilities`) |
| POST | `/admin/login` | None | Admin login |
| GET | `/admin/verify` | JWT | Verify admin JWT |
| GET | `/admin/vercel/config` | Admin | Read preconfigured Vercel creds |
@@ -617,6 +621,20 @@ Returns SSE (`text/event-stream`), each chunk as `data: <json>`:
---
## Ollama API
- `POST /api/show` request body: `{"model":"<model-id>"}`.
- Response uses lowercase `id` (not `ID`) and includes `capabilities` for Ollama-style clients and strict schemas.
Example response:
```json
{
"id": "deepseek-v4-flash",
"capabilities": ["tools", "thinking"]
}
```
## Admin API
### `POST /admin/login`

18
API.md
View File

@@ -18,6 +18,7 @@
- [OpenAI 兼容接口](#openai-兼容接口)
- [Claude 兼容接口](#claude-兼容接口)
- [Gemini 兼容接口](#gemini-兼容接口)
- [Ollama 兼容接口](#ollama-兼容接口)
- [Admin 接口](#admin-接口)
- [错误响应格式](#错误响应格式)
- [cURL 示例](#curl-示例)
@@ -125,6 +126,9 @@ Gemini 兼容客户端还可以使用 `x-goog-api-key`、`?key=` 或 `?api_key=`
| POST | `/v1beta/models/{model}:streamGenerateContent` | 业务 | Gemini 流式 |
| POST | `/v1/models/{model}:generateContent` | 业务 | Gemini 非流式兼容路径 |
| POST | `/v1/models/{model}:streamGenerateContent` | 业务 | Gemini 流式兼容路径 |
| GET | `/api/version` | 无 | Ollama 版本接口 |
| GET | `/api/tags` | 无 | Ollama 模型列表 |
| POST | `/api/show` | 无 | Ollama 单模型能力查询(返回 `id``capabilities` |
| POST | `/admin/login` | 无 | 管理登录 |
| GET | `/admin/verify` | JWT | 校验管理 JWT |
| GET | `/admin/vercel/config` | Admin | 读取 Vercel 预配置 |
@@ -628,6 +632,20 @@ data: {"type":"message_stop"}
---
## Ollama 兼容接口
- `POST /api/show` 请求体:`{"model":"<model-id>"}`
- 响应字段使用小写 `id`(不是 `ID`),并返回 `capabilities` 数组,便于与 Ollama 风格客户端/严格 schema 对齐。
示例响应:
```json
{
"id": "deepseek-v4-flash",
"capabilities": ["tools", "thinking"]
}
```
## Admin 接口
### `POST /admin/login`

View File

@@ -1 +1 @@
4.4.2
4.4.4

View File

@@ -75,20 +75,6 @@ func TestResolveExpandedHistoricalAliases(t *testing.T) {
}
}
func TestResolveModelHeuristicReasoner(t *testing.T) {
got, ok := ResolveModel(nil, "o3-super")
if !ok || got != "deepseek-v4-pro" {
t.Fatalf("expected heuristic reasoner, got ok=%v model=%q", ok, got)
}
}
func TestResolveModelHeuristicReasonerNoThinking(t *testing.T) {
got, ok := ResolveModel(nil, "o3-super-nothinking")
if !ok || got != "deepseek-v4-pro-nothinking" {
t.Fatalf("expected heuristic reasoner nothinking, got ok=%v model=%q", ok, got)
}
}
func TestResolveModelUnknown(t *testing.T) {
_, ok := ResolveModel(nil, "totally-custom-model")
if ok {
@@ -96,6 +82,13 @@ func TestResolveModelUnknown(t *testing.T) {
}
}
func TestResolveModelUnknownKnownFamilyName(t *testing.T) {
_, ok := ResolveModel(nil, "gpt-5.5-pro-search")
if ok {
t.Fatal("expected unknown known-family model to fail resolve without alias")
}
}
func TestResolveModelRejectsLegacyDeepSeekIDs(t *testing.T) {
legacyModels := []string{
"deepseek-chat",
@@ -151,13 +144,6 @@ func TestResolveModelCustomAliasToVision(t *testing.T) {
}
}
func TestResolveModelHeuristicVisionIgnoresSearchSuffix(t *testing.T) {
got, ok := ResolveModel(nil, "gemini-vision-search")
if !ok || got != "deepseek-v4-vision" {
t.Fatalf("expected heuristic vision alias to resolve without search variant, got ok=%v model=%q", ok, got)
}
}
func TestClaudeModelsResponsePaginationFields(t *testing.T) {
resp := ClaudeModelsResponse()
if _, ok := resp["first_id"]; !ok {

View File

@@ -1,6 +1,9 @@
package config
import "strings"
import (
"strings"
"time"
)
type ModelInfo struct {
ID string `json:"id"`
@@ -9,6 +12,16 @@ type ModelInfo struct {
OwnedBy string `json:"owned_by"`
Permission []any `json:"permission,omitempty"`
}
type OllamaModelInfo struct {
Name string `json:"name"`
Model string `json:"model"`
Size int64 `json:"size"`
ModifiedAt string `json:"modified_at"`
}
type OllamaCapabilitiesModelInfo struct {
ID string `json:"id"`
Capabilities []string `json:"capabilities"`
}
type ModelAliasReader interface {
ModelAliases() map[string]string
@@ -24,8 +37,21 @@ var deepSeekBaseModels = []ModelInfo{
{ID: "deepseek-v4-vision", Object: "model", Created: 1677610602, OwnedBy: "deepseek", Permission: []any{}},
}
var DeepSeekModels = appendNoThinkingVariants(deepSeekBaseModels)
var OllamaCapabilitiesModels = []OllamaCapabilitiesModelInfo{
{ID: "deepseek-v4-flash", Capabilities: []string{"tools", "thinking"}},
{ID: "deepseek-v4-pro", Capabilities: []string{"tools", "thinking"}},
{ID: "deepseek-v4-flash-search", Capabilities: []string{"tools", "thinking"}},
{ID: "deepseek-v4-pro-search", Capabilities: []string{"tools", "thinking"}},
{ID: "deepseek-v4-vision", Capabilities: []string{"tools", "thinking", "vision"}},
{ID: "deepseek-v4-flash-nothinking", Capabilities: []string{"tools"}},
{ID: "deepseek-v4-pro-nothinking", Capabilities: []string{"tools"}},
{ID: "deepseek-v4-flash-search-nothinking", Capabilities: []string{"tools"}},
{ID: "deepseek-v4-pro-search-nothinking", Capabilities: []string{"tools"}},
{ID: "deepseek-v4-vision-nothinking", Capabilities: []string{"tools", "vision"}},
}
var DeepSeekModels = appendNoThinkingVariants(deepSeekBaseModels)
var OllamaModels = mapToOllamaModels(DeepSeekModels)
var claudeBaseModels = []ModelInfo{
// Current aliases
{ID: "claude-opus-4-6", Object: "model", Created: 1715635200, OwnedBy: "anthropic"},
@@ -214,26 +240,10 @@ func ResolveModel(store ModelAliasReader, requested string) (string, bool) {
return mapped, true
}
baseModel, noThinking := splitNoThinkingModel(model)
resolvedModel, ok := resolveCanonicalModel(aliases, baseModel)
if !ok {
return "", false
}
return withNoThinkingVariant(resolvedModel, noThinking), true
}
func isRetiredHistoricalModel(model string) bool {
switch {
case strings.HasPrefix(model, "claude-1."):
return true
case strings.HasPrefix(model, "claude-2."):
return true
case strings.HasPrefix(model, "claude-instant-"):
return true
case strings.HasPrefix(model, "gpt-3.5"):
return true
default:
return false
if mapped, ok := aliases[baseModel]; ok && IsSupportedDeepSeekModel(mapped) {
return withNoThinkingVariant(mapped, noThinking), true
}
return "", false
}
func lower(s string) string {
@@ -263,6 +273,23 @@ func OpenAIModelByID(store ModelAliasReader, id string) (ModelInfo, bool) {
return ModelInfo{}, false
}
func OllamaModelsResponse() map[string]any {
return map[string]any{"models": OllamaModels}
}
func OllamaModelByID(store ModelAliasReader, id string) (OllamaCapabilitiesModelInfo, bool) {
canonical, ok := ResolveModel(store, id)
if !ok {
return OllamaCapabilitiesModelInfo{}, false
}
for _, model := range OllamaCapabilitiesModels {
if model.ID == canonical {
return model, true
}
}
return OllamaCapabilitiesModelInfo{}, false
}
func ClaudeModelsResponse() map[string]any {
resp := map[string]any{"object": "list", "data": ClaudeModels}
if len(ClaudeModels) > 0 {
@@ -286,6 +313,23 @@ func appendNoThinkingVariants(models []ModelInfo) []ModelInfo {
}
return out
}
func mapToOllamaModels(models []ModelInfo) []OllamaModelInfo {
out := make([]OllamaModelInfo, 0, len(models))
for _, model := range models {
var modifiedAt string
if model.Created > 0 {
modifiedAt = time.Unix(model.Created, 0).Format(time.RFC3339)
}
ollamaModel := OllamaModelInfo{
Name: model.ID,
Model: model.ID,
Size: 0,
ModifiedAt: modifiedAt,
}
out = append(out, ollamaModel)
}
return out
}
func splitNoThinkingModel(model string) (string, bool) {
model = lower(strings.TrimSpace(model))
@@ -315,58 +359,3 @@ func loadModelAliases(store ModelAliasReader) map[string]string {
}
return aliases
}
func resolveCanonicalModel(aliases map[string]string, model string) (string, bool) {
model = lower(strings.TrimSpace(model))
if model == "" {
return "", false
}
if isRetiredHistoricalModel(model) {
return "", false
}
if IsSupportedDeepSeekModel(model) {
return model, true
}
if mapped, ok := aliases[model]; ok && IsSupportedDeepSeekModel(mapped) {
return mapped, true
}
if strings.HasPrefix(model, "deepseek-") {
return "", false
}
knownFamily := false
for _, prefix := range []string{
"gpt-", "o1", "o3", "claude-", "gemini-", "llama-", "qwen-", "mistral-", "command-",
} {
if strings.HasPrefix(model, prefix) {
knownFamily = true
break
}
}
if !knownFamily {
return "", false
}
useVision := strings.Contains(model, "vision")
useReasoner := strings.Contains(model, "reason") ||
strings.Contains(model, "reasoner") ||
strings.HasPrefix(model, "o1") ||
strings.HasPrefix(model, "o3") ||
strings.Contains(model, "opus") ||
strings.Contains(model, "slow") ||
strings.Contains(model, "r1")
useSearch := strings.Contains(model, "search")
switch {
case useVision:
return "deepseek-v4-vision", true
case useReasoner && useSearch:
return "deepseek-v4-pro-search", true
case useReasoner:
return "deepseek-v4-pro", true
case useSearch:
return "deepseek-v4-flash-search", true
default:
return "deepseek-v4-flash", true
}
}

View File

@@ -58,6 +58,11 @@ func RawStreamSampleRoot() string {
}
func ChatHistoryPath() string {
// On Vercel, /var/task is read-only at runtime. If no explicit path is set,
// default to /tmp/chat_history.json (the only writable directory).
if IsVercel() && strings.TrimSpace(os.Getenv("DS2API_CHAT_HISTORY_PATH")) == "" {
return "/tmp/chat_history.json"
}
return ResolvePath("DS2API_CHAT_HISTORY_PATH", "data/chat_history.json")
}

View File

@@ -0,0 +1,58 @@
package ollama
import (
"ds2api/internal/config"
"ds2api/internal/util"
"encoding/json"
"github.com/go-chi/chi/v5"
"log/slog"
"net/http"
)
var WriteJSON = util.WriteJSON
type ConfigReader interface {
ModelAliases() map[string]string
}
type Handler struct {
Store ConfigReader
}
type OllamaModelRequest struct {
Model string `json:"model"`
}
func RegisterRoutes(r chi.Router, h *Handler) {
r.Get("/api/version", h.GetVersion)
r.Get("/api/tags", h.ListOllamaModels)
r.Post("/api/show", h.GetOllamaModel)
}
func (h *Handler) GetVersion(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/json")
w.WriteHeader(http.StatusOK)
_, _ = w.Write([]byte(`{"version":"0.23.1"}`))
}
func (h *Handler) ListOllamaModels(w http.ResponseWriter, r *http.Request) {
WriteJSON(w, http.StatusOK, config.OllamaModelsResponse())
}
func (h *Handler) GetOllamaModel(w http.ResponseWriter, r *http.Request) {
var payload OllamaModelRequest
if err := json.NewDecoder(r.Body).Decode(&payload); err != nil {
http.Error(w, "Invalid JSON body: "+err.Error(), http.StatusBadRequest)
return
}
defer func() {
if err := r.Body.Close(); err != nil {
slog.Warn("[ollama] failed to close request body", "error", err)
}
}()
modelID := payload.Model
model, ok := config.OllamaModelByID(h.Store, modelID)
if !ok {
http.Error(w, "Model not found.", http.StatusNotFound)
return
}
WriteJSON(w, http.StatusOK, model)
}

View File

@@ -0,0 +1,127 @@
package ollama
import (
"encoding/json"
"github.com/go-chi/chi/v5"
"net/http"
"net/http/httptest"
"strings"
"testing"
)
type ollamaTestSurface struct {
Store ConfigReader
handler *Handler
}
func (h *ollamaTestSurface) apiHandler() *Handler {
if h.handler == nil {
h.handler = &Handler{Store: h.Store}
}
return h.handler
}
func registerOllamaTestRoutes(r chi.Router, h *ollamaTestSurface) {
r.Get("/api/version", h.apiHandler().GetVersion)
r.Get("/api/tags", h.apiHandler().ListOllamaModels)
r.Post("/api/show", h.apiHandler().GetOllamaModel)
}
func TestGetOllamaVersionRoute(t *testing.T) {
h := &ollamaTestSurface{}
r := chi.NewRouter()
registerOllamaTestRoutes(r, h)
req := httptest.NewRequest(http.MethodGet, "/api/version", nil)
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
}
func TestGetOllamaModelsRoute(t *testing.T) {
h := &ollamaTestSurface{}
r := chi.NewRouter()
registerOllamaTestRoutes(r, h)
req := httptest.NewRequest(http.MethodGet, "/api/tags", nil)
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
}
func TestGetOllamaModelRoute(t *testing.T) {
h := &ollamaTestSurface{}
r := chi.NewRouter()
registerOllamaTestRoutes(r, h)
t.Run("direct", func(t *testing.T) {
body := `{"model":"deepseek-v4-flash"}`
req := httptest.NewRequest(http.MethodPost, "/api/show", strings.NewReader(body))
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
var payload map[string]any
if err := json.Unmarshal(rec.Body.Bytes(), &payload); err != nil {
t.Fatalf("expected valid json body, got err=%v body=%s", err, rec.Body.String())
}
if _, ok := payload["id"]; !ok {
t.Fatalf("expected response has lowercase id field, body=%s", rec.Body.String())
}
if _, ok := payload["ID"]; ok {
t.Fatalf("expected response does not expose uppercase ID field, body=%s", rec.Body.String())
}
})
t.Run("direct_nothinking", func(t *testing.T) {
body := `{"model":"deepseek-v4-flash-nothinking"}`
req := httptest.NewRequest(http.MethodPost, "/api/show", strings.NewReader(body))
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
})
t.Run("direct_expert", func(t *testing.T) {
body := `{"model":"deepseek-v4-pro"}`
req := httptest.NewRequest(http.MethodPost, "/api/show", strings.NewReader(body))
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
})
t.Run("direct_vision", func(t *testing.T) {
body := `{"model":"deepseek-v4-vision"}`
req := httptest.NewRequest(http.MethodPost, "/api/show", strings.NewReader(body))
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusOK {
t.Fatalf("expected 200, got %d body=%s", rec.Code, rec.Body.String())
}
})
}
func TestGetOllamaModelRouteNotFound(t *testing.T) {
h := &ollamaTestSurface{}
r := chi.NewRouter()
registerOllamaTestRoutes(r, h)
body := `{"model":"not-exists"}`
req := httptest.NewRequest(http.MethodPost, "/api/show", strings.NewReader(body))
req.Header.Set("Content-Type", "application/json")
rec := httptest.NewRecorder()
r.ServeHTTP(rec, req)
if rec.Code != http.StatusNotFound {
t.Fatalf("expected 404, got %d body=%s", rec.Code, rec.Body.String())
}
}

View File

@@ -127,13 +127,7 @@ func (s *chatStreamRuntime) sendKeepAlive() {
return
}
_, _ = s.w.Write([]byte(": keep-alive\n\n"))
s.sendChunk(openaifmt.BuildChatStreamChunk(
s.completionID,
s.created,
s.model,
[]map[string]any{},
nil,
))
_ = s.rc.Flush()
}
func (s *chatStreamRuntime) sendChunk(v any) {

View File

@@ -10,7 +10,7 @@ import (
"ds2api/internal/promptcompat"
)
func TestChatStreamKeepAliveEmitsEmptyChoiceDataFrame(t *testing.T) {
func TestChatStreamKeepAliveUsesCommentOnly(t *testing.T) {
rec := httptest.NewRecorder()
runtime := newChatStreamRuntime(
rec,
@@ -40,18 +40,8 @@ func TestChatStreamKeepAliveEmitsEmptyChoiceDataFrame(t *testing.T) {
if done {
t.Fatalf("keep-alive must not emit [DONE], body=%q", body)
}
if len(frames) != 1 {
t.Fatalf("expected one data frame, got %d body=%q", len(frames), body)
}
if got := asString(frames[0]["id"]); got != "chatcmpl-test" {
t.Fatalf("expected completion id to be preserved, got %q", got)
}
if got := asString(frames[0]["object"]); got != "chat.completion.chunk" {
t.Fatalf("expected chat chunk object, got %q", got)
}
choices, _ := frames[0]["choices"].([]any)
if len(choices) != 0 {
t.Fatalf("expected empty choices heartbeat, got %#v", choices)
if len(frames) != 0 {
t.Fatalf("keep-alive must not emit JSON data frames, got %#v body=%q", frames, body)
}
}

View File

@@ -22,6 +22,7 @@ import (
"ds2api/internal/httpapi/admin"
"ds2api/internal/httpapi/claude"
"ds2api/internal/httpapi/gemini"
"ds2api/internal/httpapi/ollama"
"ds2api/internal/httpapi/openai/chat"
"ds2api/internal/httpapi/openai/embeddings"
"ds2api/internal/httpapi/openai/files"
@@ -68,6 +69,7 @@ func NewApp() (*App, error) {
claudeHandler := &claude.Handler{Store: store, Auth: resolver, DS: dsClient, OpenAI: chatHandler, ChatHistory: chatHistoryStore}
geminiHandler := &gemini.Handler{Store: store, Auth: resolver, DS: dsClient, OpenAI: chatHandler, ChatHistory: chatHistoryStore}
adminHandler := &admin.Handler{Store: store, Pool: pool, DS: dsClient, OpenAI: chatHandler, ChatHistory: chatHistoryStore}
ollamaHandler := &ollama.Handler{Store: store}
webuiHandler := webui.NewHandler()
r := chi.NewRouter()
@@ -112,6 +114,7 @@ func NewApp() (*App, error) {
r.Post("/embeddings", embeddingsHandler.Embeddings)
claude.RegisterRoutes(r, claudeHandler)
gemini.RegisterRoutes(r, geminiHandler)
ollama.RegisterRoutes(r, ollamaHandler)
r.Route("/admin", func(ar chi.Router) {
admin.RegisterRoutes(ar, adminHandler)
})