Files
LangBot/pkg/provider/modelmgr/requesters/modelscopechatcmpl.yaml
devin-ai-integration[bot] d2b93b3296 feat: add embeddings model management (#1461)
* feat: add embeddings model management backend support

Co-Authored-By: Junyan Qin <Chin> <rockchinq@gmail.com>

* feat: add embeddings model management frontend support

Co-Authored-By: Junyan Qin <Chin> <rockchinq@gmail.com>

* chore: revert HttpClient URL to production setting

Co-Authored-By: Junyan Qin <Chin> <rockchinq@gmail.com>

* refactor: integrate embeddings models into models page with tabs

Co-Authored-By: Junyan Qin <Chin> <rockchinq@gmail.com>

* perf: move files

* perf: remove `s`

* feat: allow requester to declare supported types in manifest

* feat(embedding): delete dimension and encoding format

* feat: add extra_args for embedding moels

* perf: i18n ref

* fix: linter err

* fix: lint err

* fix: linter err

---------

Co-authored-by: Devin AI <158243242+devin-ai-integration[bot]@users.noreply.github.com>
Co-authored-by: Junyan Qin <Chin> <rockchinq@gmail.com>
2025-07-05 20:07:15 +08:00

38 lines
758 B
YAML

apiVersion: v1
kind: LLMAPIRequester
metadata:
name: modelscope-chat-completions
label:
en_US: ModelScope
zh_Hans: 魔搭社区
icon: modelscope.svg
spec:
config:
- name: base_url
label:
en_US: Base URL
zh_Hans: 基础 URL
type: string
required: true
default: "https://api-inference.modelscope.cn/v1"
- name: args
label:
en_US: Args
zh_Hans: 附加参数
type: object
required: true
default: {}
- name: timeout
label:
en_US: Timeout
zh_Hans: 超时时间
type: int
required: true
default: 120
support_type:
- llm
execution:
python:
path: ./modelscopechatcmpl.py
attr: ModelScopeChatCompletions