mirror of
https://github.com/langbot-app/LangBot.git
synced 2025-11-25 11:29:39 +08:00
Compare commits
5 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
dbe6272bd8 | ||
|
|
eceaf85807 | ||
|
|
d0606b79b0 | ||
|
|
412f290606 | ||
|
|
21e1acc4f5 |
5
.github/ISSUE_TEMPLATE/bug-report.yml
vendored
5
.github/ISSUE_TEMPLATE/bug-report.yml
vendored
@@ -11,9 +11,12 @@ body:
|
||||
- 其他(或暂未使用)
|
||||
- Nakuru(go-cqhttp)
|
||||
- aiocqhttp(使用 OneBot 协议接入的)
|
||||
- qq-botpy(QQ官方API)
|
||||
- qq-botpy(QQ官方API WebSocket)
|
||||
- qqofficial(QQ官方API Webhook)
|
||||
- lark(飞书)
|
||||
- wecom(企业微信)
|
||||
- gewechat(个人微信)
|
||||
- discord
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
|
||||
@@ -26,11 +26,14 @@
|
||||
[](https://github.com/RockChinQ/LangBot/releases/latest)
|
||||

|
||||
<img src="https://img.shields.io/badge/python-3.10 | 3.11 | 3.12-blue.svg" alt="python">
|
||||
|
||||
[简体中文](README.md) / [English](README_EN.md)
|
||||
|
||||
</div>
|
||||
|
||||
</p>
|
||||
|
||||
## ✨ Features
|
||||
## ✨ 特性
|
||||
|
||||
- 💬 大模型对话、Agent:支持多种大模型,适配群聊和私聊;具有多轮对话、工具调用、多模态能力,并深度适配 [Dify](https://dify.ai)。目前支持 QQ、QQ频道、企业微信、飞书、Discord、个人微信,后续还将支持 WhatsApp、Telegram 等平台。
|
||||
- 🛠️ 高稳定性、功能完备:原生支持访问控制、限速、敏感词过滤等机制;配置简单,支持多种部署方式。
|
||||
|
||||
123
README_EN.md
Normal file
123
README_EN.md
Normal file
@@ -0,0 +1,123 @@
|
||||
<p align="center">
|
||||
<a href="https://langbot.app">
|
||||
<img src="https://docs.langbot.app/social.png" alt="LangBot"/>
|
||||
</a>
|
||||
|
||||
<div align="center">
|
||||
|
||||
<a href="https://trendshift.io/repositories/6187" target="_blank"><img src="https://trendshift.io/api/badge/repositories/6187" alt="RockChinQ%2FQChatGPT | Trendshift" style="width: 250px; height: 55px;" width="250" height="55"/></a>
|
||||
|
||||
<a href="https://docs.langbot.app">Home</a> |
|
||||
<a href="https://docs.langbot.app/insight/intro.htmll">Features</a> |
|
||||
<a href="https://docs.langbot.app/insight/guide.html">Deployment</a> |
|
||||
<a href="https://docs.langbot.app/usage/faq.html">FAQ</a> |
|
||||
<a href="https://docs.langbot.app/plugin/plugin-intro.html">Plugin</a> |
|
||||
<a href="https://github.com/RockChinQ/LangBot/issues/new?assignees=&labels=%E7%8B%AC%E7%AB%8B%E6%8F%92%E4%BB%B6&projects=&template=submit-plugin.yml&title=%5BPlugin%5D%3A+%E8%AF%B7%E6%B1%82%E7%99%BB%E8%AE%B0%E6%96%B0%E6%8F%92%E4%BB%B6">Submit Plugin</a>
|
||||
|
||||
<div align="center">
|
||||
😎High Stability, 🧩Extension Supported, 🦄Multi-modal - LLM Native Instant Messaging Bot Platform🤖
|
||||
</div>
|
||||
|
||||
<br/>
|
||||
|
||||
|
||||
|
||||
[](https://github.com/RockChinQ/LangBot/releases/latest)
|
||||
)
|
||||
<img src="https://img.shields.io/badge/python-3.10 | 3.11 | 3.12-blue.svg" alt="python">
|
||||
|
||||
[简体中文](README.md) / [English](README_EN.md)
|
||||
|
||||
</div>
|
||||
|
||||
</p>
|
||||
|
||||
## ✨ Features
|
||||
|
||||
- 💬 Chat with LLM / Agent: Supports multiple LLMs, adapt to group chats and private chats; Supports multi-round conversations, tool calls, and multi-modal capabilities. Deeply integrates with [Dify](https://dify.ai). Currently supports QQ, QQ Channel, WeCom, Lark, Discord, personal WeChat, and will support WhatsApp, Telegram, etc. in the future.
|
||||
- 🛠️ High Stability, Feature-rich: Native access control, rate limiting, sensitive word filtering, etc. mechanisms; Easy to use, supports multiple deployment methods.
|
||||
- 🧩 Plugin Extension, Active Community: Support event-driven, component extension, etc. plugin mechanisms; Rich ecology, currently has dozens of [plugins](https://docs.langbot.app/plugin/plugin-intro.html)
|
||||
- 😻 [New] Web UI: Support management LangBot instance through the browser, for details, see [documentation](https://docs.langbot.app/webui/intro.html)
|
||||
|
||||
## 📦 Getting Started
|
||||
|
||||
> [!IMPORTANT]
|
||||
>
|
||||
> - Before you start deploying in any way, please read the [New User Guide](https://docs.langbot.app/insight/guide.html).
|
||||
> - All documentation is in Chinese, we will provide i18n version in the near future.
|
||||
|
||||
#### Docker Compose Deployment
|
||||
|
||||
Suitable for users familiar with Docker, see the [Docker Deployment](https://docs.langbot.app/deploy/langbot/docker.html) documentation.
|
||||
|
||||
#### One-click Deployment on BTPanel
|
||||
|
||||
LangBot has been listed on the BTPanel, if you have installed the BTPanel, you can use the [document](https://docs.langbot.app/deploy/langbot/one-click/bt.html) to use it.
|
||||
|
||||
#### Zeabur Cloud Deployment
|
||||
|
||||
Community contributed Zeabur template.
|
||||
|
||||
[](https://zeabur.com/zh-CN/templates/ZKTBDH)
|
||||
|
||||
#### Railway Cloud Deployment
|
||||
|
||||
[](https://railway.app/template/yRrAyL?referralCode=vogKPF)
|
||||
|
||||
#### Other Deployment Methods
|
||||
|
||||
Directly use the released version to run, see the [Manual Deployment](https://docs.langbot.app/deploy/langbot/manual.html) documentation.
|
||||
|
||||
## 📸 Demo
|
||||
|
||||
<img alt="Reply Effect (with Internet Plugin)" src="https://docs.langbot.app/QChatGPT-0516.png" width="500px"/>
|
||||
|
||||
- WebUI Demo: https://demo.langbot.dev/
|
||||
- Login information: Email: `demo@langbot.app` Password: `langbot123456`
|
||||
- Note: Only the WebUI effect is shown, please do not fill in any sensitive information in the public environment.
|
||||
|
||||
## 🔌 Component Compatibility
|
||||
|
||||
### Message Platform
|
||||
|
||||
| Platform | Status | Remarks |
|
||||
| --- | --- | --- |
|
||||
| Personal QQ | ✅ | |
|
||||
| QQ Official API | ✅ | |
|
||||
| WeCom | ✅ | |
|
||||
| Lark | ✅ | |
|
||||
| Discord | ✅ | |
|
||||
| Personal WeChat | ✅ | Use [Gewechat](https://github.com/Devo919/Gewechat) to access |
|
||||
| Telegram | 🚧 | |
|
||||
| WhatsApp | 🚧 | |
|
||||
| DingTalk | 🚧 | |
|
||||
|
||||
🚧: In development
|
||||
|
||||
### LLMs
|
||||
|
||||
| LLM | Status | Remarks |
|
||||
| --- | --- | --- |
|
||||
| [OpenAI](https://platform.openai.com/) | ✅ | Available for any OpenAI interface format model |
|
||||
| [DeepSeek](https://www.deepseek.com/) | ✅ | |
|
||||
| [Moonshot](https://www.moonshot.cn/) | ✅ | |
|
||||
| [Anthropic](https://www.anthropic.com/) | ✅ | |
|
||||
| [xAI](https://x.ai/) | ✅ | |
|
||||
| [Zhipu AI](https://open.bigmodel.cn/) | ✅ | |
|
||||
| [Dify](https://dify.ai) | ✅ | LLMOps platform |
|
||||
| [Ollama](https://ollama.com/) | ✅ | Local LLM running platform |
|
||||
| [LMStudio](https://lmstudio.ai/) | ✅ | Local LLM running platform |
|
||||
| [GiteeAI](https://ai.gitee.com/) | ✅ | LLM interface gateway(MaaS) |
|
||||
| [SiliconFlow](https://siliconflow.cn/) | ✅ | LLM gateway(MaaS) |
|
||||
| [Aliyun Bailian](https://bailian.console.aliyun.com/) | ✅ | LLM gateway(MaaS) |
|
||||
|
||||
## 🤝 Community Contribution
|
||||
|
||||
Thanks to the following contributors and everyone in the community for their contributions.
|
||||
|
||||
|
||||
<a href="https://github.com/RockChinQ/LangBot/graphs/contributors">
|
||||
<img src="https://contrib.rocks/image?repo=RockChinQ/LangBot" />
|
||||
</a>
|
||||
|
||||
|
||||
21
pkg/core/notes/n003_print_version.py
Normal file
21
pkg/core/notes/n003_print_version.py
Normal file
@@ -0,0 +1,21 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import typing
|
||||
import os
|
||||
import sys
|
||||
import logging
|
||||
|
||||
from .. import note, app
|
||||
|
||||
|
||||
@note.note_class("PrintVersion", 3)
|
||||
class PrintVersion(note.LaunchNote):
|
||||
"""打印版本信息
|
||||
"""
|
||||
|
||||
async def need_show(self) -> bool:
|
||||
return True
|
||||
|
||||
async def yield_note(self) -> typing.AsyncGenerator[typing.Tuple[str, int], None]:
|
||||
|
||||
yield f"当前版本:{self.ap.ver_mgr.get_current_version()}", logging.INFO
|
||||
@@ -1,7 +1,7 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from .. import stage, app, note
|
||||
from ..notes import n001_classic_msgs, n002_selection_mode_on_windows
|
||||
from ..notes import n001_classic_msgs, n002_selection_mode_on_windows, n003_print_version
|
||||
|
||||
|
||||
@stage.stage_class("ShowNotesStage")
|
||||
|
||||
@@ -102,7 +102,7 @@ class ResponseWrapper(stage.PipelineStage):
|
||||
new_query=query
|
||||
)
|
||||
|
||||
if result.tool_calls is not None: # 有函数调用
|
||||
if result.tool_calls is not None and len(result.tool_calls) > 0: # 有函数调用
|
||||
|
||||
function_names = [tc.function.name for tc in result.tool_calls]
|
||||
|
||||
|
||||
@@ -8,6 +8,7 @@ from typing import AsyncGenerator
|
||||
|
||||
import openai
|
||||
import openai.types.chat.chat_completion as chat_completion
|
||||
import openai.types.chat.chat_completion_message_tool_call as chat_completion_message_tool_call
|
||||
import httpx
|
||||
import aiohttp
|
||||
import async_lru
|
||||
@@ -40,6 +41,7 @@ class OpenAIChatCompletions(requester.LLMAPIRequester):
|
||||
timeout=self.requester_cfg['timeout'],
|
||||
http_client=httpx.AsyncClient(
|
||||
trust_env=True,
|
||||
timeout=self.requester_cfg['timeout']
|
||||
)
|
||||
)
|
||||
|
||||
@@ -47,7 +49,67 @@ class OpenAIChatCompletions(requester.LLMAPIRequester):
|
||||
self,
|
||||
args: dict,
|
||||
) -> chat_completion.ChatCompletion:
|
||||
return await self.client.chat.completions.create(**args)
|
||||
args["stream"] = True
|
||||
|
||||
chunk = None
|
||||
|
||||
pending_content = ""
|
||||
|
||||
tool_calls = []
|
||||
|
||||
resp_gen: openai.AsyncStream = await self.client.chat.completions.create(**args)
|
||||
|
||||
async for chunk in resp_gen:
|
||||
# print(chunk)
|
||||
if not chunk:
|
||||
continue
|
||||
|
||||
if chunk.choices[0].delta.content is not None:
|
||||
pending_content += chunk.choices[0].delta.content
|
||||
|
||||
if chunk.choices[0].delta.tool_calls is not None:
|
||||
for tool_call in chunk.choices[0].delta.tool_calls:
|
||||
for tc in tool_calls:
|
||||
if tc.index == tool_call.index:
|
||||
tc.function.arguments += tool_call.function.arguments
|
||||
break
|
||||
else:
|
||||
tool_calls.append(tool_call)
|
||||
|
||||
real_tool_calls = []
|
||||
|
||||
for tc in tool_calls:
|
||||
function = chat_completion_message_tool_call.Function(
|
||||
name=tc.function.name,
|
||||
arguments=tc.function.arguments
|
||||
)
|
||||
real_tool_calls.append(chat_completion_message_tool_call.ChatCompletionMessageToolCall(
|
||||
id=tc.id,
|
||||
function=function,
|
||||
type="function"
|
||||
))
|
||||
|
||||
return chat_completion.ChatCompletion(
|
||||
id=chunk.id,
|
||||
object="chat.completion",
|
||||
created=chunk.created,
|
||||
choices=[
|
||||
chat_completion.Choice(
|
||||
index=0,
|
||||
message=chat_completion.ChatCompletionMessage(
|
||||
role="assistant",
|
||||
content=pending_content,
|
||||
tool_calls=real_tool_calls if len(real_tool_calls) > 0 else None
|
||||
),
|
||||
finish_reason=chunk.choices[0].finish_reason,
|
||||
logprobs=chunk.choices[0].logprobs,
|
||||
)
|
||||
],
|
||||
model=args["model"],
|
||||
service_tier=chunk.service_tier,
|
||||
system_fingerprint=chunk.system_fingerprint,
|
||||
usage=chunk.usage
|
||||
) if chunk else None
|
||||
|
||||
async def _make_msg(
|
||||
self,
|
||||
|
||||
@@ -46,6 +46,9 @@ class DeepseekChatCompletions(chatcmpl.OpenAIChatCompletions):
|
||||
# 发送请求
|
||||
resp = await self._req(args)
|
||||
|
||||
if resp is None:
|
||||
raise errors.RequesterError('接口返回为空,请确定模型提供商服务是否正常')
|
||||
|
||||
# 处理请求结果
|
||||
message = await self._make_msg(resp)
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
semantic_version = "v3.4.6"
|
||||
semantic_version = "v3.4.6.1"
|
||||
|
||||
debug_mode = False
|
||||
|
||||
|
||||
Reference in New Issue
Block a user