Compare commits

...

31 Commits

Author SHA1 Message Date
Junyan Qin
b7c4c21796 feat: add message_chain field to *NormalMessageReceived events 2025-11-22 14:59:12 +08:00
Copilot
66602da9cb feat: add model_config parameter support for Dify assistant type apps (#1796)
* Initial plan

* feat: add model_config parameter support for Dify assistant type

- Add model_config parameter to AsyncDifyServiceClient.chat_messages method
- Add _get_model_config helper method to DifyServiceAPIRunner
- Pass model_config from pipeline configuration to all chat_messages calls
- Add model-config configuration field to dify-service-api schema in ai.yaml
- Support optional model configuration for assistant type apps in open-source Dify

Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>

* refactor: improve model_config implementation based on code review

- Simplify _get_model_config method logic
- Add more descriptive comment about model_config usage
- Clarify when model_config is used (assistant type apps)

Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>

* feat: only modify client.py

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>
Co-authored-by: Junyan Qin <rockchinq@gmail.com>
2025-11-22 14:38:26 +08:00
wrj97
31b483509c fix: fix n8n streaming support issue (#1787)
* fix: fix n8n streaming support issue

Add streaming support detection and proper message type handling for
n8n service API runner. Previously, when streaming was enabled, n8n
integration would fail due to incorrect message type usage.

1. Added streaming capability detection by checking adapter's
is_stream_output_supported method
2. Implemented conditional message generation using MessageChunk for
streaming mode and Message for non-streaming mode
3. Added proper error handling for adapters that don't support streaming
detection

* fix: add n8n webhook streaming model ,Optimized the streaming output when calling n8n.

---------

Co-authored-by: Dong_master <2213070223@qq.com>
2025-11-22 14:17:46 +08:00
Junyan Qin
ba7cf69c9d doc: update READMEs 2025-11-22 00:20:39 +08:00
Junyan Qin
37296be67e feat: refactor plugin market interaction and migrate to LangBot Space
- Replace plugin detail dialog with hover buttons interaction
- Add "Install" and "View Details" hover buttons on plugin cards
- Remove PluginDetailDialog component
- Update plugin marketplace URL format to /market/{author}/{plugin}
- Redirect all plugin detail views to LangBot Space
- Add i18n support for 4 languages (zh-Hans, en-US, zh-Hant, ja-JP)
- Optimize hover overlay styles for light/dark theme

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-21 14:23:22 +08:00
Junyan Qin
6c03a1dd31 perf: add supports for showing multilingual plugin README 2025-11-21 12:14:04 +08:00
Junyan Qin
b75ec9e989 doc: add product hunt badges 2025-11-21 10:31:57 +08:00
Copilot
5c8523e4ef docs: Add multilingual README files (Spanish, French, Korean, Russian, Vietnamese) (#1794)
* Initial plan

* Add multilingual README files (ES, FR, KO, RU, VI)

Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>
2025-11-21 00:36:35 +08:00
Junyan Qin
9802a42a9e perf: add request plugin button to marketplace 2025-11-20 23:41:45 +08:00
Junyan Qin
99e3abec72 chore: bump version 4.5.4 2025-11-20 23:19:37 +08:00
Junyan Qin
fc2efdf994 chore: bump langbot-plugin 0.1.12 2025-11-20 21:51:44 +08:00
Junyan Qin
6ed672d996 perf: tips msg for tool call 2025-11-20 21:45:22 +08:00
Junyan Qin
2bf593fa6b feat: pass session and query_id to tool call 2025-11-20 21:17:47 +08:00
Junyan Qin
3182214663 fix: linter errors 2025-11-20 19:48:34 +08:00
Junyan Qin
20614b20b7 feat: add component filter to marketplace page 2025-11-20 19:46:33 +08:00
Junyan Qin
da323817f7 feat: add plugin components displaying in marketplace page 2025-11-20 18:50:00 +08:00
Junyan Qin
763c1a885c perf: url display in webhook dialog 2025-11-20 16:48:06 +08:00
Junyan Qin
dbc09f46f4 perf: provider icon rounded in hovercard 2025-11-20 10:25:29 +08:00
Junyan Qin
cf43f09aff perf: auto refresh logic in market 2025-11-20 10:18:28 +08:00
Copilot
c3c51b0fbf perf: Add "Select All" checkbox to Plugin and MCP Server selection dialogs (#1790)
* Initial plan

* Add "Select All" checkbox to Plugin and MCP Server selection dialogs

Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>

* Make "Select All" text clickable by adding onClick handler to container

Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>
2025-11-18 17:00:05 +08:00
Duke
8a42daa63f Fix wecom image message send fail issue (#1789)
* Fix wecom image upload issue

* Fix log
2025-11-18 16:02:13 +08:00
Junyan Qin
d91d98c9d4 chore: bump version 4.5.3 2025-11-18 11:31:28 +08:00
Copilot
2e82f2b2d1 fix: plugin pages scroll entire viewport instead of content area only (#1788)
* Initial plan

* Fix scroll behavior in plugin pages - only content areas scroll now

Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>
2025-11-18 11:16:41 +08:00
Junyan Qin
f459c7017a chore: update pr template 2025-11-17 16:02:39 +08:00
Copilot
c27ccb8475 feat(web): Add centered empty state messages to pipeline extension dialogs (#1784)
* Initial plan

* feat: add empty state messages in pipeline extension dialogs

Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>

* fix: center empty state messages in dialog content area

Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>
2025-11-16 23:37:40 +08:00
Copilot
abb2f7ae05 feat(web): Move Get Help button to account menu (#1782)
* Initial plan

* feat: Move Get Help button to account options menu

Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>
2025-11-16 22:44:46 +08:00
Junyan Qin
80606ed32c docs: update README_JP 2025-11-16 20:44:33 +08:00
Junyan Qin
bc7c5fa864 chore: push first pypi package 2025-11-16 20:25:48 +08:00
Junyan Qin
ed0ea68037 doc: add uv link to READMEs 2025-11-16 20:04:34 +08:00
Junyan Qin
6ac4dbc011 doc: update README 2025-11-16 20:00:43 +08:00
Copilot
e642ffa5b3 chore: Add PyPI package support for uvx/pip installation (#1764)
* Initial plan

* Add package structure and resource path utilities

- Created langbot/ package with __init__.py and __main__.py entry point
- Added paths utility to find frontend and resource files from package installation
- Updated config loading to use resource paths
- Updated frontend serving to use resource paths
- Added MANIFEST.in for package data inclusion
- Updated pyproject.toml with build system and entry points

Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>

* Add PyPI publishing workflow and update license

- Created GitHub Actions workflow to build frontend and publish to PyPI
- Added license field to pyproject.toml to fix deprecation warning
- Updated .gitignore to exclude build artifacts
- Tested package building successfully

Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>

* Add PyPI installation documentation

- Created PYPI_INSTALLATION.md with detailed installation and usage instructions
- Updated README.md to feature uvx/pip installation as recommended method
- Updated README_EN.md with same changes for English documentation

Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>

* Address code review feedback

- Made package-data configuration more specific to langbot package only
- Improved path detection with caching to avoid repeated file I/O
- Removed sys.path searching which was incorrect for package data
- Removed interactive input() call for non-interactive environment compatibility
- Simplified error messages for version check

Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>

* Fix code review issues

- Use specific exception types instead of bare except
- Fix misleading comments about directory levels
- Remove redundant existence check before makedirs with exist_ok=True
- Use context manager for file opening to ensure proper cleanup

Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>

* Simplify package configuration and document behavioral differences

- Removed redundant package-data configuration, relying on MANIFEST.in
- Added documentation about behavioral differences between package and source installation
- Clarified that include-package-data=true uses MANIFEST.in for data files

Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>

* chore: update pyproject.toml

* chore: try pack templates in langbot/

* chore: update

* chore: update

* chore: update

* chore: update

* chore: update

* chore: adjust dir structure

* chore: fix imports

* fix: read default-pipeline-config.json

* fix: read default-pipeline-config.json

* fix: tests

* ci: publish pypi

* chore: bump version 4.6.0-beta.1 for testing

* chore: add templates/**

* fix: send adapters and requesters icons

* chore: bump version 4.6.0b2 for testing

* chore: add platform field for docker-compose.yaml

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com>
Co-authored-by: Junyan Qin <rockchinq@gmail.com>
2025-11-16 19:53:01 +08:00
509 changed files with 2624 additions and 1823 deletions

View File

@@ -2,6 +2,17 @@
> 请在此部分填写你实现/解决/优化的内容:
> Summary of what you implemented/solved/optimized:
>
### 更改前后对比截图 / Screenshots
> 请在此部分粘贴更改前后对比截图(可以是界面截图、控制台输出、对话截图等):
> Please paste the screenshots of changes before and after here (can be interface screenshots, console output, conversation screenshots, etc.):
>
> 修改前 / Before:
>
> 修改后 / After:
>
## 检查清单 / Checklist

46
.github/workflows/publish-to-pypi.yml vendored Normal file
View File

@@ -0,0 +1,46 @@
name: Build and Publish to PyPI
on:
workflow_dispatch:
release:
types: [published]
jobs:
build-and-publish:
runs-on: ubuntu-latest
permissions:
contents: read
id-token: write # Required for trusted publishing to PyPI
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
persist-credentials: false
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: '22'
- name: Build frontend
run: |
cd web
npm install -g pnpm
pnpm install
pnpm build
mkdir -p ../src/langbot/web/out
cp -r out ../src/langbot/web/
- name: Install the latest version of uv
uses: astral-sh/setup-uv@v6
with:
version: "latest"
- name: Build package
run: |
uv build
- name: Publish to PyPI
run: |
uv publish --token ${{ secrets.PYPI_TOKEN }}

6
.gitignore vendored
View File

@@ -47,3 +47,9 @@ uv.lock
plugins.bak
coverage.xml
.coverage
src/langbot/web/
# Build artifacts
/dist
/build
*.egg-info

View File

@@ -8,7 +8,7 @@
<a href="https://hellogithub.com/repository/langbot-app/LangBot" target="_blank"><img src="https://abroad.hellogithub.com/v1/widgets/recommend.svg?rid=5ce8ae2aa4f74316bf393b57b952433c&claim_uid=gtmc6YWjMZkT21R" alt="FeaturedHelloGitHub" style="width: 250px; height: 54px;" width="250" height="54" /></a>
[English](README_EN.md) / 简体中文 / [繁體中文](README_TW.md) / [日本語](README_JP.md) / (PR for your language)
[English](README_EN.md) / 简体中文 / [繁體中文](README_TW.md) / [日本語](README_JP.md) / [Español](README_ES.md) / [Français](README_FR.md) / [한국어](README_KO.md) / [Русский](README_RU.md) / [Tiếng Việt](README_VI.md)
[![Discord](https://img.shields.io/discord/1335141740050649118?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb)](https://discord.gg/wdNEHETs87)
[![QQ Group](https://img.shields.io/badge/%E7%A4%BE%E5%8C%BAQQ%E7%BE%A4-966235608-blue)](https://qm.qq.com/q/JLi38whHum)
@@ -31,6 +31,16 @@ LangBot 是一个开源的大语言模型原生即时通信机器人开发平台
## 📦 开始使用
#### 快速部署
使用 `uvx` 一键启动(需要先安装 [uv](https://docs.astral.sh/uv/getting-started/installation/)
```bash
uvx langbot
```
访问 http://localhost:5300 即可开始使用。
#### Docker Compose 部署
```bash
@@ -73,10 +83,10 @@ docker compose up -d
## ✨ 特性
- 💬 大模型对话、Agent支持多种大模型适配群聊和私聊具有多轮对话、工具调用、多模态、流式输出能力自带 RAG知识库实现并深度适配 [Dify](https://dify.ai)。
- 💬 大模型对话、Agent支持多种大模型适配群聊和私聊具有多轮对话、工具调用、多模态、流式输出能力自带 RAG知识库实现并深度适配 [Dify](https://dify.ai)、[Coze](https://coze.com)、[n8n](https://n8n.io)等 LLMOps 平台
- 🤖 多平台支持:目前支持 QQ、QQ频道、企业微信、个人微信、飞书、Discord、Telegram 等平台。
- 🛠️ 高稳定性、功能完备:原生支持访问控制、限速、敏感词过滤等机制;配置简单,支持多种部署方式。支持多流水线配置,不同机器人用于不同应用场景。
- 🧩 插件扩展、活跃社区:支持事件驱动、组件扩展等插件机制;适配 Anthropic [MCP 协议](https://modelcontextprotocol.io/);目前已有数百个插件。
- 🧩 插件扩展、活跃社区:高稳定性、高安全性的生产级插件系统,支持事件驱动、组件扩展等插件机制;适配 Anthropic [MCP 协议](https://modelcontextprotocol.io/);目前已有数百个插件。
- 😻 Web 管理面板:支持通过浏览器管理 LangBot 实例,不再需要手动编写配置文件。
详细规格特性请访问[文档](https://docs.langbot.app/zh/insight/features.html)。

View File

@@ -5,7 +5,9 @@
<div align="center">
English / [简体中文](README.md) / [繁體中文](README_TW.md) / [日本語](README_JP.md) / (PR for your language)
<a href="https://www.producthunt.com/products/langbot?utm_source=badge-follow&utm_medium=badge&utm_source=badge-langbot" target="_blank"><img src="https://api.producthunt.com/widgets/embed-image/v1/follow.svg?product_id=1077185&theme=light" alt="LangBot - Production&#0045;grade&#0032;IM&#0032;bot&#0032;made&#0032;easy&#0046; | Product Hunt" style="width: 250px; height: 54px;" width="250" height="54" /></a>
English / [简体中文](README.md) / [繁體中文](README_TW.md) / [日本語](README_JP.md) / [Español](README_ES.md) / [Français](README_FR.md) / [한국어](README_KO.md) / [Русский](README_RU.md) / [Tiếng Việt](README_VI.md)
[![Discord](https://img.shields.io/discord/1335141740050649118?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb)](https://discord.gg/wdNEHETs87)
[![Ask DeepWiki](https://deepwiki.com/badge.svg)](https://deepwiki.com/langbot-app/LangBot)
@@ -25,6 +27,16 @@ LangBot is an open-source LLM native instant messaging robot development platfor
## 📦 Getting Started
#### Quick Start
Use `uvx` to start with one command (need to install [uv](https://docs.astral.sh/uv/getting-started/installation/)):
```bash
uvx langbot
```
Visit http://localhost:5300 to start using it.
#### Docker Compose Deployment
```bash
@@ -67,10 +79,10 @@ Click the Star and Watch button in the upper right corner of the repository to g
## ✨ Features
- 💬 Chat with LLM / Agent: Supports multiple LLMs, adapt to group chats and private chats; Supports multi-round conversations, tool calls, multi-modal, and streaming output capabilities. Built-in RAG (knowledge base) implementation, and deeply integrates with [Dify](https://dify.ai).
- 💬 Chat with LLM / Agent: Supports multiple LLMs, adapt to group chats and private chats; Supports multi-round conversations, tool calls, multi-modal, and streaming output capabilities. Built-in RAG (knowledge base) implementation, and deeply integrates with [Dify](https://dify.ai), [Coze](https://coze.com), [n8n](https://n8n.io) etc. LLMOps platforms.
- 🤖 Multi-platform Support: Currently supports QQ, QQ Channel, WeCom, personal WeChat, Lark, DingTalk, Discord, Telegram, etc.
- 🛠️ High Stability, Feature-rich: Native access control, rate limiting, sensitive word filtering, etc. mechanisms; Easy to use, supports multiple deployment methods. Supports multiple pipeline configurations, different bots can be used for different scenarios.
- 🧩 Plugin Extension, Active Community: Support event-driven, component extension, etc. plugin mechanisms; Integrate Anthropic [MCP protocol](https://modelcontextprotocol.io/); Currently has hundreds of plugins.
- 🧩 Plugin Extension, Active Community: High stability, high security production-level plugin system; Support event-driven, component extension, etc. plugin mechanisms; Integrate Anthropic [MCP protocol](https://modelcontextprotocol.io/); Currently has hundreds of plugins.
- 😻 Web UI: Support management LangBot instance through the browser. No need to manually write configuration files.
For more detailed specifications, please refer to the [documentation](https://docs.langbot.app/en/insight/features.html).

143
README_ES.md Normal file
View File

@@ -0,0 +1,143 @@
<p align="center">
<a href="https://langbot.app">
<img src="https://docs.langbot.app/social_en.png" alt="LangBot"/>
</a>
<div align="center">
<a href="https://www.producthunt.com/products/langbot?utm_source=badge-follow&utm_medium=badge&utm_source=badge-langbot" target="_blank"><img src="https://api.producthunt.com/widgets/embed-image/v1/follow.svg?product_id=1077185&theme=light" alt="LangBot - Production&#0045;grade&#0032;IM&#0032;bot&#0032;made&#0032;easy&#0046; | Product Hunt" style="width: 250px; height: 54px;" width="250" height="54" /></a>
[English](README_EN.md) / [简体中文](README.md) / [繁體中文](README_TW.md) / [日本語](README_JP.md) / Español / [Français](README_FR.md) / [한국어](README_KO.md) / [Русский](README_RU.md) / [Tiếng Việt](README_VI.md)
[![Discord](https://img.shields.io/discord/1335141740050649118?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb)](https://discord.gg/wdNEHETs87)
[![Ask DeepWiki](https://deepwiki.com/badge.svg)](https://deepwiki.com/langbot-app/LangBot)
[![GitHub release (latest by date)](https://img.shields.io/github/v/release/langbot-app/LangBot)](https://github.com/langbot-app/LangBot/releases/latest)
<img src="https://img.shields.io/badge/python-3.10 ~ 3.13 -blue.svg" alt="python">
<a href="https://langbot.app">Inicio</a>
<a href="https://docs.langbot.app/en/insight/guide.html">Despliegue</a>
<a href="https://docs.langbot.app/en/plugin/plugin-intro.html">Plugin</a>
<a href="https://github.com/langbot-app/LangBot/issues/new?assignees=&labels=%E7%8B%AC%E7%AB%8B%E6%8F%92%E4%BB%B6&projects=&template=submit-plugin.yml&title=%5BPlugin%5D%3A+%E8%AF%B7%E6%B1%82%E7%99%BB%E8%AE%B0%E6%96%B0%E6%8F%92%E4%BB%B6">Enviar Plugin</a>
</div>
</p>
LangBot es una plataforma de desarrollo de robots de mensajería instantánea nativa de LLM de código abierto, con el objetivo de proporcionar una experiencia de desarrollo de robots de mensajería instantánea lista para usar, con funciones de aplicación LLM como Agent, RAG, MCP, adaptándose a plataformas de mensajería instantánea globales y proporcionando interfaces API ricas, compatible con desarrollo personalizado.
## 📦 Comenzar
#### Inicio Rápido
Use `uvx` para iniciar con un comando (necesita instalar [uv](https://docs.astral.sh/uv/getting-started/installation/)):
```bash
uvx langbot
```
Visite http://localhost:5300 para comenzar a usarlo.
#### Despliegue con Docker Compose
```bash
git clone https://github.com/langbot-app/LangBot
cd LangBot/docker
docker compose up -d
```
Visite http://localhost:5300 para comenzar a usarlo.
Documentación detallada [Despliegue con Docker](https://docs.langbot.app/en/deploy/langbot/docker.html).
#### Despliegue con un clic en BTPanel
LangBot ha sido listado en BTPanel. Si tiene BTPanel instalado, puede usar la [documentación](https://docs.langbot.app/en/deploy/langbot/one-click/bt.html) para usarlo.
#### Despliegue en la Nube Zeabur
Plantilla de Zeabur contribuida por la comunidad.
[![Deploy on Zeabur](https://zeabur.com/button.svg)](https://zeabur.com/en-US/templates/ZKTBDH)
#### Despliegue en la Nube Railway
[![Deploy on Railway](https://railway.com/button.svg)](https://railway.app/template/yRrAyL?referralCode=vogKPF)
#### Otros Métodos de Despliegue
Use directamente la versión publicada para ejecutar, consulte la documentación de [Despliegue Manual](https://docs.langbot.app/en/deploy/langbot/manual.html).
#### Despliegue en Kubernetes
Consulte la documentación de [Despliegue en Kubernetes](./docker/README_K8S.md).
## 😎 Manténgase Actualizado
Haga clic en los botones Star y Watch en la esquina superior derecha del repositorio para obtener las últimas actualizaciones.
![star gif](https://docs.langbot.app/star.gif)
## ✨ Características
- 💬 Chat con LLM / Agent: Compatible con múltiples LLMs, adaptado para chats grupales y privados; Admite conversaciones de múltiples rondas, llamadas a herramientas, capacidades multimodales y de salida en streaming. Implementación RAG (base de conocimientos) incorporada, e integración profunda con [Dify](https://dify.ai), [Coze](https://coze.com), [n8n](https://n8n.io) etc. LLMOps platforms.
- 🤖 Soporte Multiplataforma: Actualmente compatible con QQ, QQ Channel, WeCom, WeChat personal, Lark, DingTalk, Discord, Telegram, etc.
- 🛠️ Alta Estabilidad, Rico en Funciones: Control de acceso nativo, limitación de velocidad, filtrado de palabras sensibles, etc.; Fácil de usar, admite múltiples métodos de despliegue. Compatible con múltiples configuraciones de pipeline, diferentes bots para diferentes escenarios.
- 🧩 Extensión de Plugin, Comunidad Activa: Sistema de plugin de alta estabilidad, alta seguridad de nivel de producción; Compatible con mecanismos de plugin impulsados por eventos, extensión de componentes, etc.; Integración del protocolo [MCP](https://modelcontextprotocol.io/) de Anthropic; Actualmente cuenta con cientos de plugins.
- 😻 Interfaz Web: Admite la gestión de instancias de LangBot a través del navegador. No es necesario escribir archivos de configuración manualmente.
Para especificaciones más detalladas, consulte la [documentación](https://docs.langbot.app/en/insight/features.html).
O visite el entorno de demostración: https://demo.langbot.dev/
- Información de inicio de sesión: Correo electrónico: `demo@langbot.app` Contraseña: `langbot123456`
- Nota: Solo para demostración de WebUI, por favor no ingrese información confidencial en el entorno público.
### Plataformas de Mensajería
| Plataforma | Estado | Observaciones |
| --- | --- | --- |
| Discord | ✅ | |
| Telegram | ✅ | |
| Slack | ✅ | |
| LINE | ✅ | |
| QQ Personal | ✅ | |
| QQ API Oficial | ✅ | |
| WeCom | ✅ | |
| WeComCS | ✅ | |
| WeCom AI Bot | ✅ | |
| WeChat Personal | ✅ | |
| Lark | ✅ | |
| DingTalk | ✅ | |
### LLMs
| LLM | Estado | Observaciones |
| --- | --- | --- |
| [OpenAI](https://platform.openai.com/) | ✅ | Disponible para cualquier modelo con formato de interfaz OpenAI |
| [DeepSeek](https://www.deepseek.com/) | ✅ | |
| [Moonshot](https://www.moonshot.cn/) | ✅ | |
| [Anthropic](https://www.anthropic.com/) | ✅ | |
| [xAI](https://x.ai/) | ✅ | |
| [Zhipu AI](https://open.bigmodel.cn/) | ✅ | |
| [CompShare](https://www.compshare.cn/?ytag=GPU_YY-gh_langbot) | ✅ | Plataforma de recursos LLM y GPU |
| [PPIO](https://ppinfra.com/user/register?invited_by=QJKFYD&utm_source=github_langbot) | ✅ | Plataforma de recursos LLM y GPU |
| [接口 AI](https://jiekou.ai/) | ✅ | Plataforma de agregación LLM |
| [ShengSuanYun](https://www.shengsuanyun.com/?from=CH_KYIPP758) | ✅ | Plataforma de recursos LLM y GPU |
| [302.AI](https://share.302.ai/SuTG99) | ✅ | Gateway LLM (MaaS) |
| [Google Gemini](https://aistudio.google.com/prompts/new_chat) | ✅ | |
| [Dify](https://dify.ai) | ✅ | Plataforma LLMOps |
| [Ollama](https://ollama.com/) | ✅ | Plataforma de ejecución de LLM local |
| [LMStudio](https://lmstudio.ai/) | ✅ | Plataforma de ejecución de LLM local |
| [GiteeAI](https://ai.gitee.com/) | ✅ | Gateway de interfaz LLM (MaaS) |
| [SiliconFlow](https://siliconflow.cn/) | ✅ | Gateway LLM (MaaS) |
| [Aliyun Bailian](https://bailian.console.aliyun.com/) | ✅ | Gateway LLM (MaaS), plataforma LLMOps |
| [Volc Engine Ark](https://console.volcengine.com/ark/region:ark+cn-beijing/model?vendor=Bytedance&view=LIST_VIEW) | ✅ | Gateway LLM (MaaS), plataforma LLMOps |
| [ModelScope](https://modelscope.cn/docs/model-service/API-Inference/intro) | ✅ | Gateway LLM (MaaS) |
| [MCP](https://modelcontextprotocol.io/) | ✅ | Compatible con acceso a herramientas a través del protocolo MCP |
## 🤝 Contribución de la Comunidad
Gracias a los siguientes [contribuidores de código](https://github.com/langbot-app/LangBot/graphs/contributors) y otros miembros de la comunidad por sus contribuciones a LangBot:
<a href="https://github.com/langbot-app/LangBot/graphs/contributors">
<img src="https://contrib.rocks/image?repo=langbot-app/LangBot" />
</a>

143
README_FR.md Normal file
View File

@@ -0,0 +1,143 @@
<p align="center">
<a href="https://langbot.app">
<img src="https://docs.langbot.app/social_en.png" alt="LangBot"/>
</a>
<div align="center">
<a href="https://www.producthunt.com/products/langbot?utm_source=badge-follow&utm_medium=badge&utm_source=badge-langbot" target="_blank"><img src="https://api.producthunt.com/widgets/embed-image/v1/follow.svg?product_id=1077185&theme=light" alt="LangBot - Production&#0045;grade&#0032;IM&#0032;bot&#0032;made&#0032;easy&#0046; | Product Hunt" style="width: 250px; height: 54px;" width="250" height="54" /></a>
[English](README_EN.md) / [简体中文](README.md) / [繁體中文](README_TW.md) / [日本語](README_JP.md) / [Español](README_ES.md) / Français / [한국어](README_KO.md) / [Русский](README_RU.md) / [Tiếng Việt](README_VI.md)
[![Discord](https://img.shields.io/discord/1335141740050649118?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb)](https://discord.gg/wdNEHETs87)
[![Ask DeepWiki](https://deepwiki.com/badge.svg)](https://deepwiki.com/langbot-app/LangBot)
[![GitHub release (latest by date)](https://img.shields.io/github/v/release/langbot-app/LangBot)](https://github.com/langbot-app/LangBot/releases/latest)
<img src="https://img.shields.io/badge/python-3.10 ~ 3.13 -blue.svg" alt="python">
<a href="https://langbot.app">Accueil</a>
<a href="https://docs.langbot.app/en/insight/guide.html">Déploiement</a>
<a href="https://docs.langbot.app/en/plugin/plugin-intro.html">Plugin</a>
<a href="https://github.com/langbot-app/LangBot/issues/new?assignees=&labels=%E7%8B%AC%E7%AB%8B%E6%8F%92%E4%BB%B6&projects=&template=submit-plugin.yml&title=%5BPlugin%5D%3A+%E8%AF%B7%E6%B1%82%E7%99%BB%E8%AE%B0%E6%96%B0%E6%8F%92%E4%BB%B6">Soumettre un Plugin</a>
</div>
</p>
LangBot est une plateforme de développement de robots de messagerie instantanée native LLM open source, visant à fournir une expérience de développement de robots de messagerie instantanée prête à l'emploi, avec des fonctionnalités d'application LLM telles qu'Agent, RAG, MCP, s'adaptant aux plateformes de messagerie instantanée mondiales et fournissant des interfaces API riches, prenant en charge le développement personnalisé.
## 📦 Commencer
#### Démarrage Rapide
Utilisez `uvx` pour démarrer avec une commande (besoin d'installer [uv](https://docs.astral.sh/uv/getting-started/installation/)) :
```bash
uvx langbot
```
Visitez http://localhost:5300 pour commencer à l'utiliser.
#### Déploiement avec Docker Compose
```bash
git clone https://github.com/langbot-app/LangBot
cd LangBot/docker
docker compose up -d
```
Visitez http://localhost:5300 pour commencer à l'utiliser.
Documentation détaillée [Déploiement Docker](https://docs.langbot.app/en/deploy/langbot/docker.html).
#### Déploiement en un clic sur BTPanel
LangBot a été répertorié sur BTPanel. Si vous avez installé BTPanel, vous pouvez utiliser la [documentation](https://docs.langbot.app/en/deploy/langbot/one-click/bt.html) pour l'utiliser.
#### Déploiement Cloud Zeabur
Modèle Zeabur contribué par la communauté.
[![Deploy on Zeabur](https://zeabur.com/button.svg)](https://zeabur.com/en-US/templates/ZKTBDH)
#### Déploiement Cloud Railway
[![Deploy on Railway](https://railway.com/button.svg)](https://railway.app/template/yRrAyL?referralCode=vogKPF)
#### Autres Méthodes de Déploiement
Utilisez directement la version publiée pour exécuter, consultez la documentation de [Déploiement Manuel](https://docs.langbot.app/en/deploy/langbot/manual.html).
#### Déploiement Kubernetes
Consultez la documentation de [Déploiement Kubernetes](./docker/README_K8S.md).
## 😎 Restez à Jour
Cliquez sur les boutons Star et Watch dans le coin supérieur droit du dépôt pour obtenir les dernières mises à jour.
![star gif](https://docs.langbot.app/star.gif)
## ✨ Fonctionnalités
- 💬 Chat avec LLM / Agent : Prend en charge plusieurs LLM, adapté aux chats de groupe et privés ; Prend en charge les conversations multi-tours, les appels d'outils, les capacités multimodales et de sortie en streaming. Implémentation RAG (base de connaissances) intégrée, et intégration profonde avec [Dify](https://dify.ai), [Coze](https://coze.com), [n8n](https://n8n.io) etc. LLMOps platforms.
- 🤖 Support Multi-plateforme : Actuellement compatible avec QQ, QQ Channel, WeCom, WeChat personnel, Lark, DingTalk, Discord, Telegram, etc.
- 🛠️ Haute Stabilité, Riche en Fonctionnalités : Contrôle d'accès natif, limitation de débit, filtrage de mots sensibles, etc. ; Facile à utiliser, prend en charge plusieurs méthodes de déploiement. Prend en charge plusieurs configurations de pipeline, différents bots pour différents scénarios.
- 🧩 Extension de Plugin, Communauté Active : Système de plugin de haute stabilité, haute sécurité de niveau production; Prend en charge les mécanismes de plugin pilotés par événements, l'extension de composants, etc. ; Intégration du protocole [MCP](https://modelcontextprotocol.io/) d'Anthropic ; Dispose actuellement de centaines de plugins.
- 😻 Interface Web : Prend en charge la gestion des instances LangBot via le navigateur. Pas besoin d'écrire manuellement les fichiers de configuration.
Pour des spécifications plus détaillées, veuillez consulter la [documentation](https://docs.langbot.app/en/insight/features.html).
Ou visitez l'environnement de démonstration : https://demo.langbot.dev/
- Informations de connexion : Email : `demo@langbot.app` Mot de passe : `langbot123456`
- Note : Pour la démonstration WebUI uniquement, veuillez ne pas entrer d'informations sensibles dans l'environnement public.
### Plateformes de Messagerie
| Plateforme | Statut | Remarques |
| --- | --- | --- |
| Discord | ✅ | |
| Telegram | ✅ | |
| Slack | ✅ | |
| LINE | ✅ | |
| QQ Personnel | ✅ | |
| API Officielle QQ | ✅ | |
| WeCom | ✅ | |
| WeComCS | ✅ | |
| WeCom AI Bot | ✅ | |
| WeChat Personnel | ✅ | |
| Lark | ✅ | |
| DingTalk | ✅ | |
### LLMs
| LLM | Statut | Remarques |
| --- | --- | --- |
| [OpenAI](https://platform.openai.com/) | ✅ | Disponible pour tout modèle au format d'interface OpenAI |
| [DeepSeek](https://www.deepseek.com/) | ✅ | |
| [Moonshot](https://www.moonshot.cn/) | ✅ | |
| [Anthropic](https://www.anthropic.com/) | ✅ | |
| [xAI](https://x.ai/) | ✅ | |
| [Zhipu AI](https://open.bigmodel.cn/) | ✅ | |
| [CompShare](https://www.compshare.cn/?ytag=GPU_YY-gh_langbot) | ✅ | Plateforme de ressources LLM et GPU |
| [PPIO](https://ppinfra.com/user/register?invited_by=QJKFYD&utm_source=github_langbot) | ✅ | Plateforme de ressources LLM et GPU |
| [接口 AI](https://jiekou.ai/) | ✅ | Plateforme d'agrégation LLM |
| [ShengSuanYun](https://www.shengsuanyun.com/?from=CH_KYIPP758) | ✅ | Plateforme de ressources LLM et GPU |
| [302.AI](https://share.302.ai/SuTG99) | ✅ | Passerelle LLM (MaaS) |
| [Google Gemini](https://aistudio.google.com/prompts/new_chat) | ✅ | |
| [Dify](https://dify.ai) | ✅ | Plateforme LLMOps |
| [Ollama](https://ollama.com/) | ✅ | Plateforme d'exécution LLM locale |
| [LMStudio](https://lmstudio.ai/) | ✅ | Plateforme d'exécution LLM locale |
| [GiteeAI](https://ai.gitee.com/) | ✅ | Passerelle d'interface LLM (MaaS) |
| [SiliconFlow](https://siliconflow.cn/) | ✅ | Passerelle LLM (MaaS) |
| [Aliyun Bailian](https://bailian.console.aliyun.com/) | ✅ | Passerelle LLM (MaaS), plateforme LLMOps |
| [Volc Engine Ark](https://console.volcengine.com/ark/region:ark+cn-beijing/model?vendor=Bytedance&view=LIST_VIEW) | ✅ | Passerelle LLM (MaaS), plateforme LLMOps |
| [ModelScope](https://modelscope.cn/docs/model-service/API-Inference/intro) | ✅ | Passerelle LLM (MaaS) |
| [MCP](https://modelcontextprotocol.io/) | ✅ | Prend en charge l'accès aux outils via le protocole MCP |
## 🤝 Contribution de la Communauté
Merci aux [contributeurs de code](https://github.com/langbot-app/LangBot/graphs/contributors) suivants et aux autres membres de la communauté pour leurs contributions à LangBot :
<a href="https://github.com/langbot-app/LangBot/graphs/contributors">
<img src="https://contrib.rocks/image?repo=langbot-app/LangBot" />
</a>

View File

@@ -5,7 +5,9 @@
<div align="center">
[English](README_EN.md) / [简体中文](README.md) / [繁體中文](README_TW.md) / 日本語 / (PR for your language)
<a href="https://www.producthunt.com/products/langbot?utm_source=badge-follow&utm_medium=badge&utm_source=badge-langbot" target="_blank"><img src="https://api.producthunt.com/widgets/embed-image/v1/follow.svg?product_id=1077185&theme=light" alt="LangBot - Production&#0045;grade&#0032;IM&#0032;bot&#0032;made&#0032;easy&#0046; | Product Hunt" style="width: 250px; height: 54px;" width="250" height="54" /></a>
[English](README_EN.md) / [简体中文](README.md) / [繁體中文](README_TW.md) / 日本語 / [Español](README_ES.md) / [Français](README_FR.md) / [한국어](README_KO.md) / [Русский](README_RU.md) / [Tiếng Việt](README_VI.md)
[![Discord](https://img.shields.io/discord/1335141740050649118?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb)](https://discord.gg/wdNEHETs87)
[![Ask DeepWiki](https://deepwiki.com/badge.svg)](https://deepwiki.com/langbot-app/LangBot)
@@ -25,6 +27,16 @@ LangBot は、エージェント、RAG、MCP などの LLM アプリケーショ
## 📦 始め方
#### クイックスタート
`uvx` を使用した迅速なデプロイ([uv](https://docs.astral.sh/uv/getting-started/installation/) が必要です):
```bash
uvx langbot
```
http://localhost:5300 にアクセスして使用を開始します。
#### Docker Compose デプロイ
```bash
@@ -67,10 +79,10 @@ LangBotはBTPanelにリストされています。BTPanelをインストール
## ✨ 機能
- 💬 LLM / エージェントとのチャット: 複数のLLMをサポートし、グループチャットとプライベートチャットに対応。マルチラウンドの会話、ツールの呼び出し、マルチモーダル、ストリーミング出力機能をサポート、RAG知識ベースを組み込み、[Dify](https://dify.ai) と深く統合。
- 💬 LLM / エージェントとのチャット: 複数のLLMをサポートし、グループチャットとプライベートチャットに対応。マルチラウンドの会話、ツールの呼び出し、マルチモーダル、ストリーミング出力機能をサポート、RAG知識ベースを組み込み、[Dify](https://dify.ai)、[Coze](https://coze.com)、[n8n](https://n8n.io) などの LLMOps プラットフォームと深く統合。
- 🤖 多プラットフォーム対応: 現在、QQ、QQ チャンネル、WeChat、個人 WeChat、Lark、DingTalk、Discord、Telegram など、複数のプラットフォームをサポートしています。
- 🛠️ 高い安定性、豊富な機能: ネイティブのアクセス制御、レート制限、敏感な単語のフィルタリングなどのメカニズムをサポート。使いやすく、複数のデプロイ方法をサポート。複数のパイプライン設定をサポートし、異なるボットを異なる用途に使用できます。
- 🧩 プラグイン拡張、活発なコミュニティ: イベント駆動、コンポーネント拡張などのプラグインメカニズムをサポート。適配 Anthropic [MCP プロトコル](https://modelcontextprotocol.io/);豊富なエコシステム、現在数百のプラグインが存在。
- 🧩 プラグイン拡張、活発なコミュニティ: 高い安定性、高いセキュリティの生産レベルのプラグインシステム;イベント駆動、コンポーネント拡張などのプラグインメカニズムをサポート。適配 Anthropic [MCP プロトコル](https://modelcontextprotocol.io/);豊富なエコシステム、現在数百のプラグインが存在。
- 😻 Web UI: ブラウザを通じてLangBotインスタンスを管理することをサポート。
詳細な仕様については、[ドキュメント](https://docs.langbot.app/en/insight/features.html)を参照してください。

143
README_KO.md Normal file
View File

@@ -0,0 +1,143 @@
<p align="center">
<a href="https://langbot.app">
<img src="https://docs.langbot.app/social_en.png" alt="LangBot"/>
</a>
<div align="center">
<a href="https://www.producthunt.com/products/langbot?utm_source=badge-follow&utm_medium=badge&utm_source=badge-langbot" target="_blank"><img src="https://api.producthunt.com/widgets/embed-image/v1/follow.svg?product_id=1077185&theme=light" alt="LangBot - Production&#0045;grade&#0032;IM&#0032;bot&#0032;made&#0032;easy&#0046; | Product Hunt" style="width: 250px; height: 54px;" width="250" height="54" /></a>
[English](README_EN.md) / [简体中文](README.md) / [繁體中文](README_TW.md) / [日本語](README_JP.md) / [Español](README_ES.md) / [Français](README_FR.md) / 한국어 / [Русский](README_RU.md) / [Tiếng Việt](README_VI.md)
[![Discord](https://img.shields.io/discord/1335141740050649118?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb)](https://discord.gg/wdNEHETs87)
[![Ask DeepWiki](https://deepwiki.com/badge.svg)](https://deepwiki.com/langbot-app/LangBot)
[![GitHub release (latest by date)](https://img.shields.io/github/v/release/langbot-app/LangBot)](https://github.com/langbot-app/LangBot/releases/latest)
<img src="https://img.shields.io/badge/python-3.10 ~ 3.13 -blue.svg" alt="python">
<a href="https://langbot.app">홈</a>
<a href="https://docs.langbot.app/en/insight/guide.html">배포</a>
<a href="https://docs.langbot.app/en/plugin/plugin-intro.html">플러그인</a>
<a href="https://github.com/langbot-app/LangBot/issues/new?assignees=&labels=%E7%8B%AC%E7%AB%8B%E6%8F%92%E4%BB%B6&projects=&template=submit-plugin.yml&title=%5BPlugin%5D%3A+%E8%AF%B7%E6%B1%82%E7%99%BB%E8%AE%B0%E6%96%B0%E6%8F%92%E4%BB%B6">플러그인 제출</a>
</div>
</p>
LangBot은 오픈 소스 LLM 네이티브 인스턴트 메시징 로봇 개발 플랫폼으로, Agent, RAG, MCP 등 다양한 LLM 애플리케이션 기능을 갖춘 즉시 사용 가능한 IM 로봇 개발 경험을 제공하며, 글로벌 인스턴트 메시징 플랫폼에 적응하고 풍부한 API 인터페이스를 제공하여 맞춤형 개발을 지원합니다.
## 📦 시작하기
#### 빠른 시작
`uvx`를 사용하여 한 명령으로 시작하세요 ([uv](https://docs.astral.sh/uv/getting-started/installation/) 설치 필요):
```bash
uvx langbot
```
http://localhost:5300을 방문하여 사용을 시작하세요.
#### Docker Compose 배포
```bash
git clone https://github.com/langbot-app/LangBot
cd LangBot/docker
docker compose up -d
```
http://localhost:5300을 방문하여 사용을 시작하세요.
자세한 문서는 [Docker 배포](https://docs.langbot.app/en/deploy/langbot/docker.html)를 참조하세요.
#### BTPanel 원클릭 배포
LangBot은 BTPanel에 등록되어 있습니다. BTPanel을 설치한 경우 [문서](https://docs.langbot.app/en/deploy/langbot/one-click/bt.html)를 사용하여 사용할 수 있습니다.
#### Zeabur 클라우드 배포
커뮤니티에서 제공하는 Zeabur 템플릿입니다.
[![Deploy on Zeabur](https://zeabur.com/button.svg)](https://zeabur.com/en-US/templates/ZKTBDH)
#### Railway 클라우드 배포
[![Deploy on Railway](https://railway.com/button.svg)](https://railway.app/template/yRrAyL?referralCode=vogKPF)
#### 기타 배포 방법
릴리스 버전을 직접 사용하여 실행하려면 [수동 배포](https://docs.langbot.app/en/deploy/langbot/manual.html) 문서를 참조하세요.
#### Kubernetes 배포
[Kubernetes 배포](./docker/README_K8S.md) 문서를 참조하세요.
## 😎 최신 정보 받기
리포지토리 오른쪽 상단의 Star 및 Watch 버튼을 클릭하여 최신 업데이트를 받으세요.
![star gif](https://docs.langbot.app/star.gif)
## ✨ 기능
- 💬 LLM / Agent와 채팅: 여러 LLM을 지원하며 그룹 채팅 및 개인 채팅에 적응; 멀티 라운드 대화, 도구 호출, 멀티모달, 스트리밍 출력 기능을 지원합니다. 내장된 RAG(지식 베이스) 구현 및 [Dify](https://dify.ai)、[Coze](https://coze.com)、[n8n](https://n8n.io) 등의 LLMOps 플랫폼과 깊이 통합됩니다.
- 🤖 다중 플랫폼 지원: 현재 QQ, QQ Channel, WeCom, 개인 WeChat, Lark, DingTalk, Discord, Telegram 등을 지원합니다.
- 🛠️ 높은 안정성, 풍부한 기능: 네이티브 액세스 제어, 속도 제한, 민감한 단어 필터링 등의 메커니즘; 사용하기 쉽고 여러 배포 방법을 지원합니다. 여러 파이프라인 구성을 지원하며 다양한 시나리오에 대해 다른 봇을 사용할 수 있습니다.
- 🧩 플러그인 확장, 활발한 커뮤니티: 고안정성, 고보안 생산 수준의 플러그인 시스템; 이벤트 기반, 컴포넌트 확장 등의 플러그인 메커니즘을 지원; Anthropic [MCP 프로토콜](https://modelcontextprotocol.io/) 통합; 현재 수백 개의 플러그인이 있습니다.
- 😻 웹 UI: 브라우저를 통해 LangBot 인스턴스 관리를 지원합니다. 구성 파일을 수동으로 작성할 필요가 없습니다.
더 자세한 사양은 [문서](https://docs.langbot.app/en/insight/features.html)를 참조하세요.
또는 데모 환경을 방문하세요: https://demo.langbot.dev/
- 로그인 정보: 이메일: `demo@langbot.app` 비밀번호: `langbot123456`
- 참고: WebUI 데모 전용이므로 공개 환경에서는 민감한 정보를 입력하지 마세요.
### 메시징 플랫폼
| 플랫폼 | 상태 | 비고 |
| --- | --- | --- |
| Discord | ✅ | |
| Telegram | ✅ | |
| Slack | ✅ | |
| LINE | ✅ | |
| 개인 QQ | ✅ | |
| QQ 공식 API | ✅ | |
| WeCom | ✅ | |
| WeComCS | ✅ | |
| WeCom AI Bot | ✅ | |
| 개인 WeChat | ✅ | |
| Lark | ✅ | |
| DingTalk | ✅ | |
### LLMs
| LLM | 상태 | 비고 |
| --- | --- | --- |
| [OpenAI](https://platform.openai.com/) | ✅ | 모든 OpenAI 인터페이스 형식 모델에 사용 가능 |
| [DeepSeek](https://www.deepseek.com/) | ✅ | |
| [Moonshot](https://www.moonshot.cn/) | ✅ | |
| [Anthropic](https://www.anthropic.com/) | ✅ | |
| [xAI](https://x.ai/) | ✅ | |
| [Zhipu AI](https://open.bigmodel.cn/) | ✅ | |
| [CompShare](https://www.compshare.cn/?ytag=GPU_YY-gh_langbot) | ✅ | LLM 및 GPU 리소스 플랫폼 |
| [PPIO](https://ppinfra.com/user/register?invited_by=QJKFYD&utm_source=github_langbot) | ✅ | LLM 및 GPU 리소스 플랫폼 |
| [接口 AI](https://jiekou.ai/) | ✅ | LLM 집계 플랫폼 |
| [ShengSuanYun](https://www.shengsuanyun.com/?from=CH_KYIPP758) | ✅ | LLM 및 GPU 리소스 플랫폼 |
| [302.AI](https://share.302.ai/SuTG99) | ✅ | LLM 게이트웨이(MaaS) |
| [Google Gemini](https://aistudio.google.com/prompts/new_chat) | ✅ | |
| [Dify](https://dify.ai) | ✅ | LLMOps 플랫폼 |
| [Ollama](https://ollama.com/) | ✅ | 로컬 LLM 실행 플랫폼 |
| [LMStudio](https://lmstudio.ai/) | ✅ | 로컬 LLM 실행 플랫폼 |
| [GiteeAI](https://ai.gitee.com/) | ✅ | LLM 인터페이스 게이트웨이(MaaS) |
| [SiliconFlow](https://siliconflow.cn/) | ✅ | LLM 게이트웨이(MaaS) |
| [Aliyun Bailian](https://bailian.console.aliyun.com/) | ✅ | LLM 게이트웨이(MaaS), LLMOps 플랫폼 |
| [Volc Engine Ark](https://console.volcengine.com/ark/region:ark+cn-beijing/model?vendor=Bytedance&view=LIST_VIEW) | ✅ | LLM 게이트웨이(MaaS), LLMOps 플랫폼 |
| [ModelScope](https://modelscope.cn/docs/model-service/API-Inference/intro) | ✅ | LLM 게이트웨이(MaaS) |
| [MCP](https://modelcontextprotocol.io/) | ✅ | MCP 프로토콜을 통한 도구 액세스 지원 |
## 🤝 커뮤니티 기여
다음 [코드 기여자](https://github.com/langbot-app/LangBot/graphs/contributors) 및 커뮤니티의 다른 구성원들의 LangBot 기여에 감사드립니다:
<a href="https://github.com/langbot-app/LangBot/graphs/contributors">
<img src="https://contrib.rocks/image?repo=langbot-app/LangBot" />
</a>

143
README_RU.md Normal file
View File

@@ -0,0 +1,143 @@
<p align="center">
<a href="https://langbot.app">
<img src="https://docs.langbot.app/social_en.png" alt="LangBot"/>
</a>
<div align="center">
<a href="https://www.producthunt.com/products/langbot?utm_source=badge-follow&utm_medium=badge&utm_source=badge-langbot" target="_blank"><img src="https://api.producthunt.com/widgets/embed-image/v1/follow.svg?product_id=1077185&theme=light" alt="LangBot - Production&#0045;grade&#0032;IM&#0032;bot&#0032;made&#0032;easy&#0046; | Product Hunt" style="width: 250px; height: 54px;" width="250" height="54" /></a>
[English](README_EN.md) / [简体中文](README.md) / [繁體中文](README_TW.md) / [日本語](README_JP.md) / [Español](README_ES.md) / [Français](README_FR.md) / [한국어](README_KO.md) / Русский / [Tiếng Việt](README_VI.md)
[![Discord](https://img.shields.io/discord/1335141740050649118?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb)](https://discord.gg/wdNEHETs87)
[![Ask DeepWiki](https://deepwiki.com/badge.svg)](https://deepwiki.com/langbot-app/LangBot)
[![GitHub release (latest by date)](https://img.shields.io/github/v/release/langbot-app/LangBot)](https://github.com/langbot-app/LangBot/releases/latest)
<img src="https://img.shields.io/badge/python-3.10 ~ 3.13 -blue.svg" alt="python">
<a href="https://langbot.app">Главная</a>
<a href="https://docs.langbot.app/en/insight/guide.html">Развертывание</a>
<a href="https://docs.langbot.app/en/plugin/plugin-intro.html">Плагин</a>
<a href="https://github.com/langbot-app/LangBot/issues/new?assignees=&labels=%E7%8B%AC%E7%AB%8B%E6%8F%92%E4%BB%B6&projects=&template=submit-plugin.yml&title=%5BPlugin%5D%3A+%E8%AF%B7%E6%B1%82%E7%99%BB%E8%AE%B0%E6%96%B0%E6%8F%92%E4%BB%B6">Отправить плагин</a>
</div>
</p>
LangBot — это платформа разработки ботов для мгновенных сообщений на основе LLM с открытым исходным кодом, целью которой является предоставление готового к использованию опыта разработки ботов для IM, с функциями приложений LLM, такими как Agent, RAG, MCP, адаптацией к глобальным платформам мгновенных сообщений и предоставлением богатых API-интерфейсов, поддерживающих пользовательскую разработку.
## 📦 Начало работы
#### Быстрый старт
Используйте `uvx` для запуска одной командой (требуется установка [uv](https://docs.astral.sh/uv/getting-started/installation/)):
```bash
uvx langbot
```
Посетите http://localhost:5300, чтобы начать использование.
#### Развертывание с Docker Compose
```bash
git clone https://github.com/langbot-app/LangBot
cd LangBot/docker
docker compose up -d
```
Посетите http://localhost:5300, чтобы начать использование.
Подробная документация [Развертывание Docker](https://docs.langbot.app/en/deploy/langbot/docker.html).
#### Развертывание одним кликом на BTPanel
LangBot добавлен в BTPanel. Если у вас установлен BTPanel, вы можете использовать [документацию](https://docs.langbot.app/en/deploy/langbot/one-click/bt.html) для его использования.
#### Облачное развертывание Zeabur
Шаблон Zeabur, предоставленный сообществом.
[![Deploy on Zeabur](https://zeabur.com/button.svg)](https://zeabur.com/en-US/templates/ZKTBDH)
#### Облачное развертывание Railway
[![Deploy on Railway](https://railway.com/button.svg)](https://railway.app/template/yRrAyL?referralCode=vogKPF)
#### Другие методы развертывания
Используйте выпущенную версию напрямую для запуска, см. документацию [Ручное развертывание](https://docs.langbot.app/en/deploy/langbot/manual.html).
#### Развертывание Kubernetes
См. документацию [Развертывание Kubernetes](./docker/README_K8S.md).
## 😎 Оставайтесь в курсе
Нажмите кнопки Star и Watch в правом верхнем углу репозитория, чтобы получать последние обновления.
![star gif](https://docs.langbot.app/star.gif)
## ✨ Функции
- 💬 Чат с LLM / Agent: Поддержка нескольких LLM, адаптация к групповым и личным чатам; Поддержка многораундовых разговоров, вызовов инструментов, мультимодальных возможностей и потоковой передачи. Встроенная реализация RAG (база знаний) и глубокая интеграция с [Dify](https://dify.ai), [Coze](https://coze.com), [n8n](https://n8n.io) 등의 LLMOps 플랫포트폼과 깊이 통합됩니다.
- 🤖 Многоплатформенная поддержка: В настоящее время поддерживает QQ, QQ Channel, WeCom, личный WeChat, Lark, DingTalk, Discord, Telegram и т.д.
- 🛠️ Высокая стабильность, богатство функций: Нативный контроль доступа, ограничение скорости, фильтрация чувствительных слов и т.д.; Простота в использовании, поддержка нескольких методов развертывания. Поддержка нескольких конфигураций конвейера, разные боты для разных сценариев.
- 🧩 Расширение плагинов, активное сообщество: Высокая стабильность, высокая безопасность уровня производства; Поддержка механизмов плагинов, управляемых событиями, расширения компонентов и т.д.; Интеграция протокола [MCP](https://modelcontextprotocol.io/) от Anthropic; В настоящее время сотни плагинов.
- 😻 Веб-интерфейс: Поддержка управления экземплярами LangBot через браузер. Нет необходимости вручную писать конфигурационные файлы.
Для более подробных спецификаций обратитесь к [документации](https://docs.langbot.app/en/insight/features.html).
Или посетите демонстрационную среду: https://demo.langbot.dev/
- Информация для входа: Email: `demo@langbot.app` Пароль: `langbot123456`
- Примечание: Только для демонстрации WebUI, пожалуйста, не вводите конфиденциальную информацию в общедоступной среде.
### Платформы обмена сообщениями
| Платформа | Статус | Примечания |
| --- | --- | --- |
| Discord | ✅ | |
| Telegram | ✅ | |
| Slack | ✅ | |
| LINE | ✅ | |
| Личный QQ | ✅ | |
| Официальный API QQ | ✅ | |
| WeCom | ✅ | |
| WeComCS | ✅ | |
| WeCom AI Bot | ✅ | |
| Личный WeChat | ✅ | |
| Lark | ✅ | |
| DingTalk | ✅ | |
### LLMs
| LLM | Статус | Примечания |
| --- | --- | --- |
| [OpenAI](https://platform.openai.com/) | ✅ | Доступна для любой модели формата интерфейса OpenAI |
| [DeepSeek](https://www.deepseek.com/) | ✅ | |
| [Moonshot](https://www.moonshot.cn/) | ✅ | |
| [Anthropic](https://www.anthropic.com/) | ✅ | |
| [xAI](https://x.ai/) | ✅ | |
| [Zhipu AI](https://open.bigmodel.cn/) | ✅ | |
| [CompShare](https://www.compshare.cn/?ytag=GPU_YY-gh_langbot) | ✅ | Платформа ресурсов LLM и GPU |
| [PPIO](https://ppinfra.com/user/register?invited_by=QJKFYD&utm_source=github_langbot) | ✅ | Платформа ресурсов LLM и GPU |
| [接口 AI](https://jiekou.ai/) | ✅ | Платформа агрегации LLM |
| [ShengSuanYun](https://www.shengsuanyun.com/?from=CH_KYIPP758) | ✅ | Платформа ресурсов LLM и GPU |
| [302.AI](https://share.302.ai/SuTG99) | ✅ | Шлюз LLM (MaaS) |
| [Google Gemini](https://aistudio.google.com/prompts/new_chat) | ✅ | |
| [Dify](https://dify.ai) | ✅ | Платформа LLMOps |
| [Ollama](https://ollama.com/) | ✅ | Платформа локального запуска LLM |
| [LMStudio](https://lmstudio.ai/) | ✅ | Платформа локального запуска LLM |
| [GiteeAI](https://ai.gitee.com/) | ✅ | Шлюз интерфейса LLM (MaaS) |
| [SiliconFlow](https://siliconflow.cn/) | ✅ | Шлюз LLM (MaaS) |
| [Aliyun Bailian](https://bailian.console.aliyun.com/) | ✅ | Шлюз LLM (MaaS), платформа LLMOps |
| [Volc Engine Ark](https://console.volcengine.com/ark/region:ark+cn-beijing/model?vendor=Bytedance&view=LIST_VIEW) | ✅ | Шлюз LLM (MaaS), платформа LLMOps |
| [ModelScope](https://modelscope.cn/docs/model-service/API-Inference/intro) | ✅ | Шлюз LLM (MaaS) |
| [MCP](https://modelcontextprotocol.io/) | ✅ | Поддержка доступа к инструментам через протокол MCP |
## 🤝 Вклад сообщества
Спасибо следующим [контрибьюторам кода](https://github.com/langbot-app/LangBot/graphs/contributors) и другим членам сообщества за их вклад в LangBot:
<a href="https://github.com/langbot-app/LangBot/graphs/contributors">
<img src="https://contrib.rocks/image?repo=langbot-app/LangBot" />
</a>

View File

@@ -5,7 +5,7 @@
<div align="center"><a href="https://hellogithub.com/repository/langbot-app/LangBot" target="_blank"><img src="https://abroad.hellogithub.com/v1/widgets/recommend.svg?rid=5ce8ae2aa4f74316bf393b57b952433c&claim_uid=gtmc6YWjMZkT21R" alt="FeaturedHelloGitHub" style="width: 250px; height: 54px;" width="250" height="54" /></a>
[English](README_EN.md) / [简体中文](README.md) / 繁體中文 / [日本語](README_JP.md) / (PR for your language)
[English](README_EN.md) / [简体中文](README.md) / 繁體中文 / [日本語](README_JP.md) / [Español](README_ES.md) / [Français](README_FR.md) / [한국어](README_KO.md) / [Русский](README_RU.md) / [Tiếng Việt](README_VI.md)
[![Discord](https://img.shields.io/discord/1335141740050649118?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb)](https://discord.gg/wdNEHETs87)
[![QQ Group](https://img.shields.io/badge/%E7%A4%BE%E5%8C%BAQQ%E7%BE%A4-966235608-blue)](https://qm.qq.com/q/JLi38whHum)
@@ -27,6 +27,16 @@ LangBot 是一個開源的大語言模型原生即時通訊機器人開發平台
## 📦 開始使用
#### 快速部署
使用 `uvx` 一鍵啟動(需要先安裝 [uv](https://docs.astral.sh/uv/getting-started/installation/)
```bash
uvx langbot
```
訪問 http://localhost:5300 即可開始使用。
#### Docker Compose 部署
```bash
@@ -69,10 +79,10 @@ docker compose up -d
## ✨ 特性
- 💬 大模型對話、Agent支援多種大模型適配群聊和私聊具有多輪對話、工具調用、多模態、流式輸出能力自帶 RAG知識庫實現並深度適配 [Dify](https://dify.ai)。
- 💬 大模型對話、Agent支援多種大模型適配群聊和私聊具有多輪對話、工具調用、多模態、流式輸出能力自帶 RAG知識庫實現並深度適配 [Dify](https://dify.ai)、[Coze](https://coze.com)、[n8n](https://n8n.io) 等 LLMOps 平台
- 🤖 多平台支援:目前支援 QQ、QQ頻道、企業微信、個人微信、飛書、Discord、Telegram 等平台。
- 🛠️ 高穩定性、功能完備:原生支援訪問控制、限速、敏感詞過濾等機制;配置簡單,支援多種部署方式。支援多流水線配置,不同機器人用於不同應用場景。
- 🧩 外掛擴展、活躍社群:支援事件驅動、組件擴展等外掛機制;適配 Anthropic [MCP 協議](https://modelcontextprotocol.io/);目前已有數百個外掛。
- 🧩 外掛擴展、活躍社群:高穩定性、高安全性的生產級外掛系統;支援事件驅動、組件擴展等外掛機制;適配 Anthropic [MCP 協議](https://modelcontextprotocol.io/);目前已有數百個外掛。
- 😻 Web 管理面板:支援通過瀏覽器管理 LangBot 實例,不再需要手動編寫配置文件。
詳細規格特性請訪問[文件](https://docs.langbot.app/zh/insight/features.html)。

143
README_VI.md Normal file
View File

@@ -0,0 +1,143 @@
<p align="center">
<a href="https://langbot.app">
<img src="https://docs.langbot.app/social_en.png" alt="LangBot"/>
</a>
<div align="center">
<a href="https://www.producthunt.com/products/langbot?utm_source=badge-follow&utm_medium=badge&utm_source=badge-langbot" target="_blank"><img src="https://api.producthunt.com/widgets/embed-image/v1/follow.svg?product_id=1077185&theme=light" alt="LangBot - Production&#0045;grade&#0032;IM&#0032;bot&#0032;made&#0032;easy&#0046; | Product Hunt" style="width: 250px; height: 54px;" width="250" height="54" /></a>
[English](README_EN.md) / [简体中文](README.md) / [繁體中文](README_TW.md) / [日本語](README_JP.md) / [Español](README_ES.md) / [Français](README_FR.md) / [한국어](README_KO.md) / [Русский](README_RU.md) / Tiếng Việt
[![Discord](https://img.shields.io/discord/1335141740050649118?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb)](https://discord.gg/wdNEHETs87)
[![Ask DeepWiki](https://deepwiki.com/badge.svg)](https://deepwiki.com/langbot-app/LangBot)
[![GitHub release (latest by date)](https://img.shields.io/github/v/release/langbot-app/LangBot)](https://github.com/langbot-app/LangBot/releases/latest)
<img src="https://img.shields.io/badge/python-3.10 ~ 3.13 -blue.svg" alt="python">
<a href="https://langbot.app">Trang chủ</a>
<a href="https://docs.langbot.app/en/insight/guide.html">Triển khai</a>
<a href="https://docs.langbot.app/en/plugin/plugin-intro.html">Plugin</a>
<a href="https://github.com/langbot-app/LangBot/issues/new?assignees=&labels=%E7%8B%AC%E7%AB%8B%E6%8F%92%E4%BB%B6&projects=&template=submit-plugin.yml&title=%5BPlugin%5D%3A+%E8%AF%B7%E6%B1%82%E7%99%BB%E8%AE%B0%E6%96%B0%E6%8F%92%E4%BB%B6">Gửi Plugin</a>
</div>
</p>
LangBot là một nền tảng phát triển robot nhắn tin tức thời gốc LLM mã nguồn mở, nhằm mục đích cung cấp trải nghiệm phát triển robot IM sẵn sàng sử dụng, với các chức năng ứng dụng LLM như Agent, RAG, MCP, thích ứng với các nền tảng nhắn tin tức thời toàn cầu và cung cấp giao diện API phong phú, hỗ trợ phát triển tùy chỉnh.
## 📦 Bắt đầu
#### Khởi động Nhanh
Sử dụng `uvx` để khởi động bằng một lệnh (cần cài đặt [uv](https://docs.astral.sh/uv/getting-started/installation/)):
```bash
uvx langbot
```
Truy cập http://localhost:5300 để bắt đầu sử dụng.
#### Triển khai Docker Compose
```bash
git clone https://github.com/langbot-app/LangBot
cd LangBot/docker
docker compose up -d
```
Truy cập http://localhost:5300 để bắt đầu sử dụng.
Tài liệu chi tiết [Triển khai Docker](https://docs.langbot.app/en/deploy/langbot/docker.html).
#### Triển khai Một cú nhấp chuột trên BTPanel
LangBot đã được liệt kê trên BTPanel. Nếu bạn đã cài đặt BTPanel, bạn có thể sử dụng [tài liệu](https://docs.langbot.app/en/deploy/langbot/one-click/bt.html) để sử dụng nó.
#### Triển khai Cloud Zeabur
Mẫu Zeabur được đóng góp bởi cộng đồng.
[![Deploy on Zeabur](https://zeabur.com/button.svg)](https://zeabur.com/en-US/templates/ZKTBDH)
#### Triển khai Cloud Railway
[![Deploy on Railway](https://railway.com/button.svg)](https://railway.app/template/yRrAyL?referralCode=vogKPF)
#### Các Phương pháp Triển khai Khác
Sử dụng trực tiếp phiên bản phát hành để chạy, xem tài liệu [Triển khai Thủ công](https://docs.langbot.app/en/deploy/langbot/manual.html).
#### Triển khai Kubernetes
Tham khảo tài liệu [Triển khai Kubernetes](./docker/README_K8S.md).
## 😎 Cập nhật Mới nhất
Nhấp vào các nút Star và Watch ở góc trên bên phải của kho lưu trữ để nhận các bản cập nhật mới nhất.
![star gif](https://docs.langbot.app/star.gif)
## ✨ Tính năng
- 💬 Chat với LLM / Agent: Hỗ trợ nhiều LLM, thích ứng với chat nhóm và chat riêng tư; Hỗ trợ các cuộc trò chuyện nhiều vòng, gọi công cụ, khả năng đa phương thức và đầu ra streaming. Triển khai RAG (cơ sở kiến thức) tích hợp sẵn và tích hợp sâu với [Dify](https://dify.ai), [Coze](https://coze.com), [n8n](https://n8n.io) v.v. LLMOps platforms.
- 🤖 Hỗ trợ Đa nền tảng: Hiện hỗ trợ QQ, QQ Channel, WeCom, WeChat cá nhân, Lark, DingTalk, Discord, Telegram, v.v.
- 🛠️ Độ ổn định Cao, Tính năng Phong phú: Kiểm soát truy cập gốc, giới hạn tốc độ, lọc từ nhạy cảm, v.v.; Dễ sử dụng, hỗ trợ nhiều phương pháp triển khai. Hỗ trợ nhiều cấu hình pipeline, các bot khác nhau cho các kịch bản khác nhau.
- 🧩 Mở rộng Plugin, Cộng đồng Hoạt động: Hỗ trợ các cơ chế plugin hướng sự kiện, mở rộng thành phần, v.v.; Tích hợp giao thức [MCP](https://modelcontextprotocol.io/) của Anthropic; Hiện có hàng trăng plugin.
- 😻 Giao diện Web: Hỗ trợ quản lý các phiên bản LangBot thông qua trình duyệt. Không cần viết tệp cấu hình thủ công.
Để biết thêm thông số kỹ thuật chi tiết, vui lòng tham khảo [tài liệu](https://docs.langbot.app/en/insight/features.html).
Hoặc truy cập môi trường demo: https://demo.langbot.dev/
- Thông tin đăng nhập: Email: `demo@langbot.app` Mật khẩu: `langbot123456`
- Lưu ý: Chỉ dành cho demo WebUI, vui lòng không nhập bất kỳ thông tin nhạy cảm nào trong môi trường công cộng.
### Nền tảng Nhắn tin
| Nền tảng | Trạng thái | Ghi chú |
| --- | --- | --- |
| Discord | ✅ | |
| Telegram | ✅ | |
| Slack | ✅ | |
| LINE | ✅ | |
| QQ Cá nhân | ✅ | |
| QQ API Chính thức | ✅ | |
| WeCom | ✅ | |
| WeComCS | ✅ | |
| WeCom AI Bot | ✅ | |
| WeChat Cá nhân | ✅ | |
| Lark | ✅ | |
| DingTalk | ✅ | |
### LLMs
| LLM | Trạng thái | Ghi chú |
| --- | --- | --- |
| [OpenAI](https://platform.openai.com/) | ✅ | Có sẵn cho bất kỳ mô hình định dạng giao diện OpenAI nào |
| [DeepSeek](https://www.deepseek.com/) | ✅ | |
| [Moonshot](https://www.moonshot.cn/) | ✅ | |
| [Anthropic](https://www.anthropic.com/) | ✅ | |
| [xAI](https://x.ai/) | ✅ | |
| [Zhipu AI](https://open.bigmodel.cn/) | ✅ | |
| [CompShare](https://www.compshare.cn/?ytag=GPU_YY-gh_langbot) | ✅ | Nền tảng tài nguyên LLM và GPU |
| [PPIO](https://ppinfra.com/user/register?invited_by=QJKFYD&utm_source=github_langbot) | ✅ | Nền tảng tài nguyên LLM và GPU |
| [接口 AI](https://jiekou.ai/) | ✅ | Nền tảng tổng hợp LLM |
| [ShengSuanYun](https://www.shengsuanyun.com/?from=CH_KYIPP758) | ✅ | Nền tảng tài nguyên LLM và GPU |
| [302.AI](https://share.302.ai/SuTG99) | ✅ | Cổng LLM (MaaS) |
| [Google Gemini](https://aistudio.google.com/prompts/new_chat) | ✅ | |
| [Dify](https://dify.ai) | ✅ | Nền tảng LLMOps |
| [Ollama](https://ollama.com/) | ✅ | Nền tảng chạy LLM cục bộ |
| [LMStudio](https://lmstudio.ai/) | ✅ | Nền tảng chạy LLM cục bộ |
| [GiteeAI](https://ai.gitee.com/) | ✅ | Cổng giao diện LLM (MaaS) |
| [SiliconFlow](https://siliconflow.cn/) | ✅ | Cổng LLM (MaaS) |
| [Aliyun Bailian](https://bailian.console.aliyun.com/) | ✅ | Cổng LLM (MaaS), nền tảng LLMOps |
| [Volc Engine Ark](https://console.volcengine.com/ark/region:ark+cn-beijing/model?vendor=Bytedance&view=LIST_VIEW) | ✅ | Cổng LLM (MaaS), nền tảng LLMOps |
| [ModelScope](https://modelscope.cn/docs/model-service/API-Inference/intro) | ✅ | Cổng LLM (MaaS) |
| [MCP](https://modelcontextprotocol.io/) | ✅ | Hỗ trợ truy cập công cụ qua giao thức MCP |
## 🤝 Đóng góp Cộng đồng
Cảm ơn các [người đóng góp mã](https://github.com/langbot-app/LangBot/graphs/contributors) sau đây và các thành viên khác trong cộng đồng vì những đóng góp của họ cho LangBot:
<a href="https://github.com/langbot-app/LangBot/graphs/contributors">
<img src="https://contrib.rocks/image?repo=langbot-app/LangBot" />
</a>

View File

@@ -7,6 +7,7 @@ services:
langbot_plugin_runtime:
image: rockchin/langbot:latest
container_name: langbot_plugin_runtime
platform: linux/amd64 # For Apple Silicon compatibility
volumes:
- ./data/plugins:/app/data/plugins
ports:
@@ -21,6 +22,7 @@ services:
langbot:
image: rockchin/langbot:latest
container_name: langbot
platform: linux/amd64 # For Apple Silicon compatibility
volumes:
- ./data:/app/data
- ./plugins:/app/plugins

117
docs/PYPI_INSTALLATION.md Normal file
View File

@@ -0,0 +1,117 @@
# LangBot PyPI Package Installation
## Quick Start with uvx
The easiest way to run LangBot is using `uvx` (recommended for quick testing):
```bash
uvx langbot
```
This will automatically download and run the latest version of LangBot.
## Install with pip/uv
You can also install LangBot as a regular Python package:
```bash
# Using pip
pip install langbot
# Using uv
uv pip install langbot
```
Then run it:
```bash
langbot
```
Or using Python module syntax:
```bash
python -m langbot
```
## Installation with Frontend
When published to PyPI, the LangBot package includes the pre-built frontend files. You don't need to build the frontend separately.
## Data Directory
When running LangBot as a package, it will create a `data/` directory in your current working directory to store configuration, logs, and other runtime data. You can run LangBot from any directory, and it will set up its data directory there.
## Command Line Options
LangBot supports the following command line options:
- `--standalone-runtime`: Use standalone plugin runtime
- `--debug`: Enable debug mode
Example:
```bash
langbot --debug
```
## Comparison with Other Installation Methods
### PyPI Package (uvx/pip)
- **Pros**: Easy to install and update, no need to clone repository or build frontend
- **Cons**: Less flexible for development/customization
### Docker
- **Pros**: Isolated environment, easy deployment
- **Cons**: Requires Docker
### Manual Source Installation
- **Pros**: Full control, easy to customize and develop
- **Cons**: Requires building frontend, managing dependencies manually
## Development
If you want to contribute or customize LangBot, you should still use the manual installation method by cloning the repository:
```bash
git clone https://github.com/langbot-app/LangBot
cd LangBot
uv sync
cd web
npm install
npm run build
cd ..
uv run main.py
```
## Updating
To update to the latest version:
```bash
# With pip
pip install --upgrade langbot
# With uv
uv pip install --upgrade langbot
# With uvx (automatically uses latest)
uvx langbot
```
## System Requirements
- Python 3.10.1 or higher
- Operating System: Linux, macOS, or Windows
## Differences from Source Installation
When running LangBot from the PyPI package (via uvx or pip), there are a few behavioral differences compared to running from source:
1. **Version Check**: The package version does not prompt for user input when the Python version is incompatible. It simply prints an error message and exits. This makes it compatible with non-interactive environments like containers and CI/CD.
2. **Working Directory**: The package version does not require being run from the LangBot project root. You can run `langbot` from any directory, and it will create a `data/` directory in your current working directory.
3. **Frontend Files**: The frontend is pre-built and included in the package, so you don't need to run `npm build` separately.
These differences are intentional to make the package more user-friendly and suitable for various deployment scenarios.

View File

@@ -1,45 +0,0 @@
from v1 import client # type: ignore
import asyncio
import os
import json
class TestDifyClient:
async def test_chat_messages(self):
cln = client.AsyncDifyServiceClient(api_key=os.getenv('DIFY_API_KEY'), base_url=os.getenv('DIFY_BASE_URL'))
async for chunk in cln.chat_messages(inputs={}, query='调用工具查看现在几点?', user='test'):
print(json.dumps(chunk, ensure_ascii=False, indent=4))
async def test_upload_file(self):
cln = client.AsyncDifyServiceClient(api_key=os.getenv('DIFY_API_KEY'), base_url=os.getenv('DIFY_BASE_URL'))
file_bytes = open('img.png', 'rb').read()
print(type(file_bytes))
file = ('img2.png', file_bytes, 'image/png')
resp = await cln.upload_file(file=file, user='test')
print(json.dumps(resp, ensure_ascii=False, indent=4))
async def test_workflow_run(self):
cln = client.AsyncDifyServiceClient(api_key=os.getenv('DIFY_API_KEY'), base_url=os.getenv('DIFY_BASE_URL'))
# resp = await cln.workflow_run(inputs={}, user="test")
# # print(json.dumps(resp, ensure_ascii=False, indent=4))
# print(resp)
chunks = []
ignored_events = ['text_chunk']
async for chunk in cln.workflow_run(inputs={}, user='test'):
if chunk['event'] in ignored_events:
continue
chunks.append(chunk)
print(json.dumps(chunks, ensure_ascii=False, indent=4))
if __name__ == '__main__':
asyncio.run(TestDifyClient().test_chat_messages())

118
main.py
View File

@@ -1,117 +1,3 @@
import asyncio
import argparse
# LangBot 终端启动入口
# 在此层级解决依赖项检查。
# LangBot/main.py
import langbot.__main__
asciiart = r"""
_ ___ _
| | __ _ _ _ __ _| _ ) ___| |_
| |__/ _` | ' \/ _` | _ \/ _ \ _|
|____\__,_|_||_\__, |___/\___/\__|
|___/
⭐️ Open Source 开源地址: https://github.com/langbot-app/LangBot
📖 Documentation 文档地址: https://docs.langbot.app
"""
async def main_entry(loop: asyncio.AbstractEventLoop):
parser = argparse.ArgumentParser(description='LangBot')
parser.add_argument(
'--standalone-runtime',
action='store_true',
help='Use standalone plugin runtime / 使用独立插件运行时',
default=False,
)
parser.add_argument('--debug', action='store_true', help='Debug mode / 调试模式', default=False)
args = parser.parse_args()
if args.standalone_runtime:
from pkg.utils import platform
platform.standalone_runtime = True
if args.debug:
from pkg.utils import constants
constants.debug_mode = True
print(asciiart)
import sys
# 检查依赖
from pkg.core.bootutils import deps
missing_deps = await deps.check_deps()
if missing_deps:
print('以下依赖包未安装,将自动安装,请完成后重启程序:')
print(
'These dependencies are missing, they will be installed automatically, please restart the program after completion:'
)
for dep in missing_deps:
print('-', dep)
await deps.install_deps(missing_deps)
print('已自动安装缺失的依赖包,请重启程序。')
print('The missing dependencies have been installed automatically, please restart the program.')
sys.exit(0)
# # 检查pydantic版本如果没有 pydantic.v1则把 pydantic 映射为 v1
# import pydantic.version
# if pydantic.version.VERSION < '2.0':
# import pydantic
# sys.modules['pydantic.v1'] = pydantic
# 检查配置文件
from pkg.core.bootutils import files
generated_files = await files.generate_files()
if generated_files:
print('以下文件不存在,已自动生成:')
print('Following files do not exist and have been automatically generated:')
for file in generated_files:
print('-', file)
from pkg.core import boot
await boot.main(loop)
if __name__ == '__main__':
import os
import sys
# 必须大于 3.10.1
if sys.version_info < (3, 10, 1):
print('需要 Python 3.10.1 及以上版本,当前 Python 版本为:', sys.version)
input('按任意键退出...')
print('Your Python version is not supported. Please exit the program by pressing any key.')
exit(1)
# Check if the current directory is the LangBot project root directory
invalid_pwd = False
if not os.path.exists('main.py'):
invalid_pwd = True
else:
with open('main.py', 'r', encoding='utf-8') as f:
content = f.read()
if 'LangBot/main.py' not in content:
invalid_pwd = True
if invalid_pwd:
print('请在 LangBot 项目根目录下以命令形式运行此程序。')
input('按任意键退出...')
print('Please run this program in the LangBot project root directory in command form.')
print('Press any key to exit...')
exit(1)
loop = asyncio.new_event_loop()
loop.run_until_complete(main_entry(loop))
langbot.__main__.main()

View File

@@ -1,115 +0,0 @@
from __future__ import annotations
import json
import typing
import os
import base64
import logging
import pydantic
import requests
from ..core import app
class Announcement(pydantic.BaseModel):
"""公告"""
id: int
time: str
timestamp: int
content: str
enabled: typing.Optional[bool] = True
def to_dict(self) -> dict:
return {
'id': self.id,
'time': self.time,
'timestamp': self.timestamp,
'content': self.content,
'enabled': self.enabled,
}
class AnnouncementManager:
"""公告管理器"""
ap: app.Application = None
def __init__(self, ap: app.Application):
self.ap = ap
async def fetch_all(self) -> list[Announcement]:
"""获取所有公告"""
try:
resp = requests.get(
url='https://api.github.com/repos/langbot-app/LangBot/contents/res/announcement.json',
proxies=self.ap.proxy_mgr.get_forward_proxies(),
timeout=5,
)
resp.raise_for_status() # 检查请求是否成功
obj_json = resp.json()
b64_content = obj_json['content']
# 解码
content = base64.b64decode(b64_content).decode('utf-8')
return [Announcement(**item) for item in json.loads(content)]
except (requests.RequestException, json.JSONDecodeError, KeyError) as e:
self.ap.logger.warning(f'获取公告失败: {e}')
pass
return [] # 请求失败时返回空列表
async def fetch_saved(self) -> list[Announcement]:
if not os.path.exists('data/labels/announcement_saved.json'):
with open('data/labels/announcement_saved.json', 'w', encoding='utf-8') as f:
f.write('[]')
with open('data/labels/announcement_saved.json', 'r', encoding='utf-8') as f:
content = f.read()
if not content:
content = '[]'
return [Announcement(**item) for item in json.loads(content)]
async def write_saved(self, content: list[Announcement]):
with open('data/labels/announcement_saved.json', 'w', encoding='utf-8') as f:
f.write(json.dumps([item.to_dict() for item in content], indent=4, ensure_ascii=False))
async def fetch_new(self) -> list[Announcement]:
"""获取新公告"""
all = await self.fetch_all()
saved = await self.fetch_saved()
to_show: list[Announcement] = []
for item in all:
# 遍历saved检查是否有相同id的公告
for saved_item in saved:
if saved_item.id == item.id:
break
else:
if item.enabled:
# 没有相同id的公告
to_show.append(item)
await self.write_saved(all)
return to_show
async def show_announcements(self) -> typing.Tuple[str, int]:
"""显示公告"""
try:
announcements = await self.fetch_new()
ann_text = ''
for ann in announcements:
ann_text += f'[公告] {ann.time}: {ann.content}\n'
# TODO statistics
return ann_text, logging.INFO
except Exception as e:
return f'获取公告时出错: {e}', logging.WARNING

View File

@@ -1,8 +1,9 @@
[project]
name = "langbot"
version = "4.5.0"
version = "4.5.4"
description = "Easy-to-use global IM bot platform designed for LLM era"
readme = "README.md"
license-files = ["LICENSE"]
requires-python = ">=3.10.1,<4.0"
dependencies = [
"aiocqhttp>=1.4.4",
@@ -45,7 +46,6 @@ dependencies = [
"urllib3>=2.4.0",
"websockets>=15.0.1",
"python-socks>=2.7.1", # dingtalk missing dependency
"taskgroup==0.0.0a4", # graingert/taskgroup#20
"pip>=25.1.1",
"ruff>=0.11.9",
"pre-commit>=4.2.0",
@@ -63,7 +63,7 @@ dependencies = [
"langchain-text-splitters>=0.0.1",
"chromadb>=0.4.24",
"qdrant-client (>=1.15.1,<2.0.0)",
"langbot-plugin==0.1.11b1",
"langbot-plugin==0.1.12",
"asyncpg>=0.30.0",
"line-bot-sdk>=3.19.0",
"tboxsdk>=0.0.10",
@@ -85,11 +85,10 @@ keywords = [
"onebot",
]
classifiers = [
"Development Status :: 3 - Alpha",
"Development Status :: 5 - Production/Stable",
"Framework :: AsyncIO",
"Framework :: Robot Framework",
"Framework :: Robot Framework :: Library",
"License :: OSI Approved :: AGPL-3 License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: Communications :: Chat",
@@ -100,6 +99,16 @@ Homepage = "https://langbot.app"
Documentation = "https://docs.langbot.app"
Repository = "https://github.com/langbot-app/LangBot"
[project.scripts]
langbot = "langbot.__main__:main"
[build-system]
requires = ["setuptools>=61.0", "wheel"]
build-backend = "setuptools.build_meta"
[tool.setuptools]
package-data = { "langbot" = ["templates/**", "pkg/provider/modelmgr/requesters/*", "pkg/platform/sources/*", "web/out/**"] }
[dependency-groups]
dev = [
"pre-commit>=4.2.0",

View File

@@ -26,7 +26,7 @@ markers =
# Coverage options (when using pytest-cov)
[coverage:run]
source = pkg
source = langbot.pkg
omit =
*/tests/*
*/test_*.py

3
src/langbot/__init__.py Normal file
View File

@@ -0,0 +1,3 @@
"""LangBot - Easy-to-use global IM bot platform designed for LLM era"""
__version__ = '4.5.4'

104
src/langbot/__main__.py Normal file
View File

@@ -0,0 +1,104 @@
"""LangBot entry point for package execution"""
import asyncio
import argparse
import sys
import os
# ASCII art banner
asciiart = r"""
_ ___ _
| | __ _ _ _ __ _| _ ) ___| |_
| |__/ _` | ' \/ _` | _ \/ _ \ _|
|____\__,_|_||_\__, |___/\___/\__|
|___/
⭐️ Open Source 开源地址: https://github.com/langbot-app/LangBot
📖 Documentation 文档地址: https://docs.langbot.app
"""
async def main_entry(loop: asyncio.AbstractEventLoop):
"""Main entry point for LangBot"""
parser = argparse.ArgumentParser(description='LangBot')
parser.add_argument(
'--standalone-runtime',
action='store_true',
help='Use standalone plugin runtime / 使用独立插件运行时',
default=False,
)
parser.add_argument('--debug', action='store_true', help='Debug mode / 调试模式', default=False)
args = parser.parse_args()
if args.standalone_runtime:
from langbot.pkg.utils import platform
platform.standalone_runtime = True
if args.debug:
from langbot.pkg.utils import constants
constants.debug_mode = True
print(asciiart)
# Check dependencies
from langbot.pkg.core.bootutils import deps
missing_deps = await deps.check_deps()
if missing_deps:
print('以下依赖包未安装,将自动安装,请完成后重启程序:')
print(
'These dependencies are missing, they will be installed automatically, please restart the program after completion:'
)
for dep in missing_deps:
print('-', dep)
await deps.install_deps(missing_deps)
print('已自动安装缺失的依赖包,请重启程序。')
print('The missing dependencies have been installed automatically, please restart the program.')
sys.exit(0)
# Check configuration files
from langbot.pkg.core.bootutils import files
generated_files = await files.generate_files()
if generated_files:
print('以下文件不存在,已自动生成:')
print('Following files do not exist and have been automatically generated:')
for file in generated_files:
print('-', file)
from langbot.pkg.core import boot
await boot.main(loop)
def main():
"""Main function to be called by console script entry point"""
# Check Python version
if sys.version_info < (3, 10, 1):
print('需要 Python 3.10.1 及以上版本,当前 Python 版本为:', sys.version)
print('Your Python version is not supported.')
print('Python 3.10.1 or higher is required. Current version:', sys.version)
sys.exit(1)
# Set up the working directory
# When installed as a package, we need to handle the working directory differently
# We'll create data directory in current working directory if not exists
os.makedirs('data', exist_ok=True)
loop = asyncio.new_event_loop()
try:
loop.run_until_complete(main_entry(loop))
except KeyboardInterrupt:
print('\n正在退出...')
print('Exiting...')
finally:
loop.close()
if __name__ == '__main__':
main()

View File

@@ -7,10 +7,8 @@ import os
from pathlib import Path
class AsyncCozeAPIClient:
def __init__(self, api_key: str, api_base: str = "https://api.coze.cn"):
def __init__(self, api_key: str, api_base: str = 'https://api.coze.cn'):
self.api_key = api_key
self.api_base = api_base
self.session = None
@@ -24,13 +22,11 @@ class AsyncCozeAPIClient:
"""退出时自动关闭会话"""
await self.close()
async def coze_session(self):
"""确保HTTP session存在"""
if self.session is None:
connector = aiohttp.TCPConnector(
ssl=False if self.api_base.startswith("http://") else True,
ssl=False if self.api_base.startswith('http://') else True,
limit=100,
limit_per_host=30,
keepalive_timeout=30,
@@ -42,12 +38,10 @@ class AsyncCozeAPIClient:
sock_read=120,
)
headers = {
"Authorization": f"Bearer {self.api_key}",
"Accept": "text/event-stream",
'Authorization': f'Bearer {self.api_key}',
'Accept': 'text/event-stream',
}
self.session = aiohttp.ClientSession(
headers=headers, timeout=timeout, connector=connector
)
self.session = aiohttp.ClientSession(headers=headers, timeout=timeout, connector=connector)
return self.session
async def close(self):
@@ -63,15 +57,15 @@ class AsyncCozeAPIClient:
# 处理 Path 对象
if isinstance(file, Path):
if not file.exists():
raise ValueError(f"File not found: {file}")
with open(file, "rb") as f:
raise ValueError(f'File not found: {file}')
with open(file, 'rb') as f:
file = f.read()
# 处理文件路径字符串
elif isinstance(file, str):
if not os.path.isfile(file):
raise ValueError(f"File not found: {file}")
with open(file, "rb") as f:
raise ValueError(f'File not found: {file}')
with open(file, 'rb') as f:
file = f.read()
# 处理文件对象
@@ -79,43 +73,39 @@ class AsyncCozeAPIClient:
file = file.read()
session = await self.coze_session()
url = f"{self.api_base}/v1/files/upload"
url = f'{self.api_base}/v1/files/upload'
try:
file_io = io.BytesIO(file)
async with session.post(
url,
data={
"file": file_io,
'file': file_io,
},
timeout=aiohttp.ClientTimeout(total=60),
) as response:
if response.status == 401:
raise Exception("Coze API 认证失败,请检查 API Key 是否正确")
raise Exception('Coze API 认证失败,请检查 API Key 是否正确')
response_text = await response.text()
if response.status != 200:
raise Exception(
f"文件上传失败,状态码: {response.status}, 响应: {response_text}"
)
raise Exception(f'文件上传失败,状态码: {response.status}, 响应: {response_text}')
try:
result = await response.json()
except json.JSONDecodeError:
raise Exception(f"文件上传响应解析失败: {response_text}")
raise Exception(f'文件上传响应解析失败: {response_text}')
if result.get("code") != 0:
raise Exception(f"文件上传失败: {result.get('msg', '未知错误')}")
if result.get('code') != 0:
raise Exception(f'文件上传失败: {result.get("msg", "未知错误")}')
file_id = result["data"]["id"]
file_id = result['data']['id']
return file_id
except asyncio.TimeoutError:
raise Exception("文件上传超时")
raise Exception('文件上传超时')
except Exception as e:
raise Exception(f"文件上传失败: {str(e)}")
raise Exception(f'文件上传失败: {str(e)}')
async def chat_messages(
self,
@@ -139,22 +129,21 @@ class AsyncCozeAPIClient:
timeout: 超时时间
"""
session = await self.coze_session()
url = f"{self.api_base}/v3/chat"
url = f'{self.api_base}/v3/chat'
payload = {
"bot_id": bot_id,
"user_id": user_id,
"stream": stream,
"auto_save_history": auto_save_history,
'bot_id': bot_id,
'user_id': user_id,
'stream': stream,
'auto_save_history': auto_save_history,
}
if additional_messages:
payload["additional_messages"] = additional_messages
payload['additional_messages'] = additional_messages
params = {}
if conversation_id:
params["conversation_id"] = conversation_id
params['conversation_id'] = conversation_id
try:
async with session.post(
@@ -164,29 +153,25 @@ class AsyncCozeAPIClient:
timeout=aiohttp.ClientTimeout(total=timeout),
) as response:
if response.status == 401:
raise Exception("Coze API 认证失败,请检查 API Key 是否正确")
raise Exception('Coze API 认证失败,请检查 API Key 是否正确')
if response.status != 200:
raise Exception(f"Coze API 流式请求失败,状态码: {response.status}")
raise Exception(f'Coze API 流式请求失败,状态码: {response.status}')
async for chunk in response.content:
chunk = chunk.decode("utf-8")
chunk = chunk.decode('utf-8')
if chunk != '\n':
if chunk.startswith("event:"):
chunk_type = chunk.replace("event:", "", 1).strip()
elif chunk.startswith("data:"):
chunk_data = chunk.replace("data:", "", 1).strip()
if chunk.startswith('event:'):
chunk_type = chunk.replace('event:', '', 1).strip()
elif chunk.startswith('data:'):
chunk_data = chunk.replace('data:', '', 1).strip()
else:
yield {"event": chunk_type, "data": json.loads(chunk_data) if chunk_data else {}} # 处理本地部署时接口返回的data为空值
yield {
'event': chunk_type,
'data': json.loads(chunk_data) if chunk_data else {},
} # 处理本地部署时接口返回的data为空值
except asyncio.TimeoutError:
raise Exception(f"Coze API 流式请求超时 ({timeout}秒)")
raise Exception(f'Coze API 流式请求超时 ({timeout}秒)')
except Exception as e:
raise Exception(f"Coze API 流式请求失败: {str(e)}")
raise Exception(f'Coze API 流式请求失败: {str(e)}')

View File

@@ -32,6 +32,7 @@ class AsyncDifyServiceClient:
conversation_id: str = '',
files: list[dict[str, typing.Any]] = [],
timeout: float = 30.0,
model_config: dict[str, typing.Any] | None = None,
) -> typing.AsyncGenerator[dict[str, typing.Any], None]:
"""发送消息"""
if response_mode != 'streaming':
@@ -42,6 +43,16 @@ class AsyncDifyServiceClient:
trust_env=True,
timeout=timeout,
) as client:
payload = {
'inputs': inputs,
'query': query,
'user': user,
'response_mode': response_mode,
'conversation_id': conversation_id,
'files': files,
'model_config': model_config or {},
}
async with client.stream(
'POST',
'/chat-messages',
@@ -49,14 +60,7 @@ class AsyncDifyServiceClient:
'Authorization': f'Bearer {self.api_key}',
'Content-Type': 'application/json',
},
json={
'inputs': inputs,
'query': query,
'user': user,
'response_mode': response_mode,
'conversation_id': conversation_id,
'files': files,
},
json=payload,
) as r:
async for chunk in r.aiter_lines():
if r.status_code != 200:

View File

@@ -194,28 +194,23 @@ class DingTalkClient:
'Type': 'richText',
'Elements': [], # 按顺序存储所有元素
'SimpleContent': '', # 兼容字段:纯文本内容
'SimplePicture': '' # 兼容字段:第一张图片
'SimplePicture': '', # 兼容字段:第一张图片
}
# 先收集所有文本和图片占位符
text_elements = []
image_placeholders = []
# 解析富文本内容,保持原始顺序
for item in data['richText']:
# 处理文本内容
if 'text' in item and item['text'] != "\n":
element = {
'Type': 'text',
'Content': item['text']
}
if 'text' in item and item['text'] != '\n':
element = {'Type': 'text', 'Content': item['text']}
rich_content['Elements'].append(element)
text_elements.append(item['text'])
# 检查是否是图片元素 - 根据钉钉API的实际结构调整
# 钉钉富文本中的图片通常有特定标识,可能需要根据实际返回调整
elif item.get("type") == "picture":
elif item.get('type') == 'picture':
# 创建图片占位符
element = {
'Type': 'image_placeholder',
@@ -232,10 +227,7 @@ class DingTalkClient:
if element['Type'] == 'image_placeholder':
if image_index < len(image_list) and image_list[image_index]:
image_url = await self.download_image(image_list[image_index])
new_elements.append({
'Type': 'image',
'Picture': image_url
})
new_elements.append({'Type': 'image', 'Picture': image_url})
image_index += 1
else:
# 如果没有对应的图片,保留占位符或跳过
@@ -245,7 +237,6 @@ class DingTalkClient:
rich_content['Elements'] = new_elements
# 设置兼容字段
all_texts = [elem['Content'] for elem in rich_content['Elements'] if elem.get('Type') == 'text']
rich_content['SimpleContent'] = '\n'.join(all_texts) if all_texts else ''
@@ -261,8 +252,6 @@ class DingTalkClient:
if all_images:
message_data['Picture'] = all_images[0]
elif incoming_message.message_type == 'text':
message_data['Content'] = incoming_message.get_text_list()[0]

View File

@@ -43,7 +43,6 @@ class DingTalkEvent(dict):
def name(self):
return self.get('Name', '')
@property
def conversation(self):
return self.get('conversation_type', '')

View File

@@ -1,12 +1,12 @@
# 微信公众号的加解密算法与企业微信一样,所以直接使用企业微信的加解密算法文件
import time
import traceback
from libs.wecom_api.WXBizMsgCrypt3 import WXBizMsgCrypt
from langbot.libs.wecom_api.WXBizMsgCrypt3 import WXBizMsgCrypt
import xml.etree.ElementTree as ET
from quart import Quart, request
import hashlib
from typing import Callable
from .oaevent import OAEvent
from langbot.libs.official_account_api.oaevent import OAEvent
import asyncio

View File

@@ -1,4 +1,4 @@
from libs.wechatpad_api.util.http_util import post_json
from langbot.libs.wechatpad_api.util.http_util import post_json
class ChatRoomApi:

View File

@@ -1,4 +1,4 @@
from libs.wechatpad_api.util.http_util import post_json
from langbot.libs.wechatpad_api.util.http_util import post_json
import httpx
import base64

View File

@@ -1,4 +1,4 @@
from libs.wechatpad_api.util.http_util import post_json, get_json
from langbot.libs.wechatpad_api.util.http_util import post_json, get_json
class LoginApi:

View File

@@ -1,4 +1,4 @@
from libs.wechatpad_api.util.http_util import post_json
from langbot.libs.wechatpad_api.util.http_util import post_json
class MessageApi:

View File

@@ -1,4 +1,4 @@
from libs.wechatpad_api.util.http_util import post_json, async_request, get_json
from langbot.libs.wechatpad_api.util.http_util import post_json, async_request, get_json
class UserApi:

View File

@@ -1,9 +1,9 @@
from libs.wechatpad_api.api.login import LoginApi
from libs.wechatpad_api.api.friend import FriendApi
from libs.wechatpad_api.api.message import MessageApi
from libs.wechatpad_api.api.user import UserApi
from libs.wechatpad_api.api.downloadpai import DownloadApi
from libs.wechatpad_api.api.chatroom import ChatRoomApi
from langbot.libs.wechatpad_api.api.login import LoginApi
from langbot.libs.wechatpad_api.api.friend import FriendApi
from langbot.libs.wechatpad_api.api.message import MessageApi
from langbot.libs.wechatpad_api.api.user import UserApi
from langbot.libs.wechatpad_api.api.downloadpai import DownloadApi
from langbot.libs.wechatpad_api.api.chatroom import ChatRoomApi
class WeChatPadClient:

View File

@@ -16,7 +16,7 @@ import struct
from Crypto.Cipher import AES
import xml.etree.cElementTree as ET
import socket
from libs.wecom_ai_bot_api import ierror
from langbot.libs.wecom_ai_bot_api import ierror
"""

View File

@@ -13,9 +13,9 @@ import httpx
from Crypto.Cipher import AES
from quart import Quart, request, Response, jsonify
from libs.wecom_ai_bot_api import wecombotevent
from libs.wecom_ai_bot_api.WXBizMsgCrypt3 import WXBizMsgCrypt
from pkg.platform.logger import EventLogger
from langbot.libs.wecom_ai_bot_api import wecombotevent
from langbot.libs.wecom_ai_bot_api.WXBizMsgCrypt3 import WXBizMsgCrypt
from langbot.pkg.platform.logger import EventLogger
@dataclass
@@ -219,10 +219,7 @@ class WecomBotClient:
self.ReceiveId = ''
self.app = Quart(__name__)
self.app.add_url_rule(
'/callback/command',
'handle_callback',
self.handle_callback_request,
methods=['POST', 'GET']
'/callback/command', 'handle_callback', self.handle_callback_request, methods=['POST', 'GET']
)
self._message_handlers = {
'example': [],
@@ -420,7 +417,7 @@ class WecomBotClient:
await self.logger.error("请求体中缺少 'encrypt' 字段")
return Response('Bad Request', status=400)
xml_post_data = f"<xml><Encrypt><![CDATA[{encrypted_msg}]]></Encrypt></xml>"
xml_post_data = f'<xml><Encrypt><![CDATA[{encrypted_msg}]]></Encrypt></xml>'
ret, decrypted_xml = self.wxcpt.DecryptMsg(xml_post_data, msg_signature, timestamp, nonce)
if ret != 0:
await self.logger.error('解密失败')
@@ -458,7 +455,7 @@ class WecomBotClient:
picurl = item.get('image', {}).get('url')
if texts:
message_data['content'] = "".join(texts) # 拼接所有 text
message_data['content'] = ''.join(texts) # 拼接所有 text
if picurl:
base64 = await self.download_url_to_base64(picurl, self.EnCodingAESKey)
message_data['picurl'] = base64 # 只保留第一个 image
@@ -466,7 +463,9 @@ class WecomBotClient:
# Extract user information
from_info = msg_json.get('from', {})
message_data['userid'] = from_info.get('userid', '')
message_data['username'] = from_info.get('alias', '') or from_info.get('name', '') or from_info.get('userid', '')
message_data['username'] = (
from_info.get('alias', '') or from_info.get('name', '') or from_info.get('userid', '')
)
# Extract chat/group information
if msg_json.get('chattype', '') == 'group':
@@ -555,7 +554,7 @@ class WecomBotClient:
encrypted_bytes = response.content
aes_key = base64.b64decode(encoding_aes_key + "=") # base64 补齐
aes_key = base64.b64decode(encoding_aes_key + '=') # base64 补齐
iv = aes_key[:16]
cipher = AES.new(aes_key, AES.MODE_CBC, iv)
@@ -564,22 +563,22 @@ class WecomBotClient:
pad_len = decrypted[-1]
decrypted = decrypted[:-pad_len]
if decrypted.startswith(b"\xff\xd8"): # JPEG
mime_type = "image/jpeg"
elif decrypted.startswith(b"\x89PNG"): # PNG
mime_type = "image/png"
elif decrypted.startswith((b"GIF87a", b"GIF89a")): # GIF
mime_type = "image/gif"
elif decrypted.startswith(b"BM"): # BMP
mime_type = "image/bmp"
elif decrypted.startswith(b"II*\x00") or decrypted.startswith(b"MM\x00*"): # TIFF
mime_type = "image/tiff"
if decrypted.startswith(b'\xff\xd8'): # JPEG
mime_type = 'image/jpeg'
elif decrypted.startswith(b'\x89PNG'): # PNG
mime_type = 'image/png'
elif decrypted.startswith((b'GIF87a', b'GIF89a')): # GIF
mime_type = 'image/gif'
elif decrypted.startswith(b'BM'): # BMP
mime_type = 'image/bmp'
elif decrypted.startswith(b'II*\x00') or decrypted.startswith(b'MM\x00*'): # TIFF
mime_type = 'image/tiff'
else:
mime_type = "application/octet-stream"
mime_type = 'application/octet-stream'
# 转 base64
base64_str = base64.b64encode(decrypted).decode("utf-8")
return f"data:{mime_type};base64,{base64_str}"
base64_str = base64.b64encode(decrypted).decode('utf-8')
return f'data:{mime_type};base64,{base64_str}'
async def run_task(self, host: str, port: int, *args, **kwargs):
"""

View File

@@ -29,7 +29,12 @@ class WecomBotEvent(dict):
"""
用户名称
"""
return self.get('username', '') or self.get('from', {}).get('alias', '') or self.get('from', {}).get('name', '') or self.userid
return (
self.get('username', '')
or self.get('from', {}).get('alias', '')
or self.get('from', {}).get('name', '')
or self.userid
)
@property
def chatname(self) -> str:
@@ -65,7 +70,7 @@ class WecomBotEvent(dict):
消息id
"""
return self.get('msgid', '')
@property
def ai_bot_id(self) -> str:
"""

View File

@@ -109,14 +109,13 @@ class WecomClient:
async def send_image(self, user_id: str, agent_id: int, media_id: str):
if not await self.check_access_token():
self.access_token = await self.get_access_token(self.secret)
url = self.base_url + '/media/upload?access_token=' + self.access_token
url = self.base_url + '/message/send?access_token=' + self.access_token
async with httpx.AsyncClient() as client:
params = {
'touser': user_id,
'toparty': '',
'totag': '',
'agentid': agent_id,
'msgtype': 'image',
'agentid': agent_id,
'image': {
'media_id': media_id,
},
@@ -125,19 +124,13 @@ class WecomClient:
'enable_duplicate_check': 0,
'duplicate_check_interval': 1800,
}
try:
response = await client.post(url, json=params)
data = response.json()
except Exception as e:
await self.logger.error(f'发送图片失败:{data}')
raise Exception('Failed to send image: ' + str(e))
# 企业微信错误码40014和42001代表accesstoken问题
response = await client.post(url, json=params)
data = response.json()
if data['errcode'] == 40014 or data['errcode'] == 42001:
self.access_token = await self.get_access_token(self.secret)
return await self.send_image(user_id, agent_id, media_id)
if data['errcode'] != 0:
await self.logger.error(f'发送图片失败:{data}')
raise Exception('Failed to send image: ' + str(data))
async def send_private_msg(self, user_id: str, agent_id: int, content: str):
@@ -340,4 +333,3 @@ class WecomClient:
async def get_media_id(self, image: platform_message.Image):
media_id = await self.upload_to_work(image=image)
return media_id

View File

@@ -110,7 +110,7 @@ class RouterGroup(abc.ABC):
elif auth_type == AuthType.USER_TOKEN_OR_API_KEY:
# Try API key first (check X-API-Key header)
api_key = quart.request.headers.get('X-API-Key', '')
if api_key:
# API key authentication
try:
@@ -124,7 +124,9 @@ class RouterGroup(abc.ABC):
token = quart.request.headers.get('Authorization', '').replace('Bearer ', '')
if not token:
return self.http_status(401, -1, 'No valid authentication provided (user token or API key required)')
return self.http_status(
401, -1, 'No valid authentication provided (user token or API key required)'
)
try:
user_email = await self.ap.user_service.verify_jwt_token(token)

View File

@@ -27,7 +27,9 @@ class PipelinesRouterGroup(group.RouterGroup):
async def _() -> str:
return self.success(data={'configs': await self.ap.pipeline_service.get_pipeline_metadata()})
@self.route('/<pipeline_uuid>', methods=['GET', 'PUT', 'DELETE'], auth_type=group.AuthType.USER_TOKEN_OR_API_KEY)
@self.route(
'/<pipeline_uuid>', methods=['GET', 'PUT', 'DELETE'], auth_type=group.AuthType.USER_TOKEN_OR_API_KEY
)
async def _(pipeline_uuid: str) -> str:
if quart.request.method == 'GET':
pipeline = await self.ap.pipeline_service.get_pipeline(pipeline_uuid)
@@ -47,7 +49,9 @@ class PipelinesRouterGroup(group.RouterGroup):
return self.success()
@self.route('/<pipeline_uuid>/extensions', methods=['GET', 'PUT'], auth_type=group.AuthType.USER_TOKEN_OR_API_KEY)
@self.route(
'/<pipeline_uuid>/extensions', methods=['GET', 'PUT'], auth_type=group.AuthType.USER_TOKEN_OR_API_KEY
)
async def _(pipeline_uuid: str) -> str:
if quart.request.method == 'GET':
# Get current extensions and available plugins

View File

@@ -1,6 +1,7 @@
import quart
import mimetypes
from ... import group
from langbot.pkg.utils import importutil
@group.group_class('adapters', '/api/v1/platform/adapters')
@@ -31,4 +32,6 @@ class AdaptersRouterGroup(group.RouterGroup):
if icon_path is None:
return self.http_status(404, -1, 'icon not found')
return await quart.send_file(icon_path)
return quart.Response(
importutil.read_resource_file_bytes(icon_path), mimetype=mimetypes.guess_type(icon_path)[0]
)

View File

@@ -1,6 +1,8 @@
import quart
import mimetypes
from ... import group
from langbot.pkg.utils import importutil
@group.group_class('provider/requesters', '/api/v1/provider/requesters')
@@ -32,4 +34,6 @@ class RequestersRouterGroup(group.RouterGroup):
if icon_path is None:
return self.http_status(404, -1, 'icon not found')
return await quart.send_file(icon_path)
return quart.Response(
importutil.read_resource_file_bytes(icon_path), mimetype=mimetypes.guess_type(icon_path)[0]
)

View File

@@ -55,17 +55,6 @@ class SystemRouterGroup(group.RouterGroup):
return self.success(data=exec(py_code, {'ap': ap}))
@self.route('/debug/tools/call', methods=['POST'], auth_type=group.AuthType.USER_TOKEN)
async def _() -> str:
if not constants.debug_mode:
return self.http_status(403, 403, 'Forbidden')
data = await quart.request.json
return self.success(
data=await self.ap.tool_mgr.execute_func_call(data['tool_name'], data['tool_parameters'])
)
@self.route(
'/debug/plugin/action',
methods=['POST'],

View File

@@ -86,7 +86,9 @@ class HTTPController:
ginst = g(self.ap, self.quart_app)
await ginst.initialize()
frontend_path = 'web/out'
from ....utils import paths
frontend_path = paths.get_frontend_path()
@self.quart_app.route('/')
async def index():

View File

@@ -61,9 +61,7 @@ class ApiKeyService:
async def delete_api_key(self, key_id: int) -> None:
"""Delete an API key"""
await self.ap.persistence_mgr.execute_async(
sqlalchemy.delete(apikey.ApiKey).where(apikey.ApiKey.id == key_id)
)
await self.ap.persistence_mgr.execute_async(sqlalchemy.delete(apikey.ApiKey).where(apikey.ApiKey.id == key_id))
async def update_api_key(self, key_id: int, name: str = None, description: str = None) -> None:
"""Update an API key's metadata (name, description)"""

Some files were not shown because too many files have changed in this diff Show More