* feat: code by huntun * chore: revert group.py * refactor: api * feat: adjust ui * chore: stash * feat: add dialog * feat: add mcp from sse on frontend * feat: add mcp db * feat: semi frontend * feat: change sse frontend * fix: page out of control * fix: mcp card * fix: mcp refactor * fix: delete description * feat: add mcp servers * fix: status icon * feat: mcp-ui * perf: remove title from mcp mgm page * fix: delete mcp market * feat: add i18n * fix: run lint * feat: add i18n * fix: delete print function * fix: mcp test error * fix: i18n and mcp test * refactor(mcp): bridge controller and db operation with service layer * fix: try & catch & error * fix: error message in mcp card * feat: no longer register tool loader as component for type hints * perf: make startup async * feat: completely remove the fucking mcp market components and refs * refactor: mcp server datastructure * perf: tidy dir * feat: perf mcp server api datastruct * perf: ui * perf: mcp server status checking logic * perf: mcp server testing and refreshing * perf: no mcp server tips * perf: update sidebar title * chore: update * chore: bump langbot-plugin to 0.1.3 * chore: bump version v4.3.4 * chore: release v4.3.5 * Fix: Correct data type mismatch in AtBotRule (#1705) Fix can't '@' in QQ group. * chore: bump version 4.3.6 * feat: update for new events fields * Fix/qqo (#1709) * fix: qq official * fix: appid * chore: add `codecov.yml` * chore: bump langbot-plugin to 0.1.4b2 * chore: bump version 4.3.7b1 * fix: return empty data when plugin system disabled (#1710) * chore: bump version 4.3.7 * fix: bad Plain component init in wechatpad (#1712) * perf: allow not set llm model (#1703) * perf: output pipeline error in en * fix: datetime serialization error in emit_event (#1713) * chore: bump version 4.3.8 * perf: add component list in plugin detail dialog * perf: store pipeline sort method * Feat/coze runner (#1714) * feat:add coze api client and coze runner and coze config * del print * fix:Change the default setting of the plugin system to true * fix:del multimodal-support config, default multimodal-support,and in cozeapi.py Obtain timeout and auto-save-history config * chore: add comment for coze.com --------- Co-authored-by: Junyan Qin <rockchinq@gmail.com> * chore: bump version 4.3.9 * feat: 实现企业微信智能机器人流式响应 - 重构 WecomBotClient,支持流式会话管理和队列机制 - 新增 StreamSession 和 StreamSessionManager 类管理流式上下文 - 实现 reply_message_chunk 接口支持流式输出 - 优化消息处理流程,支持异步流式响应 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * refactor: split WeCom callback handlers * fix: langchain error * fix: add langchain test splitter module * perf: config reset logic (#1742) * fix: inherit settings from existing settings * feat: add optional data cleanup checkbox to plugin uninstall dialog (#1743) * Initial plan * Add checkbox for plugin config/storage deletion - Add delete_data parameter to backend API endpoint - Update delete_plugin flow to clean up settings and binary storage - Add checkbox in uninstall dialog using shadcn/ui - Add translations for checkbox label in all languages Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com> * perf: param list --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com> Co-authored-by: Junyan Qin <rockchinq@gmail.com> * chore: fix linter errors --------- Co-authored-by: Copilot <198982749+Copilot@users.noreply.github.com> --------- Co-authored-by: WangCham <651122857@qq.com> Co-authored-by: wangcham <wangcham233@gmail.com> Co-authored-by: Thetail001 <56257172+Thetail001@users.noreply.github.com> Co-authored-by: fdc310 <82008029+fdc310@users.noreply.github.com> Co-authored-by: Alfons <alfonsxh@gmail.com> Co-authored-by: Claude <noreply@anthropic.com> Co-authored-by: Copilot <198982749+Copilot@users.noreply.github.com>
LangBot is an open-source LLM native instant messaging robot development platform, aiming to provide out-of-the-box IM robot development experience, with Agent, RAG, MCP and other LLM application functions, adapting to global instant messaging platforms, and providing rich API interfaces, supporting custom development.
📦 Getting Started
Docker Compose Deployment
git clone https://github.com/langbot-app/LangBot
cd LangBot/docker
docker compose up -d
Visit http://localhost:5300 to start using it.
Detailed documentation Docker Deployment.
One-click Deployment on BTPanel
LangBot has been listed on the BTPanel, if you have installed the BTPanel, you can use the document to use it.
Zeabur Cloud Deployment
Community contributed Zeabur template.
Railway Cloud Deployment
Other Deployment Methods
Directly use the released version to run, see the Manual Deployment documentation.
😎 Stay Ahead
Click the Star and Watch button in the upper right corner of the repository to get the latest updates.
✨ Features
- 💬 Chat with LLM / Agent: Supports multiple LLMs, adapt to group chats and private chats; Supports multi-round conversations, tool calls, multi-modal, and streaming output capabilities. Built-in RAG (knowledge base) implementation, and deeply integrates with Dify.
- 🤖 Multi-platform Support: Currently supports QQ, QQ Channel, WeCom, personal WeChat, Lark, DingTalk, Discord, Telegram, etc.
- 🛠️ High Stability, Feature-rich: Native access control, rate limiting, sensitive word filtering, etc. mechanisms; Easy to use, supports multiple deployment methods. Supports multiple pipeline configurations, different bots can be used for different scenarios.
- 🧩 Plugin Extension, Active Community: Support event-driven, component extension, etc. plugin mechanisms; Integrate Anthropic MCP protocol; Currently has hundreds of plugins.
- 😻 Web UI: Support management LangBot instance through the browser. No need to manually write configuration files.
For more detailed specifications, please refer to the documentation.
Or visit the demo environment: https://demo.langbot.dev/
- Login information: Email:
demo@langbot.appPassword:langbot123456 - Note: For WebUI demo only, please do not fill in any sensitive information in the public environment.
Message Platform
| Platform | Status | Remarks |
|---|---|---|
| Discord | ✅ | |
| Telegram | ✅ | |
| Slack | ✅ | |
| LINE | ✅ | |
| Personal QQ | ✅ | |
| QQ Official API | ✅ | |
| WeCom | ✅ | |
| WeComCS | ✅ | |
| WeCom AI Bot | ✅ | |
| Personal WeChat | ✅ | |
| Lark | ✅ | |
| DingTalk | ✅ |
LLMs
| LLM | Status | Remarks |
|---|---|---|
| OpenAI | ✅ | Available for any OpenAI interface format model |
| DeepSeek | ✅ | |
| Moonshot | ✅ | |
| Anthropic | ✅ | |
| xAI | ✅ | |
| Zhipu AI | ✅ | |
| CompShare | ✅ | LLM and GPU resource platform |
| Dify | ✅ | LLMOps platform |
| PPIO | ✅ | LLM and GPU resource platform |
| ShengSuanYun | ✅ | LLM and GPU resource platform |
| 302.AI | ✅ | LLM gateway(MaaS) |
| Google Gemini | ✅ | |
| Ollama | ✅ | Local LLM running platform |
| LMStudio | ✅ | Local LLM running platform |
| GiteeAI | ✅ | LLM interface gateway(MaaS) |
| SiliconFlow | ✅ | LLM gateway(MaaS) |
| Aliyun Bailian | ✅ | LLM gateway(MaaS), LLMOps platform |
| Volc Engine Ark | ✅ | LLM gateway(MaaS), LLMOps platform |
| ModelScope | ✅ | LLM gateway(MaaS) |
| MCP | ✅ | Support tool access through MCP protocol |
🤝 Community Contribution
Thank you for the following code contributors and other members in the community for their contributions to LangBot:
