* fix: fix n8n streaming support issue Add streaming support detection and proper message type handling for n8n service API runner. Previously, when streaming was enabled, n8n integration would fail due to incorrect message type usage. 1. Added streaming capability detection by checking adapter's is_stream_output_supported method 2. Implemented conditional message generation using MessageChunk for streaming mode and Message for non-streaming mode 3. Added proper error handling for adapters that don't support streaming detection * fix: add n8n webhook streaming model ,Optimized the streaming output when calling n8n. --------- Co-authored-by: Dong_master <2213070223@qq.com>
English / 简体中文 / 繁體中文 / 日本語 / Español / Français / 한국어 / Русский / Tiếng Việt
Home | Deployment | Plugin | Submit Plugin
LangBot is an open-source LLM native instant messaging robot development platform, aiming to provide out-of-the-box IM robot development experience, with Agent, RAG, MCP and other LLM application functions, adapting to global instant messaging platforms, and providing rich API interfaces, supporting custom development.
📦 Getting Started
Quick Start
Use uvx to start with one command (need to install uv):
uvx langbot
Visit http://localhost:5300 to start using it.
Docker Compose Deployment
git clone https://github.com/langbot-app/LangBot
cd LangBot/docker
docker compose up -d
Visit http://localhost:5300 to start using it.
Detailed documentation Docker Deployment.
One-click Deployment on BTPanel
LangBot has been listed on the BTPanel, if you have installed the BTPanel, you can use the document to use it.
Zeabur Cloud Deployment
Community contributed Zeabur template.
Railway Cloud Deployment
Other Deployment Methods
Directly use the released version to run, see the Manual Deployment documentation.
Kubernetes Deployment
Refer to the Kubernetes Deployment documentation.
😎 Stay Ahead
Click the Star and Watch button in the upper right corner of the repository to get the latest updates.
✨ Features
- 💬 Chat with LLM / Agent: Supports multiple LLMs, adapt to group chats and private chats; Supports multi-round conversations, tool calls, multi-modal, and streaming output capabilities. Built-in RAG (knowledge base) implementation, and deeply integrates with Dify, Coze, n8n etc. LLMOps platforms.
- 🤖 Multi-platform Support: Currently supports QQ, QQ Channel, WeCom, personal WeChat, Lark, DingTalk, Discord, Telegram, etc.
- 🛠️ High Stability, Feature-rich: Native access control, rate limiting, sensitive word filtering, etc. mechanisms; Easy to use, supports multiple deployment methods. Supports multiple pipeline configurations, different bots can be used for different scenarios.
- 🧩 Plugin Extension, Active Community: High stability, high security production-level plugin system; Support event-driven, component extension, etc. plugin mechanisms; Integrate Anthropic MCP protocol; Currently has hundreds of plugins.
- 😻 Web UI: Support management LangBot instance through the browser. No need to manually write configuration files.
For more detailed specifications, please refer to the documentation.
Or visit the demo environment: https://demo.langbot.dev/
- Login information: Email:
demo@langbot.appPassword:langbot123456 - Note: For WebUI demo only, please do not fill in any sensitive information in the public environment.
Message Platform
| Platform | Status | Remarks |
|---|---|---|
| Discord | ✅ | |
| Telegram | ✅ | |
| Slack | ✅ | |
| LINE | ✅ | |
| Personal QQ | ✅ | |
| QQ Official API | ✅ | |
| WeCom | ✅ | |
| WeComCS | ✅ | |
| WeCom AI Bot | ✅ | |
| Personal WeChat | ✅ | |
| Lark | ✅ | |
| DingTalk | ✅ |
LLMs
| LLM | Status | Remarks |
|---|---|---|
| OpenAI | ✅ | Available for any OpenAI interface format model |
| DeepSeek | ✅ | |
| Moonshot | ✅ | |
| Anthropic | ✅ | |
| xAI | ✅ | |
| Zhipu AI | ✅ | |
| CompShare | ✅ | LLM and GPU resource platform |
| Dify | ✅ | LLMOps platform |
| PPIO | ✅ | LLM and GPU resource platform |
| 接口 AI | ✅ | LLM aggregation platform, dedicated to global LLMs |
| ShengSuanYun | ✅ | LLM and GPU resource platform |
| 302.AI | ✅ | LLM gateway(MaaS) |
| Google Gemini | ✅ | |
| Ollama | ✅ | Local LLM running platform |
| LMStudio | ✅ | Local LLM running platform |
| GiteeAI | ✅ | LLM interface gateway(MaaS) |
| SiliconFlow | ✅ | LLM gateway(MaaS) |
| Aliyun Bailian | ✅ | LLM gateway(MaaS), LLMOps platform |
| Volc Engine Ark | ✅ | LLM gateway(MaaS), LLMOps platform |
| ModelScope | ✅ | LLM gateway(MaaS) |
| MCP | ✅ | Support tool access through MCP protocol |
🤝 Community Contribution
Thank you for the following code contributors and other members in the community for their contributions to LangBot:
