* Initial plan * Add package structure and resource path utilities - Created langbot/ package with __init__.py and __main__.py entry point - Added paths utility to find frontend and resource files from package installation - Updated config loading to use resource paths - Updated frontend serving to use resource paths - Added MANIFEST.in for package data inclusion - Updated pyproject.toml with build system and entry points Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com> * Add PyPI publishing workflow and update license - Created GitHub Actions workflow to build frontend and publish to PyPI - Added license field to pyproject.toml to fix deprecation warning - Updated .gitignore to exclude build artifacts - Tested package building successfully Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com> * Add PyPI installation documentation - Created PYPI_INSTALLATION.md with detailed installation and usage instructions - Updated README.md to feature uvx/pip installation as recommended method - Updated README_EN.md with same changes for English documentation Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com> * Address code review feedback - Made package-data configuration more specific to langbot package only - Improved path detection with caching to avoid repeated file I/O - Removed sys.path searching which was incorrect for package data - Removed interactive input() call for non-interactive environment compatibility - Simplified error messages for version check Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com> * Fix code review issues - Use specific exception types instead of bare except - Fix misleading comments about directory levels - Remove redundant existence check before makedirs with exist_ok=True - Use context manager for file opening to ensure proper cleanup Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com> * Simplify package configuration and document behavioral differences - Removed redundant package-data configuration, relying on MANIFEST.in - Added documentation about behavioral differences between package and source installation - Clarified that include-package-data=true uses MANIFEST.in for data files Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com> * chore: update pyproject.toml * chore: try pack templates in langbot/ * chore: update * chore: update * chore: update * chore: update * chore: update * chore: adjust dir structure * chore: fix imports * fix: read default-pipeline-config.json * fix: read default-pipeline-config.json * fix: tests * ci: publish pypi * chore: bump version 4.6.0-beta.1 for testing * chore: add templates/** * fix: send adapters and requesters icons * chore: bump version 4.6.0b2 for testing * chore: add platform field for docker-compose.yaml --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: RockChinQ <45992437+RockChinQ@users.noreply.github.com> Co-authored-by: Junyan Qin <rockchinq@gmail.com>
6.9 KiB
LangBot is an open-source LLM native instant messaging robot development platform, aiming to provide out-of-the-box IM robot development experience, with Agent, RAG, MCP and other LLM application functions, adapting to global instant messaging platforms, and providing rich API interfaces, supporting custom development.
📦 Getting Started
Quick Start (Recommended)
Use uvx to start with one command (no installation required):
uvx langbot
Or install with pip and run:
pip install langbot
langbot
Visit http://localhost:5300 to start using it.
Detailed documentation PyPI Installation.
Docker Compose Deployment
git clone https://github.com/langbot-app/LangBot
cd LangBot/docker
docker compose up -d
Visit http://localhost:5300 to start using it.
Detailed documentation Docker Deployment.
One-click Deployment on BTPanel
LangBot has been listed on the BTPanel, if you have installed the BTPanel, you can use the document to use it.
Zeabur Cloud Deployment
Community contributed Zeabur template.
Railway Cloud Deployment
Other Deployment Methods
Directly use the released version to run, see the Manual Deployment documentation.
Kubernetes Deployment
Refer to the Kubernetes Deployment documentation.
😎 Stay Ahead
Click the Star and Watch button in the upper right corner of the repository to get the latest updates.
✨ Features
- 💬 Chat with LLM / Agent: Supports multiple LLMs, adapt to group chats and private chats; Supports multi-round conversations, tool calls, multi-modal, and streaming output capabilities. Built-in RAG (knowledge base) implementation, and deeply integrates with Dify.
- 🤖 Multi-platform Support: Currently supports QQ, QQ Channel, WeCom, personal WeChat, Lark, DingTalk, Discord, Telegram, etc.
- 🛠️ High Stability, Feature-rich: Native access control, rate limiting, sensitive word filtering, etc. mechanisms; Easy to use, supports multiple deployment methods. Supports multiple pipeline configurations, different bots can be used for different scenarios.
- 🧩 Plugin Extension, Active Community: Support event-driven, component extension, etc. plugin mechanisms; Integrate Anthropic MCP protocol; Currently has hundreds of plugins.
- 😻 Web UI: Support management LangBot instance through the browser. No need to manually write configuration files.
For more detailed specifications, please refer to the documentation.
Or visit the demo environment: https://demo.langbot.dev/
- Login information: Email:
demo@langbot.appPassword:langbot123456 - Note: For WebUI demo only, please do not fill in any sensitive information in the public environment.
Message Platform
| Platform | Status | Remarks |
|---|---|---|
| Discord | ✅ | |
| Telegram | ✅ | |
| Slack | ✅ | |
| LINE | ✅ | |
| Personal QQ | ✅ | |
| QQ Official API | ✅ | |
| WeCom | ✅ | |
| WeComCS | ✅ | |
| WeCom AI Bot | ✅ | |
| Personal WeChat | ✅ | |
| Lark | ✅ | |
| DingTalk | ✅ |
LLMs
| LLM | Status | Remarks |
|---|---|---|
| OpenAI | ✅ | Available for any OpenAI interface format model |
| DeepSeek | ✅ | |
| Moonshot | ✅ | |
| Anthropic | ✅ | |
| xAI | ✅ | |
| Zhipu AI | ✅ | |
| CompShare | ✅ | LLM and GPU resource platform |
| Dify | ✅ | LLMOps platform |
| PPIO | ✅ | LLM and GPU resource platform |
| 接口 AI | ✅ | LLM aggregation platform, dedicated to global LLMs |
| ShengSuanYun | ✅ | LLM and GPU resource platform |
| 302.AI | ✅ | LLM gateway(MaaS) |
| Google Gemini | ✅ | |
| Ollama | ✅ | Local LLM running platform |
| LMStudio | ✅ | Local LLM running platform |
| GiteeAI | ✅ | LLM interface gateway(MaaS) |
| SiliconFlow | ✅ | LLM gateway(MaaS) |
| Aliyun Bailian | ✅ | LLM gateway(MaaS), LLMOps platform |
| Volc Engine Ark | ✅ | LLM gateway(MaaS), LLMOps platform |
| ModelScope | ✅ | LLM gateway(MaaS) |
| MCP | ✅ | Support tool access through MCP protocol |
🤝 Community Contribution
Thank you for the following code contributors and other members in the community for their contributions to LangBot:
