Compare commits

...

84 Commits

Author SHA1 Message Date
Xinrea
7e54231bef ci/cd: fix get previous tag 2025-09-07 19:11:49 +08:00
Xinrea
80a885dbf3 Release 2.12.0 (#179)
* chore: add devcontainer config (#175)

* refactor: refactor error handling and update dependencies (#176)

* docs: update webhook

* feat: add webhook module

* feat: webhook url settings

* feat: update webhook poster instead of recreating

* feat: add link for webhook docs

* refactor: using relative path for all covers

* fix: webhook in headless mode

* feat: implement all webhook events

* fix: webhook in headless mode

* feat: static host cache/output directory

* tests: add more tests (#178)

* chore: add tests

* chore: update

* fix: wrong cover type

* bump version to 2.12.0

* feat: change default clip bitrate to 6000k

---------

Co-authored-by: Sieluna <seele.peng@gmail.com>
2025-09-07 18:38:16 +08:00
Xinrea
134c6bbb5f chore: update dependencies 2025-08-31 11:01:32 +08:00
Xinrea
49a153adf7 chore: add cursor rules 2025-08-31 10:52:44 +08:00
Xinrea
99e15b0bda ci/cd: publish workflow with release body 2025-08-30 23:35:06 +08:00
Xinrea
4de8a73af2 chore: add logs for ts filename 2025-08-30 23:19:06 +08:00
Xinrea
d104ba3180 chore: update issue template 2025-08-30 10:52:00 +08:00
Xinrea
abf0d4748f chore: update issue template 2025-08-30 10:46:45 +08:00
Xinrea
d2a9c44601 feat: clip note support (#173)
* refactor: move components

* feat: note for clip (close #170)

* fix: import video handler

* fix: sort by note

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

---------

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-08-30 10:24:53 +08:00
Xinrea
c269558bae refactor: using post for all handlers (#172)
* refactor: using post for all handlers

* chore: code format

* ci/cd: tests verbose

* ci/cd: remove compilation test

* ci/cd: add ffmpeg in test
2025-08-24 22:48:51 +08:00
Xinrea
cc22453a40 chore: fix tests 2025-08-24 21:20:26 +08:00
Xinrea
d525d92de4 ci/cd: add checks for pr 2025-08-24 21:13:16 +08:00
Xinrea
2197dfe65c chore: format code 2025-08-24 21:05:22 +08:00
Xinrea
38ee00f474 bump version to 2.11.7 2025-08-24 20:58:20 +08:00
Eeeeep4
8fdad41c71 fix: resolve subtitle drag sync issues with timeline scaling (#171)
- Fix double scaling in getSubtitleStyle causing position misalignment
- Improve edge detection logic for better drag precision
- Add time boundary constraints to prevent negative/overflow values
- Refactor drag handlers with helper functions for better maintainability
- Ensure subtitle blocks align with timeline markers at all zoom levels
2025-08-24 17:08:35 +08:00
Eeeeep4
f269995bb7 feat: batch video import (#163)
* feat(import): auto-detect and import new videos from the import directory

- Scan the configured import directory for newly added video files
- Automatically enqueue/import detected files into the application library
- Extend config and app initialization to support import directory settings
- Update video/config handlers to expose and trigger the import flow

* refactor(import): improve video import and conversion workflow

- Optimize auto-import loading state management with proper polling
- Separate large file (>500MB) async conversion from small file sync conversion
- Extract reusable helper functions for file cleanup and thumbnail generation
- Enhance UI progress display for large file conversions

* feat(import): add batch video import functionality

- Add batch_import_external_videos API endpoint with progress tracking
- Support multiple file selection in import dialog
- Display batch import progress with current file information

* feat: improve video import on headless mode

- Enhance batch video import workflow and progress tracking
- Improve HTTP server API structure and error handling

* fix(video): prevent OOM in headless mode by using fixed 64KB buffers
- introduce sanitize_filename_advanced with tests
- use fixed 64KB buffer and 1% progress reporting

* fix: resolve compilation warnings and improve SSE reliability

  - Add feature-gated compilation for platform-specific functions
  - Enhance SSE error handling and connection stability
  - Increase broadcast channel capacity and optimize keep-alive timing
2025-08-24 10:10:38 +08:00
Xinrea
03a2db8c44 feat: randomly choose stream variant (#168) 2025-08-20 23:40:56 +08:00
Xinrea
6d9cd3c6a8 fix: danmu reconnection (#167) 2025-08-20 23:08:41 +08:00
Xinrea
303b2f7036 fix: record breaks after stream expired (#166) 2025-08-20 22:55:06 +08:00
Xinrea
ec25c2ffd9 bump version to 2.11.6 2025-08-20 22:23:25 +08:00
Xinrea
50ab608ddb fix: cache/output dir migration (close #159) (#165)
* fix: cache/output dir migration (close #159)

* chore: adjust wrong log info

* fix: more accurate way to check path
2025-08-20 22:15:46 +08:00
Xinrea
3c76be9b81 feat: add batch delete for archives API and tool 2025-08-19 00:27:59 +08:00
Xinrea
ab7f0cf0b4 bump version to 2.11.5 2025-08-15 22:50:38 +08:00
Xinrea
f9f590c4dc fix: docker start with nscd 2025-08-15 22:47:52 +08:00
Xinrea
8d38fe582a fix: ffprobe segment falt in docker environment 2025-08-15 22:31:10 +08:00
Xinrea
dc4a26561d bump version to 2.11.4 2025-08-14 22:08:44 +08:00
Xinrea
10c1d1f3a8 feat: add video export button in clip list (close #156) 2025-08-14 22:05:11 +08:00
Xinrea
66bcf53d01 fix: database operation optimization (close #157) 2025-08-14 21:52:08 +08:00
Xinrea
8ab4b7d693 bump version to 2.11.3 2025-08-14 00:13:09 +08:00
Xinrea
ce2f097d32 fix: adjust body size limit for video importing 2025-08-14 00:12:02 +08:00
Xinrea
f7575cd327 bump version to 2.11.2 2025-08-10 21:35:21 +08:00
Xinrea
8634c6a211 fix: always start a new recording when update entries errror 2025-08-10 21:34:41 +08:00
Xinrea
b070013efc bump version to 2.11.1 2025-08-10 20:56:48 +08:00
Eeeeep4
d2d9112f6c feat: smart flv conversion with progress (#155)
* fix: auto-close batch delete dialog after completion

* feat: smart FLV conversion with detailed progress and better UX

- Intelligent FLV→MP4 conversion (lossless stream copy + high-quality fallback)
- Real-time import progress with percentage tracking
- Smart file size display (auto GB/MB units)
- Optimized thumbnail generation and network file handling

* refactor: reorganize FFmpeg functions and fix network detection

- Move FFmpeg functions from video.rs to ffmpeg.rs
- Fix Windows drive letters misidentified as network paths
- Improve network vs local file detection logic

* fix: delete thumbnails when removing cliped videos
2025-08-10 17:00:31 +08:00
Xinrea
9fea18f2de ci/cd: remove unused rust-cache 2025-08-10 10:17:09 +08:00
Xinrea
74480f91ce ci/cd: fix rust cache 2025-08-10 00:05:44 +08:00
Xinrea
b2e13b631f bump version to 2.11.0 2025-08-09 23:47:39 +08:00
Xinrea
001d995c8f chore: style adjustment 2025-08-09 23:46:47 +08:00
Eeeeep4
8cb2acea88 feat: add universal video clipping support (close #78) (#145)
* feat: add universal video clipping support

- Allow clipping for all video types including imported and recorded videos
- Support secondary precise clipping from rough clips
- Fix video status validation to include completed recordings (status 0 and 1)

* feat: integrate video clipping into player and enhance subtitle timeline

- Optimize subtitle progress bar styling and timeline interaction
- Unify video clipping workflow across all video types
- Fix build issues and improve code quality with safer path handling

* feat: improve video clipping and UI consistency

- fix: resolve FFmpeg clipping start time offset issue
- fix: enhance clip selection creation logic
- style: unify interface styling and colors
- feat: complete HTTP APIs for headless mode

* fix: improve headless mode file handling and path resolution

- Add multipart file upload support for external video import in headless mode
- Fix file path resolution issues for both video files and thumbnails
- Make convertFileSrc and convertCoverSrc async to properly handle absolute path conversion in Tauri

* fix: correct video cover size and file size display in GUI import

Fix the cover size when importing videos
Fix file size display during GUI-side import
2025-08-09 23:22:53 +08:00
Xinrea
7c0d57d84e fix: clip danmu offset (#153)
* fix: account tutorial link

* feat: encoding-fix option for clip

* fix: clip on stream

* fix: danmu encoding offset

* fix: clip when no range provided
2025-08-09 21:00:28 +08:00
Xinrea
8cb875f449 bump version to 2.10.6 2025-08-07 23:31:17 +08:00
Xinrea
e6bbe65723 feat: backup api for douyin room info (#146) 2025-08-07 23:09:43 +08:00
Xinrea
f4a71a2476 bump version to 2.10.5 2025-08-07 01:01:05 +08:00
Xinrea
47b9362b0a fix: douyin manifest and ts fetch error 2025-08-06 23:28:46 +08:00
Xinrea
c1aad0806e fix: subtitle result not saved 2025-08-06 23:08:29 +08:00
Xinrea
4ccc90f9fb docs: update 2025-08-05 08:35:46 +08:00
Xinrea
7dc63440e6 docs: update 2025-08-04 23:29:17 +08:00
Xinrea
4094e8b80d docs: update 2025-08-04 22:05:31 +08:00
Xinrea
e27cbaf715 bump version to 2.10.4 2025-08-04 00:20:12 +08:00
Xinrea
1f39b27d79 fix: creation_flags on windows 2025-08-03 23:56:56 +08:00
Xinrea
f45891fd95 fix: cmd window on windows 2025-08-03 23:17:36 +08:00
Xinrea
18fe644715 bump version to 2.10.3 2025-08-03 21:18:21 +08:00
Xinrea
40cde8c69a fix: no danmaku after adding with short room id 2025-08-03 21:17:41 +08:00
Xinrea
4b0af47906 fix: douyin room info params 2025-08-03 20:45:07 +08:00
Xinrea
9365b3c8cd bump version to 2.10.2 2025-08-02 21:08:29 +08:00
Xinrea
4b9f015ea7 fix: introduce user-agent configuration to avoid access limit 2025-08-02 21:07:06 +08:00
Xinrea
c42d4a084e doc: update 2025-08-02 01:18:04 +08:00
Xinrea
5bb3feb05b bump version to 2.10.1 2025-07-31 23:08:45 +08:00
Xinrea
05f776ed8b chore: adjust logs 2025-07-31 23:07:37 +08:00
Xinrea
9cec809485 fix: button disabled when triggered by deeplinking 2025-07-31 22:50:57 +08:00
Xinrea
429f909152 feat: break recording when resolution changes (close #144) 2025-07-31 22:39:45 +08:00
Xinrea
084dd23df1 Revert "fix: start a new recording when header changes"
This reverts commit 955e284d41.
2025-07-31 21:15:26 +08:00
Xinrea
e55afdd739 docs: update 2025-07-31 00:30:02 +08:00
Xinrea
72128a132b docs: update 2025-07-30 01:48:29 +08:00
Xinrea
92ca2cddad fix: dependencies 2025-07-29 00:59:21 +08:00
Xinrea
3db0d1dfe5 feat: manual input model name (close #143) 2025-07-29 00:09:06 +08:00
Xinrea
57907323e6 bump version to 2.10.0 2025-07-27 19:52:53 +08:00
Xinrea
dbdca44c5f feat: deep-link support bsr:// 2025-07-27 19:51:58 +08:00
Xinrea
fe1dd2201f fix: prevent list corruption when deleting archived items 2025-07-26 22:52:45 +08:00
Xinrea
e0ae194cc3 bump version to 2.9.5 2025-07-26 22:40:50 +08:00
Xinrea
6fc5700457 ci/cd: add script to bump version 2025-07-26 22:40:49 +08:00
Xinrea
c4fdcf86d4 fix: bilibili stream pathway not update (close #117) 2025-07-26 22:40:46 +08:00
Xinrea
3088500c8d bump version to 2.9.4 2025-07-25 21:10:04 +08:00
Xinrea
861f3a3624 fix: tauri schema not handled by custom plugin for shaka-player 2025-07-25 21:09:41 +08:00
Xinrea
c55783e4d9 chore: update @tauri-apps/api 2025-07-25 20:13:04 +08:00
Xinrea
955e284d41 fix: start a new recording when header changes 2025-07-24 23:03:09 +08:00
Xinrea
fc4c47427e chore: adjust log level 2025-07-24 21:57:04 +08:00
Xinrea
e2d7563faa bump version to 2.9.3 2025-07-24 21:28:35 +08:00
Xinrea
27d69f7f8d fix: clip video cover not loaded 2025-07-24 21:28:10 +08:00
Xinrea
a77bb5af44 bump version to 2.9.2 2025-07-24 00:32:28 +08:00
Xinrea
00286261a4 fix: range offset caused by duration error 2025-07-24 00:23:58 +08:00
Xinrea
0b898dccaa fix: bilibili stream url extraction error caused 404 2025-07-23 22:27:57 +08:00
Xinrea
a1d9ac4e68 chore: remove ai generated docs 2025-07-23 21:56:56 +08:00
Xinrea
4150939e23 Only create records after successful ts download (#141)
* Defer record creation until first successful stream segment download

Co-authored-by: shenwuol <shenwuol@gmail.com>

* Checkpoint before follow-up message

* Improve recording logic with directory management and error handling

Co-authored-by: shenwuol <shenwuol@gmail.com>

* Add recorder flow diagrams for Bilibili and Douyin recorders

Co-authored-by: shenwuol <shenwuol@gmail.com>

* Refactor recorder update_entries to prevent empty records and directories

Co-authored-by: shenwuol <shenwuol@gmail.com>

* Refactor recorder update_entries to prevent empty records and dirs

Co-authored-by: shenwuol <shenwuol@gmail.com>

* Fix panic in non-FMP4 stream recording by safely handling entry store

Co-authored-by: shenwuol <shenwuol@gmail.com>

---------

Co-authored-by: Cursor Agent <cursoragent@cursor.com>
Co-authored-by: shenwuol <shenwuol@gmail.com>
2025-07-23 17:35:00 +08:00
137 changed files with 13631 additions and 3140 deletions

View File

@@ -0,0 +1,44 @@
# AI Features and LangChain Integration
## AI Components
- **LangChain Integration**: Uses `@langchain/core`, `@langchain/deepseek`, `@langchain/langgraph`, `@langchain/ollama`
- **Whisper Transcription**: Local and online transcription via `whisper-rs` in Rust backend
- **AI Agent**: Located in [src/lib/agent/](mdc:src/lib/agent/) directory
## Frontend AI Features
- **AI Page**: [src/page/AI.svelte](mdc:src/page/AI.svelte) - Main AI interface
- **Agent Logic**: [src/lib/agent/](mdc:src/lib/agent/) - AI agent implementation
- **Interface**: [src/lib/interface.ts](mdc:src/lib/interface.ts) - AI communication layer
## Backend AI Features
- **Subtitle Generation**: [src-tauri/src/subtitle_generator/](mdc:src-tauri/src/subtitle_generator/) - AI-powered subtitle creation
- **Whisper Integration**: [src-tauri/src/subtitle_generator.rs](mdc:src-tauri/src/subtitle_generator.rs) - Speech-to-text processing
- **CUDA Support**: Optional CUDA acceleration for Whisper via feature flag
## AI Workflows
- **Live Transcription**: Real-time speech-to-text during live streams
- **Content Summarization**: AI-powered content analysis and summarization
- **Smart Editing**: AI-assisted video editing and clip generation
- **Danmaku Processing**: AI analysis of danmaku (bullet comments) streams
## Configuration
- **LLM Settings**: Configure AI models in [src-tauri/config.example.toml](mdc:src-tauri/config.example.toml)
- **Whisper Models**: Local model configuration for offline transcription
- **API Keys**: External AI service configuration for online features
## Development Notes
- AI features require proper model configuration
- CUDA feature enables GPU acceleration for Whisper
- LangChain integration supports multiple AI providers
- AI agent can work with both local and cloud-based models
description:
globs:
alwaysApply: true
---

View File

@@ -0,0 +1,53 @@
# Build and Deployment Configuration
## Build Scripts
- **PowerShell**: [build.ps1](mdc:build.ps1) - Windows build script
- **FFmpeg Setup**: [ffmpeg_setup.ps1](mdc:ffmpeg_setup.ps1) - FFmpeg installation script
- **Version Bump**: [scripts/bump.cjs](mdc:scripts/bump.cjs) - Version management script
## Package Management
- **Node.js**: [package.json](mdc:package.json) - Frontend dependencies and scripts
- **Rust**: [src-tauri/Cargo.toml](mdc:src-tauri/Cargo.toml) - Backend dependencies and features
- **Lock Files**: [yarn.lock](mdc:yarn.lock) - Yarn dependency lock
## Build Configuration
- **Vite**: [vite.config.ts](mdc:vite.config.ts) - Frontend build tool configuration
- **Tailwind**: [tailwind.config.cjs](mdc:tailwind.config.cjs) - CSS framework configuration
- **PostCSS**: [postcss.config.cjs](mdc:postcss.config.cjs) - CSS processing configuration
- **TypeScript**: [tsconfig.json](mdc:tsconfig.json), [tsconfig.node.json](mdc:tsconfig.node.json) - TypeScript configuration
## Tauri Configuration
- **Main Config**: [src-tauri/tauri.conf.json](mdc:src-tauri/tauri.conf.json) - Core Tauri settings
- **Platform Configs**:
- [src-tauri/tauri.macos.conf.json](mdc:src-tauri/tauri.macos.conf.json) - macOS specific
- [src-tauri/tauri.linux.conf.json](mdc:src-tauri/tauri.linux.conf.json) - Linux specific
- [src-tauri/tauri.windows.conf.json](mdc:src-tauri/tauri.windows.conf.json) - Windows specific
- [src-tauri/tauri.windows.cuda.conf.json](mdc:src-tauri/tauri.windows.cuda.conf.json) - Windows with CUDA
## Docker Support
- **Dockerfile**: [Dockerfile](mdc:Dockerfile) - Container deployment configuration
- **Documentation**: [docs/](mdc:docs/) - VitePress-based documentation site
## Build Commands
- **Frontend**: `yarn build` - Build production frontend
- **Tauri**: `yarn tauri build` - Build desktop application
- **Documentation**: `yarn docs:build` - Build documentation site
- **Type Check**: `yarn check` - TypeScript and Svelte validation
## Deployment Targets
- **Desktop**: Native Tauri applications for Windows, macOS, Linux
- **Docker**: Containerized deployment option
- **Documentation**: Static site deployment via VitePress
- **Assets**: Static asset distribution for web components
description:
globs:
alwaysApply: true
---

View File

@@ -0,0 +1,53 @@
# Database and Data Management
## Database Architecture
- **SQLite Database**: Primary data storage using `sqlx` with async runtime
- **Database Module**: [src-tauri/src/database/](mdc:src-tauri/src/database/) - Core database operations
- **Migration System**: [src-tauri/src/migration.rs](mdc:src-tauri/src/migration.rs) - Database schema management
## Data Models
- **Recording Data**: Stream metadata, recording sessions, and file information
- **Room Configuration**: Stream room settings and platform credentials
- **Task Management**: Recording task status and progress tracking
- **User Preferences**: Application settings and user configurations
## Frontend Data Layer
- **Database Interface**: [src/lib/db.ts](mdc:src/lib/db.ts) - Frontend database operations
- **Stores**: [src/lib/stores/](mdc:src/lib/stores/) - State management for data
- **Version Management**: [src/lib/stores/version.ts](mdc:src/lib/stores/version.ts) - Version tracking
## Data Operations
- **CRUD Operations**: Create, read, update, delete for all data entities
- **Query Optimization**: Efficient SQL queries with proper indexing
- **Transaction Support**: ACID compliance for critical operations
- **Data Validation**: Input validation and sanitization
## File Management
- **Cache Directory**: [src-tauri/cache/](mdc:src-tauri/cache/) - Temporary file storage
- **Upload Directory**: [src-tauri/cache/uploads/](mdc:src-tauri/cache/uploads/) - User upload storage
- **Bilibili Cache**: [src-tauri/cache/bilibili/](mdc:src-tauri/cache/bilibili/) - Platform-specific cache
## Data Persistence
- **SQLite Files**: [src-tauri/data/data_v2.db](mdc:src-tauri/data/data_v2.db) - Main database file
- **Write-Ahead Logging**: WAL mode for concurrent access and performance
- **Backup Strategy**: Database backup and recovery procedures
- **Migration Handling**: Automatic schema updates and data migration
## Development Guidelines
- Use prepared statements to prevent SQL injection
- Implement proper error handling for database operations
- Use transactions for multi-step operations
- Follow database naming conventions consistently
- Test database operations with sample data
description:
globs:
alwaysApply: true
---

View File

@@ -0,0 +1,45 @@
# Frontend Development Guidelines
## Svelte 3 Best Practices
- Use Svelte 3 syntax with `<script>` tags for component logic
- Prefer reactive statements with `$:` for derived state
- Use stores from [src/lib/stores/](mdc:src/lib/stores/) for global state management
- Import components from [src/lib/components/](mdc:src/lib/components/)
## TypeScript Configuration
- Follow the configuration in [tsconfig.json](mdc:tsconfig.json)
- Use strict type checking with `checkJs: true`
- Extends `@tsconfig/svelte` for Svelte-specific TypeScript settings
- Base URL is set to workspace root for clean imports
## Component Structure
- **Page components**: Located in [src/page/](mdc:src/page/) directory
- **Reusable components**: Located in [src/lib/components/](mdc:src/lib/components/) directory
- **Layout components**: [src/App.svelte](mdc:src/App.svelte), [src/AppClip.svelte](mdc:src/AppClip.svelte), [src/AppLive.svelte](mdc:src/AppLive.svelte)
## Styling
- Use Tailwind CSS classes for styling
- Configuration in [tailwind.config.cjs](mdc:tailwind.config.cjs)
- PostCSS configuration in [postcss.config.cjs](mdc:postcss.config.cjs)
- Global styles in [src/styles.css](mdc:src/styles.css)
## Entry Points
- **Main app**: [src/main.ts](mdc:src/main.ts) - Main application entry
- **Clip mode**: [src/main_clip.ts](mdc:src/main_clip.ts) - Clip editing interface
- **Live mode**: [src/main_live.ts](mdc:src/main_live.ts) - Live streaming interface
## Development Workflow
- Use `yarn dev` for frontend-only development
- Use `yarn tauri dev` for full Tauri development
- Use `yarn check` for TypeScript and Svelte type checking
description:
globs:
alwaysApply: true
---

View File

@@ -0,0 +1,47 @@
# BiliBili ShadowReplay Project Overview
This is a Tauri-based desktop application for caching live streams and performing real-time editing and submission. It supports Bilibili and Douyin platforms.
## Project Structure
### Frontend (Svelte + TypeScript)
- **Main entry points**: [src/main.ts](mdc:src/main.ts), [src/main_clip.ts](mdc:src/main_clip.ts), [src/main_live.ts](mdc:src/main_live.ts)
- **App components**: [src/App.svelte](mdc:src/App.svelte), [src/AppClip.svelte](mdc:src/AppClip.svelte), [src/AppLive.svelte](mdc:src/AppLive.svelte)
- **Pages**: Located in [src/page/](mdc:src/page/) directory
- **Components**: Located in [src/lib/components/](mdc:src/lib/components/) directory
- **Stores**: Located in [src/lib/stores/](mdc:src/lib/stores/) directory
### Backend (Rust + Tauri)
- **Main entry**: [src-tauri/src/main.rs](mdc:src-tauri/src/main.rs)
- **Core modules**:
- [src-tauri/src/recorder/](mdc:src-tauri/src/recorder/) - Stream recording functionality
- [src-tauri/src/database/](mdc:src-tauri/src/database/) - Database operations
- [src-tauri/src/handlers/](mdc:src-tauri/src/handlers/) - Tauri command handlers
- **Custom crate**: [src-tauri/crates/danmu_stream/](mdc:src-tauri/crates/danmu_stream/) - Danmaku stream processing
### Configuration
- **Frontend config**: [tsconfig.json](mdc:tsconfig.json), [vite.config.ts](mdc:vite.config.ts), [tailwind.config.cjs](mdc:tailwind.config.cjs)
- **Backend config**: [src-tauri/Cargo.toml](mdc:src-tauri/Cargo.toml), [src-tauri/tauri.conf.json](mdc:src-tauri/tauri.conf.json)
- **Example config**: [src-tauri/config.example.toml](mdc:src-tauri/config.example.toml)
## Key Technologies
- **Frontend**: Svelte 3, TypeScript, Tailwind CSS, Flowbite
- **Backend**: Rust, Tauri 2, SQLite, FFmpeg
- **AI Features**: LangChain, Whisper for transcription
- **Build Tools**: Vite, VitePress for documentation
## Development Commands
- `yarn dev` - Start development server
- `yarn tauri dev` - Start Tauri development
- `yarn build` - Build frontend
- `yarn docs:dev` - Start documentation server
description:
globs:
alwaysApply: true
---

View File

@@ -0,0 +1,47 @@
# Rust Backend Development Guidelines
## Project Structure
- **Main entry**: [src-tauri/src/main.rs](mdc:src-tauri/src/main.rs) - Application entry point
- **Core modules**:
- [src-tauri/src/recorder/](mdc:src-tauri/src/recorder/) - Stream recording and management
- [src-tauri/src/database/](mdc:src-tauri/src/database/) - SQLite database operations
- [src-tauri/src/handlers/](mdc:src-tauri/src/handlers/) - Tauri command handlers
- [src-tauri/src/subtitle_generator/](mdc:src-tauri/src/subtitle_generator/) - AI-powered subtitle generation
## Custom Crates
- **danmu_stream**: [src-tauri/crates/danmu_stream/](mdc:src-tauri/crates/danmu_stream/) - Danmaku stream processing library
## Dependencies
- **Tauri 2**: Core framework for desktop app functionality
- **FFmpeg**: Video/audio processing via `async-ffmpeg-sidecar`
- **Whisper**: AI transcription via `whisper-rs` (CUDA support available)
- **LangChain**: AI agent functionality
- **SQLite**: Database via `sqlx` with async runtime
## Configuration
- **Cargo.toml**: [src-tauri/Cargo.toml](mdc:src-tauri/Cargo.toml) - Dependencies and features
- **Tauri config**: [src-tauri/tauri.conf.json](mdc:src-tauri/tauri.conf.json) - App configuration
- **Example config**: [src-tauri/config.example.toml](mdc:src-tauri/config.example.toml) - User configuration template
## Features
- **default**: Includes GUI and core functionality
- **cuda**: Enables CUDA acceleration for Whisper transcription
- **headless**: Headless mode without GUI
- **custom-protocol**: Required for production builds
## Development Commands
- `yarn tauri dev` - Start Tauri development with hot reload
- `yarn tauri build` - Build production application
- `cargo check` - Check Rust code without building
- `cargo test` - Run Rust tests
description:
globs:
alwaysApply: true
---

View File

@@ -0,0 +1,53 @@
# Streaming and Recording System
## Core Recording Components
- **Recorder Manager**: [src-tauri/src/recorder_manager.rs](mdc:src-tauri/src/recorder_manager.rs) - Main recording orchestration
- **Recorder**: [src-tauri/src/recorder/](mdc:src-tauri/src/recorder/) - Individual stream recording logic
- **Danmaku Stream**: [src-tauri/crates/danmu_stream/](mdc:src-tauri/crates/danmu_stream/) - Custom crate for bullet comment processing
## Supported Platforms
- **Bilibili**: Main platform support with live stream caching
- **Douyin**: TikTok's Chinese platform support
- **Multi-stream**: Support for recording multiple streams simultaneously
## Recording Features
- **Live Caching**: Real-time stream recording and buffering
- **Time-based Clipping**: Extract specific time segments from recorded streams
- **Danmaku Capture**: Record bullet comments and chat messages
- **Quality Control**: Configurable recording quality and format options
## Frontend Interfaces
- **Live Mode**: [src/AppLive.svelte](mdc:src/AppLive.svelte) - Live streaming interface
- **Clip Mode**: [src/AppClip.svelte](mdc:src/AppClip.svelte) - Video editing and clipping
- **Room Management**: [src/page/Room.svelte](mdc:src/page/Room.svelte) - Stream room configuration
- **Task Management**: [src/page/Task.svelte](mdc:src/page/Task.svelte) - Recording task monitoring
## Technical Implementation
- **FFmpeg Integration**: Video/audio processing via `async-ffmpeg-sidecar`
- **M3U8 Support**: HLS stream processing with `m3u8-rs`
- **Async Processing**: Non-blocking I/O with `tokio` runtime
- **Database Storage**: SQLite for metadata and recording information
## Configuration
- **Recording Settings**: Configure in [src-tauri/config.example.toml](mdc:src-tauri/config.example.toml)
- **FFmpeg Path**: Set FFmpeg binary location for video processing
- **Storage Paths**: Configure cache and output directories
- **Quality Settings**: Adjust recording bitrate and format options
## Development Workflow
- Use [src-tauri/src/recorder/](mdc:src-tauri/src/recorder/) for core recording logic
- Test with [src-tauri/tests/](mdc:src-tauri/tests/) directory
- Monitor recording progress via progress manager
- Handle errors gracefully with custom error types
description:
globs:
alwaysApply: true
---

36
.devcontainer/Dockerfile Normal file
View File

@@ -0,0 +1,36 @@
ARG VARIANT=bookworm-slim
FROM debian:${VARIANT}
ENV DEBIAN_FRONTEND=noninteractive
# Arguments
ARG CONTAINER_USER=vscode
ARG CONTAINER_GROUP=vscode
# Install dependencies
RUN apt-get update \
&& apt-get install -y \
build-essential \
clang \
cmake \
curl \
file \
git \
libayatana-appindicator3-dev \
librsvg2-dev \
libssl-dev \
libwebkit2gtk-4.1-dev \
libxdo-dev \
pkg-config \
wget \
&& apt-get clean -y && rm -rf /var/lib/apt/lists/* /tmp/library-scripts
# Set users
RUN adduser --disabled-password --gecos "" ${CONTAINER_USER}
USER ${CONTAINER_USER}
WORKDIR /home/${CONTAINER_USER}
# Install rustup
RUN curl --proto "=https" --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y --profile minimal
ENV PATH=${PATH}:/home/${CONTAINER_USER}/.cargo/bin
CMD [ "/bin/bash" ]

View File

@@ -0,0 +1,31 @@
{
"name": "vscode",
"build": {
"dockerfile": "Dockerfile",
"args": {
"CONTAINER_USER": "vscode",
"CONTAINER_GROUP": "vscode"
}
},
"features": {
"ghcr.io/devcontainers/features/node:1": {
"version": "latest"
}
},
"customizations": {
"vscode": {
"settings": {
"lldb.executable": "/usr/bin/lldb",
"files.watcherExclude": {
"**/target/**": true
}
},
"extensions": [
"vadimcn.vscode-lldb",
"rust-lang.rust-analyzer",
"tamasfe.even-better-toml"
]
}
},
"remoteUser": "vscode"
}

View File

@@ -1,21 +0,0 @@
---
name: Bug report
about: 提交一个 BUG
title: "[BUG]"
labels: bug
assignees: Xinrea
---
**描述:**
简要描述一下这个 BUG 的现象
**日志和截图:**
如果可以的话,请尽量附上相关截图和日志文件(日志是位于安装目录下,名为 bsr.log 的文件)。
**相关信息:**
- 程序版本:
- 系统类型:
**其他**
任何其他想说的

47
.github/ISSUE_TEMPLATE/bug_report.yml vendored Normal file
View File

@@ -0,0 +1,47 @@
name: Bug Report
description: 提交 BUG 报告.
title: "[bug] "
labels: ["bug"]
assignees:
- Xinrea
body:
- type: checkboxes
attributes:
label: 提交须知
description: 请确认以下内容
options:
- label: 我是在最新版本上发现的此问题
required: true
- label: 我已阅读 [常见问题](https://bsr.xinrea.cn/usage/faq.html) 的说明
required: true
- type: dropdown
id: app_type
attributes:
label: 以哪种方式使用的该软件?
multiple: false
options:
- Docker 镜像
- 桌面应用
- type: dropdown
id: os
attributes:
label: 运行环境
multiple: false
options:
- Linux
- Windows
- MacOS
- Docker
- type: textarea
attributes:
label: BUG 描述
description: 请尽可能详细描述 BUG 的现象以及复现的方法
validations:
required: true
- type: textarea
id: logs
attributes:
label: 日志
description: 请粘贴日志内容或是上传日志文件(在主窗口的设置页面,提供了一键打开日志目录所在位置的按钮;当你打开日志目录所在位置后,进入 logs 目录,找到后缀名为 log 的文件)
validations:
required: true

View File

@@ -1,20 +0,0 @@
---
name: Feature request
about: 提交一个新功能的建议
title: "[feature]"
labels: enhancement
assignees: Xinrea
---
**遇到的问题:**
在使用过程中遇到了什么问题让你想要提出建议
**想要的功能:**
想要怎样的新功能来解决这个问题
**通过什么方式实现(有思路的话):**
如果有相关的实现思路或者是参考,可以在此提供
**其他:**
其他任何想说的话

View File

@@ -0,0 +1,13 @@
name: Feature Request
description: 提交新功能的需求
title: "[feature] "
labels: ["feature"]
assignees:
- Xinrea
body:
- type: textarea
attributes:
label: 需求描述
description: 请尽可能详细描述你想要的新功能
validations:
required: true

43
.github/workflows/check.yml vendored Normal file
View File

@@ -0,0 +1,43 @@
name: Check
on:
pull_request:
branches: [ "main" ]
paths:
- 'src-tauri/**'
- '.github/workflows/check.yml'
jobs:
check:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Rust
uses: dtolnay/rust-toolchain@stable
with:
components: rustfmt
- name: Cache cargo registry
uses: actions/cache@v4
with:
path: |
~/.cargo/registry
~/.cargo/git
src-tauri/target
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}
- name: Install dependencies (ubuntu only)
run: |
sudo apt-get update
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf ffmpeg
- name: Check formatting
run: cargo fmt --check
working-directory: src-tauri
- name: Check tests
run: cargo test -v && cargo test --no-default-features --features headless -v
working-directory: src-tauri

View File

@@ -59,11 +59,6 @@ jobs:
if: matrix.platform == 'windows-latest' && matrix.features == 'cuda'
uses: Jimver/cuda-toolkit@v0.2.24
- name: Rust cache
uses: swatinem/rust-cache@v2
with:
workspaces: "./src-tauri -> target"
- name: Setup ffmpeg
if: matrix.platform == 'windows-latest'
working-directory: ./
@@ -87,6 +82,19 @@ jobs:
Copy-Item "$cudaPath\cublas64*.dll" -Destination $targetPath
Copy-Item "$cudaPath\cublasLt64*.dll" -Destination $targetPath
- name: Get previous tag
id: get_previous_tag
run: |
# Get the previous tag (excluding the current one being pushed)
PREVIOUS_TAG=$(git describe --tags --abbrev=0 HEAD~1 2>/dev/null || echo "")
if [ -z "$PREVIOUS_TAG" ]; then
# If no previous tag found, use the first commit
PREVIOUS_TAG=$(git rev-list --max-parents=0 HEAD | head -1)
fi
echo "previous_tag=$PREVIOUS_TAG" >> $GITHUB_OUTPUT
echo "current_tag=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT
shell: bash
- uses: tauri-apps/tauri-action@v0
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
@@ -96,7 +104,7 @@ jobs:
with:
tagName: v__VERSION__
releaseName: "BiliBili ShadowReplay v__VERSION__"
releaseBody: "See the assets to download this version and install."
releaseBody: "> [!NOTE]\n> 如果你是第一次下载安装,请参考 [安装准备](https://bsr.xinrea.cn/getting-started/installation/desktop.html) 选择合适的版本。\n> Changelog: https://github.com/Xinrea/bili-shadowreplay/compare/${{ steps.get_previous_tag.outputs.previous_tag }}...${{ steps.get_previous_tag.outputs.current_tag }}"
releaseDraft: true
prerelease: false
args: ${{ matrix.args }} ${{ matrix.platform == 'windows-latest' && matrix.features == 'cuda' && '--config src-tauri/tauri.windows.cuda.conf.json' || '' }}

1
.gitignore vendored
View File

@@ -11,6 +11,7 @@ node_modules
dist
dist-ssr
*.local
/target/
# Editor directories and files
.vscode/*

View File

@@ -65,9 +65,16 @@ RUN apt-get update && apt-get install -y \
libssl3 \
ca-certificates \
fonts-wqy-microhei \
netbase \
nscd \
&& update-ca-certificates \
&& rm -rf /var/lib/apt/lists/*
RUN touch /etc/netgroup
RUN mkdir -p /var/run/nscd && chmod 755 /var/run/nscd
RUN nscd
# Add /app to PATH
ENV PATH="/app:${PATH}"
@@ -83,4 +90,4 @@ COPY --from=rust-builder /app/src-tauri/ffprobe ./ffprobe
EXPOSE 3000
# Run the application
CMD ["./bili-shadowreplay"]
CMD ["sh", "-c", "nscd && ./bili-shadowreplay"]

View File

@@ -4,23 +4,27 @@
![GitHub Actions Workflow Status](https://img.shields.io/github/actions/workflow/status/xinrea/bili-shadowreplay/main.yml?label=Application%20Build)
![GitHub Actions Workflow Status](https://img.shields.io/github/actions/workflow/status/Xinrea/bili-shadowreplay/package.yml?label=Docker%20Build)
![GitHub Release](https://img.shields.io/github/v/release/xinrea/bili-shadowreplay)
![GitHub Downloads (all assets, all releases)](https://img.shields.io/github/downloads/xinrea/bili-shadowreplay/total)
[![Ask DeepWiki](https://deepwiki.com/badge.svg)](https://deepwiki.com/Xinrea/bili-shadowreplay)
BiliBili ShadowReplay 是一个缓存直播并进行实时编辑投稿的工具。通过划定时间区间,并编辑简单的必需信息,即可完成直播切片以及投稿,将整个流程压缩到分钟级。同时,也支持对缓存的历史直播进行回放,以及相同的切片编辑投稿处理流程。
目前仅支持 B 站和抖音平台的直播。
![rooms](docs/public/images/summary.png)
[![Star History Chart](https://api.star-history.com/svg?repos=Xinrea/bili-shadowreplay&type=Date)](https://www.star-history.com/#Xinrea/bili-shadowreplay&Date)
## 安装和使用
![rooms](docs/public/images/summary.png)
前往网站查看说明:[BiliBili ShadowReplay](https://bsr.xinrea.cn/)
## 参与开发
[Contributing](.github/CONTRIBUTING.md)
可以通过 [DeepWiki](https://deepwiki.com/Xinrea/bili-shadowreplay) 了解本项目。
贡献指南:[Contributing](.github/CONTRIBUTING.md)
## 赞助

View File

@@ -1,7 +1,8 @@
import { defineConfig } from "vitepress";
import { withMermaid } from "vitepress-plugin-mermaid";
// https://vitepress.dev/reference/site-config
export default defineConfig({
export default withMermaid({
title: "BiliBili ShadowReplay",
description: "直播录制/实时回放/剪辑/投稿工具",
themeConfig: {
@@ -18,21 +19,55 @@ export default defineConfig({
{
text: "开始使用",
items: [
{ text: "安装准备", link: "/getting-started/installation" },
{ text: "配置使用", link: "/getting-started/configuration" },
{ text: "FFmpeg 配置", link: "/getting-started/ffmpeg" },
{
text: "安装准备",
items: [
{
text: "桌面端安装",
link: "/getting-started/installation/desktop",
},
{
text: "Docker 安装",
link: "/getting-started/installation/docker",
},
],
},
{
text: "配置使用",
items: [
{ text: "账号配置", link: "/getting-started/config/account" },
{ text: "FFmpeg 配置", link: "/getting-started/config/ffmpeg" },
{ text: "Whisper 配置", link: "/getting-started/config/whisper" },
{ text: "LLM 配置", link: "/getting-started/config/llm" },
],
},
],
},
{
text: "说明文档",
items: [
{ text: "功能说明", link: "/usage/features" },
{
text: "功能说明",
items: [
{ text: "工作流程", link: "/usage/features/workflow" },
{ text: "直播间管理", link: "/usage/features/room" },
{ text: "切片功能", link: "/usage/features/clip" },
{ text: "字幕功能", link: "/usage/features/subtitle" },
{ text: "弹幕功能", link: "/usage/features/danmaku" },
{ text: "Webhook", link: "/usage/features/webhook" },
],
},
{ text: "常见问题", link: "/usage/faq" },
],
},
{
text: "开发文档",
items: [{ text: "架构设计", link: "/develop/architecture" }],
items: [
{
text: "DeepWiki",
link: "https://deepwiki.com/Xinrea/bili-shadowreplay",
},
],
},
],

View File

@@ -1 +0,0 @@
# 架构设计

View File

@@ -0,0 +1,12 @@
# 账号配置
要添加直播间,至少需要配置一个同平台的账号。在账号页面,你可以通过添加账号按钮添加一个账号。
- B 站账号:目前支持扫码登录和 Cookie 手动配置两种方式,推荐使用扫码登录
- 抖音账号:目前仅支持 Cookie 手动配置登陆
## 抖音账号配置
首先确保已经登录抖音,然后打开[个人主页](https://www.douyin.com/user/self),右键单击网页,在菜单中选择 `检查Inspect`,打开开发者工具,切换到 `网络Network` 选项卡,然后刷新网页,此时能在列表中找到 `self` 请求(一般是列表中第一个),单击该请求,查看`请求标头`,在 `请求标头` 中找到 `Cookie`,复制该字段的值,粘贴到配置页面的 `Cookie` 输入框中,要注意复制完全。
![DouyinCookie](/images/douyin_cookie.png)

View File

@@ -0,0 +1,9 @@
# LLM 配置
![LLM](/images/model_config.png)
助手页面的 AI Agent 助手功能需要配置大模型,目前仅支持配置 OpenAI 协议兼容的大模型服务。
本软件并不提供大模型服务,请自行选择服务提供商。要注意,使用 AI Agent 助手需要消耗比普通对话更多的 Token请确保有足够的 Token 余额。
此外AI Agent 的功能需要大模型支持 Function Calling 功能,否则无法正常调用工具。

View File

@@ -1,30 +1,11 @@
# 配置使用
## 账号配置
要添加直播间,至少需要配置一个同平台的账号。在账号页面,你可以通过添加账号按钮添加一个账号。
- B 站账号:目前支持扫码登录和 Cookie 手动配置两种方式,推荐使用扫码登录
- 抖音账号:目前仅支持 Cookie 手动配置登陆
### 抖音账号配置
首先确保已经登录抖音,然后打开[个人主页](https://www.douyin.com/user/self),右键单击网页,在菜单中选择 `检查Inspect`,打开开发者工具,切换到 `网络Network` 选项卡,然后刷新网页,此时能在列表中找到 `self` 请求(一般是列表中第一个),单击该请求,查看`请求标头`,在 `请求标头` 中找到 `Cookie`,复制该字段的值,粘贴到配置页面的 `Cookie` 输入框中,要注意复制完全。
![DouyinCookie](/images/douyin_cookie.png)
## FFmpeg 配置
如果想要使用切片生成和压制功能,请确保 FFmpeg 已正确配置;除了 Windows 平台打包自带 FFmpeg 以外,其他平台需要手动安装 FFmpeg请参考 [FFmpeg 配置](/getting-started/ffmpeg)。
## Whisper 配置
# Whisper 配置
要使用 AI 字幕识别功能,需要在设置页面配置 Whisper。目前可以选择使用本地运行 Whisper 模型,或是使用在线的 Whisper 服务(通常需要付费获取 API Key
> [!NOTE]
> 其实有许多更好的中文字幕识别解决方案,但是这类服务通常需要将文件上传到对象存储后异步处理,考虑到实现的复杂度,选择了使用本地运行 Whisper 模型或是使用在线的 Whisper 服务,在请求返回时能够直接获取字幕生成结果。
### 本地运行 Whisper 模型
## 本地运行 Whisper 模型
![WhisperLocal](/images/whisper_local.png)
@@ -37,13 +18,13 @@
模型文件的大小通常意味着其在运行时资源占用的大小因此请根据电脑配置选择合适的模型。此外GPU 版本与 CPU 版本在字幕生成速度上存在**巨大差异**,因此推荐使用 GPU 版本进行本地处理(目前仅支持 Nvidia GPU
### 使用在线 Whisper 服务
## 使用在线 Whisper 服务
![WhisperOnline](/images/whisper_online.png)
如果需要使用在线的 Whisper 服务进行字幕生成,可以在设置中切换为在线 Whisper并配置好 API Key。提供 Whisper 服务的平台并非只有 OpenAI 一家,许多云服务平台也提供 Whisper 服务。
### 字幕识别质量的调优
## 字幕识别质量的调优
目前在设置中支持设置 Whisper 语言和 Whisper 提示词,这些设置对于本地和在线的 Whisper 服务都有效。
@@ -52,6 +33,3 @@
Afrikaans, Arabic, Armenian, Azerbaijani, Belarusian, Bosnian, Bulgarian, Catalan, Chinese, Croatian, Czech, Danish, Dutch, English, Estonian, Finnish, French, Galician, German, Greek, Hebrew, Hindi, Hungarian, Icelandic, Indonesian, Italian, Japanese, Kannada, Kazakh, Korean, Latvian, Lithuanian, Macedonian, Malay, Marathi, Maori, Nepali, Norwegian, Persian, Polish, Portuguese, Romanian, Russian, Serbian, Slovak, Slovenian, Spanish, Swahili, Swedish, Tagalog, Tamil, Thai, Turkish, Ukrainian, Urdu, Vietnamese, and Welsh.
提示词可以优化生成的字幕的风格也会一定程度上影响质量要注意Whisper 无法理解复杂的提示词,你可以在提示词中使用一些简单的描述,让其在选择词汇时使用偏向于提示词所描述的领域相关的词汇,以避免出现毫不相干领域的词汇;或是让它在标点符号的使用上参照提示词的风格。

View File

@@ -1,66 +0,0 @@
# 安装准备
## 桌面端安装
桌面端目前提供了 Windows、Linux 和 MacOS 三个平台的安装包。
安装包分为两个版本,普通版和 debug 版普通版适合大部分用户使用debug 版包含了更多的调试信息,适合开发者使用;由于程序会对账号等敏感信息进行管理,请从信任的来源进行下载;所有版本均可在 [GitHub Releases](https://github.com/Xinrea/bili-shadowreplay/releases) 页面下载安装。
### Windows
由于程序内置 Whisper 字幕识别模型支持Windows 版本分为两种:
- **普通版本**:内置了 Whisper GPU 加速,字幕识别较快,体积较大,只支持 Nvidia 显卡
- **CPU 版本** 使用 CPU 进行字幕识别推理,速度较慢
请根据自己的显卡情况选择合适的版本进行下载。
### Linux
Linux 版本目前仅支持使用 CPU 推理,且测试较少,可能存在一些问题,遇到问题请及时反馈。
### MacOS
MacOS 版本内置 Metal GPU 加速;安装后首次运行,会提示无法打开从网络下载的软件,请在设置-隐私与安全性下,选择仍然打开以允许程序运行。
## Docker 部署
BiliBili ShadowReplay 提供了服务端部署的能力,提供 Web 控制界面,可以用于在服务器等无图形界面环境下部署使用。
### 镜像获取
```bash
# 拉取最新版本
docker pull ghcr.io/xinrea/bili-shadowreplay:latest
# 拉取指定版本
docker pull ghcr.io/xinrea/bili-shadowreplay:2.5.0
# 速度太慢?从镜像源拉取
docker pull ghcr.nju.edu.cn/xinrea/bili-shadowreplay:latest
```
### 镜像使用
使用方法:
```bash
sudo docker run -it -d\
-p 3000:3000 \
-v $DATA_DIR:/app/data \
-v $CACHE_DIR:/app/cache \
-v $OUTPUT_DIR:/app/output \
-v $WHISPER_MODEL:/app/whisper_model.bin \
--name bili-shadowreplay \
ghcr.io/xinrea/bili-shadowreplay:latest
```
其中:
- `$DATA_DIR`:为数据目录,对应于桌面版的数据目录,
Windows 下位于 `C:\Users\{用户名}\AppData\Roaming\cn.vjoi.bilishadowreplay`;
MacOS 下位于 `/Users/{user}/Library/Application Support/cn.vjoi.bilishadowreplay`
- `$CACHE_DIR`:为缓存目录,对应于桌面版的缓存目录;
- `$OUTPUT_DIR`:为输出目录,对应于桌面版的输出目录;
- `$WHISPER_MODEL`:为 Whisper 模型文件路径,对应于桌面版的 Whisper 模型文件路径。

View File

@@ -0,0 +1,22 @@
# 桌面端安装
桌面端目前提供了 Windows、Linux 和 MacOS 三个平台的安装包。
安装包分为两个版本,普通版和 debug 版普通版适合大部分用户使用debug 版包含了更多的调试信息,适合开发者使用;由于程序会对账号等敏感信息进行管理,请从信任的来源进行下载;所有版本均可在 [GitHub Releases](https://github.com/Xinrea/bili-shadowreplay/releases) 页面下载安装。
## Windows
由于程序内置 Whisper 字幕识别模型支持Windows 版本分为两种:
- **普通版本**:内置了 Whisper GPU 加速,字幕识别较快,体积较大,只支持 Nvidia 显卡
- **CPU 版本** 使用 CPU 进行字幕识别推理,速度较慢
请根据自己的显卡情况选择合适的版本进行下载。
## Linux
Linux 版本目前仅支持使用 CPU 推理,且测试较少,可能存在一些问题,遇到问题请及时反馈。
## MacOS
MacOS 版本内置 Metal GPU 加速;安装后首次运行,会提示无法打开从网络下载的软件,请在设置-隐私与安全性下,选择仍然打开以允许程序运行。

View File

@@ -0,0 +1,41 @@
# Docker 部署
BiliBili ShadowReplay 提供了服务端部署的能力,提供 Web 控制界面,可以用于在服务器等无图形界面环境下部署使用。
## 镜像获取
```bash
# 拉取最新版本
docker pull ghcr.io/xinrea/bili-shadowreplay:latest
# 拉取指定版本
docker pull ghcr.io/xinrea/bili-shadowreplay:2.5.0
# 速度太慢?从镜像源拉取
docker pull ghcr.nju.edu.cn/xinrea/bili-shadowreplay:latest
```
## 镜像使用
使用方法:
```bash
sudo docker run -it -d\
-p 3000:3000 \
-v $DATA_DIR:/app/data \
-v $CACHE_DIR:/app/cache \
-v $OUTPUT_DIR:/app/output \
-v $WHISPER_MODEL:/app/whisper_model.bin \
--name bili-shadowreplay \
ghcr.io/xinrea/bili-shadowreplay:latest
```
其中:
- `$DATA_DIR`:为数据目录,对应于桌面版的数据目录,
Windows 下位于 `C:\Users\{用户名}\AppData\Roaming\cn.vjoi.bilishadowreplay`;
MacOS 下位于 `/Users/{user}/Library/Application Support/cn.vjoi.bilishadowreplay`
- `$CACHE_DIR`:为缓存目录,对应于桌面版的缓存目录;
- `$OUTPUT_DIR`:为输出目录,对应于桌面版的输出目录;
- `$WHISPER_MODEL`:为 Whisper 模型文件路径,对应于桌面版的 Whisper 模型文件路径。

View File

@@ -11,10 +11,10 @@ hero:
actions:
- theme: brand
text: 开始使用
link: /getting-started/installation
link: /getting-started/installation/desktop
- theme: alt
text: 说明文档
link: /usage/features
link: /usage/features/workflow
features:
- icon: 📹

Binary file not shown.

After

Width:  |  Height:  |  Size: 383 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 67 KiB

Binary file not shown.

Binary file not shown.

View File

@@ -0,0 +1,31 @@
# 常见问题
## 一、在哪里反馈问题?
你可以前往 [Github Issues](https://github.com/Xinrea/bili-shadowreplay/issues/new?template=bug_report.md) 提交问题,或是加入[反馈交流群](https://qm.qq.com/q/v4lrE6gyum)。
1. 在提交问题前,请先阅读其它常见问题,确保你的问题已有解答;
2. 其次,请确保你的程序已更新到最新版本;
3. 最后,你应准备好提供你的程序日志文件,以便更好地定位问题。
## 二、在哪里查看日志?
在主窗口的设置页面,提供了一键打开日志目录所在位置的按钮。当你打开日志目录所在位置后,进入 `logs` 目录,找到后缀名为 `log` 的文件,这便是你需要提供给开发者的日志文件。
## 三、无法预览直播或是生成切片
如果你是 macOS 或 Linux 用户,请确保你已安装了 `ffmpeg``ffprobe` 工具;如果不知道如何安装,请参考 [FFmpeg 配置](/getting-started/config/ffmpeg)。
如果你是 Windows 用户,程序目录下应当自带了 `ffmpeg``ffprobe` 工具,如果无法预览直播或是生成切片,请向开发者反馈。
## 四、添加 B 站直播间出现 -352 错误
`-352` 错误是由 B 站风控机制导致的,如果你添加了大量的 B 站直播间进行录制,可以在设置页面调整直播间状态的检查间隔,尽量避免风控;如果你在直播间数量较少的情况下出现该错误,请向开发者反馈。
## 五、录播为什么都是碎片文件?
缓存目录下的录播文件并非用于直接播放或是投稿,而是用于直播流的预览与实时回放。如果你需要录播文件用于投稿,请打开对应录播的预览界面,使用快捷键创建选区,生成所需范围的切片,切片文件为常规的 mp4 文件,位于你所设置的切片目录下。
如果你将 BSR 作为单纯的录播软件使用,在设置中可以开启`整场录播生成`这样在直播结束后BSR 会自动生成整场录播的切片。
![整场录播](/images/whole_clip.png)

View File

View File

@@ -0,0 +1 @@
# 切片

View File

@@ -0,0 +1 @@
# 弹幕

View File

@@ -0,0 +1,38 @@
# 直播间
> [!WARNING]
> 在添加管理直播间前,请确保账号列表中有对应平台的可用账号。
## 添加直播间
### 手动添加直播间
你可以在 BSR 直播间页面,点击按钮手动添加直播间。你需要选择平台,并输入直播间号。
直播间号通常是直播间网页地址尾部的遗传数字,例如 `https://live.bilibili.com/123456` 中的 `123456`,或是 `https://live.douyin.com/123456` 中的 `123456`
抖音直播间比较特殊,当未开播时,你无法找到直播间的入口,因此你需要当直播间开播时找到直播间网页地址,并记录其直播间号。
抖音直播间需要输入主播的 sec_uid你可以在主播主页的 URL 中找到,例如 `https://www.douyin.com/user/MS4wLjABAAAA` 中的 `MS4wLjABAAAA`
### 使用 DeepLinking 快速添加直播间
<video src="/videos/deeplinking.mp4" loop autoplay muted style="border-radius: 10px;"></video>
在浏览器中观看直播时,替换地址栏中直播间地址中的 `https://``bsr://` 即可快速唤起 BSR 添加直播间。
## 启用/禁用直播间
你可以点击直播间卡片右上角的菜单按钮,选择启用/禁用直播间。
- 启用后,当直播间开播时,会自动开始录制
- 禁用后,当直播间开播时,不会自动开始录制
## 移除直播间
> [!CAUTION]
> 移除直播间后,该直播间相关的所有录播都会被删除,请谨慎操作。
你可以点击直播间卡片右上角的菜单按钮,选择移除直播间。
<video src="/videos/room_remove.mp4" loop autoplay muted style="border-radius: 10px;"></video>

View File

@@ -0,0 +1 @@
# 字幕

View File

@@ -0,0 +1,245 @@
# Webhook
> [!NOTE]
> 你可以使用 <https://webhook.site> 来测试 Webhook 功能。
## 设置 Webhook
打开 BSR 设置页面,在基础设置中设置 Webhook 地址。
## Webhook Events
### 直播间相关
#### 添加直播间
```json
{
"id": "a96a5e9f-9857-4c13-b889-91da2ace208a",
"event": "recorder.added",
"payload": {
"room_id": 26966466,
"created_at": "2025-09-07T03:33:14.258796+00:00",
"platform": "bilibili",
"auto_start": true,
"extra": ""
},
"timestamp": 1757215994
}
```
#### 移除直播间
```json
{
"id": "e33623d4-e040-4390-88f5-d351ceeeace7",
"event": "recorder.removed",
"payload": {
"room_id": 27183290,
"created_at": "2025-08-30T10:54:18.569198+00:00",
"platform": "bilibili",
"auto_start": true,
"extra": ""
},
"timestamp": 1757217015
}
```
### 直播相关
> [!NOTE]
> 直播开始和结束,不意味着录制的开始和结束。
#### 直播开始
```json
{
"id": "f12f3424-f7d8-4b2f-a8b7-55477411482e",
"event": "live.started",
"payload": {
"room_id": 843610,
"room_info": {
"room_id": 843610,
"room_title": "登顶!",
"room_cover": "https://i0.hdslb.com/bfs/live/new_room_cover/73aea43f4b4624c314d62fea4b424822fb506dfb.jpg"
},
"user_info": {
"user_id": "475210",
"user_name": "Xinrea",
"user_avatar": "https://i1.hdslb.com/bfs/face/91beb3bf444b295fe12bae1f3dc6d9fc4fe4c224.jpg"
},
"total_length": 0,
"current_live_id": "",
"live_status": false,
"is_recording": false,
"auto_start": true,
"platform": "bilibili"
},
"timestamp": 1757217190
}
```
#### 直播结束
```json
{
"id": "e8b0756a-02f9-4655-b5ae-a170bf9547bd",
"event": "live.ended",
"payload": {
"room_id": 843610,
"room_info": {
"room_id": 843610,
"room_title": "登顶!",
"room_cover": "https://i0.hdslb.com/bfs/live/new_room_cover/73aea43f4b4624c314d62fea4b424822fb506dfb.jpg"
},
"user_info": {
"user_id": "475210",
"user_name": "Xinrea",
"user_avatar": "https://i1.hdslb.com/bfs/face/91beb3bf444b295fe12bae1f3dc6d9fc4fe4c224.jpg"
},
"total_length": 0,
"current_live_id": "",
"live_status": true,
"is_recording": false,
"auto_start": true,
"platform": "bilibili"
},
"timestamp": 1757217365
}
```
### 录播相关
#### 开始录制
```json
{
"id": "5ec1ea10-2b31-48fd-8deb-f2d7d2ea5985",
"event": "record.started",
"payload": {
"room_id": 26966466,
"room_info": {
"room_id": 26966466,
"room_title": "早安獭獭栞下播前抽fufu",
"room_cover": "https://i0.hdslb.com/bfs/live/user_cover/b810c36855168034557e905e5916b1dba1761fa4.jpg"
},
"user_info": {
"user_id": "1609526545",
"user_name": "栞栞Shiori",
"user_avatar": "https://i1.hdslb.com/bfs/face/47e8dbabb895de44ec6cace085d4dc1d40307277.jpg"
},
"total_length": 0,
"current_live_id": "1757216045412",
"live_status": true,
"is_recording": false,
"auto_start": true,
"platform": "bilibili"
},
"timestamp": 1757216045
}
```
#### 结束录制
```json
{
"id": "56fd03e5-3965-4c2e-a6a9-bb6932347eb3",
"event": "record.ended",
"payload": {
"room_id": 26966466,
"room_info": {
"room_id": 26966466,
"room_title": "早安獭獭栞下播前抽fufu",
"room_cover": "https://i0.hdslb.com/bfs/live/user_cover/b810c36855168034557e905e5916b1dba1761fa4.jpg"
},
"user_info": {
"user_id": "1609526545",
"user_name": "栞栞Shiori",
"user_avatar": "https://i1.hdslb.com/bfs/face/47e8dbabb895de44ec6cace085d4dc1d40307277.jpg"
},
"total_length": 52.96700000000001,
"current_live_id": "1757215994597",
"live_status": true,
"is_recording": true,
"auto_start": true,
"platform": "bilibili"
},
"timestamp": 1757216040
}
```
#### 删除录播
```json
{
"id": "c32bc811-ab4b-49fd-84c7-897727905d16",
"event": "archive.deleted",
"payload": {
"platform": "bilibili",
"live_id": "1756607084705",
"room_id": 1967212929,
"title": "灶台O.o",
"length": 9,
"size": 1927112,
"created_at": "2025-08-31T02:24:44.728616+00:00",
"cover": "bilibili/1967212929/1756607084705/cover.jpg"
},
"timestamp": 1757176219
}
```
### 切片相关
#### 切片生成
```json
{
"id": "f542e0e1-688b-4f1a-8ce1-e5e51530cf5d",
"event": "clip.generated",
"payload": {
"id": 316,
"room_id": 27183290,
"cover": "[27183290][1757172501727][一起看凡人修仙传][2025-09-07_00-16-11].jpg",
"file": "[27183290][1757172501727][一起看凡人修仙传][2025-09-07_00-16-11].mp4",
"note": "",
"length": 121,
"size": 53049119,
"status": 0,
"bvid": "",
"title": "",
"desc": "",
"tags": "",
"area": 0,
"created_at": "2025-09-07T00:16:11.747461+08:00",
"platform": "bilibili"
},
"timestamp": 1757175371
}
```
#### 切片删除
```json
{
"id": "5c7ca728-753d-4a7d-a0b4-02c997ad2f92",
"event": "clip.deleted",
"payload": {
"id": 313,
"room_id": 27183290,
"cover": "[27183290][1756903953470][不出非洲之心不下播][2025-09-03_21-10-54].jpg",
"file": "[27183290][1756903953470][不出非洲之心不下播][2025-09-03_21-10-54].mp4",
"note": "",
"length": 32,
"size": 18530098,
"status": 0,
"bvid": "",
"title": "",
"desc": "",
"tags": "",
"area": 0,
"created_at": "2025-09-03T21:10:54.943682+08:00",
"platform": "bilibili"
},
"timestamp": 1757147617
}
```

View File

@@ -0,0 +1,30 @@
# 工作流程
- 直播间:各个平台的直播间
- 录播:直播流的存档,每次录制会自动生成一场录播记录
- 切片:从直播流中剪切生成的视频片段
- 投稿:将切片上传到各个平台(目前仅支持 Bilibili
下图展示了它们之间的关系:
```mermaid
flowchart TD
A[直播间] -->|录制| B[录播 01]
A -->|录制| C[录播 02]
A -->|录制| E[录播 N]
B --> F[直播流预览窗口]
F -->|区间生成| G[切片 01]
F -->|区间生成| H[切片 02]
F -->|区间生成| I[切片 N]
G --> J[切片预览窗口]
J -->|字幕压制| K[新切片]
K --> J
J -->|投稿| L[Bilibili]
```

View File

@@ -1,7 +1,7 @@
{
"name": "bili-shadowreplay",
"private": true,
"version": "2.9.1",
"version": "2.12.0",
"type": "module",
"scripts": {
"dev": "vite",
@@ -11,14 +11,16 @@
"tauri": "tauri",
"docs:dev": "vitepress dev docs",
"docs:build": "vitepress build docs",
"docs:preview": "vitepress preview docs"
"docs:preview": "vitepress preview docs",
"bump": "node scripts/bump.cjs"
},
"dependencies": {
"@langchain/core": "^0.3.64",
"@langchain/deepseek": "^0.1.0",
"@langchain/langgraph": "^0.3.10",
"@langchain/ollama": "^0.2.3",
"@tauri-apps/api": "^2.4.1",
"@tauri-apps/api": "^2.6.2",
"@tauri-apps/plugin-deep-link": "~2",
"@tauri-apps/plugin-dialog": "~2",
"@tauri-apps/plugin-fs": "~2",
"@tauri-apps/plugin-http": "~2",
@@ -40,6 +42,7 @@
"flowbite": "^2.5.1",
"flowbite-svelte": "^0.46.16",
"flowbite-svelte-icons": "^1.6.1",
"mermaid": "^11.9.0",
"postcss": "^8.4.21",
"svelte": "^3.54.0",
"svelte-check": "^3.0.0",
@@ -47,8 +50,9 @@
"tailwindcss": "^3.3.0",
"ts-node": "^10.9.1",
"tslib": "^2.4.1",
"typescript": "^4.6.4",
"typescript": "^5.0.0",
"vite": "^4.0.0",
"vitepress": "^1.6.3"
"vitepress": "^1.6.3",
"vitepress-plugin-mermaid": "^2.0.17"
}
}

58
scripts/bump.cjs Normal file
View File

@@ -0,0 +1,58 @@
#!/usr/bin/env node
const fs = require("fs");
const path = require("path");
function updatePackageJson(version) {
const packageJsonPath = path.join(process.cwd(), "package.json");
const packageJson = JSON.parse(fs.readFileSync(packageJsonPath, "utf8"));
packageJson.version = version;
fs.writeFileSync(
packageJsonPath,
JSON.stringify(packageJson, null, 2) + "\n"
);
console.log(`✅ Updated package.json version to ${version}`);
}
function updateCargoToml(version) {
const cargoTomlPath = path.join(process.cwd(), "src-tauri", "Cargo.toml");
let cargoToml = fs.readFileSync(cargoTomlPath, "utf8");
// Update the version in the [package] section
cargoToml = cargoToml.replace(/^version = ".*"$/m, `version = "${version}"`);
fs.writeFileSync(cargoTomlPath, cargoToml);
console.log(`✅ Updated Cargo.toml version to ${version}`);
}
function main() {
const args = process.argv.slice(2);
if (args.length === 0) {
console.error("❌ Please provide a version number");
console.error("Usage: yarn bump <version>");
console.error("Example: yarn bump 3.1.0");
process.exit(1);
}
const version = args[0];
// Validate version format (simple check)
if (!/^\d+\.\d+\.\d+/.test(version)) {
console.error(
"❌ Invalid version format. Please use semantic versioning (e.g., 3.1.0)"
);
process.exit(1);
}
try {
updatePackageJson(version);
updateCargoToml(version);
console.log(`🎉 Successfully bumped version to ${version}`);
} catch (error) {
console.error("❌ Error updating version:", error.message);
process.exit(1);
}
}
main();

632
src-tauri/Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -4,7 +4,7 @@ resolver = "2"
[package]
name = "bili-shadowreplay"
version = "1.0.0"
version = "2.12.0"
description = "BiliBili ShadowReplay"
authors = ["Xinrea"]
license = ""
@@ -44,7 +44,7 @@ async-trait = "0.1.87"
whisper-rs = "0.14.2"
hound = "3.5.1"
uuid = { version = "1.4", features = ["v4"] }
axum = { version = "0.7", features = ["macros"] }
axum = { version = "0.7", features = ["macros", "multipart"] }
tower-http = { version = "0.5", features = ["cors", "fs"] }
futures-core = "0.3"
futures = "0.3"
@@ -52,6 +52,7 @@ tokio-util = { version = "0.7", features = ["io"] }
clap = { version = "4.5.37", features = ["derive"] }
url = "2.5.4"
srtparse = "0.2.0"
thiserror = "1.0"
[features]
# this feature is used for production builds or when `devPath` points to the filesystem
@@ -71,6 +72,7 @@ gui = [
"tauri-utils",
"tauri-plugin-os",
"tauri-plugin-notification",
"tauri-plugin-deep-link",
"fix-path-env",
"tauri-build",
]
@@ -83,6 +85,7 @@ optional = true
[dependencies.tauri-plugin-single-instance]
version = "2"
optional = true
features = ["deep-link"]
[dependencies.tauri-plugin-dialog]
version = "2"
@@ -117,6 +120,10 @@ optional = true
version = "2"
optional = true
[dependencies.tauri-plugin-deep-link]
version = "2"
optional = true
[dependencies.fix-path-env]
git = "https://github.com/tauri-apps/fix-path-env-rs"
optional = true

View File

@@ -2,7 +2,11 @@
"identifier": "migrated",
"description": "permissions that were migrated from v1",
"local": true,
"windows": ["main", "Live*", "Clip*"],
"windows": [
"main",
"Live*",
"Clip*"
],
"permissions": [
"core:default",
"fs:allow-read-file",
@@ -16,7 +20,9 @@
"fs:allow-exists",
{
"identifier": "fs:scope",
"allow": ["**"]
"allow": [
"**"
]
},
"core:window:default",
"core:window:allow-start-dragging",
@@ -65,6 +71,7 @@
"shell:default",
"sql:default",
"os:default",
"dialog:default"
"dialog:default",
"deep-link:default"
]
}
}

View File

@@ -10,6 +10,9 @@ whisper_model = "./whisper_model.bin"
whisper_prompt = "这是一段中文 你们好"
openai_api_key = ""
clip_name_format = "[{room_id}][{live_id}][{title}][{created_at}].mp4"
# FLV 转换后自动清理源文件
# 启用后,导入 FLV 视频并自动转换为 MP4 后,会删除原始 FLV 文件以节省存储空间
cleanup_source_flv_after_import = false
[auto_generate]
enabled = false

View File

@@ -7,38 +7,42 @@ edition = "2021"
name = "danmu_stream"
path = "src/lib.rs"
[[example]]
name = "bilibili"
path = "examples/bilibili.rs"
[[example]]
name = "douyin"
path = "examples/douyin.rs"
[dependencies]
tokio = { version = "1.0", features = ["full"] }
tokio-tungstenite = { version = "0.20", features = ["native-tls"] }
tokio = { version = "1", features = ["full"] }
tokio-tungstenite = { version = "0.27", features = ["native-tls"] }
futures-util = "0.3"
prost = "0.12"
prost = "0.14"
chrono = "0.4"
log = "0.4"
env_logger = "0.10"
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
reqwest = { version = "0.11", features = ["json"] }
env_logger = "0.11"
serde = { version = "1", features = ["derive"] }
serde_json = "1"
reqwest = { version = "0.12", features = ["json"] }
url = "2.4"
md5 = "0.7"
md5 = "0.8"
regex = "1.9"
deno_core = "0.242.0"
pct-str = "2.0.0"
custom_error = "1.9.2"
deno_core = "0.355"
pct-str = "2.0"
thiserror = "2.0"
flate2 = "1.0"
scroll = "0.13.0"
scroll_derive = "0.13.0"
brotli = "8.0.1"
scroll = "0.13"
scroll_derive = "0.13"
brotli = "8.0"
http = "1.0"
rand = "0.9.1"
urlencoding = "2.1.3"
rand = "0.9"
urlencoding = "2.1"
gzip = "0.1.2"
hex = "0.4.3"
async-trait = "0.1.88"
uuid = "1.17.0"
async-trait = "0.1"
uuid = "1"
[build-dependencies]
tonic-build = "0.10"
tonic-build = "0.14"

View File

@@ -1,10 +1,11 @@
use std::sync::Arc;
use tokio::sync::{mpsc, RwLock};
use crate::{
provider::{new, DanmuProvider, ProviderType},
DanmuMessageType, DanmuStreamError,
};
use tokio::sync::{mpsc, RwLock};
#[derive(Clone)]
pub struct DanmuStream {

View File

@@ -1,19 +1,8 @@
use std::time::Duration;
use crate::DanmuStreamError;
use reqwest::header::HeaderMap;
impl From<reqwest::Error> for DanmuStreamError {
fn from(value: reqwest::Error) -> Self {
Self::HttpError { err: value }
}
}
impl From<url::ParseError> for DanmuStreamError {
fn from(value: url::ParseError) -> Self {
Self::ParseError { err: value }
}
}
use crate::DanmuStreamError;
pub struct ApiClient {
client: reqwest::Client,

View File

@@ -2,16 +2,24 @@ pub mod danmu_stream;
mod http_client;
pub mod provider;
use custom_error::custom_error;
use thiserror::Error;
custom_error! {pub DanmuStreamError
HttpError {err: reqwest::Error} = "HttpError {err}",
ParseError {err: url::ParseError} = "ParseError {err}",
WebsocketError {err: String } = "WebsocketError {err}",
PackError {err: String} = "PackError {err}",
UnsupportProto {proto: u16} = "UnsupportProto {proto}",
MessageParseError {err: String} = "MessageParseError {err}",
InvalidIdentifier {err: String} = "InvalidIdentifier {err}"
#[derive(Error, Debug)]
pub enum DanmuStreamError {
#[error("HttpError {0:?}")]
HttpError(#[from] reqwest::Error),
#[error("ParseError {0:?}")]
ParseError(#[from] url::ParseError),
#[error("WebsocketError {err}")]
WebsocketError { err: String },
#[error("PackError {err}")]
PackError { err: String },
#[error("UnsupportProto {proto}")]
UnsupportProto { proto: u16 },
#[error("MessageParseError {err}")]
MessageParseError { err: String },
#[error("InvalidIdentifier {err}")]
InvalidIdentifier { err: String },
}
#[derive(Debug)]

View File

@@ -65,7 +65,6 @@ impl DanmuProvider for BiliDanmu {
tx: mpsc::UnboundedSender<DanmuMessageType>,
) -> Result<(), DanmuStreamError> {
let mut retry_count = 0;
const MAX_RETRIES: u32 = 5;
const RETRY_DELAY: Duration = Duration::from_secs(5);
info!(
"Bilibili WebSocket connection started, room_id: {}",
@@ -74,33 +73,37 @@ impl DanmuProvider for BiliDanmu {
loop {
if *self.stop.read().await {
info!(
"Bilibili WebSocket connection stopped, room_id: {}",
self.room_id
);
break;
}
match self.connect_and_handle(tx.clone()).await {
Ok(_) => {
info!("Bilibili WebSocket connection closed normally");
info!(
"Bilibili WebSocket connection closed normally, room_id: {}",
self.room_id
);
break;
}
Err(e) => {
error!("Bilibili WebSocket connection error: {}", e);
retry_count += 1;
if retry_count >= MAX_RETRIES {
return Err(DanmuStreamError::WebsocketError {
err: format!("Failed to connect after {} retries", MAX_RETRIES),
});
}
info!(
"Retrying connection in {} seconds... (Attempt {}/{})",
RETRY_DELAY.as_secs(),
retry_count,
MAX_RETRIES
error!(
"Bilibili WebSocket connection error, room_id: {}, error: {}",
self.room_id, e
);
tokio::time::sleep(RETRY_DELAY).await;
retry_count += 1;
}
}
info!(
"Retrying connection in {} seconds... (Attempt {}), room_id: {}",
RETRY_DELAY.as_secs(),
retry_count,
self.room_id
);
tokio::time::sleep(RETRY_DELAY).await;
}
Ok(())
@@ -123,7 +126,8 @@ impl BiliDanmu {
tx: mpsc::UnboundedSender<DanmuMessageType>,
) -> Result<(), DanmuStreamError> {
let wbi_key = self.get_wbi_key().await?;
let danmu_info = self.get_danmu_info(&wbi_key, self.room_id).await?;
let real_room = self.get_real_room(&wbi_key, self.room_id).await?;
let danmu_info = self.get_danmu_info(&wbi_key, real_room).await?;
let ws_hosts = danmu_info.data.host_list.clone();
let mut conn = None;
log::debug!("ws_hosts: {:?}", ws_hosts);
@@ -152,7 +156,7 @@ impl BiliDanmu {
*self.write.write().await = Some(write);
let json = serde_json::to_string(&WsSend {
roomid: self.room_id,
roomid: real_room,
key: danmu_info.data.token,
uid: self.user_id,
protover: 3,
@@ -239,7 +243,6 @@ impl BiliDanmu {
wbi_key: &str,
room_id: u64,
) -> Result<DanmuInfo, DanmuStreamError> {
let room_id = self.get_real_room(wbi_key, room_id).await?;
let params = self
.get_sign(
wbi_key,

View File

@@ -1,6 +1,8 @@
use serde::Deserialize;
use crate::{provider::bilibili::stream::WsStreamCtx, DanmuStreamError};
use super::stream::WsStreamCtx;
use crate::DanmuStreamError;
#[derive(Debug, Deserialize)]
#[allow(dead_code)]

View File

@@ -1,4 +1,6 @@
use crate::{provider::bilibili::stream::WsStreamCtx, DanmuStreamError};
use super::stream::WsStreamCtx;
use crate::DanmuStreamError;
#[derive(Debug)]
#[allow(dead_code)]

View File

@@ -24,7 +24,7 @@ struct PackHotCount {
type BilibiliPackCtx<'a> = (BilibiliPackHeader, &'a [u8]);
fn pack(buffer: &[u8]) -> Result<BilibiliPackCtx, DanmuStreamError> {
fn pack(buffer: &[u8]) -> Result<BilibiliPackCtx<'_>, DanmuStreamError> {
let data = buffer
.pread_with(0, scroll::BE)
.map_err(|e: scroll::Error| DanmuStreamError::PackError { err: e.to_string() })?;

View File

@@ -1,6 +1,8 @@
use serde::Deserialize;
use crate::{provider::bilibili::stream::WsStreamCtx, DanmuStreamError};
use super::stream::WsStreamCtx;
use crate::DanmuStreamError;
#[derive(Debug, Deserialize)]
#[allow(dead_code)]

View File

@@ -1,10 +1,9 @@
use serde::Deserialize;
use serde_json::Value;
use crate::{
provider::{bilibili::dannmu_msg::BiliDanmuMessage, DanmuMessageType},
DanmuMessage, DanmuStreamError,
};
use super::dannmu_msg::BiliDanmuMessage;
use crate::{provider::DanmuMessageType, DanmuMessage, DanmuStreamError};
#[derive(Debug, Deserialize, Clone)]
pub struct WsStreamCtx {

View File

@@ -1,6 +1,8 @@
use serde::Deserialize;
use crate::{provider::bilibili::stream::WsStreamCtx, DanmuStreamError};
use super::stream::WsStreamCtx;
use crate::DanmuStreamError;
#[derive(Debug, Deserialize)]
#[allow(dead_code)]

View File

@@ -1,4 +1,9 @@
use crate::{provider::DanmuProvider, DanmuMessage, DanmuMessageType, DanmuStreamError};
mod messages;
use std::io::Read;
use std::sync::Arc;
use std::time::{Duration, SystemTime};
use async_trait::async_trait;
use deno_core::v8;
use deno_core::JsRuntime;
@@ -7,11 +12,9 @@ use flate2::read::GzDecoder;
use futures_util::{SinkExt, StreamExt, TryStreamExt};
use log::debug;
use log::{error, info};
use messages::*;
use prost::bytes::Bytes;
use prost::Message;
use std::io::Read;
use std::sync::Arc;
use std::time::{Duration, SystemTime};
use tokio::net::TcpStream;
use tokio::sync::mpsc;
use tokio::sync::RwLock;
@@ -19,8 +22,7 @@ use tokio_tungstenite::{
connect_async, tungstenite::Message as WsMessage, MaybeTlsStream, WebSocketStream,
};
mod messages;
use messages::*;
use crate::{provider::DanmuProvider, DanmuMessage, DanmuMessageType, DanmuStreamError};
const USER_AGENT: &str = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36";
@@ -109,7 +111,7 @@ impl DouyinDanmu {
runtime
.execute_script(
"<crypto-js.min.js>",
deno_core::FastString::Static(crypto_js),
deno_core::FastString::from_static(crypto_js),
)
.map_err(|e| DanmuStreamError::WebsocketError {
err: format!("Failed to execute crypto-js: {}", e),
@@ -118,7 +120,7 @@ impl DouyinDanmu {
// Load and execute the sign.js file
let js_code = include_str!("douyin/webmssdk.js");
runtime
.execute_script("<sign.js>", deno_core::FastString::Static(js_code))
.execute_script("<sign.js>", deno_core::FastString::from_static(js_code))
.map_err(|e| DanmuStreamError::WebsocketError {
err: format!("Failed to execute JavaScript: {}", e),
})?;
@@ -126,10 +128,7 @@ impl DouyinDanmu {
// Call the get_wss_url function
let sign_call = format!("get_wss_url(\"{}\")", self.room_id);
let result = runtime
.execute_script(
"<sign_call>",
deno_core::FastString::Owned(sign_call.into_boxed_str()),
)
.execute_script("<sign_call>", deno_core::FastString::from(sign_call))
.map_err(|e| DanmuStreamError::WebsocketError {
err: format!("Failed to execute JavaScript: {}", e),
})?;
@@ -214,7 +213,7 @@ impl DouyinDanmu {
if let Ok(Some(ack)) = handle_binary_message(&data, &tx, room_id).await {
if let Some(write) = write.write().await.as_mut() {
if let Err(e) =
write.send(WsMessage::Binary(ack.encode_to_vec())).await
write.send(WsMessage::binary(ack.encode_to_vec())).await
{
error!("Failed to send ack: {}", e);
}
@@ -257,7 +256,7 @@ impl DouyinDanmu {
async fn send_heartbeat(tx: &mpsc::Sender<WsMessage>) -> Result<(), DanmuStreamError> {
// heartbeat message: 3A 02 68 62
tx.send(WsMessage::Binary(vec![0x3A, 0x02, 0x68, 0x62]))
tx.send(WsMessage::binary(vec![0x3A, 0x02, 0x68, 0x62]))
.await
.map_err(|e| DanmuStreamError::WebsocketError {
err: format!("Failed to send heartbeat message: {}", e),

View File

@@ -1,6 +1,7 @@
use prost::Message;
use std::collections::HashMap;
use prost::Message;
// message Response {
// repeated Message messagesList = 1;
// string cursor = 2;

View File

@@ -4,10 +4,10 @@ mod douyin;
use async_trait::async_trait;
use tokio::sync::mpsc;
use crate::{
provider::bilibili::BiliDanmu, provider::douyin::DouyinDanmu, DanmuMessageType,
DanmuStreamError,
};
use self::bilibili::BiliDanmu;
use self::douyin::DouyinDanmu;
use crate::{DanmuMessageType, DanmuStreamError};
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub enum ProviderType {

File diff suppressed because one or more lines are too long

View File

@@ -1 +1 @@
{"migrated":{"identifier":"migrated","description":"permissions that were migrated from v1","local":true,"windows":["main","Live*","Clip*"],"permissions":["core:default","fs:allow-read-file","fs:allow-write-file","fs:allow-read-dir","fs:allow-copy-file","fs:allow-mkdir","fs:allow-remove","fs:allow-remove","fs:allow-rename","fs:allow-exists",{"identifier":"fs:scope","allow":["**"]},"core:window:default","core:window:allow-start-dragging","core:window:allow-close","core:window:allow-minimize","core:window:allow-maximize","core:window:allow-unmaximize","core:window:allow-set-title","sql:allow-execute","shell:allow-open","dialog:allow-open","dialog:allow-save","dialog:allow-message","dialog:allow-ask","dialog:allow-confirm",{"identifier":"http:default","allow":[{"url":"https://*.hdslb.com/"},{"url":"https://afdian.com/"},{"url":"https://*.afdiancdn.com/"},{"url":"https://*.douyin.com/"},{"url":"https://*.douyinpic.com/"}]},"dialog:default","shell:default","fs:default","http:default","sql:default","os:default","notification:default","dialog:default","fs:default","http:default","shell:default","sql:default","os:default","dialog:default"]}}
{"migrated":{"identifier":"migrated","description":"permissions that were migrated from v1","local":true,"windows":["main","Live*","Clip*"],"permissions":["core:default","fs:allow-read-file","fs:allow-write-file","fs:allow-read-dir","fs:allow-copy-file","fs:allow-mkdir","fs:allow-remove","fs:allow-remove","fs:allow-rename","fs:allow-exists",{"identifier":"fs:scope","allow":["**"]},"core:window:default","core:window:allow-start-dragging","core:window:allow-close","core:window:allow-minimize","core:window:allow-maximize","core:window:allow-unmaximize","core:window:allow-set-title","sql:allow-execute","shell:allow-open","dialog:allow-open","dialog:allow-save","dialog:allow-message","dialog:allow-ask","dialog:allow-confirm",{"identifier":"http:default","allow":[{"url":"https://*.hdslb.com/"},{"url":"https://afdian.com/"},{"url":"https://*.afdiancdn.com/"},{"url":"https://*.douyin.com/"},{"url":"https://*.douyinpic.com/"}]},"dialog:default","shell:default","fs:default","http:default","sql:default","os:default","notification:default","dialog:default","fs:default","http:default","shell:default","sql:default","os:default","dialog:default","deep-link:default"]}}

View File

@@ -4220,6 +4220,60 @@
"const": "core:window:deny-unminimize",
"markdownDescription": "Denies the unminimize command without any pre-configured scope."
},
{
"description": "Allows reading the opened deep link via the get_current command\n#### This default permission set includes:\n\n- `allow-get-current`",
"type": "string",
"const": "deep-link:default",
"markdownDescription": "Allows reading the opened deep link via the get_current command\n#### This default permission set includes:\n\n- `allow-get-current`"
},
{
"description": "Enables the get_current command without any pre-configured scope.",
"type": "string",
"const": "deep-link:allow-get-current",
"markdownDescription": "Enables the get_current command without any pre-configured scope."
},
{
"description": "Enables the is_registered command without any pre-configured scope.",
"type": "string",
"const": "deep-link:allow-is-registered",
"markdownDescription": "Enables the is_registered command without any pre-configured scope."
},
{
"description": "Enables the register command without any pre-configured scope.",
"type": "string",
"const": "deep-link:allow-register",
"markdownDescription": "Enables the register command without any pre-configured scope."
},
{
"description": "Enables the unregister command without any pre-configured scope.",
"type": "string",
"const": "deep-link:allow-unregister",
"markdownDescription": "Enables the unregister command without any pre-configured scope."
},
{
"description": "Denies the get_current command without any pre-configured scope.",
"type": "string",
"const": "deep-link:deny-get-current",
"markdownDescription": "Denies the get_current command without any pre-configured scope."
},
{
"description": "Denies the is_registered command without any pre-configured scope.",
"type": "string",
"const": "deep-link:deny-is-registered",
"markdownDescription": "Denies the is_registered command without any pre-configured scope."
},
{
"description": "Denies the register command without any pre-configured scope.",
"type": "string",
"const": "deep-link:deny-register",
"markdownDescription": "Denies the register command without any pre-configured scope."
},
{
"description": "Denies the unregister command without any pre-configured scope.",
"type": "string",
"const": "deep-link:deny-unregister",
"markdownDescription": "Denies the unregister command without any pre-configured scope."
},
{
"description": "This permission set configures the types of dialogs\navailable from the dialog plugin.\n\n#### Granted Permissions\n\nAll dialog types are enabled.\n\n\n\n#### This default permission set includes:\n\n- `allow-ask`\n- `allow-confirm`\n- `allow-message`\n- `allow-save`\n- `allow-open`",
"type": "string",

View File

@@ -4220,6 +4220,60 @@
"const": "core:window:deny-unminimize",
"markdownDescription": "Denies the unminimize command without any pre-configured scope."
},
{
"description": "Allows reading the opened deep link via the get_current command\n#### This default permission set includes:\n\n- `allow-get-current`",
"type": "string",
"const": "deep-link:default",
"markdownDescription": "Allows reading the opened deep link via the get_current command\n#### This default permission set includes:\n\n- `allow-get-current`"
},
{
"description": "Enables the get_current command without any pre-configured scope.",
"type": "string",
"const": "deep-link:allow-get-current",
"markdownDescription": "Enables the get_current command without any pre-configured scope."
},
{
"description": "Enables the is_registered command without any pre-configured scope.",
"type": "string",
"const": "deep-link:allow-is-registered",
"markdownDescription": "Enables the is_registered command without any pre-configured scope."
},
{
"description": "Enables the register command without any pre-configured scope.",
"type": "string",
"const": "deep-link:allow-register",
"markdownDescription": "Enables the register command without any pre-configured scope."
},
{
"description": "Enables the unregister command without any pre-configured scope.",
"type": "string",
"const": "deep-link:allow-unregister",
"markdownDescription": "Enables the unregister command without any pre-configured scope."
},
{
"description": "Denies the get_current command without any pre-configured scope.",
"type": "string",
"const": "deep-link:deny-get-current",
"markdownDescription": "Denies the get_current command without any pre-configured scope."
},
{
"description": "Denies the is_registered command without any pre-configured scope.",
"type": "string",
"const": "deep-link:deny-is-registered",
"markdownDescription": "Denies the is_registered command without any pre-configured scope."
},
{
"description": "Denies the register command without any pre-configured scope.",
"type": "string",
"const": "deep-link:deny-register",
"markdownDescription": "Denies the register command without any pre-configured scope."
},
{
"description": "Denies the unregister command without any pre-configured scope.",
"type": "string",
"const": "deep-link:deny-unregister",
"markdownDescription": "Denies the unregister command without any pre-configured scope."
},
{
"description": "This permission set configures the types of dialogs\navailable from the dialog plugin.\n\n#### Granted Permissions\n\nAll dialog types are enabled.\n\n\n\n#### This default permission set includes:\n\n- `allow-ask`\n- `allow-confirm`\n- `allow-message`\n- `allow-save`\n- `allow-open`",
"type": "string",

File diff suppressed because it is too large Load Diff

View File

@@ -1,56 +0,0 @@
use std::path::PathBuf;
use std::sync::Arc;
use chrono::Utc;
use crate::database::Database;
use crate::recorder::PlatformType;
pub async fn try_rebuild_archives(
db: &Arc<Database>,
cache_path: PathBuf,
) -> Result<(), Box<dyn std::error::Error>> {
let rooms = db.get_recorders().await?;
for room in rooms {
let room_id = room.room_id;
let room_cache_path = cache_path.join(format!("{}/{}", room.platform, room_id));
let mut files = tokio::fs::read_dir(room_cache_path).await?;
while let Some(file) = files.next_entry().await? {
if file.file_type().await?.is_dir() {
// use folder name as live_id
let live_id = file.file_name();
let live_id = live_id.to_str().unwrap();
// check if live_id is in db
let record = db.get_record(room_id, live_id).await;
if record.is_ok() {
continue;
}
// get created_at from folder metadata
let metadata = file.metadata().await?;
let created_at = metadata.created();
if created_at.is_err() {
continue;
}
let created_at = created_at.unwrap();
let created_at = chrono::DateTime::<Utc>::from(created_at)
.format("%Y-%m-%dT%H:%M:%S.%fZ")
.to_string();
// create a record for this live_id
let record = db
.add_record(
PlatformType::from_str(room.platform.as_str()).unwrap(),
live_id,
room_id,
&format!("UnknownLive {}", live_id),
None,
Some(&created_at),
)
.await?;
log::info!("rebuild archive {:?}", record);
}
}
}
Ok(())
}

View File

@@ -1,6 +1,6 @@
use std::path::{Path, PathBuf};
use chrono::Utc;
use chrono::Local;
use serde::{Deserialize, Serialize};
use crate::{recorder::PlatformType, recorder_manager::ClipRangeParams};
@@ -35,6 +35,12 @@ pub struct Config {
pub config_path: String,
#[serde(default = "default_whisper_language")]
pub whisper_language: String,
#[serde(default = "default_user_agent")]
pub user_agent: String,
#[serde(default = "default_cleanup_source_flv")]
pub cleanup_source_flv_after_import: bool,
#[serde(default = "default_webhook_url")]
pub webhook_url: String,
}
#[derive(Deserialize, Serialize, Clone)]
@@ -86,6 +92,18 @@ fn default_whisper_language() -> String {
"auto".to_string()
}
fn default_user_agent() -> String {
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/137.0.0.0 Safari/537.36".to_string()
}
fn default_cleanup_source_flv() -> bool {
false
}
fn default_webhook_url() -> String {
"".to_string()
}
impl Config {
pub fn load(
config_path: &PathBuf,
@@ -123,6 +141,9 @@ impl Config {
status_check_interval: default_status_check_interval(),
config_path: config_path.to_str().unwrap().into(),
whisper_language: default_whisper_language(),
user_agent: default_user_agent(),
cleanup_source_flv_after_import: default_cleanup_source_flv(),
webhook_url: default_webhook_url(),
};
config.save();
@@ -155,6 +176,18 @@ impl Config {
self.save();
}
#[allow(dead_code)]
pub fn set_user_agent(&mut self, user_agent: &str) {
self.user_agent = user_agent.to_string();
self.save();
}
#[allow(dead_code)]
pub fn set_cleanup_source_flv(&mut self, cleanup: bool) {
self.cleanup_source_flv_after_import = cleanup;
self.save();
}
pub fn generate_clip_name(&self, params: &ClipRangeParams) -> PathBuf {
let platform = PlatformType::from_str(&params.platform).unwrap();
@@ -170,13 +203,31 @@ impl Config {
let format_config = format_config.replace("{platform}", platform.as_str());
let format_config = format_config.replace("{room_id}", &params.room_id.to_string());
let format_config = format_config.replace("{live_id}", &params.live_id);
let format_config = format_config.replace("{x}", &params.x.to_string());
let format_config = format_config.replace("{y}", &params.y.to_string());
let format_config = format_config.replace(
"{x}",
&params
.range
.as_ref()
.map_or("0".to_string(), |r| r.start.to_string()),
);
let format_config = format_config.replace(
"{y}",
&params
.range
.as_ref()
.map_or("0".to_string(), |r| r.end.to_string()),
);
let format_config = format_config.replace(
"{created_at}",
&Utc::now().format("%Y-%m-%d_%H-%M-%S").to_string(),
&Local::now().format("%Y-%m-%d_%H-%M-%S").to_string(),
);
let format_config = format_config.replace(
"{length}",
&params
.range
.as_ref()
.map_or("0".to_string(), |r| r.duration().to_string()),
);
let format_config = format_config.replace("{length}", &(params.y - params.x).to_string());
let output = self.output.clone();

View File

@@ -0,0 +1,4 @@
pub const PREFIX_SUBTITLE: &str = "[subtitle]";
pub const PREFIX_IMPORTED: &str = "[imported]";
pub const PREFIX_DANMAKU: &str = "[danmaku]";
pub const PREFIX_CLIP: &str = "[clip]";

View File

@@ -6,10 +6,10 @@ use chrono::Utc;
use rand::seq::SliceRandom;
use rand::Rng;
#[derive(Debug, Clone, serde::Serialize, sqlx::FromRow)]
#[derive(Debug, Clone, serde::Serialize, serde::Deserialize, sqlx::FromRow)]
pub struct AccountRow {
pub platform: String,
pub uid: u64, // Keep for Bilibili compatibility
pub uid: u64, // Keep for Bilibili compatibility
pub id_str: Option<String>, // New field for string IDs like Douyin sec_uid
pub name: String,
pub avatar: String,
@@ -133,7 +133,7 @@ impl Database {
avatar: &str,
) -> Result<(), DatabaseError> {
let lock = self.db.read().await.clone().unwrap();
// If the id_str changed, we need to delete the old record and create a new one
if old_account.id_str.as_deref() != Some(new_id_str) {
// Delete the old record (for Douyin accounts, we use uid to identify)
@@ -142,7 +142,7 @@ impl Database {
.bind(&old_account.platform)
.execute(&lock)
.await?;
// Insert the new record with updated id_str
sqlx::query("INSERT INTO accounts (uid, platform, id_str, name, avatar, csrf, cookies, created_at) VALUES ($1, $2, $3, $4, $5, $6, $7, $8)")
.bind(old_account.uid as i64)
@@ -157,15 +157,17 @@ impl Database {
.await?;
} else {
// id_str is the same, just update name and avatar
sqlx::query("UPDATE accounts SET name = $1, avatar = $2 WHERE uid = $3 and platform = $4")
.bind(name)
.bind(avatar)
.bind(old_account.uid as i64)
.bind(&old_account.platform)
.execute(&lock)
.await?;
sqlx::query(
"UPDATE accounts SET name = $1, avatar = $2 WHERE uid = $3 and platform = $4",
)
.bind(name)
.bind(avatar)
.bind(old_account.uid as i64)
.bind(&old_account.platform)
.execute(&lock)
.await?;
}
Ok(())
}

View File

@@ -4,7 +4,7 @@ use super::Database;
use super::DatabaseError;
use chrono::Utc;
#[derive(Debug, Clone, serde::Serialize, sqlx::FromRow)]
#[derive(Debug, Clone, serde::Serialize, serde::Deserialize, sqlx::FromRow)]
pub struct RecordRow {
pub platform: String,
pub live_id: String,
@@ -18,14 +18,21 @@ pub struct RecordRow {
// CREATE TABLE records (live_id INTEGER PRIMARY KEY, room_id INTEGER, title TEXT, length INTEGER, size INTEGER, created_at TEXT);
impl Database {
pub async fn get_records(&self, room_id: u64) -> Result<Vec<RecordRow>, DatabaseError> {
pub async fn get_records(
&self,
room_id: u64,
offset: u64,
limit: u64,
) -> Result<Vec<RecordRow>, DatabaseError> {
let lock = self.db.read().await.clone().unwrap();
Ok(
sqlx::query_as::<_, RecordRow>("SELECT * FROM records WHERE room_id = $1")
.bind(room_id as i64)
.fetch_all(&lock)
.await?,
Ok(sqlx::query_as::<_, RecordRow>(
"SELECT * FROM records WHERE room_id = $1 ORDER BY created_at DESC LIMIT $2 OFFSET $3",
)
.bind(room_id as i64)
.bind(limit as i64)
.bind(offset as i64)
.fetch_all(&lock)
.await?)
}
pub async fn get_record(
@@ -35,10 +42,10 @@ impl Database {
) -> Result<RecordRow, DatabaseError> {
let lock = self.db.read().await.clone().unwrap();
Ok(sqlx::query_as::<_, RecordRow>(
"SELECT * FROM records WHERE live_id = $1 and room_id = $2",
"SELECT * FROM records WHERE room_id = $1 and live_id = $2",
)
.bind(live_id)
.bind(room_id as i64)
.bind(live_id)
.fetch_one(&lock)
.await?)
}
@@ -73,13 +80,17 @@ impl Database {
Ok(record)
}
pub async fn remove_record(&self, live_id: &str) -> Result<(), DatabaseError> {
pub async fn remove_record(&self, live_id: &str) -> Result<RecordRow, DatabaseError> {
let lock = self.db.read().await.clone().unwrap();
let to_delete = sqlx::query_as::<_, RecordRow>("SELECT * FROM records WHERE live_id = $1")
.bind(live_id)
.fetch_one(&lock)
.await?;
sqlx::query("DELETE FROM records WHERE live_id = $1")
.bind(live_id)
.execute(&lock)
.await?;
Ok(())
Ok(to_delete)
}
pub async fn update_record(
@@ -98,6 +109,20 @@ impl Database {
Ok(())
}
pub async fn update_record_cover(
&self,
live_id: &str,
cover: Option<String>,
) -> Result<(), DatabaseError> {
let lock = self.db.read().await.clone().unwrap();
sqlx::query("UPDATE records SET cover = $1 WHERE live_id = $2")
.bind(cover)
.bind(live_id)
.execute(&lock)
.await?;
Ok(())
}
pub async fn get_total_length(&self) -> Result<i64, DatabaseError> {
let lock = self.db.read().await.clone().unwrap();
let result: (i64,) = sqlx::query_as("SELECT SUM(length) FROM records;")
@@ -133,9 +158,9 @@ impl Database {
"SELECT * FROM records ORDER BY created_at DESC LIMIT $1 OFFSET $2",
)
.bind(limit as i64)
.bind(offset as i64)
.fetch_all(&lock)
.await?)
.bind(offset as i64)
.fetch_all(&lock)
.await?)
} else {
Ok(sqlx::query_as::<_, RecordRow>(
"SELECT * FROM records WHERE room_id = $1 ORDER BY created_at DESC LIMIT $2 OFFSET $3",
@@ -147,4 +172,12 @@ impl Database {
.await?)
}
}
pub async fn get_record_disk_usage(&self) -> Result<u64, DatabaseError> {
let lock = self.db.read().await.clone().unwrap();
let result: (i64,) = sqlx::query_as("SELECT SUM(size) FROM records;")
.fetch_one(&lock)
.await?;
Ok(result.0 as u64)
}
}

View File

@@ -4,12 +4,13 @@ use crate::recorder::PlatformType;
use chrono::Utc;
/// Recorder in database is pretty simple
/// because many room infos are collected in realtime
#[derive(Debug, Clone, serde::Serialize, sqlx::FromRow)]
#[derive(Debug, Clone, serde::Serialize, serde::Deserialize, sqlx::FromRow)]
pub struct RecorderRow {
pub room_id: u64,
pub created_at: String,
pub platform: String,
pub auto_start: bool,
pub extra: String,
}
// recorders
@@ -18,6 +19,7 @@ impl Database {
&self,
platform: PlatformType,
room_id: u64,
extra: &str,
) -> Result<RecorderRow, DatabaseError> {
let lock = self.db.read().await.clone().unwrap();
let recorder = RecorderRow {
@@ -25,21 +27,28 @@ impl Database {
created_at: Utc::now().to_rfc3339(),
platform: platform.as_str().to_string(),
auto_start: true,
extra: extra.to_string(),
};
let _ = sqlx::query(
"INSERT INTO recorders (room_id, created_at, platform, auto_start) VALUES ($1, $2, $3, $4)",
"INSERT OR REPLACE INTO recorders (room_id, created_at, platform, auto_start, extra) VALUES ($1, $2, $3, $4, $5)",
)
.bind(room_id as i64)
.bind(&recorder.created_at)
.bind(platform.as_str())
.bind(recorder.auto_start)
.bind(extra)
.execute(&lock)
.await?;
Ok(recorder)
}
pub async fn remove_recorder(&self, room_id: u64) -> Result<(), DatabaseError> {
pub async fn remove_recorder(&self, room_id: u64) -> Result<RecorderRow, DatabaseError> {
let lock = self.db.read().await.clone().unwrap();
let recorder =
sqlx::query_as::<_, RecorderRow>("SELECT * FROM recorders WHERE room_id = $1")
.bind(room_id as i64)
.fetch_one(&lock)
.await?;
let sql = sqlx::query("DELETE FROM recorders WHERE room_id = $1")
.bind(room_id as i64)
.execute(&lock)
@@ -50,13 +59,13 @@ impl Database {
// remove related archive
let _ = self.remove_archive(room_id).await;
Ok(())
Ok(recorder)
}
pub async fn get_recorders(&self) -> Result<Vec<RecorderRow>, DatabaseError> {
let lock = self.db.read().await.clone().unwrap();
Ok(sqlx::query_as::<_, RecorderRow>(
"SELECT room_id, created_at, platform, auto_start FROM recorders",
"SELECT room_id, created_at, platform, auto_start, extra FROM recorders",
)
.fetch_all(&lock)
.await?)

View File

@@ -2,29 +2,13 @@ use super::Database;
use super::DatabaseError;
// CREATE TABLE videos (id INTEGER PRIMARY KEY, room_id INTEGER, cover TEXT, file TEXT, length INTEGER, size INTEGER, status INTEGER, bvid TEXT, title TEXT, desc TEXT, tags TEXT, area INTEGER, created_at TEXT);
#[derive(Debug, Clone, serde::Serialize, sqlx::FromRow)]
#[derive(Debug, Clone, serde::Serialize, serde::Deserialize, sqlx::FromRow)]
pub struct VideoRow {
pub id: i64,
pub room_id: u64,
pub cover: String,
pub file: String,
pub length: i64,
pub size: i64,
pub status: i64,
pub bvid: String,
pub title: String,
pub desc: String,
pub tags: String,
pub area: i64,
pub created_at: String,
pub platform: String,
}
#[derive(Debug, Clone, serde::Serialize, sqlx::FromRow)]
pub struct VideoNoCover {
pub id: i64,
pub room_id: u64,
pub file: String,
pub note: String,
pub length: i64,
pub size: i64,
pub status: i64,
@@ -38,9 +22,9 @@ pub struct VideoNoCover {
}
impl Database {
pub async fn get_videos(&self, room_id: u64) -> Result<Vec<VideoNoCover>, DatabaseError> {
pub async fn get_videos(&self, room_id: u64) -> Result<Vec<VideoRow>, DatabaseError> {
let lock = self.db.read().await.clone().unwrap();
let videos = sqlx::query_as::<_, VideoNoCover>("SELECT * FROM videos WHERE room_id = $1;")
let videos = sqlx::query_as::<_, VideoRow>("SELECT * FROM videos WHERE room_id = $1;")
.bind(room_id as i64)
.fetch_all(&lock)
.await?;
@@ -59,13 +43,14 @@ impl Database {
pub async fn update_video(&self, video_row: &VideoRow) -> Result<(), DatabaseError> {
let lock = self.db.read().await.clone().unwrap();
sqlx::query("UPDATE videos SET status = $1, bvid = $2, title = $3, desc = $4, tags = $5, area = $6 WHERE id = $7")
sqlx::query("UPDATE videos SET status = $1, bvid = $2, title = $3, desc = $4, tags = $5, area = $6, note = $7 WHERE id = $8")
.bind(video_row.status)
.bind(&video_row.bvid)
.bind(&video_row.title)
.bind(&video_row.desc)
.bind(&video_row.tags)
.bind(video_row.area)
.bind(&video_row.note)
.bind(video_row.id)
.execute(&lock)
.await?;
@@ -83,10 +68,11 @@ impl Database {
pub async fn add_video(&self, video: &VideoRow) -> Result<VideoRow, DatabaseError> {
let lock = self.db.read().await.clone().unwrap();
let sql = sqlx::query("INSERT INTO videos (room_id, cover, file, length, size, status, bvid, title, desc, tags, area, created_at, platform) VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13)")
let sql = sqlx::query("INSERT INTO videos (room_id, cover, file, note, length, size, status, bvid, title, desc, tags, area, created_at, platform) VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14)")
.bind(video.room_id as i64)
.bind(&video.cover)
.bind(&video.file)
.bind(&video.note)
.bind(video.length)
.bind(video.size)
.bind(video.status)
@@ -116,10 +102,10 @@ impl Database {
Ok(())
}
pub async fn get_all_videos(&self) -> Result<Vec<VideoNoCover>, DatabaseError> {
pub async fn get_all_videos(&self) -> Result<Vec<VideoRow>, DatabaseError> {
let lock = self.db.read().await.clone().unwrap();
let videos =
sqlx::query_as::<_, VideoNoCover>("SELECT * FROM videos ORDER BY created_at DESC;")
sqlx::query_as::<_, VideoRow>("SELECT * FROM videos ORDER BY created_at DESC;")
.fetch_all(&lock)
.await?;
Ok(videos)

View File

@@ -1,17 +1,56 @@
use std::fmt;
use std::path::{Path, PathBuf};
use std::process::Stdio;
use crate::constants;
use crate::progress_reporter::{ProgressReporter, ProgressReporterTrait};
use crate::subtitle_generator::{whisper_cpp, GenerateResult, SubtitleGenerator, SubtitleGeneratorType};
use crate::subtitle_generator::whisper_online;
use crate::subtitle_generator::{
whisper_cpp, GenerateResult, SubtitleGenerator, SubtitleGeneratorType,
};
use async_ffmpeg_sidecar::event::{FfmpegEvent, LogLevel};
use async_ffmpeg_sidecar::log_parser::FfmpegLogParser;
use tokio::io::BufReader;
use serde::{Deserialize, Serialize};
use tokio::io::{AsyncBufReadExt, BufReader};
// 视频元数据结构
#[derive(Debug)]
pub struct VideoMetadata {
pub duration: f64,
pub width: u32,
pub height: u32,
}
#[cfg(target_os = "windows")]
const CREATE_NO_WINDOW: u32 = 0x08000000;
#[cfg(target_os = "windows")]
#[allow(unused_imports)]
use std::os::windows::process::CommandExt;
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Range {
pub start: f64,
pub end: f64,
}
impl fmt::Display for Range {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "[{}, {}]", self.start, self.end)
}
}
impl Range {
pub fn duration(&self) -> f64 {
self.end - self.start
}
}
pub async fn clip_from_m3u8(
reporter: Option<&impl ProgressReporterTrait>,
m3u8_index: &Path,
output_path: &Path,
range: Option<&Range>,
fix_encoding: bool,
) -> Result<(), String> {
// first check output folder exists
let output_folder = output_path.parent().unwrap();
@@ -23,9 +62,28 @@ pub async fn clip_from_m3u8(
std::fs::create_dir_all(output_folder).unwrap();
}
let child = tokio::process::Command::new(ffmpeg_path())
.args(["-i", &format!("{}", m3u8_index.display())])
.args(["-c", "copy"])
let mut ffmpeg_process = tokio::process::Command::new(ffmpeg_path());
#[cfg(target_os = "windows")]
ffmpeg_process.creation_flags(CREATE_NO_WINDOW);
let child_command = ffmpeg_process.args(["-i", &format!("{}", m3u8_index.display())]);
if let Some(range) = range {
child_command
.args(["-ss", &range.start.to_string()])
.args(["-t", &range.duration().to_string()]);
}
if fix_encoding {
child_command
.args(["-c:v", "libx264"])
.args(["-c:a", "copy"])
.args(["-b:v", "6000k"]);
} else {
child_command.args(["-c", "copy"]);
}
let child = child_command
.args(["-y", output_path.to_str().unwrap()])
.args(["-progress", "pipe:2"])
.stderr(Stdio::piped())
@@ -47,6 +105,7 @@ pub async fn clip_from_m3u8(
if reporter.is_none() {
continue;
}
log::debug!("Clip progress: {}", p.time);
reporter
.unwrap()
.update(format!("编码中:{}", p.time).as_str())
@@ -161,10 +220,11 @@ pub async fn extract_audio_chunks(file: &Path, format: &str) -> Result<PathBuf,
args.push(segment_pattern_str);
let child = tokio::process::Command::new(ffmpeg_path())
.args(&args)
.stderr(Stdio::piped())
.spawn();
let mut ffmpeg_process = tokio::process::Command::new(ffmpeg_path());
#[cfg(target_os = "windows")]
ffmpeg_process.creation_flags(CREATE_NO_WINDOW);
let child = ffmpeg_process.args(&args).stderr(Stdio::piped()).spawn();
if let Err(e) = child {
return Err(e.to_string());
@@ -208,7 +268,11 @@ pub async fn extract_audio_chunks(file: &Path, format: &str) -> Result<PathBuf,
/// Get the duration of an audio/video file in seconds
async fn get_audio_duration(file: &Path) -> Result<u64, String> {
// Use ffprobe with format option to get duration
let child = tokio::process::Command::new(ffprobe_path())
let mut ffprobe_process = tokio::process::Command::new(ffprobe_path());
#[cfg(target_os = "windows")]
ffprobe_process.creation_flags(CREATE_NO_WINDOW);
let child = ffprobe_process
.args(["-v", "quiet"])
.args(["-show_entries", "format=duration"])
.args(["-of", "csv=p=0"])
@@ -249,6 +313,58 @@ async fn get_audio_duration(file: &Path) -> Result<u64, String> {
duration.ok_or_else(|| "Failed to parse duration".to_string())
}
/// Get the precise duration of a video segment (TS/MP4) in seconds
pub async fn get_segment_duration(file: &Path) -> Result<f64, String> {
// Use ffprobe to get the exact duration of the segment
let mut ffprobe_process = tokio::process::Command::new(ffprobe_path());
#[cfg(target_os = "windows")]
ffprobe_process.creation_flags(CREATE_NO_WINDOW);
let child = ffprobe_process
.args(["-v", "quiet"])
.args(["-show_entries", "format=duration"])
.args(["-of", "csv=p=0"])
.args(["-i", file.to_str().unwrap()])
.stdout(Stdio::piped())
.stderr(Stdio::piped())
.spawn();
if let Err(e) = child {
return Err(format!(
"Failed to spawn ffprobe process for segment: {}",
e
));
}
let mut child = child.unwrap();
let stdout = child.stdout.take().unwrap();
let reader = BufReader::new(stdout);
let mut parser = FfmpegLogParser::new(reader);
let mut duration = None;
while let Ok(event) = parser.parse_next_event().await {
match event {
FfmpegEvent::LogEOF => break,
FfmpegEvent::Log(_level, content) => {
// Parse the exact duration as f64 for precise timing
if let Ok(seconds_f64) = content.trim().parse::<f64>() {
duration = Some(seconds_f64);
log::debug!("Parsed segment duration: {} seconds", seconds_f64);
}
}
_ => {}
}
}
if let Err(e) = child.wait().await {
log::error!("Failed to get segment duration: {}", e);
return Err(e.to_string());
}
duration.ok_or_else(|| "Failed to parse segment duration".to_string())
}
/// Encode video subtitle using ffmpeg, output is file name with prefix [subtitle]
pub async fn encode_video_subtitle(
reporter: &impl ProgressReporterTrait,
file: &Path,
@@ -257,15 +373,21 @@ pub async fn encode_video_subtitle(
) -> Result<String, String> {
// ffmpeg -i fixed_\[30655190\]1742887114_0325084106_81.5.mp4 -vf "subtitles=test.srt:force_style='FontSize=24'" -c:v libx264 -c:a copy output.mp4
log::info!("Encode video subtitle task start: {}", file.display());
log::info!("srt_style: {}", srt_style);
log::info!("SRT style: {}", srt_style);
// output path is file with prefix [subtitle]
let output_filename = format!("[subtitle]{}", file.file_name().unwrap().to_str().unwrap());
let output_filename = format!(
"{}{}",
constants::PREFIX_SUBTITLE,
file.file_name().unwrap().to_str().unwrap()
);
let output_path = file.with_file_name(&output_filename);
// check output path exists
// check output path exists - log but allow overwrite
if output_path.exists() {
log::info!("Output path already exists: {}", output_path.display());
return Err("Output path already exists".to_string());
log::info!(
"Output path already exists, will overwrite: {}",
output_path.display()
);
}
let mut command_error = None;
@@ -285,11 +407,16 @@ pub async fn encode_video_subtitle(
let vf = format!("subtitles={}:force_style='{}'", subtitle, srt_style);
log::info!("vf: {}", vf);
let child = tokio::process::Command::new(ffmpeg_path())
let mut ffmpeg_process = tokio::process::Command::new(ffmpeg_path());
#[cfg(target_os = "windows")]
ffmpeg_process.creation_flags(CREATE_NO_WINDOW);
let child = ffmpeg_process
.args(["-i", file.to_str().unwrap()])
.args(["-vf", vf.as_str()])
.args(["-c:v", "libx264"])
.args(["-c:a", "copy"])
.args(["-b:v", "6000k"])
.args([output_path.to_str().unwrap()])
.args(["-y"])
.args(["-progress", "pipe:2"])
@@ -342,13 +469,19 @@ pub async fn encode_video_danmu(
) -> Result<PathBuf, String> {
// ffmpeg -i fixed_\[30655190\]1742887114_0325084106_81.5.mp4 -vf ass=subtitle.ass -c:v libx264 -c:a copy output.mp4
log::info!("Encode video danmu task start: {}", file.display());
let danmu_filename = format!("[danmu]{}", file.file_name().unwrap().to_str().unwrap());
let output_path = file.with_file_name(danmu_filename);
let danmu_filename = format!(
"{}{}",
constants::PREFIX_DANMAKU,
file.file_name().unwrap().to_str().unwrap()
);
let output_file_path = file.with_file_name(danmu_filename);
// check output path exists
if output_path.exists() {
log::info!("Output path already exists: {}", output_path.display());
return Err("Output path already exists".to_string());
// check output path exists - log but allow overwrite
if output_file_path.exists() {
log::info!(
"Output path already exists, will overwrite: {}",
output_file_path.display()
);
}
let mut command_error = None;
@@ -366,12 +499,17 @@ pub async fn encode_video_danmu(
format!("'{}'", subtitle.display())
};
let child = tokio::process::Command::new(ffmpeg_path())
let mut ffmpeg_process = tokio::process::Command::new(ffmpeg_path());
#[cfg(target_os = "windows")]
ffmpeg_process.creation_flags(CREATE_NO_WINDOW);
let child = ffmpeg_process
.args(["-i", file.to_str().unwrap()])
.args(["-vf", &format!("ass={}", subtitle)])
.args(["-c:v", "libx264"])
.args(["-c:a", "copy"])
.args([output_path.to_str().unwrap()])
.args(["-b:v", "6000k"])
.args([output_file_path.to_str().unwrap()])
.args(["-y"])
.args(["-progress", "pipe:2"])
.stderr(Stdio::piped())
@@ -393,7 +531,7 @@ pub async fn encode_video_danmu(
command_error = Some(e.to_string());
}
FfmpegEvent::Progress(p) => {
log::info!("Encode video danmu progress: {}", p.time);
log::debug!("Encode video danmu progress: {}", p.time);
if reporter.is_none() {
continue;
}
@@ -416,19 +554,20 @@ pub async fn encode_video_danmu(
log::error!("Encode video danmu error: {}", error);
Err(error)
} else {
log::info!("Encode video danmu task end: {}", output_path.display());
Ok(output_path)
log::info!(
"Encode video danmu task end: {}",
output_file_path.display()
);
Ok(output_file_path)
}
}
pub async fn generic_ffmpeg_command(args: &[&str]) -> Result<String, String> {
let mut ffmpeg_process = tokio::process::Command::new(ffmpeg_path());
#[cfg(target_os = "windows")]
ffmpeg_process.creation_flags(CREATE_NO_WINDOW);
pub async fn generic_ffmpeg_command(
args: &[&str],
) -> Result<String, String> {
let child = tokio::process::Command::new(ffmpeg_path())
.args(args)
.stderr(Stdio::piped())
.spawn();
let child = ffmpeg_process.args(args).stderr(Stdio::piped()).spawn();
if let Err(e) = child {
return Err(e.to_string());
}
@@ -474,8 +613,7 @@ pub async fn generate_video_subtitle(
if whisper_model.is_empty() {
return Err("Whisper model not configured".to_string());
}
if let Ok(generator) =
whisper_cpp::new(Path::new(&whisper_model), whisper_prompt).await
if let Ok(generator) = whisper_cpp::new(Path::new(&whisper_model), whisper_prompt).await
{
let chunk_dir = extract_audio_chunks(file, "wav").await?;
@@ -584,7 +722,6 @@ pub async fn generate_video_subtitle(
}
}
/// Trying to run ffmpeg for version
pub async fn check_ffmpeg() -> Result<String, String> {
let child = tokio::process::Command::new(ffmpeg_path())
@@ -624,6 +761,52 @@ pub async fn check_ffmpeg() -> Result<String, String> {
}
}
pub async fn get_video_resolution(file: &str) -> Result<String, String> {
// ffprobe -v error -select_streams v:0 -show_entries stream=width,height -of csv=s=x:p=0 input.mp4
let mut ffprobe_process = tokio::process::Command::new(ffprobe_path());
#[cfg(target_os = "windows")]
ffprobe_process.creation_flags(CREATE_NO_WINDOW);
let child = ffprobe_process
.arg("-i")
.arg(file)
.arg("-v")
.arg("error")
.arg("-select_streams")
.arg("v:0")
.arg("-show_entries")
.arg("stream=width,height")
.arg("-of")
.arg("csv=s=x:p=0")
.stdout(Stdio::piped())
.spawn();
if let Err(e) = child {
log::error!("Faild to spwan ffprobe process: {e}");
return Err(e.to_string());
}
let mut child = child.unwrap();
let stdout = child.stdout.take();
if stdout.is_none() {
log::error!("Failed to take ffprobe output");
return Err("Failed to take ffprobe output".into());
}
let stdout = stdout.unwrap();
let reader = BufReader::new(stdout);
let mut lines = reader.lines();
let line = lines.next_line().await.unwrap();
if line.is_none() {
return Err("Failed to parse resolution from output".into());
}
let line = line.unwrap();
let resolution = line.split("x").collect::<Vec<&str>>();
if resolution.len() != 2 {
return Err("Failed to parse resolution from output".into());
}
Ok(format!("{}x{}", resolution[0], resolution[1]))
}
fn ffmpeg_path() -> PathBuf {
let mut path = Path::new("ffmpeg").to_path_buf();
if cfg!(windows) {
@@ -641,3 +824,567 @@ fn ffprobe_path() -> PathBuf {
path
}
// 从视频文件切片
pub async fn clip_from_video_file(
reporter: Option<&impl ProgressReporterTrait>,
input_path: &Path,
output_path: &Path,
start_time: f64,
duration: f64,
) -> Result<(), String> {
let output_folder = output_path.parent().unwrap();
if !output_folder.exists() {
std::fs::create_dir_all(output_folder).unwrap();
}
let mut ffmpeg_process = tokio::process::Command::new(ffmpeg_path());
#[cfg(target_os = "windows")]
ffmpeg_process.creation_flags(CREATE_NO_WINDOW);
let child = ffmpeg_process
.args(["-i", &format!("{}", input_path.display())])
.args(["-ss", &start_time.to_string()])
.args(["-t", &duration.to_string()])
.args(["-c:v", "libx264"])
.args(["-c:a", "aac"])
.args(["-b:v", "6000k"])
.args(["-avoid_negative_ts", "make_zero"])
.args(["-y", output_path.to_str().unwrap()])
.args(["-progress", "pipe:2"])
.stderr(Stdio::piped())
.spawn();
if let Err(e) = child {
return Err(format!("启动ffmpeg进程失败: {}", e));
}
let mut child = child.unwrap();
let stderr = child.stderr.take().unwrap();
let reader = BufReader::new(stderr);
let mut parser = FfmpegLogParser::new(reader);
let mut clip_error = None;
while let Ok(event) = parser.parse_next_event().await {
match event {
FfmpegEvent::Progress(p) => {
if let Some(reporter) = reporter {
reporter.update(&format!("切片进度: {}", p.time));
}
}
FfmpegEvent::LogEOF => break,
FfmpegEvent::Log(level, content) => {
if content.contains("error") || level == LogLevel::Error {
log::error!("切片错误: {}", content);
}
}
FfmpegEvent::Error(e) => {
log::error!("切片错误: {}", e);
clip_error = Some(e.to_string());
}
_ => {}
}
}
if let Err(e) = child.wait().await {
return Err(e.to_string());
}
if let Some(error) = clip_error {
Err(error)
} else {
log::info!("切片任务完成: {}", output_path.display());
Ok(())
}
}
/// Extract basic information from a video file.
///
/// # Arguments
/// * `file_path` - The path to the video file.
///
/// # Returns
/// A `Result` containing the video metadata or an error message.
pub async fn extract_video_metadata(file_path: &Path) -> Result<VideoMetadata, String> {
let mut ffprobe_process = tokio::process::Command::new("ffprobe");
#[cfg(target_os = "windows")]
ffprobe_process.creation_flags(CREATE_NO_WINDOW);
let output = ffprobe_process
.args([
"-v",
"quiet",
"-print_format",
"json",
"-show_format",
"-show_streams",
"-select_streams",
"v:0",
&format!("{}", file_path.display()),
])
.output()
.await
.map_err(|e| format!("执行ffprobe失败: {}", e))?;
if !output.status.success() {
return Err(format!(
"ffprobe执行失败: {}",
String::from_utf8_lossy(&output.stderr)
));
}
let json_str = String::from_utf8_lossy(&output.stdout);
let json: serde_json::Value =
serde_json::from_str(&json_str).map_err(|e| format!("解析ffprobe输出失败: {}", e))?;
// 解析视频流信息
let streams = json["streams"].as_array().ok_or("未找到视频流信息")?;
if streams.is_empty() {
return Err("未找到视频流".to_string());
}
let video_stream = &streams[0];
let format = &json["format"];
let duration = format["duration"]
.as_str()
.and_then(|d| d.parse::<f64>().ok())
.unwrap_or(0.0);
let width = video_stream["width"].as_u64().unwrap_or(0) as u32;
let height = video_stream["height"].as_u64().unwrap_or(0) as u32;
Ok(VideoMetadata {
duration,
width,
height,
})
}
/// Generate thumbnail file from video, capturing a frame at the specified timestamp.
///
/// # Arguments
/// * `video_full_path` - The full path to the video file.
/// * `timestamp` - The timestamp (in seconds) to capture the thumbnail.
///
/// # Returns
/// The path to the generated thumbnail image.
pub async fn generate_thumbnail(video_full_path: &Path, timestamp: f64) -> Result<PathBuf, String> {
let mut ffmpeg_process = tokio::process::Command::new(ffmpeg_path());
#[cfg(target_os = "windows")]
ffmpeg_process.creation_flags(CREATE_NO_WINDOW);
let thumbnail_full_path = video_full_path.with_extension("jpg");
let output = ffmpeg_process
.args(["-i", &format!("{}", video_full_path.display())])
.args(["-ss", &timestamp.to_string()])
.args(["-vframes", "1"])
.args(["-y", thumbnail_full_path.to_str().unwrap()])
.output()
.await
.map_err(|e| format!("生成缩略图失败: {}", e))?;
if !output.status.success() {
return Err(format!(
"ffmpeg生成缩略图失败: {}",
String::from_utf8_lossy(&output.stderr)
));
}
// 记录生成的缩略图信息
if let Ok(metadata) = std::fs::metadata(&thumbnail_full_path) {
log::info!(
"生成缩略图完成: {} (文件大小: {} bytes)",
thumbnail_full_path.display(),
metadata.len()
);
} else {
log::info!("生成缩略图完成: {}", thumbnail_full_path.display());
}
Ok(thumbnail_full_path)
}
// 执行FFmpeg转换的通用函数
pub async fn execute_ffmpeg_conversion(
mut cmd: tokio::process::Command,
reporter: &ProgressReporter,
mode_name: &str,
) -> Result<(), String> {
use async_ffmpeg_sidecar::event::FfmpegEvent;
use async_ffmpeg_sidecar::log_parser::FfmpegLogParser;
use std::process::Stdio;
use tokio::io::BufReader;
let mut child = cmd
.stderr(Stdio::piped())
.spawn()
.map_err(|e| format!("启动FFmpeg进程失败: {}", e))?;
let stderr = child.stderr.take().unwrap();
let reader = BufReader::new(stderr);
let mut parser = FfmpegLogParser::new(reader);
let mut conversion_error = None;
while let Ok(event) = parser.parse_next_event().await {
match event {
FfmpegEvent::Progress(p) => {
reporter.update(&format!("正在转换视频格式... {} ({})", p.time, mode_name));
}
FfmpegEvent::LogEOF => break,
FfmpegEvent::Log(level, content) => {
if matches!(level, async_ffmpeg_sidecar::event::LogLevel::Error)
&& content.contains("Error")
{
conversion_error = Some(content);
}
}
FfmpegEvent::Error(e) => {
conversion_error = Some(e);
}
_ => {} // 忽略其他事件类型
}
}
let status = child
.wait()
.await
.map_err(|e| format!("等待FFmpeg进程失败: {}", e))?;
if !status.success() {
let error_msg = conversion_error
.unwrap_or_else(|| format!("FFmpeg退出码: {}", status.code().unwrap_or(-1)));
return Err(format!("视频格式转换失败 ({}): {}", mode_name, error_msg));
}
reporter.update(&format!("视频格式转换完成 100% ({})", mode_name));
Ok(())
}
// 尝试流复制转换(无损,速度快)
pub async fn try_stream_copy_conversion(
source: &Path,
dest: &Path,
reporter: &ProgressReporter,
) -> Result<(), String> {
reporter.update("正在转换视频格式... 0% (无损模式)");
// 构建ffmpeg命令 - 流复制模式
let mut cmd = tokio::process::Command::new(ffmpeg_path());
#[cfg(target_os = "windows")]
cmd.creation_flags(0x08000000); // CREATE_NO_WINDOW
cmd.args([
"-i",
&source.to_string_lossy(),
"-c:v",
"copy", // 直接复制视频流,零损失
"-c:a",
"copy", // 直接复制音频流,零损失
"-avoid_negative_ts",
"make_zero", // 修复时间戳问题
"-movflags",
"+faststart", // 优化web播放
"-progress",
"pipe:2", // 输出进度到stderr
"-y", // 覆盖输出文件
&dest.to_string_lossy(),
]);
execute_ffmpeg_conversion(cmd, reporter, "无损转换").await
}
// 高质量重编码转换(兼容性好,质量高)
pub async fn try_high_quality_conversion(
source: &Path,
dest: &Path,
reporter: &ProgressReporter,
) -> Result<(), String> {
reporter.update("正在转换视频格式... 0% (高质量模式)");
// 构建ffmpeg命令 - 高质量重编码
let mut cmd = tokio::process::Command::new(ffmpeg_path());
#[cfg(target_os = "windows")]
cmd.creation_flags(0x08000000); // CREATE_NO_WINDOW
cmd.args([
"-i",
&source.to_string_lossy(),
"-c:v",
"libx264", // H.264编码器
"-preset",
"slow", // 慢速预设,更好的压缩效率
"-crf",
"18", // 高质量设置 (18-23范围越小质量越高)
"-c:a",
"aac", // AAC音频编码器
"-b:a",
"192k", // 高音频码率
"-avoid_negative_ts",
"make_zero", // 修复时间戳问题
"-movflags",
"+faststart", // 优化web播放
"-progress",
"pipe:2", // 输出进度到stderr
"-y", // 覆盖输出文件
&dest.to_string_lossy(),
]);
execute_ffmpeg_conversion(cmd, reporter, "高质量转换").await
}
// 带进度的视频格式转换函数(智能质量保持策略)
pub async fn convert_video_format(
source: &Path,
dest: &Path,
reporter: &ProgressReporter,
) -> Result<(), String> {
// 先尝试stream copy无损转换如果失败则使用高质量重编码
match try_stream_copy_conversion(source, dest, reporter).await {
Ok(()) => Ok(()),
Err(stream_copy_error) => {
reporter.update("流复制失败,使用高质量重编码模式...");
log::warn!(
"Stream copy failed: {}, falling back to re-encoding",
stream_copy_error
);
try_high_quality_conversion(source, dest, reporter).await
}
}
}
// tests
#[cfg(test)]
mod tests {
use super::*;
// 测试 Range 结构体
#[test]
fn test_range_creation() {
let range = Range {
start: 10.0,
end: 30.0,
};
assert_eq!(range.start, 10.0);
assert_eq!(range.end, 30.0);
assert_eq!(range.duration(), 20.0);
}
#[test]
fn test_range_duration() {
let range = Range {
start: 0.0,
end: 60.0,
};
assert_eq!(range.duration(), 60.0);
let range2 = Range {
start: 15.5,
end: 45.5,
};
assert_eq!(range2.duration(), 30.0);
}
#[test]
fn test_range_display() {
let range = Range {
start: 5.0,
end: 25.0,
};
assert_eq!(range.to_string(), "[5, 25]");
}
#[test]
fn test_range_edge_cases() {
let zero_range = Range {
start: 0.0,
end: 0.0,
};
assert_eq!(zero_range.duration(), 0.0);
let negative_start = Range {
start: -5.0,
end: 10.0,
};
assert_eq!(negative_start.duration(), 15.0);
let large_range = Range {
start: 1000.0,
end: 2000.0,
};
assert_eq!(large_range.duration(), 1000.0);
}
// 测试视频元数据提取
#[tokio::test]
async fn test_extract_video_metadata() {
let test_video = Path::new("tests/video/test.mp4");
if test_video.exists() {
let metadata = extract_video_metadata(test_video).await.unwrap();
assert!(metadata.duration > 0.0);
assert!(metadata.width > 0);
assert!(metadata.height > 0);
}
}
// 测试音频时长获取
#[tokio::test]
async fn test_get_audio_duration() {
let test_audio = Path::new("tests/audio/test.wav");
if test_audio.exists() {
let duration = get_audio_duration(test_audio).await.unwrap();
assert!(duration > 0);
}
}
// 测试视频分辨率获取
#[tokio::test]
async fn test_get_video_resolution() {
let file = Path::new("tests/video/h_test.m4s");
if file.exists() {
let resolution = get_video_resolution(file.to_str().unwrap()).await.unwrap();
assert_eq!(resolution, "1920x1080");
}
}
// 测试缩略图生成
#[tokio::test]
async fn test_generate_thumbnail() {
let file = Path::new("tests/video/test.mp4");
if file.exists() {
let thumbnail_file = generate_thumbnail(file, 0.0).await.unwrap();
assert!(thumbnail_file.exists());
assert_eq!(thumbnail_file.extension().unwrap(), "jpg");
// clean up
let _ = std::fs::remove_file(thumbnail_file);
}
}
// 测试 FFmpeg 版本检查
#[tokio::test]
async fn test_check_ffmpeg() {
let result = check_ffmpeg().await;
match result {
Ok(version) => {
assert!(!version.is_empty());
// FFmpeg 版本字符串可能不包含 "ffmpeg" 这个词,所以检查是否包含数字
assert!(version.chars().any(|c| c.is_ascii_digit()));
}
Err(_) => {
// FFmpeg 可能没有安装,这是正常的
println!("FFmpeg not available for testing");
}
}
}
// 测试通用 FFmpeg 命令
#[tokio::test]
async fn test_generic_ffmpeg_command() {
let result = generic_ffmpeg_command(&["-version"]).await;
match result {
Ok(_output) => {
// 输出可能为空或者不包含 "ffmpeg" 字符串,我们只检查函数能正常执行
println!("FFmpeg command executed successfully");
}
Err(_) => {
// FFmpeg 可能没有安装,这是正常的
println!("FFmpeg not available for testing");
}
}
}
// 测试字幕生成错误处理
#[tokio::test]
async fn test_generate_video_subtitle_errors() {
let test_file = Path::new("tests/video/test.mp4");
// 测试 Whisper 类型 - 模型未配置
let result =
generate_video_subtitle(None, test_file, "whisper", "", "", "", "", "zh").await;
assert!(result.is_err());
assert!(result.unwrap_err().contains("Whisper model not configured"));
// 测试 Whisper Online 类型 - API key 未配置
let result =
generate_video_subtitle(None, test_file, "whisper_online", "", "", "", "", "zh").await;
assert!(result.is_err());
assert!(result.unwrap_err().contains("API key not configured"));
// 测试未知类型
let result =
generate_video_subtitle(None, test_file, "unknown_type", "", "", "", "", "").await;
assert!(result.is_err());
assert!(result
.unwrap_err()
.contains("Unknown subtitle generator type"));
}
// 测试路径构建函数
#[test]
fn test_ffmpeg_paths() {
let ffmpeg_path = ffmpeg_path();
let ffprobe_path = ffprobe_path();
#[cfg(windows)]
{
assert_eq!(ffmpeg_path.extension().unwrap(), "exe");
assert_eq!(ffprobe_path.extension().unwrap(), "exe");
}
#[cfg(not(windows))]
{
assert_eq!(ffmpeg_path.file_name().unwrap(), "ffmpeg");
assert_eq!(ffprobe_path.file_name().unwrap(), "ffprobe");
}
}
// 测试错误处理
#[tokio::test]
async fn test_error_handling() {
// 测试不存在的文件
let non_existent_file = Path::new("tests/nonexistent/test.mp4");
let result = extract_video_metadata(non_existent_file).await;
assert!(result.is_err());
let result = get_video_resolution("tests/nonexistent/test.mp4").await;
assert!(result.is_err());
}
// 测试文件名和路径处理
#[test]
fn test_filename_processing() {
let test_file = Path::new("tests/video/test.mp4");
// 测试字幕文件名生成
let subtitle_filename = format!(
"{}{}",
constants::PREFIX_SUBTITLE,
test_file.file_name().unwrap().to_str().unwrap()
);
assert!(subtitle_filename.starts_with(constants::PREFIX_SUBTITLE));
assert!(subtitle_filename.contains("test.mp4"));
// 测试弹幕文件名生成
let danmu_filename = format!(
"{}{}",
constants::PREFIX_DANMAKU,
test_file.file_name().unwrap().to_str().unwrap()
);
assert!(danmu_filename.starts_with(constants::PREFIX_DANMAKU));
assert!(danmu_filename.contains("test.mp4"));
}
// 测试音频分块目录结构
#[test]
fn test_audio_chunk_directory_structure() {
let test_file = Path::new("tests/audio/test.wav");
let output_path = test_file.with_extension("wav");
let output_dir = output_path.parent().unwrap();
let base_name = output_path.file_stem().unwrap().to_str().unwrap();
let chunk_dir = output_dir.join(format!("{}_chunks", base_name));
assert!(chunk_dir.to_string_lossy().contains("_chunks"));
assert!(chunk_dir.to_string_lossy().contains("test"));
}
}

View File

@@ -39,12 +39,20 @@ pub async fn add_account(
.await?;
} else if platform == "douyin" {
// Get user info from Douyin API
let douyin_client = crate::recorder::douyin::client::DouyinClient::new(&account);
let douyin_client = crate::recorder::douyin::client::DouyinClient::new(
&state.config.read().await.user_agent,
&account,
);
match douyin_client.get_user_info().await {
Ok(user_info) => {
// For Douyin, use sec_uid as the primary identifier in id_str field
let avatar_url = user_info.avatar_thumb.url_list.first().cloned().unwrap_or_default();
let avatar_url = user_info
.avatar_thumb
.url_list
.first()
.cloned()
.unwrap_or_default();
state
.db
.update_account_with_id_str(

View File

@@ -14,10 +14,27 @@ pub async fn get_config(state: state_type!()) -> Result<Config, ()> {
#[allow(dead_code)]
pub async fn set_cache_path(state: state_type!(), cache_path: String) -> Result<(), String> {
let old_cache_path = state.config.read().await.cache.clone();
log::info!(
"Try to set cache path: {} -> {}",
old_cache_path,
cache_path
);
if old_cache_path == cache_path {
return Ok(());
}
let old_cache_path_obj = std::path::Path::new(&old_cache_path);
let new_cache_path_obj = std::path::Path::new(&cache_path);
// check if new cache path is under old cache path
if new_cache_path_obj.starts_with(old_cache_path_obj) {
log::error!(
"New cache path is under old cache path: {} -> {}",
old_cache_path,
cache_path
);
return Err("New cache path cannot be under old cache path".to_string());
}
state.recorder_manager.set_migrating(true).await;
// stop and clear all recorders
state.recorder_manager.stop_all().await;
@@ -52,9 +69,11 @@ pub async fn set_cache_path(state: state_type!(), cache_path: String) -> Result<
if entry.is_dir() {
if let Err(e) = crate::handlers::utils::copy_dir_all(entry, &new_entry) {
log::error!("Copy old cache to new cache error: {}", e);
return Err(e.to_string());
}
} else if let Err(e) = std::fs::copy(entry, &new_entry) {
log::error!("Copy old cache to new cache error: {}", e);
return Err(e.to_string());
}
}
@@ -79,12 +98,30 @@ pub async fn set_cache_path(state: state_type!(), cache_path: String) -> Result<
#[cfg_attr(feature = "gui", tauri::command)]
#[allow(dead_code)]
pub async fn set_output_path(state: state_type!(), output_path: String) -> Result<(), ()> {
pub async fn set_output_path(state: state_type!(), output_path: String) -> Result<(), String> {
let mut config = state.config.write().await;
let old_output_path = config.output.clone();
log::info!(
"Try to set output path: {} -> {}",
old_output_path,
output_path
);
if old_output_path == output_path {
return Ok(());
}
let old_output_path_obj = std::path::Path::new(&old_output_path);
let new_output_path_obj = std::path::Path::new(&output_path);
// check if new output path is under old output path
if new_output_path_obj.starts_with(old_output_path_obj) {
log::error!(
"New output path is under old output path: {} -> {}",
old_output_path,
output_path
);
return Err("New output path cannot be under old output path".to_string());
}
// list all file and folder in old output
let mut old_output_entries = vec![];
if let Ok(entries) = std::fs::read_dir(&old_output_path) {
@@ -103,10 +140,12 @@ pub async fn set_output_path(state: state_type!(), output_path: String) -> Resul
// if entry is a folder
if entry.is_dir() {
if let Err(e) = crate::handlers::utils::copy_dir_all(entry, &new_entry) {
log::error!("Copy old cache to new cache error: {}", e);
log::error!("Copy old output to new output error: {}", e);
return Err(e.to_string());
}
} else if let Err(e) = std::fs::copy(entry, &new_entry) {
log::error!("Copy old cache to new cache error: {}", e);
log::error!("Copy old output to new output error: {}", e);
return Err(e.to_string());
}
}
@@ -114,10 +153,10 @@ pub async fn set_output_path(state: state_type!(), output_path: String) -> Resul
for entry in old_output_entries {
if entry.is_dir() {
if let Err(e) = std::fs::remove_dir_all(&entry) {
log::error!("Remove old cache error: {}", e);
log::error!("Remove old output error: {}", e);
}
} else if let Err(e) = std::fs::remove_file(&entry) {
log::error!("Remove old cache error: {}", e);
log::error!("Remove old output error: {}", e);
}
}
@@ -245,3 +284,33 @@ pub async fn update_whisper_language(
state.config.write().await.save();
Ok(())
}
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn update_user_agent(state: state_type!(), user_agent: String) -> Result<(), ()> {
log::info!("Updating user agent to {}", user_agent);
state.config.write().await.set_user_agent(&user_agent);
Ok(())
}
#[cfg_attr(feature = "gui", tauri::command)]
#[cfg(feature = "gui")]
pub async fn update_cleanup_source_flv(state: state_type!(), cleanup: bool) -> Result<(), ()> {
log::info!("Updating cleanup source FLV after import to {}", cleanup);
state.config.write().await.set_cleanup_source_flv(cleanup);
Ok(())
}
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn update_webhook_url(state: state_type!(), webhook_url: String) -> Result<(), ()> {
log::info!("Updating webhook url to {}", webhook_url);
let _ = state
.webhook_poster
.update_config(crate::webhook::poster::WebhookConfig {
url: webhook_url.clone(),
..Default::default()
})
.await;
state.config.write().await.webhook_url = webhook_url;
state.config.write().await.save();
Ok(())
}

View File

@@ -7,6 +7,7 @@ use crate::recorder::RecorderInfo;
use crate::recorder_manager::RecorderList;
use crate::state::State;
use crate::state_type;
use crate::webhook::events;
#[cfg(feature = "gui")]
use tauri::State as TauriState;
@@ -24,6 +25,7 @@ pub async fn add_recorder(
state: state_type!(),
platform: String,
room_id: u64,
extra: String,
) -> Result<RecorderRow, String> {
log::info!("Add recorder: {} {}", platform, room_id);
let platform = PlatformType::from_str(&platform).unwrap();
@@ -50,15 +52,23 @@ pub async fn add_recorder(
match account {
Ok(account) => match state
.recorder_manager
.add_recorder(&account, platform, room_id, true)
.add_recorder(&account, platform, room_id, &extra, true)
.await
{
Ok(()) => {
let room = state.db.add_recorder(platform, room_id).await?;
let room = state.db.add_recorder(platform, room_id, &extra).await?;
state
.db
.new_message("添加直播间", &format!("添加了新直播间 {}", room_id))
.await?;
// post webhook event
let event = events::new_webhook_event(
events::RECORDER_ADDED,
events::Payload::Recorder(room.clone()),
);
if let Err(e) = state.webhook_poster.post_event(&event).await {
log::error!("Post webhook event error: {}", e);
}
Ok(room)
}
Err(e) => {
@@ -86,11 +96,19 @@ pub async fn remove_recorder(
.remove_recorder(platform, room_id)
.await
{
Ok(()) => {
Ok(recorder) => {
state
.db
.new_message("移除直播间", &format!("移除了直播间 {}", room_id))
.await?;
// post webhook event
let event = events::new_webhook_event(
events::RECORDER_REMOVED,
events::Payload::Recorder(recorder),
);
if let Err(e) = state.webhook_poster.post_event(&event).await {
log::error!("Post webhook event error: {}", e);
}
log::info!("Removed recorder: {} {}", platform.as_str(), room_id);
Ok(())
}
@@ -120,8 +138,21 @@ pub async fn get_room_info(
}
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_archives(state: state_type!(), room_id: u64) -> Result<Vec<RecordRow>, String> {
Ok(state.recorder_manager.get_archives(room_id).await?)
pub async fn get_archive_disk_usage(state: state_type!()) -> Result<u64, String> {
Ok(state.recorder_manager.get_archive_disk_usage().await?)
}
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_archives(
state: state_type!(),
room_id: u64,
offset: u64,
limit: u64,
) -> Result<Vec<RecordRow>, String> {
Ok(state
.recorder_manager
.get_archives(room_id, offset, limit)
.await?)
}
#[cfg_attr(feature = "gui", tauri::command)]
@@ -147,7 +178,10 @@ pub async fn get_archive_subtitle(
if platform.is_none() {
return Err("Unsupported platform".to_string());
}
Ok(state.recorder_manager.get_archive_subtitle(platform.unwrap(), room_id, &live_id).await?)
Ok(state
.recorder_manager
.get_archive_subtitle(platform.unwrap(), room_id, &live_id)
.await?)
}
#[cfg_attr(feature = "gui", tauri::command)]
@@ -161,7 +195,10 @@ pub async fn generate_archive_subtitle(
if platform.is_none() {
return Err("Unsupported platform".to_string());
}
Ok(state.recorder_manager.generate_archive_subtitle(platform.unwrap(), room_id, &live_id).await?)
Ok(state
.recorder_manager
.generate_archive_subtitle(platform.unwrap(), room_id, &live_id)
.await?)
}
#[cfg_attr(feature = "gui", tauri::command)]
@@ -175,7 +212,7 @@ pub async fn delete_archive(
if platform.is_none() {
return Err("Unsupported platform".to_string());
}
state
let to_delete = state
.recorder_manager
.delete_archive(platform.unwrap(), room_id, &live_id)
.await?;
@@ -186,6 +223,49 @@ pub async fn delete_archive(
&format!("删除了房间 {} 的历史缓存 {}", room_id, live_id),
)
.await?;
// post webhook event
let event =
events::new_webhook_event(events::ARCHIVE_DELETED, events::Payload::Archive(to_delete));
if let Err(e) = state.webhook_poster.post_event(&event).await {
log::error!("Post webhook event error: {}", e);
}
Ok(())
}
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn delete_archives(
state: state_type!(),
platform: String,
room_id: u64,
live_ids: Vec<String>,
) -> Result<(), String> {
let platform = PlatformType::from_str(&platform);
if platform.is_none() {
return Err("Unsupported platform".to_string());
}
let to_deletes = state
.recorder_manager
.delete_archives(
platform.unwrap(),
room_id,
&live_ids.iter().map(|s| s.as_str()).collect::<Vec<&str>>(),
)
.await?;
state
.db
.new_message(
"删除历史缓存",
&format!("删除了房间 {} 的历史缓存 {}", room_id, live_ids.join(", ")),
)
.await?;
for to_delete in to_deletes {
// post webhook event
let event =
events::new_webhook_event(events::ARCHIVE_DELETED, events::Payload::Archive(to_delete));
if let Err(e) = state.webhook_poster.post_event(&event).await {
log::error!("Post webhook event error: {}", e);
}
}
Ok(())
}

View File

@@ -301,4 +301,129 @@ pub async fn list_folder(_state: state_type!(), path: String) -> Result<Vec<Stri
files.push(entry.path().to_str().unwrap().to_string());
}
Ok(files)
}
}
/// 高级文件名清理函数,全面处理各种危险字符和控制字符
///
/// 适用于需要严格文件名清理的场景,支持中文字符
///
/// # 参数
/// - `name`: 需要清理的文件名
/// - `max_length`: 最大长度限制默认100字符
///
/// # 返回
/// 经过全面清理的安全文件名
#[cfg(feature = "headless")]
pub fn sanitize_filename_advanced(name: &str, max_length: Option<usize>) -> String {
let max_len = max_length.unwrap_or(100);
// 先清理所有字符
let cleaned: String = name
.chars()
.map(|c| match c {
// 文件系统危险字符
'\\' | '/' | ':' | '*' | '?' | '"' | '<' | '>' | '|' => '_',
// 控制字符和不可见字符
c if c.is_control() => '_',
// 保留安全的字符(白名单)
c if c.is_alphanumeric()
|| c == ' '
|| c == '.'
|| c == '-'
|| c == '_'
|| c == '('
|| c == ')'
|| c == '['
|| c == ']'
|| c == '《'
|| c == '》'
|| c == ''
|| c == '' =>
{
c
}
// 其他字符替换为下划线
_ => '_',
})
.collect();
// 如果清理后的长度在限制内,直接返回
if cleaned.chars().count() <= max_len {
return cleaned;
}
// 智能截断:保护文件扩展名
if let Some(dot_pos) = cleaned.rfind('.') {
let extension = &cleaned[dot_pos..];
let main_part = &cleaned[..dot_pos];
// 确保扩展名不会太长最多10个字符包括点号
if extension.chars().count() <= 10 {
let ext_len = extension.chars().count();
let available_for_main = max_len.saturating_sub(ext_len);
if available_for_main > 0 {
let truncated_main: String = main_part.chars().take(available_for_main).collect();
return format!("{}{}", truncated_main, extension);
}
}
}
// 如果没有扩展名或扩展名太长,直接截断
cleaned.chars().take(max_len).collect()
}
#[cfg(test)]
mod tests {
#[test]
#[cfg(feature = "headless")]
fn test_sanitize_filename_advanced() {
use super::sanitize_filename_advanced;
assert_eq!(
sanitize_filename_advanced("test<>file.txt", None),
"test__file.txt"
);
assert_eq!(sanitize_filename_advanced("文件名.txt", None), "文件名.txt");
assert_eq!(
sanitize_filename_advanced("《视频》(高清).mp4", None),
"《视频》(高清).mp4"
);
assert_eq!(
sanitize_filename_advanced("file\x00with\x01control.txt", None),
"file_with_control.txt"
);
// 测试空白字符处理(函数不自动移除空白字符)
assert_eq!(
sanitize_filename_advanced(" .hidden_file.txt ", None),
" .hidden_file.txt "
);
assert_eq!(
sanitize_filename_advanced(" normal_file.mp4 ", None),
" normal_file.mp4 "
);
// 测试特殊字符替换
assert_eq!(
sanitize_filename_advanced("file@#$%^&.txt", None),
"file______.txt"
);
// 测试长度限制 - 无扩展名
let long_name = "测试".repeat(60);
let result = sanitize_filename_advanced(&long_name, Some(10));
assert_eq!(result.chars().count(), 10);
// 测试长度限制 - 有扩展名
let long_name_with_ext = format!("{}.txt", "测试".repeat(60));
let result = sanitize_filename_advanced(&long_name_with_ext, Some(10));
assert!(result.ends_with(".txt"));
assert_eq!(result.chars().count(), 10); // 6个测试字符 + .txt (4个字符)
// 测试短文件名不被截断
let short_name = "test.mp4";
let result = sanitize_filename_advanced(short_name, Some(50));
assert_eq!(result, "test.mp4");
}
}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -1,15 +1,14 @@
// Prevents additional console window on Windows in release, DO NOT REMOVE!!
#![cfg_attr(not(debug_assertions), windows_subsystem = "windows")]
mod archive_migration;
mod config;
mod constants;
mod danmu2ass;
mod database;
mod ffmpeg;
mod handlers;
#[cfg(feature = "headless")]
mod http_server;
#[cfg(feature = "headless")]
mod migration;
mod progress_manager;
mod progress_reporter;
@@ -19,12 +18,15 @@ mod state;
mod subtitle_generator;
#[cfg(feature = "gui")]
mod tray;
mod webhook;
use archive_migration::try_rebuild_archives;
use async_std::fs;
use chrono::Utc;
use config::Config;
use database::Database;
use migration::migration_methods::try_convert_clip_covers;
use migration::migration_methods::try_convert_live_covers;
use migration::migration_methods::try_rebuild_archives;
use recorder::bilibili::client::BiliClient;
use recorder::PlatformType;
use recorder_manager::RecorderManager;
@@ -117,6 +119,9 @@ async fn setup_logging(log_dir: &Path) -> Result<(), Box<dyn std::error::Error>>
),
])?;
// logging current package version
log::info!("Current version: {}", env!("CARGO_PKG_VERSION"));
Ok(())
}
@@ -162,6 +167,32 @@ fn get_migrations() -> Vec<Migration> {
sql: r#"ALTER TABLE accounts ADD COLUMN id_str TEXT;"#,
kind: MigrationKind::Up,
},
// add extra column to recorders
Migration {
version: 6,
description: "add_extra_column_to_recorders",
sql: r#"ALTER TABLE recorders ADD COLUMN extra TEXT;"#,
kind: MigrationKind::Up,
},
// add indexes
Migration {
version: 7,
description: "add_indexes",
sql: r#"
CREATE INDEX idx_records_live_id ON records (room_id, live_id);
CREATE INDEX idx_records_created_at ON records (room_id, created_at);
CREATE INDEX idx_videos_room_id ON videos (room_id);
CREATE INDEX idx_videos_created_at ON videos (created_at);
"#,
kind: MigrationKind::Up,
},
// add note column for video
Migration {
version: 8,
description: "add_note_column_for_video",
sql: r#"ALTER TABLE videos ADD COLUMN note TEXT;"#,
kind: MigrationKind::Up,
},
]
}
@@ -209,7 +240,7 @@ async fn setup_server_state(args: Args) -> Result<State, Box<dyn std::error::Err
return Err(e.into());
}
};
let client = Arc::new(BiliClient::new()?);
let client = Arc::new(BiliClient::new(&config.user_agent)?);
let config = Arc::new(RwLock::new(config));
let db = Arc::new(Database::new());
// connect to sqlite database
@@ -239,7 +270,14 @@ async fn setup_server_state(args: Args) -> Result<State, Box<dyn std::error::Err
let progress_manager = Arc::new(ProgressManager::new());
let emitter = EventEmitter::new(progress_manager.get_event_sender());
let recorder_manager = Arc::new(RecorderManager::new(emitter, db.clone(), config.clone()));
let webhook_poster =
webhook::poster::create_webhook_poster(&config.read().await.webhook_url, None).unwrap();
let recorder_manager = Arc::new(RecorderManager::new(
emitter,
db.clone(),
config.clone(),
webhook_poster.clone(),
));
// Update account infos for headless mode
let accounts = db.get_accounts().await?;
@@ -268,7 +306,7 @@ async fn setup_server_state(args: Args) -> Result<State, Box<dyn std::error::Err
} else if platform == PlatformType::Douyin {
// Update Douyin account info
use crate::recorder::douyin::client::DouyinClient;
let douyin_client = DouyinClient::new(&account);
let douyin_client = DouyinClient::new(&config.read().await.user_agent, &account);
match douyin_client.get_user_info().await {
Ok(user_info) => {
let avatar_url = user_info
@@ -298,11 +336,14 @@ async fn setup_server_state(args: Args) -> Result<State, Box<dyn std::error::Err
}
let _ = try_rebuild_archives(&db, config.read().await.cache.clone().into()).await;
let _ = try_convert_live_covers(&db, config.read().await.cache.clone().into()).await;
let _ = try_convert_clip_covers(&db, config.read().await.output.clone().into()).await;
Ok(State {
db,
client,
config,
webhook_poster,
recorder_manager,
progress_manager,
readonly: args.readonly,
@@ -331,7 +372,7 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
}
};
let client = Arc::new(BiliClient::new()?);
let client = Arc::new(BiliClient::new(&config.user_agent)?);
let config = Arc::new(RwLock::new(config));
let config_clone = config.clone();
let dbs = app.state::<tauri_plugin_sql::DbInstances>().inner();
@@ -346,12 +387,15 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
};
db_clone.set(sqlite_pool.unwrap().clone()).await;
db_clone.finish_pending_tasks().await?;
let webhook_poster =
webhook::poster::create_webhook_poster(&config.read().await.webhook_url, None).unwrap();
let recorder_manager = Arc::new(RecorderManager::new(
app.app_handle().clone(),
emitter,
db.clone(),
config.clone(),
webhook_poster.clone(),
));
let accounts = db_clone.get_accounts().await?;
@@ -363,10 +407,11 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
config,
recorder_manager,
app_handle: app.handle().clone(),
webhook_poster,
});
}
// update account infos
// update account info
for account in accounts {
let platform = PlatformType::from_str(&account.platform).unwrap();
@@ -392,7 +437,7 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
} else if platform == PlatformType::Douyin {
// Update Douyin account info
use crate::recorder::douyin::client::DouyinClient;
let douyin_client = DouyinClient::new(&account);
let douyin_client = DouyinClient::new(&config_clone.read().await.user_agent, &account);
match douyin_client.get_user_info().await {
Ok(user_info) => {
let avatar_url = user_info
@@ -423,9 +468,12 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
// try to rebuild archive table
let cache_path = config_clone.read().await.cache.clone();
if let Err(e) = try_rebuild_archives(&db_clone, cache_path.into()).await {
let output_path = config_clone.read().await.output.clone();
if let Err(e) = try_rebuild_archives(&db_clone, cache_path.clone().into()).await {
log::warn!("Rebuilding archive table failed: {}", e);
}
let _ = try_convert_live_covers(&db_clone, cache_path.into()).await;
let _ = try_convert_clip_covers(&db_clone, output_path.into()).await;
Ok(State {
db,
@@ -433,6 +481,7 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
config,
recorder_manager,
app_handle: app.handle().clone(),
webhook_poster,
})
}
@@ -501,6 +550,9 @@ fn setup_invoke_handlers(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<
crate::handlers::config::update_auto_generate,
crate::handlers::config::update_status_check_interval,
crate::handlers::config::update_whisper_language,
crate::handlers::config::update_user_agent,
crate::handlers::config::update_cleanup_source_flv,
crate::handlers::config::update_webhook_url,
crate::handlers::message::get_messages,
crate::handlers::message::read_message,
crate::handlers::message::delete_message,
@@ -508,11 +560,13 @@ fn setup_invoke_handlers(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<
crate::handlers::recorder::add_recorder,
crate::handlers::recorder::remove_recorder,
crate::handlers::recorder::get_room_info,
crate::handlers::recorder::get_archive_disk_usage,
crate::handlers::recorder::get_archives,
crate::handlers::recorder::get_archive,
crate::handlers::recorder::get_archive_subtitle,
crate::handlers::recorder::generate_archive_subtitle,
crate::handlers::recorder::delete_archive,
crate::handlers::recorder::delete_archives,
crate::handlers::recorder::get_danmu_record,
crate::handlers::recorder::export_danmu,
crate::handlers::recorder::send_danmaku,
@@ -534,8 +588,17 @@ fn setup_invoke_handlers(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<
crate::handlers::video::generate_video_subtitle,
crate::handlers::video::get_video_subtitle,
crate::handlers::video::update_video_subtitle,
crate::handlers::video::update_video_note,
crate::handlers::video::encode_video_subtitle,
crate::handlers::video::generic_ffmpeg_command,
crate::handlers::video::import_external_video,
crate::handlers::video::batch_import_external_videos,
crate::handlers::video::clip_video,
crate::handlers::video::get_file_size,
crate::handlers::video::scan_imported_directory,
crate::handlers::video::import_file_in_place,
crate::handlers::video::batch_import_in_place,
crate::handlers::video::get_import_progress,
crate::handlers::task::get_tasks,
crate::handlers::task::delete_task,
crate::handlers::utils::show_in_folder,
@@ -553,7 +616,7 @@ fn setup_invoke_handlers(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<
fn main() -> Result<(), Box<dyn std::error::Error>> {
let _ = fix_path_env::fix();
let builder = tauri::Builder::default();
let builder = tauri::Builder::default().plugin(tauri_plugin_deep_link::init());
let builder = setup_plugins(builder);
let builder = setup_event_handlers(builder);
let builder = setup_invoke_handlers(builder);

View File

@@ -0,0 +1,134 @@
use std::path::PathBuf;
use std::sync::Arc;
use base64::Engine;
use chrono::Utc;
use crate::database::Database;
use crate::recorder::PlatformType;
pub async fn try_rebuild_archives(
db: &Arc<Database>,
cache_path: PathBuf,
) -> Result<(), Box<dyn std::error::Error>> {
let rooms = db.get_recorders().await?;
for room in rooms {
let room_id = room.room_id;
let room_cache_path = cache_path.join(format!("{}/{}", room.platform, room_id));
let mut files = tokio::fs::read_dir(room_cache_path).await?;
while let Some(file) = files.next_entry().await? {
if file.file_type().await?.is_dir() {
// use folder name as live_id
let live_id = file.file_name();
let live_id = live_id.to_str().unwrap();
// check if live_id is in db
let record = db.get_record(room_id, live_id).await;
if record.is_ok() {
continue;
}
// get created_at from folder metadata
let metadata = file.metadata().await?;
let created_at = metadata.created();
if created_at.is_err() {
continue;
}
let created_at = created_at.unwrap();
let created_at = chrono::DateTime::<Utc>::from(created_at)
.format("%Y-%m-%dT%H:%M:%S.%fZ")
.to_string();
// create a record for this live_id
let record = db
.add_record(
PlatformType::from_str(room.platform.as_str()).unwrap(),
live_id,
room_id,
&format!("UnknownLive {}", live_id),
None,
Some(&created_at),
)
.await?;
log::info!("rebuild archive {:?}", record);
}
}
}
Ok(())
}
pub async fn try_convert_live_covers(
db: &Arc<Database>,
cache_path: PathBuf,
) -> Result<(), Box<dyn std::error::Error>> {
let rooms = db.get_recorders().await?;
for room in rooms {
let room_id = room.room_id;
let room_cache_path = cache_path.join(format!("{}/{}", room.platform, room_id));
let records = db.get_records(room_id, 0, 999999999).await?;
for record in &records {
let record_path = room_cache_path.join(record.live_id.clone());
let cover = record.cover.clone();
if cover.is_none() {
continue;
}
let cover = cover.unwrap();
if cover.starts_with("data:") {
let base64 = cover.split("base64,").nth(1).unwrap();
let bytes = base64::engine::general_purpose::STANDARD
.decode(base64)
.unwrap();
let path = record_path.join("cover.jpg");
tokio::fs::write(&path, bytes).await?;
log::info!("convert live cover: {}", path.display());
// update record
db.update_record_cover(
record.live_id.as_str(),
Some(format!(
"{}/{}/{}/cover.jpg",
room.platform, room_id, record.live_id
)),
)
.await?;
}
}
}
Ok(())
}
pub async fn try_convert_clip_covers(
db: &Arc<Database>,
output_path: PathBuf,
) -> Result<(), Box<dyn std::error::Error>> {
let videos = db.get_all_videos().await?;
log::debug!("videos: {}", videos.len());
for video in &videos {
let cover = video.cover.clone();
if cover.starts_with("data:") {
let base64 = cover.split("base64,").nth(1).unwrap();
let bytes = base64::engine::general_purpose::STANDARD
.decode(base64)
.unwrap();
let video_file_path = output_path.join(video.file.clone());
let cover_file_path = video_file_path.with_extension("jpg");
log::debug!("cover_file_path: {}", cover_file_path.display());
tokio::fs::write(&cover_file_path, bytes).await?;
log::info!("convert clip cover: {}", cover_file_path.display());
// update record
db.update_video_cover(
video.id,
cover_file_path
.file_name()
.unwrap()
.to_str()
.unwrap()
.to_string(),
)
.await?;
}
}
Ok(())
}

View File

@@ -1,3 +1,5 @@
pub mod migration_methods;
use sqlx::migrate::MigrationType;
#[derive(Debug)]
@@ -6,6 +8,7 @@ pub enum MigrationKind {
Down,
}
#[cfg(feature = "headless")]
#[derive(Debug)]
pub struct Migration {
pub version: i64,

View File

@@ -30,7 +30,7 @@ pub struct ProgressManager {
#[cfg(feature = "headless")]
impl ProgressManager {
pub fn new() -> Self {
let (progress_sender, progress_receiver) = broadcast::channel(16);
let (progress_sender, progress_receiver) = broadcast::channel(256);
Self {
progress_sender,
progress_receiver,

View File

@@ -82,7 +82,10 @@ pub trait Recorder: Send + Sync + 'static {
async fn comments(&self, live_id: &str) -> Result<Vec<DanmuEntry>, errors::RecorderError>;
async fn is_recording(&self, live_id: &str) -> bool;
async fn get_archive_subtitle(&self, live_id: &str) -> Result<String, errors::RecorderError>;
async fn generate_archive_subtitle(&self, live_id: &str) -> Result<String, errors::RecorderError>;
async fn generate_archive_subtitle(
&self,
live_id: &str,
) -> Result<String, errors::RecorderError>;
async fn enable(&self);
async fn disable(&self);
}

File diff suppressed because it is too large Load Diff

View File

@@ -9,7 +9,7 @@ use super::response::VideoSubmitData;
use crate::database::account::AccountRow;
use crate::progress_reporter::ProgressReporter;
use crate::progress_reporter::ProgressReporterTrait;
use base64::Engine;
use chrono::TimeZone;
use pct_str::PctString;
use pct_str::URIReserved;
use regex::Regex;
@@ -42,6 +42,7 @@ pub struct RoomInfo {
pub room_keyframe_url: String,
pub room_title: String,
pub user_id: u64,
pub live_start_time: i64,
}
#[derive(Serialize, Deserialize, Clone, Debug)]
@@ -138,25 +139,12 @@ impl BiliStream {
}
})
}
pub fn is_same(&self, other: &BiliStream) -> bool {
// Extract live_id part from path (e.g., live_1848752274_71463808)
let get_live_id = |path: &str| {
path.split('/')
.find(|part| part.starts_with("live_"))
.unwrap_or("")
.to_string()
};
let self_live_id = get_live_id(&self.path);
let other_live_id = get_live_id(&other.path);
self_live_id == other_live_id
}
}
impl BiliClient {
pub fn new() -> Result<BiliClient, BiliClientError> {
pub fn new(user_agent: &str) -> Result<BiliClient, BiliClientError> {
let mut headers = reqwest::header::HeaderMap::new();
headers.insert("user-agent", "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36".parse().unwrap());
headers.insert("user-agent", user_agent.parse().unwrap());
if let Ok(client) = Client::builder().timeout(Duration::from_secs(10)).build() {
Ok(BiliClient { client, headers })
@@ -344,6 +332,22 @@ impl BiliClient {
let live_status = res["data"]["live_status"]
.as_u64()
.ok_or(BiliClientError::InvalidValue)? as u8;
// "live_time": "2025-08-09 18:33:35",
let live_start_time_str = res["data"]["live_time"]
.as_str()
.ok_or(BiliClientError::InvalidValue)?;
let live_start_time = if live_start_time_str == "0000-00-00 00:00:00" {
0
} else {
let naive =
chrono::NaiveDateTime::parse_from_str(live_start_time_str, "%Y-%m-%d %H:%M:%S")
.map_err(|_| BiliClientError::InvalidValue)?;
chrono::Local
.from_local_datetime(&naive)
.earliest()
.ok_or(BiliClientError::InvalidValue)?
.timestamp()
};
Ok(RoomInfo {
room_id,
room_title,
@@ -351,18 +355,21 @@ impl BiliClient {
room_keyframe_url,
user_id,
live_status,
live_start_time,
})
}
/// Get and encode response data into base64
pub async fn get_cover_base64(&self, url: &str) -> Result<String, BiliClientError> {
/// Download file from url to path
pub async fn download_file(&self, url: &str, path: &Path) -> Result<(), BiliClientError> {
if !path.parent().unwrap().exists() {
std::fs::create_dir_all(path.parent().unwrap()).unwrap();
}
let response = self.client.get(url).send().await?;
let bytes = response.bytes().await?;
let base64 = base64::engine::general_purpose::STANDARD.encode(bytes);
let mime_type = mime_guess::from_path(url)
.first_or_octet_stream()
.to_string();
Ok(format!("data:{};base64,{}", mime_type, base64))
let mut file = tokio::fs::File::create(&path).await?;
let mut content = std::io::Cursor::new(bytes);
tokio::io::copy(&mut content, &mut file).await?;
Ok(())
}
pub async fn get_index_content(

View File

@@ -67,12 +67,16 @@ impl DanmuStorage {
// get entries with ts relative to live start time
pub async fn get_entries(&self, live_start_ts: i64) -> Vec<DanmuEntry> {
let mut danmus: Vec<DanmuEntry> = self.cache.read().await.iter().map(|entry| {
DanmuEntry {
let mut danmus: Vec<DanmuEntry> = self
.cache
.read()
.await
.iter()
.map(|entry| DanmuEntry {
ts: entry.ts - live_start_ts,
content: entry.content.clone(),
}
}).collect();
})
.collect();
// filter out danmus with ts < 0
danmus.retain(|entry| entry.ts >= 0);
danmus

View File

@@ -19,11 +19,11 @@ use danmu_stream::danmu_stream::DanmuStream;
use danmu_stream::provider::ProviderType;
use danmu_stream::DanmuMessageType;
use rand::random;
use tokio::fs::File;
use tokio::io::{AsyncReadExt, AsyncWriteExt, BufReader};
use std::path::Path;
use std::sync::Arc;
use std::time::Duration;
use tokio::fs::File;
use tokio::io::{AsyncReadExt, AsyncWriteExt, BufReader};
use tokio::sync::{broadcast, Mutex, RwLock};
use tokio::task::JoinHandle;
@@ -59,7 +59,8 @@ pub struct DouyinRecorder {
db: Arc<Database>,
account: AccountRow,
room_id: u64,
room_info: Arc<RwLock<Option<response::DouyinRoomInfoResponse>>>,
sec_user_id: String,
room_info: Arc<RwLock<Option<client::DouyinBasicRoomInfo>>>,
stream_url: Arc<RwLock<Option<String>>>,
entry_store: Arc<RwLock<Option<EntryStore>>>,
danmu_store: Arc<RwLock<Option<DanmuStorage>>>,
@@ -84,16 +85,17 @@ impl DouyinRecorder {
#[cfg(not(feature = "headless"))] app_handle: AppHandle,
emitter: EventEmitter,
room_id: u64,
sec_user_id: &str,
config: Arc<RwLock<Config>>,
account: &AccountRow,
db: &Arc<Database>,
enabled: bool,
channel: broadcast::Sender<RecorderEvent>,
) -> Result<Self, super::errors::RecorderError> {
let client = client::DouyinClient::new(account);
let room_info = client.get_room_info(room_id).await?;
let client = client::DouyinClient::new(&config.read().await.user_agent, account);
let room_info = client.get_room_info(room_id, sec_user_id).await?;
let mut live_status = LiveStatus::Offline;
if room_info.data.room_status == 0 {
if room_info.status == 0 {
live_status = LiveStatus::Live;
}
@@ -104,6 +106,7 @@ impl DouyinRecorder {
db: db.clone(),
account: account.clone(),
room_id,
sec_user_id: sec_user_id.to_string(),
live_id: Arc::new(RwLock::new(String::new())),
danmu_room_id: Arc::new(RwLock::new(String::new())),
entry_store: Arc::new(RwLock::new(None)),
@@ -134,9 +137,13 @@ impl DouyinRecorder {
}
async fn check_status(&self) -> bool {
match self.client.get_room_info(self.room_id).await {
match self
.client
.get_room_info(self.room_id, &self.sec_user_id)
.await
{
Ok(info) => {
let live_status = info.data.room_status == 0; // room_status == 0 表示正在直播
let live_status = info.status == 0; // room_status == 0 表示正在直播
*self.room_info.write().await = Some(info.clone());
@@ -157,7 +164,7 @@ impl DouyinRecorder {
.title("BiliShadowReplay - 直播开始")
.body(format!(
"{} 开启了直播:{}",
info.data.user.nickname, info.data.data[0].title
info.user_name, info.room_title
))
.show()
.unwrap();
@@ -169,14 +176,14 @@ impl DouyinRecorder {
.title("BiliShadowReplay - 直播结束")
.body(format!(
"{} 关闭了直播:{}",
info.data.user.nickname, info.data.data[0].title
info.user_name, info.room_title
))
.show()
.unwrap();
let _ = self.live_end_channel.send(RecorderEvent::LiveEnd {
platform: PlatformType::Douyin,
room_id: self.room_id,
live_id: self.live_id.read().await.clone(),
recorder: self.info().await,
});
}
@@ -202,66 +209,9 @@ impl DouyinRecorder {
}
// Get stream URL when live starts
if !info.data.data[0]
.stream_url
.as_ref()
.unwrap()
.hls_pull_url
.is_empty()
{
*self.live_id.write().await = Utc::now().timestamp_millis().to_string();
*self.danmu_room_id.write().await = info.data.data[0].id_str.clone();
// create a new record
let cover_url = info.data.data[0]
.cover
.as_ref()
.map(|cover| cover.url_list[0].clone());
let cover = if let Some(url) = cover_url {
Some(self.client.get_cover_base64(&url).await.unwrap())
} else {
None
};
if let Err(e) = self
.db
.add_record(
PlatformType::Douyin,
self.live_id.read().await.as_str(),
self.room_id,
&info.data.data[0].title,
cover,
None,
)
.await
{
log::error!("Failed to add record: {}", e);
}
// setup entry store
let work_dir = self.get_work_dir(self.live_id.read().await.as_str()).await;
let entry_store = EntryStore::new(&work_dir).await;
*self.entry_store.write().await = Some(entry_store);
// setup danmu store
let danmu_file_path = format!("{}{}", work_dir, "danmu.txt");
let danmu_store = DanmuStorage::new(&danmu_file_path).await;
*self.danmu_store.write().await = danmu_store;
// start danmu task
if let Some(danmu_task) = self.danmu_task.lock().await.as_mut() {
danmu_task.abort();
}
if let Some(danmu_stream_task) = self.danmu_stream_task.lock().await.as_mut() {
danmu_stream_task.abort();
}
let live_id = self.live_id.read().await.clone();
let self_clone = self.clone();
*self.danmu_task.lock().await = Some(tokio::spawn(async move {
log::info!("Start fetching danmu for live {}", live_id);
let _ = self_clone.danmu().await;
}));
// setup stream url
if !info.hls_url.is_empty() {
// Only set stream URL, don't create record yet
// Record will be created when first ts download succeeds
let new_stream_url = self.get_best_stream_url(&info).await;
if new_stream_url.is_none() {
log::error!("No stream url found in room_info: {:#?}", info);
@@ -270,6 +220,7 @@ impl DouyinRecorder {
log::info!("New douyin stream URL: {}", new_stream_url.clone().unwrap());
*self.stream_url.write().await = Some(new_stream_url.unwrap());
*self.danmu_room_id.write().await = info.room_id_str.clone();
}
true
@@ -283,7 +234,13 @@ impl DouyinRecorder {
async fn danmu(&self) -> Result<(), super::errors::RecorderError> {
let cookies = self.account.cookies.clone();
let danmu_room_id = self.danmu_room_id.read().await.clone().parse::<u64>().unwrap_or(0);
let danmu_room_id = self
.danmu_room_id
.read()
.await
.clone()
.parse::<u64>()
.unwrap_or(0);
let danmu_stream = DanmuStream::new(ProviderType::Douyin, &cookies, danmu_room_id).await;
if danmu_stream.is_err() {
let err = danmu_stream.err().unwrap();
@@ -340,18 +297,8 @@ impl DouyinRecorder {
)
}
async fn get_best_stream_url(
&self,
room_info: &response::DouyinRoomInfoResponse,
) -> Option<String> {
let stream_data = room_info.data.data[0]
.stream_url
.as_ref()
.unwrap()
.live_core_sdk_data
.pull_data
.stream_data
.clone();
async fn get_best_stream_url(&self, room_info: &client::DouyinBasicRoomInfo) -> Option<String> {
let stream_data = room_info.stream_data.clone();
// parse stream_data into stream_info
let stream_info = serde_json::from_str::<stream_info::StreamInfo>(&stream_data);
if let Ok(stream_info) = stream_info {
@@ -372,24 +319,22 @@ impl DouyinRecorder {
fn parse_stream_url(&self, stream_url: &str) -> (String, String) {
// Parse stream URL to extract base URL and query parameters
// Example: http://7167739a741646b4651b6949b2f3eb8e.livehwc3.cn/pull-hls-l26.douyincdn.com/third/stream-693342996808860134_or4.m3u8?sub_m3u8=true&user_session_id=16090eb45ab8a2f042f7c46563936187&major_anchor_level=common&edge_slice=true&expire=67d944ec&sign=47b95cc6e8de20d82f3d404412fa8406
let base_url = stream_url
.rfind('/')
.map(|i| &stream_url[..=i])
.unwrap_or(stream_url)
.to_string();
let query_params = stream_url
.find('?')
.map(|i| &stream_url[i..])
.unwrap_or("")
.to_string();
(base_url, query_params)
}
async fn update_entries(&self) -> Result<u128, RecorderError> {
let task_begin_time = std::time::Instant::now();
@@ -404,26 +349,38 @@ impl DouyinRecorder {
return Err(RecorderError::NoStreamAvailable);
}
let stream_url = self.stream_url.read().await.as_ref().unwrap().clone();
let mut stream_url = self.stream_url.read().await.as_ref().unwrap().clone();
// Get m3u8 playlist
let (playlist, updated_stream_url) = self.client.get_m3u8_content(&stream_url).await?;
*self.stream_url.write().await = Some(updated_stream_url);
*self.stream_url.write().await = Some(updated_stream_url.clone());
stream_url = updated_stream_url;
let mut new_segment_fetched = false;
let work_dir = self.get_work_dir(self.live_id.read().await.as_str()).await;
let mut is_first_segment = self.entry_store.read().await.is_none();
let work_dir;
// Create work directory if not exists
tokio::fs::create_dir_all(&work_dir).await?;
// If this is the first segment, prepare but don't create directories yet
if is_first_segment {
// Generate live_id for potential use
let live_id = Utc::now().timestamp_millis().to_string();
*self.live_id.write().await = live_id.clone();
work_dir = self.get_work_dir(&live_id).await;
} else {
work_dir = self.get_work_dir(self.live_id.read().await.as_str()).await;
}
let last_sequence = self
.entry_store
.read()
.await
.as_ref()
.unwrap()
.last_sequence;
let last_sequence = if is_first_segment {
0
} else {
self.entry_store
.read()
.await
.as_ref()
.unwrap()
.last_sequence
};
for segment in playlist.segments.iter() {
let formated_ts_name = segment.uri.clone();
@@ -449,7 +406,7 @@ impl DouyinRecorder {
} else {
// Parse the stream URL to extract base URL and query parameters
let (base_url, query_params) = self.parse_stream_url(&stream_url);
// Check if the segment URI already has query parameters
if uri.contains('?') {
// If segment URI has query params, append m3u8 query params with &
@@ -459,22 +416,28 @@ impl DouyinRecorder {
format!("{}{}{}", base_url, uri, query_params)
}
};
// Download segment with retry mechanism
let mut retry_count = 0;
let max_retries = 3;
let mut download_success = false;
let mut work_dir_created = false;
while retry_count < max_retries && !download_success {
let file_name = format!("{}.ts", sequence);
let file_path = format!("{}/{}", work_dir, file_name);
match self
.client
.download_ts(&ts_url, &file_path)
.await
{
// If this is the first segment, create work directory before first download attempt
if is_first_segment && !work_dir_created {
// Create work directory only when we're about to download
if let Err(e) = tokio::fs::create_dir_all(&work_dir).await {
log::error!("Failed to create work directory: {}", e);
return Err(e.into());
}
work_dir_created = true;
}
match self.client.download_ts(&ts_url, &file_path).await {
Ok(size) => {
if size == 0 {
log::error!("Download segment failed (empty response): {}", ts_url);
@@ -485,6 +448,61 @@ impl DouyinRecorder {
}
break;
}
// If this is the first successful download, create record and initialize stores
if is_first_segment {
// Create database record
let room_info = room_info.as_ref().unwrap();
let cover_url = room_info.cover.clone();
let cover = if let Some(url) = cover_url {
Some(self.client.get_cover_base64(&url).await.unwrap_or_default())
} else {
None
};
if let Err(e) = self
.db
.add_record(
PlatformType::Douyin,
self.live_id.read().await.as_str(),
self.room_id,
&room_info.room_title,
cover,
None,
)
.await
{
log::error!("Failed to add record: {}", e);
}
// Setup entry store
let entry_store = EntryStore::new(&work_dir).await;
*self.entry_store.write().await = Some(entry_store);
// Setup danmu store
let danmu_file_path = format!("{}{}", work_dir, "danmu.txt");
let danmu_store = DanmuStorage::new(&danmu_file_path).await;
*self.danmu_store.write().await = danmu_store;
// Start danmu task
if let Some(danmu_task) = self.danmu_task.lock().await.as_mut() {
danmu_task.abort();
}
if let Some(danmu_stream_task) =
self.danmu_stream_task.lock().await.as_mut()
{
danmu_stream_task.abort();
}
let live_id = self.live_id.read().await.clone();
let self_clone = self.clone();
*self.danmu_task.lock().await = Some(tokio::spawn(async move {
log::info!("Start fetching danmu for live {}", live_id);
let _ = self_clone.danmu().await;
}));
is_first_segment = false;
}
let ts_entry = TsEntry {
url: file_name,
sequence,
@@ -506,26 +524,76 @@ impl DouyinRecorder {
download_success = true;
}
Err(e) => {
log::warn!("Failed to download segment (attempt {}/{}): {} - URL: {}",
retry_count + 1, max_retries, e, ts_url);
log::warn!(
"Failed to download segment (attempt {}/{}): {} - URL: {}",
retry_count + 1,
max_retries,
e,
ts_url
);
retry_count += 1;
if retry_count < max_retries {
tokio::time::sleep(Duration::from_millis(1000 * retry_count as u64)).await;
tokio::time::sleep(Duration::from_millis(1000 * retry_count as u64))
.await;
continue;
}
// If all retries failed, check if it's a 400 error
if e.to_string().contains("400") {
log::error!("HTTP 400 error for segment, stream URL may be expired: {}", ts_url);
log::error!(
"HTTP 400 error for segment, stream URL may be expired: {}",
ts_url
);
*self.stream_url.write().await = None;
// Clean up empty directory if first segment failed
if is_first_segment && work_dir_created {
if let Err(cleanup_err) = tokio::fs::remove_dir_all(&work_dir).await
{
log::warn!(
"Failed to cleanup empty work directory {}: {}",
work_dir,
cleanup_err
);
}
}
return Err(RecorderError::NoStreamAvailable);
}
// Clean up empty directory if first segment failed
if is_first_segment && work_dir_created {
if let Err(cleanup_err) = tokio::fs::remove_dir_all(&work_dir).await {
log::warn!(
"Failed to cleanup empty work directory {}: {}",
work_dir,
cleanup_err
);
}
}
return Err(e.into());
}
}
}
if !download_success {
log::error!("Failed to download segment after {} retries: {}", max_retries, ts_url);
log::error!(
"Failed to download segment after {} retries: {}",
max_retries,
ts_url
);
// Clean up empty directory if first segment failed after all retries
if is_first_segment && work_dir_created {
if let Err(cleanup_err) = tokio::fs::remove_dir_all(&work_dir).await {
log::warn!(
"Failed to cleanup empty work directory {}: {}",
work_dir,
cleanup_err
);
}
}
continue;
}
}
@@ -686,7 +754,10 @@ impl Recorder for DouyinRecorder {
m3u8_content
}
async fn get_archive_subtitle(&self, live_id: &str) -> Result<String, super::errors::RecorderError> {
async fn get_archive_subtitle(
&self,
live_id: &str,
) -> Result<String, super::errors::RecorderError> {
let work_dir = self.get_work_dir(live_id).await;
let subtitle_file_path = format!("{}/{}", work_dir, "subtitle.srt");
let subtitle_file = File::open(subtitle_file_path).await;
@@ -702,7 +773,10 @@ impl Recorder for DouyinRecorder {
Ok(subtitle_content)
}
async fn generate_archive_subtitle(&self, live_id: &str) -> Result<String, super::errors::RecorderError> {
async fn generate_archive_subtitle(
&self,
live_id: &str,
) -> Result<String, super::errors::RecorderError> {
// generate subtitle file under work_dir
let work_dir = self.get_work_dir(live_id).await;
let subtitle_file_path = format!("{}/{}", work_dir, "subtitle.srt");
@@ -714,22 +788,45 @@ impl Recorder for DouyinRecorder {
tokio::fs::write(&m3u8_index_file_path, m3u8_content).await?;
// generate a tmp clip file
let clip_file_path = format!("{}/{}", work_dir, "tmp.mp4");
if let Err(e) = crate::ffmpeg::clip_from_m3u8(None::<&crate::progress_reporter::ProgressReporter>, Path::new(&m3u8_index_file_path), Path::new(&clip_file_path)).await {
if let Err(e) = crate::ffmpeg::clip_from_m3u8(
None::<&crate::progress_reporter::ProgressReporter>,
Path::new(&m3u8_index_file_path),
Path::new(&clip_file_path),
None,
false,
)
.await
{
return Err(super::errors::RecorderError::SubtitleGenerationFailed {
error: e.to_string(),
});
}
// generate subtitle file
let config = self.config.read().await;
let result = crate::ffmpeg::generate_video_subtitle(None, Path::new(&clip_file_path), "whisper", &config.whisper_model, &config.whisper_prompt, &config.openai_api_key, &config.openai_api_endpoint, &config.whisper_language).await;
let result = crate::ffmpeg::generate_video_subtitle(
None,
Path::new(&clip_file_path),
"whisper",
&config.whisper_model,
&config.whisper_prompt,
&config.openai_api_key,
&config.openai_api_endpoint,
&config.whisper_language,
)
.await;
// write subtitle file
if let Err(e) = result {
return Err(super::errors::RecorderError::SubtitleGenerationFailed {
error: e.to_string(),
});
}
}
let result = result.unwrap();
let subtitle_content = result.subtitle_content.iter().map(item_to_srt).collect::<Vec<String>>().join("");
let subtitle_content = result
.subtitle_content
.iter()
.map(item_to_srt)
.collect::<Vec<String>>()
.join("");
subtitle_file.write_all(subtitle_content.as_bytes()).await?;
// remove tmp file
@@ -756,17 +853,11 @@ impl Recorder for DouyinRecorder {
let room_info = self.room_info.read().await;
let room_cover_url = room_info
.as_ref()
.and_then(|info| {
info.data
.data
.first()
.and_then(|data| data.cover.as_ref())
.map(|cover| cover.url_list[0].clone())
})
.and_then(|info| info.cover.clone())
.unwrap_or_default();
let room_title = room_info
.as_ref()
.and_then(|info| info.data.data.first().map(|data| data.title.clone()))
.map(|info| info.room_title.clone())
.unwrap_or_default();
RecorderInfo {
room_id: self.room_id,
@@ -778,15 +869,15 @@ impl Recorder for DouyinRecorder {
user_info: UserInfo {
user_id: room_info
.as_ref()
.map(|info| info.data.user.sec_uid.clone())
.map(|info| info.sec_user_id.clone())
.unwrap_or_default(),
user_name: room_info
.as_ref()
.map(|info| info.data.user.nickname.clone())
.map(|info| info.user_name.clone())
.unwrap_or_default(),
user_avatar: room_info
.as_ref()
.map(|info| info.data.user.avatar_thumb.url_list[0].clone())
.map(|info| info.user_avatar.clone())
.unwrap_or_default(),
},
total_length: if let Some(store) = self.entry_store.read().await.as_ref() {
@@ -806,7 +897,11 @@ impl Recorder for DouyinRecorder {
Ok(if live_id == *self.live_id.read().await {
// just return current cache content
match self.danmu_store.read().await.as_ref() {
Some(storage) => storage.get_entries(self.first_segment_ts(live_id).await).await,
Some(storage) => {
storage
.get_entries(self.first_segment_ts(live_id).await)
.await
}
None => Vec::new(),
}
} else {
@@ -824,7 +919,9 @@ impl Recorder for DouyinRecorder {
return Ok(Vec::new());
}
let storage = storage.unwrap();
storage.get_entries(self.first_segment_ts(live_id).await).await
storage
.get_entries(self.first_segment_ts(live_id).await)
.await
})
}

View File

@@ -6,11 +6,9 @@ use reqwest::{Client, Error as ReqwestError};
use super::response::DouyinRoomInfoResponse;
use std::fmt;
const USER_AGENT: &str = "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36";
#[derive(Debug)]
pub enum DouyinClientError {
Network(ReqwestError),
Network(String),
Io(std::io::Error),
Playlist(String),
}
@@ -27,7 +25,7 @@ impl fmt::Display for DouyinClientError {
impl From<ReqwestError> for DouyinClientError {
fn from(err: ReqwestError) -> Self {
DouyinClientError::Network(err)
DouyinClientError::Network(err.to_string())
}
}
@@ -37,27 +35,42 @@ impl From<std::io::Error> for DouyinClientError {
}
}
#[derive(Debug, Clone)]
pub struct DouyinBasicRoomInfo {
pub room_id_str: String,
pub room_title: String,
pub cover: Option<String>,
pub status: i64,
pub hls_url: String,
pub stream_data: String,
// user related
pub user_name: String,
pub user_avatar: String,
pub sec_user_id: String,
}
#[derive(Clone)]
pub struct DouyinClient {
client: Client,
cookies: String,
account: AccountRow,
}
impl DouyinClient {
pub fn new(account: &AccountRow) -> Self {
let client = Client::builder().user_agent(USER_AGENT).build().unwrap();
pub fn new(user_agent: &str, account: &AccountRow) -> Self {
let client = Client::builder().user_agent(user_agent).build().unwrap();
Self {
client,
cookies: account.cookies.clone(),
account: account.clone(),
}
}
pub async fn get_room_info(
&self,
room_id: u64,
) -> Result<DouyinRoomInfoResponse, DouyinClientError> {
sec_user_id: &str,
) -> Result<DouyinBasicRoomInfo, DouyinClientError> {
let url = format!(
"https://live.douyin.com/webcast/room/web/enter/?aid=6383&app_name=douyin_web&live_id=1&device_platform=web&language=zh-CN&enter_from=web_live&cookie_enabled=true&screen_width=1920&screen_height=1080&browser_language=zh-CN&browser_platform=MacIntel&browser_name=Chrome&browser_version=122.0.0.0&web_rid={}",
"https://live.douyin.com/webcast/room/web/enter/?aid=6383&app_name=douyin_web&live_id=1&device_platform=web&language=zh-CN&enter_from=web_live&a_bogus=0&cookie_enabled=true&screen_width=1920&screen_height=1080&browser_language=zh-CN&browser_platform=MacIntel&browser_name=Chrome&browser_version=122.0.0.0&web_rid={}",
room_id
);
@@ -65,14 +78,190 @@ impl DouyinClient {
.client
.get(&url)
.header("Referer", "https://live.douyin.com/")
.header("User-Agent", USER_AGENT)
.header("Cookie", self.cookies.clone())
.header("Cookie", self.account.cookies.clone())
.send()
.await?
.json::<DouyinRoomInfoResponse>()
.await?;
Ok(resp)
let status = resp.status();
let text = resp.text().await?;
if text.is_empty() {
log::warn!("Empty room info response, trying H5 API");
return self.get_room_info_h5(room_id, sec_user_id).await;
}
if status.is_success() {
if let Ok(data) = serde_json::from_str::<DouyinRoomInfoResponse>(&text) {
let cover = data
.data
.data
.first()
.and_then(|data| data.cover.as_ref())
.map(|cover| cover.url_list[0].clone());
return Ok(DouyinBasicRoomInfo {
room_id_str: data.data.data[0].id_str.clone(),
sec_user_id: sec_user_id.to_string(),
cover,
room_title: data.data.data[0].title.clone(),
user_name: data.data.user.nickname.clone(),
user_avatar: data.data.user.avatar_thumb.url_list[0].clone(),
status: data.data.room_status,
hls_url: data.data.data[0]
.stream_url
.as_ref()
.map(|stream_url| stream_url.hls_pull_url.clone())
.unwrap_or_default(),
stream_data: data.data.data[0]
.stream_url
.as_ref()
.map(|s| s.live_core_sdk_data.pull_data.stream_data.clone())
.unwrap_or_default(),
});
} else {
log::error!("Failed to parse room info response: {}", text);
return self.get_room_info_h5(room_id, sec_user_id).await;
}
}
log::error!("Failed to get room info: {}", status);
return self.get_room_info_h5(room_id, sec_user_id).await;
}
pub async fn get_room_info_h5(
&self,
room_id: u64,
sec_user_id: &str,
) -> Result<DouyinBasicRoomInfo, DouyinClientError> {
// 参考biliup实现构建完整的URL参数
let room_id_str = room_id.to_string();
// https://webcast.amemv.com/webcast/room/reflow/info/?type_id=0&live_id=1&version_code=99.99.99&app_id=1128&room_id=10000&sec_user_id=MS4wLjAB&aid=6383&device_platform=web&browser_language=zh-CN&browser_platform=Win32&browser_name=Mozilla&browser_version=5.0
let url_params = [
("type_id", "0"),
("live_id", "1"),
("version_code", "99.99.99"),
("app_id", "1128"),
("room_id", &room_id_str),
("sec_user_id", sec_user_id),
("aid", "6383"),
("device_platform", "web"),
];
// 构建URL
let query_string = url_params
.iter()
.map(|(k, v)| format!("{}={}", k, v))
.collect::<Vec<_>>()
.join("&");
let url = format!(
"https://webcast.amemv.com/webcast/room/reflow/info/?{}",
query_string
);
log::info!("get_room_info_h5: {}", url);
let resp = self
.client
.get(&url)
.header("User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36")
.header("Referer", "https://live.douyin.com/")
.header("Cookie", self.account.cookies.clone())
.send()
.await?;
let status = resp.status();
let text = resp.text().await?;
if status.is_success() {
// Try to parse as H5 response format
if let Ok(h5_data) =
serde_json::from_str::<super::response::DouyinH5RoomInfoResponse>(&text)
{
// Extract RoomBasicInfo from H5 response
let room = &h5_data.data.room;
let owner = &room.owner;
let cover = room
.cover
.as_ref()
.and_then(|c| c.url_list.first().cloned());
let hls_url = room
.stream_url
.as_ref()
.map(|s| s.hls_pull_url.clone())
.unwrap_or_default();
return Ok(DouyinBasicRoomInfo {
room_id_str: room.id_str.clone(),
room_title: room.title.clone(),
cover,
status: if room.status == 2 { 0 } else { 1 },
hls_url,
user_name: owner.nickname.clone(),
user_avatar: owner
.avatar_thumb
.url_list
.first()
.unwrap_or(&String::new())
.clone(),
sec_user_id: owner.sec_uid.clone(),
stream_data: room
.stream_url
.as_ref()
.map(|s| s.live_core_sdk_data.pull_data.stream_data.clone())
.unwrap_or_default(),
});
}
// If that fails, try to parse as a generic JSON to see what we got
if let Ok(json_value) = serde_json::from_str::<serde_json::Value>(&text) {
log::error!(
"Unexpected response structure: {}",
serde_json::to_string_pretty(&json_value).unwrap_or_default()
);
// Check if it's an error response
if let Some(status_code) = json_value.get("status_code").and_then(|v| v.as_i64()) {
if status_code != 0 {
let error_msg = json_value
.get("status_message")
.and_then(|v| v.as_str())
.unwrap_or("Unknown error");
return Err(DouyinClientError::Network(format!(
"API returned error status_code: {} - {}",
status_code, error_msg
)));
}
}
// 检查是否是"invalid session"错误
if let Some(status_message) =
json_value.get("status_message").and_then(|v| v.as_str())
{
if status_message.contains("invalid session") {
return Err(DouyinClientError::Network(
"Invalid session - please check your cookies. Make sure you have valid sessionid, passport_csrf_token, and other authentication cookies from douyin.com".to_string(),
));
}
}
return Err(DouyinClientError::Network(format!(
"Failed to parse h5 room info response: {}",
text
)));
} else {
log::error!("Failed to parse h5 room info response: {}", text);
return Err(DouyinClientError::Network(format!(
"Failed to parse h5 room info response: {}",
text
)));
}
}
log::error!("Failed to get h5 room info: {}", status);
Err(DouyinClientError::Network(format!(
"Failed to get h5 room info: {} {}",
status, text
)))
}
pub async fn get_user_info(&self) -> Result<super::response::User, DouyinClientError> {
@@ -82,16 +271,19 @@ impl DouyinClient {
.client
.get(url)
.header("Referer", "https://www.douyin.com/")
.header("User-Agent", USER_AGENT)
.header("Cookie", self.cookies.clone())
.header("Cookie", self.account.cookies.clone())
.send()
.await?;
if resp.status().is_success() {
if let Ok(data) = resp.json::<super::response::DouyinRelationResponse>().await {
let status = resp.status();
let text = resp.text().await?;
if status.is_success() {
if let Ok(data) = serde_json::from_str::<super::response::DouyinRelationResponse>(&text)
{
if data.status_code == 0 {
let owner_sec_uid = &data.owner_sec_uid;
// Find the user's own info in the followings list by matching sec_uid
if let Some(followings) = &data.followings {
for following in followings {
@@ -109,27 +301,33 @@ impl DouyinClient {
}
}
}
// If not found in followings, create a minimal user info from owner_sec_uid
let user = super::response::User {
id_str: "".to_string(), // We don't have the numeric UID
sec_uid: owner_sec_uid.clone(),
nickname: "抖音用户".to_string(), // Default nickname
avatar_thumb: super::response::AvatarThumb {
url_list: vec![],
},
avatar_thumb: super::response::AvatarThumb { url_list: vec![] },
follow_info: super::response::FollowInfo::default(),
foreign_user: 0,
open_id_str: "".to_string(),
};
return Ok(user);
}
} else {
log::error!("Failed to parse user info response: {}", text);
return Err(DouyinClientError::Network(format!(
"Failed to parse user info response: {}",
text
)));
}
}
log::error!("Failed to get user info: {}", status);
Err(DouyinClientError::Io(std::io::Error::new(
std::io::ErrorKind::NotFound,
"Failed to get user info from Douyin relation API"
"Failed to get user info from Douyin relation API",
)))
}
@@ -148,27 +346,13 @@ impl DouyinClient {
&self,
url: &str,
) -> Result<(MediaPlaylist, String), DouyinClientError> {
let content = self.client
.get(url)
.header("Referer", "https://live.douyin.com/")
.header("User-Agent", USER_AGENT)
.header("Cookie", self.cookies.clone())
.header("Accept", "*/*")
.header("Accept-Language", "zh-CN,zh;q=0.9,en;q=0.8")
.header("Accept-Encoding", "gzip, deflate, br")
.header("Connection", "keep-alive")
.header("Sec-Fetch-Dest", "empty")
.header("Sec-Fetch-Mode", "cors")
.header("Sec-Fetch-Site", "cross-site")
.send()
.await?
.text()
.await?;
let content = self.client.get(url).send().await?.text().await?;
// m3u8 content: #EXTM3U
// #EXT-X-VERSION:3
// #EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=2560000
// http://7167739a741646b4651b6949b2f3eb8e.livehwc3.cn/pull-hls-l26.douyincdn.com/third/stream-693342996808860134_or4.m3u8?sub_m3u8=true&user_session_id=16090eb45ab8a2f042f7c46563936187&major_anchor_level=common&edge_slice=true&expire=67d944ec&sign=47b95cc6e8de20d82f3d404412fa8406
if content.contains("BANDWIDTH") {
log::info!("Master manifest with playlist URL: {}", url);
let new_url = content.lines().last().unwrap();
return Box::pin(self.get_m3u8_content(new_url)).await;
}
@@ -183,25 +367,12 @@ impl DouyinClient {
}
pub async fn download_ts(&self, url: &str, path: &str) -> Result<u64, DouyinClientError> {
let response = self.client
.get(url)
.header("Referer", "https://live.douyin.com/")
.header("User-Agent", USER_AGENT)
.header("Cookie", self.cookies.clone())
.header("Accept", "*/*")
.header("Accept-Language", "zh-CN,zh;q=0.9,en;q=0.8")
.header("Accept-Encoding", "gzip, deflate, br")
.header("Connection", "keep-alive")
.header("Sec-Fetch-Dest", "empty")
.header("Sec-Fetch-Mode", "cors")
.header("Sec-Fetch-Site", "cross-site")
.send()
.await?;
let response = self.client.get(url).send().await?;
if response.status() != reqwest::StatusCode::OK {
let error = response.error_for_status().unwrap_err();
log::error!("HTTP error: {} for URL: {}", error, url);
return Err(DouyinClientError::Network(error));
return Err(DouyinClientError::Network(error.to_string()));
}
let mut file = tokio::fs::File::create(path).await?;
@@ -212,5 +383,3 @@ impl DouyinClient {
Ok(size)
}
}

View File

@@ -182,8 +182,7 @@ pub struct Extra {
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct PullDatas {
}
pub struct PullDatas {}
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
@@ -436,8 +435,7 @@ pub struct Stats {
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct LinkerMap {
}
pub struct LinkerMap {}
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
@@ -478,13 +476,11 @@ pub struct LinkerDetail {
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct LinkerMapStr {
}
pub struct LinkerMapStr {}
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct PlaymodeDetail {
}
pub struct PlaymodeDetail {}
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
@@ -676,4 +672,121 @@ pub struct AvatarSmall {
pub uri: String,
#[serde(rename = "url_list")]
pub url_list: Vec<String>,
}
}
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct DouyinH5RoomInfoResponse {
pub data: H5Data,
pub extra: H5Extra,
#[serde(rename = "status_code")]
pub status_code: i64,
}
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct H5Data {
pub room: H5Room,
pub user: H5User,
}
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct H5Room {
pub id: u64,
#[serde(rename = "id_str")]
pub id_str: String,
pub status: i64,
pub title: String,
pub cover: Option<H5Cover>,
#[serde(rename = "stream_url")]
pub stream_url: Option<H5StreamUrl>,
pub owner: H5Owner,
}
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct H5Cover {
#[serde(rename = "url_list")]
pub url_list: Vec<String>,
}
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct H5StreamUrl {
pub provider: i64,
pub id: u64,
#[serde(rename = "id_str")]
pub id_str: String,
#[serde(rename = "default_resolution")]
pub default_resolution: String,
#[serde(rename = "rtmp_pull_url")]
pub rtmp_pull_url: String,
#[serde(rename = "flv_pull_url")]
pub flv_pull_url: H5FlvPullUrl,
#[serde(rename = "hls_pull_url")]
pub hls_pull_url: String,
#[serde(rename = "hls_pull_url_map")]
pub hls_pull_url_map: H5HlsPullUrlMap,
#[serde(rename = "live_core_sdk_data")]
pub live_core_sdk_data: LiveCoreSdkData,
}
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct H5FlvPullUrl {
#[serde(rename = "FULL_HD1")]
pub full_hd1: Option<String>,
#[serde(rename = "HD1")]
pub hd1: Option<String>,
#[serde(rename = "SD1")]
pub sd1: Option<String>,
#[serde(rename = "SD2")]
pub sd2: Option<String>,
}
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct H5HlsPullUrlMap {
#[serde(rename = "FULL_HD1")]
pub full_hd1: Option<String>,
#[serde(rename = "HD1")]
pub hd1: Option<String>,
#[serde(rename = "SD1")]
pub sd1: Option<String>,
#[serde(rename = "SD2")]
pub sd2: Option<String>,
}
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct H5Owner {
pub nickname: String,
#[serde(rename = "avatar_thumb")]
pub avatar_thumb: H5AvatarThumb,
#[serde(rename = "sec_uid")]
pub sec_uid: String,
}
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct H5AvatarThumb {
#[serde(rename = "url_list")]
pub url_list: Vec<String>,
}
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct H5User {
pub nickname: String,
#[serde(rename = "avatar_thumb")]
pub avatar_thumb: Option<H5AvatarThumb>,
#[serde(rename = "sec_uid")]
pub sec_uid: String,
}
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct H5Extra {
pub now: i64,
}

View File

@@ -83,7 +83,7 @@ impl TsEntry {
let mut content = String::new();
content += &format!("#EXTINF:{:.2},\n", self.length);
content += &format!("#EXTINF:{:.4},\n", self.length);
content += &format!("{}\n", self.url);
content
@@ -200,10 +200,12 @@ impl EntryStore {
self.total_size
}
/// Get first timestamp in milliseconds
pub fn first_ts(&self) -> Option<i64> {
self.entries.first().map(|x| x.ts_mili())
}
/// Get last timestamp in milliseconds
pub fn last_ts(&self) -> Option<i64> {
self.entries.last().map(|x| x.ts_mili())
}
@@ -212,7 +214,6 @@ impl EntryStore {
/// `vod` indicates the manifest is for stream or video.
/// `force_time` adds DATE-TIME tag for each entry.
pub fn manifest(&self, vod: bool, force_time: bool, range: Option<Range>) -> String {
log::info!("Generate manifest for range: {:?} with vod: {} and force_time: {}", range, vod, force_time);
let mut m3u8_content = "#EXTM3U\n".to_string();
m3u8_content += "#EXT-X-VERSION:6\n";
m3u8_content += if vod {
@@ -240,12 +241,6 @@ impl EntryStore {
// Collect entries in range
let first_entry = self.entries.first().unwrap();
let first_entry_ts = first_entry.ts_seconds();
log::debug!("First entry ts: {}", first_entry_ts);
let last_entry = self.entries.last().unwrap();
let last_entry_ts = last_entry.ts_seconds();
log::debug!("Last entry ts: {}", last_entry_ts);
log::debug!("Full length: {}", last_entry_ts - first_entry_ts);
log::debug!("Range: {:?}", range);
let mut entries_in_range = vec![];
for e in &self.entries {
// ignore header, cause it's already in EXT-X-MAP

View File

@@ -22,4 +22,6 @@ custom_error! {pub RecorderError
DanmuStreamError {err: danmu_stream::DanmuStreamError} = "Danmu stream error: {err}",
SubtitleNotFound {live_id: String} = "Subtitle not found: {live_id}",
SubtitleGenerationFailed {error: String} = "Subtitle generation failed: {error}",
FfmpegError {err: String} = "FFmpeg error: {err}",
ResolutionChanged {err: String} = "Resolution changed: {err}",
}

View File

@@ -1,22 +0,0 @@
use actix_web::Response;
fn handle_hls_request(ts_path: Option<&str>) -> Response {
if let Some(ts_path) = ts_path {
if let Ok(content) = std::fs::read(ts_path) {
return Response::builder()
.status(200)
.header("Content-Type", "video/mp2t")
.header("Cache-Control", "no-cache")
.header("Access-Control-Allow-Origin", "*")
.body(content)
.unwrap();
}
}
Response::builder()
.status(404)
.header("Content-Type", "text/plain")
.header("Cache-Control", "no-cache")
.header("Access-Control-Allow-Origin", "*")
.body(b"Not Found".to_vec())
.unwrap()
}

View File

@@ -1,9 +1,10 @@
use crate::config::Config;
use crate::danmu2ass;
use crate::database::recorder::RecorderRow;
use crate::database::video::VideoRow;
use crate::database::{account::AccountRow, record::RecordRow};
use crate::database::{Database, DatabaseError};
use crate::ffmpeg::{clip_from_m3u8, encode_video_danmu};
use crate::ffmpeg::{clip_from_m3u8, encode_video_danmu, Range};
use crate::progress_reporter::{EventEmitter, ProgressReporter};
use crate::recorder::bilibili::{BiliRecorder, BiliRecorderOptions};
use crate::recorder::danmu::DanmuEntry;
@@ -12,6 +13,8 @@ use crate::recorder::errors::RecorderError;
use crate::recorder::PlatformType;
use crate::recorder::Recorder;
use crate::recorder::RecorderInfo;
use crate::webhook::events::{self, Payload};
use crate::webhook::poster::WebhookPoster;
use chrono::Utc;
use custom_error::custom_error;
use serde::{Deserialize, Serialize};
@@ -35,30 +38,38 @@ pub struct RecorderList {
#[derive(Debug, Deserialize, Serialize, Clone)]
pub struct ClipRangeParams {
pub title: String,
pub note: String,
pub cover: String,
pub platform: String,
pub room_id: u64,
pub live_id: String,
/// Clip range start in seconds
pub x: i64,
/// Clip range end in seconds
pub y: i64,
/// Timestamp of first stream segment in seconds
pub offset: i64,
pub range: Option<Range>,
/// Encode danmu after clip
pub danmu: bool,
pub local_offset: i64,
/// Fix encoding after clip
pub fix_encoding: bool,
}
#[derive(Debug, Clone)]
pub enum RecorderEvent {
LiveStart {
recorder: RecorderInfo,
},
LiveEnd {
platform: PlatformType,
room_id: u64,
live_id: String,
platform: PlatformType,
recorder: RecorderInfo,
},
RecordStart {
recorder: RecorderInfo,
},
RecordEnd {
recorder: RecorderInfo,
},
}
#[derive(Clone)]
pub struct RecorderManager {
#[cfg(not(feature = "headless"))]
app_handle: AppHandle,
@@ -69,6 +80,7 @@ pub struct RecorderManager {
to_remove: Arc<RwLock<HashSet<String>>>,
event_tx: broadcast::Sender<RecorderEvent>,
is_migrating: Arc<AtomicBool>,
webhook_poster: WebhookPoster,
}
custom_error! {pub RecorderManagerError
@@ -113,6 +125,7 @@ impl RecorderManager {
emitter: EventEmitter,
db: Arc<Database>,
config: Arc<RwLock<Config>>,
webhook_poster: WebhookPoster,
) -> RecorderManager {
let (event_tx, _) = broadcast::channel(100);
let manager = RecorderManager {
@@ -125,6 +138,7 @@ impl RecorderManager {
to_remove: Arc::new(RwLock::new(HashSet::new())),
event_tx,
is_migrating: Arc::new(AtomicBool::new(false)),
webhook_poster,
};
// Start event listener
@@ -141,20 +155,6 @@ impl RecorderManager {
manager
}
pub fn clone(&self) -> Self {
RecorderManager {
#[cfg(not(feature = "headless"))]
app_handle: self.app_handle.clone(),
emitter: self.emitter.clone(),
db: self.db.clone(),
config: self.config.clone(),
recorders: self.recorders.clone(),
to_remove: self.to_remove.clone(),
event_tx: self.event_tx.clone(),
is_migrating: self.is_migrating.clone(),
}
}
pub fn get_event_sender(&self) -> broadcast::Sender<RecorderEvent> {
self.event_tx.clone()
}
@@ -163,25 +163,46 @@ impl RecorderManager {
let mut rx = self.event_tx.subscribe();
while let Ok(event) = rx.recv().await {
match event {
RecorderEvent::LiveStart { recorder } => {
let event =
events::new_webhook_event(events::LIVE_STARTED, Payload::Room(recorder));
let _ = self.webhook_poster.post_event(&event).await;
}
RecorderEvent::LiveEnd {
platform,
room_id,
live_id,
recorder,
} => {
self.handle_live_end(platform, room_id, &live_id).await;
let event = events::new_webhook_event(
events::LIVE_ENDED,
Payload::Room(recorder.clone()),
);
let _ = self.webhook_poster.post_event(&event).await;
self.handle_live_end(platform, room_id, &recorder).await;
}
RecorderEvent::RecordStart { recorder } => {
let event =
events::new_webhook_event(events::RECORD_STARTED, Payload::Room(recorder));
let _ = self.webhook_poster.post_event(&event).await;
}
RecorderEvent::RecordEnd { recorder } => {
let event =
events::new_webhook_event(events::RECORD_ENDED, Payload::Room(recorder));
let _ = self.webhook_poster.post_event(&event).await;
}
}
}
}
async fn handle_live_end(&self, platform: PlatformType, room_id: u64, live_id: &str) {
async fn handle_live_end(&self, platform: PlatformType, room_id: u64, recorder: &RecorderInfo) {
if !self.config.read().await.auto_generate.enabled {
return;
}
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
log::info!("Start auto generate for {}", recorder_id);
let live_record = self.db.get_record(room_id, live_id).await;
let live_id = recorder.current_live_id.clone();
let live_record = self.db.get_record(room_id, &live_id).await;
if live_record.is_err() {
log::error!("Live not found in record: {} {}", room_id, live_id);
return;
@@ -201,15 +222,15 @@ impl RecorderManager {
let clip_config = ClipRangeParams {
title: live_record.title,
note: "".into(),
cover: "".into(),
platform: live_record.platform.clone(),
room_id,
live_id: live_id.to_string(),
x: 0,
y: 0,
offset: recorder.first_segment_ts(live_id).await,
range: None,
danmu: encode_danmu,
local_offset: 0,
fix_encoding: false,
};
let clip_filename = self.config.read().await.generate_clip_name(&clip_config);
@@ -242,6 +263,7 @@ impl RecorderManager {
created_at: Utc::now().to_rfc3339(),
cover: "".into(),
file: f.file_name().unwrap().to_str().unwrap().to_string(),
note: "".into(),
length: live_record.length,
size: metadata.len() as i64,
bvid: "".into(),
@@ -292,7 +314,8 @@ impl RecorderManager {
let platform = PlatformType::from_str(&recorder.platform).unwrap();
let room_id = recorder.room_id;
let auto_start = recorder.auto_start;
recorder_map.insert((platform, room_id), auto_start);
let extra = recorder.extra;
recorder_map.insert((platform, room_id), (auto_start, extra));
}
let mut recorders_to_add = Vec::new();
for (platform, room_id) in recorder_map.keys() {
@@ -307,7 +330,7 @@ impl RecorderManager {
if self.is_migrating.load(std::sync::atomic::Ordering::Relaxed) {
break;
}
let auto_start = recorder_map.get(&(platform, room_id)).unwrap();
let (auto_start, extra) = recorder_map.get(&(platform, room_id)).unwrap();
let account = self
.db
.get_account_by_platform(platform.clone().as_str())
@@ -319,7 +342,7 @@ impl RecorderManager {
let account = account.unwrap();
if let Err(e) = self
.add_recorder(&account, platform, room_id, *auto_start)
.add_recorder(&account, platform, room_id, extra, *auto_start)
.await
{
log::error!("Failed to add recorder: {}", e);
@@ -334,6 +357,7 @@ impl RecorderManager {
account: &AccountRow,
platform: PlatformType,
room_id: u64,
extra: &str,
auto_start: bool,
) -> Result<(), RecorderManagerError> {
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
@@ -363,6 +387,7 @@ impl RecorderManager {
self.app_handle.clone(),
self.emitter.clone(),
room_id,
extra,
self.config.clone(),
account,
&self.db,
@@ -404,7 +429,7 @@ impl RecorderManager {
&self,
platform: PlatformType,
room_id: u64,
) -> Result<(), RecorderManagerError> {
) -> Result<RecorderRow, RecorderManagerError> {
// check recorder exists
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
if !self.recorders.read().await.contains_key(&recorder_id) {
@@ -412,7 +437,7 @@ impl RecorderManager {
}
// remove from db
self.db.remove_recorder(room_id).await?;
let recorder = self.db.remove_recorder(room_id).await?;
// add to to_remove
log::debug!("Add to to_remove: {}", recorder_id);
@@ -443,7 +468,7 @@ impl RecorderManager {
let _ = tokio::fs::remove_dir_all(cache_folder).await;
log::info!("Recorder {} cache folder removed", room_id);
Ok(())
Ok(recorder)
}
pub async fn clip_range(
@@ -475,14 +500,21 @@ impl RecorderManager {
params: &ClipRangeParams,
) -> Result<PathBuf, RecorderManagerError> {
let range_m3u8 = format!(
"{}/{}/{}/playlist.m3u8?start={}&end={}",
params.platform, params.room_id, params.live_id, params.x, params.y
"{}/{}/{}/playlist.m3u8",
params.platform, params.room_id, params.live_id
);
let manifest_content = self.handle_hls_request(&range_m3u8).await?;
let manifest_content = String::from_utf8(manifest_content)
let mut manifest_content = String::from_utf8(manifest_content)
.map_err(|e| RecorderManagerError::ClipError { err: e.to_string() })?;
// if manifest is for stream, replace EXT-X-PLAYLIST-TYPE:EVENT to EXT-X-PLAYLIST-TYPE:VOD, and add #EXT-X-ENDLIST
if manifest_content.contains("#EXT-X-PLAYLIST-TYPE:EVENT") {
manifest_content =
manifest_content.replace("#EXT-X-PLAYLIST-TYPE:EVENT", "#EXT-X-PLAYLIST-TYPE:VOD");
manifest_content += "\n#EXT-X-ENDLIST\n";
}
let cache_path = self.config.read().await.cache.clone();
let cache_path = Path::new(&cache_path);
let random_filename = format!("manifest_{}.m3u8", uuid::Uuid::new_v4());
@@ -497,7 +529,15 @@ impl RecorderManager {
.await
.map_err(|e| RecorderManagerError::ClipError { err: e.to_string() })?;
if let Err(e) = clip_from_m3u8(reporter, &tmp_manifest_file_path, &clip_file).await {
if let Err(e) = clip_from_m3u8(
reporter,
&tmp_manifest_file_path,
&clip_file,
params.range.as_ref(),
params.fix_encoding,
)
.await
{
log::error!("Failed to generate clip file: {}", e);
return Err(RecorderManagerError::ClipError { err: e.to_string() });
}
@@ -517,12 +557,6 @@ impl RecorderManager {
return Ok(clip_file);
}
let mut clip_offset = params.offset;
if clip_offset > 0 {
clip_offset -= recorder.first_segment_ts(&params.live_id).await;
clip_offset = clip_offset.max(0);
}
let danmus = recorder.comments(&params.live_id).await;
if danmus.is_err() {
log::error!("Failed to get danmus");
@@ -530,20 +564,24 @@ impl RecorderManager {
}
log::info!(
"Filter danmus in range [{}, {}] with global offset {} and local offset {}",
params.x,
params.y,
clip_offset,
"Filter danmus in range {} with local offset {}",
params
.range
.as_ref()
.map_or("None".to_string(), |r| r.to_string()),
params.local_offset
);
let mut danmus = danmus.unwrap();
log::debug!("First danmu entry: {:?}", danmus.first());
// update entry ts to offset
for d in &mut danmus {
d.ts -= (params.x + clip_offset + params.local_offset) * 1000;
}
if params.x != 0 || params.y != 0 {
danmus.retain(|x| x.ts >= 0 && x.ts <= (params.y - params.x) * 1000);
if let Some(range) = &params.range {
// update entry ts to offset and filter danmus in range
for d in &mut danmus {
d.ts -= (range.start as i64 + params.local_offset) * 1000;
}
if range.duration() > 0.0 {
danmus.retain(|x| x.ts >= 0 && x.ts <= (range.duration() as i64) * 1000);
}
}
if danmus.is_empty() {
@@ -601,8 +639,17 @@ impl RecorderManager {
}
}
pub async fn get_archives(&self, room_id: u64) -> Result<Vec<RecordRow>, RecorderManagerError> {
Ok(self.db.get_records(room_id).await?)
pub async fn get_archive_disk_usage(&self) -> Result<u64, RecorderManagerError> {
Ok(self.db.get_record_disk_usage().await?)
}
pub async fn get_archives(
&self,
room_id: u64,
offset: u64,
limit: u64,
) -> Result<Vec<RecordRow>, RecorderManagerError> {
Ok(self.db.get_records(room_id, offset, limit).await?)
}
pub async fn get_archive(
@@ -613,7 +660,12 @@ impl RecorderManager {
Ok(self.db.get_record(room_id, live_id).await?)
}
pub async fn get_archive_subtitle(&self, platform: PlatformType, room_id: u64, live_id: &str) -> Result<String, RecorderManagerError> {
pub async fn get_archive_subtitle(
&self,
platform: PlatformType,
room_id: u64,
live_id: &str,
) -> Result<String, RecorderManagerError> {
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
if let Some(recorder_ref) = self.recorders.read().await.get(&recorder_id) {
let recorder = recorder_ref.as_ref();
@@ -623,7 +675,12 @@ impl RecorderManager {
}
}
pub async fn generate_archive_subtitle(&self, platform: PlatformType, room_id: u64, live_id: &str) -> Result<String, RecorderManagerError> {
pub async fn generate_archive_subtitle(
&self,
platform: PlatformType,
room_id: u64,
live_id: &str,
) -> Result<String, RecorderManagerError> {
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
if let Some(recorder_ref) = self.recorders.read().await.get(&recorder_id) {
let recorder = recorder_ref.as_ref();
@@ -638,7 +695,7 @@ impl RecorderManager {
platform: PlatformType,
room_id: u64,
live_id: &str,
) -> Result<(), RecorderManagerError> {
) -> Result<RecordRow, RecorderManagerError> {
log::info!("Deleting {}:{}", room_id, live_id);
// check if this is still recording
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
@@ -650,13 +707,28 @@ impl RecorderManager {
});
}
}
self.db.remove_record(live_id).await?;
let to_delete = self.db.remove_record(live_id).await?;
let cache_folder = Path::new(self.config.read().await.cache.as_str())
.join(platform.as_str())
.join(room_id.to_string())
.join(live_id);
let _ = tokio::fs::remove_dir_all(cache_folder).await;
Ok(())
Ok(to_delete)
}
pub async fn delete_archives(
&self,
platform: PlatformType,
room_id: u64,
live_ids: &[&str],
) -> Result<Vec<RecordRow>, RecorderManagerError> {
log::info!("Deleting archives in batch: {:?}", live_ids);
let mut to_deletes = Vec::new();
for live_id in live_ids {
let to_delete = self.delete_archive(platform, room_id, live_id).await?;
to_deletes.push(to_delete);
}
Ok(to_deletes)
}
pub async fn get_danmu(

View File

@@ -6,6 +6,7 @@ use crate::config::Config;
use crate::database::Database;
use crate::recorder::bilibili::client::BiliClient;
use crate::recorder_manager::RecorderManager;
use crate::webhook::poster::WebhookPoster;
#[cfg(feature = "headless")]
use crate::progress_manager::ProgressManager;
@@ -21,6 +22,7 @@ pub struct State {
pub db: Arc<Database>,
pub client: Arc<BiliClient>,
pub config: Arc<RwLock<Config>>,
pub webhook_poster: WebhookPoster,
pub recorder_manager: Arc<RecorderManager>,
#[cfg(not(feature = "headless"))]
pub app_handle: tauri::AppHandle,

View File

@@ -178,6 +178,7 @@ mod tests {
}
#[tokio::test]
#[ignore = "Might not have enough memory to run this test"]
async fn generate_subtitle() {
let whisper = new(Path::new("tests/model/ggml-tiny-q5_1.bin"), "")
.await

View File

@@ -228,6 +228,7 @@ mod tests {
}
#[tokio::test]
#[ignore = "requres api key"]
async fn test_generate_subtitle() {
let result = new(Some("https://api.openai.com/v1"), Some("sk-****"), None).await;
assert!(result.is_ok());

View File

@@ -0,0 +1,47 @@
use uuid::Uuid;
use crate::{
database::{account::AccountRow, record::RecordRow, recorder::RecorderRow, video::VideoRow},
recorder::RecorderInfo,
};
pub const CLIP_GENERATED: &str = "clip.generated";
pub const CLIP_DELETED: &str = "clip.deleted";
pub const RECORD_STARTED: &str = "record.started";
pub const RECORD_ENDED: &str = "record.ended";
pub const LIVE_STARTED: &str = "live.started";
pub const LIVE_ENDED: &str = "live.ended";
pub const ARCHIVE_DELETED: &str = "archive.deleted";
pub const RECORDER_REMOVED: &str = "recorder.removed";
pub const RECORDER_ADDED: &str = "recorder.added";
#[derive(serde::Serialize, serde::Deserialize, Debug, Clone)]
pub struct WebhookEvent {
pub id: String,
pub event: String,
pub payload: Payload,
pub timestamp: i64,
}
#[derive(serde::Serialize, serde::Deserialize, Debug, Clone)]
#[serde(untagged)]
pub enum Payload {
Account(AccountRow),
Recorder(RecorderRow),
Room(RecorderInfo),
Clip(VideoRow),
Archive(RecordRow),
}
pub fn new_webhook_event(event_type: &str, payload: Payload) -> WebhookEvent {
WebhookEvent {
id: Uuid::new_v4().to_string(),
event: event_type.to_string(),
payload,
timestamp: chrono::Utc::now().timestamp(),
}
}

View File

@@ -0,0 +1,2 @@
pub mod events;
pub mod poster;

Some files were not shown because too many files have changed in this diff Show More