mirror of
https://github.com/Xinrea/bili-shadowreplay.git
synced 2025-11-25 04:22:24 +08:00
Compare commits
114 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
08979d2079 | ||
|
|
c6efe07303 | ||
|
|
7294f0ca6d | ||
|
|
eac1c09149 | ||
|
|
1e9cd61eba | ||
|
|
7b7f341fa0 | ||
|
|
ac806b49b2 | ||
|
|
f20636a107 | ||
|
|
787a30e6f7 | ||
|
|
d1d217be18 | ||
|
|
944d0a371a | ||
|
|
0df03e0c9c | ||
|
|
7ffdf65705 | ||
|
|
89cdf91a48 | ||
|
|
43ebc27044 | ||
|
|
e6159555f3 | ||
|
|
1f2508aae9 | ||
|
|
ad13f58fa7 | ||
|
|
de4959d49f | ||
|
|
b5b75129e7 | ||
|
|
84346a486f | ||
|
|
3bdcddf5a2 | ||
|
|
98f68a5e14 | ||
|
|
2249b86af3 | ||
|
|
fd889922d8 | ||
|
|
8db7c6e320 | ||
|
|
5bc4ed6dfd | ||
|
|
22ad5f7fea | ||
|
|
c0369c1a14 | ||
|
|
322f4a3ca5 | ||
|
|
4e32453441 | ||
|
|
66725b8a64 | ||
|
|
f7bcbbca83 | ||
|
|
07a3b33040 | ||
|
|
2f9b4582f8 | ||
|
|
c3f63c58cf | ||
|
|
4a3529bc2e | ||
|
|
b0355a919f | ||
|
|
cfe1a0b4b9 | ||
|
|
b655e98f35 | ||
|
|
2d1021bc42 | ||
|
|
33d74999b9 | ||
|
|
84b7dd7a3c | ||
|
|
0c678fbda3 | ||
|
|
3486f7d050 | ||
|
|
d42a1010b8 | ||
|
|
ece6ceea45 | ||
|
|
b22ebb399e | ||
|
|
4431b10cb7 | ||
|
|
01a0c929e8 | ||
|
|
b06f6e8d09 | ||
|
|
753227acbb | ||
|
|
c7dd9091d0 | ||
|
|
bae20ce011 | ||
|
|
8da4759668 | ||
|
|
eb7c6d91e9 | ||
|
|
3c24dfe8a6 | ||
|
|
bb916daaaf | ||
|
|
3931e484c2 | ||
|
|
b67e258c31 | ||
|
|
1a7e6f5a43 | ||
|
|
437204dbe6 | ||
|
|
af105277d9 | ||
|
|
7efd327a36 | ||
|
|
0141586fa9 | ||
|
|
df1d8ccac6 | ||
|
|
10b6b95e4d | ||
|
|
a58e6f77bd | ||
|
|
fe2bd80ac6 | ||
|
|
870b44a973 | ||
|
|
48fd9ca7b2 | ||
|
|
14d03b7eb9 | ||
|
|
6f1db6c038 | ||
|
|
cd2d208e5c | ||
|
|
7d6ec72002 | ||
|
|
837cb6a978 | ||
|
|
aeeb0c08d7 | ||
|
|
72d8a7f485 | ||
|
|
5d3692c7a0 | ||
|
|
7e54231bef | ||
|
|
80a885dbf3 | ||
|
|
134c6bbb5f | ||
|
|
49a153adf7 | ||
|
|
99e15b0bda | ||
|
|
4de8a73af2 | ||
|
|
d104ba3180 | ||
|
|
abf0d4748f | ||
|
|
d2a9c44601 | ||
|
|
c269558bae | ||
|
|
cc22453a40 | ||
|
|
d525d92de4 | ||
|
|
2197dfe65c | ||
|
|
38ee00f474 | ||
|
|
8fdad41c71 | ||
|
|
f269995bb7 | ||
|
|
03a2db8c44 | ||
|
|
6d9cd3c6a8 | ||
|
|
303b2f7036 | ||
|
|
ec25c2ffd9 | ||
|
|
50ab608ddb | ||
|
|
3c76be9b81 | ||
|
|
ab7f0cf0b4 | ||
|
|
f9f590c4dc | ||
|
|
8d38fe582a | ||
|
|
dc4a26561d | ||
|
|
10c1d1f3a8 | ||
|
|
66bcf53d01 | ||
|
|
8ab4b7d693 | ||
|
|
ce2f097d32 | ||
|
|
f7575cd327 | ||
|
|
8634c6a211 | ||
|
|
b070013efc | ||
|
|
d2d9112f6c | ||
|
|
9fea18f2de |
51
.cursor/rules/ai-features.mdc
Normal file
51
.cursor/rules/ai-features.mdc
Normal file
@@ -0,0 +1,51 @@
|
||||
# AI Features and LangChain Integration
|
||||
|
||||
## AI Components
|
||||
|
||||
- **LangChain Integration**: Uses `@langchain/core`, `@langchain/deepseek`,
|
||||
`@langchain/langgraph`, `@langchain/ollama`
|
||||
- **Whisper Transcription**: Local and online transcription via `whisper-rs` in
|
||||
Rust backend
|
||||
- **AI Agent**: Located in [src/lib/agent/](mdc:src/lib/agent/) directory
|
||||
|
||||
## Frontend AI Features
|
||||
|
||||
- **AI Page**: [src/page/AI.svelte](mdc:src/page/AI.svelte) - Main AI interface
|
||||
- **Agent Logic**: [src/lib/agent/](mdc:src/lib/agent/) - AI agent implementation
|
||||
- **Interface**: [src/lib/interface.ts](mdc:src/lib/interface.ts)
|
||||
\- AI communication layer
|
||||
|
||||
## Backend AI Features
|
||||
|
||||
- **Subtitle Generation**:
|
||||
[src-tauri/src/subtitle_generator/](mdc:src-tauri/src/subtitle_generator/) -
|
||||
AI-powered subtitle creation
|
||||
- **Whisper Integration**:
|
||||
[src-tauri/src/subtitle_generator.rs](mdc:src-tauri/src/subtitle_generator.rs)
|
||||
\- Speech-to-text processing
|
||||
- **CUDA Support**: Optional CUDA acceleration for Whisper via feature flag
|
||||
|
||||
## AI Workflows
|
||||
|
||||
- **Live Transcription**: Real-time speech-to-text during live streams
|
||||
- **Content Summarization**: AI-powered content analysis and summarization
|
||||
- **Smart Editing**: AI-assisted video editing and clip generation
|
||||
- **Danmaku Processing**: AI analysis of danmaku (bullet comments) streams
|
||||
|
||||
## Configuration
|
||||
|
||||
- **LLM Settings**: Configure AI models in [src-tauri/config.example.toml](mdc:src-tauri/config.example.toml)
|
||||
- **Whisper Models**: Local model configuration for offline transcription
|
||||
- **API Keys**: External AI service configuration for online features
|
||||
|
||||
## Development Notes
|
||||
|
||||
- AI features require proper model configuration
|
||||
- CUDA feature enables GPU acceleration for Whisper
|
||||
- LangChain integration supports multiple AI providers
|
||||
- AI agent can work with both local and cloud-based models
|
||||
description:
|
||||
globs:
|
||||
alwaysApply: true
|
||||
|
||||
---
|
||||
62
.cursor/rules/build-deployment.mdc
Normal file
62
.cursor/rules/build-deployment.mdc
Normal file
@@ -0,0 +1,62 @@
|
||||
# Build and Deployment Configuration
|
||||
|
||||
## Build Scripts
|
||||
|
||||
- **PowerShell**: [build.ps1](mdc:build.ps1) - Windows build script
|
||||
- **FFmpeg Setup**: [ffmpeg_setup.ps1](mdc:ffmpeg_setup.ps1)
|
||||
\- FFmpeg installation script
|
||||
- **Version Bump**: [scripts/bump.cjs](mdc:scripts/bump.cjs)
|
||||
\- Version management script
|
||||
|
||||
## Package Management
|
||||
|
||||
- **Node.js**: [package.json](mdc:package.json) - Frontend dependencies and scripts
|
||||
- **Rust**: [src-tauri/Cargo.toml](mdc:src-tauri/Cargo.toml)
|
||||
\- Backend dependencies and features
|
||||
- **Lock Files**: [yarn.lock](mdc:yarn.lock) - Yarn dependency lock
|
||||
|
||||
## Build Configuration
|
||||
|
||||
- **Vite**: [vite.config.ts](mdc:vite.config.ts) - Frontend build tool configuration
|
||||
- **Tailwind**: [tailwind.config.cjs](mdc:tailwind.config.cjs) - CSS framework configuration
|
||||
- **PostCSS**: [postcss.config.cjs](mdc:postcss.config.cjs) - CSS processing configuration
|
||||
- **TypeScript**: [tsconfig.json](mdc:tsconfig.json),
|
||||
[tsconfig.node.json](mdc:tsconfig.node.json) - TypeScript configuration
|
||||
|
||||
## Tauri Configuration
|
||||
|
||||
- **Main Config**: [src-tauri/tauri.conf.json](mdc:src-tauri/tauri.conf.json)
|
||||
\- Core Tauri settings
|
||||
- **Platform Configs**:
|
||||
- [src-tauri/tauri.macos.conf.json](mdc:src-tauri/tauri.macos.conf.json)
|
||||
\- macOS specific
|
||||
- [src-tauri/tauri.linux.conf.json](mdc:src-tauri/tauri.linux.conf.json)
|
||||
\- Linux specific
|
||||
- [src-tauri/tauri.windows.conf.json](mdc:src-tauri/tauri.windows.conf.json)
|
||||
\- Windows specific
|
||||
- [src-tauri/tauri.windows.cuda.conf.json](mdc:src-tauri/tauri.windows.cuda.conf.json)
|
||||
\- Windows with CUDA
|
||||
|
||||
## Docker Support
|
||||
|
||||
- **Dockerfile**: [Dockerfile](mdc:Dockerfile) - Container deployment configuration
|
||||
- **Documentation**: [docs/](mdc:docs/) - VitePress-based documentation site
|
||||
|
||||
## Build Commands
|
||||
|
||||
- **Frontend**: `yarn build` - Build production frontend
|
||||
- **Tauri**: `yarn tauri build` - Build desktop application
|
||||
- **Documentation**: `yarn docs:build` - Build documentation site
|
||||
- **Type Check**: `yarn check` - TypeScript and Svelte validation
|
||||
|
||||
## Deployment Targets
|
||||
|
||||
- **Desktop**: Native Tauri applications for Windows, macOS, Linux
|
||||
- **Docker**: Containerized deployment option
|
||||
- **Documentation**: Static site deployment via VitePress
|
||||
- **Assets**: Static asset distribution for web components
|
||||
description:
|
||||
globs:
|
||||
alwaysApply: true
|
||||
|
||||
---
|
||||
61
.cursor/rules/database-data.mdc
Normal file
61
.cursor/rules/database-data.mdc
Normal file
@@ -0,0 +1,61 @@
|
||||
# Database and Data Management
|
||||
|
||||
## Database Architecture
|
||||
|
||||
- **SQLite Database**: Primary data storage using `sqlx` with async runtime
|
||||
- **Database Module**: [src-tauri/src/database/](mdc:src-tauri/src/database/)
|
||||
\- Core database operations
|
||||
- **Migration System**: [src-tauri/src/migration.rs](mdc:src-tauri/src/migration.rs)
|
||||
\- Database schema management
|
||||
|
||||
## Data Models
|
||||
|
||||
- **Recording Data**: Stream metadata, recording sessions, and file information
|
||||
- **Room Configuration**: Stream room settings and platform credentials
|
||||
- **Task Management**: Recording task status and progress tracking
|
||||
- **User Preferences**: Application settings and user configurations
|
||||
|
||||
## Frontend Data Layer
|
||||
|
||||
- **Database Interface**: [src/lib/db.ts](mdc:src/lib/db.ts)
|
||||
\- Frontend database operations
|
||||
- **Stores**: [src/lib/stores/](mdc:src/lib/stores/) - State management for data
|
||||
- **Version Management**: [src/lib/stores/version.ts](mdc:src/lib/stores/version.ts)
|
||||
\- Version tracking
|
||||
|
||||
## Data Operations
|
||||
|
||||
- **CRUD Operations**: Create, read, update, delete for all data entities
|
||||
- **Query Optimization**: Efficient SQL queries with proper indexing
|
||||
- **Transaction Support**: ACID compliance for critical operations
|
||||
- **Data Validation**: Input validation and sanitization
|
||||
|
||||
## File Management
|
||||
|
||||
- **Cache Directory**: [src-tauri/cache/](mdc:src-tauri/cache/)
|
||||
\- Temporary file storage
|
||||
- **Upload Directory**: [src-tauri/cache/uploads/](mdc:src-tauri/cache/uploads/)
|
||||
\- User upload storage
|
||||
- **Bilibili Cache**: [src-tauri/cache/bilibili/](mdc:src-tauri/cache/bilibili/)
|
||||
\- Platform-specific cache
|
||||
|
||||
## Data Persistence
|
||||
|
||||
- **SQLite Files**: [src-tauri/data/data_v2.db](mdc:src-tauri/data/data_v2.db)
|
||||
\- Main database file
|
||||
- **Write-Ahead Logging**: WAL mode for concurrent access and performance
|
||||
- **Backup Strategy**: Database backup and recovery procedures
|
||||
- **Migration Handling**: Automatic schema updates and data migration
|
||||
|
||||
## Development Guidelines
|
||||
|
||||
- Use prepared statements to prevent SQL injection
|
||||
- Implement proper error handling for database operations
|
||||
- Use transactions for multi-step operations
|
||||
- Follow database naming conventions consistently
|
||||
- Test database operations with sample data
|
||||
description:
|
||||
globs:
|
||||
alwaysApply: true
|
||||
|
||||
---
|
||||
47
.cursor/rules/frontend-development.mdc
Normal file
47
.cursor/rules/frontend-development.mdc
Normal file
@@ -0,0 +1,47 @@
|
||||
# Frontend Development Guidelines
|
||||
|
||||
## Svelte 3 Best Practices
|
||||
|
||||
- Use Svelte 3 syntax with `<script>` tags for component logic
|
||||
- Prefer reactive statements with `$:` for derived state
|
||||
- Use stores from [src/lib/stores/](mdc:src/lib/stores/) for global state management
|
||||
- Import components from [src/lib/components/](mdc:src/lib/components/)
|
||||
|
||||
## TypeScript Configuration
|
||||
|
||||
- Follow the configuration in [tsconfig.json](mdc:tsconfig.json)
|
||||
- Use strict type checking with `checkJs: true`
|
||||
- Extends `@tsconfig/svelte` for Svelte-specific TypeScript settings
|
||||
- Base URL is set to workspace root for clean imports
|
||||
|
||||
## Component Structure
|
||||
|
||||
- **Page components**: Located in [src/page/](mdc:src/page/) directory
|
||||
- **Reusable components**: Located in [src/lib/components/](mdc:src/lib/components/)
|
||||
directory
|
||||
- **Layout components**: [src/App.svelte](mdc:src/App.svelte),
|
||||
[src/AppClip.svelte](mdc:src/AppClip.svelte), [src/AppLive.svelte](mdc:src/AppLive.svelte)
|
||||
|
||||
## Styling
|
||||
|
||||
- Use Tailwind CSS classes for styling
|
||||
- Configuration in [tailwind.config.cjs](mdc:tailwind.config.cjs)
|
||||
- PostCSS configuration in [postcss.config.cjs](mdc:postcss.config.cjs)
|
||||
- Global styles in [src/styles.css](mdc:src/styles.css)
|
||||
|
||||
## Entry Points
|
||||
|
||||
- **Main app**: [src/main.ts](mdc:src/main.ts) - Main application entry
|
||||
- **Clip mode**: [src/main_clip.ts](mdc:src/main_clip.ts) - Clip editing interface
|
||||
- **Live mode**: [src/main_live.ts](mdc:src/main_live.ts) - Live streaming interface
|
||||
|
||||
## Development Workflow
|
||||
|
||||
- Use `yarn dev` for frontend-only development
|
||||
- Use `yarn tauri dev` for full Tauri development
|
||||
- Use `yarn check` for TypeScript and Svelte type checking
|
||||
description:
|
||||
globs:
|
||||
alwaysApply: true
|
||||
|
||||
---
|
||||
53
.cursor/rules/project-overview.mdc
Normal file
53
.cursor/rules/project-overview.mdc
Normal file
@@ -0,0 +1,53 @@
|
||||
# BiliBili ShadowReplay Project Overview
|
||||
|
||||
This is a Tauri-based desktop application for caching live streams and performing
|
||||
real-time editing and submission. It supports Bilibili and Douyin platforms.
|
||||
|
||||
## Project Structure
|
||||
|
||||
### Frontend (Svelte + TypeScript)
|
||||
|
||||
- **Main entry points**: [src/main.ts](mdc:src/main.ts),
|
||||
[src/main_clip.ts](mdc:src/main_clip.ts), [src/main_live.ts](mdc:src/main_live.ts)
|
||||
- **App components**: [src/App.svelte](mdc:src/App.svelte),
|
||||
[src/AppClip.svelte](mdc:src/AppClip.svelte), [src/AppLive.svelte](mdc:src/AppLive.svelte)
|
||||
- **Pages**: Located in [src/page/](mdc:src/page/) directory
|
||||
- **Components**: Located in [src/lib/components/](mdc:src/lib/components/) directory
|
||||
- **Stores**: Located in [src/lib/stores/](mdc:src/lib/stores/) directory
|
||||
|
||||
### Backend (Rust + Tauri)
|
||||
|
||||
- **Main entry**: [src-tauri/src/main.rs](mdc:src-tauri/src/main.rs)
|
||||
- **Core modules**:
|
||||
- [src-tauri/src/recorder/](mdc:src-tauri/src/recorder/) - Stream recording functionality
|
||||
- [src-tauri/src/database/](mdc:src-tauri/src/database/) - Database operations
|
||||
- [src-tauri/src/handlers/](mdc:src-tauri/src/handlers/) - Tauri command handlers
|
||||
- **Custom crate**:
|
||||
[src-tauri/crates/danmu_stream/](mdc:src-tauri/crates/danmu_stream/) -
|
||||
Danmaku stream processing
|
||||
|
||||
### Configuration
|
||||
|
||||
- **Frontend config**: [tsconfig.json](mdc:tsconfig.json),
|
||||
[vite.config.ts](mdc:vite.config.ts), [tailwind.config.cjs](mdc:tailwind.config.cjs)
|
||||
- **Backend config**: [src-tauri/Cargo.toml](mdc:src-tauri/Cargo.toml), [src-tauri/tauri.conf.json](mdc:src-tauri/tauri.conf.json)
|
||||
- **Example config**: [src-tauri/config.example.toml](mdc:src-tauri/config.example.toml)
|
||||
|
||||
## Key Technologies
|
||||
|
||||
- **Frontend**: Svelte 3, TypeScript, Tailwind CSS, Flowbite
|
||||
- **Backend**: Rust, Tauri 2, SQLite, FFmpeg
|
||||
- **AI Features**: LangChain, Whisper for transcription
|
||||
- **Build Tools**: Vite, VitePress for documentation
|
||||
|
||||
## Development Commands
|
||||
|
||||
- `yarn dev` - Start development server
|
||||
- `yarn tauri dev` - Start Tauri development
|
||||
- `yarn build` - Build frontend
|
||||
- `yarn docs:dev` - Start documentation server
|
||||
description:
|
||||
globs:
|
||||
alwaysApply: true
|
||||
|
||||
---
|
||||
56
.cursor/rules/rust-backend.mdc
Normal file
56
.cursor/rules/rust-backend.mdc
Normal file
@@ -0,0 +1,56 @@
|
||||
# Rust Backend Development Guidelines
|
||||
|
||||
## Project Structure
|
||||
|
||||
- **Main entry**: [src-tauri/src/main.rs](mdc:src-tauri/src/main.rs)
|
||||
\- Application entry point
|
||||
- **Core modules**:
|
||||
- [src-tauri/src/recorder/](mdc:src-tauri/src/recorder/)
|
||||
\- Stream recording and management
|
||||
- [src-tauri/src/database/](mdc:src-tauri/src/database/)
|
||||
\- SQLite database operations
|
||||
- [src-tauri/src/handlers/](mdc:src-tauri/src/handlers/)
|
||||
\- Tauri command handlers
|
||||
- [src-tauri/src/subtitle_generator/](mdc:src-tauri/src/subtitle_generator/)
|
||||
\- AI-powered subtitle generation
|
||||
|
||||
## Custom Crates
|
||||
|
||||
- **danmu_stream**: [src-tauri/crates/danmu_stream/](mdc:src-tauri/crates/danmu_stream/)
|
||||
\- Danmaku stream processing library
|
||||
|
||||
## Dependencies
|
||||
|
||||
- **Tauri 2**: Core framework for desktop app functionality
|
||||
- **FFmpeg**: Video/audio processing via `async-ffmpeg-sidecar`
|
||||
- **Whisper**: AI transcription via `whisper-rs` (CUDA support available)
|
||||
- **LangChain**: AI agent functionality
|
||||
- **SQLite**: Database via `sqlx` with async runtime
|
||||
|
||||
## Configuration
|
||||
|
||||
- **Cargo.toml**: [src-tauri/Cargo.toml](mdc:src-tauri/Cargo.toml)
|
||||
\- Dependencies and features
|
||||
- **Tauri config**: [src-tauri/tauri.conf.json](mdc:src-tauri/tauri.conf.json)
|
||||
\- App configuration
|
||||
- **Example config**: [src-tauri/config.example.toml](mdc:src-tauri/config.example.toml)
|
||||
\- User configuration template
|
||||
|
||||
## Features
|
||||
|
||||
- **default**: Includes GUI and core functionality
|
||||
- **cuda**: Enables CUDA acceleration for Whisper transcription
|
||||
- **headless**: Headless mode without GUI
|
||||
- **custom-protocol**: Required for production builds
|
||||
|
||||
## Development Commands
|
||||
|
||||
- `yarn tauri dev` - Start Tauri development with hot reload
|
||||
- `yarn tauri build` - Build production application
|
||||
- `cargo check` - Check Rust code without building
|
||||
- `cargo test` - Run Rust tests
|
||||
description:
|
||||
globs:
|
||||
alwaysApply: true
|
||||
|
||||
---
|
||||
60
.cursor/rules/streaming-recording.mdc
Normal file
60
.cursor/rules/streaming-recording.mdc
Normal file
@@ -0,0 +1,60 @@
|
||||
# Streaming and Recording System
|
||||
|
||||
## Core Recording Components
|
||||
|
||||
- **Recorder Manager**: [src-tauri/src/recorder_manager.rs](mdc:src-tauri/src/recorder_manager.rs)
|
||||
\- Main recording orchestration
|
||||
- **Recorder**: [src-tauri/src/recorder/](mdc:src-tauri/src/recorder/)
|
||||
\- Individual stream recording logic
|
||||
- **Danmaku Stream**: [src-tauri/crates/danmu_stream/](mdc:src-tauri/crates/danmu_stream/)
|
||||
\- Custom crate for bullet comment processing
|
||||
|
||||
## Supported Platforms
|
||||
|
||||
- **Bilibili**: Main platform support with live stream caching
|
||||
- **Douyin**: TikTok's Chinese platform support
|
||||
- **Multi-stream**: Support for recording multiple streams simultaneously
|
||||
|
||||
## Recording Features
|
||||
|
||||
- **Live Caching**: Real-time stream recording and buffering
|
||||
- **Time-based Clipping**: Extract specific time segments from recorded streams
|
||||
- **Danmaku Capture**: Record bullet comments and chat messages
|
||||
- **Quality Control**: Configurable recording quality and format options
|
||||
|
||||
## Frontend Interfaces
|
||||
|
||||
- **Live Mode**: [src/AppLive.svelte](mdc:src/AppLive.svelte)
|
||||
\- Live streaming interface
|
||||
- **Clip Mode**: [src/AppClip.svelte](mdc:src/AppClip.svelte)
|
||||
\- Video editing and clipping
|
||||
- **Room Management**: [src/page/Room.svelte](mdc:src/page/Room.svelte)
|
||||
\- Stream room configuration
|
||||
- **Task Management**: [src/page/Task.svelte](mdc:src/page/Task.svelte)
|
||||
\- Recording task monitoring
|
||||
|
||||
## Technical Implementation
|
||||
|
||||
- **FFmpeg Integration**: Video/audio processing via `async-ffmpeg-sidecar`
|
||||
- **M3U8 Support**: HLS stream processing with `m3u8-rs`
|
||||
- **Async Processing**: Non-blocking I/O with `tokio` runtime
|
||||
- **Database Storage**: SQLite for metadata and recording information
|
||||
|
||||
## Configuration
|
||||
|
||||
- **Recording Settings**: Configure in [src-tauri/config.example.toml](mdc:src-tauri/config.example.toml)
|
||||
- **FFmpeg Path**: Set FFmpeg binary location for video processing
|
||||
- **Storage Paths**: Configure cache and output directories
|
||||
- **Quality Settings**: Adjust recording bitrate and format options
|
||||
|
||||
## Development Workflow
|
||||
|
||||
- Use [src-tauri/src/recorder/](mdc:src-tauri/src/recorder/) for core recording logic
|
||||
- Test with [src-tauri/tests/](mdc:src-tauri/tests/) directory
|
||||
- Monitor recording progress via progress manager
|
||||
- Handle errors gracefully with custom error types
|
||||
description:
|
||||
globs:
|
||||
alwaysApply: true
|
||||
|
||||
---
|
||||
36
.devcontainer/Dockerfile
Normal file
36
.devcontainer/Dockerfile
Normal file
@@ -0,0 +1,36 @@
|
||||
ARG VARIANT=bookworm-slim
|
||||
FROM debian:${VARIANT}
|
||||
ENV DEBIAN_FRONTEND=noninteractive
|
||||
|
||||
# Arguments
|
||||
ARG CONTAINER_USER=vscode
|
||||
ARG CONTAINER_GROUP=vscode
|
||||
|
||||
# Install dependencies
|
||||
RUN apt-get update \
|
||||
&& apt-get install -y \
|
||||
build-essential \
|
||||
clang \
|
||||
cmake \
|
||||
curl \
|
||||
file \
|
||||
git \
|
||||
libayatana-appindicator3-dev \
|
||||
librsvg2-dev \
|
||||
libssl-dev \
|
||||
libwebkit2gtk-4.1-dev \
|
||||
libxdo-dev \
|
||||
pkg-config \
|
||||
wget \
|
||||
&& apt-get clean -y && rm -rf /var/lib/apt/lists/* /tmp/library-scripts
|
||||
|
||||
# Set users
|
||||
RUN adduser --disabled-password --gecos "" ${CONTAINER_USER}
|
||||
USER ${CONTAINER_USER}
|
||||
WORKDIR /home/${CONTAINER_USER}
|
||||
|
||||
# Install rustup
|
||||
RUN curl --proto "=https" --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y --profile minimal
|
||||
ENV PATH=${PATH}:/home/${CONTAINER_USER}/.cargo/bin
|
||||
|
||||
CMD [ "/bin/bash" ]
|
||||
31
.devcontainer/devcontainer.json
Normal file
31
.devcontainer/devcontainer.json
Normal file
@@ -0,0 +1,31 @@
|
||||
{
|
||||
"name": "vscode",
|
||||
"build": {
|
||||
"dockerfile": "Dockerfile",
|
||||
"args": {
|
||||
"CONTAINER_USER": "vscode",
|
||||
"CONTAINER_GROUP": "vscode"
|
||||
}
|
||||
},
|
||||
"features": {
|
||||
"ghcr.io/devcontainers/features/node:1": {
|
||||
"version": "latest"
|
||||
}
|
||||
},
|
||||
"customizations": {
|
||||
"vscode": {
|
||||
"settings": {
|
||||
"lldb.executable": "/usr/bin/lldb",
|
||||
"files.watcherExclude": {
|
||||
"**/target/**": true
|
||||
}
|
||||
},
|
||||
"extensions": [
|
||||
"vadimcn.vscode-lldb",
|
||||
"rust-lang.rust-analyzer",
|
||||
"tamasfe.even-better-toml"
|
||||
]
|
||||
}
|
||||
},
|
||||
"remoteUser": "vscode"
|
||||
}
|
||||
7
.github/CONTRIBUTING.md
vendored
7
.github/CONTRIBUTING.md
vendored
@@ -12,7 +12,8 @@
|
||||
|
||||
### Windows
|
||||
|
||||
Windows 下分为两个版本,分别是 `cpu` 和 `cuda` 版本。区别在于 Whisper 是否使用 GPU 加速。`cpu` 版本使用 CPU 进行推理,`cuda` 版本使用 GPU 进行推理。
|
||||
Windows 下分为两个版本,分别是 `cpu` 和 `cuda` 版本。区别在于 Whisper 是否使用 GPU 加速。
|
||||
`cpu` 版本使用 CPU 进行推理,`cuda` 版本使用 GPU 进行推理。
|
||||
|
||||
默认运行为 `cpu` 版本,使用 `yarn tauri dev --features cuda` 命令运行 `cuda` 版本。
|
||||
|
||||
@@ -20,7 +21,9 @@ Windows 下分为两个版本,分别是 `cpu` 和 `cuda` 版本。区别在于
|
||||
|
||||
1. 安装 LLVM 且配置相关环境变量,详情见 [LLVM Windows Setup](https://llvm.org/docs/GettingStarted.html#building-llvm-on-windows);
|
||||
|
||||
2. 安装 CUDA Toolkit,详情见 [CUDA Windows Setup](https://docs.nvidia.com/cuda/cuda-installation-guide-microsoft-windows/index.html);要注意,安装时请勾选 **VisualStudio integration**。
|
||||
2. 安装 CUDA Toolkit,详情见
|
||||
[CUDA Windows Setup](https://docs.nvidia.com/cuda/cuda-installation-guide-microsoft-windows/index.html);
|
||||
要注意,安装时请勾选 **VisualStudio integration**。
|
||||
|
||||
### 常见问题
|
||||
|
||||
|
||||
21
.github/ISSUE_TEMPLATE/bug_report.md
vendored
21
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@@ -1,21 +0,0 @@
|
||||
---
|
||||
name: Bug report
|
||||
about: 提交一个 BUG
|
||||
title: "[BUG]"
|
||||
labels: bug
|
||||
assignees: Xinrea
|
||||
---
|
||||
|
||||
**描述:**
|
||||
简要描述一下这个 BUG 的现象
|
||||
|
||||
**日志和截图:**
|
||||
如果可以的话,请尽量附上相关截图和日志文件(日志是位于安装目录下,名为 bsr.log 的文件)。
|
||||
|
||||
**相关信息:**
|
||||
|
||||
- 程序版本:
|
||||
- 系统类型:
|
||||
|
||||
**其他**
|
||||
任何其他想说的
|
||||
47
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
Normal file
47
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
Normal file
@@ -0,0 +1,47 @@
|
||||
name: Bug Report
|
||||
description: 提交 BUG 报告.
|
||||
title: "[bug] "
|
||||
labels: ["bug"]
|
||||
assignees:
|
||||
- Xinrea
|
||||
body:
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: 提交须知
|
||||
description: 请确认以下内容
|
||||
options:
|
||||
- label: 我是在最新版本上发现的此问题
|
||||
required: true
|
||||
- label: 我已阅读 [常见问题](https://bsr.xinrea.cn/usage/faq.html) 的说明
|
||||
required: true
|
||||
- type: dropdown
|
||||
id: app_type
|
||||
attributes:
|
||||
label: 以哪种方式使用的该软件?
|
||||
multiple: false
|
||||
options:
|
||||
- Docker 镜像
|
||||
- 桌面应用
|
||||
- type: dropdown
|
||||
id: os
|
||||
attributes:
|
||||
label: 运行环境
|
||||
multiple: false
|
||||
options:
|
||||
- Linux
|
||||
- Windows
|
||||
- MacOS
|
||||
- Docker
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: BUG 描述
|
||||
description: 请尽可能详细描述 BUG 的现象以及复现的方法
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: logs
|
||||
attributes:
|
||||
label: 日志
|
||||
description: 请粘贴日志内容或是上传日志文件(在主窗口的设置页面,提供了一键打开日志目录所在位置的按钮;当你打开日志目录所在位置后,进入 logs 目录,找到后缀名为 log 的文件)
|
||||
validations:
|
||||
required: true
|
||||
20
.github/ISSUE_TEMPLATE/feature_request.md
vendored
20
.github/ISSUE_TEMPLATE/feature_request.md
vendored
@@ -1,20 +0,0 @@
|
||||
---
|
||||
name: Feature request
|
||||
about: 提交一个新功能的建议
|
||||
title: "[feature]"
|
||||
labels: enhancement
|
||||
assignees: Xinrea
|
||||
|
||||
---
|
||||
|
||||
**遇到的问题:**
|
||||
在使用过程中遇到了什么问题让你想要提出建议
|
||||
|
||||
**想要的功能:**
|
||||
想要怎样的新功能来解决这个问题
|
||||
|
||||
**通过什么方式实现(有思路的话):**
|
||||
如果有相关的实现思路或者是参考,可以在此提供
|
||||
|
||||
**其他:**
|
||||
其他任何想说的话
|
||||
13
.github/ISSUE_TEMPLATE/feature_request.yml
vendored
Normal file
13
.github/ISSUE_TEMPLATE/feature_request.yml
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
name: Feature Request
|
||||
description: 提交新功能的需求
|
||||
title: "[feature] "
|
||||
labels: ["feature"]
|
||||
assignees:
|
||||
- Xinrea
|
||||
body:
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: 需求描述
|
||||
description: 请尽可能详细描述你想要的新功能
|
||||
validations:
|
||||
required: true
|
||||
46
.github/workflows/check.yml
vendored
Normal file
46
.github/workflows/check.yml
vendored
Normal file
@@ -0,0 +1,46 @@
|
||||
name: Rust Check
|
||||
|
||||
on:
|
||||
push:
|
||||
paths:
|
||||
- "**/*.rs"
|
||||
- "src-tauri/Cargo.toml"
|
||||
- "src-tauri/Cargo.lock"
|
||||
|
||||
jobs:
|
||||
check:
|
||||
runs-on: self-linux
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup Rust
|
||||
uses: dtolnay/rust-toolchain@stable
|
||||
with:
|
||||
components: rustfmt clippy
|
||||
|
||||
- name: Install dependencies (ubuntu only)
|
||||
run: |
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf ffmpeg
|
||||
|
||||
- name: Check formatting
|
||||
run: cargo fmt --check
|
||||
working-directory: src-tauri
|
||||
|
||||
- name: Check clippy
|
||||
run: cargo clippy
|
||||
working-directory: src-tauri
|
||||
|
||||
- name: Check clippy (headless)
|
||||
run: cargo clippy --no-default-features --features headless
|
||||
working-directory: src-tauri
|
||||
|
||||
- name: Check tests
|
||||
run: cargo test -v
|
||||
working-directory: src-tauri
|
||||
|
||||
- name: Check tests (headless)
|
||||
run: cargo test --no-default-features --features headless -v
|
||||
working-directory: src-tauri
|
||||
22
.github/workflows/main.yml
vendored
22
.github/workflows/main.yml
vendored
@@ -59,12 +59,6 @@ jobs:
|
||||
if: matrix.platform == 'windows-latest' && matrix.features == 'cuda'
|
||||
uses: Jimver/cuda-toolkit@v0.2.24
|
||||
|
||||
- name: Rust cache
|
||||
uses: swatinem/rust-cache@v2
|
||||
with:
|
||||
workspaces: "./src-tauri -> target"
|
||||
shared-key: ${{ matrix.platform }}
|
||||
|
||||
- name: Setup ffmpeg
|
||||
if: matrix.platform == 'windows-latest'
|
||||
working-directory: ./
|
||||
@@ -88,6 +82,19 @@ jobs:
|
||||
Copy-Item "$cudaPath\cublas64*.dll" -Destination $targetPath
|
||||
Copy-Item "$cudaPath\cublasLt64*.dll" -Destination $targetPath
|
||||
|
||||
- name: Get previous tag
|
||||
id: get_previous_tag
|
||||
run: |
|
||||
# Get the previous tag (excluding the current one being pushed)
|
||||
PREVIOUS_TAG=$(git describe --tags --abbrev=0 HEAD~1 2>/dev/null || echo "")
|
||||
if [ -z "$PREVIOUS_TAG" ]; then
|
||||
# If no previous tag found, use the first commit
|
||||
PREVIOUS_TAG=$(git rev-list --max-parents=0 HEAD | head -1)
|
||||
fi
|
||||
echo "previous_tag=$PREVIOUS_TAG" >> $GITHUB_OUTPUT
|
||||
echo "current_tag=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT
|
||||
shell: bash
|
||||
|
||||
- uses: tauri-apps/tauri-action@v0
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
@@ -97,8 +104,7 @@ jobs:
|
||||
with:
|
||||
tagName: v__VERSION__
|
||||
releaseName: "BiliBili ShadowReplay v__VERSION__"
|
||||
releaseBody: "See the assets to download this version and install."
|
||||
releaseBody: "> [!NOTE]\n> 如果你是第一次下载安装,请参考 [安装准备](https://bsr.xinrea.cn/getting-started/installation/desktop.html) 选择合适的版本。\n> Changelog: https://github.com/Xinrea/bili-shadowreplay/compare/${{ steps.get_previous_tag.outputs.previous_tag }}...${{ steps.get_previous_tag.outputs.current_tag }}"
|
||||
releaseDraft: true
|
||||
prerelease: false
|
||||
args: ${{ matrix.args }} ${{ matrix.platform == 'windows-latest' && matrix.features == 'cuda' && '--config src-tauri/tauri.windows.cuda.conf.json' || '' }}
|
||||
includeDebug: true
|
||||
|
||||
1
.gitignore
vendored
1
.gitignore
vendored
@@ -11,6 +11,7 @@ node_modules
|
||||
dist
|
||||
dist-ssr
|
||||
*.local
|
||||
/target/
|
||||
|
||||
# Editor directories and files
|
||||
.vscode/*
|
||||
|
||||
5
.markdownlint.json
Normal file
5
.markdownlint.json
Normal file
@@ -0,0 +1,5 @@
|
||||
{
|
||||
"MD033": {
|
||||
"allowed_elements": ["nobr", "sup"]
|
||||
}
|
||||
}
|
||||
51
.pre-commit-config.yaml
Normal file
51
.pre-commit-config.yaml
Normal file
@@ -0,0 +1,51 @@
|
||||
fail_fast: true
|
||||
|
||||
repos:
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v6.0.0
|
||||
hooks:
|
||||
- id: trailing-whitespace
|
||||
- id: end-of-file-fixer
|
||||
exclude: \.json$
|
||||
|
||||
- repo: https://github.com/crate-ci/typos
|
||||
rev: v1.36.2
|
||||
hooks:
|
||||
- id: typos
|
||||
|
||||
- repo: local
|
||||
hooks:
|
||||
- id: cargo-fmt
|
||||
name: cargo fmt
|
||||
entry: cargo fmt --manifest-path src-tauri/Cargo.toml --
|
||||
language: system
|
||||
types: [rust]
|
||||
pass_filenames: false # This makes it a lot faster
|
||||
|
||||
- id: cargo-clippy
|
||||
name: cargo clippy
|
||||
language: system
|
||||
types: [rust]
|
||||
pass_filenames: false
|
||||
entry: cargo clippy --manifest-path src-tauri/Cargo.toml
|
||||
|
||||
- id: cargo-clippy-headless
|
||||
name: cargo clippy headless
|
||||
language: system
|
||||
types: [rust]
|
||||
pass_filenames: false
|
||||
entry: cargo clippy --manifest-path src-tauri/Cargo.toml --no-default-features --features headless
|
||||
|
||||
- id: cargo-test
|
||||
name: cargo test
|
||||
language: system
|
||||
types: [rust]
|
||||
pass_filenames: false
|
||||
entry: cargo test --manifest-path src-tauri/Cargo.toml
|
||||
|
||||
- id: cargo-test-headless
|
||||
name: cargo test headless
|
||||
language: system
|
||||
types: [rust]
|
||||
pass_filenames: false
|
||||
entry: cargo test --manifest-path src-tauri/Cargo.toml --no-default-features --features headless
|
||||
21
Dockerfile
21
Dockerfile
@@ -23,7 +23,7 @@ COPY . .
|
||||
RUN yarn build
|
||||
|
||||
# Build Rust backend
|
||||
FROM rust:1.86-slim AS rust-builder
|
||||
FROM rust:1.90-slim AS rust-builder
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
@@ -48,15 +48,9 @@ COPY src-tauri/crates ./src-tauri/crates
|
||||
WORKDIR /app/src-tauri
|
||||
RUN rustup component add rustfmt
|
||||
RUN cargo build --no-default-features --features headless --release
|
||||
# Download and install FFmpeg static build
|
||||
RUN wget https://johnvansickle.com/ffmpeg/releases/ffmpeg-release-amd64-static.tar.xz \
|
||||
&& tar xf ffmpeg-release-amd64-static.tar.xz \
|
||||
&& mv ffmpeg-*-static/ffmpeg ./ \
|
||||
&& mv ffmpeg-*-static/ffprobe ./ \
|
||||
&& rm -rf ffmpeg-*-static ffmpeg-release-amd64-static.tar.xz
|
||||
|
||||
# Final stage
|
||||
FROM debian:bookworm-slim AS final
|
||||
FROM debian:trixie-slim AS final
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
@@ -65,9 +59,16 @@ RUN apt-get update && apt-get install -y \
|
||||
libssl3 \
|
||||
ca-certificates \
|
||||
fonts-wqy-microhei \
|
||||
netbase \
|
||||
nscd \
|
||||
ffmpeg \
|
||||
&& update-ca-certificates \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
|
||||
RUN touch /etc/netgroup
|
||||
RUN mkdir -p /var/run/nscd && chmod 755 /var/run/nscd
|
||||
|
||||
# Add /app to PATH
|
||||
ENV PATH="/app:${PATH}"
|
||||
|
||||
@@ -76,11 +77,9 @@ COPY --from=frontend-builder /app/dist ./dist
|
||||
|
||||
# Copy built Rust binary
|
||||
COPY --from=rust-builder /app/src-tauri/target/release/bili-shadowreplay .
|
||||
COPY --from=rust-builder /app/src-tauri/ffmpeg ./ffmpeg
|
||||
COPY --from=rust-builder /app/src-tauri/ffprobe ./ffprobe
|
||||
|
||||
# Expose port
|
||||
EXPOSE 3000
|
||||
|
||||
# Run the application
|
||||
CMD ["./bili-shadowreplay"]
|
||||
CMD ["sh", "-c", "nscd && ./bili-shadowreplay"]
|
||||
|
||||
@@ -28,4 +28,5 @@ BiliBili ShadowReplay 是一个缓存直播并进行实时编辑投稿的工具
|
||||
|
||||
## 赞助
|
||||
|
||||

|
||||
<!-- markdownlint-disable MD033 -->
|
||||
<img src="docs/public/images/donate.png" alt="donate" width="300">
|
||||
|
||||
2
_typos.toml
Normal file
2
_typos.toml
Normal file
@@ -0,0 +1,2 @@
|
||||
[default.extend-identifiers]
|
||||
pull_datas = "pull_datas"
|
||||
@@ -54,6 +54,7 @@ export default withMermaid({
|
||||
{ text: "切片功能", link: "/usage/features/clip" },
|
||||
{ text: "字幕功能", link: "/usage/features/subtitle" },
|
||||
{ text: "弹幕功能", link: "/usage/features/danmaku" },
|
||||
{ text: "Webhook", link: "/usage/features/webhook" },
|
||||
],
|
||||
},
|
||||
{ text: "常见问题", link: "/usage/faq" },
|
||||
|
||||
@@ -1,9 +1,11 @@
|
||||
# Whisper 配置
|
||||
|
||||
要使用 AI 字幕识别功能,需要在设置页面配置 Whisper。目前可以选择使用本地运行 Whisper 模型,或是使用在线的 Whisper 服务(通常需要付费获取 API Key)。
|
||||
要使用 AI 字幕识别功能,需要在设置页面配置 Whisper。目前可以选择使用本地运行 Whisper 模型,或是使用在线的 Whisper 服务(通常需要付
|
||||
费获取 API Key)。
|
||||
|
||||
> [!NOTE]
|
||||
> 其实有许多更好的中文字幕识别解决方案,但是这类服务通常需要将文件上传到对象存储后异步处理,考虑到实现的复杂度,选择了使用本地运行 Whisper 模型或是使用在线的 Whisper 服务,在请求返回时能够直接获取字幕生成结果。
|
||||
> 其实有许多更好的中文字幕识别解决方案,但是这类服务通常需要将文件上传到对象存储后异步处理,考虑到实现的复杂度,选择了使用本地运行 Whisper 模型或是使
|
||||
> 用在线的 Whisper 服务,在请求返回时能够直接获取字幕生成结果。
|
||||
|
||||
## 本地运行 Whisper 模型
|
||||
|
||||
@@ -16,20 +18,29 @@
|
||||
|
||||
可以跟据自己的需求选择不同的模型,要注意带有 `en` 的模型是英文模型,其他模型为多语言模型。
|
||||
|
||||
模型文件的大小通常意味着其在运行时资源占用的大小,因此请根据电脑配置选择合适的模型。此外,GPU 版本与 CPU 版本在字幕生成速度上存在**巨大差异**,因此推荐使用 GPU 版本进行本地处理(目前仅支持 Nvidia GPU)。
|
||||
模型文件的大小通常意味着其在运行时资源占用的大小,因此请根据电脑配置选择合适的模型。此外,GPU 版本与 CPU 版本在字幕生成速度上存在**巨大差异**,因此
|
||||
推荐使用 GPU 版本进行本地处理(目前仅支持 Nvidia GPU)。
|
||||
|
||||
## 使用在线 Whisper 服务
|
||||
|
||||

|
||||
|
||||
如果需要使用在线的 Whisper 服务进行字幕生成,可以在设置中切换为在线 Whisper,并配置好 API Key。提供 Whisper 服务的平台并非只有 OpenAI 一家,许多云服务平台也提供 Whisper 服务。
|
||||
如果需要使用在线的 Whisper 服务进行字幕生成,可以在设置中切换为在线 Whisper,并配置好 API Key。提供 Whisper 服务的平台并非只有
|
||||
OpenAI 一家,许多云服务平台也提供 Whisper 服务。
|
||||
|
||||
## 字幕识别质量的调优
|
||||
|
||||
目前在设置中支持设置 Whisper 语言和 Whisper 提示词,这些设置对于本地和在线的 Whisper 服务都有效。
|
||||
|
||||
通常情况下,`auto` 语言选项能够自动识别语音语言,并生成相应语言的字幕。如果需要生成其他语言的字幕,或是生成的字幕语言不匹配,可以手动配置指定的语言。根据 OpenAI 官方文档中对于 `language` 参数的描述,目前支持的语言包括
|
||||
通常情况下,`auto` 语言选项能够自动识别语音语言,并生成相应语言的字幕。如果需要生成其他语言的字幕,或是生成的字幕语言不匹配,可以手动配置指定的语言。
|
||||
根据 OpenAI 官方文档中对于 `language` 参数的描述,目前支持的语言包括
|
||||
|
||||
Afrikaans, Arabic, Armenian, Azerbaijani, Belarusian, Bosnian, Bulgarian, Catalan, Chinese, Croatian, Czech, Danish, Dutch, English, Estonian, Finnish, French, Galician, German, Greek, Hebrew, Hindi, Hungarian, Icelandic, Indonesian, Italian, Japanese, Kannada, Kazakh, Korean, Latvian, Lithuanian, Macedonian, Malay, Marathi, Maori, Nepali, Norwegian, Persian, Polish, Portuguese, Romanian, Russian, Serbian, Slovak, Slovenian, Spanish, Swahili, Swedish, Tagalog, Tamil, Thai, Turkish, Ukrainian, Urdu, Vietnamese, and Welsh.
|
||||
Afrikaans, Arabic, Armenian, Azerbaijani, Belarusian, Bosnian, Bulgarian,
|
||||
Catalan, Chinese, Croatian, Czech, Danish, Dutch, English, Estonian, Finnish,
|
||||
French, Galician, German, Greek, Hebrew, Hindi, Hungarian, Icelandic,
|
||||
Indonesian, Italian, Japanese, Kannada, Kazakh, Korean, Latvian, Lithuanian,
|
||||
Macedonian, Malay, Marathi, Maori, Nepali, Norwegian, Persian, Polish,
|
||||
Portuguese, Romanian, Russian, Serbian, Slovak, Slovenian, Spanish, Swahili,
|
||||
Swedish, Tagalog, Tamil, Thai, Turkish, Ukrainian, Urdu, Vietnamese, and Welsh.
|
||||
|
||||
提示词可以优化生成的字幕的风格(也会一定程度上影响质量),要注意,Whisper 无法理解复杂的提示词,你可以在提示词中使用一些简单的描述,让其在选择词汇时使用偏向于提示词所描述的领域相关的词汇,以避免出现毫不相干领域的词汇;或是让它在标点符号的使用上参照提示词的风格。
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
桌面端目前提供了 Windows、Linux 和 MacOS 三个平台的安装包。
|
||||
|
||||
安装包分为两个版本,普通版和 debug 版,普通版适合大部分用户使用,debug 版包含了更多的调试信息,适合开发者使用;由于程序会对账号等敏感信息进行管理,请从信任的来源进行下载;所有版本均可在 [GitHub Releases](https://github.com/Xinrea/bili-shadowreplay/releases) 页面下载安装。
|
||||
由于程序会对账号等敏感信息进行管理,请从信任的来源进行下载;所有版本均可在 [GitHub Releases](https://github.com/Xinrea/bili-shadowreplay/releases) 页面下载安装。
|
||||
|
||||
## Windows
|
||||
|
||||
|
||||
@@ -17,6 +17,8 @@
|
||||
|
||||
### 使用 DeepLinking 快速添加直播间
|
||||
|
||||
<!-- MD033 -->
|
||||
|
||||
<video src="/videos/deeplinking.mp4" loop autoplay muted style="border-radius: 10px;"></video>
|
||||
|
||||
在浏览器中观看直播时,替换地址栏中直播间地址中的 `https://` 为 `bsr://` 即可快速唤起 BSR 添加直播间。
|
||||
|
||||
245
docs/usage/features/webhook.md
Normal file
245
docs/usage/features/webhook.md
Normal file
@@ -0,0 +1,245 @@
|
||||
# Webhook
|
||||
|
||||
> [!NOTE]
|
||||
> 你可以使用 <https://webhook.site> 来测试 Webhook 功能。
|
||||
|
||||
## 设置 Webhook
|
||||
|
||||
打开 BSR 设置页面,在基础设置中设置 Webhook 地址。
|
||||
|
||||
## Webhook Events
|
||||
|
||||
### 直播间相关
|
||||
|
||||
#### 添加直播间
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "a96a5e9f-9857-4c13-b889-91da2ace208a",
|
||||
"event": "recorder.added",
|
||||
"payload": {
|
||||
"room_id": 26966466,
|
||||
"created_at": "2025-09-07T03:33:14.258796+00:00",
|
||||
"platform": "bilibili",
|
||||
"auto_start": true,
|
||||
"extra": ""
|
||||
},
|
||||
"timestamp": 1757215994
|
||||
}
|
||||
```
|
||||
|
||||
#### 移除直播间
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "e33623d4-e040-4390-88f5-d351ceeeace7",
|
||||
"event": "recorder.removed",
|
||||
"payload": {
|
||||
"room_id": 27183290,
|
||||
"created_at": "2025-08-30T10:54:18.569198+00:00",
|
||||
"platform": "bilibili",
|
||||
"auto_start": true,
|
||||
"extra": ""
|
||||
},
|
||||
"timestamp": 1757217015
|
||||
}
|
||||
```
|
||||
|
||||
### 直播相关
|
||||
|
||||
> [!NOTE]
|
||||
> 直播开始和结束,不意味着录制的开始和结束。
|
||||
|
||||
#### 直播开始
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "f12f3424-f7d8-4b2f-a8b7-55477411482e",
|
||||
"event": "live.started",
|
||||
"payload": {
|
||||
"room_id": 843610,
|
||||
"room_info": {
|
||||
"room_id": 843610,
|
||||
"room_title": "登顶!",
|
||||
"room_cover": "https://i0.hdslb.com/bfs/live/new_room_cover/73aea43f4b4624c314d62fea4b424822fb506dfb.jpg"
|
||||
},
|
||||
"user_info": {
|
||||
"user_id": "475210",
|
||||
"user_name": "Xinrea",
|
||||
"user_avatar": "https://i1.hdslb.com/bfs/face/91beb3bf444b295fe12bae1f3dc6d9fc4fe4c224.jpg"
|
||||
},
|
||||
"total_length": 0,
|
||||
"current_live_id": "",
|
||||
"live_status": false,
|
||||
"is_recording": false,
|
||||
"auto_start": true,
|
||||
"platform": "bilibili"
|
||||
},
|
||||
"timestamp": 1757217190
|
||||
}
|
||||
```
|
||||
|
||||
#### 直播结束
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "e8b0756a-02f9-4655-b5ae-a170bf9547bd",
|
||||
"event": "live.ended",
|
||||
"payload": {
|
||||
"room_id": 843610,
|
||||
"room_info": {
|
||||
"room_id": 843610,
|
||||
"room_title": "登顶!",
|
||||
"room_cover": "https://i0.hdslb.com/bfs/live/new_room_cover/73aea43f4b4624c314d62fea4b424822fb506dfb.jpg"
|
||||
},
|
||||
"user_info": {
|
||||
"user_id": "475210",
|
||||
"user_name": "Xinrea",
|
||||
"user_avatar": "https://i1.hdslb.com/bfs/face/91beb3bf444b295fe12bae1f3dc6d9fc4fe4c224.jpg"
|
||||
},
|
||||
"total_length": 0,
|
||||
"current_live_id": "",
|
||||
"live_status": true,
|
||||
"is_recording": false,
|
||||
"auto_start": true,
|
||||
"platform": "bilibili"
|
||||
},
|
||||
"timestamp": 1757217365
|
||||
}
|
||||
```
|
||||
|
||||
### 录播相关
|
||||
|
||||
#### 开始录制
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "5ec1ea10-2b31-48fd-8deb-f2d7d2ea5985",
|
||||
"event": "record.started",
|
||||
"payload": {
|
||||
"room_id": 26966466,
|
||||
"room_info": {
|
||||
"room_id": 26966466,
|
||||
"room_title": "早安獭獭栞!下播前抽fufu",
|
||||
"room_cover": "https://i0.hdslb.com/bfs/live/user_cover/b810c36855168034557e905e5916b1dba1761fa4.jpg"
|
||||
},
|
||||
"user_info": {
|
||||
"user_id": "1609526545",
|
||||
"user_name": "栞栞Shiori",
|
||||
"user_avatar": "https://i1.hdslb.com/bfs/face/47e8dbabb895de44ec6cace085d4dc1d40307277.jpg"
|
||||
},
|
||||
"total_length": 0,
|
||||
"current_live_id": "1757216045412",
|
||||
"live_status": true,
|
||||
"is_recording": false,
|
||||
"auto_start": true,
|
||||
"platform": "bilibili"
|
||||
},
|
||||
"timestamp": 1757216045
|
||||
}
|
||||
```
|
||||
|
||||
#### 结束录制
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "56fd03e5-3965-4c2e-a6a9-bb6932347eb3",
|
||||
"event": "record.ended",
|
||||
"payload": {
|
||||
"room_id": 26966466,
|
||||
"room_info": {
|
||||
"room_id": 26966466,
|
||||
"room_title": "早安獭獭栞!下播前抽fufu",
|
||||
"room_cover": "https://i0.hdslb.com/bfs/live/user_cover/b810c36855168034557e905e5916b1dba1761fa4.jpg"
|
||||
},
|
||||
"user_info": {
|
||||
"user_id": "1609526545",
|
||||
"user_name": "栞栞Shiori",
|
||||
"user_avatar": "https://i1.hdslb.com/bfs/face/47e8dbabb895de44ec6cace085d4dc1d40307277.jpg"
|
||||
},
|
||||
"total_length": 52.96700000000001,
|
||||
"current_live_id": "1757215994597",
|
||||
"live_status": true,
|
||||
"is_recording": true,
|
||||
"auto_start": true,
|
||||
"platform": "bilibili"
|
||||
},
|
||||
"timestamp": 1757216040
|
||||
}
|
||||
```
|
||||
|
||||
#### 删除录播
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "c32bc811-ab4b-49fd-84c7-897727905d16",
|
||||
"event": "archive.deleted",
|
||||
"payload": {
|
||||
"platform": "bilibili",
|
||||
"live_id": "1756607084705",
|
||||
"room_id": 1967212929,
|
||||
"title": "灶台O.o",
|
||||
"length": 9,
|
||||
"size": 1927112,
|
||||
"created_at": "2025-08-31T02:24:44.728616+00:00",
|
||||
"cover": "bilibili/1967212929/1756607084705/cover.jpg"
|
||||
},
|
||||
"timestamp": 1757176219
|
||||
}
|
||||
```
|
||||
|
||||
### 切片相关
|
||||
|
||||
#### 切片生成
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "f542e0e1-688b-4f1a-8ce1-e5e51530cf5d",
|
||||
"event": "clip.generated",
|
||||
"payload": {
|
||||
"id": 316,
|
||||
"room_id": 27183290,
|
||||
"cover": "[27183290][1757172501727][一起看凡人修仙传][2025-09-07_00-16-11].jpg",
|
||||
"file": "[27183290][1757172501727][一起看凡人修仙传][2025-09-07_00-16-11].mp4",
|
||||
"note": "",
|
||||
"length": 121,
|
||||
"size": 53049119,
|
||||
"status": 0,
|
||||
"bvid": "",
|
||||
"title": "",
|
||||
"desc": "",
|
||||
"tags": "",
|
||||
"area": 0,
|
||||
"created_at": "2025-09-07T00:16:11.747461+08:00",
|
||||
"platform": "bilibili"
|
||||
},
|
||||
"timestamp": 1757175371
|
||||
}
|
||||
```
|
||||
|
||||
#### 切片删除
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "5c7ca728-753d-4a7d-a0b4-02c997ad2f92",
|
||||
"event": "clip.deleted",
|
||||
"payload": {
|
||||
"id": 313,
|
||||
"room_id": 27183290,
|
||||
"cover": "[27183290][1756903953470][不出非洲之心不下播][2025-09-03_21-10-54].jpg",
|
||||
"file": "[27183290][1756903953470][不出非洲之心不下播][2025-09-03_21-10-54].mp4",
|
||||
"note": "",
|
||||
"length": 32,
|
||||
"size": 18530098,
|
||||
"status": 0,
|
||||
"bvid": "",
|
||||
"title": "",
|
||||
"desc": "",
|
||||
"tags": "",
|
||||
"area": 0,
|
||||
"created_at": "2025-09-03T21:10:54.943682+08:00",
|
||||
"platform": "bilibili"
|
||||
},
|
||||
"timestamp": 1757147617
|
||||
}
|
||||
```
|
||||
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"name": "bili-shadowreplay",
|
||||
"private": true,
|
||||
"version": "2.11.0",
|
||||
"version": "2.13.8",
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"dev": "vite",
|
||||
@@ -30,7 +30,8 @@
|
||||
"@tauri-apps/plugin-sql": "~2",
|
||||
"lucide-svelte": "^0.479.0",
|
||||
"marked": "^16.1.1",
|
||||
"qrcode": "^1.5.4"
|
||||
"qrcode": "^1.5.4",
|
||||
"socket.io-client": "^4.8.1"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@sveltejs/vite-plugin-svelte": "^2.0.0",
|
||||
@@ -50,7 +51,7 @@
|
||||
"tailwindcss": "^3.3.0",
|
||||
"ts-node": "^10.9.1",
|
||||
"tslib": "^2.4.1",
|
||||
"typescript": "^4.6.4",
|
||||
"typescript": "^5.0.0",
|
||||
"vite": "^4.0.0",
|
||||
"vitepress": "^1.6.3",
|
||||
"vitepress-plugin-mermaid": "^2.0.17"
|
||||
|
||||
BIN
public/imgs/bilibili.png
Normal file
BIN
public/imgs/bilibili.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 26 KiB |
BIN
public/imgs/bilibili_avatar.png
Normal file
BIN
public/imgs/bilibili_avatar.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 38 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 32 KiB After Width: | Height: | Size: 246 KiB |
BIN
public/imgs/douyin_avatar.png
Normal file
BIN
public/imgs/douyin_avatar.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 153 KiB |
697
src-tauri/Cargo.lock
generated
697
src-tauri/Cargo.lock
generated
File diff suppressed because it is too large
Load Diff
@@ -4,13 +4,20 @@ resolver = "2"
|
||||
|
||||
[package]
|
||||
name = "bili-shadowreplay"
|
||||
version = "2.11.0"
|
||||
version = "2.13.8"
|
||||
description = "BiliBili ShadowReplay"
|
||||
authors = ["Xinrea"]
|
||||
license = ""
|
||||
repository = ""
|
||||
edition = "2021"
|
||||
|
||||
[lints.clippy]
|
||||
correctness="deny"
|
||||
suspicious="deny"
|
||||
complexity="deny"
|
||||
style="deny"
|
||||
perf="deny"
|
||||
|
||||
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
|
||||
|
||||
[dependencies]
|
||||
@@ -25,7 +32,6 @@ async-std = "1.12.0"
|
||||
async-ffmpeg-sidecar = "0.0.1"
|
||||
chrono = { version = "0.4.24", features = ["serde"] }
|
||||
toml = "0.7.3"
|
||||
custom_error = "1.9.2"
|
||||
regex = "1.7.3"
|
||||
tokio = { version = "1.27.0", features = ["process"] }
|
||||
platform-dirs = "0.3.0"
|
||||
@@ -49,9 +55,14 @@ tower-http = { version = "0.5", features = ["cors", "fs"] }
|
||||
futures-core = "0.3"
|
||||
futures = "0.3"
|
||||
tokio-util = { version = "0.7", features = ["io"] }
|
||||
tokio-stream = "0.1"
|
||||
clap = { version = "4.5.37", features = ["derive"] }
|
||||
url = "2.5.4"
|
||||
srtparse = "0.2.0"
|
||||
thiserror = "2"
|
||||
deno_core = "0.355"
|
||||
sanitize-filename = "0.6.0"
|
||||
socketioxide = "0.17.2"
|
||||
|
||||
[features]
|
||||
# this feature is used for production builds or when `devPath` points to the filesystem
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
fn main() {
|
||||
#[cfg(feature = "gui")]
|
||||
tauri_build::build()
|
||||
tauri_build::build();
|
||||
}
|
||||
|
||||
@@ -2,11 +2,7 @@
|
||||
"identifier": "migrated",
|
||||
"description": "permissions that were migrated from v1",
|
||||
"local": true,
|
||||
"windows": [
|
||||
"main",
|
||||
"Live*",
|
||||
"Clip*"
|
||||
],
|
||||
"windows": ["main", "Live*", "Clip*"],
|
||||
"permissions": [
|
||||
"core:default",
|
||||
"fs:allow-read-file",
|
||||
@@ -20,9 +16,7 @@
|
||||
"fs:allow-exists",
|
||||
{
|
||||
"identifier": "fs:scope",
|
||||
"allow": [
|
||||
"**"
|
||||
]
|
||||
"allow": ["**"]
|
||||
},
|
||||
"core:window:default",
|
||||
"core:window:allow-start-dragging",
|
||||
@@ -55,6 +49,12 @@
|
||||
},
|
||||
{
|
||||
"url": "https://*.douyinpic.com/"
|
||||
},
|
||||
{
|
||||
"url": "http://tauri.localhost/*"
|
||||
},
|
||||
{
|
||||
"url": "http://localhost:8054/*"
|
||||
}
|
||||
]
|
||||
},
|
||||
@@ -74,4 +74,4 @@
|
||||
"dialog:default",
|
||||
"deep-link:default"
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
@@ -7,38 +7,42 @@ edition = "2021"
|
||||
name = "danmu_stream"
|
||||
path = "src/lib.rs"
|
||||
|
||||
[[example]]
|
||||
name = "bilibili"
|
||||
path = "examples/bilibili.rs"
|
||||
|
||||
[[example]]
|
||||
name = "douyin"
|
||||
path = "examples/douyin.rs"
|
||||
|
||||
[dependencies]
|
||||
tokio = { version = "1.0", features = ["full"] }
|
||||
tokio-tungstenite = { version = "0.20", features = ["native-tls"] }
|
||||
tokio = { version = "1", features = ["full"] }
|
||||
tokio-tungstenite = { version = "0.27", features = ["native-tls"] }
|
||||
futures-util = "0.3"
|
||||
prost = "0.12"
|
||||
prost = "0.14"
|
||||
chrono = "0.4"
|
||||
log = "0.4"
|
||||
env_logger = "0.10"
|
||||
serde = { version = "1.0", features = ["derive"] }
|
||||
serde_json = "1.0"
|
||||
reqwest = { version = "0.11", features = ["json"] }
|
||||
env_logger = "0.11"
|
||||
serde = { version = "1", features = ["derive"] }
|
||||
serde_json = "1"
|
||||
reqwest = { version = "0.12", features = ["json"] }
|
||||
url = "2.4"
|
||||
md5 = "0.7"
|
||||
md5 = "0.8"
|
||||
regex = "1.9"
|
||||
deno_core = "0.242.0"
|
||||
pct-str = "2.0.0"
|
||||
custom_error = "1.9.2"
|
||||
deno_core = "0.355"
|
||||
pct-str = "2.0"
|
||||
thiserror = "2.0"
|
||||
flate2 = "1.0"
|
||||
scroll = "0.13.0"
|
||||
scroll_derive = "0.13.0"
|
||||
brotli = "8.0.1"
|
||||
scroll = "0.13"
|
||||
scroll_derive = "0.13"
|
||||
brotli = "8.0"
|
||||
http = "1.0"
|
||||
rand = "0.9.1"
|
||||
urlencoding = "2.1.3"
|
||||
rand = "0.9"
|
||||
urlencoding = "2.1"
|
||||
gzip = "0.1.2"
|
||||
hex = "0.4.3"
|
||||
async-trait = "0.1.88"
|
||||
uuid = "1.17.0"
|
||||
async-trait = "0.1"
|
||||
uuid = "1"
|
||||
|
||||
[build-dependencies]
|
||||
tonic-build = "0.10"
|
||||
tonic-build = "0.14"
|
||||
|
||||
@@ -1,16 +1,17 @@
|
||||
use std::sync::Arc;
|
||||
|
||||
use tokio::sync::{mpsc, RwLock};
|
||||
|
||||
use crate::{
|
||||
provider::{new, DanmuProvider, ProviderType},
|
||||
DanmuMessageType, DanmuStreamError,
|
||||
};
|
||||
use tokio::sync::{mpsc, RwLock};
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct DanmuStream {
|
||||
pub provider_type: ProviderType,
|
||||
pub identifier: String,
|
||||
pub room_id: u64,
|
||||
pub room_id: i64,
|
||||
pub provider: Arc<RwLock<Box<dyn DanmuProvider>>>,
|
||||
tx: mpsc::UnboundedSender<DanmuMessageType>,
|
||||
rx: Arc<RwLock<mpsc::UnboundedReceiver<DanmuMessageType>>>,
|
||||
@@ -20,7 +21,7 @@ impl DanmuStream {
|
||||
pub async fn new(
|
||||
provider_type: ProviderType,
|
||||
identifier: &str,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
) -> Result<Self, DanmuStreamError> {
|
||||
let (tx, rx) = mpsc::unbounded_channel();
|
||||
let provider = new(provider_type, identifier, room_id).await?;
|
||||
|
||||
@@ -1,19 +1,8 @@
|
||||
use std::time::Duration;
|
||||
|
||||
use crate::DanmuStreamError;
|
||||
use reqwest::header::HeaderMap;
|
||||
|
||||
impl From<reqwest::Error> for DanmuStreamError {
|
||||
fn from(value: reqwest::Error) -> Self {
|
||||
Self::HttpError { err: value }
|
||||
}
|
||||
}
|
||||
|
||||
impl From<url::ParseError> for DanmuStreamError {
|
||||
fn from(value: url::ParseError) -> Self {
|
||||
Self::ParseError { err: value }
|
||||
}
|
||||
}
|
||||
use crate::DanmuStreamError;
|
||||
|
||||
pub struct ApiClient {
|
||||
client: reqwest::Client,
|
||||
|
||||
@@ -2,16 +2,24 @@ pub mod danmu_stream;
|
||||
mod http_client;
|
||||
pub mod provider;
|
||||
|
||||
use custom_error::custom_error;
|
||||
use thiserror::Error;
|
||||
|
||||
custom_error! {pub DanmuStreamError
|
||||
HttpError {err: reqwest::Error} = "HttpError {err}",
|
||||
ParseError {err: url::ParseError} = "ParseError {err}",
|
||||
WebsocketError {err: String } = "WebsocketError {err}",
|
||||
PackError {err: String} = "PackError {err}",
|
||||
UnsupportProto {proto: u16} = "UnsupportProto {proto}",
|
||||
MessageParseError {err: String} = "MessageParseError {err}",
|
||||
InvalidIdentifier {err: String} = "InvalidIdentifier {err}"
|
||||
#[derive(Error, Debug)]
|
||||
pub enum DanmuStreamError {
|
||||
#[error("HttpError {0:?}")]
|
||||
HttpError(#[from] reqwest::Error),
|
||||
#[error("ParseError {0:?}")]
|
||||
ParseError(#[from] url::ParseError),
|
||||
#[error("WebsocketError {err}")]
|
||||
WebsocketError { err: String },
|
||||
#[error("PackError {err}")]
|
||||
PackError { err: String },
|
||||
#[error("UnsupportProto {proto}")]
|
||||
UnsupportProto { proto: u16 },
|
||||
#[error("MessageParseError {err}")]
|
||||
MessageParseError { err: String },
|
||||
#[error("InvalidIdentifier {err}")]
|
||||
InvalidIdentifier { err: String },
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
@@ -21,7 +29,7 @@ pub enum DanmuMessageType {
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct DanmuMessage {
|
||||
pub room_id: u64,
|
||||
pub room_id: i64,
|
||||
pub user_id: u64,
|
||||
pub user_name: String,
|
||||
pub message: String,
|
||||
|
||||
@@ -36,15 +36,15 @@ type WsWriteType = futures_util::stream::SplitSink<
|
||||
|
||||
pub struct BiliDanmu {
|
||||
client: ApiClient,
|
||||
room_id: u64,
|
||||
user_id: u64,
|
||||
room_id: i64,
|
||||
user_id: i64,
|
||||
stop: Arc<RwLock<bool>>,
|
||||
write: Arc<RwLock<Option<WsWriteType>>>,
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl DanmuProvider for BiliDanmu {
|
||||
async fn new(cookie: &str, room_id: u64) -> Result<Self, DanmuStreamError> {
|
||||
async fn new(cookie: &str, room_id: i64) -> Result<Self, DanmuStreamError> {
|
||||
// find DedeUserID=<user_id> in cookie str
|
||||
let user_id = BiliDanmu::parse_user_id(cookie)?;
|
||||
// add buvid3 to cookie
|
||||
@@ -65,7 +65,6 @@ impl DanmuProvider for BiliDanmu {
|
||||
tx: mpsc::UnboundedSender<DanmuMessageType>,
|
||||
) -> Result<(), DanmuStreamError> {
|
||||
let mut retry_count = 0;
|
||||
const MAX_RETRIES: u32 = 5;
|
||||
const RETRY_DELAY: Duration = Duration::from_secs(5);
|
||||
info!(
|
||||
"Bilibili WebSocket connection started, room_id: {}",
|
||||
@@ -74,33 +73,37 @@ impl DanmuProvider for BiliDanmu {
|
||||
|
||||
loop {
|
||||
if *self.stop.read().await {
|
||||
info!(
|
||||
"Bilibili WebSocket connection stopped, room_id: {}",
|
||||
self.room_id
|
||||
);
|
||||
break;
|
||||
}
|
||||
|
||||
match self.connect_and_handle(tx.clone()).await {
|
||||
Ok(_) => {
|
||||
info!("Bilibili WebSocket connection closed normally");
|
||||
break;
|
||||
info!(
|
||||
"Bilibili WebSocket connection closed normally, room_id: {}",
|
||||
self.room_id
|
||||
);
|
||||
retry_count = 0;
|
||||
}
|
||||
Err(e) => {
|
||||
error!("Bilibili WebSocket connection error: {}", e);
|
||||
retry_count += 1;
|
||||
|
||||
if retry_count >= MAX_RETRIES {
|
||||
return Err(DanmuStreamError::WebsocketError {
|
||||
err: format!("Failed to connect after {} retries", MAX_RETRIES),
|
||||
});
|
||||
}
|
||||
|
||||
info!(
|
||||
"Retrying connection in {} seconds... (Attempt {}/{})",
|
||||
RETRY_DELAY.as_secs(),
|
||||
retry_count,
|
||||
MAX_RETRIES
|
||||
error!(
|
||||
"Bilibili WebSocket connection error, room_id: {}, error: {}",
|
||||
self.room_id, e
|
||||
);
|
||||
tokio::time::sleep(RETRY_DELAY).await;
|
||||
retry_count += 1;
|
||||
}
|
||||
}
|
||||
|
||||
info!(
|
||||
"Retrying connection in {} seconds... (Attempt {}), room_id: {}",
|
||||
RETRY_DELAY.as_secs(),
|
||||
retry_count,
|
||||
self.room_id
|
||||
);
|
||||
tokio::time::sleep(RETRY_DELAY).await;
|
||||
}
|
||||
|
||||
Ok(())
|
||||
@@ -238,7 +241,7 @@ impl BiliDanmu {
|
||||
async fn get_danmu_info(
|
||||
&self,
|
||||
wbi_key: &str,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
) -> Result<DanmuInfo, DanmuStreamError> {
|
||||
let params = self
|
||||
.get_sign(
|
||||
@@ -265,7 +268,7 @@ impl BiliDanmu {
|
||||
Ok(resp)
|
||||
}
|
||||
|
||||
async fn get_real_room(&self, wbi_key: &str, room_id: u64) -> Result<u64, DanmuStreamError> {
|
||||
async fn get_real_room(&self, wbi_key: &str, room_id: i64) -> Result<i64, DanmuStreamError> {
|
||||
let params = self
|
||||
.get_sign(
|
||||
wbi_key,
|
||||
@@ -293,14 +296,14 @@ impl BiliDanmu {
|
||||
Ok(resp)
|
||||
}
|
||||
|
||||
fn parse_user_id(cookie: &str) -> Result<u64, DanmuStreamError> {
|
||||
fn parse_user_id(cookie: &str) -> Result<i64, DanmuStreamError> {
|
||||
let mut user_id = None;
|
||||
|
||||
// find DedeUserID=<user_id> in cookie str
|
||||
let re = Regex::new(r"DedeUserID=(\d+)").unwrap();
|
||||
if let Some(captures) = re.captures(cookie) {
|
||||
if let Some(user) = captures.get(1) {
|
||||
user_id = Some(user.as_str().parse::<u64>().unwrap());
|
||||
user_id = Some(user.as_str().parse::<i64>().unwrap());
|
||||
}
|
||||
}
|
||||
|
||||
@@ -404,8 +407,8 @@ impl BiliDanmu {
|
||||
|
||||
#[derive(Serialize)]
|
||||
struct WsSend {
|
||||
uid: u64,
|
||||
roomid: u64,
|
||||
uid: i64,
|
||||
roomid: i64,
|
||||
key: String,
|
||||
protover: u32,
|
||||
platform: String,
|
||||
@@ -436,5 +439,5 @@ pub struct RoomInit {
|
||||
|
||||
#[derive(Debug, Deserialize, Clone)]
|
||||
pub struct RoomInitData {
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
}
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
use serde::Deserialize;
|
||||
|
||||
use crate::{provider::bilibili::stream::WsStreamCtx, DanmuStreamError};
|
||||
use super::stream::WsStreamCtx;
|
||||
|
||||
use crate::DanmuStreamError;
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
#[allow(dead_code)]
|
||||
|
||||
@@ -1,4 +1,6 @@
|
||||
use crate::{provider::bilibili::stream::WsStreamCtx, DanmuStreamError};
|
||||
use super::stream::WsStreamCtx;
|
||||
|
||||
use crate::DanmuStreamError;
|
||||
|
||||
#[derive(Debug)]
|
||||
#[allow(dead_code)]
|
||||
|
||||
@@ -24,7 +24,7 @@ struct PackHotCount {
|
||||
|
||||
type BilibiliPackCtx<'a> = (BilibiliPackHeader, &'a [u8]);
|
||||
|
||||
fn pack(buffer: &[u8]) -> Result<BilibiliPackCtx, DanmuStreamError> {
|
||||
fn pack(buffer: &[u8]) -> Result<BilibiliPackCtx<'_>, DanmuStreamError> {
|
||||
let data = buffer
|
||||
.pread_with(0, scroll::BE)
|
||||
.map_err(|e: scroll::Error| DanmuStreamError::PackError { err: e.to_string() })?;
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
use serde::Deserialize;
|
||||
|
||||
use crate::{provider::bilibili::stream::WsStreamCtx, DanmuStreamError};
|
||||
use super::stream::WsStreamCtx;
|
||||
|
||||
use crate::DanmuStreamError;
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
#[allow(dead_code)]
|
||||
|
||||
@@ -1,10 +1,9 @@
|
||||
use serde::Deserialize;
|
||||
use serde_json::Value;
|
||||
|
||||
use crate::{
|
||||
provider::{bilibili::dannmu_msg::BiliDanmuMessage, DanmuMessageType},
|
||||
DanmuMessage, DanmuStreamError,
|
||||
};
|
||||
use super::dannmu_msg::BiliDanmuMessage;
|
||||
|
||||
use crate::{provider::DanmuMessageType, DanmuMessage, DanmuStreamError};
|
||||
|
||||
#[derive(Debug, Deserialize, Clone)]
|
||||
pub struct WsStreamCtx {
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
use serde::Deserialize;
|
||||
|
||||
use crate::{provider::bilibili::stream::WsStreamCtx, DanmuStreamError};
|
||||
use super::stream::WsStreamCtx;
|
||||
|
||||
use crate::DanmuStreamError;
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
#[allow(dead_code)]
|
||||
|
||||
@@ -1,4 +1,9 @@
|
||||
use crate::{provider::DanmuProvider, DanmuMessage, DanmuMessageType, DanmuStreamError};
|
||||
mod messages;
|
||||
|
||||
use std::io::Read;
|
||||
use std::sync::Arc;
|
||||
use std::time::{Duration, SystemTime};
|
||||
|
||||
use async_trait::async_trait;
|
||||
use deno_core::v8;
|
||||
use deno_core::JsRuntime;
|
||||
@@ -7,11 +12,9 @@ use flate2::read::GzDecoder;
|
||||
use futures_util::{SinkExt, StreamExt, TryStreamExt};
|
||||
use log::debug;
|
||||
use log::{error, info};
|
||||
use messages::*;
|
||||
use prost::bytes::Bytes;
|
||||
use prost::Message;
|
||||
use std::io::Read;
|
||||
use std::sync::Arc;
|
||||
use std::time::{Duration, SystemTime};
|
||||
use tokio::net::TcpStream;
|
||||
use tokio::sync::mpsc;
|
||||
use tokio::sync::RwLock;
|
||||
@@ -19,8 +22,7 @@ use tokio_tungstenite::{
|
||||
connect_async, tungstenite::Message as WsMessage, MaybeTlsStream, WebSocketStream,
|
||||
};
|
||||
|
||||
mod messages;
|
||||
use messages::*;
|
||||
use crate::{provider::DanmuProvider, DanmuMessage, DanmuMessageType, DanmuStreamError};
|
||||
|
||||
const USER_AGENT: &str = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36";
|
||||
|
||||
@@ -31,7 +33,7 @@ type WsWriteType =
|
||||
futures_util::stream::SplitSink<WebSocketStream<MaybeTlsStream<TcpStream>>, WsMessage>;
|
||||
|
||||
pub struct DouyinDanmu {
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
cookie: String,
|
||||
stop: Arc<RwLock<bool>>,
|
||||
write: Arc<RwLock<Option<WsWriteType>>>,
|
||||
@@ -109,7 +111,7 @@ impl DouyinDanmu {
|
||||
runtime
|
||||
.execute_script(
|
||||
"<crypto-js.min.js>",
|
||||
deno_core::FastString::Static(crypto_js),
|
||||
deno_core::FastString::from_static(crypto_js),
|
||||
)
|
||||
.map_err(|e| DanmuStreamError::WebsocketError {
|
||||
err: format!("Failed to execute crypto-js: {}", e),
|
||||
@@ -118,7 +120,7 @@ impl DouyinDanmu {
|
||||
// Load and execute the sign.js file
|
||||
let js_code = include_str!("douyin/webmssdk.js");
|
||||
runtime
|
||||
.execute_script("<sign.js>", deno_core::FastString::Static(js_code))
|
||||
.execute_script("<sign.js>", deno_core::FastString::from_static(js_code))
|
||||
.map_err(|e| DanmuStreamError::WebsocketError {
|
||||
err: format!("Failed to execute JavaScript: {}", e),
|
||||
})?;
|
||||
@@ -126,10 +128,7 @@ impl DouyinDanmu {
|
||||
// Call the get_wss_url function
|
||||
let sign_call = format!("get_wss_url(\"{}\")", self.room_id);
|
||||
let result = runtime
|
||||
.execute_script(
|
||||
"<sign_call>",
|
||||
deno_core::FastString::Owned(sign_call.into_boxed_str()),
|
||||
)
|
||||
.execute_script("<sign_call>", deno_core::FastString::from(sign_call))
|
||||
.map_err(|e| DanmuStreamError::WebsocketError {
|
||||
err: format!("Failed to execute JavaScript: {}", e),
|
||||
})?;
|
||||
@@ -214,7 +213,7 @@ impl DouyinDanmu {
|
||||
if let Ok(Some(ack)) = handle_binary_message(&data, &tx, room_id).await {
|
||||
if let Some(write) = write.write().await.as_mut() {
|
||||
if let Err(e) =
|
||||
write.send(WsMessage::Binary(ack.encode_to_vec())).await
|
||||
write.send(WsMessage::binary(ack.encode_to_vec())).await
|
||||
{
|
||||
error!("Failed to send ack: {}", e);
|
||||
}
|
||||
@@ -257,7 +256,7 @@ impl DouyinDanmu {
|
||||
|
||||
async fn send_heartbeat(tx: &mpsc::Sender<WsMessage>) -> Result<(), DanmuStreamError> {
|
||||
// heartbeat message: 3A 02 68 62
|
||||
tx.send(WsMessage::Binary(vec![0x3A, 0x02, 0x68, 0x62]))
|
||||
tx.send(WsMessage::binary(vec![0x3A, 0x02, 0x68, 0x62]))
|
||||
.await
|
||||
.map_err(|e| DanmuStreamError::WebsocketError {
|
||||
err: format!("Failed to send heartbeat message: {}", e),
|
||||
@@ -269,7 +268,7 @@ impl DouyinDanmu {
|
||||
async fn handle_binary_message(
|
||||
data: &[u8],
|
||||
tx: &mpsc::UnboundedSender<DanmuMessageType>,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
) -> Result<Option<PushFrame>, DanmuStreamError> {
|
||||
// First decode the PushFrame
|
||||
let push_frame = PushFrame::decode(Bytes::from(data.to_vec())).map_err(|e| {
|
||||
@@ -395,7 +394,7 @@ async fn handle_binary_message(
|
||||
|
||||
#[async_trait]
|
||||
impl DanmuProvider for DouyinDanmu {
|
||||
async fn new(identifier: &str, room_id: u64) -> Result<Self, DanmuStreamError> {
|
||||
async fn new(identifier: &str, room_id: i64) -> Result<Self, DanmuStreamError> {
|
||||
Ok(Self {
|
||||
room_id,
|
||||
cookie: identifier.to_string(),
|
||||
@@ -409,7 +408,6 @@ impl DanmuProvider for DouyinDanmu {
|
||||
tx: mpsc::UnboundedSender<DanmuMessageType>,
|
||||
) -> Result<(), DanmuStreamError> {
|
||||
let mut retry_count = 0;
|
||||
const MAX_RETRIES: u32 = 5;
|
||||
const RETRY_DELAY: Duration = Duration::from_secs(5);
|
||||
info!(
|
||||
"Douyin WebSocket connection started, room_id: {}",
|
||||
@@ -423,28 +421,25 @@ impl DanmuProvider for DouyinDanmu {
|
||||
|
||||
match self.connect_and_handle(tx.clone()).await {
|
||||
Ok(_) => {
|
||||
info!("Douyin WebSocket connection closed normally");
|
||||
break;
|
||||
info!(
|
||||
"Douyin WebSocket connection closed normally, room_id: {}",
|
||||
self.room_id
|
||||
);
|
||||
retry_count = 0;
|
||||
}
|
||||
Err(e) => {
|
||||
error!("Douyin WebSocket connection error: {}", e);
|
||||
retry_count += 1;
|
||||
|
||||
if retry_count >= MAX_RETRIES {
|
||||
return Err(DanmuStreamError::WebsocketError {
|
||||
err: format!("Failed to connect after {} retries", MAX_RETRIES),
|
||||
});
|
||||
}
|
||||
|
||||
info!(
|
||||
"Retrying connection in {} seconds... (Attempt {}/{})",
|
||||
RETRY_DELAY.as_secs(),
|
||||
retry_count,
|
||||
MAX_RETRIES
|
||||
);
|
||||
tokio::time::sleep(RETRY_DELAY).await;
|
||||
}
|
||||
}
|
||||
|
||||
info!(
|
||||
"Retrying connection in {} seconds... (Attempt {}), room_id: {}",
|
||||
RETRY_DELAY.as_secs(),
|
||||
retry_count,
|
||||
self.room_id
|
||||
);
|
||||
tokio::time::sleep(RETRY_DELAY).await;
|
||||
}
|
||||
|
||||
Ok(())
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
use prost::Message;
|
||||
use std::collections::HashMap;
|
||||
|
||||
use prost::Message;
|
||||
|
||||
// message Response {
|
||||
// repeated Message messagesList = 1;
|
||||
// string cursor = 2;
|
||||
|
||||
@@ -4,10 +4,10 @@ mod douyin;
|
||||
use async_trait::async_trait;
|
||||
use tokio::sync::mpsc;
|
||||
|
||||
use crate::{
|
||||
provider::bilibili::BiliDanmu, provider::douyin::DouyinDanmu, DanmuMessageType,
|
||||
DanmuStreamError,
|
||||
};
|
||||
use self::bilibili::BiliDanmu;
|
||||
use self::douyin::DouyinDanmu;
|
||||
|
||||
use crate::{DanmuMessageType, DanmuStreamError};
|
||||
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub enum ProviderType {
|
||||
@@ -17,7 +17,7 @@ pub enum ProviderType {
|
||||
|
||||
#[async_trait]
|
||||
pub trait DanmuProvider: Send + Sync {
|
||||
async fn new(identifier: &str, room_id: u64) -> Result<Self, DanmuStreamError>
|
||||
async fn new(identifier: &str, room_id: i64) -> Result<Self, DanmuStreamError>
|
||||
where
|
||||
Self: Sized;
|
||||
|
||||
@@ -57,7 +57,7 @@ pub trait DanmuProvider: Send + Sync {
|
||||
pub async fn new(
|
||||
provider_type: ProviderType,
|
||||
identifier: &str,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
) -> Result<Box<dyn DanmuProvider>, DanmuStreamError> {
|
||||
match provider_type {
|
||||
ProviderType::BiliBili => {
|
||||
@@ -1 +1 @@
|
||||
{"migrated":{"identifier":"migrated","description":"permissions that were migrated from v1","local":true,"windows":["main","Live*","Clip*"],"permissions":["core:default","fs:allow-read-file","fs:allow-write-file","fs:allow-read-dir","fs:allow-copy-file","fs:allow-mkdir","fs:allow-remove","fs:allow-remove","fs:allow-rename","fs:allow-exists",{"identifier":"fs:scope","allow":["**"]},"core:window:default","core:window:allow-start-dragging","core:window:allow-close","core:window:allow-minimize","core:window:allow-maximize","core:window:allow-unmaximize","core:window:allow-set-title","sql:allow-execute","shell:allow-open","dialog:allow-open","dialog:allow-save","dialog:allow-message","dialog:allow-ask","dialog:allow-confirm",{"identifier":"http:default","allow":[{"url":"https://*.hdslb.com/"},{"url":"https://afdian.com/"},{"url":"https://*.afdiancdn.com/"},{"url":"https://*.douyin.com/"},{"url":"https://*.douyinpic.com/"}]},"dialog:default","shell:default","fs:default","http:default","sql:default","os:default","notification:default","dialog:default","fs:default","http:default","shell:default","sql:default","os:default","dialog:default","deep-link:default"]}}
|
||||
{"migrated":{"identifier":"migrated","description":"permissions that were migrated from v1","local":true,"windows":["main","Live*","Clip*"],"permissions":["core:default","fs:allow-read-file","fs:allow-write-file","fs:allow-read-dir","fs:allow-copy-file","fs:allow-mkdir","fs:allow-remove","fs:allow-remove","fs:allow-rename","fs:allow-exists",{"identifier":"fs:scope","allow":["**"]},"core:window:default","core:window:allow-start-dragging","core:window:allow-close","core:window:allow-minimize","core:window:allow-maximize","core:window:allow-unmaximize","core:window:allow-set-title","sql:allow-execute","shell:allow-open","dialog:allow-open","dialog:allow-save","dialog:allow-message","dialog:allow-ask","dialog:allow-confirm",{"identifier":"http:default","allow":[{"url":"https://*.hdslb.com/"},{"url":"https://afdian.com/"},{"url":"https://*.afdiancdn.com/"},{"url":"https://*.douyin.com/"},{"url":"https://*.douyinpic.com/"},{"url":"http://tauri.localhost/*"},{"url":"http://localhost:8054/*"}]},"dialog:default","shell:default","fs:default","http:default","sql:default","os:default","notification:default","dialog:default","fs:default","http:default","shell:default","sql:default","os:default","dialog:default","deep-link:default"]}}
|
||||
@@ -1,56 +0,0 @@
|
||||
use std::path::PathBuf;
|
||||
use std::sync::Arc;
|
||||
|
||||
use chrono::Utc;
|
||||
|
||||
use crate::database::Database;
|
||||
use crate::recorder::PlatformType;
|
||||
|
||||
pub async fn try_rebuild_archives(
|
||||
db: &Arc<Database>,
|
||||
cache_path: PathBuf,
|
||||
) -> Result<(), Box<dyn std::error::Error>> {
|
||||
let rooms = db.get_recorders().await?;
|
||||
for room in rooms {
|
||||
let room_id = room.room_id;
|
||||
let room_cache_path = cache_path.join(format!("{}/{}", room.platform, room_id));
|
||||
let mut files = tokio::fs::read_dir(room_cache_path).await?;
|
||||
while let Some(file) = files.next_entry().await? {
|
||||
if file.file_type().await?.is_dir() {
|
||||
// use folder name as live_id
|
||||
let live_id = file.file_name();
|
||||
let live_id = live_id.to_str().unwrap();
|
||||
// check if live_id is in db
|
||||
let record = db.get_record(room_id, live_id).await;
|
||||
if record.is_ok() {
|
||||
continue;
|
||||
}
|
||||
|
||||
// get created_at from folder metadata
|
||||
let metadata = file.metadata().await?;
|
||||
let created_at = metadata.created();
|
||||
if created_at.is_err() {
|
||||
continue;
|
||||
}
|
||||
let created_at = created_at.unwrap();
|
||||
let created_at = chrono::DateTime::<Utc>::from(created_at)
|
||||
.format("%Y-%m-%dT%H:%M:%S.%fZ")
|
||||
.to_string();
|
||||
// create a record for this live_id
|
||||
let record = db
|
||||
.add_record(
|
||||
PlatformType::from_str(room.platform.as_str()).unwrap(),
|
||||
live_id,
|
||||
room_id,
|
||||
&format!("UnknownLive {}", live_id),
|
||||
None,
|
||||
Some(&created_at),
|
||||
)
|
||||
.await?;
|
||||
|
||||
log::info!("rebuild archive {:?}", record);
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
@@ -35,8 +35,8 @@ pub struct Config {
|
||||
pub config_path: String,
|
||||
#[serde(default = "default_whisper_language")]
|
||||
pub whisper_language: String,
|
||||
#[serde(default = "default_user_agent")]
|
||||
pub user_agent: String,
|
||||
#[serde(default = "default_webhook_url")]
|
||||
pub webhook_url: String,
|
||||
}
|
||||
|
||||
#[derive(Deserialize, Serialize, Clone)]
|
||||
@@ -66,7 +66,7 @@ fn default_openai_api_endpoint() -> String {
|
||||
}
|
||||
|
||||
fn default_openai_api_key() -> String {
|
||||
"".to_string()
|
||||
String::new()
|
||||
}
|
||||
|
||||
fn default_clip_name_format() -> String {
|
||||
@@ -88,8 +88,8 @@ fn default_whisper_language() -> String {
|
||||
"auto".to_string()
|
||||
}
|
||||
|
||||
fn default_user_agent() -> String {
|
||||
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/137.0.0.0 Safari/537.36".to_string()
|
||||
fn default_webhook_url() -> String {
|
||||
String::new()
|
||||
}
|
||||
|
||||
impl Config {
|
||||
@@ -129,7 +129,7 @@ impl Config {
|
||||
status_check_interval: default_status_check_interval(),
|
||||
config_path: config_path.to_str().unwrap().into(),
|
||||
whisper_language: default_whisper_language(),
|
||||
user_agent: default_user_agent(),
|
||||
webhook_url: default_webhook_url(),
|
||||
};
|
||||
|
||||
config.save();
|
||||
@@ -162,12 +162,6 @@ impl Config {
|
||||
self.save();
|
||||
}
|
||||
|
||||
#[allow(dead_code)]
|
||||
pub fn set_user_agent(&mut self, user_agent: &str) {
|
||||
self.user_agent = user_agent.to_string();
|
||||
self.save();
|
||||
}
|
||||
|
||||
pub fn generate_clip_name(&self, params: &ClipRangeParams) -> PathBuf {
|
||||
let platform = PlatformType::from_str(¶ms.platform).unwrap();
|
||||
|
||||
|
||||
4
src-tauri/src/constants.rs
Normal file
4
src-tauri/src/constants.rs
Normal file
@@ -0,0 +1,4 @@
|
||||
pub const PREFIX_SUBTITLE: &str = "[subtitle]";
|
||||
pub const PREFIX_IMPORTED: &str = "[imported]";
|
||||
pub const PREFIX_DANMAKU: &str = "[danmaku]";
|
||||
pub const PREFIX_CLIP: &str = "[clip]";
|
||||
@@ -24,32 +24,32 @@ struct DanmakuPosition {
|
||||
time: f64,
|
||||
}
|
||||
|
||||
const PLAY_RES_X: f64 = 1920.0;
|
||||
const PLAY_RES_Y: f64 = 1080.0;
|
||||
const PLAY_RES_X: f64 = 1280.0;
|
||||
const PLAY_RES_Y: f64 = 720.0;
|
||||
const BOTTOM_RESERVED: f64 = 50.0;
|
||||
const R2L_TIME: f64 = 8.0;
|
||||
const MAX_DELAY: f64 = 6.0;
|
||||
|
||||
pub fn danmu_to_ass(danmus: Vec<DanmuEntry>) -> String {
|
||||
// ASS header
|
||||
let header = r#"[Script Info]
|
||||
let header = r"[Script Info]
|
||||
Title: Bilibili Danmaku
|
||||
ScriptType: v4.00+
|
||||
Collisions: Normal
|
||||
PlayResX: 1920
|
||||
PlayResY: 1080
|
||||
PlayResX: 1280
|
||||
PlayResY: 720
|
||||
Timer: 10.0000
|
||||
|
||||
[V4+ Styles]
|
||||
Format: Name, Fontname, Fontsize, PrimaryColour, SecondaryColour, OutlineColour, BackColour, Bold, Italic, Underline, StrikeOut, ScaleX, ScaleY, Spacing, Angle, BorderStyle, Outline, Shadow, Alignment, MarginL, MarginR, MarginV, Encoding
|
||||
Style: Default,Microsoft YaHei,48,&H00FFFFFF,&H000000FF,&H00000000,&H00000000,0,0,0,0,100,100,0,0,1,2,0,2,20,20,2,0
|
||||
Style: Default,微软雅黑,36,&H7fFFFFFF,&H7fFFFFFF,&H7f000000,&H7f000000,0,0,0,0,100,100,0,0,1,1,0,2,20,20,2,0
|
||||
|
||||
[Events]
|
||||
Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text
|
||||
"#;
|
||||
";
|
||||
|
||||
let mut normal = normal_danmaku();
|
||||
let font_size = 48.0; // Default font size
|
||||
let font_size = 36.0; // Default font size
|
||||
|
||||
// Convert danmus to ASS events
|
||||
let events = danmus
|
||||
@@ -76,7 +76,7 @@ Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text
|
||||
"Dialogue: 0,{},{},Default,,0,0,0,,{{\\move({},{},{},{})}}{}",
|
||||
start_time,
|
||||
end_time,
|
||||
PLAY_RES_X,
|
||||
PLAY_RES_X + text_width / 2.0,
|
||||
pos.top + font_size, // Start position
|
||||
-text_width,
|
||||
pos.top + font_size, // End position
|
||||
@@ -87,22 +87,22 @@ Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text
|
||||
.join("\n");
|
||||
|
||||
// Combine header and events
|
||||
format!("{}\n{}", header, events)
|
||||
format!("{header}\n{events}")
|
||||
}
|
||||
|
||||
fn format_time(seconds: f64) -> String {
|
||||
let hours = (seconds / 3600.0) as i32;
|
||||
let minutes = ((seconds % 3600.0) / 60.0) as i32;
|
||||
let seconds = seconds % 60.0;
|
||||
format!("{}:{:02}:{:05.2}", hours, minutes, seconds)
|
||||
format!("{hours}:{minutes:02}:{seconds:05.2}")
|
||||
}
|
||||
|
||||
fn escape_text(text: &str) -> String {
|
||||
text.replace("\\", "\\\\")
|
||||
.replace("{", "{")
|
||||
.replace("}", "}")
|
||||
.replace("\r", "")
|
||||
.replace("\n", "\\N")
|
||||
text.replace('\\', "\\\\")
|
||||
.replace('{', "{")
|
||||
.replace('}', "}")
|
||||
.replace('\r', "")
|
||||
.replace('\n', "\\N")
|
||||
}
|
||||
|
||||
fn normal_danmaku() -> impl FnMut(f64, f64, f64, bool) -> Option<DanmakuPosition> {
|
||||
@@ -144,8 +144,8 @@ fn normal_danmaku() -> impl FnMut(f64, f64, f64, bool) -> Option<DanmakuPosition
|
||||
|
||||
let p = space.m;
|
||||
let m = p + hv;
|
||||
let mut tas = t0s;
|
||||
let mut tal = t0l;
|
||||
let mut time_actual_start = t0s;
|
||||
let mut time_actual_leave = t0l;
|
||||
|
||||
for other in &used {
|
||||
if other.p >= m || other.m <= p {
|
||||
@@ -154,13 +154,13 @@ fn normal_danmaku() -> impl FnMut(f64, f64, f64, bool) -> Option<DanmakuPosition
|
||||
if other.b && b {
|
||||
continue;
|
||||
}
|
||||
tas = tas.max(other.tf);
|
||||
tal = tal.max(other.td);
|
||||
time_actual_start = time_actual_start.max(other.tf);
|
||||
time_actual_leave = time_actual_leave.max(other.td);
|
||||
}
|
||||
|
||||
suggestions.push(PositionSuggestion {
|
||||
p,
|
||||
r: (tas - t0s).max(tal - t0l),
|
||||
r: (time_actual_start - t0s).max(time_actual_leave - t0l),
|
||||
});
|
||||
}
|
||||
|
||||
|
||||
@@ -6,10 +6,10 @@ use chrono::Utc;
|
||||
use rand::seq::SliceRandom;
|
||||
use rand::Rng;
|
||||
|
||||
#[derive(Debug, Clone, serde::Serialize, sqlx::FromRow)]
|
||||
#[derive(Debug, Clone, serde::Serialize, serde::Deserialize, sqlx::FromRow)]
|
||||
pub struct AccountRow {
|
||||
pub platform: String,
|
||||
pub uid: u64, // Keep for Bilibili compatibility
|
||||
pub uid: i64, // Keep for Bilibili compatibility
|
||||
pub id_str: Option<String>, // New field for string IDs like Douyin sec_uid
|
||||
pub name: String,
|
||||
pub avatar: String,
|
||||
@@ -30,74 +30,76 @@ impl Database {
|
||||
let platform = PlatformType::from_str(platform).unwrap();
|
||||
|
||||
let csrf = if platform == PlatformType::Douyin {
|
||||
Some("".to_string())
|
||||
Some(String::new())
|
||||
} else {
|
||||
// parse cookies
|
||||
cookies
|
||||
.split(';')
|
||||
.map(|cookie| cookie.trim())
|
||||
.map(str::trim)
|
||||
.find_map(|cookie| -> Option<String> {
|
||||
match cookie.starts_with("bili_jct=") {
|
||||
true => {
|
||||
let var_name = &"bili_jct=";
|
||||
Some(cookie[var_name.len()..].to_string())
|
||||
}
|
||||
false => None,
|
||||
if cookie.starts_with("bili_jct=") {
|
||||
let var_name = &"bili_jct=";
|
||||
Some(cookie[var_name.len()..].to_string())
|
||||
} else {
|
||||
None
|
||||
}
|
||||
})
|
||||
};
|
||||
|
||||
if csrf.is_none() {
|
||||
return Err(DatabaseError::InvalidCookiesError);
|
||||
return Err(DatabaseError::InvalidCookies);
|
||||
}
|
||||
|
||||
// parse uid and id_str based on platform
|
||||
let (uid, id_str) = if platform == PlatformType::BiliBili {
|
||||
// For Bilibili, extract numeric uid from cookies
|
||||
let uid = cookies
|
||||
let uid = (*cookies
|
||||
.split("DedeUserID=")
|
||||
.collect::<Vec<&str>>()
|
||||
.get(1)
|
||||
.unwrap()
|
||||
.split(";")
|
||||
.split(';')
|
||||
.collect::<Vec<&str>>()
|
||||
.first()
|
||||
.unwrap()
|
||||
.to_string()
|
||||
.parse::<u64>()
|
||||
.map_err(|_| DatabaseError::InvalidCookiesError)?;
|
||||
.unwrap())
|
||||
.to_string()
|
||||
.parse::<u64>()
|
||||
.map_err(|_| DatabaseError::InvalidCookies)?;
|
||||
(uid, None)
|
||||
} else {
|
||||
// For Douyin, use temporary uid and will set id_str later with real sec_uid
|
||||
let temp_uid = rand::thread_rng().gen_range(10000..=i32::MAX) as u64;
|
||||
(temp_uid, Some(format!("temp_{}", temp_uid)))
|
||||
// Fix: Generate a u32 within the desired range and then cast to u64 to avoid `clippy::cast-sign-loss`.
|
||||
let temp_uid = rand::thread_rng().gen_range(10000u64..=i32::MAX as u64);
|
||||
(temp_uid, Some(format!("temp_{temp_uid}")))
|
||||
};
|
||||
|
||||
let uid = i64::try_from(uid).map_err(|_| DatabaseError::InvalidCookies)?;
|
||||
|
||||
let account = AccountRow {
|
||||
platform: platform.as_str().to_string(),
|
||||
uid,
|
||||
id_str,
|
||||
name: "".into(),
|
||||
avatar: "".into(),
|
||||
name: String::new(),
|
||||
avatar: String::new(),
|
||||
csrf: csrf.unwrap(),
|
||||
cookies: cookies.into(),
|
||||
created_at: Utc::now().to_rfc3339(),
|
||||
};
|
||||
|
||||
sqlx::query("INSERT INTO accounts (uid, platform, id_str, name, avatar, csrf, cookies, created_at) VALUES ($1, $2, $3, $4, $5, $6, $7, $8)").bind(account.uid as i64).bind(&account.platform).bind(&account.id_str).bind(&account.name).bind(&account.avatar).bind(&account.csrf).bind(&account.cookies).bind(&account.created_at).execute(&lock).await?;
|
||||
sqlx::query("INSERT INTO accounts (uid, platform, id_str, name, avatar, csrf, cookies, created_at) VALUES ($1, $2, $3, $4, $5, $6, $7, $8)").bind(uid).bind(&account.platform).bind(&account.id_str).bind(&account.name).bind(&account.avatar).bind(&account.csrf).bind(&account.cookies).bind(&account.created_at).execute(&lock).await?;
|
||||
|
||||
Ok(account)
|
||||
}
|
||||
|
||||
pub async fn remove_account(&self, platform: &str, uid: u64) -> Result<(), DatabaseError> {
|
||||
pub async fn remove_account(&self, platform: &str, uid: i64) -> Result<(), DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let sql = sqlx::query("DELETE FROM accounts WHERE uid = $1 and platform = $2")
|
||||
.bind(uid as i64)
|
||||
.bind(uid)
|
||||
.bind(platform)
|
||||
.execute(&lock)
|
||||
.await?;
|
||||
if sql.rows_affected() != 1 {
|
||||
return Err(DatabaseError::NotFoundError);
|
||||
return Err(DatabaseError::NotFound);
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
@@ -105,7 +107,7 @@ impl Database {
|
||||
pub async fn update_account(
|
||||
&self,
|
||||
platform: &str,
|
||||
uid: u64,
|
||||
uid: i64,
|
||||
name: &str,
|
||||
avatar: &str,
|
||||
) -> Result<(), DatabaseError> {
|
||||
@@ -115,12 +117,12 @@ impl Database {
|
||||
)
|
||||
.bind(name)
|
||||
.bind(avatar)
|
||||
.bind(uid as i64)
|
||||
.bind(uid)
|
||||
.bind(platform)
|
||||
.execute(&lock)
|
||||
.await?;
|
||||
if sql.rows_affected() != 1 {
|
||||
return Err(DatabaseError::NotFoundError);
|
||||
return Err(DatabaseError::NotFound);
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
@@ -135,17 +137,28 @@ impl Database {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
|
||||
// If the id_str changed, we need to delete the old record and create a new one
|
||||
if old_account.id_str.as_deref() != Some(new_id_str) {
|
||||
if old_account.id_str.as_deref() == Some(new_id_str) {
|
||||
// id_str is the same, just update name and avatar
|
||||
sqlx::query(
|
||||
"UPDATE accounts SET name = $1, avatar = $2 WHERE uid = $3 and platform = $4",
|
||||
)
|
||||
.bind(name)
|
||||
.bind(avatar)
|
||||
.bind(old_account.uid)
|
||||
.bind(&old_account.platform)
|
||||
.execute(&lock)
|
||||
.await?;
|
||||
} else {
|
||||
// Delete the old record (for Douyin accounts, we use uid to identify)
|
||||
sqlx::query("DELETE FROM accounts WHERE uid = $1 and platform = $2")
|
||||
.bind(old_account.uid as i64)
|
||||
.bind(old_account.uid)
|
||||
.bind(&old_account.platform)
|
||||
.execute(&lock)
|
||||
.await?;
|
||||
|
||||
// Insert the new record with updated id_str
|
||||
sqlx::query("INSERT INTO accounts (uid, platform, id_str, name, avatar, csrf, cookies, created_at) VALUES ($1, $2, $3, $4, $5, $6, $7, $8)")
|
||||
.bind(old_account.uid as i64)
|
||||
.bind(old_account.uid)
|
||||
.bind(&old_account.platform)
|
||||
.bind(new_id_str)
|
||||
.bind(name)
|
||||
@@ -155,17 +168,6 @@ impl Database {
|
||||
.bind(&old_account.created_at)
|
||||
.execute(&lock)
|
||||
.await?;
|
||||
} else {
|
||||
// id_str is the same, just update name and avatar
|
||||
sqlx::query(
|
||||
"UPDATE accounts SET name = $1, avatar = $2 WHERE uid = $3 and platform = $4",
|
||||
)
|
||||
.bind(name)
|
||||
.bind(avatar)
|
||||
.bind(old_account.uid as i64)
|
||||
.bind(&old_account.platform)
|
||||
.execute(&lock)
|
||||
.await?;
|
||||
}
|
||||
|
||||
Ok(())
|
||||
@@ -178,12 +180,12 @@ impl Database {
|
||||
.await?)
|
||||
}
|
||||
|
||||
pub async fn get_account(&self, platform: &str, uid: u64) -> Result<AccountRow, DatabaseError> {
|
||||
pub async fn get_account(&self, platform: &str, uid: i64) -> Result<AccountRow, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
Ok(sqlx::query_as::<_, AccountRow>(
|
||||
"SELECT * FROM accounts WHERE uid = $1 and platform = $2",
|
||||
)
|
||||
.bind(uid as i64)
|
||||
.bind(uid)
|
||||
.bind(platform)
|
||||
.fetch_one(&lock)
|
||||
.await?)
|
||||
@@ -200,7 +202,7 @@ impl Database {
|
||||
.fetch_all(&lock)
|
||||
.await?;
|
||||
if accounts.is_empty() {
|
||||
return Err(DatabaseError::NotFoundError);
|
||||
return Err(DatabaseError::NotFound);
|
||||
}
|
||||
// randomly select one account
|
||||
let account = accounts.choose(&mut rand::thread_rng()).unwrap();
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
use custom_error::custom_error;
|
||||
use sqlx::Pool;
|
||||
use sqlx::Sqlite;
|
||||
use thiserror::Error;
|
||||
use tokio::sync::RwLock;
|
||||
|
||||
pub mod account;
|
||||
@@ -14,23 +14,25 @@ pub struct Database {
|
||||
db: RwLock<Option<Pool<Sqlite>>>,
|
||||
}
|
||||
|
||||
custom_error! { pub DatabaseError
|
||||
InsertError = "Entry insert failed",
|
||||
NotFoundError = "Entry not found",
|
||||
InvalidCookiesError = "Cookies are invalid",
|
||||
DBError {err: sqlx::Error } = "DB error: {err}",
|
||||
SQLError { sql: String } = "SQL is incorret: {sql}"
|
||||
#[derive(Error, Debug)]
|
||||
pub enum DatabaseError {
|
||||
#[error("Entry insert failed")]
|
||||
Insert,
|
||||
#[error("Entry not found")]
|
||||
NotFound,
|
||||
#[error("Cookies are invalid")]
|
||||
InvalidCookies,
|
||||
#[error("Number exceed i64 range")]
|
||||
NumberExceedI64Range,
|
||||
#[error("DB error: {0}")]
|
||||
DB(#[from] sqlx::Error),
|
||||
#[error("SQL is incorret: {sql}")]
|
||||
Sql { sql: String },
|
||||
}
|
||||
|
||||
impl From<DatabaseError> for String {
|
||||
fn from(value: DatabaseError) -> Self {
|
||||
value.to_string()
|
||||
}
|
||||
}
|
||||
|
||||
impl From<sqlx::Error> for DatabaseError {
|
||||
fn from(value: sqlx::Error) -> Self {
|
||||
DatabaseError::DBError { err: value }
|
||||
fn from(err: DatabaseError) -> Self {
|
||||
err.to_string()
|
||||
}
|
||||
}
|
||||
|
||||
@@ -4,11 +4,12 @@ use super::Database;
|
||||
use super::DatabaseError;
|
||||
use chrono::Utc;
|
||||
|
||||
#[derive(Debug, Clone, serde::Serialize, sqlx::FromRow)]
|
||||
#[derive(Debug, Clone, serde::Serialize, serde::Deserialize, sqlx::FromRow)]
|
||||
pub struct RecordRow {
|
||||
pub platform: String,
|
||||
pub parent_id: String,
|
||||
pub live_id: String,
|
||||
pub room_id: u64,
|
||||
pub room_id: i64,
|
||||
pub title: String,
|
||||
pub length: i64,
|
||||
pub size: i64,
|
||||
@@ -18,53 +19,76 @@ pub struct RecordRow {
|
||||
|
||||
// CREATE TABLE records (live_id INTEGER PRIMARY KEY, room_id INTEGER, title TEXT, length INTEGER, size INTEGER, created_at TEXT);
|
||||
impl Database {
|
||||
pub async fn get_records(&self, room_id: u64) -> Result<Vec<RecordRow>, DatabaseError> {
|
||||
pub async fn get_records(
|
||||
&self,
|
||||
room_id: i64,
|
||||
offset: i64,
|
||||
limit: i64,
|
||||
) -> Result<Vec<RecordRow>, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
Ok(
|
||||
sqlx::query_as::<_, RecordRow>("SELECT * FROM records WHERE room_id = $1")
|
||||
.bind(room_id as i64)
|
||||
.fetch_all(&lock)
|
||||
.await?,
|
||||
Ok(sqlx::query_as::<_, RecordRow>(
|
||||
"SELECT * FROM records WHERE room_id = $1 ORDER BY created_at DESC LIMIT $2 OFFSET $3",
|
||||
)
|
||||
.bind(room_id)
|
||||
.bind(limit)
|
||||
.bind(offset)
|
||||
.fetch_all(&lock)
|
||||
.await?)
|
||||
}
|
||||
|
||||
pub async fn get_record(
|
||||
&self,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
live_id: &str,
|
||||
) -> Result<RecordRow, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
Ok(sqlx::query_as::<_, RecordRow>(
|
||||
"SELECT * FROM records WHERE live_id = $1 and room_id = $2",
|
||||
"SELECT * FROM records WHERE room_id = $1 and live_id = $2",
|
||||
)
|
||||
.bind(room_id)
|
||||
.bind(live_id)
|
||||
.bind(room_id as i64)
|
||||
.fetch_one(&lock)
|
||||
.await?)
|
||||
}
|
||||
|
||||
pub async fn get_archives_by_parent_id(
|
||||
&self,
|
||||
room_id: i64,
|
||||
parent_id: &str,
|
||||
) -> Result<Vec<RecordRow>, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
Ok(sqlx::query_as::<_, RecordRow>(
|
||||
"SELECT * FROM records WHERE room_id = $1 and parent_id = $2",
|
||||
)
|
||||
.bind(room_id)
|
||||
.bind(parent_id)
|
||||
.fetch_all(&lock)
|
||||
.await?)
|
||||
}
|
||||
|
||||
pub async fn add_record(
|
||||
&self,
|
||||
platform: PlatformType,
|
||||
parent_id: &str,
|
||||
live_id: &str,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
title: &str,
|
||||
cover: Option<String>,
|
||||
created_at: Option<&str>,
|
||||
) -> Result<RecordRow, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let record = RecordRow {
|
||||
platform: platform.as_str().to_string(),
|
||||
parent_id: parent_id.to_string(),
|
||||
live_id: live_id.to_string(),
|
||||
room_id,
|
||||
title: title.into(),
|
||||
length: 0,
|
||||
size: 0,
|
||||
created_at: created_at.unwrap_or(&Utc::now().to_rfc3339()).to_string(),
|
||||
created_at: Utc::now().to_rfc3339().to_string(),
|
||||
cover,
|
||||
};
|
||||
if let Err(e) = sqlx::query("INSERT INTO records (live_id, room_id, title, length, size, cover, created_at, platform) VALUES ($1, $2, $3, $4, $5, $6, $7, $8)").bind(record.live_id.clone())
|
||||
.bind(record.room_id as i64).bind(&record.title).bind(0).bind(0).bind(&record.cover).bind(&record.created_at).bind(platform.as_str().to_string()).execute(&lock).await {
|
||||
if let Err(e) = sqlx::query("INSERT INTO records (live_id, room_id, title, length, size, cover, created_at, platform, parent_id) VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9)").bind(record.live_id.clone())
|
||||
.bind(record.room_id).bind(&record.title).bind(0).bind(0).bind(&record.cover).bind(&record.created_at).bind(platform.as_str().to_string()).bind(parent_id).execute(&lock).await {
|
||||
// if the record already exists, return the existing record
|
||||
if e.to_string().contains("UNIQUE constraint failed") {
|
||||
return self.get_record(room_id, live_id).await;
|
||||
@@ -73,13 +97,17 @@ impl Database {
|
||||
Ok(record)
|
||||
}
|
||||
|
||||
pub async fn remove_record(&self, live_id: &str) -> Result<(), DatabaseError> {
|
||||
pub async fn remove_record(&self, live_id: &str) -> Result<RecordRow, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let to_delete = sqlx::query_as::<_, RecordRow>("SELECT * FROM records WHERE live_id = $1")
|
||||
.bind(live_id)
|
||||
.fetch_one(&lock)
|
||||
.await?;
|
||||
sqlx::query("DELETE FROM records WHERE live_id = $1")
|
||||
.bind(live_id)
|
||||
.execute(&lock)
|
||||
.await?;
|
||||
Ok(())
|
||||
Ok(to_delete)
|
||||
}
|
||||
|
||||
pub async fn update_record(
|
||||
@@ -89,9 +117,38 @@ impl Database {
|
||||
size: u64,
|
||||
) -> Result<(), DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let size = i64::try_from(size).map_err(|_| DatabaseError::NumberExceedI64Range)?;
|
||||
sqlx::query("UPDATE records SET length = $1, size = $2 WHERE live_id = $3")
|
||||
.bind(length)
|
||||
.bind(size as i64)
|
||||
.bind(size)
|
||||
.bind(live_id)
|
||||
.execute(&lock)
|
||||
.await?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn update_record_parent_id(
|
||||
&self,
|
||||
live_id: &str,
|
||||
parent_id: &str,
|
||||
) -> Result<(), DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
sqlx::query("UPDATE records SET parent_id = $1 WHERE live_id = $2")
|
||||
.bind(parent_id)
|
||||
.bind(live_id)
|
||||
.execute(&lock)
|
||||
.await?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn update_record_cover(
|
||||
&self,
|
||||
live_id: &str,
|
||||
cover: Option<String>,
|
||||
) -> Result<(), DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
sqlx::query("UPDATE records SET cover = $1 WHERE live_id = $2")
|
||||
.bind(cover)
|
||||
.bind(live_id)
|
||||
.execute(&lock)
|
||||
.await?;
|
||||
@@ -123,28 +180,36 @@ impl Database {
|
||||
|
||||
pub async fn get_recent_record(
|
||||
&self,
|
||||
room_id: u64,
|
||||
offset: u64,
|
||||
limit: u64,
|
||||
room_id: i64,
|
||||
offset: i64,
|
||||
limit: i64,
|
||||
) -> Result<Vec<RecordRow>, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
if room_id == 0 {
|
||||
Ok(sqlx::query_as::<_, RecordRow>(
|
||||
"SELECT * FROM records ORDER BY created_at DESC LIMIT $1 OFFSET $2",
|
||||
)
|
||||
.bind(limit as i64)
|
||||
.bind(offset as i64)
|
||||
.bind(limit)
|
||||
.bind(offset)
|
||||
.fetch_all(&lock)
|
||||
.await?)
|
||||
} else {
|
||||
Ok(sqlx::query_as::<_, RecordRow>(
|
||||
"SELECT * FROM records WHERE room_id = $1 ORDER BY created_at DESC LIMIT $2 OFFSET $3",
|
||||
)
|
||||
.bind(room_id as i64)
|
||||
.bind(limit as i64)
|
||||
.bind(offset as i64)
|
||||
.bind(room_id)
|
||||
.bind(limit)
|
||||
.bind(offset)
|
||||
.fetch_all(&lock)
|
||||
.await?)
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn get_record_disk_usage(&self) -> Result<i64, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let result: (i64,) = sqlx::query_as("SELECT SUM(size) FROM records;")
|
||||
.fetch_one(&lock)
|
||||
.await?;
|
||||
Ok(result.0)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -4,9 +4,9 @@ use crate::recorder::PlatformType;
|
||||
use chrono::Utc;
|
||||
/// Recorder in database is pretty simple
|
||||
/// because many room infos are collected in realtime
|
||||
#[derive(Debug, Clone, serde::Serialize, sqlx::FromRow)]
|
||||
#[derive(Debug, Clone, serde::Serialize, serde::Deserialize, sqlx::FromRow)]
|
||||
pub struct RecorderRow {
|
||||
pub room_id: u64,
|
||||
pub room_id: i64,
|
||||
pub created_at: String,
|
||||
pub platform: String,
|
||||
pub auto_start: bool,
|
||||
@@ -18,7 +18,7 @@ impl Database {
|
||||
pub async fn add_recorder(
|
||||
&self,
|
||||
platform: PlatformType,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
extra: &str,
|
||||
) -> Result<RecorderRow, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
@@ -32,7 +32,7 @@ impl Database {
|
||||
let _ = sqlx::query(
|
||||
"INSERT OR REPLACE INTO recorders (room_id, created_at, platform, auto_start, extra) VALUES ($1, $2, $3, $4, $5)",
|
||||
)
|
||||
.bind(room_id as i64)
|
||||
.bind(room_id)
|
||||
.bind(&recorder.created_at)
|
||||
.bind(platform.as_str())
|
||||
.bind(recorder.auto_start)
|
||||
@@ -42,19 +42,24 @@ impl Database {
|
||||
Ok(recorder)
|
||||
}
|
||||
|
||||
pub async fn remove_recorder(&self, room_id: u64) -> Result<(), DatabaseError> {
|
||||
pub async fn remove_recorder(&self, room_id: i64) -> Result<RecorderRow, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let recorder =
|
||||
sqlx::query_as::<_, RecorderRow>("SELECT * FROM recorders WHERE room_id = $1")
|
||||
.bind(room_id)
|
||||
.fetch_one(&lock)
|
||||
.await?;
|
||||
let sql = sqlx::query("DELETE FROM recorders WHERE room_id = $1")
|
||||
.bind(room_id as i64)
|
||||
.bind(room_id)
|
||||
.execute(&lock)
|
||||
.await?;
|
||||
if sql.rows_affected() != 1 {
|
||||
return Err(DatabaseError::NotFoundError);
|
||||
return Err(DatabaseError::NotFound);
|
||||
}
|
||||
|
||||
// remove related archive
|
||||
let _ = self.remove_archive(room_id).await;
|
||||
Ok(())
|
||||
Ok(recorder)
|
||||
}
|
||||
|
||||
pub async fn get_recorders(&self) -> Result<Vec<RecorderRow>, DatabaseError> {
|
||||
@@ -66,10 +71,10 @@ impl Database {
|
||||
.await?)
|
||||
}
|
||||
|
||||
pub async fn remove_archive(&self, room_id: u64) -> Result<(), DatabaseError> {
|
||||
pub async fn remove_archive(&self, room_id: i64) -> Result<(), DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let _ = sqlx::query("DELETE FROM records WHERE room_id = $1")
|
||||
.bind(room_id as i64)
|
||||
.bind(room_id)
|
||||
.execute(&lock)
|
||||
.await?;
|
||||
Ok(())
|
||||
@@ -78,7 +83,7 @@ impl Database {
|
||||
pub async fn update_recorder(
|
||||
&self,
|
||||
platform: PlatformType,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
auto_start: bool,
|
||||
) -> Result<(), DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
@@ -87,7 +92,7 @@ impl Database {
|
||||
)
|
||||
.bind(auto_start)
|
||||
.bind(platform.as_str().to_string())
|
||||
.bind(room_id as i64)
|
||||
.bind(room_id)
|
||||
.execute(&lock)
|
||||
.await?;
|
||||
Ok(())
|
||||
|
||||
@@ -13,6 +13,27 @@ pub struct TaskRow {
|
||||
}
|
||||
|
||||
impl Database {
|
||||
pub async fn generate_task(
|
||||
&self,
|
||||
task_type: &str,
|
||||
message: &str,
|
||||
metadata: &str,
|
||||
) -> Result<TaskRow, DatabaseError> {
|
||||
let task_id = uuid::Uuid::new_v4().to_string();
|
||||
let task = TaskRow {
|
||||
id: task_id,
|
||||
task_type: task_type.to_string(),
|
||||
status: "pending".to_string(),
|
||||
message: message.to_string(),
|
||||
metadata: metadata.to_string(),
|
||||
created_at: chrono::Utc::now().to_rfc3339(),
|
||||
};
|
||||
|
||||
self.add_task(&task).await?;
|
||||
|
||||
Ok(task)
|
||||
}
|
||||
|
||||
pub async fn add_task(&self, task: &TaskRow) -> Result<(), DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let _ = sqlx::query(
|
||||
|
||||
@@ -2,29 +2,13 @@ use super::Database;
|
||||
use super::DatabaseError;
|
||||
|
||||
// CREATE TABLE videos (id INTEGER PRIMARY KEY, room_id INTEGER, cover TEXT, file TEXT, length INTEGER, size INTEGER, status INTEGER, bvid TEXT, title TEXT, desc TEXT, tags TEXT, area INTEGER, created_at TEXT);
|
||||
#[derive(Debug, Clone, serde::Serialize, sqlx::FromRow)]
|
||||
#[derive(Debug, Clone, serde::Serialize, serde::Deserialize, sqlx::FromRow)]
|
||||
pub struct VideoRow {
|
||||
pub id: i64,
|
||||
pub room_id: u64,
|
||||
pub room_id: i64,
|
||||
pub cover: String,
|
||||
pub file: String,
|
||||
pub length: i64,
|
||||
pub size: i64,
|
||||
pub status: i64,
|
||||
pub bvid: String,
|
||||
pub title: String,
|
||||
pub desc: String,
|
||||
pub tags: String,
|
||||
pub area: i64,
|
||||
pub created_at: String,
|
||||
pub platform: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, serde::Serialize, sqlx::FromRow)]
|
||||
pub struct VideoNoCover {
|
||||
pub id: i64,
|
||||
pub room_id: u64,
|
||||
pub file: String,
|
||||
pub note: String,
|
||||
pub length: i64,
|
||||
pub size: i64,
|
||||
pub status: i64,
|
||||
@@ -38,10 +22,10 @@ pub struct VideoNoCover {
|
||||
}
|
||||
|
||||
impl Database {
|
||||
pub async fn get_videos(&self, room_id: u64) -> Result<Vec<VideoNoCover>, DatabaseError> {
|
||||
pub async fn get_videos(&self, room_id: i64) -> Result<Vec<VideoRow>, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let videos = sqlx::query_as::<_, VideoNoCover>("SELECT * FROM videos WHERE room_id = $1;")
|
||||
.bind(room_id as i64)
|
||||
let videos = sqlx::query_as::<_, VideoRow>("SELECT * FROM videos WHERE room_id = $1;")
|
||||
.bind(room_id)
|
||||
.fetch_all(&lock)
|
||||
.await?;
|
||||
Ok(videos)
|
||||
@@ -59,13 +43,14 @@ impl Database {
|
||||
|
||||
pub async fn update_video(&self, video_row: &VideoRow) -> Result<(), DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
sqlx::query("UPDATE videos SET status = $1, bvid = $2, title = $3, desc = $4, tags = $5, area = $6 WHERE id = $7")
|
||||
sqlx::query("UPDATE videos SET status = $1, bvid = $2, title = $3, desc = $4, tags = $5, area = $6, note = $7 WHERE id = $8")
|
||||
.bind(video_row.status)
|
||||
.bind(&video_row.bvid)
|
||||
.bind(&video_row.title)
|
||||
.bind(&video_row.desc)
|
||||
.bind(&video_row.tags)
|
||||
.bind(video_row.area)
|
||||
.bind(&video_row.note)
|
||||
.bind(video_row.id)
|
||||
.execute(&lock)
|
||||
.await?;
|
||||
@@ -83,10 +68,11 @@ impl Database {
|
||||
|
||||
pub async fn add_video(&self, video: &VideoRow) -> Result<VideoRow, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let sql = sqlx::query("INSERT INTO videos (room_id, cover, file, length, size, status, bvid, title, desc, tags, area, created_at, platform) VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13)")
|
||||
.bind(video.room_id as i64)
|
||||
let sql = sqlx::query("INSERT INTO videos (room_id, cover, file, note, length, size, status, bvid, title, desc, tags, area, created_at, platform) VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14)")
|
||||
.bind(video.room_id)
|
||||
.bind(&video.cover)
|
||||
.bind(&video.file)
|
||||
.bind(&video.note)
|
||||
.bind(video.length)
|
||||
.bind(video.size)
|
||||
.bind(video.status)
|
||||
@@ -106,7 +92,7 @@ impl Database {
|
||||
Ok(video)
|
||||
}
|
||||
|
||||
pub async fn update_video_cover(&self, id: i64, cover: String) -> Result<(), DatabaseError> {
|
||||
pub async fn update_video_cover(&self, id: i64, cover: &str) -> Result<(), DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
sqlx::query("UPDATE videos SET cover = $1 WHERE id = $2")
|
||||
.bind(cover)
|
||||
@@ -116,10 +102,10 @@ impl Database {
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn get_all_videos(&self) -> Result<Vec<VideoNoCover>, DatabaseError> {
|
||||
pub async fn get_all_videos(&self) -> Result<Vec<VideoRow>, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let videos =
|
||||
sqlx::query_as::<_, VideoNoCover>("SELECT * FROM videos ORDER BY created_at DESC;")
|
||||
sqlx::query_as::<_, VideoRow>("SELECT * FROM videos ORDER BY created_at DESC;")
|
||||
.fetch_all(&lock)
|
||||
.await?;
|
||||
Ok(videos)
|
||||
|
||||
@@ -1,999 +0,0 @@
|
||||
use std::fmt;
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::process::Stdio;
|
||||
|
||||
use crate::progress_reporter::{ProgressReporter, ProgressReporterTrait};
|
||||
use crate::subtitle_generator::whisper_online;
|
||||
use crate::subtitle_generator::{
|
||||
whisper_cpp, GenerateResult, SubtitleGenerator, SubtitleGeneratorType,
|
||||
};
|
||||
use async_ffmpeg_sidecar::event::{FfmpegEvent, LogLevel};
|
||||
use async_ffmpeg_sidecar::log_parser::FfmpegLogParser;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use tokio::io::{AsyncBufReadExt, BufReader};
|
||||
|
||||
// 视频元数据结构
|
||||
#[derive(Debug)]
|
||||
pub struct VideoMetadata {
|
||||
pub duration: f64,
|
||||
pub width: u32,
|
||||
pub height: u32,
|
||||
}
|
||||
|
||||
#[cfg(target_os = "windows")]
|
||||
const CREATE_NO_WINDOW: u32 = 0x08000000;
|
||||
#[cfg(target_os = "windows")]
|
||||
#[allow(unused_imports)]
|
||||
use std::os::windows::process::CommandExt;
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct Range {
|
||||
pub start: f64,
|
||||
pub end: f64,
|
||||
}
|
||||
|
||||
impl fmt::Display for Range {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(f, "[{}, {}]", self.start, self.end)
|
||||
}
|
||||
}
|
||||
|
||||
impl Range {
|
||||
pub fn duration(&self) -> f64 {
|
||||
self.end - self.start
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn clip_from_m3u8(
|
||||
reporter: Option<&impl ProgressReporterTrait>,
|
||||
m3u8_index: &Path,
|
||||
output_path: &Path,
|
||||
range: Option<&Range>,
|
||||
fix_encoding: bool,
|
||||
) -> Result<(), String> {
|
||||
// first check output folder exists
|
||||
let output_folder = output_path.parent().unwrap();
|
||||
if !output_folder.exists() {
|
||||
log::warn!(
|
||||
"Output folder does not exist, creating: {}",
|
||||
output_folder.display()
|
||||
);
|
||||
std::fs::create_dir_all(output_folder).unwrap();
|
||||
}
|
||||
|
||||
let mut ffmpeg_process = tokio::process::Command::new(ffmpeg_path());
|
||||
#[cfg(target_os = "windows")]
|
||||
ffmpeg_process.creation_flags(CREATE_NO_WINDOW);
|
||||
|
||||
let child_command = ffmpeg_process.args(["-i", &format!("{}", m3u8_index.display())]);
|
||||
|
||||
if let Some(range) = range {
|
||||
child_command
|
||||
.args(["-ss", &range.start.to_string()])
|
||||
.args(["-t", &range.duration().to_string()]);
|
||||
}
|
||||
|
||||
if fix_encoding {
|
||||
child_command
|
||||
.args(["-c:v", "libx264"])
|
||||
.args(["-c:a", "aac"])
|
||||
.args(["-preset", "fast"]);
|
||||
} else {
|
||||
child_command.args(["-c", "copy"]);
|
||||
}
|
||||
|
||||
let child = child_command
|
||||
.args(["-y", output_path.to_str().unwrap()])
|
||||
.args(["-progress", "pipe:2"])
|
||||
.stderr(Stdio::piped())
|
||||
.spawn();
|
||||
|
||||
if let Err(e) = child {
|
||||
return Err(format!("Spawn ffmpeg process failed: {}", e));
|
||||
}
|
||||
|
||||
let mut child = child.unwrap();
|
||||
let stderr = child.stderr.take().unwrap();
|
||||
let reader = BufReader::new(stderr);
|
||||
let mut parser = FfmpegLogParser::new(reader);
|
||||
|
||||
let mut clip_error = None;
|
||||
while let Ok(event) = parser.parse_next_event().await {
|
||||
match event {
|
||||
FfmpegEvent::Progress(p) => {
|
||||
if reporter.is_none() {
|
||||
continue;
|
||||
}
|
||||
log::debug!("Clip progress: {}", p.time);
|
||||
reporter
|
||||
.unwrap()
|
||||
.update(format!("编码中:{}", p.time).as_str())
|
||||
}
|
||||
FfmpegEvent::LogEOF => break,
|
||||
FfmpegEvent::Log(level, content) => {
|
||||
// log error if content contains error
|
||||
if content.contains("error") || level == LogLevel::Error {
|
||||
log::error!("Clip error: {}", content);
|
||||
}
|
||||
}
|
||||
FfmpegEvent::Error(e) => {
|
||||
log::error!("Clip error: {}", e);
|
||||
clip_error = Some(e.to_string());
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
if let Err(e) = child.wait().await {
|
||||
log::error!("Clip error: {}", e);
|
||||
return Err(e.to_string());
|
||||
}
|
||||
|
||||
if let Some(error) = clip_error {
|
||||
log::error!("Clip error: {}", error);
|
||||
Err(error)
|
||||
} else {
|
||||
log::info!("Clip task end: {}", output_path.display());
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn extract_audio_chunks(file: &Path, format: &str) -> Result<PathBuf, String> {
|
||||
// ffmpeg -i fixed_\[30655190\]1742887114_0325084106_81.5.mp4 -ar 16000 test.wav
|
||||
log::info!("Extract audio task start: {}", file.display());
|
||||
let output_path = file.with_extension(format);
|
||||
let mut extract_error = None;
|
||||
|
||||
// 降低采样率以提高处理速度,同时保持足够的音质用于语音识别
|
||||
let sample_rate = if format == "mp3" { "22050" } else { "16000" };
|
||||
|
||||
// First, get the duration of the input file
|
||||
let duration = get_audio_duration(file).await?;
|
||||
log::info!("Audio duration: {} seconds", duration);
|
||||
|
||||
// Split into chunks of 30 seconds
|
||||
let chunk_duration = 30;
|
||||
let chunk_count = (duration as f64 / chunk_duration as f64).ceil() as usize;
|
||||
log::info!(
|
||||
"Splitting into {} chunks of {} seconds each",
|
||||
chunk_count,
|
||||
chunk_duration
|
||||
);
|
||||
|
||||
// Create output directory for chunks
|
||||
let output_dir = output_path.parent().unwrap();
|
||||
let base_name = output_path.file_stem().unwrap().to_str().unwrap();
|
||||
let chunk_dir = output_dir.join(format!("{}_chunks", base_name));
|
||||
|
||||
if !chunk_dir.exists() {
|
||||
std::fs::create_dir_all(&chunk_dir)
|
||||
.map_err(|e| format!("Failed to create chunk directory: {}", e))?;
|
||||
}
|
||||
|
||||
// Use ffmpeg segment feature to split audio into chunks
|
||||
let segment_pattern = chunk_dir.join(format!("{}_%03d.{}", base_name, format));
|
||||
|
||||
// 构建优化的ffmpeg命令参数
|
||||
let file_str = file.to_str().unwrap();
|
||||
let chunk_duration_str = chunk_duration.to_string();
|
||||
let segment_pattern_str = segment_pattern.to_str().unwrap();
|
||||
|
||||
let mut args = vec![
|
||||
"-i",
|
||||
file_str,
|
||||
"-ar",
|
||||
sample_rate,
|
||||
"-vn",
|
||||
"-f",
|
||||
"segment",
|
||||
"-segment_time",
|
||||
&chunk_duration_str,
|
||||
"-reset_timestamps",
|
||||
"1",
|
||||
"-y",
|
||||
"-progress",
|
||||
"pipe:2",
|
||||
];
|
||||
|
||||
// 根据格式添加优化的编码参数
|
||||
if format == "mp3" {
|
||||
args.extend_from_slice(&[
|
||||
"-c:a",
|
||||
"mp3",
|
||||
"-b:a",
|
||||
"64k", // 降低比特率以提高速度
|
||||
"-compression_level",
|
||||
"0", // 最快压缩
|
||||
]);
|
||||
} else {
|
||||
args.extend_from_slice(&[
|
||||
"-c:a",
|
||||
"pcm_s16le", // 使用PCM编码,速度更快
|
||||
]);
|
||||
}
|
||||
|
||||
// 添加性能优化参数
|
||||
args.extend_from_slice(&[
|
||||
"-threads", "0", // 使用所有可用CPU核心
|
||||
]);
|
||||
|
||||
args.push(segment_pattern_str);
|
||||
|
||||
let mut ffmpeg_process = tokio::process::Command::new(ffmpeg_path());
|
||||
#[cfg(target_os = "windows")]
|
||||
ffmpeg_process.creation_flags(CREATE_NO_WINDOW);
|
||||
|
||||
let child = ffmpeg_process.args(&args).stderr(Stdio::piped()).spawn();
|
||||
|
||||
if let Err(e) = child {
|
||||
return Err(e.to_string());
|
||||
}
|
||||
|
||||
let mut child = child.unwrap();
|
||||
let stderr = child.stderr.take().unwrap();
|
||||
let reader = BufReader::new(stderr);
|
||||
let mut parser = FfmpegLogParser::new(reader);
|
||||
|
||||
while let Ok(event) = parser.parse_next_event().await {
|
||||
match event {
|
||||
FfmpegEvent::Error(e) => {
|
||||
log::error!("Extract audio error: {}", e);
|
||||
extract_error = Some(e.to_string());
|
||||
}
|
||||
FfmpegEvent::LogEOF => break,
|
||||
FfmpegEvent::Log(_level, _content) => {}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
if let Err(e) = child.wait().await {
|
||||
log::error!("Extract audio error: {}", e);
|
||||
return Err(e.to_string());
|
||||
}
|
||||
|
||||
if let Some(error) = extract_error {
|
||||
log::error!("Extract audio error: {}", error);
|
||||
Err(error)
|
||||
} else {
|
||||
log::info!(
|
||||
"Extract audio task end: {} chunks created in {}",
|
||||
chunk_count,
|
||||
chunk_dir.display()
|
||||
);
|
||||
Ok(chunk_dir)
|
||||
}
|
||||
}
|
||||
|
||||
/// Get the duration of an audio/video file in seconds
|
||||
async fn get_audio_duration(file: &Path) -> Result<u64, String> {
|
||||
// Use ffprobe with format option to get duration
|
||||
let mut ffprobe_process = tokio::process::Command::new(ffprobe_path());
|
||||
#[cfg(target_os = "windows")]
|
||||
ffprobe_process.creation_flags(CREATE_NO_WINDOW);
|
||||
|
||||
let child = ffprobe_process
|
||||
.args(["-v", "quiet"])
|
||||
.args(["-show_entries", "format=duration"])
|
||||
.args(["-of", "csv=p=0"])
|
||||
.args(["-i", file.to_str().unwrap()])
|
||||
.stdout(Stdio::piped())
|
||||
.stderr(Stdio::piped())
|
||||
.spawn();
|
||||
|
||||
if let Err(e) = child {
|
||||
return Err(format!("Failed to spawn ffprobe process: {}", e));
|
||||
}
|
||||
|
||||
let mut child = child.unwrap();
|
||||
let stdout = child.stdout.take().unwrap();
|
||||
let reader = BufReader::new(stdout);
|
||||
let mut parser = FfmpegLogParser::new(reader);
|
||||
|
||||
let mut duration = None;
|
||||
while let Ok(event) = parser.parse_next_event().await {
|
||||
match event {
|
||||
FfmpegEvent::LogEOF => break,
|
||||
FfmpegEvent::Log(_level, content) => {
|
||||
// The new command outputs duration directly as a float
|
||||
if let Ok(seconds_f64) = content.trim().parse::<f64>() {
|
||||
duration = Some(seconds_f64.ceil() as u64);
|
||||
log::debug!("Parsed duration: {} seconds", seconds_f64);
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
if let Err(e) = child.wait().await {
|
||||
log::error!("Failed to get duration: {}", e);
|
||||
return Err(e.to_string());
|
||||
}
|
||||
|
||||
duration.ok_or_else(|| "Failed to parse duration".to_string())
|
||||
}
|
||||
|
||||
/// Get the precise duration of a video segment (TS/MP4) in seconds
|
||||
pub async fn get_segment_duration(file: &Path) -> Result<f64, String> {
|
||||
// Use ffprobe to get the exact duration of the segment
|
||||
let mut ffprobe_process = tokio::process::Command::new(ffprobe_path());
|
||||
#[cfg(target_os = "windows")]
|
||||
ffprobe_process.creation_flags(CREATE_NO_WINDOW);
|
||||
|
||||
let child = ffprobe_process
|
||||
.args(["-v", "quiet"])
|
||||
.args(["-show_entries", "format=duration"])
|
||||
.args(["-of", "csv=p=0"])
|
||||
.args(["-i", file.to_str().unwrap()])
|
||||
.stdout(Stdio::piped())
|
||||
.stderr(Stdio::piped())
|
||||
.spawn();
|
||||
|
||||
if let Err(e) = child {
|
||||
return Err(format!(
|
||||
"Failed to spawn ffprobe process for segment: {}",
|
||||
e
|
||||
));
|
||||
}
|
||||
|
||||
let mut child = child.unwrap();
|
||||
let stdout = child.stdout.take().unwrap();
|
||||
let reader = BufReader::new(stdout);
|
||||
let mut parser = FfmpegLogParser::new(reader);
|
||||
|
||||
let mut duration = None;
|
||||
while let Ok(event) = parser.parse_next_event().await {
|
||||
match event {
|
||||
FfmpegEvent::LogEOF => break,
|
||||
FfmpegEvent::Log(_level, content) => {
|
||||
// Parse the exact duration as f64 for precise timing
|
||||
if let Ok(seconds_f64) = content.trim().parse::<f64>() {
|
||||
duration = Some(seconds_f64);
|
||||
log::debug!("Parsed segment duration: {} seconds", seconds_f64);
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
if let Err(e) = child.wait().await {
|
||||
log::error!("Failed to get segment duration: {}", e);
|
||||
return Err(e.to_string());
|
||||
}
|
||||
|
||||
duration.ok_or_else(|| "Failed to parse segment duration".to_string())
|
||||
}
|
||||
|
||||
pub async fn encode_video_subtitle(
|
||||
reporter: &impl ProgressReporterTrait,
|
||||
file: &Path,
|
||||
subtitle: &Path,
|
||||
srt_style: String,
|
||||
) -> Result<String, String> {
|
||||
// ffmpeg -i fixed_\[30655190\]1742887114_0325084106_81.5.mp4 -vf "subtitles=test.srt:force_style='FontSize=24'" -c:v libx264 -c:a copy output.mp4
|
||||
log::info!("Encode video subtitle task start: {}", file.display());
|
||||
log::info!("srt_style: {}", srt_style);
|
||||
// output path is file with prefix [subtitle]
|
||||
let output_filename = format!("[subtitle]{}", file.file_name().unwrap().to_str().unwrap());
|
||||
let output_path = file.with_file_name(&output_filename);
|
||||
|
||||
// check output path exists - log but allow overwrite
|
||||
if output_path.exists() {
|
||||
log::info!("Output path already exists, will overwrite: {}", output_path.display());
|
||||
}
|
||||
|
||||
let mut command_error = None;
|
||||
|
||||
// if windows
|
||||
let subtitle = if cfg!(target_os = "windows") {
|
||||
// escape characters in subtitle path
|
||||
let subtitle = subtitle
|
||||
.to_str()
|
||||
.unwrap()
|
||||
.replace("\\", "\\\\")
|
||||
.replace(":", "\\:");
|
||||
format!("'{}'", subtitle)
|
||||
} else {
|
||||
format!("'{}'", subtitle.display())
|
||||
};
|
||||
let vf = format!("subtitles={}:force_style='{}'", subtitle, srt_style);
|
||||
log::info!("vf: {}", vf);
|
||||
|
||||
let mut ffmpeg_process = tokio::process::Command::new(ffmpeg_path());
|
||||
#[cfg(target_os = "windows")]
|
||||
ffmpeg_process.creation_flags(CREATE_NO_WINDOW);
|
||||
|
||||
let child = ffmpeg_process
|
||||
.args(["-i", file.to_str().unwrap()])
|
||||
.args(["-vf", vf.as_str()])
|
||||
.args(["-c:v", "libx264"])
|
||||
.args(["-c:a", "copy"])
|
||||
.args([output_path.to_str().unwrap()])
|
||||
.args(["-y"])
|
||||
.args(["-progress", "pipe:2"])
|
||||
.stderr(Stdio::piped())
|
||||
.spawn();
|
||||
|
||||
if let Err(e) = child {
|
||||
return Err(e.to_string());
|
||||
}
|
||||
|
||||
let mut child = child.unwrap();
|
||||
let stderr = child.stderr.take().unwrap();
|
||||
let reader = BufReader::new(stderr);
|
||||
let mut parser = FfmpegLogParser::new(reader);
|
||||
|
||||
while let Ok(event) = parser.parse_next_event().await {
|
||||
match event {
|
||||
FfmpegEvent::Error(e) => {
|
||||
log::error!("Encode video subtitle error: {}", e);
|
||||
command_error = Some(e.to_string());
|
||||
}
|
||||
FfmpegEvent::Progress(p) => {
|
||||
log::info!("Encode video subtitle progress: {}", p.time);
|
||||
reporter.update(format!("压制中:{}", p.time).as_str());
|
||||
}
|
||||
FfmpegEvent::LogEOF => break,
|
||||
FfmpegEvent::Log(_level, _content) => {}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
if let Err(e) = child.wait().await {
|
||||
log::error!("Encode video subtitle error: {}", e);
|
||||
return Err(e.to_string());
|
||||
}
|
||||
|
||||
if let Some(error) = command_error {
|
||||
log::error!("Encode video subtitle error: {}", error);
|
||||
Err(error)
|
||||
} else {
|
||||
log::info!("Encode video subtitle task end: {}", output_path.display());
|
||||
Ok(output_filename)
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn encode_video_danmu(
|
||||
reporter: Option<&impl ProgressReporterTrait>,
|
||||
file: &Path,
|
||||
subtitle: &Path,
|
||||
) -> Result<PathBuf, String> {
|
||||
// ffmpeg -i fixed_\[30655190\]1742887114_0325084106_81.5.mp4 -vf ass=subtitle.ass -c:v libx264 -c:a copy output.mp4
|
||||
log::info!("Encode video danmu task start: {}", file.display());
|
||||
let danmu_filename = format!("[danmu]{}", file.file_name().unwrap().to_str().unwrap());
|
||||
let output_path = file.with_file_name(danmu_filename);
|
||||
|
||||
// check output path exists - log but allow overwrite
|
||||
if output_path.exists() {
|
||||
log::info!("Output path already exists, will overwrite: {}", output_path.display());
|
||||
}
|
||||
|
||||
let mut command_error = None;
|
||||
|
||||
// if windows
|
||||
let subtitle = if cfg!(target_os = "windows") {
|
||||
// escape characters in subtitle path
|
||||
let subtitle = subtitle
|
||||
.to_str()
|
||||
.unwrap()
|
||||
.replace("\\", "\\\\")
|
||||
.replace(":", "\\:");
|
||||
format!("'{}'", subtitle)
|
||||
} else {
|
||||
format!("'{}'", subtitle.display())
|
||||
};
|
||||
|
||||
let mut ffmpeg_process = tokio::process::Command::new(ffmpeg_path());
|
||||
#[cfg(target_os = "windows")]
|
||||
ffmpeg_process.creation_flags(CREATE_NO_WINDOW);
|
||||
|
||||
let child = ffmpeg_process
|
||||
.args(["-i", file.to_str().unwrap()])
|
||||
.args(["-vf", &format!("ass={}", subtitle)])
|
||||
.args(["-c:v", "libx264"])
|
||||
.args(["-c:a", "copy"])
|
||||
.args([output_path.to_str().unwrap()])
|
||||
.args(["-y"])
|
||||
.args(["-progress", "pipe:2"])
|
||||
.stderr(Stdio::piped())
|
||||
.spawn();
|
||||
|
||||
if let Err(e) = child {
|
||||
return Err(e.to_string());
|
||||
}
|
||||
|
||||
let mut child = child.unwrap();
|
||||
let stderr = child.stderr.take().unwrap();
|
||||
let reader = BufReader::new(stderr);
|
||||
let mut parser = FfmpegLogParser::new(reader);
|
||||
|
||||
while let Ok(event) = parser.parse_next_event().await {
|
||||
match event {
|
||||
FfmpegEvent::Error(e) => {
|
||||
log::error!("Encode video danmu error: {}", e);
|
||||
command_error = Some(e.to_string());
|
||||
}
|
||||
FfmpegEvent::Progress(p) => {
|
||||
log::debug!("Encode video danmu progress: {}", p.time);
|
||||
if reporter.is_none() {
|
||||
continue;
|
||||
}
|
||||
reporter
|
||||
.unwrap()
|
||||
.update(format!("压制中:{}", p.time).as_str());
|
||||
}
|
||||
FfmpegEvent::Log(_level, _content) => {}
|
||||
FfmpegEvent::LogEOF => break,
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
if let Err(e) = child.wait().await {
|
||||
log::error!("Encode video danmu error: {}", e);
|
||||
return Err(e.to_string());
|
||||
}
|
||||
|
||||
if let Some(error) = command_error {
|
||||
log::error!("Encode video danmu error: {}", error);
|
||||
Err(error)
|
||||
} else {
|
||||
log::info!("Encode video danmu task end: {}", output_path.display());
|
||||
Ok(output_path)
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn generic_ffmpeg_command(args: &[&str]) -> Result<String, String> {
|
||||
let mut ffmpeg_process = tokio::process::Command::new(ffmpeg_path());
|
||||
#[cfg(target_os = "windows")]
|
||||
ffmpeg_process.creation_flags(CREATE_NO_WINDOW);
|
||||
|
||||
let child = ffmpeg_process.args(args).stderr(Stdio::piped()).spawn();
|
||||
if let Err(e) = child {
|
||||
return Err(e.to_string());
|
||||
}
|
||||
|
||||
let mut child = child.unwrap();
|
||||
let stderr = child.stderr.take().unwrap();
|
||||
let reader = BufReader::new(stderr);
|
||||
let mut parser = FfmpegLogParser::new(reader);
|
||||
|
||||
let mut logs = Vec::new();
|
||||
|
||||
while let Ok(event) = parser.parse_next_event().await {
|
||||
match event {
|
||||
FfmpegEvent::Log(_level, content) => {
|
||||
logs.push(content);
|
||||
}
|
||||
FfmpegEvent::LogEOF => break,
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
if let Err(e) = child.wait().await {
|
||||
log::error!("Generic ffmpeg command error: {}", e);
|
||||
return Err(e.to_string());
|
||||
}
|
||||
|
||||
Ok(logs.join("\n"))
|
||||
}
|
||||
|
||||
#[allow(clippy::too_many_arguments)]
|
||||
pub async fn generate_video_subtitle(
|
||||
reporter: Option<&ProgressReporter>,
|
||||
file: &Path,
|
||||
generator_type: &str,
|
||||
whisper_model: &str,
|
||||
whisper_prompt: &str,
|
||||
openai_api_key: &str,
|
||||
openai_api_endpoint: &str,
|
||||
language_hint: &str,
|
||||
) -> Result<GenerateResult, String> {
|
||||
match generator_type {
|
||||
"whisper" => {
|
||||
if whisper_model.is_empty() {
|
||||
return Err("Whisper model not configured".to_string());
|
||||
}
|
||||
if let Ok(generator) = whisper_cpp::new(Path::new(&whisper_model), whisper_prompt).await
|
||||
{
|
||||
let chunk_dir = extract_audio_chunks(file, "wav").await?;
|
||||
|
||||
let mut full_result = GenerateResult {
|
||||
subtitle_id: "".to_string(),
|
||||
subtitle_content: vec![],
|
||||
generator_type: SubtitleGeneratorType::Whisper,
|
||||
};
|
||||
|
||||
let mut chunk_paths = vec![];
|
||||
for entry in std::fs::read_dir(&chunk_dir)
|
||||
.map_err(|e| format!("Failed to read chunk directory: {}", e))?
|
||||
{
|
||||
let entry =
|
||||
entry.map_err(|e| format!("Failed to read directory entry: {}", e))?;
|
||||
let path = entry.path();
|
||||
chunk_paths.push(path);
|
||||
}
|
||||
|
||||
// sort chunk paths by name
|
||||
chunk_paths
|
||||
.sort_by_key(|path| path.file_name().unwrap().to_str().unwrap().to_string());
|
||||
|
||||
let mut results = Vec::new();
|
||||
for path in chunk_paths {
|
||||
let result = generator
|
||||
.generate_subtitle(reporter, &path, language_hint)
|
||||
.await;
|
||||
results.push(result);
|
||||
}
|
||||
|
||||
for (i, result) in results.iter().enumerate() {
|
||||
if let Ok(result) = result {
|
||||
full_result.subtitle_id = result.subtitle_id.clone();
|
||||
full_result.concat(result, 30 * i as u64);
|
||||
}
|
||||
}
|
||||
|
||||
// delete chunk directory
|
||||
let _ = tokio::fs::remove_dir_all(chunk_dir).await;
|
||||
|
||||
Ok(full_result)
|
||||
} else {
|
||||
Err("Failed to initialize Whisper model".to_string())
|
||||
}
|
||||
}
|
||||
"whisper_online" => {
|
||||
if openai_api_key.is_empty() {
|
||||
return Err("API key not configured".to_string());
|
||||
}
|
||||
if let Ok(generator) = whisper_online::new(
|
||||
Some(openai_api_endpoint),
|
||||
Some(openai_api_key),
|
||||
Some(whisper_prompt),
|
||||
)
|
||||
.await
|
||||
{
|
||||
let chunk_dir = extract_audio_chunks(file, "mp3").await?;
|
||||
|
||||
let mut full_result = GenerateResult {
|
||||
subtitle_id: "".to_string(),
|
||||
subtitle_content: vec![],
|
||||
generator_type: SubtitleGeneratorType::WhisperOnline,
|
||||
};
|
||||
|
||||
let mut chunk_paths = vec![];
|
||||
for entry in std::fs::read_dir(&chunk_dir)
|
||||
.map_err(|e| format!("Failed to read chunk directory: {}", e))?
|
||||
{
|
||||
let entry =
|
||||
entry.map_err(|e| format!("Failed to read directory entry: {}", e))?;
|
||||
let path = entry.path();
|
||||
chunk_paths.push(path);
|
||||
}
|
||||
// sort chunk paths by name
|
||||
chunk_paths
|
||||
.sort_by_key(|path| path.file_name().unwrap().to_str().unwrap().to_string());
|
||||
|
||||
let mut results = Vec::new();
|
||||
for path in chunk_paths {
|
||||
let result = generator
|
||||
.generate_subtitle(reporter, &path, language_hint)
|
||||
.await;
|
||||
results.push(result);
|
||||
}
|
||||
|
||||
for (i, result) in results.iter().enumerate() {
|
||||
if let Ok(result) = result {
|
||||
full_result.subtitle_id = result.subtitle_id.clone();
|
||||
full_result.concat(result, 30 * i as u64);
|
||||
}
|
||||
}
|
||||
|
||||
// delete chunk directory
|
||||
let _ = tokio::fs::remove_dir_all(chunk_dir).await;
|
||||
|
||||
Ok(full_result)
|
||||
} else {
|
||||
Err("Failed to initialize Whisper Online".to_string())
|
||||
}
|
||||
}
|
||||
_ => Err(format!(
|
||||
"Unknown subtitle generator type: {}",
|
||||
generator_type
|
||||
)),
|
||||
}
|
||||
}
|
||||
|
||||
/// Trying to run ffmpeg for version
|
||||
pub async fn check_ffmpeg() -> Result<String, String> {
|
||||
let child = tokio::process::Command::new(ffmpeg_path())
|
||||
.arg("-version")
|
||||
.stdout(Stdio::piped())
|
||||
.spawn();
|
||||
if let Err(e) = child {
|
||||
log::error!("Faild to spwan ffmpeg process: {e}");
|
||||
return Err(e.to_string());
|
||||
}
|
||||
|
||||
let mut child = child.unwrap();
|
||||
|
||||
let stdout = child.stdout.take();
|
||||
if stdout.is_none() {
|
||||
log::error!("Failed to take ffmpeg output");
|
||||
return Err("Failed to take ffmpeg output".into());
|
||||
}
|
||||
|
||||
let stdout = stdout.unwrap();
|
||||
let reader = BufReader::new(stdout);
|
||||
let mut parser = FfmpegLogParser::new(reader);
|
||||
|
||||
let mut version = None;
|
||||
while let Ok(event) = parser.parse_next_event().await {
|
||||
match event {
|
||||
FfmpegEvent::ParsedVersion(v) => version = Some(v.version),
|
||||
FfmpegEvent::LogEOF => break,
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(version) = version {
|
||||
Ok(version)
|
||||
} else {
|
||||
Err("Failed to parse version from output".into())
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn get_video_resolution(file: &str) -> Result<String, String> {
|
||||
// ffprobe -v error -select_streams v:0 -show_entries stream=width,height -of csv=s=x:p=0 input.mp4
|
||||
let mut ffprobe_process = tokio::process::Command::new(ffprobe_path());
|
||||
#[cfg(target_os = "windows")]
|
||||
ffprobe_process.creation_flags(CREATE_NO_WINDOW);
|
||||
|
||||
let child = ffprobe_process
|
||||
.arg("-i")
|
||||
.arg(file)
|
||||
.arg("-v")
|
||||
.arg("error")
|
||||
.arg("-select_streams")
|
||||
.arg("v:0")
|
||||
.arg("-show_entries")
|
||||
.arg("stream=width,height")
|
||||
.arg("-of")
|
||||
.arg("csv=s=x:p=0")
|
||||
.stdout(Stdio::piped())
|
||||
.spawn();
|
||||
if let Err(e) = child {
|
||||
log::error!("Faild to spwan ffprobe process: {e}");
|
||||
return Err(e.to_string());
|
||||
}
|
||||
|
||||
let mut child = child.unwrap();
|
||||
let stdout = child.stdout.take();
|
||||
if stdout.is_none() {
|
||||
log::error!("Failed to take ffprobe output");
|
||||
return Err("Failed to take ffprobe output".into());
|
||||
}
|
||||
|
||||
let stdout = stdout.unwrap();
|
||||
let reader = BufReader::new(stdout);
|
||||
let mut lines = reader.lines();
|
||||
let line = lines.next_line().await.unwrap();
|
||||
if line.is_none() {
|
||||
return Err("Failed to parse resolution from output".into());
|
||||
}
|
||||
let line = line.unwrap();
|
||||
let resolution = line.split("x").collect::<Vec<&str>>();
|
||||
if resolution.len() != 2 {
|
||||
return Err("Failed to parse resolution from output".into());
|
||||
}
|
||||
Ok(format!("{}x{}", resolution[0], resolution[1]))
|
||||
}
|
||||
|
||||
fn ffmpeg_path() -> PathBuf {
|
||||
let mut path = Path::new("ffmpeg").to_path_buf();
|
||||
if cfg!(windows) {
|
||||
path.set_extension("exe");
|
||||
}
|
||||
|
||||
path
|
||||
}
|
||||
|
||||
fn ffprobe_path() -> PathBuf {
|
||||
let mut path = Path::new("ffprobe").to_path_buf();
|
||||
if cfg!(windows) {
|
||||
path.set_extension("exe");
|
||||
}
|
||||
|
||||
path
|
||||
}
|
||||
|
||||
// 解析 FFmpeg 时间字符串 (格式如 "00:01:23.45")
|
||||
fn parse_time_string(time_str: &str) -> Result<f64, String> {
|
||||
let parts: Vec<&str> = time_str.split(':').collect();
|
||||
if parts.len() != 3 {
|
||||
return Err("Invalid time format".to_string());
|
||||
}
|
||||
|
||||
let hours: f64 = parts[0].parse().map_err(|_| "Invalid hours")?;
|
||||
let minutes: f64 = parts[1].parse().map_err(|_| "Invalid minutes")?;
|
||||
let seconds: f64 = parts[2].parse().map_err(|_| "Invalid seconds")?;
|
||||
|
||||
Ok(hours * 3600.0 + minutes * 60.0 + seconds)
|
||||
}
|
||||
|
||||
// 从视频文件切片
|
||||
pub async fn clip_from_video_file(
|
||||
reporter: Option<&impl ProgressReporterTrait>,
|
||||
input_path: &Path,
|
||||
output_path: &Path,
|
||||
start_time: f64,
|
||||
duration: f64,
|
||||
) -> Result<(), String> {
|
||||
let output_folder = output_path.parent().unwrap();
|
||||
if !output_folder.exists() {
|
||||
std::fs::create_dir_all(output_folder).unwrap();
|
||||
}
|
||||
|
||||
let mut ffmpeg_process = tokio::process::Command::new(ffmpeg_path());
|
||||
#[cfg(target_os = "windows")]
|
||||
ffmpeg_process.creation_flags(CREATE_NO_WINDOW);
|
||||
|
||||
let child = ffmpeg_process
|
||||
.args(["-i", &format!("{}", input_path.display())])
|
||||
.args(["-ss", &start_time.to_string()])
|
||||
.args(["-t", &duration.to_string()])
|
||||
.args(["-c:v", "libx264"])
|
||||
.args(["-c:a", "aac"])
|
||||
.args(["-preset", "fast"])
|
||||
.args(["-crf", "23"])
|
||||
.args(["-avoid_negative_ts", "make_zero"])
|
||||
.args(["-y", output_path.to_str().unwrap()])
|
||||
.args(["-progress", "pipe:2"])
|
||||
.stderr(Stdio::piped())
|
||||
.spawn();
|
||||
|
||||
if let Err(e) = child {
|
||||
return Err(format!("启动ffmpeg进程失败: {}", e));
|
||||
}
|
||||
|
||||
let mut child = child.unwrap();
|
||||
let stderr = child.stderr.take().unwrap();
|
||||
let reader = BufReader::new(stderr);
|
||||
let mut parser = FfmpegLogParser::new(reader);
|
||||
|
||||
let mut clip_error = None;
|
||||
while let Ok(event) = parser.parse_next_event().await {
|
||||
match event {
|
||||
FfmpegEvent::Progress(p) => {
|
||||
if let Some(reporter) = reporter {
|
||||
// 解析时间字符串 (格式如 "00:01:23.45")
|
||||
if let Ok(current_time) = parse_time_string(&p.time) {
|
||||
let progress = (current_time / duration * 100.0).min(100.0);
|
||||
reporter.update(&format!("切片进度: {:.1}%", progress));
|
||||
}
|
||||
}
|
||||
}
|
||||
FfmpegEvent::LogEOF => break,
|
||||
FfmpegEvent::Log(level, content) => {
|
||||
if content.contains("error") || level == LogLevel::Error {
|
||||
log::error!("切片错误: {}", content);
|
||||
}
|
||||
}
|
||||
FfmpegEvent::Error(e) => {
|
||||
log::error!("切片错误: {}", e);
|
||||
clip_error = Some(e.to_string());
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
if let Err(e) = child.wait().await {
|
||||
return Err(e.to_string());
|
||||
}
|
||||
|
||||
if let Some(error) = clip_error {
|
||||
Err(error)
|
||||
} else {
|
||||
log::info!("切片任务完成: {}", output_path.display());
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
// 获取视频元数据
|
||||
pub async fn extract_video_metadata(file_path: &Path) -> Result<VideoMetadata, String> {
|
||||
let mut ffprobe_process = tokio::process::Command::new("ffprobe");
|
||||
#[cfg(target_os = "windows")]
|
||||
ffprobe_process.creation_flags(CREATE_NO_WINDOW);
|
||||
|
||||
let output = ffprobe_process
|
||||
.args([
|
||||
"-v", "quiet",
|
||||
"-print_format", "json",
|
||||
"-show_format",
|
||||
"-show_streams",
|
||||
"-select_streams", "v:0",
|
||||
&format!("{}", file_path.display())
|
||||
])
|
||||
.output()
|
||||
.await
|
||||
.map_err(|e| format!("执行ffprobe失败: {}", e))?;
|
||||
|
||||
if !output.status.success() {
|
||||
return Err(format!("ffprobe执行失败: {}", String::from_utf8_lossy(&output.stderr)));
|
||||
}
|
||||
|
||||
let json_str = String::from_utf8_lossy(&output.stdout);
|
||||
let json: serde_json::Value = serde_json::from_str(&json_str)
|
||||
.map_err(|e| format!("解析ffprobe输出失败: {}", e))?;
|
||||
|
||||
// 解析视频流信息
|
||||
let streams = json["streams"].as_array()
|
||||
.ok_or("未找到视频流信息")?;
|
||||
|
||||
if streams.is_empty() {
|
||||
return Err("未找到视频流".to_string());
|
||||
}
|
||||
|
||||
let video_stream = &streams[0];
|
||||
let format = &json["format"];
|
||||
|
||||
let duration = format["duration"].as_str()
|
||||
.and_then(|d| d.parse::<f64>().ok())
|
||||
.unwrap_or(0.0);
|
||||
|
||||
let width = video_stream["width"].as_u64().unwrap_or(0) as u32;
|
||||
let height = video_stream["height"].as_u64().unwrap_or(0) as u32;
|
||||
|
||||
Ok(VideoMetadata {
|
||||
duration,
|
||||
width,
|
||||
height,
|
||||
})
|
||||
}
|
||||
|
||||
// 生成视频缩略图
|
||||
pub async fn generate_thumbnail(
|
||||
video_path: &Path,
|
||||
output_path: &Path,
|
||||
timestamp: f64,
|
||||
) -> Result<(), String> {
|
||||
let output_folder = output_path.parent().unwrap();
|
||||
if !output_folder.exists() {
|
||||
std::fs::create_dir_all(output_folder).unwrap();
|
||||
}
|
||||
|
||||
let mut ffmpeg_process = tokio::process::Command::new(ffmpeg_path());
|
||||
#[cfg(target_os = "windows")]
|
||||
ffmpeg_process.creation_flags(CREATE_NO_WINDOW);
|
||||
|
||||
let output = ffmpeg_process
|
||||
.args(["-i", &format!("{}", video_path.display())])
|
||||
.args(["-ss", ×tamp.to_string()])
|
||||
.args(["-vframes", "1"])
|
||||
.args(["-y", output_path.to_str().unwrap()])
|
||||
.output()
|
||||
.await
|
||||
.map_err(|e| format!("生成缩略图失败: {}", e))?;
|
||||
|
||||
if !output.status.success() {
|
||||
return Err(format!("ffmpeg生成缩略图失败: {}", String::from_utf8_lossy(&output.stderr)));
|
||||
}
|
||||
|
||||
// 记录生成的缩略图信息
|
||||
if let Ok(metadata) = std::fs::metadata(output_path) {
|
||||
log::info!("生成缩略图完成: {} (文件大小: {} bytes)", output_path.display(), metadata.len());
|
||||
} else {
|
||||
log::info!("生成缩略图完成: {}", output_path.display());
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// tests
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_get_video_size() {
|
||||
let file = Path::new("/Users/xinreasuper/Desktop/shadowreplay-test/output2/[1789714684][1753965688317][摄像头被前夫抛妻弃子直播挣点奶粉][2025-07-31_12-58-14].mp4");
|
||||
let resolution = get_video_resolution(file.to_str().unwrap()).await.unwrap();
|
||||
println!("Resolution: {}", resolution);
|
||||
}
|
||||
}
|
||||
122
src-tauri/src/ffmpeg/general.rs
Normal file
122
src-tauri/src/ffmpeg/general.rs
Normal file
@@ -0,0 +1,122 @@
|
||||
use std::{
|
||||
path::{Path, PathBuf},
|
||||
process::Stdio,
|
||||
};
|
||||
|
||||
use async_ffmpeg_sidecar::{event::FfmpegEvent, log_parser::FfmpegLogParser};
|
||||
use tokio::io::{AsyncWriteExt, BufReader};
|
||||
|
||||
use crate::progress::progress_reporter::ProgressReporterTrait;
|
||||
|
||||
use super::ffmpeg_path;
|
||||
|
||||
#[cfg(target_os = "windows")]
|
||||
const CREATE_NO_WINDOW: u32 = 0x08000000;
|
||||
#[cfg(target_os = "windows")]
|
||||
#[allow(unused_imports)]
|
||||
use std::os::windows::process::CommandExt;
|
||||
|
||||
/// Generate a random filename in hex
|
||||
pub async fn random_filename() -> String {
|
||||
format!("{:x}", rand::random::<u64>())
|
||||
}
|
||||
|
||||
pub async fn handle_ffmpeg_process(
|
||||
reporter: Option<&impl ProgressReporterTrait>,
|
||||
ffmpeg_process: &mut tokio::process::Command,
|
||||
) -> Result<(), String> {
|
||||
let child = ffmpeg_process.stderr(Stdio::piped()).spawn();
|
||||
if let Err(e) = child {
|
||||
return Err(e.to_string());
|
||||
}
|
||||
let mut child = child.unwrap();
|
||||
let stderr = child.stderr.take().unwrap();
|
||||
let reader = BufReader::new(stderr);
|
||||
let mut parser = FfmpegLogParser::new(reader);
|
||||
while let Ok(event) = parser.parse_next_event().await {
|
||||
match event {
|
||||
FfmpegEvent::Progress(p) => {
|
||||
if let Some(reporter) = reporter {
|
||||
reporter.update(p.time.to_string().as_str());
|
||||
}
|
||||
}
|
||||
FfmpegEvent::LogEOF => break,
|
||||
FfmpegEvent::Error(e) => {
|
||||
return Err(e.to_string());
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
if let Err(e) = child.wait().await {
|
||||
return Err(e.to_string());
|
||||
}
|
||||
|
||||
if let Some(reporter) = reporter {
|
||||
reporter.update("合成完成");
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn concat_videos(
|
||||
reporter: Option<&impl ProgressReporterTrait>,
|
||||
videos: &[PathBuf],
|
||||
output_path: &Path,
|
||||
) -> Result<(), String> {
|
||||
// ffmpeg -i input1.mp4 -i input2.mp4 -i input3.mp4 -c copy output.mp4
|
||||
let mut ffmpeg_process = tokio::process::Command::new(ffmpeg_path());
|
||||
#[cfg(target_os = "windows")]
|
||||
ffmpeg_process.creation_flags(CREATE_NO_WINDOW);
|
||||
|
||||
let output_folder = output_path.parent().unwrap();
|
||||
if !output_folder.exists() {
|
||||
std::fs::create_dir_all(output_folder).unwrap();
|
||||
}
|
||||
|
||||
let filelist_filename = format!("filelist_{}.txt", random_filename().await);
|
||||
|
||||
let mut filelist = tokio::fs::File::create(&output_folder.join(&filelist_filename))
|
||||
.await
|
||||
.unwrap();
|
||||
for video in videos {
|
||||
filelist
|
||||
.write_all(format!("file '{}'\n", video.display()).as_bytes())
|
||||
.await
|
||||
.unwrap();
|
||||
}
|
||||
filelist.flush().await.unwrap();
|
||||
|
||||
// Convert &[PathBuf] to &[&Path] for check_videos
|
||||
let video_refs: Vec<&Path> = videos.iter().map(|p| p.as_path()).collect();
|
||||
let should_encode = !super::check_videos(&video_refs).await;
|
||||
|
||||
ffmpeg_process.args([
|
||||
"-f",
|
||||
"concat",
|
||||
"-safe",
|
||||
"0",
|
||||
"-i",
|
||||
output_folder.join(&filelist_filename).to_str().unwrap(),
|
||||
]);
|
||||
if should_encode {
|
||||
ffmpeg_process.args(["-vf", "scale=1920:1080"]);
|
||||
ffmpeg_process.args(["-r", "60"]);
|
||||
ffmpeg_process.args(["-c", "libx264"]);
|
||||
ffmpeg_process.args(["-c:a", "aac"]);
|
||||
ffmpeg_process.args(["-b:v", "6000k"]);
|
||||
ffmpeg_process.args(["-b:a", "128k"]);
|
||||
ffmpeg_process.args(["-threads", "0"]);
|
||||
} else {
|
||||
ffmpeg_process.args(["-c", "copy"]);
|
||||
}
|
||||
ffmpeg_process.args([output_path.to_str().unwrap()]);
|
||||
ffmpeg_process.args(["-progress", "pipe:2"]);
|
||||
ffmpeg_process.args(["-y"]);
|
||||
|
||||
handle_ffmpeg_process(reporter, &mut ffmpeg_process).await?;
|
||||
|
||||
// clean up filelist
|
||||
let _ = tokio::fs::remove_file(output_folder.join(&filelist_filename)).await;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
1530
src-tauri/src/ffmpeg/mod.rs
Normal file
1530
src-tauri/src/ffmpeg/mod.rs
Normal file
File diff suppressed because it is too large
Load Diff
135
src-tauri/src/ffmpeg/playlist.rs
Normal file
135
src-tauri/src/ffmpeg/playlist.rs
Normal file
@@ -0,0 +1,135 @@
|
||||
use std::path::Path;
|
||||
|
||||
use m3u8_rs::Map;
|
||||
use tokio::io::AsyncWriteExt;
|
||||
|
||||
use crate::progress::progress_reporter::ProgressReporterTrait;
|
||||
|
||||
use super::Range;
|
||||
|
||||
pub async fn playlist_to_video(
|
||||
reporter: Option<&impl ProgressReporterTrait>,
|
||||
playlist_path: &Path,
|
||||
output_path: &Path,
|
||||
range: Option<Range>,
|
||||
) -> Result<(), String> {
|
||||
let (_, playlist) = m3u8_rs::parse_media_playlist(
|
||||
&tokio::fs::read(playlist_path)
|
||||
.await
|
||||
.map_err(|e| e.to_string())?,
|
||||
)
|
||||
.unwrap();
|
||||
let mut start_offset = None;
|
||||
let mut segments = Vec::new();
|
||||
if let Some(range) = &range {
|
||||
let mut duration = 0.0;
|
||||
for s in playlist.segments.clone() {
|
||||
if range.is_in(duration) || range.is_in(duration + s.duration as f64) {
|
||||
segments.push(s.clone());
|
||||
if start_offset.is_none() {
|
||||
start_offset = Some(range.start - duration);
|
||||
}
|
||||
}
|
||||
duration += s.duration as f64;
|
||||
}
|
||||
} else {
|
||||
segments = playlist.segments.clone();
|
||||
}
|
||||
|
||||
if segments.is_empty() {
|
||||
return Err("No segments found".to_string());
|
||||
}
|
||||
|
||||
let first_segment = playlist.segments.first().unwrap().clone();
|
||||
let mut header_url = first_segment
|
||||
.unknown_tags
|
||||
.iter()
|
||||
.find(|t| t.tag == "X-MAP")
|
||||
.map(|t| {
|
||||
let rest = t.rest.clone().unwrap();
|
||||
rest.split('=').nth(1).unwrap().replace("\\\"", "")
|
||||
});
|
||||
if header_url.is_none() {
|
||||
// map: Some(Map { uri: "h1758725308.m4s"
|
||||
if let Some(Map { uri, .. }) = &first_segment.map {
|
||||
header_url = Some(uri.clone());
|
||||
}
|
||||
}
|
||||
|
||||
// write all segments to clip_file
|
||||
{
|
||||
let playlist_folder = playlist_path.parent().unwrap();
|
||||
let output_folder = output_path.parent().unwrap();
|
||||
if !output_folder.exists() {
|
||||
std::fs::create_dir_all(output_folder).unwrap();
|
||||
}
|
||||
let mut file = tokio::fs::File::create(&output_path).await.unwrap();
|
||||
if let Some(header_url) = header_url {
|
||||
let header_data = tokio::fs::read(playlist_folder.join(header_url))
|
||||
.await
|
||||
.unwrap();
|
||||
file.write_all(&header_data).await.unwrap();
|
||||
}
|
||||
for s in segments {
|
||||
// read segment
|
||||
let segment_file_path = playlist_folder.join(s.uri);
|
||||
let segment_data = tokio::fs::read(&segment_file_path).await.unwrap();
|
||||
// append segment data to clip_file
|
||||
file.write_all(&segment_data).await.unwrap();
|
||||
}
|
||||
file.flush().await.unwrap();
|
||||
}
|
||||
|
||||
// transcode copy to fix timestamp
|
||||
{
|
||||
let tmp_output_path = output_path.with_extension("tmp.mp4");
|
||||
super::transcode(reporter, output_path, &tmp_output_path, true).await?;
|
||||
|
||||
// remove original file
|
||||
let _ = tokio::fs::remove_file(output_path).await;
|
||||
// rename tmp_output_path to output_path
|
||||
let _ = tokio::fs::rename(tmp_output_path, output_path).await;
|
||||
}
|
||||
|
||||
// trim for precised duration
|
||||
if let Some(start_offset) = start_offset {
|
||||
let tmp_output_path = output_path.with_extension("tmp.mp4");
|
||||
super::trim_video(
|
||||
reporter,
|
||||
output_path,
|
||||
&tmp_output_path,
|
||||
start_offset,
|
||||
range.as_ref().unwrap().duration(),
|
||||
)
|
||||
.await?;
|
||||
|
||||
// remove original file
|
||||
let _ = tokio::fs::remove_file(output_path).await;
|
||||
// rename tmp_output_path to output_path
|
||||
let _ = tokio::fs::rename(tmp_output_path, output_path).await;
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn playlists_to_video(
|
||||
reporter: Option<&impl ProgressReporterTrait>,
|
||||
playlists: &[&Path],
|
||||
output_path: &Path,
|
||||
) -> Result<(), String> {
|
||||
let mut segments = Vec::new();
|
||||
for (i, playlist) in playlists.iter().enumerate() {
|
||||
let video_path = output_path.with_extension(format!("{}.mp4", i));
|
||||
playlist_to_video(reporter, playlist, &video_path, None).await?;
|
||||
segments.push(video_path);
|
||||
}
|
||||
|
||||
super::general::concat_videos(reporter, &segments, output_path).await?;
|
||||
|
||||
// clean up segments
|
||||
for segment in segments {
|
||||
let _ = tokio::fs::remove_file(segment).await;
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
@@ -23,7 +23,7 @@ pub async fn add_account(
|
||||
) -> Result<AccountRow, String> {
|
||||
// check if cookies is valid
|
||||
if let Err(e) = cookies.parse::<HeaderValue>() {
|
||||
return Err(format!("Invalid cookies: {}", e));
|
||||
return Err(format!("Invalid cookies: {e}"));
|
||||
}
|
||||
let account = state.db.add_account(&platform, cookies).await?;
|
||||
if platform == "bilibili" {
|
||||
@@ -39,10 +39,7 @@ pub async fn add_account(
|
||||
.await?;
|
||||
} else if platform == "douyin" {
|
||||
// Get user info from Douyin API
|
||||
let douyin_client = crate::recorder::douyin::client::DouyinClient::new(
|
||||
&state.config.read().await.user_agent,
|
||||
&account,
|
||||
);
|
||||
let douyin_client = crate::recorder::douyin::client::DouyinClient::new(&account);
|
||||
match douyin_client.get_user_info().await {
|
||||
Ok(user_info) => {
|
||||
// For Douyin, use sec_uid as the primary identifier in id_str field
|
||||
@@ -64,7 +61,7 @@ pub async fn add_account(
|
||||
.await?;
|
||||
}
|
||||
Err(e) => {
|
||||
log::warn!("Failed to get Douyin user info: {}", e);
|
||||
log::warn!("Failed to get Douyin user info: {e}");
|
||||
// Keep the account but with default values
|
||||
}
|
||||
}
|
||||
@@ -76,7 +73,7 @@ pub async fn add_account(
|
||||
pub async fn remove_account(
|
||||
state: state_type!(),
|
||||
platform: String,
|
||||
uid: u64,
|
||||
uid: i64,
|
||||
) -> Result<(), String> {
|
||||
if platform == "bilibili" {
|
||||
let account = state.db.get_account(&platform, uid).await?;
|
||||
|
||||
@@ -14,16 +14,25 @@ pub async fn get_config(state: state_type!()) -> Result<Config, ()> {
|
||||
#[allow(dead_code)]
|
||||
pub async fn set_cache_path(state: state_type!(), cache_path: String) -> Result<(), String> {
|
||||
let old_cache_path = state.config.read().await.cache.clone();
|
||||
log::info!("Try to set cache path: {old_cache_path} -> {cache_path}");
|
||||
if old_cache_path == cache_path {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
state.recorder_manager.set_migrating(true).await;
|
||||
let old_cache_path_obj = std::path::Path::new(&old_cache_path);
|
||||
let new_cache_path_obj = std::path::Path::new(&cache_path);
|
||||
// check if new cache path is under old cache path
|
||||
if new_cache_path_obj.starts_with(old_cache_path_obj) {
|
||||
log::error!("New cache path is under old cache path: {old_cache_path} -> {cache_path}");
|
||||
return Err("New cache path cannot be under old cache path".to_string());
|
||||
}
|
||||
|
||||
state.recorder_manager.set_migrating(true);
|
||||
// stop and clear all recorders
|
||||
state.recorder_manager.stop_all().await;
|
||||
// first switch to new cache
|
||||
state.config.write().await.set_cache_path(&cache_path);
|
||||
log::info!("Cache path changed: {}", cache_path);
|
||||
log::info!("Cache path changed: {cache_path}");
|
||||
// Copy old cache to new cache
|
||||
log::info!("Start copy old cache to new cache");
|
||||
state
|
||||
@@ -51,26 +60,28 @@ pub async fn set_cache_path(state: state_type!(), cache_path: String) -> Result<
|
||||
// if entry is a folder
|
||||
if entry.is_dir() {
|
||||
if let Err(e) = crate::handlers::utils::copy_dir_all(entry, &new_entry) {
|
||||
log::error!("Copy old cache to new cache error: {}", e);
|
||||
log::error!("Copy old cache to new cache error: {e}");
|
||||
return Err(e.to_string());
|
||||
}
|
||||
} else if let Err(e) = std::fs::copy(entry, &new_entry) {
|
||||
log::error!("Copy old cache to new cache error: {}", e);
|
||||
log::error!("Copy old cache to new cache error: {e}");
|
||||
return Err(e.to_string());
|
||||
}
|
||||
}
|
||||
|
||||
log::info!("Copy old cache to new cache done");
|
||||
state.db.new_message("缓存目录切换", "缓存切换完成").await?;
|
||||
|
||||
state.recorder_manager.set_migrating(false).await;
|
||||
state.recorder_manager.set_migrating(false);
|
||||
|
||||
// remove all old cache entries
|
||||
for entry in old_cache_entries {
|
||||
if entry.is_dir() {
|
||||
if let Err(e) = std::fs::remove_dir_all(&entry) {
|
||||
log::error!("Remove old cache error: {}", e);
|
||||
log::error!("Remove old cache error: {e}");
|
||||
}
|
||||
} else if let Err(e) = std::fs::remove_file(&entry) {
|
||||
log::error!("Remove old cache error: {}", e);
|
||||
log::error!("Remove old cache error: {e}");
|
||||
}
|
||||
}
|
||||
|
||||
@@ -79,12 +90,22 @@ pub async fn set_cache_path(state: state_type!(), cache_path: String) -> Result<
|
||||
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
#[allow(dead_code)]
|
||||
pub async fn set_output_path(state: state_type!(), output_path: String) -> Result<(), ()> {
|
||||
pub async fn set_output_path(state: state_type!(), output_path: String) -> Result<(), String> {
|
||||
let mut config = state.config.write().await;
|
||||
let old_output_path = config.output.clone();
|
||||
log::info!("Try to set output path: {old_output_path} -> {output_path}");
|
||||
if old_output_path == output_path {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
let old_output_path_obj = std::path::Path::new(&old_output_path);
|
||||
let new_output_path_obj = std::path::Path::new(&output_path);
|
||||
// check if new output path is under old output path
|
||||
if new_output_path_obj.starts_with(old_output_path_obj) {
|
||||
log::error!("New output path is under old output path: {old_output_path} -> {output_path}");
|
||||
return Err("New output path cannot be under old output path".to_string());
|
||||
}
|
||||
|
||||
// list all file and folder in old output
|
||||
let mut old_output_entries = vec![];
|
||||
if let Ok(entries) = std::fs::read_dir(&old_output_path) {
|
||||
@@ -103,10 +124,12 @@ pub async fn set_output_path(state: state_type!(), output_path: String) -> Resul
|
||||
// if entry is a folder
|
||||
if entry.is_dir() {
|
||||
if let Err(e) = crate::handlers::utils::copy_dir_all(entry, &new_entry) {
|
||||
log::error!("Copy old cache to new cache error: {}", e);
|
||||
log::error!("Copy old output to new output error: {e}");
|
||||
return Err(e.to_string());
|
||||
}
|
||||
} else if let Err(e) = std::fs::copy(entry, &new_entry) {
|
||||
log::error!("Copy old cache to new cache error: {}", e);
|
||||
log::error!("Copy old output to new output error: {e}");
|
||||
return Err(e.to_string());
|
||||
}
|
||||
}
|
||||
|
||||
@@ -114,10 +137,10 @@ pub async fn set_output_path(state: state_type!(), output_path: String) -> Resul
|
||||
for entry in old_output_entries {
|
||||
if entry.is_dir() {
|
||||
if let Err(e) = std::fs::remove_dir_all(&entry) {
|
||||
log::error!("Remove old cache error: {}", e);
|
||||
log::error!("Remove old output error: {e}");
|
||||
}
|
||||
} else if let Err(e) = std::fs::remove_file(&entry) {
|
||||
log::error!("Remove old cache error: {}", e);
|
||||
log::error!("Remove old output error: {e}");
|
||||
}
|
||||
}
|
||||
|
||||
@@ -177,10 +200,7 @@ pub async fn update_subtitle_generator_type(
|
||||
state: state_type!(),
|
||||
subtitle_generator_type: String,
|
||||
) -> Result<(), ()> {
|
||||
log::info!(
|
||||
"Updating subtitle generator type to {}",
|
||||
subtitle_generator_type
|
||||
);
|
||||
log::info!("Updating subtitle generator type to {subtitle_generator_type}");
|
||||
let mut config = state.config.write().await;
|
||||
config.subtitle_generator_type = subtitle_generator_type;
|
||||
config.save();
|
||||
@@ -201,7 +221,7 @@ pub async fn update_openai_api_endpoint(
|
||||
state: state_type!(),
|
||||
openai_api_endpoint: String,
|
||||
) -> Result<(), ()> {
|
||||
log::info!("Updating openai api endpoint to {}", openai_api_endpoint);
|
||||
log::info!("Updating openai api endpoint to {openai_api_endpoint}");
|
||||
let mut config = state.config.write().await;
|
||||
config.openai_api_endpoint = openai_api_endpoint;
|
||||
config.save();
|
||||
@@ -229,7 +249,7 @@ pub async fn update_status_check_interval(
|
||||
if interval < 10 {
|
||||
interval = 10; // Minimum interval of 10 seconds
|
||||
}
|
||||
log::info!("Updating status check interval to {} seconds", interval);
|
||||
log::info!("Updating status check interval to {interval} seconds");
|
||||
state.config.write().await.status_check_interval = interval;
|
||||
state.config.write().await.save();
|
||||
Ok(())
|
||||
@@ -240,15 +260,23 @@ pub async fn update_whisper_language(
|
||||
state: state_type!(),
|
||||
whisper_language: String,
|
||||
) -> Result<(), ()> {
|
||||
log::info!("Updating whisper language to {}", whisper_language);
|
||||
log::info!("Updating whisper language to {whisper_language}");
|
||||
state.config.write().await.whisper_language = whisper_language;
|
||||
state.config.write().await.save();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
pub async fn update_user_agent(state: state_type!(), user_agent: String) -> Result<(), ()> {
|
||||
log::info!("Updating user agent to {}", user_agent);
|
||||
state.config.write().await.set_user_agent(&user_agent);
|
||||
pub async fn update_webhook_url(state: state_type!(), webhook_url: String) -> Result<(), ()> {
|
||||
log::info!("Updating webhook url to {webhook_url}");
|
||||
let _ = state
|
||||
.webhook_poster
|
||||
.update_config(crate::webhook::poster::WebhookConfig {
|
||||
url: webhook_url.clone(),
|
||||
..Default::default()
|
||||
})
|
||||
.await;
|
||||
state.config.write().await.webhook_url = webhook_url;
|
||||
state.config.write().await.save();
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
@@ -13,10 +13,3 @@ use crate::database::account::AccountRow;
|
||||
pub struct AccountInfo {
|
||||
pub accounts: Vec<AccountRow>,
|
||||
}
|
||||
|
||||
#[derive(serde::Serialize)]
|
||||
pub struct DiskInfo {
|
||||
pub disk: String,
|
||||
pub total: u64,
|
||||
pub free: u64,
|
||||
}
|
||||
|
||||
@@ -1,12 +1,17 @@
|
||||
use crate::danmu2ass;
|
||||
use crate::database::record::RecordRow;
|
||||
use crate::database::recorder::RecorderRow;
|
||||
use crate::database::task::TaskRow;
|
||||
use crate::progress::progress_reporter::EventEmitter;
|
||||
use crate::progress::progress_reporter::ProgressReporter;
|
||||
use crate::progress::progress_reporter::ProgressReporterTrait;
|
||||
use crate::recorder::danmu::DanmuEntry;
|
||||
use crate::recorder::PlatformType;
|
||||
use crate::recorder::RecorderInfo;
|
||||
use crate::recorder_manager::RecorderList;
|
||||
use crate::state::State;
|
||||
use crate::state_type;
|
||||
use crate::webhook::events;
|
||||
|
||||
#[cfg(feature = "gui")]
|
||||
use tauri::State as TauriState;
|
||||
@@ -23,10 +28,10 @@ pub async fn get_recorder_list(state: state_type!()) -> Result<RecorderList, ()>
|
||||
pub async fn add_recorder(
|
||||
state: state_type!(),
|
||||
platform: String,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
extra: String,
|
||||
) -> Result<RecorderRow, String> {
|
||||
log::info!("Add recorder: {} {}", platform, room_id);
|
||||
log::info!("Add recorder: {platform} {room_id}");
|
||||
let platform = PlatformType::from_str(&platform).unwrap();
|
||||
let account = match platform {
|
||||
PlatformType::BiliBili => {
|
||||
@@ -58,18 +63,26 @@ pub async fn add_recorder(
|
||||
let room = state.db.add_recorder(platform, room_id, &extra).await?;
|
||||
state
|
||||
.db
|
||||
.new_message("添加直播间", &format!("添加了新直播间 {}", room_id))
|
||||
.new_message("添加直播间", &format!("添加了新直播间 {room_id}"))
|
||||
.await?;
|
||||
// post webhook event
|
||||
let event = events::new_webhook_event(
|
||||
events::RECORDER_ADDED,
|
||||
events::Payload::Recorder(room.clone()),
|
||||
);
|
||||
if let Err(e) = state.webhook_poster.post_event(&event).await {
|
||||
log::error!("Post webhook event error: {e}");
|
||||
}
|
||||
Ok(room)
|
||||
}
|
||||
Err(e) => {
|
||||
log::error!("Failed to add recorder: {}", e);
|
||||
Err(format!("添加失败: {}", e))
|
||||
log::error!("Failed to add recorder: {e}");
|
||||
Err(format!("添加失败: {e}"))
|
||||
}
|
||||
},
|
||||
Err(e) => {
|
||||
log::error!("Failed to add recorder: {}", e);
|
||||
Err(format!("添加失败: {}", e))
|
||||
log::error!("Failed to add recorder: {e}");
|
||||
Err(format!("添加失败: {e}"))
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -78,25 +91,33 @@ pub async fn add_recorder(
|
||||
pub async fn remove_recorder(
|
||||
state: state_type!(),
|
||||
platform: String,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
) -> Result<(), String> {
|
||||
log::info!("Remove recorder: {} {}", platform, room_id);
|
||||
log::info!("Remove recorder: {platform} {room_id}");
|
||||
let platform = PlatformType::from_str(&platform).unwrap();
|
||||
match state
|
||||
.recorder_manager
|
||||
.remove_recorder(platform, room_id)
|
||||
.await
|
||||
{
|
||||
Ok(()) => {
|
||||
Ok(recorder) => {
|
||||
state
|
||||
.db
|
||||
.new_message("移除直播间", &format!("移除了直播间 {}", room_id))
|
||||
.new_message("移除直播间", &format!("移除了直播间 {room_id}"))
|
||||
.await?;
|
||||
// post webhook event
|
||||
let event = events::new_webhook_event(
|
||||
events::RECORDER_REMOVED,
|
||||
events::Payload::Recorder(recorder),
|
||||
);
|
||||
if let Err(e) = state.webhook_poster.post_event(&event).await {
|
||||
log::error!("Post webhook event error: {e}");
|
||||
}
|
||||
log::info!("Removed recorder: {} {}", platform.as_str(), room_id);
|
||||
Ok(())
|
||||
}
|
||||
Err(e) => {
|
||||
log::error!("Failed to remove recorder: {}", e);
|
||||
log::error!("Failed to remove recorder: {e}");
|
||||
Err(e.to_string())
|
||||
}
|
||||
}
|
||||
@@ -106,7 +127,7 @@ pub async fn remove_recorder(
|
||||
pub async fn get_room_info(
|
||||
state: state_type!(),
|
||||
platform: String,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
) -> Result<RecorderInfo, String> {
|
||||
let platform = PlatformType::from_str(&platform).unwrap();
|
||||
if let Some(info) = state
|
||||
@@ -121,14 +142,27 @@ pub async fn get_room_info(
|
||||
}
|
||||
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
pub async fn get_archives(state: state_type!(), room_id: u64) -> Result<Vec<RecordRow>, String> {
|
||||
Ok(state.recorder_manager.get_archives(room_id).await?)
|
||||
pub async fn get_archive_disk_usage(state: state_type!()) -> Result<i64, String> {
|
||||
Ok(state.recorder_manager.get_archive_disk_usage().await?)
|
||||
}
|
||||
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
pub async fn get_archives(
|
||||
state: state_type!(),
|
||||
room_id: i64,
|
||||
offset: i64,
|
||||
limit: i64,
|
||||
) -> Result<Vec<RecordRow>, String> {
|
||||
Ok(state
|
||||
.recorder_manager
|
||||
.get_archives(room_id, offset, limit)
|
||||
.await?)
|
||||
}
|
||||
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
pub async fn get_archive(
|
||||
state: state_type!(),
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
live_id: String,
|
||||
) -> Result<RecordRow, String> {
|
||||
Ok(state
|
||||
@@ -137,11 +171,23 @@ pub async fn get_archive(
|
||||
.await?)
|
||||
}
|
||||
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
pub async fn get_archives_by_parent_id(
|
||||
state: state_type!(),
|
||||
room_id: i64,
|
||||
parent_id: String,
|
||||
) -> Result<Vec<RecordRow>, String> {
|
||||
Ok(state
|
||||
.db
|
||||
.get_archives_by_parent_id(room_id, &parent_id)
|
||||
.await?)
|
||||
}
|
||||
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
pub async fn get_archive_subtitle(
|
||||
state: state_type!(),
|
||||
platform: String,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
live_id: String,
|
||||
) -> Result<String, String> {
|
||||
let platform = PlatformType::from_str(&platform);
|
||||
@@ -158,7 +204,7 @@ pub async fn get_archive_subtitle(
|
||||
pub async fn generate_archive_subtitle(
|
||||
state: state_type!(),
|
||||
platform: String,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
live_id: String,
|
||||
) -> Result<String, String> {
|
||||
let platform = PlatformType::from_str(&platform);
|
||||
@@ -175,14 +221,14 @@ pub async fn generate_archive_subtitle(
|
||||
pub async fn delete_archive(
|
||||
state: state_type!(),
|
||||
platform: String,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
live_id: String,
|
||||
) -> Result<(), String> {
|
||||
let platform = PlatformType::from_str(&platform);
|
||||
if platform.is_none() {
|
||||
return Err("Unsupported platform".to_string());
|
||||
}
|
||||
state
|
||||
let to_delete = state
|
||||
.recorder_manager
|
||||
.delete_archive(platform.unwrap(), room_id, &live_id)
|
||||
.await?;
|
||||
@@ -190,9 +236,55 @@ pub async fn delete_archive(
|
||||
.db
|
||||
.new_message(
|
||||
"删除历史缓存",
|
||||
&format!("删除了房间 {} 的历史缓存 {}", room_id, live_id),
|
||||
&format!("删除了房间 {room_id} 的历史缓存 {live_id}"),
|
||||
)
|
||||
.await?;
|
||||
// post webhook event
|
||||
let event =
|
||||
events::new_webhook_event(events::ARCHIVE_DELETED, events::Payload::Archive(to_delete));
|
||||
if let Err(e) = state.webhook_poster.post_event(&event).await {
|
||||
log::error!("Post webhook event error: {e}");
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
pub async fn delete_archives(
|
||||
state: state_type!(),
|
||||
platform: String,
|
||||
room_id: i64,
|
||||
live_ids: Vec<String>,
|
||||
) -> Result<(), String> {
|
||||
let platform = PlatformType::from_str(&platform);
|
||||
if platform.is_none() {
|
||||
return Err("Unsupported platform".to_string());
|
||||
}
|
||||
let to_deletes = state
|
||||
.recorder_manager
|
||||
.delete_archives(
|
||||
platform.unwrap(),
|
||||
room_id,
|
||||
&live_ids
|
||||
.iter()
|
||||
.map(std::string::String::as_str)
|
||||
.collect::<Vec<&str>>(),
|
||||
)
|
||||
.await?;
|
||||
state
|
||||
.db
|
||||
.new_message(
|
||||
"删除历史缓存",
|
||||
&format!("删除了房间 {} 的历史缓存 {}", room_id, live_ids.join(", ")),
|
||||
)
|
||||
.await?;
|
||||
for to_delete in to_deletes {
|
||||
// post webhook event
|
||||
let event =
|
||||
events::new_webhook_event(events::ARCHIVE_DELETED, events::Payload::Archive(to_delete));
|
||||
if let Err(e) = state.webhook_poster.post_event(&event).await {
|
||||
log::error!("Post webhook event error: {e}");
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
@@ -200,7 +292,7 @@ pub async fn delete_archive(
|
||||
pub async fn get_danmu_record(
|
||||
state: state_type!(),
|
||||
platform: String,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
live_id: String,
|
||||
) -> Result<Vec<DanmuEntry>, String> {
|
||||
let platform = PlatformType::from_str(&platform);
|
||||
@@ -217,7 +309,7 @@ pub async fn get_danmu_record(
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct ExportDanmuOptions {
|
||||
platform: String,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
live_id: String,
|
||||
x: i64,
|
||||
y: i64,
|
||||
@@ -262,8 +354,8 @@ pub async fn export_danmu(
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
pub async fn send_danmaku(
|
||||
state: state_type!(),
|
||||
uid: u64,
|
||||
room_id: u64,
|
||||
uid: i64,
|
||||
room_id: i64,
|
||||
message: String,
|
||||
) -> Result<(), String> {
|
||||
let account = state.db.get_account("bilibili", uid).await?;
|
||||
@@ -278,7 +370,7 @@ pub async fn send_danmaku(
|
||||
pub async fn get_total_length(state: state_type!()) -> Result<i64, String> {
|
||||
match state.db.get_total_length().await {
|
||||
Ok(total_length) => Ok(total_length),
|
||||
Err(e) => Err(format!("Failed to get total length: {}", e)),
|
||||
Err(e) => Err(format!("Failed to get total length: {e}")),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -286,20 +378,20 @@ pub async fn get_total_length(state: state_type!()) -> Result<i64, String> {
|
||||
pub async fn get_today_record_count(state: state_type!()) -> Result<i64, String> {
|
||||
match state.db.get_today_record_count().await {
|
||||
Ok(count) => Ok(count),
|
||||
Err(e) => Err(format!("Failed to get today record count: {}", e)),
|
||||
Err(e) => Err(format!("Failed to get today record count: {e}")),
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
pub async fn get_recent_record(
|
||||
state: state_type!(),
|
||||
room_id: u64,
|
||||
offset: u64,
|
||||
limit: u64,
|
||||
room_id: i64,
|
||||
offset: i64,
|
||||
limit: i64,
|
||||
) -> Result<Vec<RecordRow>, String> {
|
||||
match state.db.get_recent_record(room_id, offset, limit).await {
|
||||
Ok(records) => Ok(records),
|
||||
Err(e) => Err(format!("Failed to get recent record: {}", e)),
|
||||
Err(e) => Err(format!("Failed to get recent record: {e}")),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -307,7 +399,7 @@ pub async fn get_recent_record(
|
||||
pub async fn set_enable(
|
||||
state: state_type!(),
|
||||
platform: String,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
enabled: bool,
|
||||
) -> Result<(), String> {
|
||||
log::info!("Set enable for recorder {platform} {room_id} {enabled}");
|
||||
@@ -330,9 +422,71 @@ pub async fn fetch_hls(state: state_type!(), uri: String) -> Result<Vec<u8>, Str
|
||||
} else {
|
||||
uri
|
||||
};
|
||||
state
|
||||
Ok(state
|
||||
.recorder_manager
|
||||
.handle_hls_request(&uri)
|
||||
.await
|
||||
.map_err(|e| e.to_string())
|
||||
.unwrap())
|
||||
}
|
||||
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
pub async fn generate_whole_clip(
|
||||
state: state_type!(),
|
||||
platform: String,
|
||||
room_id: i64,
|
||||
parent_id: String,
|
||||
) -> Result<TaskRow, String> {
|
||||
log::info!("Generate whole clip for {platform} {room_id} {parent_id}");
|
||||
|
||||
let task = state
|
||||
.db
|
||||
.generate_task(
|
||||
"generate_whole_clip",
|
||||
"",
|
||||
&serde_json::json!({
|
||||
"platform": platform,
|
||||
"room_id": room_id,
|
||||
"parent_id": parent_id,
|
||||
})
|
||||
.to_string(),
|
||||
)
|
||||
.await?;
|
||||
|
||||
#[cfg(feature = "gui")]
|
||||
let emitter = EventEmitter::new(state.app_handle.clone());
|
||||
#[cfg(feature = "headless")]
|
||||
let emitter = EventEmitter::new(state.progress_manager.get_event_sender());
|
||||
let reporter = ProgressReporter::new(&emitter, &task.id).await?;
|
||||
|
||||
log::info!("Create task: {} {}", task.id, task.task_type);
|
||||
// create a tokio task to run in background
|
||||
#[cfg(feature = "gui")]
|
||||
let state_clone = (*state).clone();
|
||||
#[cfg(feature = "headless")]
|
||||
let state_clone = state.clone();
|
||||
|
||||
let task_id = task.id.clone();
|
||||
tokio::spawn(async move {
|
||||
match state_clone
|
||||
.recorder_manager
|
||||
.generate_whole_clip(Some(&reporter), platform, room_id, parent_id)
|
||||
.await
|
||||
{
|
||||
Ok(()) => {
|
||||
reporter.finish(true, "切片生成完成").await;
|
||||
let _ = state_clone
|
||||
.db
|
||||
.update_task(&task_id, "success", "切片生成完成", None)
|
||||
.await;
|
||||
}
|
||||
Err(e) => {
|
||||
reporter.finish(false, &format!("切片生成失败: {e}")).await;
|
||||
let _ = state_clone
|
||||
.db
|
||||
.update_task(&task_id, "failed", &format!("切片生成失败: {e}"), None)
|
||||
.await;
|
||||
}
|
||||
}
|
||||
});
|
||||
Ok(task)
|
||||
}
|
||||
|
||||
@@ -57,9 +57,13 @@ pub fn show_in_folder(path: String) {
|
||||
path2.into_os_string().into_string().unwrap()
|
||||
}
|
||||
};
|
||||
Command::new("xdg-open").arg(&new_path).spawn().unwrap();
|
||||
let _ = Command::new("xdg-open")
|
||||
.arg(&new_path)
|
||||
.spawn()
|
||||
.unwrap()
|
||||
.wait();
|
||||
} else {
|
||||
Command::new("dbus-send")
|
||||
let _ = Command::new("dbus-send")
|
||||
.args([
|
||||
"--session",
|
||||
"--dest=org.freedesktop.FileManager1",
|
||||
@@ -70,7 +74,8 @@ pub fn show_in_folder(path: String) {
|
||||
"string:\"\"",
|
||||
])
|
||||
.spawn()
|
||||
.unwrap();
|
||||
.unwrap()
|
||||
.wait();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -109,10 +114,10 @@ pub async fn get_disk_info(state: state_type!()) -> Result<DiskInfo, ()> {
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
pub async fn console_log(_state: state_type!(), level: &str, message: &str) -> Result<(), ()> {
|
||||
match level {
|
||||
"error" => log::error!("[frontend] {}", message),
|
||||
"warn" => log::warn!("[frontend] {}", message),
|
||||
"info" => log::info!("[frontend] {}", message),
|
||||
_ => log::debug!("[frontend] {}", message),
|
||||
"error" => log::error!("[frontend] {message}"),
|
||||
"warn" => log::warn!("[frontend] {message}"),
|
||||
"info" => log::info!("[frontend] {message}"),
|
||||
_ => log::debug!("[frontend] {message}"),
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
@@ -139,7 +144,7 @@ pub async fn get_disk_info_inner(target: PathBuf) -> Result<DiskInfo, ()> {
|
||||
let total = parts[1].parse::<u64>().unwrap() * 1024;
|
||||
let free = parts[3].parse::<u64>().unwrap() * 1024;
|
||||
|
||||
return Ok(DiskInfo { disk, total, free });
|
||||
Ok(DiskInfo { disk, total, free })
|
||||
}
|
||||
|
||||
#[cfg(any(target_os = "windows", target_os = "macos"))]
|
||||
@@ -148,7 +153,7 @@ pub async fn get_disk_info_inner(target: PathBuf) -> Result<DiskInfo, ()> {
|
||||
let disks = sysinfo::Disks::new_with_refreshed_list();
|
||||
// get target disk info
|
||||
let mut disk_info = DiskInfo {
|
||||
disk: "".into(),
|
||||
disk: String::new(),
|
||||
total: 0,
|
||||
free: 0,
|
||||
};
|
||||
@@ -157,11 +162,11 @@ pub async fn get_disk_info_inner(target: PathBuf) -> Result<DiskInfo, ()> {
|
||||
let mut longest_match = 0;
|
||||
for disk in disks.list() {
|
||||
let mount_point = disk.mount_point().to_str().unwrap();
|
||||
if target.starts_with(mount_point) && mount_point.split("/").count() > longest_match {
|
||||
if target.starts_with(mount_point) && mount_point.split('/').count() > longest_match {
|
||||
disk_info.disk = mount_point.into();
|
||||
disk_info.total = disk.total_space();
|
||||
disk_info.free = disk.available_space();
|
||||
longest_match = mount_point.split("/").count();
|
||||
longest_match = mount_point.split('/').count();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -187,10 +192,10 @@ pub async fn export_to_file(
|
||||
}
|
||||
let mut file = file.unwrap();
|
||||
if let Err(e) = file.write_all(content.as_bytes()).await {
|
||||
return Err(format!("Write file failed: {}", e));
|
||||
return Err(format!("Write file failed: {e}"));
|
||||
}
|
||||
if let Err(e) = file.flush().await {
|
||||
return Err(format!("Flush file failed: {}", e));
|
||||
return Err(format!("Flush file failed: {e}"));
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
@@ -211,10 +216,10 @@ pub async fn open_log_folder(state: state_type!()) -> Result<(), String> {
|
||||
pub async fn open_live(
|
||||
state: state_type!(),
|
||||
platform: String,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
live_id: String,
|
||||
) -> Result<(), String> {
|
||||
log::info!("Open player window: {} {}", room_id, live_id);
|
||||
log::info!("Open player window: {room_id} {live_id}");
|
||||
#[cfg(feature = "gui")]
|
||||
{
|
||||
let platform = PlatformType::from_str(&platform).unwrap();
|
||||
@@ -225,7 +230,7 @@ pub async fn open_live(
|
||||
.unwrap();
|
||||
let builder = tauri::WebviewWindowBuilder::new(
|
||||
&state.app_handle,
|
||||
format!("Live:{}:{}", room_id, live_id),
|
||||
format!("Live:{room_id}:{live_id}"),
|
||||
tauri::WebviewUrl::App(
|
||||
format!(
|
||||
"index_live.html?platform={}&room_id={}&live_id={}",
|
||||
@@ -253,7 +258,7 @@ pub async fn open_live(
|
||||
});
|
||||
|
||||
if let Err(e) = builder.decorations(true).build() {
|
||||
log::error!("live window build failed: {}", e);
|
||||
log::error!("live window build failed: {e}");
|
||||
}
|
||||
}
|
||||
|
||||
@@ -263,13 +268,13 @@ pub async fn open_live(
|
||||
#[cfg(feature = "gui")]
|
||||
#[tauri::command]
|
||||
pub async fn open_clip(state: state_type!(), video_id: i64) -> Result<(), String> {
|
||||
log::info!("Open clip window: {}", video_id);
|
||||
log::info!("Open clip window: {video_id}");
|
||||
let builder = tauri::WebviewWindowBuilder::new(
|
||||
&state.app_handle,
|
||||
format!("Clip:{}", video_id),
|
||||
tauri::WebviewUrl::App(format!("index_clip.html?id={}", video_id).into()),
|
||||
format!("Clip:{video_id}"),
|
||||
tauri::WebviewUrl::App(format!("index_clip.html?id={video_id}").into()),
|
||||
)
|
||||
.title(format!("Clip window:{}", video_id))
|
||||
.title(format!("Clip window:{video_id}"))
|
||||
.theme(Some(Theme::Light))
|
||||
.inner_size(1200.0, 800.0)
|
||||
.effects(WindowEffectsConfig {
|
||||
@@ -283,7 +288,7 @@ pub async fn open_clip(state: state_type!(), video_id: i64) -> Result<(), String
|
||||
});
|
||||
|
||||
if let Err(e) = builder.decorations(true).build() {
|
||||
log::error!("clip window build failed: {}", e);
|
||||
log::error!("clip window build failed: {e}");
|
||||
}
|
||||
|
||||
Ok(())
|
||||
@@ -302,3 +307,128 @@ pub async fn list_folder(_state: state_type!(), path: String) -> Result<Vec<Stri
|
||||
}
|
||||
Ok(files)
|
||||
}
|
||||
|
||||
/// 高级文件名清理函数,全面处理各种危险字符和控制字符
|
||||
///
|
||||
/// 适用于需要严格文件名清理的场景,支持中文字符
|
||||
///
|
||||
/// # 参数
|
||||
/// - `name`: 需要清理的文件名
|
||||
/// - `max_length`: 最大长度限制(默认100字符)
|
||||
///
|
||||
/// # 返回
|
||||
/// 经过全面清理的安全文件名
|
||||
#[cfg(feature = "headless")]
|
||||
pub fn sanitize_filename_advanced(name: &str, max_length: Option<usize>) -> String {
|
||||
let max_len = max_length.unwrap_or(100);
|
||||
|
||||
// 先清理所有字符
|
||||
let cleaned: String = name
|
||||
.chars()
|
||||
.map(|c| match c {
|
||||
// 文件系统危险字符
|
||||
'\\' | '/' | ':' | '*' | '?' | '"' | '<' | '>' | '|' => '_',
|
||||
// 控制字符和不可见字符
|
||||
c if c.is_control() => '_',
|
||||
// 保留安全的字符(白名单)
|
||||
c if c.is_alphanumeric()
|
||||
|| c == ' '
|
||||
|| c == '.'
|
||||
|| c == '-'
|
||||
|| c == '_'
|
||||
|| c == '('
|
||||
|| c == ')'
|
||||
|| c == '['
|
||||
|| c == ']'
|
||||
|| c == '《'
|
||||
|| c == '》'
|
||||
|| c == '('
|
||||
|| c == ')' =>
|
||||
{
|
||||
c
|
||||
}
|
||||
// 其他字符替换为下划线
|
||||
_ => '_',
|
||||
})
|
||||
.collect();
|
||||
|
||||
// 如果清理后的长度在限制内,直接返回
|
||||
if cleaned.chars().count() <= max_len {
|
||||
return cleaned;
|
||||
}
|
||||
|
||||
// 智能截断:保护文件扩展名
|
||||
if let Some(dot_pos) = cleaned.rfind('.') {
|
||||
let extension = &cleaned[dot_pos..];
|
||||
let main_part = &cleaned[..dot_pos];
|
||||
|
||||
// 确保扩展名不会太长(最多10个字符,包括点号)
|
||||
if extension.chars().count() <= 10 {
|
||||
let ext_len = extension.chars().count();
|
||||
let available_for_main = max_len.saturating_sub(ext_len);
|
||||
|
||||
if available_for_main > 0 {
|
||||
let truncated_main: String = main_part.chars().take(available_for_main).collect();
|
||||
return format!("{}{}", truncated_main, extension);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 如果没有扩展名或扩展名太长,直接截断
|
||||
cleaned.chars().take(max_len).collect()
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
#[test]
|
||||
#[cfg(feature = "headless")]
|
||||
fn test_sanitize_filename_advanced() {
|
||||
use super::sanitize_filename_advanced;
|
||||
|
||||
assert_eq!(
|
||||
sanitize_filename_advanced("test<>file.txt", None),
|
||||
"test__file.txt"
|
||||
);
|
||||
assert_eq!(sanitize_filename_advanced("文件名.txt", None), "文件名.txt");
|
||||
assert_eq!(
|
||||
sanitize_filename_advanced("《视频》(高清).mp4", None),
|
||||
"《视频》(高清).mp4"
|
||||
);
|
||||
assert_eq!(
|
||||
sanitize_filename_advanced("file\x00with\x01control.txt", None),
|
||||
"file_with_control.txt"
|
||||
);
|
||||
|
||||
// 测试空白字符处理(函数不自动移除空白字符)
|
||||
assert_eq!(
|
||||
sanitize_filename_advanced(" .hidden_file.txt ", None),
|
||||
" .hidden_file.txt "
|
||||
);
|
||||
assert_eq!(
|
||||
sanitize_filename_advanced(" normal_file.mp4 ", None),
|
||||
" normal_file.mp4 "
|
||||
);
|
||||
|
||||
// 测试特殊字符替换
|
||||
assert_eq!(
|
||||
sanitize_filename_advanced("file@#$%^&.txt", None),
|
||||
"file______.txt"
|
||||
);
|
||||
|
||||
// 测试长度限制 - 无扩展名
|
||||
let long_name = "测试".repeat(60);
|
||||
let result = sanitize_filename_advanced(&long_name, Some(10));
|
||||
assert_eq!(result.chars().count(), 10);
|
||||
|
||||
// 测试长度限制 - 有扩展名
|
||||
let long_name_with_ext = format!("{}.txt", "测试".repeat(60));
|
||||
let result = sanitize_filename_advanced(&long_name_with_ext, Some(10));
|
||||
assert!(result.ends_with(".txt"));
|
||||
assert_eq!(result.chars().count(), 10); // 6个测试字符 + .txt (4个字符)
|
||||
|
||||
// 测试短文件名不被截断
|
||||
let short_name = "test.mp4";
|
||||
let result = sanitize_filename_advanced(short_name, Some(50));
|
||||
assert_eq!(result, "test.mp4");
|
||||
}
|
||||
}
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
2
src-tauri/src/http_server/mod.rs
Normal file
2
src-tauri/src/http_server/mod.rs
Normal file
@@ -0,0 +1,2 @@
|
||||
pub mod api_server;
|
||||
pub mod websocket;
|
||||
92
src-tauri/src/http_server/websocket.rs
Normal file
92
src-tauri/src/http_server/websocket.rs
Normal file
@@ -0,0 +1,92 @@
|
||||
use serde_json::{json, Value};
|
||||
use socketioxide::{
|
||||
extract::{Data, SocketRef},
|
||||
layer::SocketIoLayer,
|
||||
SocketIo,
|
||||
};
|
||||
use tokio::sync::broadcast;
|
||||
|
||||
use crate::progress::progress_manager::Event;
|
||||
use crate::state::State;
|
||||
|
||||
pub async fn create_websocket_server(state: State) -> SocketIoLayer {
|
||||
let (layer, io) = SocketIo::new_layer();
|
||||
|
||||
// Clone the state for the namespace handler
|
||||
let state_clone = state.clone();
|
||||
|
||||
io.ns("/ws", move |socket: SocketRef| {
|
||||
let state = state_clone.clone();
|
||||
|
||||
// Subscribe to progress events
|
||||
let mut rx = state.progress_manager.subscribe();
|
||||
|
||||
// Spawn a task to handle progress events for this socket
|
||||
let socket_clone = socket.clone();
|
||||
tokio::spawn(async move {
|
||||
loop {
|
||||
match rx.recv().await {
|
||||
Ok(event) => {
|
||||
let (event_type, message) = match event {
|
||||
Event::ProgressUpdate { id, content } => (
|
||||
"progress-update",
|
||||
json!({
|
||||
"id": id,
|
||||
"content": content
|
||||
}),
|
||||
),
|
||||
Event::ProgressFinished {
|
||||
id,
|
||||
success,
|
||||
message,
|
||||
} => (
|
||||
"progress-finished",
|
||||
json!({
|
||||
"id": id,
|
||||
"success": success,
|
||||
"message": message
|
||||
}),
|
||||
),
|
||||
Event::DanmuReceived { room, ts, content } => (
|
||||
"danmu",
|
||||
json!({
|
||||
"room": room,
|
||||
"ts": ts,
|
||||
"content": content
|
||||
}),
|
||||
),
|
||||
};
|
||||
|
||||
if let Err(e) = socket_clone.emit(event_type, &message) {
|
||||
log::warn!("Failed to emit progress event to WebSocket client: {}", e);
|
||||
break;
|
||||
}
|
||||
}
|
||||
Err(broadcast::error::RecvError::Closed) => {
|
||||
log::info!("Progress channel closed, stopping WebSocket progress stream");
|
||||
break;
|
||||
}
|
||||
Err(broadcast::error::RecvError::Lagged(skipped)) => {
|
||||
log::warn!("WebSocket client lagged, skipped {} events", skipped);
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Handle client messages
|
||||
socket.on("message", |socket: SocketRef, Data::<Value>(data)| {
|
||||
log::debug!("Received WebSocket message: {:?}", data);
|
||||
// Echo back the message for testing
|
||||
socket.emit("echo", &data).ok();
|
||||
});
|
||||
|
||||
// Handle client disconnect
|
||||
socket.on_disconnect(|socket: SocketRef| {
|
||||
log::info!("WebSocket client disconnected: {}", socket.id);
|
||||
});
|
||||
|
||||
log::info!("WebSocket client connected: {}", socket.id);
|
||||
});
|
||||
|
||||
layer
|
||||
}
|
||||
@@ -1,30 +1,33 @@
|
||||
// Prevents additional console window on Windows in release, DO NOT REMOVE!!
|
||||
#![cfg_attr(not(debug_assertions), windows_subsystem = "windows")]
|
||||
|
||||
mod archive_migration;
|
||||
mod config;
|
||||
mod constants;
|
||||
mod danmu2ass;
|
||||
mod database;
|
||||
mod ffmpeg;
|
||||
mod handlers;
|
||||
#[cfg(feature = "headless")]
|
||||
mod http_server;
|
||||
#[cfg(feature = "headless")]
|
||||
mod migration;
|
||||
mod progress_manager;
|
||||
mod progress_reporter;
|
||||
mod progress;
|
||||
mod recorder;
|
||||
mod recorder_manager;
|
||||
mod state;
|
||||
mod subtitle_generator;
|
||||
#[cfg(feature = "gui")]
|
||||
mod tray;
|
||||
mod webhook;
|
||||
|
||||
use archive_migration::try_rebuild_archives;
|
||||
use async_std::fs;
|
||||
use chrono::Utc;
|
||||
use config::Config;
|
||||
use database::Database;
|
||||
use migration::migration_methods::try_add_parent_id_to_records;
|
||||
use migration::migration_methods::try_convert_clip_covers;
|
||||
use migration::migration_methods::try_convert_entry_to_m3u8;
|
||||
use migration::migration_methods::try_convert_live_covers;
|
||||
use migration::migration_methods::try_rebuild_archives;
|
||||
use recorder::bilibili::client::BiliClient;
|
||||
use recorder::PlatformType;
|
||||
use recorder_manager::RecorderManager;
|
||||
@@ -128,48 +131,73 @@ fn get_migrations() -> Vec<Migration> {
|
||||
Migration {
|
||||
version: 1,
|
||||
description: "create_initial_tables",
|
||||
sql: r#"
|
||||
sql: r"
|
||||
CREATE TABLE accounts (uid INTEGER, platform TEXT NOT NULL DEFAULT 'bilibili', name TEXT, avatar TEXT, csrf TEXT, cookies TEXT, created_at TEXT, PRIMARY KEY(uid, platform));
|
||||
CREATE TABLE recorders (room_id INTEGER PRIMARY KEY, platform TEXT NOT NULL DEFAULT 'bilibili', created_at TEXT);
|
||||
CREATE TABLE records (live_id TEXT PRIMARY KEY, platform TEXT NOT NULL DEFAULT 'bilibili', room_id INTEGER, title TEXT, length INTEGER, size INTEGER, cover BLOB, created_at TEXT);
|
||||
CREATE TABLE danmu_statistics (live_id TEXT PRIMARY KEY, room_id INTEGER, value INTEGER, time_point TEXT);
|
||||
CREATE TABLE messages (id INTEGER PRIMARY KEY AUTOINCREMENT, title TEXT, content TEXT, read INTEGER, created_at TEXT);
|
||||
CREATE TABLE videos (id INTEGER PRIMARY KEY AUTOINCREMENT, room_id INTEGER, cover TEXT, file TEXT, length INTEGER, size INTEGER, status INTEGER, bvid TEXT, title TEXT, desc TEXT, tags TEXT, area INTEGER, created_at TEXT);
|
||||
"#,
|
||||
",
|
||||
kind: MigrationKind::Up,
|
||||
},
|
||||
Migration {
|
||||
version: 2,
|
||||
description: "add_auto_start_column",
|
||||
sql: r#"ALTER TABLE recorders ADD COLUMN auto_start INTEGER NOT NULL DEFAULT 1;"#,
|
||||
sql: r"ALTER TABLE recorders ADD COLUMN auto_start INTEGER NOT NULL DEFAULT 1;",
|
||||
kind: MigrationKind::Up,
|
||||
},
|
||||
// add platform column to videos table
|
||||
Migration {
|
||||
version: 3,
|
||||
description: "add_platform_column",
|
||||
sql: r#"ALTER TABLE videos ADD COLUMN platform TEXT;"#,
|
||||
sql: r"ALTER TABLE videos ADD COLUMN platform TEXT;",
|
||||
kind: MigrationKind::Up,
|
||||
},
|
||||
// add task table to record encode/upload task
|
||||
Migration {
|
||||
version: 4,
|
||||
description: "add_task_table",
|
||||
sql: r#"CREATE TABLE tasks (id TEXT PRIMARY KEY, type TEXT, status TEXT, message TEXT, metadata TEXT, created_at TEXT);"#,
|
||||
sql: r"CREATE TABLE tasks (id TEXT PRIMARY KEY, type TEXT, status TEXT, message TEXT, metadata TEXT, created_at TEXT);",
|
||||
kind: MigrationKind::Up,
|
||||
},
|
||||
// add id_str column to support string IDs like Douyin sec_uid while keeping uid for Bilibili compatibility
|
||||
Migration {
|
||||
version: 5,
|
||||
description: "add_id_str_column",
|
||||
sql: r#"ALTER TABLE accounts ADD COLUMN id_str TEXT;"#,
|
||||
sql: r"ALTER TABLE accounts ADD COLUMN id_str TEXT;",
|
||||
kind: MigrationKind::Up,
|
||||
},
|
||||
// add extra column to recorders
|
||||
Migration {
|
||||
version: 6,
|
||||
description: "add_extra_column_to_recorders",
|
||||
sql: r#"ALTER TABLE recorders ADD COLUMN extra TEXT;"#,
|
||||
sql: r"ALTER TABLE recorders ADD COLUMN extra TEXT;",
|
||||
kind: MigrationKind::Up,
|
||||
},
|
||||
// add indexes
|
||||
Migration {
|
||||
version: 7,
|
||||
description: "add_indexes",
|
||||
sql: r"
|
||||
CREATE INDEX idx_records_live_id ON records (room_id, live_id);
|
||||
CREATE INDEX idx_records_created_at ON records (room_id, created_at);
|
||||
CREATE INDEX idx_videos_room_id ON videos (room_id);
|
||||
CREATE INDEX idx_videos_created_at ON videos (created_at);
|
||||
",
|
||||
kind: MigrationKind::Up,
|
||||
},
|
||||
// add note column for video
|
||||
Migration {
|
||||
version: 8,
|
||||
description: "add_note_column_for_video",
|
||||
sql: r"ALTER TABLE videos ADD COLUMN note TEXT;",
|
||||
kind: MigrationKind::Up,
|
||||
},
|
||||
Migration {
|
||||
version: 9,
|
||||
description: "add_parent_id_column_for_record",
|
||||
sql: r"ALTER TABLE records ADD COLUMN parent_id TEXT;",
|
||||
kind: MigrationKind::Up,
|
||||
},
|
||||
]
|
||||
@@ -204,8 +232,8 @@ impl MigrationSource<'static> for MigrationList {
|
||||
async fn setup_server_state(args: Args) -> Result<State, Box<dyn std::error::Error>> {
|
||||
use std::path::PathBuf;
|
||||
|
||||
use progress_manager::ProgressManager;
|
||||
use progress_reporter::EventEmitter;
|
||||
use progress::progress_manager::ProgressManager;
|
||||
use progress::progress_reporter::EventEmitter;
|
||||
|
||||
setup_logging(Path::new("./")).await?;
|
||||
log::info!("Setting up server state...");
|
||||
@@ -219,7 +247,7 @@ async fn setup_server_state(args: Args) -> Result<State, Box<dyn std::error::Err
|
||||
return Err(e.into());
|
||||
}
|
||||
};
|
||||
let client = Arc::new(BiliClient::new(&config.user_agent)?);
|
||||
let client = Arc::new(BiliClient::new()?);
|
||||
let config = Arc::new(RwLock::new(config));
|
||||
let db = Arc::new(Database::new());
|
||||
// connect to sqlite database
|
||||
@@ -249,7 +277,14 @@ async fn setup_server_state(args: Args) -> Result<State, Box<dyn std::error::Err
|
||||
|
||||
let progress_manager = Arc::new(ProgressManager::new());
|
||||
let emitter = EventEmitter::new(progress_manager.get_event_sender());
|
||||
let recorder_manager = Arc::new(RecorderManager::new(emitter, db.clone(), config.clone()));
|
||||
let webhook_poster =
|
||||
webhook::poster::create_webhook_poster(&config.read().await.webhook_url, None).unwrap();
|
||||
let recorder_manager = Arc::new(RecorderManager::new(
|
||||
emitter,
|
||||
db.clone(),
|
||||
config.clone(),
|
||||
webhook_poster.clone(),
|
||||
));
|
||||
|
||||
// Update account infos for headless mode
|
||||
let accounts = db.get_accounts().await?;
|
||||
@@ -278,7 +313,7 @@ async fn setup_server_state(args: Args) -> Result<State, Box<dyn std::error::Err
|
||||
} else if platform == PlatformType::Douyin {
|
||||
// Update Douyin account info
|
||||
use crate::recorder::douyin::client::DouyinClient;
|
||||
let douyin_client = DouyinClient::new(&config.read().await.user_agent, &account);
|
||||
let douyin_client = DouyinClient::new(&account);
|
||||
match douyin_client.get_user_info().await {
|
||||
Ok(user_info) => {
|
||||
let avatar_url = user_info
|
||||
@@ -308,11 +343,16 @@ async fn setup_server_state(args: Args) -> Result<State, Box<dyn std::error::Err
|
||||
}
|
||||
|
||||
let _ = try_rebuild_archives(&db, config.read().await.cache.clone().into()).await;
|
||||
let _ = try_convert_live_covers(&db, config.read().await.cache.clone().into()).await;
|
||||
let _ = try_convert_clip_covers(&db, config.read().await.output.clone().into()).await;
|
||||
let _ = try_add_parent_id_to_records(&db).await;
|
||||
let _ = try_convert_entry_to_m3u8(&db, config.read().await.cache.clone().into()).await;
|
||||
|
||||
Ok(State {
|
||||
db,
|
||||
client,
|
||||
config,
|
||||
webhook_poster,
|
||||
recorder_manager,
|
||||
progress_manager,
|
||||
readonly: args.readonly,
|
||||
@@ -322,7 +362,7 @@ async fn setup_server_state(args: Args) -> Result<State, Box<dyn std::error::Err
|
||||
#[cfg(feature = "gui")]
|
||||
async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::Error>> {
|
||||
use platform_dirs::AppDirs;
|
||||
use progress_reporter::EventEmitter;
|
||||
use progress::progress_reporter::EventEmitter;
|
||||
|
||||
let log_dir = app.path().app_log_dir()?;
|
||||
setup_logging(&log_dir).await?;
|
||||
@@ -332,7 +372,7 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
|
||||
let config_path = app_dirs.config_dir.join("Conf.toml");
|
||||
let cache_path = app_dirs.cache_dir.join("cache");
|
||||
let output_path = app_dirs.data_dir.join("output");
|
||||
log::info!("Loading config from {:?}", config_path);
|
||||
log::info!("Loading config from {config_path:?}");
|
||||
let config = match Config::load(&config_path, &cache_path, &output_path) {
|
||||
Ok(config) => config,
|
||||
Err(e) => {
|
||||
@@ -341,7 +381,7 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
|
||||
}
|
||||
};
|
||||
|
||||
let client = Arc::new(BiliClient::new(&config.user_agent)?);
|
||||
let client = Arc::new(BiliClient::new()?);
|
||||
let config = Arc::new(RwLock::new(config));
|
||||
let config_clone = config.clone();
|
||||
let dbs = app.state::<tauri_plugin_sql::DbInstances>().inner();
|
||||
@@ -356,12 +396,15 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
|
||||
};
|
||||
db_clone.set(sqlite_pool.unwrap().clone()).await;
|
||||
db_clone.finish_pending_tasks().await?;
|
||||
let webhook_poster =
|
||||
webhook::poster::create_webhook_poster(&config.read().await.webhook_url, None).unwrap();
|
||||
|
||||
let recorder_manager = Arc::new(RecorderManager::new(
|
||||
app.app_handle().clone(),
|
||||
emitter,
|
||||
db.clone(),
|
||||
config.clone(),
|
||||
webhook_poster.clone(),
|
||||
));
|
||||
|
||||
let accounts = db_clone.get_accounts().await?;
|
||||
@@ -373,10 +416,11 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
|
||||
config,
|
||||
recorder_manager,
|
||||
app_handle: app.handle().clone(),
|
||||
webhook_poster,
|
||||
});
|
||||
}
|
||||
|
||||
// update account infos
|
||||
// update account info
|
||||
for account in accounts {
|
||||
let platform = PlatformType::from_str(&account.platform).unwrap();
|
||||
|
||||
@@ -392,17 +436,17 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
|
||||
)
|
||||
.await
|
||||
{
|
||||
log::error!("Error when updating Bilibili account info {}", e);
|
||||
log::error!("Error when updating Bilibili account info {e}");
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
log::error!("Get Bilibili user info failed {}", e);
|
||||
log::error!("Get Bilibili user info failed {e}");
|
||||
}
|
||||
}
|
||||
} else if platform == PlatformType::Douyin {
|
||||
// Update Douyin account info
|
||||
use crate::recorder::douyin::client::DouyinClient;
|
||||
let douyin_client = DouyinClient::new(&config_clone.read().await.user_agent, &account);
|
||||
let douyin_client = DouyinClient::new(&account);
|
||||
match douyin_client.get_user_info().await {
|
||||
Ok(user_info) => {
|
||||
let avatar_url = user_info
|
||||
@@ -421,11 +465,11 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
|
||||
)
|
||||
.await
|
||||
{
|
||||
log::error!("Error when updating Douyin account info {}", e);
|
||||
log::error!("Error when updating Douyin account info {e}");
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
log::error!("Get Douyin user info failed {}", e);
|
||||
log::error!("Get Douyin user info failed {e}");
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -433,9 +477,14 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
|
||||
|
||||
// try to rebuild archive table
|
||||
let cache_path = config_clone.read().await.cache.clone();
|
||||
if let Err(e) = try_rebuild_archives(&db_clone, cache_path.into()).await {
|
||||
log::warn!("Rebuilding archive table failed: {}", e);
|
||||
let output_path = config_clone.read().await.output.clone();
|
||||
if let Err(e) = try_rebuild_archives(&db_clone, cache_path.clone().into()).await {
|
||||
log::warn!("Rebuilding archive table failed: {e}");
|
||||
}
|
||||
let _ = try_convert_live_covers(&db_clone, cache_path.clone().into()).await;
|
||||
let _ = try_convert_clip_covers(&db_clone, output_path.clone().into()).await;
|
||||
let _ = try_add_parent_id_to_records(&db_clone).await;
|
||||
let _ = try_convert_entry_to_m3u8(&db_clone, cache_path.clone().into()).await;
|
||||
|
||||
Ok(State {
|
||||
db,
|
||||
@@ -443,6 +492,7 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
|
||||
config,
|
||||
recorder_manager,
|
||||
app_handle: app.handle().clone(),
|
||||
webhook_poster,
|
||||
})
|
||||
}
|
||||
|
||||
@@ -511,7 +561,7 @@ fn setup_invoke_handlers(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<
|
||||
crate::handlers::config::update_auto_generate,
|
||||
crate::handlers::config::update_status_check_interval,
|
||||
crate::handlers::config::update_whisper_language,
|
||||
crate::handlers::config::update_user_agent,
|
||||
crate::handlers::config::update_webhook_url,
|
||||
crate::handlers::message::get_messages,
|
||||
crate::handlers::message::read_message,
|
||||
crate::handlers::message::delete_message,
|
||||
@@ -519,11 +569,14 @@ fn setup_invoke_handlers(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<
|
||||
crate::handlers::recorder::add_recorder,
|
||||
crate::handlers::recorder::remove_recorder,
|
||||
crate::handlers::recorder::get_room_info,
|
||||
crate::handlers::recorder::get_archive_disk_usage,
|
||||
crate::handlers::recorder::get_archives,
|
||||
crate::handlers::recorder::get_archive,
|
||||
crate::handlers::recorder::get_archives_by_parent_id,
|
||||
crate::handlers::recorder::get_archive_subtitle,
|
||||
crate::handlers::recorder::generate_archive_subtitle,
|
||||
crate::handlers::recorder::delete_archive,
|
||||
crate::handlers::recorder::delete_archives,
|
||||
crate::handlers::recorder::get_danmu_record,
|
||||
crate::handlers::recorder::export_danmu,
|
||||
crate::handlers::recorder::send_danmaku,
|
||||
@@ -532,6 +585,7 @@ fn setup_invoke_handlers(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<
|
||||
crate::handlers::recorder::get_recent_record,
|
||||
crate::handlers::recorder::set_enable,
|
||||
crate::handlers::recorder::fetch_hls,
|
||||
crate::handlers::recorder::generate_whole_clip,
|
||||
crate::handlers::video::clip_range,
|
||||
crate::handlers::video::upload_procedure,
|
||||
crate::handlers::video::cancel,
|
||||
@@ -545,11 +599,14 @@ fn setup_invoke_handlers(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<
|
||||
crate::handlers::video::generate_video_subtitle,
|
||||
crate::handlers::video::get_video_subtitle,
|
||||
crate::handlers::video::update_video_subtitle,
|
||||
crate::handlers::video::update_video_note,
|
||||
crate::handlers::video::encode_video_subtitle,
|
||||
crate::handlers::video::generic_ffmpeg_command,
|
||||
crate::handlers::video::import_external_video,
|
||||
crate::handlers::video::batch_import_external_videos,
|
||||
crate::handlers::video::clip_video,
|
||||
crate::handlers::video::get_file_size,
|
||||
crate::handlers::video::get_import_progress,
|
||||
crate::handlers::task::get_tasks,
|
||||
crate::handlers::task::delete_task,
|
||||
crate::handlers::utils::show_in_folder,
|
||||
@@ -625,6 +682,6 @@ async fn main() -> Result<(), Box<dyn std::error::Error>> {
|
||||
Ok(v) => log::info!("Checked ffmpeg version: {v}"),
|
||||
}
|
||||
|
||||
http_server::start_api_server(state).await;
|
||||
http_server::api_server::start_api_server(state).await;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
170
src-tauri/src/migration/migration_methods.rs
Normal file
170
src-tauri/src/migration/migration_methods.rs
Normal file
@@ -0,0 +1,170 @@
|
||||
use std::path::PathBuf;
|
||||
use std::sync::Arc;
|
||||
|
||||
use base64::Engine;
|
||||
|
||||
use crate::database::Database;
|
||||
use crate::recorder::entry::EntryStore;
|
||||
use crate::recorder::PlatformType;
|
||||
|
||||
pub async fn try_rebuild_archives(
|
||||
db: &Arc<Database>,
|
||||
cache_path: PathBuf,
|
||||
) -> Result<(), Box<dyn std::error::Error>> {
|
||||
let rooms = db.get_recorders().await?;
|
||||
for room in rooms {
|
||||
let room_id = room.room_id;
|
||||
let room_cache_path = cache_path.join(format!("{}/{}", room.platform, room_id));
|
||||
let mut files = tokio::fs::read_dir(room_cache_path).await?;
|
||||
while let Some(file) = files.next_entry().await? {
|
||||
if file.file_type().await?.is_dir() {
|
||||
// use folder name as live_id
|
||||
let live_id = file.file_name();
|
||||
let live_id = live_id.to_str().unwrap();
|
||||
// check if live_id is in db
|
||||
let record = db.get_record(room_id, live_id).await;
|
||||
if record.is_ok() {
|
||||
continue;
|
||||
}
|
||||
|
||||
// create a record for this live_id
|
||||
let record = db
|
||||
.add_record(
|
||||
PlatformType::from_str(room.platform.as_str()).unwrap(),
|
||||
live_id,
|
||||
live_id,
|
||||
room_id,
|
||||
&format!("UnknownLive {live_id}"),
|
||||
None,
|
||||
)
|
||||
.await?;
|
||||
|
||||
log::info!("rebuild archive {record:?}");
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn try_convert_live_covers(
|
||||
db: &Arc<Database>,
|
||||
cache_path: PathBuf,
|
||||
) -> Result<(), Box<dyn std::error::Error>> {
|
||||
let rooms = db.get_recorders().await?;
|
||||
for room in rooms {
|
||||
let room_id = room.room_id;
|
||||
let room_cache_path = cache_path.join(format!("{}/{}", room.platform, room_id));
|
||||
let records = db.get_records(room_id, 0, 999_999_999).await?;
|
||||
for record in &records {
|
||||
let record_path = room_cache_path.join(record.live_id.clone());
|
||||
let cover = record.cover.clone();
|
||||
if cover.is_none() {
|
||||
continue;
|
||||
}
|
||||
|
||||
let cover = cover.unwrap();
|
||||
if cover.starts_with("data:") {
|
||||
let base64 = cover.split("base64,").nth(1).unwrap();
|
||||
let bytes = base64::engine::general_purpose::STANDARD
|
||||
.decode(base64)
|
||||
.unwrap();
|
||||
let path = record_path.join("cover.jpg");
|
||||
tokio::fs::write(&path, bytes).await?;
|
||||
|
||||
log::info!("convert live cover: {}", path.display());
|
||||
// update record
|
||||
db.update_record_cover(
|
||||
record.live_id.as_str(),
|
||||
Some(format!(
|
||||
"{}/{}/{}/cover.jpg",
|
||||
room.platform, room_id, record.live_id
|
||||
)),
|
||||
)
|
||||
.await?;
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn try_convert_clip_covers(
|
||||
db: &Arc<Database>,
|
||||
output_path: PathBuf,
|
||||
) -> Result<(), Box<dyn std::error::Error>> {
|
||||
let videos = db.get_all_videos().await?;
|
||||
log::debug!("videos: {}", videos.len());
|
||||
for video in &videos {
|
||||
let cover = video.cover.clone();
|
||||
if cover.starts_with("data:") {
|
||||
let base64 = cover.split("base64,").nth(1).unwrap();
|
||||
let bytes = base64::engine::general_purpose::STANDARD
|
||||
.decode(base64)
|
||||
.unwrap();
|
||||
|
||||
let video_file_path = output_path.join(video.file.clone());
|
||||
let cover_file_path = video_file_path.with_extension("jpg");
|
||||
log::debug!("cover_file_path: {}", cover_file_path.display());
|
||||
tokio::fs::write(&cover_file_path, bytes).await?;
|
||||
|
||||
log::info!("convert clip cover: {}", cover_file_path.display());
|
||||
// update record
|
||||
db.update_video_cover(
|
||||
video.id,
|
||||
cover_file_path.file_name().unwrap().to_str().unwrap(),
|
||||
)
|
||||
.await?;
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn try_add_parent_id_to_records(
|
||||
db: &Arc<Database>,
|
||||
) -> Result<(), Box<dyn std::error::Error>> {
|
||||
let rooms = db.get_recorders().await?;
|
||||
for room in &rooms {
|
||||
let records = db.get_records(room.room_id, 0, 999_999_999).await?;
|
||||
for record in &records {
|
||||
if record.parent_id.is_empty() {
|
||||
db.update_record_parent_id(record.live_id.as_str(), record.live_id.as_str())
|
||||
.await?;
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn try_convert_entry_to_m3u8(
|
||||
db: &Arc<Database>,
|
||||
cache_path: PathBuf,
|
||||
) -> Result<(), Box<dyn std::error::Error>> {
|
||||
let rooms = db.get_recorders().await?;
|
||||
for room in &rooms {
|
||||
let records = db.get_records(room.room_id, 0, 999_999_999).await?;
|
||||
for record in &records {
|
||||
let record_path = cache_path.join(format!(
|
||||
"{}/{}/{}",
|
||||
room.platform, room.room_id, record.live_id
|
||||
));
|
||||
let entry_file = record_path.join("entries.log");
|
||||
let m3u8_file_path = record_path.join("playlist.m3u8");
|
||||
if !entry_file.exists() || m3u8_file_path.exists() {
|
||||
continue;
|
||||
}
|
||||
let entry_store = EntryStore::new(record_path.to_str().unwrap()).await;
|
||||
if entry_store.len() == 0 {
|
||||
continue;
|
||||
}
|
||||
let m3u8_content = entry_store.manifest(true, true, None);
|
||||
|
||||
tokio::fs::write(&m3u8_file_path, m3u8_content).await?;
|
||||
log::info!(
|
||||
"Convert entry to m3u8: {} => {}",
|
||||
entry_file.display(),
|
||||
m3u8_file_path.display()
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
@@ -1,3 +1,5 @@
|
||||
pub mod migration_methods;
|
||||
|
||||
use sqlx::migrate::MigrationType;
|
||||
|
||||
#[derive(Debug)]
|
||||
@@ -6,6 +8,7 @@ pub enum MigrationKind {
|
||||
Down,
|
||||
}
|
||||
|
||||
#[cfg(feature = "headless")]
|
||||
#[derive(Debug)]
|
||||
pub struct Migration {
|
||||
pub version: i64,
|
||||
2
src-tauri/src/progress/mod.rs
Normal file
2
src-tauri/src/progress/mod.rs
Normal file
@@ -0,0 +1,2 @@
|
||||
pub mod progress_manager;
|
||||
pub mod progress_reporter;
|
||||
@@ -15,7 +15,7 @@ pub enum Event {
|
||||
message: String,
|
||||
},
|
||||
DanmuReceived {
|
||||
room: u64,
|
||||
room: i64,
|
||||
ts: i64,
|
||||
content: String,
|
||||
},
|
||||
@@ -30,7 +30,7 @@ pub struct ProgressManager {
|
||||
#[cfg(feature = "headless")]
|
||||
impl ProgressManager {
|
||||
pub fn new() -> Self {
|
||||
let (progress_sender, progress_receiver) = broadcast::channel(16);
|
||||
let (progress_sender, progress_receiver) = broadcast::channel(256);
|
||||
Self {
|
||||
progress_sender,
|
||||
progress_receiver,
|
||||
@@ -4,7 +4,7 @@ use std::sync::Arc;
|
||||
use std::sync::LazyLock;
|
||||
use tokio::sync::RwLock;
|
||||
|
||||
use crate::progress_manager::Event;
|
||||
use crate::progress::progress_manager::Event;
|
||||
|
||||
#[cfg(feature = "gui")]
|
||||
use {
|
||||
@@ -76,7 +76,10 @@ impl EventEmitter {
|
||||
match event {
|
||||
Event::ProgressUpdate { id, content } => {
|
||||
self.app_handle
|
||||
.emit("progress-update", UpdateEvent { id, content })
|
||||
.emit(
|
||||
&format!("progress-update:{}", id),
|
||||
UpdateEvent { id, content },
|
||||
)
|
||||
.unwrap();
|
||||
}
|
||||
Event::ProgressFinished {
|
||||
@@ -86,7 +89,7 @@ impl EventEmitter {
|
||||
} => {
|
||||
self.app_handle
|
||||
.emit(
|
||||
"progress-finished",
|
||||
&format!("progress-finished:{}", id),
|
||||
FinishEvent {
|
||||
id,
|
||||
success: *success,
|
||||
@@ -98,7 +101,7 @@ impl EventEmitter {
|
||||
Event::DanmuReceived { room, ts, content } => {
|
||||
self.app_handle
|
||||
.emit(
|
||||
&format!("danmu:{}", room),
|
||||
&format!("danmu:{room}"),
|
||||
DanmuEntry {
|
||||
ts: *ts,
|
||||
content: content.clone(),
|
||||
@@ -117,7 +120,7 @@ impl ProgressReporter {
|
||||
pub async fn new(emitter: &EventEmitter, event_id: &str) -> Result<Self, String> {
|
||||
// if already exists, return
|
||||
if CANCEL_FLAG_MAP.read().await.get(event_id).is_some() {
|
||||
log::error!("Task already exists: {}", event_id);
|
||||
log::error!("Task already exists: {event_id}");
|
||||
emitter.emit(&Event::ProgressFinished {
|
||||
id: event_id.to_string(),
|
||||
success: false,
|
||||
File diff suppressed because it is too large
Load Diff
@@ -7,12 +7,13 @@ use super::response::PostVideoMetaResponse;
|
||||
use super::response::PreuploadResponse;
|
||||
use super::response::VideoSubmitData;
|
||||
use crate::database::account::AccountRow;
|
||||
use crate::progress_reporter::ProgressReporter;
|
||||
use crate::progress_reporter::ProgressReporterTrait;
|
||||
use base64::Engine;
|
||||
use crate::progress::progress_reporter::ProgressReporter;
|
||||
use crate::progress::progress_reporter::ProgressReporterTrait;
|
||||
use crate::recorder::user_agent_generator;
|
||||
use chrono::TimeZone;
|
||||
use pct_str::PctString;
|
||||
use pct_str::URIReserved;
|
||||
use rand::seq::SliceRandom;
|
||||
use regex::Regex;
|
||||
use reqwest::Client;
|
||||
use serde::Deserialize;
|
||||
@@ -39,16 +40,16 @@ struct UploadParams<'a> {
|
||||
pub struct RoomInfo {
|
||||
pub live_status: u8,
|
||||
pub room_cover_url: String,
|
||||
pub room_id: u64,
|
||||
pub room_id: i64,
|
||||
pub room_keyframe_url: String,
|
||||
pub room_title: String,
|
||||
pub user_id: u64,
|
||||
pub user_id: i64,
|
||||
pub live_start_time: i64,
|
||||
}
|
||||
|
||||
#[derive(Serialize, Deserialize, Clone, Debug)]
|
||||
pub struct UserInfo {
|
||||
pub user_id: u64,
|
||||
pub user_id: i64,
|
||||
pub user_name: String,
|
||||
pub user_sign: String,
|
||||
pub user_avatar_url: String,
|
||||
@@ -68,71 +69,138 @@ pub struct QrStatus {
|
||||
pub cookies: String,
|
||||
}
|
||||
|
||||
/// BiliClient is thread safe
|
||||
/// `BiliClient` is thread safe
|
||||
pub struct BiliClient {
|
||||
client: Client,
|
||||
headers: reqwest::header::HeaderMap,
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Debug)]
|
||||
pub enum StreamType {
|
||||
TS,
|
||||
FMP4,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct BiliStream {
|
||||
pub format: StreamType,
|
||||
pub format: Format,
|
||||
pub codec: Codec,
|
||||
pub base_url: String,
|
||||
pub url_info: Vec<UrlInfo>,
|
||||
pub drm: bool,
|
||||
pub master_url: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct UrlInfo {
|
||||
pub host: String,
|
||||
pub path: String,
|
||||
pub extra: String,
|
||||
pub expire: i64,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
#[allow(dead_code)]
|
||||
pub enum Protocol {
|
||||
HttpStream,
|
||||
HttpHls,
|
||||
}
|
||||
|
||||
impl fmt::Display for Protocol {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(f, "{:?}", self)
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq)]
|
||||
pub enum Format {
|
||||
Flv,
|
||||
TS,
|
||||
FMP4,
|
||||
}
|
||||
|
||||
impl fmt::Display for Format {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(f, "{:?}", self)
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub enum Codec {
|
||||
Avc,
|
||||
Hevc,
|
||||
}
|
||||
|
||||
impl fmt::Display for Codec {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(f, "{:?}", self)
|
||||
}
|
||||
}
|
||||
|
||||
// 30000 杜比
|
||||
// 20000 4K
|
||||
// 15000 2K
|
||||
// 10000 原画
|
||||
// 400 蓝光
|
||||
// 250 超清
|
||||
// 150 高清
|
||||
// 80 流畅
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
#[allow(dead_code)]
|
||||
pub enum Qn {
|
||||
Dolby = 30000,
|
||||
Q4K = 20000,
|
||||
Q2K = 15000,
|
||||
Q1080PH = 10000,
|
||||
Q1080P = 400,
|
||||
Q720P = 250,
|
||||
Hd = 150,
|
||||
Smooth = 80,
|
||||
}
|
||||
|
||||
impl fmt::Display for Qn {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(f, "{:?}", self)
|
||||
}
|
||||
}
|
||||
|
||||
impl fmt::Display for BiliStream {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(
|
||||
f,
|
||||
"type: {:?}, host: {}, path: {}, extra: {}, expire: {}",
|
||||
self.format, self.host, self.path, self.extra, self.expire
|
||||
"type: {:?}, codec: {:?}, base_url: {}, url_info: {:?}, drm: {}, master_url: {:?}",
|
||||
self.format, self.codec, self.base_url, self.url_info, self.drm, self.master_url
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
impl BiliStream {
|
||||
pub fn new(format: StreamType, base_url: &str, host: &str, extra: &str) -> BiliStream {
|
||||
pub fn new(
|
||||
format: Format,
|
||||
codec: Codec,
|
||||
base_url: &str,
|
||||
url_info: Vec<UrlInfo>,
|
||||
drm: bool,
|
||||
master_url: Option<String>,
|
||||
) -> BiliStream {
|
||||
BiliStream {
|
||||
format,
|
||||
host: host.into(),
|
||||
path: BiliStream::get_path(base_url),
|
||||
extra: extra.into(),
|
||||
expire: BiliStream::get_expire(extra).unwrap_or(600000),
|
||||
codec,
|
||||
base_url: base_url.into(),
|
||||
url_info,
|
||||
drm,
|
||||
master_url,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn index(&self) -> String {
|
||||
format!(
|
||||
"https://{}/{}/{}?{}",
|
||||
self.host, self.path, "index.m3u8", self.extra
|
||||
)
|
||||
// random choose a url_info
|
||||
let url_info = self.url_info.choose(&mut rand::thread_rng()).unwrap();
|
||||
format!("{}{}{}", url_info.host, self.base_url, url_info.extra)
|
||||
}
|
||||
|
||||
pub fn ts_url(&self, seg_name: &str) -> String {
|
||||
format!(
|
||||
"https://{}/{}/{}?{}",
|
||||
self.host, self.path, seg_name, self.extra
|
||||
)
|
||||
let m3u8_filename = self.base_url.split('/').next_back().unwrap();
|
||||
let base_url = self.base_url.replace(m3u8_filename, seg_name);
|
||||
let url_info = self.url_info.choose(&mut rand::thread_rng()).unwrap();
|
||||
format!("{}{}?{}", url_info.host, base_url, url_info.extra)
|
||||
}
|
||||
|
||||
pub fn get_path(base_url: &str) -> String {
|
||||
match base_url.rfind('/') {
|
||||
Some(pos) => base_url[..pos + 1].to_string(),
|
||||
None => base_url.to_string(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn get_expire(extra: &str) -> Option<i64> {
|
||||
extra.split('&').find_map(|param| {
|
||||
pub fn get_expire(&self) -> Option<i64> {
|
||||
let url_info = self.url_info.choose(&mut rand::thread_rng()).unwrap();
|
||||
url_info.extra.split('&').find_map(|param| {
|
||||
if param.starts_with("expires=") {
|
||||
param.split('=').nth(1)?.parse().ok()
|
||||
} else {
|
||||
@@ -143,22 +211,27 @@ impl BiliStream {
|
||||
}
|
||||
|
||||
impl BiliClient {
|
||||
pub fn new(user_agent: &str) -> Result<BiliClient, BiliClientError> {
|
||||
let mut headers = reqwest::header::HeaderMap::new();
|
||||
headers.insert("user-agent", user_agent.parse().unwrap());
|
||||
|
||||
pub fn new() -> Result<BiliClient, BiliClientError> {
|
||||
if let Ok(client) = Client::builder().timeout(Duration::from_secs(10)).build() {
|
||||
Ok(BiliClient { client, headers })
|
||||
Ok(BiliClient { client })
|
||||
} else {
|
||||
Err(BiliClientError::InitClientError)
|
||||
}
|
||||
}
|
||||
|
||||
fn generate_user_agent_header(&self) -> reqwest::header::HeaderMap {
|
||||
let user_agent = user_agent_generator::UserAgentGenerator::new().generate();
|
||||
let mut headers = reqwest::header::HeaderMap::new();
|
||||
headers.insert("user-agent", user_agent.parse().unwrap());
|
||||
headers
|
||||
}
|
||||
|
||||
pub async fn get_qr(&self) -> Result<QrInfo, BiliClientError> {
|
||||
let headers = self.generate_user_agent_header();
|
||||
let res: serde_json::Value = self
|
||||
.client
|
||||
.get("https://passport.bilibili.com/x/passport-login/web/qrcode/generate")
|
||||
.headers(self.headers.clone())
|
||||
.headers(headers)
|
||||
.send()
|
||||
.await?
|
||||
.json()
|
||||
@@ -176,19 +249,19 @@ impl BiliClient {
|
||||
}
|
||||
|
||||
pub async fn get_qr_status(&self, qrcode_key: &str) -> Result<QrStatus, BiliClientError> {
|
||||
let headers = self.generate_user_agent_header();
|
||||
let res: serde_json::Value = self
|
||||
.client
|
||||
.get(format!(
|
||||
"https://passport.bilibili.com/x/passport-login/web/qrcode/poll?qrcode_key={}",
|
||||
qrcode_key
|
||||
"https://passport.bilibili.com/x/passport-login/web/qrcode/poll?qrcode_key={qrcode_key}"
|
||||
))
|
||||
.headers(self.headers.clone())
|
||||
.headers(headers)
|
||||
.send()
|
||||
.await?
|
||||
.json()
|
||||
.await?;
|
||||
let code: u8 = res["data"]["code"].as_u64().unwrap_or(400) as u8;
|
||||
let mut cookies: String = "".to_string();
|
||||
let mut cookies: String = String::new();
|
||||
if code == 0 {
|
||||
let url = res["data"]["url"]
|
||||
.as_str()
|
||||
@@ -201,8 +274,8 @@ impl BiliClient {
|
||||
}
|
||||
|
||||
pub async fn logout(&self, account: &AccountRow) -> Result<(), BiliClientError> {
|
||||
let mut headers = self.generate_user_agent_header();
|
||||
let url = "https://passport.bilibili.com/login/exit/v2";
|
||||
let mut headers = self.headers.clone();
|
||||
if let Ok(cookies) = account.cookies.parse() {
|
||||
headers.insert("cookie", cookies);
|
||||
} else {
|
||||
@@ -223,7 +296,7 @@ impl BiliClient {
|
||||
pub async fn get_user_info(
|
||||
&self,
|
||||
account: &AccountRow,
|
||||
user_id: u64,
|
||||
user_id: i64,
|
||||
) -> Result<UserInfo, BiliClientError> {
|
||||
let params: Value = json!({
|
||||
"mid": user_id.to_string(),
|
||||
@@ -233,7 +306,7 @@ impl BiliClient {
|
||||
"w_webid": "",
|
||||
});
|
||||
let params = self.get_sign(params).await?;
|
||||
let mut headers = self.headers.clone();
|
||||
let mut headers = self.generate_user_agent_header();
|
||||
if let Ok(cookies) = account.cookies.parse() {
|
||||
headers.insert("cookie", cookies);
|
||||
} else {
|
||||
@@ -242,8 +315,7 @@ impl BiliClient {
|
||||
let resp = self
|
||||
.client
|
||||
.get(format!(
|
||||
"https://api.bilibili.com/x/space/wbi/acc/info?{}",
|
||||
params
|
||||
"https://api.bilibili.com/x/space/wbi/acc/info?{params}"
|
||||
))
|
||||
.headers(headers)
|
||||
.send()
|
||||
@@ -263,7 +335,7 @@ impl BiliClient {
|
||||
.as_u64()
|
||||
.ok_or(BiliClientError::InvalidResponseJson { resp: res.clone() })?;
|
||||
if code != 0 {
|
||||
log::error!("Get user info failed {}", code);
|
||||
log::error!("Get user info failed {code}");
|
||||
return Err(BiliClientError::InvalidMessageCode { code });
|
||||
}
|
||||
Ok(UserInfo {
|
||||
@@ -277,9 +349,9 @@ impl BiliClient {
|
||||
pub async fn get_room_info(
|
||||
&self,
|
||||
account: &AccountRow,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
) -> Result<RoomInfo, BiliClientError> {
|
||||
let mut headers = self.headers.clone();
|
||||
let mut headers = self.generate_user_agent_header();
|
||||
if let Ok(cookies) = account.cookies.parse() {
|
||||
headers.insert("cookie", cookies);
|
||||
} else {
|
||||
@@ -288,8 +360,7 @@ impl BiliClient {
|
||||
let response = self
|
||||
.client
|
||||
.get(format!(
|
||||
"https://api.live.bilibili.com/room/v1/Room/get_info?room_id={}",
|
||||
room_id
|
||||
"https://api.live.bilibili.com/room/v1/Room/get_info?room_id={room_id}"
|
||||
))
|
||||
.headers(headers)
|
||||
.send()
|
||||
@@ -313,7 +384,7 @@ impl BiliClient {
|
||||
}
|
||||
|
||||
let room_id = res["data"]["room_id"]
|
||||
.as_u64()
|
||||
.as_i64()
|
||||
.ok_or(BiliClientError::InvalidValue)?;
|
||||
let room_title = res["data"]["title"]
|
||||
.as_str()
|
||||
@@ -328,7 +399,7 @@ impl BiliClient {
|
||||
.ok_or(BiliClientError::InvalidValue)?
|
||||
.to_string();
|
||||
let user_id = res["data"]["uid"]
|
||||
.as_u64()
|
||||
.as_i64()
|
||||
.ok_or(BiliClientError::InvalidValue)?;
|
||||
let live_status = res["data"]["live_status"]
|
||||
.as_u64()
|
||||
@@ -340,35 +411,166 @@ impl BiliClient {
|
||||
let live_start_time = if live_start_time_str == "0000-00-00 00:00:00" {
|
||||
0
|
||||
} else {
|
||||
// this is a fixed Asia/Shanghai datetime str
|
||||
let naive =
|
||||
chrono::NaiveDateTime::parse_from_str(live_start_time_str, "%Y-%m-%d %H:%M:%S")
|
||||
.map_err(|_| BiliClientError::InvalidValue)?;
|
||||
chrono::Local
|
||||
// parse as UTC datetime and convert to timestamp
|
||||
chrono::Utc
|
||||
.from_local_datetime(&naive)
|
||||
.earliest()
|
||||
.ok_or(BiliClientError::InvalidValue)?
|
||||
.timestamp()
|
||||
- 8 * 3600
|
||||
};
|
||||
Ok(RoomInfo {
|
||||
room_id,
|
||||
room_title,
|
||||
room_cover_url,
|
||||
room_keyframe_url,
|
||||
user_id,
|
||||
live_status,
|
||||
room_cover_url,
|
||||
room_id,
|
||||
room_keyframe_url,
|
||||
room_title,
|
||||
user_id,
|
||||
live_start_time,
|
||||
})
|
||||
}
|
||||
|
||||
/// Get and encode response data into base64
|
||||
pub async fn get_cover_base64(&self, url: &str) -> Result<String, BiliClientError> {
|
||||
/// Get stream info from room id
|
||||
///
|
||||
/// https://socialsisteryi.github.io/bilibili-API-collect/docs/live/info.html#%E8%8E%B7%E5%8F%96%E7%9B%B4%E6%92%AD%E9%97%B4%E4%BF%A1%E6%81%AF-1
|
||||
/// https://api.live.bilibili.com/xlive/web-room/v2/index/getRoomPlayInfo?room_id=31368705&protocol=1&format=1&codec=0&qn=10000&platform=h5
|
||||
pub async fn get_stream_info(
|
||||
&self,
|
||||
account: &AccountRow,
|
||||
room_id: i64,
|
||||
protocol: Protocol,
|
||||
format: Format,
|
||||
codec: &[Codec],
|
||||
qn: Qn,
|
||||
) -> Result<BiliStream, BiliClientError> {
|
||||
let url = format!(
|
||||
"https://api.live.bilibili.com/xlive/web-room/v2/index/getRoomPlayInfo?room_id={}&protocol={}&format={}&codec={}&qn={}&platform=h5",
|
||||
room_id,
|
||||
protocol.clone() as u8,
|
||||
format.clone() as u8,
|
||||
codec.iter().map(|c| (c.clone() as u8).to_string()).collect::<Vec<String>>().join(","),
|
||||
qn as i64,
|
||||
);
|
||||
let mut headers = self.generate_user_agent_header();
|
||||
if let Ok(cookies) = account.cookies.parse() {
|
||||
headers.insert("cookie", cookies);
|
||||
} else {
|
||||
return Err(BiliClientError::InvalidCookie);
|
||||
}
|
||||
let response = self.client.get(url).headers(headers).send().await?;
|
||||
let res: serde_json::Value = response.json().await?;
|
||||
|
||||
let code = res["code"].as_u64().unwrap_or(0);
|
||||
let message = res["message"].as_str().unwrap_or("");
|
||||
if code != 0 {
|
||||
return Err(BiliClientError::ApiError(format!(
|
||||
"Code {} not found, message: {}",
|
||||
code, message
|
||||
)));
|
||||
}
|
||||
|
||||
log::debug!("Get stream info response: {res}");
|
||||
|
||||
// Parse the new API response structure
|
||||
let playurl_info = &res["data"]["playurl_info"]["playurl"];
|
||||
let empty_vec = vec![];
|
||||
let streams = playurl_info["stream"].as_array().unwrap_or(&empty_vec);
|
||||
|
||||
if streams.is_empty() {
|
||||
return Err(BiliClientError::ApiError(
|
||||
"No streams available".to_string(),
|
||||
));
|
||||
}
|
||||
|
||||
// Find the matching protocol
|
||||
let target_protocol = match protocol {
|
||||
Protocol::HttpStream => "http_stream",
|
||||
Protocol::HttpHls => "http_hls",
|
||||
};
|
||||
|
||||
let stream = streams
|
||||
.iter()
|
||||
.find(|s| s["protocol_name"].as_str() == Some(target_protocol))
|
||||
.ok_or_else(|| {
|
||||
BiliClientError::ApiError(format!("Protocol {} not found", target_protocol))
|
||||
})?;
|
||||
|
||||
// Find the matching format
|
||||
let target_format = match format {
|
||||
Format::Flv => "flv",
|
||||
Format::TS => "ts",
|
||||
Format::FMP4 => "fmp4",
|
||||
};
|
||||
|
||||
let empty_vec = vec![];
|
||||
let format_info = stream["format"]
|
||||
.as_array()
|
||||
.unwrap_or(&empty_vec)
|
||||
.iter()
|
||||
.find(|f| f["format_name"].as_str() == Some(target_format))
|
||||
.ok_or_else(|| BiliClientError::FormatNotFound(target_format.to_owned()))?;
|
||||
|
||||
// Find the matching codec
|
||||
let target_codecs = codec
|
||||
.iter()
|
||||
.map(|c| match c {
|
||||
Codec::Avc => "avc",
|
||||
Codec::Hevc => "hevc",
|
||||
})
|
||||
.collect::<Vec<&str>>();
|
||||
|
||||
let codec_info = format_info["codec"]
|
||||
.as_array()
|
||||
.unwrap_or(&empty_vec)
|
||||
.iter()
|
||||
.find(|c| target_codecs.contains(&c["codec_name"].as_str().unwrap_or("")))
|
||||
.ok_or_else(|| BiliClientError::CodecNotFound(target_codecs.join(",")))?;
|
||||
|
||||
let url_info = codec_info["url_info"].as_array().unwrap_or(&empty_vec);
|
||||
|
||||
let url_info = url_info
|
||||
.iter()
|
||||
.map(|u| UrlInfo {
|
||||
host: u["host"].as_str().unwrap_or("").to_string(),
|
||||
extra: u["extra"].as_str().unwrap_or("").to_string(),
|
||||
})
|
||||
.collect();
|
||||
|
||||
let drm = codec_info["drm"].as_bool().unwrap_or(false);
|
||||
let base_url = codec_info["base_url"].as_str().unwrap_or("").to_string();
|
||||
let master_url = format_info["master_url"].as_str().map(|s| s.to_string());
|
||||
let codec = codec_info["codec_name"].as_str().unwrap_or("");
|
||||
let codec = match codec {
|
||||
"avc" => Codec::Avc,
|
||||
"hevc" => Codec::Hevc,
|
||||
_ => return Err(BiliClientError::CodecNotFound(codec.to_string())),
|
||||
};
|
||||
|
||||
Ok(BiliStream {
|
||||
format,
|
||||
codec,
|
||||
base_url,
|
||||
url_info,
|
||||
drm,
|
||||
master_url,
|
||||
})
|
||||
}
|
||||
|
||||
/// Download file from url to path
|
||||
pub async fn download_file(&self, url: &str, path: &Path) -> Result<(), BiliClientError> {
|
||||
if !path.parent().unwrap().exists() {
|
||||
std::fs::create_dir_all(path.parent().unwrap()).unwrap();
|
||||
}
|
||||
let response = self.client.get(url).send().await?;
|
||||
let bytes = response.bytes().await?;
|
||||
let base64 = base64::engine::general_purpose::STANDARD.encode(bytes);
|
||||
let mime_type = mime_guess::from_path(url)
|
||||
.first_or_octet_stream()
|
||||
.to_string();
|
||||
Ok(format!("data:{};base64,{}", mime_type, base64))
|
||||
let mut file = tokio::fs::File::create(&path).await?;
|
||||
let mut content = std::io::Cursor::new(bytes);
|
||||
tokio::io::copy(&mut content, &mut file).await?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn get_index_content(
|
||||
@@ -376,7 +578,7 @@ impl BiliClient {
|
||||
account: &AccountRow,
|
||||
url: &String,
|
||||
) -> Result<String, BiliClientError> {
|
||||
let mut headers = self.headers.clone();
|
||||
let mut headers = self.generate_user_agent_header();
|
||||
if let Ok(cookies) = account.cookies.parse() {
|
||||
headers.insert("cookie", cookies);
|
||||
} else {
|
||||
@@ -397,11 +599,12 @@ impl BiliClient {
|
||||
}
|
||||
}
|
||||
|
||||
#[allow(unused)]
|
||||
pub async fn download_ts(&self, url: &str, file_path: &str) -> Result<u64, BiliClientError> {
|
||||
let res = self
|
||||
.client
|
||||
.get(url)
|
||||
.headers(self.headers.clone())
|
||||
.headers(self.generate_user_agent_header())
|
||||
.send()
|
||||
.await?;
|
||||
let mut file = tokio::fs::File::create(file_path).await?;
|
||||
@@ -412,6 +615,12 @@ impl BiliClient {
|
||||
Ok(size)
|
||||
}
|
||||
|
||||
pub async fn download_ts_raw(&self, url: &str) -> Result<Vec<u8>, BiliClientError> {
|
||||
let res = self.client.get(url).send().await?;
|
||||
let bytes = res.bytes().await?;
|
||||
Ok(bytes.to_vec())
|
||||
}
|
||||
|
||||
// Method from js code
|
||||
pub async fn get_sign(&self, mut parameters: Value) -> Result<String, BiliClientError> {
|
||||
let table = vec![
|
||||
@@ -422,7 +631,7 @@ impl BiliClient {
|
||||
let nav_info: Value = self
|
||||
.client
|
||||
.get("https://api.bilibili.com/x/web-interface/nav")
|
||||
.headers(self.headers.clone())
|
||||
.headers(self.generate_user_agent_header())
|
||||
.send()
|
||||
.await?
|
||||
.json()
|
||||
@@ -440,13 +649,13 @@ impl BiliClient {
|
||||
.get(1)
|
||||
.unwrap()
|
||||
.as_str();
|
||||
let raw_string = format!("{}{}", img, sub);
|
||||
let raw_string = format!("{img}{sub}");
|
||||
let mut encoded = Vec::new();
|
||||
table.into_iter().for_each(|x| {
|
||||
for x in table {
|
||||
if x < raw_string.len() {
|
||||
encoded.push(raw_string.as_bytes()[x]);
|
||||
}
|
||||
});
|
||||
}
|
||||
// only keep 32 bytes of encoded
|
||||
encoded = encoded[0..32].to_vec();
|
||||
let encoded = String::from_utf8(encoded).unwrap();
|
||||
@@ -464,12 +673,12 @@ impl BiliClient {
|
||||
.as_object()
|
||||
.unwrap()
|
||||
.keys()
|
||||
.map(|x| x.to_owned())
|
||||
.map(std::borrow::ToOwned::to_owned)
|
||||
.collect::<Vec<String>>();
|
||||
// sort keys
|
||||
keys.sort();
|
||||
let mut params = String::new();
|
||||
keys.iter().for_each(|x| {
|
||||
for x in &keys {
|
||||
params.push_str(x);
|
||||
params.push('=');
|
||||
// Value filters !'()* characters
|
||||
@@ -485,10 +694,10 @@ impl BiliClient {
|
||||
if x != keys.last().unwrap() {
|
||||
params.push('&');
|
||||
}
|
||||
});
|
||||
}
|
||||
// md5 params+encoded
|
||||
let w_rid = md5::compute(params.to_string() + encoded.as_str());
|
||||
let params = params + format!("&w_rid={:x}", w_rid).as_str();
|
||||
let params = params + format!("&w_rid={w_rid:x}").as_str();
|
||||
Ok(params)
|
||||
}
|
||||
|
||||
@@ -497,7 +706,7 @@ impl BiliClient {
|
||||
account: &AccountRow,
|
||||
video_file: &Path,
|
||||
) -> Result<PreuploadResponse, BiliClientError> {
|
||||
let mut headers = self.headers.clone();
|
||||
let mut headers = self.generate_user_agent_header();
|
||||
if let Ok(cookies) = account.cookies.parse() {
|
||||
headers.insert("cookie", cookies);
|
||||
} else {
|
||||
@@ -566,7 +775,7 @@ impl BiliClient {
|
||||
}
|
||||
|
||||
read_total += size;
|
||||
log::debug!("size: {}, total: {}", size, read_total);
|
||||
log::debug!("size: {size}, total: {read_total}");
|
||||
if size > 0 && (read_total as u64) < chunk_size {
|
||||
continue;
|
||||
}
|
||||
@@ -628,7 +837,7 @@ impl BiliClient {
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
log::error!("Upload error: {}", e);
|
||||
log::error!("Upload error: {e}");
|
||||
retry_count += 1;
|
||||
if retry_count < max_retries {
|
||||
tokio::time::sleep(Duration::from_secs(2u64.pow(retry_count as u32)))
|
||||
@@ -640,10 +849,7 @@ impl BiliClient {
|
||||
|
||||
if !success {
|
||||
return Err(BiliClientError::UploadError {
|
||||
err: format!(
|
||||
"Failed to upload chunk {} after {} retries",
|
||||
chunk, max_retries
|
||||
),
|
||||
err: format!("Failed to upload chunk {chunk} after {max_retries} retries"),
|
||||
});
|
||||
}
|
||||
|
||||
@@ -708,9 +914,9 @@ impl BiliClient {
|
||||
) -> Result<profile::Video, BiliClientError> {
|
||||
log::info!("Start Preparing Video: {}", video_file.to_str().unwrap());
|
||||
let preupload = self.preupload_video(account, video_file).await?;
|
||||
log::info!("Preupload Response: {:?}", preupload);
|
||||
log::info!("Preupload Response: {preupload:?}");
|
||||
let metaposted = self.post_video_meta(&preupload, video_file).await?;
|
||||
log::info!("Post Video Meta Response: {:?}", metaposted);
|
||||
log::info!("Post Video Meta Response: {metaposted:?}");
|
||||
let uploaded = self
|
||||
.upload_video(UploadParams {
|
||||
reporter,
|
||||
@@ -719,7 +925,7 @@ impl BiliClient {
|
||||
video_file,
|
||||
})
|
||||
.await?;
|
||||
log::info!("Uploaded: {}", uploaded);
|
||||
log::info!("Uploaded: {uploaded}");
|
||||
self.end_upload(&preupload, &metaposted, uploaded).await?;
|
||||
let filename = Path::new(&metaposted.key)
|
||||
.file_stem()
|
||||
@@ -727,9 +933,9 @@ impl BiliClient {
|
||||
.to_str()
|
||||
.unwrap();
|
||||
Ok(profile::Video {
|
||||
title: "".to_string(),
|
||||
title: filename.to_string(),
|
||||
filename: filename.to_string(),
|
||||
desc: "".to_string(),
|
||||
desc: String::new(),
|
||||
cid: preupload.biz_id,
|
||||
})
|
||||
}
|
||||
@@ -740,7 +946,7 @@ impl BiliClient {
|
||||
profile_template: &Profile,
|
||||
video: &profile::Video,
|
||||
) -> Result<VideoSubmitData, BiliClientError> {
|
||||
let mut headers = self.headers.clone();
|
||||
let mut headers = self.generate_user_agent_header();
|
||||
if let Ok(cookies) = account.cookies.parse() {
|
||||
headers.insert("cookie", cookies);
|
||||
} else {
|
||||
@@ -758,7 +964,7 @@ impl BiliClient {
|
||||
.post(&url)
|
||||
.headers(headers)
|
||||
.header("Content-Type", "application/json; charset=UTF-8")
|
||||
.body(serde_json::ser::to_string(&preprofile).unwrap_or("".to_string()))
|
||||
.body(serde_json::ser::to_string(&preprofile).unwrap_or_default())
|
||||
.send()
|
||||
.await
|
||||
{
|
||||
@@ -770,12 +976,12 @@ impl BiliClient {
|
||||
_ => Err(BiliClientError::InvalidResponse),
|
||||
}
|
||||
} else {
|
||||
log::error!("Parse response failed: {}", json);
|
||||
log::error!("Parse response failed: {json}");
|
||||
Err(BiliClientError::InvalidResponse)
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
log::error!("Send failed {}", e);
|
||||
log::error!("Send failed {e}");
|
||||
Err(BiliClientError::InvalidResponse)
|
||||
}
|
||||
}
|
||||
@@ -790,7 +996,7 @@ impl BiliClient {
|
||||
"https://member.bilibili.com/x/vu/web/cover/up?ts={}",
|
||||
chrono::Local::now().timestamp(),
|
||||
);
|
||||
let mut headers = self.headers.clone();
|
||||
let mut headers = self.generate_user_agent_header();
|
||||
if let Ok(cookies) = account.cookies.parse() {
|
||||
headers.insert("cookie", cookies);
|
||||
} else {
|
||||
@@ -814,12 +1020,12 @@ impl BiliClient {
|
||||
_ => Err(BiliClientError::InvalidResponse),
|
||||
}
|
||||
} else {
|
||||
log::error!("Parse response failed: {}", json);
|
||||
log::error!("Parse response failed: {json}");
|
||||
Err(BiliClientError::InvalidResponse)
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
log::error!("Send failed {}", e);
|
||||
log::error!("Send failed {e}");
|
||||
Err(BiliClientError::InvalidResponse)
|
||||
}
|
||||
}
|
||||
@@ -828,11 +1034,11 @@ impl BiliClient {
|
||||
pub async fn send_danmaku(
|
||||
&self,
|
||||
account: &AccountRow,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
message: &str,
|
||||
) -> Result<(), BiliClientError> {
|
||||
let url = "https://api.live.bilibili.com/msg/send".to_string();
|
||||
let mut headers = self.headers.clone();
|
||||
let mut headers = self.generate_user_agent_header();
|
||||
if let Ok(cookies) = account.cookies.parse() {
|
||||
headers.insert("cookie", cookies);
|
||||
} else {
|
||||
@@ -846,7 +1052,7 @@ impl BiliClient {
|
||||
("fontsize", "25"),
|
||||
("room_type", "0"),
|
||||
("rnd", &format!("{}", chrono::Local::now().timestamp())),
|
||||
("roomid", &format!("{}", room_id)),
|
||||
("roomid", &format!("{room_id}")),
|
||||
("csrf", &account.csrf),
|
||||
("csrf_token", &account.csrf),
|
||||
];
|
||||
@@ -866,7 +1072,7 @@ impl BiliClient {
|
||||
account: &AccountRow,
|
||||
) -> Result<Vec<response::Typelist>, BiliClientError> {
|
||||
let url = "https://member.bilibili.com/x/vupre/web/archive/pre?lang=cn";
|
||||
let mut headers = self.headers.clone();
|
||||
let mut headers = self.generate_user_agent_header();
|
||||
if let Ok(cookies) = account.cookies.parse() {
|
||||
headers.insert("cookie", cookies);
|
||||
} else {
|
||||
|
||||
@@ -1,38 +1,49 @@
|
||||
use custom_error::custom_error;
|
||||
use thiserror::Error;
|
||||
|
||||
custom_error! {pub BiliClientError
|
||||
InvalidResponse = "Invalid response",
|
||||
InitClientError = "Client init error",
|
||||
InvalidResponseStatus{ status: reqwest::StatusCode } = "Invalid response status: {status}",
|
||||
InvalidResponseJson{ resp: serde_json::Value } = "Invalid response json: {resp}",
|
||||
InvalidMessageCode{ code: u64 } = "Invalid message code: {code}",
|
||||
InvalidValue = "Invalid value",
|
||||
InvalidUrl = "Invalid url",
|
||||
InvalidFormat = "Invalid stream format",
|
||||
InvalidStream = "Invalid stream",
|
||||
InvalidCookie = "Invalid cookie",
|
||||
UploadError{err: String} = "Upload error: {err}",
|
||||
UploadCancelled = "Upload was cancelled by user",
|
||||
EmptyCache = "Empty cache",
|
||||
ClientError{err: reqwest::Error} = "Client error: {err}",
|
||||
IOError{err: std::io::Error} = "IO error: {err}",
|
||||
SecurityControlError = "Security control error",
|
||||
}
|
||||
|
||||
impl From<reqwest::Error> for BiliClientError {
|
||||
fn from(e: reqwest::Error) -> Self {
|
||||
BiliClientError::ClientError { err: e }
|
||||
}
|
||||
}
|
||||
|
||||
impl From<std::io::Error> for BiliClientError {
|
||||
fn from(e: std::io::Error) -> Self {
|
||||
BiliClientError::IOError { err: e }
|
||||
}
|
||||
#[derive(Error, Debug)]
|
||||
pub enum BiliClientError {
|
||||
#[error("Invalid response")]
|
||||
InvalidResponse,
|
||||
#[error("Client init error")]
|
||||
InitClientError,
|
||||
#[error("Invalid response status: {status}")]
|
||||
InvalidResponseStatus { status: reqwest::StatusCode },
|
||||
#[error("Invalid response json: {resp}")]
|
||||
InvalidResponseJson { resp: serde_json::Value },
|
||||
#[error("Invalid message code: {code}")]
|
||||
InvalidMessageCode { code: u64 },
|
||||
#[error("Invalid value")]
|
||||
InvalidValue,
|
||||
#[error("Invalid url")]
|
||||
InvalidUrl,
|
||||
#[error("Invalid stream format")]
|
||||
InvalidFormat,
|
||||
#[error("Invalid stream")]
|
||||
InvalidStream,
|
||||
#[error("Invalid cookie")]
|
||||
InvalidCookie,
|
||||
#[error("Upload error: {err}")]
|
||||
UploadError { err: String },
|
||||
#[error("Upload was cancelled by user")]
|
||||
UploadCancelled,
|
||||
#[error("Empty cache")]
|
||||
EmptyCache,
|
||||
#[error("Client error: {0}")]
|
||||
ClientError(#[from] reqwest::Error),
|
||||
#[error("IO error: {0}")]
|
||||
IOError(#[from] std::io::Error),
|
||||
#[error("Security control error")]
|
||||
SecurityControlError,
|
||||
#[error("API error: {0}")]
|
||||
ApiError(String),
|
||||
#[error("Format not found: {0}")]
|
||||
FormatNotFound(String),
|
||||
#[error("Codec not found: {0}")]
|
||||
CodecNotFound(String),
|
||||
}
|
||||
|
||||
impl From<BiliClientError> for String {
|
||||
fn from(value: BiliClientError) -> Self {
|
||||
value.to_string()
|
||||
fn from(err: BiliClientError) -> Self {
|
||||
err.to_string()
|
||||
}
|
||||
}
|
||||
|
||||
@@ -38,7 +38,7 @@ impl DanmuStorage {
|
||||
let parts: Vec<&str> = line.split(':').collect();
|
||||
let ts: i64 = parts[0].parse().unwrap();
|
||||
let content = parts[1].to_string();
|
||||
preload_cache.push(DanmuEntry { ts, content })
|
||||
preload_cache.push(DanmuEntry { ts, content });
|
||||
}
|
||||
let file = OpenOptions::new()
|
||||
.append(true)
|
||||
@@ -61,7 +61,7 @@ impl DanmuStorage {
|
||||
.file
|
||||
.write()
|
||||
.await
|
||||
.write(format!("{}:{}\n", ts, content).as_bytes())
|
||||
.write(format!("{ts}:{content}\n").as_bytes())
|
||||
.await;
|
||||
}
|
||||
|
||||
|
||||
@@ -1,14 +1,15 @@
|
||||
pub mod client;
|
||||
mod response;
|
||||
mod stream_info;
|
||||
use super::entry::{EntryStore, Range, TsEntry};
|
||||
use super::entry::Range;
|
||||
use super::{
|
||||
danmu::DanmuEntry, errors::RecorderError, PlatformType, Recorder, RecorderInfo, RoomInfo,
|
||||
UserInfo,
|
||||
};
|
||||
use crate::database::Database;
|
||||
use crate::progress_manager::Event;
|
||||
use crate::progress_reporter::EventEmitter;
|
||||
use crate::ffmpeg::extract_video_metadata;
|
||||
use crate::progress::progress_manager::Event;
|
||||
use crate::progress::progress_reporter::EventEmitter;
|
||||
use crate::recorder_manager::RecorderEvent;
|
||||
use crate::subtitle_generator::item_to_srt;
|
||||
use crate::{config::Config, database::account::AccountRow};
|
||||
@@ -18,6 +19,7 @@ use client::DouyinClientError;
|
||||
use danmu_stream::danmu_stream::DanmuStream;
|
||||
use danmu_stream::provider::ProviderType;
|
||||
use danmu_stream::DanmuMessageType;
|
||||
use m3u8_rs::{MediaPlaylist, MediaPlaylistType, MediaSegment};
|
||||
use rand::random;
|
||||
use std::path::Path;
|
||||
use std::sync::Arc;
|
||||
@@ -38,18 +40,6 @@ pub enum LiveStatus {
|
||||
Offline,
|
||||
}
|
||||
|
||||
impl From<std::io::Error> for RecorderError {
|
||||
fn from(err: std::io::Error) -> Self {
|
||||
RecorderError::IoError { err }
|
||||
}
|
||||
}
|
||||
|
||||
impl From<DouyinClientError> for RecorderError {
|
||||
fn from(err: DouyinClientError) -> Self {
|
||||
RecorderError::DouyinClientError { err }
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct DouyinRecorder {
|
||||
#[cfg(not(feature = "headless"))]
|
||||
@@ -58,25 +48,76 @@ pub struct DouyinRecorder {
|
||||
client: client::DouyinClient,
|
||||
db: Arc<Database>,
|
||||
account: AccountRow,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
sec_user_id: String,
|
||||
room_info: Arc<RwLock<Option<client::DouyinBasicRoomInfo>>>,
|
||||
stream_url: Arc<RwLock<Option<String>>>,
|
||||
entry_store: Arc<RwLock<Option<EntryStore>>>,
|
||||
danmu_store: Arc<RwLock<Option<DanmuStorage>>>,
|
||||
live_id: Arc<RwLock<String>>,
|
||||
danmu_room_id: Arc<RwLock<String>>,
|
||||
platform_live_id: Arc<RwLock<String>>,
|
||||
live_status: Arc<RwLock<LiveStatus>>,
|
||||
is_recording: Arc<RwLock<bool>>,
|
||||
running: Arc<RwLock<bool>>,
|
||||
last_update: Arc<RwLock<i64>>,
|
||||
config: Arc<RwLock<Config>>,
|
||||
live_end_channel: broadcast::Sender<RecorderEvent>,
|
||||
event_channel: broadcast::Sender<RecorderEvent>,
|
||||
enabled: Arc<RwLock<bool>>,
|
||||
|
||||
danmu_stream_task: Arc<Mutex<Option<JoinHandle<()>>>>,
|
||||
danmu_task: Arc<Mutex<Option<JoinHandle<()>>>>,
|
||||
record_task: Arc<Mutex<Option<JoinHandle<()>>>>,
|
||||
|
||||
playlist: Arc<RwLock<MediaPlaylist>>,
|
||||
last_sequence: Arc<RwLock<u64>>,
|
||||
total_duration: Arc<RwLock<f64>>,
|
||||
total_size: Arc<RwLock<u64>>,
|
||||
}
|
||||
|
||||
fn get_best_stream_url(room_info: &client::DouyinBasicRoomInfo) -> Option<String> {
|
||||
let stream_data = room_info.stream_data.clone();
|
||||
// parse stream_data into stream_info
|
||||
let stream_info = serde_json::from_str::<stream_info::StreamInfo>(&stream_data);
|
||||
if let Ok(stream_info) = stream_info {
|
||||
// find the best stream url
|
||||
if stream_info.data.origin.main.hls.is_empty() {
|
||||
log::error!("No stream url found in stream_data: {stream_data}");
|
||||
return None;
|
||||
}
|
||||
|
||||
Some(stream_info.data.origin.main.hls)
|
||||
} else {
|
||||
let err = stream_info.unwrap_err();
|
||||
log::error!("Failed to parse stream data: {err} {stream_data}");
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
fn parse_stream_url(stream_url: &str) -> (String, String) {
|
||||
// Parse stream URL to extract base URL and query parameters
|
||||
// Example: http://7167739a741646b4651b6949b2f3eb8e.livehwc3.cn/pull-hls-l26.douyincdn.com/third/stream-693342996808860134_or4.m3u8?sub_m3u8=true&user_session_id=16090eb45ab8a2f042f7c46563936187&major_anchor_level=common&edge_slice=true&expire=67d944ec&sign=47b95cc6e8de20d82f3d404412fa8406
|
||||
|
||||
let base_url = stream_url
|
||||
.rfind('/')
|
||||
.map_or(stream_url, |i| &stream_url[..=i])
|
||||
.to_string();
|
||||
|
||||
let query_params = stream_url
|
||||
.find('?')
|
||||
.map_or("", |i| &stream_url[i..])
|
||||
.to_string();
|
||||
|
||||
(base_url, query_params)
|
||||
}
|
||||
|
||||
fn default_m3u8_playlist() -> MediaPlaylist {
|
||||
MediaPlaylist {
|
||||
version: Some(6),
|
||||
target_duration: 4.0,
|
||||
end_list: true,
|
||||
playlist_type: Some(MediaPlaylistType::Vod),
|
||||
segments: Vec::new(),
|
||||
..Default::default()
|
||||
}
|
||||
}
|
||||
|
||||
impl DouyinRecorder {
|
||||
@@ -84,7 +125,7 @@ impl DouyinRecorder {
|
||||
pub async fn new(
|
||||
#[cfg(not(feature = "headless"))] app_handle: AppHandle,
|
||||
emitter: EventEmitter,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
sec_user_id: &str,
|
||||
config: Arc<RwLock<Config>>,
|
||||
account: &AccountRow,
|
||||
@@ -92,7 +133,7 @@ impl DouyinRecorder {
|
||||
enabled: bool,
|
||||
channel: broadcast::Sender<RecorderEvent>,
|
||||
) -> Result<Self, super::errors::RecorderError> {
|
||||
let client = client::DouyinClient::new(&config.read().await.user_agent, account);
|
||||
let client = client::DouyinClient::new(account);
|
||||
let room_info = client.get_room_info(room_id, sec_user_id).await?;
|
||||
let mut live_status = LiveStatus::Offline;
|
||||
if room_info.status == 0 {
|
||||
@@ -108,8 +149,7 @@ impl DouyinRecorder {
|
||||
room_id,
|
||||
sec_user_id: sec_user_id.to_string(),
|
||||
live_id: Arc::new(RwLock::new(String::new())),
|
||||
danmu_room_id: Arc::new(RwLock::new(String::new())),
|
||||
entry_store: Arc::new(RwLock::new(None)),
|
||||
platform_live_id: Arc::new(RwLock::new(String::new())),
|
||||
danmu_store: Arc::new(RwLock::new(None)),
|
||||
client,
|
||||
room_info: Arc::new(RwLock::new(Some(room_info))),
|
||||
@@ -120,11 +160,16 @@ impl DouyinRecorder {
|
||||
enabled: Arc::new(RwLock::new(enabled)),
|
||||
last_update: Arc::new(RwLock::new(Utc::now().timestamp())),
|
||||
config,
|
||||
live_end_channel: channel,
|
||||
event_channel: channel,
|
||||
|
||||
danmu_stream_task: Arc::new(Mutex::new(None)),
|
||||
danmu_task: Arc::new(Mutex::new(None)),
|
||||
record_task: Arc::new(Mutex::new(None)),
|
||||
|
||||
playlist: Arc::new(RwLock::new(default_m3u8_playlist())),
|
||||
last_sequence: Arc::new(RwLock::new(0)),
|
||||
total_duration: Arc::new(RwLock::new(0.0)),
|
||||
total_size: Arc::new(RwLock::new(0)),
|
||||
})
|
||||
}
|
||||
|
||||
@@ -168,6 +213,10 @@ impl DouyinRecorder {
|
||||
))
|
||||
.show()
|
||||
.unwrap();
|
||||
|
||||
let _ = self.event_channel.send(RecorderEvent::LiveStart {
|
||||
recorder: self.info().await,
|
||||
});
|
||||
} else {
|
||||
#[cfg(not(feature = "headless"))]
|
||||
self.app_handle
|
||||
@@ -180,11 +229,12 @@ impl DouyinRecorder {
|
||||
))
|
||||
.show()
|
||||
.unwrap();
|
||||
let _ = self.live_end_channel.send(RecorderEvent::LiveEnd {
|
||||
let _ = self.event_channel.send(RecorderEvent::LiveEnd {
|
||||
platform: PlatformType::Douyin,
|
||||
room_id: self.room_id,
|
||||
live_id: self.live_id.read().await.clone(),
|
||||
recorder: self.info().await,
|
||||
});
|
||||
*self.live_id.write().await = String::new();
|
||||
}
|
||||
|
||||
self.reset().await;
|
||||
@@ -212,20 +262,24 @@ impl DouyinRecorder {
|
||||
if !info.hls_url.is_empty() {
|
||||
// Only set stream URL, don't create record yet
|
||||
// Record will be created when first ts download succeeds
|
||||
let new_stream_url = self.get_best_stream_url(&info).await;
|
||||
let new_stream_url = get_best_stream_url(&info);
|
||||
if new_stream_url.is_none() {
|
||||
log::error!("No stream url found in room_info: {:#?}", info);
|
||||
log::error!("No stream url found in room_info: {info:#?}");
|
||||
return false;
|
||||
}
|
||||
|
||||
log::info!("New douyin stream URL: {}", new_stream_url.clone().unwrap());
|
||||
*self.stream_url.write().await = Some(new_stream_url.unwrap());
|
||||
*self.danmu_room_id.write().await = info.room_id_str.clone();
|
||||
(*self.platform_live_id.write().await).clone_from(&info.room_id_str);
|
||||
}
|
||||
|
||||
true
|
||||
}
|
||||
Err(e) => {
|
||||
if let DouyinClientError::H5NotLive(e) = e {
|
||||
log::debug!("[{}]Live maybe not started: {}", self.room_id, e);
|
||||
return false;
|
||||
}
|
||||
log::error!("[{}]Update room status failed: {}", self.room_id, e);
|
||||
*self.live_status.read().await == LiveStatus::Live
|
||||
}
|
||||
@@ -235,17 +289,17 @@ impl DouyinRecorder {
|
||||
async fn danmu(&self) -> Result<(), super::errors::RecorderError> {
|
||||
let cookies = self.account.cookies.clone();
|
||||
let danmu_room_id = self
|
||||
.danmu_room_id
|
||||
.platform_live_id
|
||||
.read()
|
||||
.await
|
||||
.clone()
|
||||
.parse::<u64>()
|
||||
.parse::<i64>()
|
||||
.unwrap_or(0);
|
||||
let danmu_stream = DanmuStream::new(ProviderType::Douyin, &cookies, danmu_room_id).await;
|
||||
if danmu_stream.is_err() {
|
||||
let err = danmu_stream.err().unwrap();
|
||||
log::error!("Failed to create danmu stream: {}", err);
|
||||
return Err(super::errors::RecorderError::DanmuStreamError { err });
|
||||
log::error!("Failed to create danmu stream: {err}");
|
||||
return Err(super::errors::RecorderError::DanmuStreamError(err));
|
||||
}
|
||||
let danmu_stream = danmu_stream.unwrap();
|
||||
|
||||
@@ -271,21 +325,26 @@ impl DouyinRecorder {
|
||||
}
|
||||
} else {
|
||||
log::error!("Failed to receive danmu message");
|
||||
return Err(super::errors::RecorderError::DanmuStreamError {
|
||||
err: danmu_stream::DanmuStreamError::WebsocketError {
|
||||
return Err(super::errors::RecorderError::DanmuStreamError(
|
||||
danmu_stream::DanmuStreamError::WebsocketError {
|
||||
err: "Failed to receive danmu message".to_string(),
|
||||
},
|
||||
});
|
||||
));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async fn reset(&self) {
|
||||
*self.entry_store.write().await = None;
|
||||
*self.live_id.write().await = String::new();
|
||||
*self.danmu_room_id.write().await = String::new();
|
||||
let live_id = self.live_id.read().await.clone();
|
||||
if !live_id.is_empty() {
|
||||
self.save_playlist().await;
|
||||
}
|
||||
*self.playlist.write().await = default_m3u8_playlist();
|
||||
*self.platform_live_id.write().await = String::new();
|
||||
*self.last_update.write().await = Utc::now().timestamp();
|
||||
*self.stream_url.write().await = None;
|
||||
*self.total_duration.write().await = 0.0;
|
||||
*self.total_size.write().await = 0;
|
||||
}
|
||||
|
||||
async fn get_work_dir(&self, live_id: &str) -> String {
|
||||
@@ -297,42 +356,35 @@ impl DouyinRecorder {
|
||||
)
|
||||
}
|
||||
|
||||
async fn get_best_stream_url(&self, room_info: &client::DouyinBasicRoomInfo) -> Option<String> {
|
||||
let stream_data = room_info.stream_data.clone();
|
||||
// parse stream_data into stream_info
|
||||
let stream_info = serde_json::from_str::<stream_info::StreamInfo>(&stream_data);
|
||||
if let Ok(stream_info) = stream_info {
|
||||
// find the best stream url
|
||||
if stream_info.data.origin.main.hls.is_empty() {
|
||||
log::error!("No stream url found in stream_data: {}", stream_data);
|
||||
return None;
|
||||
}
|
||||
|
||||
Some(stream_info.data.origin.main.hls)
|
||||
} else {
|
||||
let err = stream_info.unwrap_err();
|
||||
log::error!("Failed to parse stream data: {} {}", err, stream_data);
|
||||
None
|
||||
}
|
||||
async fn load_playlist(
|
||||
&self,
|
||||
live_id: &str,
|
||||
) -> Result<MediaPlaylist, super::errors::RecorderError> {
|
||||
let playlist_file_path =
|
||||
format!("{}/{}", self.get_work_dir(live_id).await, "playlist.m3u8");
|
||||
let playlist_content = tokio::fs::read(&playlist_file_path).await.unwrap();
|
||||
let playlist = m3u8_rs::parse_media_playlist(&playlist_content).unwrap().1;
|
||||
Ok(playlist)
|
||||
}
|
||||
|
||||
fn parse_stream_url(&self, stream_url: &str) -> (String, String) {
|
||||
// Parse stream URL to extract base URL and query parameters
|
||||
// Example: http://7167739a741646b4651b6949b2f3eb8e.livehwc3.cn/pull-hls-l26.douyincdn.com/third/stream-693342996808860134_or4.m3u8?sub_m3u8=true&user_session_id=16090eb45ab8a2f042f7c46563936187&major_anchor_level=common&edge_slice=true&expire=67d944ec&sign=47b95cc6e8de20d82f3d404412fa8406
|
||||
async fn save_playlist(&self) {
|
||||
let playlist = self.playlist.read().await.clone();
|
||||
let mut bytes: Vec<u8> = Vec::new();
|
||||
playlist.write_to(&mut bytes).unwrap();
|
||||
let playlist_file_path = format!(
|
||||
"{}/{}",
|
||||
self.get_work_dir(self.live_id.read().await.as_str()).await,
|
||||
"playlist.m3u8"
|
||||
);
|
||||
tokio::fs::write(&playlist_file_path, bytes).await.unwrap();
|
||||
}
|
||||
|
||||
let base_url = stream_url
|
||||
.rfind('/')
|
||||
.map(|i| &stream_url[..=i])
|
||||
.unwrap_or(stream_url)
|
||||
.to_string();
|
||||
|
||||
let query_params = stream_url
|
||||
.find('?')
|
||||
.map(|i| &stream_url[i..])
|
||||
.unwrap_or("")
|
||||
.to_string();
|
||||
|
||||
(base_url, query_params)
|
||||
async fn add_segment(&self, sequence: u64, segment: MediaSegment) {
|
||||
self.playlist.write().await.segments.push(segment);
|
||||
let current_last_sequence = *self.last_sequence.read().await;
|
||||
let new_last_sequence = std::cmp::max(current_last_sequence, sequence);
|
||||
*self.last_sequence.write().await = new_last_sequence;
|
||||
self.save_playlist().await;
|
||||
}
|
||||
|
||||
async fn update_entries(&self) -> Result<u128, RecorderError> {
|
||||
@@ -358,7 +410,7 @@ impl DouyinRecorder {
|
||||
stream_url = updated_stream_url;
|
||||
|
||||
let mut new_segment_fetched = false;
|
||||
let mut is_first_segment = self.entry_store.read().await.is_none();
|
||||
let mut is_first_segment = self.playlist.read().await.segments.is_empty();
|
||||
let work_dir;
|
||||
|
||||
// If this is the first segment, prepare but don't create directories yet
|
||||
@@ -371,25 +423,15 @@ impl DouyinRecorder {
|
||||
work_dir = self.get_work_dir(self.live_id.read().await.as_str()).await;
|
||||
}
|
||||
|
||||
let last_sequence = if is_first_segment {
|
||||
0
|
||||
} else {
|
||||
self.entry_store
|
||||
.read()
|
||||
.await
|
||||
.as_ref()
|
||||
.unwrap()
|
||||
.last_sequence
|
||||
};
|
||||
self.playlist.write().await.target_duration = playlist.target_duration;
|
||||
|
||||
for segment in playlist.segments.iter() {
|
||||
let formated_ts_name = segment.uri.clone();
|
||||
let sequence = extract_sequence_from(&formated_ts_name);
|
||||
let last_sequence = *self.last_sequence.read().await;
|
||||
|
||||
for segment in &playlist.segments {
|
||||
let formatted_ts_name = segment.uri.clone();
|
||||
let sequence = extract_sequence_from(&formatted_ts_name);
|
||||
if sequence.is_none() {
|
||||
log::error!(
|
||||
"No timestamp extracted from douyin ts name: {}",
|
||||
formated_ts_name
|
||||
);
|
||||
log::error!("No timestamp extracted from douyin ts name: {formatted_ts_name}");
|
||||
continue;
|
||||
}
|
||||
|
||||
@@ -405,7 +447,7 @@ impl DouyinRecorder {
|
||||
uri.clone()
|
||||
} else {
|
||||
// Parse the stream URL to extract base URL and query parameters
|
||||
let (base_url, query_params) = self.parse_stream_url(&stream_url);
|
||||
let (base_url, query_params) = parse_stream_url(&stream_url);
|
||||
|
||||
// Check if the segment URI already has query parameters
|
||||
if uri.contains('?') {
|
||||
@@ -413,7 +455,7 @@ impl DouyinRecorder {
|
||||
format!("{}{}&{}", base_url, uri, &query_params[1..]) // Remove leading ? from query_params
|
||||
} else {
|
||||
// If segment URI has no query params, append m3u8 query params with ?
|
||||
format!("{}{}{}", base_url, uri, query_params)
|
||||
format!("{base_url}{uri}{query_params}")
|
||||
}
|
||||
};
|
||||
|
||||
@@ -424,14 +466,14 @@ impl DouyinRecorder {
|
||||
let mut work_dir_created = false;
|
||||
|
||||
while retry_count < max_retries && !download_success {
|
||||
let file_name = format!("{}.ts", sequence);
|
||||
let file_path = format!("{}/{}", work_dir, file_name);
|
||||
let file_name = format!("{sequence}.ts");
|
||||
let file_path = format!("{work_dir}/{file_name}");
|
||||
|
||||
// If this is the first segment, create work directory before first download attempt
|
||||
if is_first_segment && !work_dir_created {
|
||||
// Create work directory only when we're about to download
|
||||
if let Err(e) = tokio::fs::create_dir_all(&work_dir).await {
|
||||
log::error!("Failed to create work directory: {}", e);
|
||||
log::error!("Failed to create work directory: {e}");
|
||||
return Err(e.into());
|
||||
}
|
||||
work_dir_created = true;
|
||||
@@ -440,7 +482,7 @@ impl DouyinRecorder {
|
||||
match self.client.download_ts(&ts_url, &file_path).await {
|
||||
Ok(size) => {
|
||||
if size == 0 {
|
||||
log::error!("Download segment failed (empty response): {}", ts_url);
|
||||
log::error!("Download segment failed (empty response): {ts_url}");
|
||||
retry_count += 1;
|
||||
if retry_count < max_retries {
|
||||
tokio::time::sleep(Duration::from_millis(500)).await;
|
||||
@@ -454,30 +496,35 @@ impl DouyinRecorder {
|
||||
// Create database record
|
||||
let room_info = room_info.as_ref().unwrap();
|
||||
let cover_url = room_info.cover.clone();
|
||||
let cover = if let Some(url) = cover_url {
|
||||
Some(self.client.get_cover_base64(&url).await.unwrap_or_default())
|
||||
} else {
|
||||
None
|
||||
};
|
||||
let room_cover_path = Path::new(PlatformType::Douyin.as_str())
|
||||
.join(self.room_id.to_string())
|
||||
.join("cover.jpg");
|
||||
if let Some(url) = cover_url {
|
||||
let full_room_cover_path =
|
||||
Path::new(&self.config.read().await.cache)
|
||||
.join(&room_cover_path);
|
||||
let _ =
|
||||
self.client.download_file(&url, &full_room_cover_path).await;
|
||||
}
|
||||
|
||||
if let Err(e) = self
|
||||
.db
|
||||
.add_record(
|
||||
PlatformType::Douyin,
|
||||
self.platform_live_id.read().await.as_str(),
|
||||
self.live_id.read().await.as_str(),
|
||||
self.room_id,
|
||||
&room_info.room_title,
|
||||
cover,
|
||||
None,
|
||||
Some(room_cover_path.to_str().unwrap().to_string()),
|
||||
)
|
||||
.await
|
||||
{
|
||||
log::error!("Failed to add record: {}", e);
|
||||
log::error!("Failed to add record: {e}");
|
||||
}
|
||||
|
||||
// Setup entry store
|
||||
let entry_store = EntryStore::new(&work_dir).await;
|
||||
*self.entry_store.write().await = Some(entry_store);
|
||||
let _ = self.event_channel.send(RecorderEvent::RecordStart {
|
||||
recorder: self.info().await,
|
||||
});
|
||||
|
||||
// Setup danmu store
|
||||
let danmu_file_path = format!("{}{}", work_dir, "danmu.txt");
|
||||
@@ -496,29 +543,25 @@ impl DouyinRecorder {
|
||||
let live_id = self.live_id.read().await.clone();
|
||||
let self_clone = self.clone();
|
||||
*self.danmu_task.lock().await = Some(tokio::spawn(async move {
|
||||
log::info!("Start fetching danmu for live {}", live_id);
|
||||
log::info!("Start fetching danmu for live {live_id}");
|
||||
let _ = self_clone.danmu().await;
|
||||
}));
|
||||
|
||||
is_first_segment = false;
|
||||
}
|
||||
|
||||
let ts_entry = TsEntry {
|
||||
url: file_name,
|
||||
sequence,
|
||||
length: segment.duration as f64,
|
||||
size,
|
||||
ts: Utc::now().timestamp_millis(),
|
||||
is_header: false,
|
||||
};
|
||||
let mut pl = segment.clone();
|
||||
pl.uri = file_name;
|
||||
|
||||
self.entry_store
|
||||
.write()
|
||||
.await
|
||||
.as_mut()
|
||||
.unwrap()
|
||||
.add_entry(ts_entry)
|
||||
.await;
|
||||
let metadata = extract_video_metadata(Path::new(&file_path)).await;
|
||||
if let Ok(metadata) = metadata {
|
||||
pl.duration = metadata.duration as f32;
|
||||
}
|
||||
|
||||
*self.total_duration.write().await += segment.duration as f64;
|
||||
*self.total_size.write().await += size;
|
||||
|
||||
self.add_segment(sequence, pl).await;
|
||||
|
||||
new_segment_fetched = true;
|
||||
download_success = true;
|
||||
@@ -540,8 +583,7 @@ impl DouyinRecorder {
|
||||
// If all retries failed, check if it's a 400 error
|
||||
if e.to_string().contains("400") {
|
||||
log::error!(
|
||||
"HTTP 400 error for segment, stream URL may be expired: {}",
|
||||
ts_url
|
||||
"HTTP 400 error for segment, stream URL may be expired: {ts_url}"
|
||||
);
|
||||
*self.stream_url.write().await = None;
|
||||
|
||||
@@ -550,9 +592,7 @@ impl DouyinRecorder {
|
||||
if let Err(cleanup_err) = tokio::fs::remove_dir_all(&work_dir).await
|
||||
{
|
||||
log::warn!(
|
||||
"Failed to cleanup empty work directory {}: {}",
|
||||
work_dir,
|
||||
cleanup_err
|
||||
"Failed to cleanup empty work directory {work_dir}: {cleanup_err}"
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -564,9 +604,7 @@ impl DouyinRecorder {
|
||||
if is_first_segment && work_dir_created {
|
||||
if let Err(cleanup_err) = tokio::fs::remove_dir_all(&work_dir).await {
|
||||
log::warn!(
|
||||
"Failed to cleanup empty work directory {}: {}",
|
||||
work_dir,
|
||||
cleanup_err
|
||||
"Failed to cleanup empty work directory {work_dir}: {cleanup_err}"
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -577,24 +615,16 @@ impl DouyinRecorder {
|
||||
}
|
||||
|
||||
if !download_success {
|
||||
log::error!(
|
||||
"Failed to download segment after {} retries: {}",
|
||||
max_retries,
|
||||
ts_url
|
||||
);
|
||||
log::error!("Failed to download segment after {max_retries} retries: {ts_url}");
|
||||
|
||||
// Clean up empty directory if first segment failed after all retries
|
||||
if is_first_segment && work_dir_created {
|
||||
if let Err(cleanup_err) = tokio::fs::remove_dir_all(&work_dir).await {
|
||||
log::warn!(
|
||||
"Failed to cleanup empty work directory {}: {}",
|
||||
work_dir,
|
||||
cleanup_err
|
||||
"Failed to cleanup empty work directory {work_dir}: {cleanup_err}"
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -619,22 +649,16 @@ impl DouyinRecorder {
|
||||
.db
|
||||
.update_record(
|
||||
self.live_id.read().await.as_str(),
|
||||
self.entry_store
|
||||
.read()
|
||||
.await
|
||||
.as_ref()
|
||||
.unwrap()
|
||||
.total_duration() as i64,
|
||||
self.entry_store.read().await.as_ref().unwrap().total_size(),
|
||||
*self.total_duration.read().await as i64,
|
||||
*self.total_size.read().await,
|
||||
)
|
||||
.await
|
||||
{
|
||||
log::error!("Failed to update record: {}", e);
|
||||
log::error!("Failed to update record: {e}");
|
||||
}
|
||||
}
|
||||
|
||||
async fn generate_m3u8(&self, live_id: &str, start: i64, end: i64) -> String {
|
||||
log::debug!("Generate m3u8 for {live_id}:{start}:{end}");
|
||||
async fn generate_m3u8(&self, live_id: &str, start: i64, end: i64) -> MediaPlaylist {
|
||||
let range = if start != 0 || end != 0 {
|
||||
Some(Range {
|
||||
x: start as f32,
|
||||
@@ -646,17 +670,48 @@ impl DouyinRecorder {
|
||||
|
||||
// if requires a range, we need to filter entries and only use entries in the range, so m3u8 type is VOD.
|
||||
if live_id == *self.live_id.read().await {
|
||||
self.entry_store
|
||||
.read()
|
||||
.await
|
||||
.as_ref()
|
||||
.unwrap()
|
||||
.manifest(range.is_some(), false, range)
|
||||
let mut playlist = self.playlist.read().await.clone();
|
||||
if let Some(range) = range {
|
||||
let mut duration = 0.0;
|
||||
let mut segments = Vec::new();
|
||||
for s in playlist.segments {
|
||||
if range.is_in(duration) || range.is_in(duration + s.duration) {
|
||||
segments.push(s.clone());
|
||||
}
|
||||
duration += s.duration;
|
||||
}
|
||||
playlist.segments = segments;
|
||||
|
||||
playlist.end_list = true;
|
||||
playlist.playlist_type = Some(MediaPlaylistType::Vod);
|
||||
} else {
|
||||
playlist.end_list = false;
|
||||
playlist.playlist_type = Some(MediaPlaylistType::Event);
|
||||
}
|
||||
|
||||
playlist
|
||||
} else {
|
||||
let work_dir = self.get_work_dir(live_id).await;
|
||||
EntryStore::new(&work_dir)
|
||||
.await
|
||||
.manifest(true, false, range)
|
||||
let playlist = self.load_playlist(live_id).await;
|
||||
if playlist.is_err() {
|
||||
return MediaPlaylist::default();
|
||||
}
|
||||
let mut playlist = playlist.unwrap();
|
||||
playlist.playlist_type = Some(MediaPlaylistType::Vod);
|
||||
playlist.end_list = true;
|
||||
|
||||
if let Some(range) = range {
|
||||
let mut duration = 0.0;
|
||||
let mut segments = Vec::new();
|
||||
for s in playlist.segments {
|
||||
if range.is_in(duration) || range.is_in(duration + s.duration) {
|
||||
segments.push(s.clone());
|
||||
}
|
||||
duration += s.duration;
|
||||
}
|
||||
playlist.segments = segments;
|
||||
}
|
||||
|
||||
playlist
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -683,8 +738,10 @@ impl Recorder for DouyinRecorder {
|
||||
match self_clone.update_entries().await {
|
||||
Ok(ms) => {
|
||||
if ms < 1000 {
|
||||
tokio::time::sleep(Duration::from_millis(1000 - ms as u64))
|
||||
.await;
|
||||
tokio::time::sleep(Duration::from_millis(
|
||||
(1000 - ms).try_into().unwrap(),
|
||||
))
|
||||
.await;
|
||||
}
|
||||
if ms >= 3000 {
|
||||
log::warn!(
|
||||
@@ -698,7 +755,7 @@ impl Recorder for DouyinRecorder {
|
||||
}
|
||||
Err(e) => {
|
||||
log::error!("[{}]Update entries error: {}", self_clone.room_id, e);
|
||||
if let RecorderError::DouyinClientError { err: _e } = e {
|
||||
if let RecorderError::DouyinClientError(_) = e {
|
||||
connection_fail_count =
|
||||
std::cmp::min(5, connection_fail_count + 1);
|
||||
}
|
||||
@@ -706,7 +763,13 @@ impl Recorder for DouyinRecorder {
|
||||
}
|
||||
}
|
||||
}
|
||||
if *self_clone.is_recording.read().await {
|
||||
let _ = self_clone.event_channel.send(RecorderEvent::RecordEnd {
|
||||
recorder: self_clone.info().await,
|
||||
});
|
||||
}
|
||||
*self_clone.is_recording.write().await = false;
|
||||
self_clone.reset().await;
|
||||
// Check status again after some seconds
|
||||
let secs = random::<u64>() % 5;
|
||||
tokio::time::sleep(Duration::from_secs(
|
||||
@@ -727,33 +790,21 @@ impl Recorder for DouyinRecorder {
|
||||
*self.running.write().await = false;
|
||||
// stop 3 tasks
|
||||
if let Some(danmu_task) = self.danmu_task.lock().await.as_mut() {
|
||||
let _ = danmu_task.abort();
|
||||
let () = danmu_task.abort();
|
||||
}
|
||||
if let Some(danmu_stream_task) = self.danmu_stream_task.lock().await.as_mut() {
|
||||
let _ = danmu_stream_task.abort();
|
||||
let () = danmu_stream_task.abort();
|
||||
}
|
||||
if let Some(record_task) = self.record_task.lock().await.as_mut() {
|
||||
let _ = record_task.abort();
|
||||
let () = record_task.abort();
|
||||
}
|
||||
log::info!("Recorder for room {} quit.", self.room_id);
|
||||
}
|
||||
|
||||
async fn m3u8_content(&self, live_id: &str, start: i64, end: i64) -> String {
|
||||
async fn playlist(&self, live_id: &str, start: i64, end: i64) -> MediaPlaylist {
|
||||
self.generate_m3u8(live_id, start, end).await
|
||||
}
|
||||
|
||||
async fn master_m3u8(&self, live_id: &str, start: i64, end: i64) -> String {
|
||||
let mut m3u8_content = "#EXTM3U\n".to_string();
|
||||
m3u8_content += "#EXT-X-VERSION:6\n";
|
||||
m3u8_content += format!(
|
||||
"#EXT-X-STREAM-INF:BANDWIDTH=1280000,RESOLUTION=1920x1080,CODECS=\"avc1.64001F,mp4a.40.2\",DANMU={}\n",
|
||||
self.first_segment_ts(live_id).await / 1000
|
||||
)
|
||||
.as_str();
|
||||
m3u8_content += &format!("playlist.m3u8?start={}&end={}\n", start, end);
|
||||
m3u8_content
|
||||
}
|
||||
|
||||
async fn get_archive_subtitle(
|
||||
&self,
|
||||
live_id: &str,
|
||||
@@ -784,16 +835,18 @@ impl Recorder for DouyinRecorder {
|
||||
// first generate a tmp clip file
|
||||
// generate a tmp m3u8 index file
|
||||
let m3u8_index_file_path = format!("{}/{}", work_dir, "tmp.m3u8");
|
||||
let m3u8_content = self.m3u8_content(live_id, 0, 0).await;
|
||||
let playlist = self.playlist(live_id, 0, 0).await;
|
||||
let mut v: Vec<u8> = Vec::new();
|
||||
playlist.write_to(&mut v).unwrap();
|
||||
let m3u8_content: &str = std::str::from_utf8(&v).unwrap();
|
||||
tokio::fs::write(&m3u8_index_file_path, m3u8_content).await?;
|
||||
// generate a tmp clip file
|
||||
let clip_file_path = format!("{}/{}", work_dir, "tmp.mp4");
|
||||
if let Err(e) = crate::ffmpeg::clip_from_m3u8(
|
||||
None::<&crate::progress_reporter::ProgressReporter>,
|
||||
if let Err(e) = crate::ffmpeg::playlist::playlist_to_video(
|
||||
None::<&crate::progress::progress_reporter::ProgressReporter>,
|
||||
Path::new(&m3u8_index_file_path),
|
||||
Path::new(&clip_file_path),
|
||||
None,
|
||||
false,
|
||||
)
|
||||
.await
|
||||
{
|
||||
@@ -825,8 +878,7 @@ impl Recorder for DouyinRecorder {
|
||||
.subtitle_content
|
||||
.iter()
|
||||
.map(item_to_srt)
|
||||
.collect::<Vec<String>>()
|
||||
.join("");
|
||||
.collect::<String>();
|
||||
subtitle_file.write_all(subtitle_content.as_bytes()).await?;
|
||||
|
||||
// remove tmp file
|
||||
@@ -835,18 +887,34 @@ impl Recorder for DouyinRecorder {
|
||||
Ok(subtitle_content)
|
||||
}
|
||||
|
||||
async fn first_segment_ts(&self, live_id: &str) -> i64 {
|
||||
if *self.live_id.read().await == live_id {
|
||||
let entry_store = self.entry_store.read().await;
|
||||
if entry_store.is_some() {
|
||||
entry_store.as_ref().unwrap().first_ts().unwrap_or(0)
|
||||
} else {
|
||||
0
|
||||
}
|
||||
} else {
|
||||
let work_dir = self.get_work_dir(live_id).await;
|
||||
EntryStore::new(&work_dir).await.first_ts().unwrap_or(0)
|
||||
async fn get_related_playlists(&self, parent_id: &str) -> Vec<(String, String)> {
|
||||
let playlists = self
|
||||
.db
|
||||
.get_archives_by_parent_id(self.room_id, parent_id)
|
||||
.await;
|
||||
if playlists.is_err() {
|
||||
return Vec::new();
|
||||
}
|
||||
let ids: Vec<(String, String)> = playlists
|
||||
.unwrap()
|
||||
.iter()
|
||||
.map(|a| (a.title.clone(), a.live_id.clone()))
|
||||
.collect();
|
||||
let playlists = ids
|
||||
.iter()
|
||||
.map(async |a| {
|
||||
(
|
||||
a.0.clone(),
|
||||
format!(
|
||||
"{}/{}",
|
||||
self.get_work_dir(a.1.as_str()).await,
|
||||
"playlist.m3u8"
|
||||
),
|
||||
)
|
||||
})
|
||||
.collect::<Vec<_>>();
|
||||
let playlists = futures::future::join_all(playlists).await;
|
||||
return playlists;
|
||||
}
|
||||
|
||||
async fn info(&self) -> RecorderInfo {
|
||||
@@ -880,11 +948,7 @@ impl Recorder for DouyinRecorder {
|
||||
.map(|info| info.user_avatar.clone())
|
||||
.unwrap_or_default(),
|
||||
},
|
||||
total_length: if let Some(store) = self.entry_store.read().await.as_ref() {
|
||||
store.total_duration()
|
||||
} else {
|
||||
0.0
|
||||
},
|
||||
total_length: *self.total_duration.read().await,
|
||||
current_live_id: self.live_id.read().await.clone(),
|
||||
live_status: *self.live_status.read().await == LiveStatus::Live,
|
||||
is_recording: *self.is_recording.read().await,
|
||||
@@ -897,11 +961,7 @@ impl Recorder for DouyinRecorder {
|
||||
Ok(if live_id == *self.live_id.read().await {
|
||||
// just return current cache content
|
||||
match self.danmu_store.read().await.as_ref() {
|
||||
Some(storage) => {
|
||||
storage
|
||||
.get_entries(self.first_segment_ts(live_id).await)
|
||||
.await
|
||||
}
|
||||
Some(storage) => storage.get_entries(0).await,
|
||||
None => Vec::new(),
|
||||
}
|
||||
} else {
|
||||
@@ -913,15 +973,13 @@ impl Recorder for DouyinRecorder {
|
||||
live_id,
|
||||
"danmu.txt"
|
||||
);
|
||||
log::debug!("loading danmu cache from {}", cache_file_path);
|
||||
log::debug!("loading danmu cache from {cache_file_path}");
|
||||
let storage = DanmuStorage::new(&cache_file_path).await;
|
||||
if storage.is_none() {
|
||||
return Ok(Vec::new());
|
||||
}
|
||||
let storage = storage.unwrap();
|
||||
storage
|
||||
.get_entries(self.first_segment_ts(live_id).await)
|
||||
.await
|
||||
storage.get_entries(0).await
|
||||
})
|
||||
}
|
||||
|
||||
|
||||
@@ -1,38 +1,28 @@
|
||||
use crate::database::account::AccountRow;
|
||||
use base64::Engine;
|
||||
use crate::{database::account::AccountRow, recorder::user_agent_generator};
|
||||
use deno_core::JsRuntime;
|
||||
use deno_core::RuntimeOptions;
|
||||
use m3u8_rs::{MediaPlaylist, Playlist};
|
||||
use reqwest::{Client, Error as ReqwestError};
|
||||
use reqwest::Client;
|
||||
use uuid::Uuid;
|
||||
|
||||
use super::response::DouyinRoomInfoResponse;
|
||||
use std::fmt;
|
||||
use std::path::Path;
|
||||
use thiserror::Error;
|
||||
|
||||
#[derive(Debug)]
|
||||
#[derive(Error, Debug)]
|
||||
pub enum DouyinClientError {
|
||||
#[error("Network error: {0}")]
|
||||
Network(String),
|
||||
Io(std::io::Error),
|
||||
#[error("IO error: {0}")]
|
||||
Io(#[from] std::io::Error),
|
||||
#[error("Playlist error: {0}")]
|
||||
Playlist(String),
|
||||
}
|
||||
|
||||
impl fmt::Display for DouyinClientError {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
match self {
|
||||
Self::Network(e) => write!(f, "Network error: {}", e),
|
||||
Self::Io(e) => write!(f, "IO error: {}", e),
|
||||
Self::Playlist(e) => write!(f, "Playlist error: {}", e),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl From<ReqwestError> for DouyinClientError {
|
||||
fn from(err: ReqwestError) -> Self {
|
||||
DouyinClientError::Network(err.to_string())
|
||||
}
|
||||
}
|
||||
|
||||
impl From<std::io::Error> for DouyinClientError {
|
||||
fn from(err: std::io::Error) -> Self {
|
||||
DouyinClientError::Io(err)
|
||||
}
|
||||
#[error("H5 live not started: {0}")]
|
||||
H5NotLive(String),
|
||||
#[error("JS runtime error: {0}")]
|
||||
JsRuntimeError(String),
|
||||
#[error("Reqwest error: {0}")]
|
||||
ReqwestError(#[from] reqwest::Error),
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
@@ -55,38 +45,96 @@ pub struct DouyinClient {
|
||||
account: AccountRow,
|
||||
}
|
||||
|
||||
fn setup_js_runtime() -> Result<JsRuntime, DouyinClientError> {
|
||||
// Create a new V8 runtime
|
||||
let mut runtime = JsRuntime::new(RuntimeOptions::default());
|
||||
|
||||
// Add global CryptoJS object
|
||||
let crypto_js = include_str!("js/a_bogus.js");
|
||||
runtime
|
||||
.execute_script(
|
||||
"<a_bogus.js>",
|
||||
deno_core::FastString::from_static(crypto_js),
|
||||
)
|
||||
.map_err(|e| {
|
||||
DouyinClientError::JsRuntimeError(format!("Failed to execute crypto-js: {e}"))
|
||||
})?;
|
||||
Ok(runtime)
|
||||
}
|
||||
|
||||
impl DouyinClient {
|
||||
pub fn new(user_agent: &str, account: &AccountRow) -> Self {
|
||||
let client = Client::builder().user_agent(user_agent).build().unwrap();
|
||||
pub fn new(account: &AccountRow) -> Self {
|
||||
let client = Client::builder().build().unwrap();
|
||||
Self {
|
||||
client,
|
||||
account: account.clone(),
|
||||
}
|
||||
}
|
||||
|
||||
async fn generate_a_bogus(
|
||||
&self,
|
||||
params: &str,
|
||||
user_agent: &str,
|
||||
) -> Result<String, DouyinClientError> {
|
||||
let mut runtime = setup_js_runtime()?;
|
||||
// Call the get_wss_url function
|
||||
let sign_call = format!("generate_a_bogus(\"{params}\", \"{user_agent}\")");
|
||||
let result = runtime
|
||||
.execute_script("<sign_call>", deno_core::FastString::from(sign_call))
|
||||
.map_err(|e| {
|
||||
DouyinClientError::JsRuntimeError(format!("Failed to execute JavaScript: {e}"))
|
||||
})?;
|
||||
|
||||
// Get the result from the V8 runtime
|
||||
let mut scope = runtime.handle_scope();
|
||||
let local = deno_core::v8::Local::new(&mut scope, result);
|
||||
let url = local
|
||||
.to_string(&mut scope)
|
||||
.unwrap()
|
||||
.to_rust_string_lossy(&mut scope);
|
||||
Ok(url)
|
||||
}
|
||||
|
||||
async fn generate_ms_token(&self) -> String {
|
||||
// generate a random 32 characters uuid string
|
||||
let uuid = Uuid::new_v4();
|
||||
uuid.to_string()
|
||||
}
|
||||
|
||||
pub fn generate_user_agent_header(&self) -> reqwest::header::HeaderMap {
|
||||
let user_agent = user_agent_generator::UserAgentGenerator::new().generate();
|
||||
let mut headers = reqwest::header::HeaderMap::new();
|
||||
headers.insert("user-agent", user_agent.parse().unwrap());
|
||||
headers
|
||||
}
|
||||
|
||||
pub async fn get_room_info(
|
||||
&self,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
sec_user_id: &str,
|
||||
) -> Result<DouyinBasicRoomInfo, DouyinClientError> {
|
||||
let mut headers = self.generate_user_agent_header();
|
||||
headers.insert("Referer", "https://live.douyin.com/".parse().unwrap());
|
||||
headers.insert("Cookie", self.account.cookies.clone().parse().unwrap());
|
||||
let ms_token = self.generate_ms_token().await;
|
||||
let user_agent = headers.get("user-agent").unwrap().to_str().unwrap();
|
||||
let params = format!(
|
||||
"aid=6383&app_name=douyin_web&live_id=1&device_platform=web&language=zh-CN&enter_from=web_live&cookie_enabled=true&screen_width=1920&screen_height=1080&browser_language=zh-CN&browser_platform=MacIntel&browser_name=Chrome&browser_version=122.0.0.0&web_rid={room_id}&ms_token={ms_token}");
|
||||
let a_bogus = self.generate_a_bogus(¶ms, user_agent).await?;
|
||||
// log::debug!("params: {params}");
|
||||
// log::debug!("user_agent: {user_agent}");
|
||||
// log::debug!("a_bogus: {a_bogus}");
|
||||
let url = format!(
|
||||
"https://live.douyin.com/webcast/room/web/enter/?aid=6383&app_name=douyin_web&live_id=1&device_platform=web&language=zh-CN&enter_from=web_live&a_bogus=0&cookie_enabled=true&screen_width=1920&screen_height=1080&browser_language=zh-CN&browser_platform=MacIntel&browser_name=Chrome&browser_version=122.0.0.0&web_rid={}",
|
||||
room_id
|
||||
"https://live.douyin.com/webcast/room/web/enter/?aid=6383&app_name=douyin_web&live_id=1&device_platform=web&language=zh-CN&enter_from=web_live&cookie_enabled=true&screen_width=1920&screen_height=1080&browser_language=zh-CN&browser_platform=MacIntel&browser_name=Chrome&browser_version=122.0.0.0&web_rid={room_id}&ms_token={ms_token}&a_bogus={a_bogus}"
|
||||
);
|
||||
|
||||
let resp = self
|
||||
.client
|
||||
.get(&url)
|
||||
.header("Referer", "https://live.douyin.com/")
|
||||
.header("Cookie", self.account.cookies.clone())
|
||||
.send()
|
||||
.await?;
|
||||
let resp = self.client.get(&url).headers(headers).send().await?;
|
||||
|
||||
let status = resp.status();
|
||||
let text = resp.text().await?;
|
||||
|
||||
if text.is_empty() {
|
||||
log::warn!("Empty room info response, trying H5 API");
|
||||
log::debug!("Empty room info response, trying H5 API");
|
||||
return self.get_room_info_h5(room_id, sec_user_id).await;
|
||||
}
|
||||
|
||||
@@ -117,19 +165,18 @@ impl DouyinClient {
|
||||
.map(|s| s.live_core_sdk_data.pull_data.stream_data.clone())
|
||||
.unwrap_or_default(),
|
||||
});
|
||||
} else {
|
||||
log::error!("Failed to parse room info response: {}", text);
|
||||
return self.get_room_info_h5(room_id, sec_user_id).await;
|
||||
}
|
||||
log::error!("Failed to parse room info response: {text}");
|
||||
return self.get_room_info_h5(room_id, sec_user_id).await;
|
||||
}
|
||||
|
||||
log::error!("Failed to get room info: {}", status);
|
||||
log::error!("Failed to get room info: {status}");
|
||||
return self.get_room_info_h5(room_id, sec_user_id).await;
|
||||
}
|
||||
|
||||
pub async fn get_room_info_h5(
|
||||
&self,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
sec_user_id: &str,
|
||||
) -> Result<DouyinBasicRoomInfo, DouyinClientError> {
|
||||
// 参考biliup实现,构建完整的URL参数
|
||||
@@ -149,24 +196,16 @@ impl DouyinClient {
|
||||
// 构建URL
|
||||
let query_string = url_params
|
||||
.iter()
|
||||
.map(|(k, v)| format!("{}={}", k, v))
|
||||
.map(|(k, v)| format!("{k}={v}"))
|
||||
.collect::<Vec<_>>()
|
||||
.join("&");
|
||||
let url = format!(
|
||||
"https://webcast.amemv.com/webcast/room/reflow/info/?{}",
|
||||
query_string
|
||||
);
|
||||
let url = format!("https://webcast.amemv.com/webcast/room/reflow/info/?{query_string}");
|
||||
|
||||
log::info!("get_room_info_h5: {}", url);
|
||||
let mut headers = self.generate_user_agent_header();
|
||||
headers.insert("Referer", "https://live.douyin.com/".parse().unwrap());
|
||||
headers.insert("Cookie", self.account.cookies.clone().parse().unwrap());
|
||||
|
||||
let resp = self
|
||||
.client
|
||||
.get(&url)
|
||||
.header("User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36")
|
||||
.header("Referer", "https://live.douyin.com/")
|
||||
.header("Cookie", self.account.cookies.clone())
|
||||
.send()
|
||||
.await?;
|
||||
let resp = self.client.get(&url).headers(headers).send().await?;
|
||||
|
||||
let status = resp.status();
|
||||
let text = resp.text().await?;
|
||||
@@ -214,21 +253,23 @@ impl DouyinClient {
|
||||
|
||||
// If that fails, try to parse as a generic JSON to see what we got
|
||||
if let Ok(json_value) = serde_json::from_str::<serde_json::Value>(&text) {
|
||||
log::error!(
|
||||
"Unexpected response structure: {}",
|
||||
serde_json::to_string_pretty(&json_value).unwrap_or_default()
|
||||
);
|
||||
|
||||
// Check if it's an error response
|
||||
if let Some(status_code) = json_value.get("status_code").and_then(|v| v.as_i64()) {
|
||||
if let Some(status_code) = json_value
|
||||
.get("status_code")
|
||||
.and_then(serde_json::Value::as_i64)
|
||||
{
|
||||
if status_code != 0 {
|
||||
let error_msg = json_value
|
||||
.get("status_message")
|
||||
.and_then(|v| v.as_str())
|
||||
.get("data")
|
||||
.and_then(|v| v.get("message").and_then(|v| v.as_str()))
|
||||
.unwrap_or("Unknown error");
|
||||
|
||||
if status_code == 10011 {
|
||||
return Err(DouyinClientError::H5NotLive(error_msg.to_string()));
|
||||
}
|
||||
|
||||
return Err(DouyinClientError::Network(format!(
|
||||
"API returned error status_code: {} - {}",
|
||||
status_code, error_msg
|
||||
"API returned error status_code: {status_code} - {error_msg}"
|
||||
)));
|
||||
}
|
||||
}
|
||||
@@ -245,35 +286,29 @@ impl DouyinClient {
|
||||
}
|
||||
|
||||
return Err(DouyinClientError::Network(format!(
|
||||
"Failed to parse h5 room info response: {}",
|
||||
text
|
||||
)));
|
||||
} else {
|
||||
log::error!("Failed to parse h5 room info response: {}", text);
|
||||
return Err(DouyinClientError::Network(format!(
|
||||
"Failed to parse h5 room info response: {}",
|
||||
text
|
||||
"Failed to parse h5 room info response: {text}"
|
||||
)));
|
||||
}
|
||||
log::error!("Failed to parse h5 room info response: {text}");
|
||||
return Err(DouyinClientError::Network(format!(
|
||||
"Failed to parse h5 room info response: {text}"
|
||||
)));
|
||||
}
|
||||
|
||||
log::error!("Failed to get h5 room info: {}", status);
|
||||
log::error!("Failed to get h5 room info: {status}");
|
||||
Err(DouyinClientError::Network(format!(
|
||||
"Failed to get h5 room info: {} {}",
|
||||
status, text
|
||||
"Failed to get h5 room info: {status} {text}"
|
||||
)))
|
||||
}
|
||||
|
||||
pub async fn get_user_info(&self) -> Result<super::response::User, DouyinClientError> {
|
||||
// Use the IM spotlight relation API to get user info
|
||||
let url = "https://www.douyin.com/aweme/v1/web/im/spotlight/relation/";
|
||||
let resp = self
|
||||
.client
|
||||
.get(url)
|
||||
.header("Referer", "https://www.douyin.com/")
|
||||
.header("Cookie", self.account.cookies.clone())
|
||||
.send()
|
||||
.await?;
|
||||
let mut headers = self.generate_user_agent_header();
|
||||
headers.insert("Referer", "https://www.douyin.com/".parse().unwrap());
|
||||
headers.insert("Cookie", self.account.cookies.clone().parse().unwrap());
|
||||
|
||||
let resp = self.client.get(url).headers(headers).send().await?;
|
||||
|
||||
let status = resp.status();
|
||||
let text = resp.text().await?;
|
||||
@@ -295,7 +330,7 @@ impl DouyinClient {
|
||||
avatar_thumb: following.avatar_thumb.clone(),
|
||||
follow_info: super::response::FollowInfo::default(),
|
||||
foreign_user: 0,
|
||||
open_id_str: "".to_string(),
|
||||
open_id_str: String::new(),
|
||||
};
|
||||
return Ok(user);
|
||||
}
|
||||
@@ -304,26 +339,25 @@ impl DouyinClient {
|
||||
|
||||
// If not found in followings, create a minimal user info from owner_sec_uid
|
||||
let user = super::response::User {
|
||||
id_str: "".to_string(), // We don't have the numeric UID
|
||||
id_str: String::new(), // We don't have the numeric UID
|
||||
sec_uid: owner_sec_uid.clone(),
|
||||
nickname: "抖音用户".to_string(), // Default nickname
|
||||
avatar_thumb: super::response::AvatarThumb { url_list: vec![] },
|
||||
follow_info: super::response::FollowInfo::default(),
|
||||
foreign_user: 0,
|
||||
open_id_str: "".to_string(),
|
||||
open_id_str: String::new(),
|
||||
};
|
||||
return Ok(user);
|
||||
}
|
||||
} else {
|
||||
log::error!("Failed to parse user info response: {}", text);
|
||||
log::error!("Failed to parse user info response: {text}");
|
||||
return Err(DouyinClientError::Network(format!(
|
||||
"Failed to parse user info response: {}",
|
||||
text
|
||||
"Failed to parse user info response: {text}"
|
||||
)));
|
||||
}
|
||||
}
|
||||
|
||||
log::error!("Failed to get user info: {}", status);
|
||||
log::error!("Failed to get user info: {status}");
|
||||
|
||||
Err(DouyinClientError::Io(std::io::Error::new(
|
||||
std::io::ErrorKind::NotFound,
|
||||
@@ -331,15 +365,17 @@ impl DouyinClient {
|
||||
)))
|
||||
}
|
||||
|
||||
pub async fn get_cover_base64(&self, url: &str) -> Result<String, DouyinClientError> {
|
||||
log::info!("get_cover_base64: {}", url);
|
||||
/// Download file from url to path
|
||||
pub async fn download_file(&self, url: &str, path: &Path) -> Result<(), DouyinClientError> {
|
||||
if !path.parent().unwrap().exists() {
|
||||
std::fs::create_dir_all(path.parent().unwrap()).unwrap();
|
||||
}
|
||||
let response = self.client.get(url).send().await?;
|
||||
let bytes = response.bytes().await?;
|
||||
let base64 = base64::engine::general_purpose::STANDARD.encode(bytes);
|
||||
let mime_type = mime_guess::from_path(url)
|
||||
.first_or_octet_stream()
|
||||
.to_string();
|
||||
Ok(format!("data:{};base64,{}", mime_type, base64))
|
||||
let mut file = tokio::fs::File::create(&path).await?;
|
||||
let mut content = std::io::Cursor::new(bytes);
|
||||
tokio::io::copy(&mut content, &mut file).await?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn get_m3u8_content(
|
||||
@@ -352,7 +388,7 @@ impl DouyinClient {
|
||||
// #EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=2560000
|
||||
// http://7167739a741646b4651b6949b2f3eb8e.livehwc3.cn/pull-hls-l26.douyincdn.com/third/stream-693342996808860134_or4.m3u8?sub_m3u8=true&user_session_id=16090eb45ab8a2f042f7c46563936187&major_anchor_level=common&edge_slice=true&expire=67d944ec&sign=47b95cc6e8de20d82f3d404412fa8406
|
||||
if content.contains("BANDWIDTH") {
|
||||
log::info!("Master manifest with playlist URL: {}", url);
|
||||
log::info!("Master manifest with playlist URL: {url}");
|
||||
let new_url = content.lines().last().unwrap();
|
||||
return Box::pin(self.get_m3u8_content(new_url)).await;
|
||||
}
|
||||
@@ -371,7 +407,7 @@ impl DouyinClient {
|
||||
|
||||
if response.status() != reqwest::StatusCode::OK {
|
||||
let error = response.error_for_status().unwrap_err();
|
||||
log::error!("HTTP error: {} for URL: {}", error, url);
|
||||
log::error!("HTTP error: {error} for URL: {url}");
|
||||
return Err(DouyinClientError::Network(error.to_string()));
|
||||
}
|
||||
|
||||
|
||||
550
src-tauri/src/recorder/douyin/js/a_bogus.js
Normal file
550
src-tauri/src/recorder/douyin/js/a_bogus.js
Normal file
@@ -0,0 +1,550 @@
|
||||
// Script from https://github.com/JoeanAmier/TikTokDownloader/blob/master/static/js/a_bogus.js
|
||||
// All the content in this article is only for learning and communication use, not for any other purpose, strictly prohibited for commercial use and illegal use, otherwise all the consequences are irrelevant to the author!
|
||||
function rc4_encrypt(plaintext, key) {
|
||||
var s = [];
|
||||
for (var i = 0; i < 256; i++) {
|
||||
s[i] = i;
|
||||
}
|
||||
var j = 0;
|
||||
for (var i = 0; i < 256; i++) {
|
||||
j = (j + s[i] + key.charCodeAt(i % key.length)) % 256;
|
||||
var temp = s[i];
|
||||
s[i] = s[j];
|
||||
s[j] = temp;
|
||||
}
|
||||
|
||||
var i = 0;
|
||||
var j = 0;
|
||||
var cipher = [];
|
||||
for (var k = 0; k < plaintext.length; k++) {
|
||||
i = (i + 1) % 256;
|
||||
j = (j + s[i]) % 256;
|
||||
var temp = s[i];
|
||||
s[i] = s[j];
|
||||
s[j] = temp;
|
||||
var t = (s[i] + s[j]) % 256;
|
||||
cipher.push(String.fromCharCode(s[t] ^ plaintext.charCodeAt(k)));
|
||||
}
|
||||
return cipher.join("");
|
||||
}
|
||||
|
||||
function le(e, r) {
|
||||
return ((e << (r %= 32)) | (e >>> (32 - r))) >>> 0;
|
||||
}
|
||||
|
||||
function de(e) {
|
||||
return 0 <= e && e < 16
|
||||
? 2043430169
|
||||
: 16 <= e && e < 64
|
||||
? 2055708042
|
||||
: void console["error"]("invalid j for constant Tj");
|
||||
}
|
||||
|
||||
function pe(e, r, t, n) {
|
||||
return 0 <= e && e < 16
|
||||
? (r ^ t ^ n) >>> 0
|
||||
: 16 <= e && e < 64
|
||||
? ((r & t) | (r & n) | (t & n)) >>> 0
|
||||
: (console["error"]("invalid j for bool function FF"), 0);
|
||||
}
|
||||
|
||||
function he(e, r, t, n) {
|
||||
return 0 <= e && e < 16
|
||||
? (r ^ t ^ n) >>> 0
|
||||
: 16 <= e && e < 64
|
||||
? ((r & t) | (~r & n)) >>> 0
|
||||
: (console["error"]("invalid j for bool function GG"), 0);
|
||||
}
|
||||
|
||||
function reset() {
|
||||
(this.reg[0] = 1937774191),
|
||||
(this.reg[1] = 1226093241),
|
||||
(this.reg[2] = 388252375),
|
||||
(this.reg[3] = 3666478592),
|
||||
(this.reg[4] = 2842636476),
|
||||
(this.reg[5] = 372324522),
|
||||
(this.reg[6] = 3817729613),
|
||||
(this.reg[7] = 2969243214),
|
||||
(this["chunk"] = []),
|
||||
(this["size"] = 0);
|
||||
}
|
||||
|
||||
function write(e) {
|
||||
var a =
|
||||
"string" == typeof e
|
||||
? (function (e) {
|
||||
(n = encodeURIComponent(e)["replace"](
|
||||
/%([0-9A-F]{2})/g,
|
||||
function (e, r) {
|
||||
return String["fromCharCode"]("0x" + r);
|
||||
}
|
||||
)),
|
||||
(a = new Array(n["length"]));
|
||||
return (
|
||||
Array["prototype"]["forEach"]["call"](n, function (e, r) {
|
||||
a[r] = e.charCodeAt(0);
|
||||
}),
|
||||
a
|
||||
);
|
||||
})(e)
|
||||
: e;
|
||||
this.size += a.length;
|
||||
var f = 64 - this["chunk"]["length"];
|
||||
if (a["length"] < f) this["chunk"] = this["chunk"].concat(a);
|
||||
else
|
||||
for (
|
||||
this["chunk"] = this["chunk"].concat(a.slice(0, f));
|
||||
this["chunk"].length >= 64;
|
||||
|
||||
)
|
||||
this["_compress"](this["chunk"]),
|
||||
f < a["length"]
|
||||
? (this["chunk"] = a["slice"](f, Math["min"](f + 64, a["length"])))
|
||||
: (this["chunk"] = []),
|
||||
(f += 64);
|
||||
}
|
||||
|
||||
function sum(e, t) {
|
||||
e && (this["reset"](), this["write"](e)), this["_fill"]();
|
||||
for (var f = 0; f < this.chunk["length"]; f += 64)
|
||||
this._compress(this["chunk"]["slice"](f, f + 64));
|
||||
var i = null;
|
||||
if (t == "hex") {
|
||||
i = "";
|
||||
for (f = 0; f < 8; f++) i += se(this["reg"][f]["toString"](16), 8, "0");
|
||||
} else
|
||||
for (i = new Array(32), f = 0; f < 8; f++) {
|
||||
var c = this.reg[f];
|
||||
(i[4 * f + 3] = (255 & c) >>> 0),
|
||||
(c >>>= 8),
|
||||
(i[4 * f + 2] = (255 & c) >>> 0),
|
||||
(c >>>= 8),
|
||||
(i[4 * f + 1] = (255 & c) >>> 0),
|
||||
(c >>>= 8),
|
||||
(i[4 * f] = (255 & c) >>> 0);
|
||||
}
|
||||
return this["reset"](), i;
|
||||
}
|
||||
|
||||
function _compress(t) {
|
||||
if (t < 64) console.error("compress error: not enough data");
|
||||
else {
|
||||
for (
|
||||
var f = (function (e) {
|
||||
for (var r = new Array(132), t = 0; t < 16; t++)
|
||||
(r[t] = e[4 * t] << 24),
|
||||
(r[t] |= e[4 * t + 1] << 16),
|
||||
(r[t] |= e[4 * t + 2] << 8),
|
||||
(r[t] |= e[4 * t + 3]),
|
||||
(r[t] >>>= 0);
|
||||
for (var n = 16; n < 68; n++) {
|
||||
var a = r[n - 16] ^ r[n - 9] ^ le(r[n - 3], 15);
|
||||
(a = a ^ le(a, 15) ^ le(a, 23)),
|
||||
(r[n] = (a ^ le(r[n - 13], 7) ^ r[n - 6]) >>> 0);
|
||||
}
|
||||
for (n = 0; n < 64; n++) r[n + 68] = (r[n] ^ r[n + 4]) >>> 0;
|
||||
return r;
|
||||
})(t),
|
||||
i = this["reg"].slice(0),
|
||||
c = 0;
|
||||
c < 64;
|
||||
c++
|
||||
) {
|
||||
var o = le(i[0], 12) + i[4] + le(de(c), c),
|
||||
s = ((o = le((o = (4294967295 & o) >>> 0), 7)) ^ le(i[0], 12)) >>> 0,
|
||||
u = pe(c, i[0], i[1], i[2]);
|
||||
u = (4294967295 & (u = u + i[3] + s + f[c + 68])) >>> 0;
|
||||
var b = he(c, i[4], i[5], i[6]);
|
||||
(b = (4294967295 & (b = b + i[7] + o + f[c])) >>> 0),
|
||||
(i[3] = i[2]),
|
||||
(i[2] = le(i[1], 9)),
|
||||
(i[1] = i[0]),
|
||||
(i[0] = u),
|
||||
(i[7] = i[6]),
|
||||
(i[6] = le(i[5], 19)),
|
||||
(i[5] = i[4]),
|
||||
(i[4] = (b ^ le(b, 9) ^ le(b, 17)) >>> 0);
|
||||
}
|
||||
for (var l = 0; l < 8; l++) this["reg"][l] = (this["reg"][l] ^ i[l]) >>> 0;
|
||||
}
|
||||
}
|
||||
|
||||
function _fill() {
|
||||
var a = 8 * this["size"],
|
||||
f = this["chunk"]["push"](128) % 64;
|
||||
for (64 - f < 8 && (f -= 64); f < 56; f++) this.chunk["push"](0);
|
||||
for (var i = 0; i < 4; i++) {
|
||||
var c = Math["floor"](a / 4294967296);
|
||||
this["chunk"].push((c >>> (8 * (3 - i))) & 255);
|
||||
}
|
||||
for (i = 0; i < 4; i++) this["chunk"]["push"]((a >>> (8 * (3 - i))) & 255);
|
||||
}
|
||||
|
||||
function SM3() {
|
||||
this.reg = [];
|
||||
this.chunk = [];
|
||||
this.size = 0;
|
||||
this.reset();
|
||||
}
|
||||
SM3.prototype.reset = reset;
|
||||
SM3.prototype.write = write;
|
||||
SM3.prototype.sum = sum;
|
||||
SM3.prototype._compress = _compress;
|
||||
SM3.prototype._fill = _fill;
|
||||
|
||||
function result_encrypt(long_str, num = null) {
|
||||
let s_obj = {
|
||||
s0: "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/=",
|
||||
s1: "Dkdpgh4ZKsQB80/Mfvw36XI1R25+WUAlEi7NLboqYTOPuzmFjJnryx9HVGcaStCe=",
|
||||
s2: "Dkdpgh4ZKsQB80/Mfvw36XI1R25-WUAlEi7NLboqYTOPuzmFjJnryx9HVGcaStCe=",
|
||||
s3: "ckdp1h4ZKsUB80/Mfvw36XIgR25+WQAlEi7NLboqYTOPuzmFjJnryx9HVGDaStCe",
|
||||
s4: "Dkdpgh2ZmsQB80/MfvV36XI1R45-WUAlEixNLwoqYTOPuzKFjJnry79HbGcaStCe",
|
||||
};
|
||||
let constant = {
|
||||
0: 16515072,
|
||||
1: 258048,
|
||||
2: 4032,
|
||||
str: s_obj[num],
|
||||
};
|
||||
|
||||
let result = "";
|
||||
let lound = 0;
|
||||
let long_int = get_long_int(lound, long_str);
|
||||
for (let i = 0; i < (long_str.length / 3) * 4; i++) {
|
||||
if (Math.floor(i / 4) !== lound) {
|
||||
lound += 1;
|
||||
long_int = get_long_int(lound, long_str);
|
||||
}
|
||||
let key = i % 4;
|
||||
switch (key) {
|
||||
case 0:
|
||||
temp_int = (long_int & constant["0"]) >> 18;
|
||||
result += constant["str"].charAt(temp_int);
|
||||
break;
|
||||
case 1:
|
||||
temp_int = (long_int & constant["1"]) >> 12;
|
||||
result += constant["str"].charAt(temp_int);
|
||||
break;
|
||||
case 2:
|
||||
temp_int = (long_int & constant["2"]) >> 6;
|
||||
result += constant["str"].charAt(temp_int);
|
||||
break;
|
||||
case 3:
|
||||
temp_int = long_int & 63;
|
||||
result += constant["str"].charAt(temp_int);
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
}
|
||||
return result;
|
||||
}
|
||||
|
||||
function get_long_int(round, long_str) {
|
||||
round = round * 3;
|
||||
return (
|
||||
(long_str.charCodeAt(round) << 16) |
|
||||
(long_str.charCodeAt(round + 1) << 8) |
|
||||
long_str.charCodeAt(round + 2)
|
||||
);
|
||||
}
|
||||
|
||||
function gener_random(random, option) {
|
||||
return [
|
||||
(random & 255 & 170) | (option[0] & 85), // 163
|
||||
(random & 255 & 85) | (option[0] & 170), //87
|
||||
((random >> 8) & 255 & 170) | (option[1] & 85), //37
|
||||
((random >> 8) & 255 & 85) | (option[1] & 170), //41
|
||||
];
|
||||
}
|
||||
|
||||
//////////////////////////////////////////////
|
||||
function generate_rc4_bb_str(
|
||||
url_search_params,
|
||||
user_agent,
|
||||
window_env_str,
|
||||
suffix = "cus",
|
||||
Arguments = [0, 1, 14]
|
||||
) {
|
||||
let sm3 = new SM3();
|
||||
let start_time = Date.now();
|
||||
/**
|
||||
* 进行3次加密处理
|
||||
* 1: url_search_params两次sm3之的结果
|
||||
* 2: 对后缀两次sm3之的结果
|
||||
* 3: 对ua处理之后的结果
|
||||
*/
|
||||
// url_search_params两次sm3之的结果
|
||||
let url_search_params_list = sm3.sum(sm3.sum(url_search_params + suffix));
|
||||
// 对后缀两次sm3之的结果
|
||||
let cus = sm3.sum(sm3.sum(suffix));
|
||||
// 对ua处理之后的结果
|
||||
let ua = sm3.sum(
|
||||
result_encrypt(
|
||||
rc4_encrypt(
|
||||
user_agent,
|
||||
String.fromCharCode.apply(null, [0.00390625, 1, 14])
|
||||
),
|
||||
"s3"
|
||||
)
|
||||
);
|
||||
//
|
||||
let end_time = Date.now();
|
||||
// b
|
||||
let b = {
|
||||
8: 3, // 固定
|
||||
10: end_time, //3次加密结束时间
|
||||
15: {
|
||||
aid: 6383,
|
||||
pageId: 6241,
|
||||
boe: false,
|
||||
ddrt: 7,
|
||||
paths: {
|
||||
include: [{}, {}, {}, {}, {}, {}, {}],
|
||||
exclude: [],
|
||||
},
|
||||
track: {
|
||||
mode: 0,
|
||||
delay: 300,
|
||||
paths: [],
|
||||
},
|
||||
dump: true,
|
||||
rpU: "",
|
||||
},
|
||||
16: start_time, //3次加密开始时间
|
||||
18: 44, //固定
|
||||
19: [1, 0, 1, 5],
|
||||
};
|
||||
|
||||
//3次加密开始时间
|
||||
b[20] = (b[16] >> 24) & 255;
|
||||
b[21] = (b[16] >> 16) & 255;
|
||||
b[22] = (b[16] >> 8) & 255;
|
||||
b[23] = b[16] & 255;
|
||||
b[24] = (b[16] / 256 / 256 / 256 / 256) >> 0;
|
||||
b[25] = (b[16] / 256 / 256 / 256 / 256 / 256) >> 0;
|
||||
|
||||
// 参数Arguments [0, 1, 14, ...]
|
||||
// let Arguments = [0, 1, 14]
|
||||
b[26] = (Arguments[0] >> 24) & 255;
|
||||
b[27] = (Arguments[0] >> 16) & 255;
|
||||
b[28] = (Arguments[0] >> 8) & 255;
|
||||
b[29] = Arguments[0] & 255;
|
||||
|
||||
b[30] = (Arguments[1] / 256) & 255;
|
||||
b[31] = Arguments[1] % 256 & 255;
|
||||
b[32] = (Arguments[1] >> 24) & 255;
|
||||
b[33] = (Arguments[1] >> 16) & 255;
|
||||
|
||||
b[34] = (Arguments[2] >> 24) & 255;
|
||||
b[35] = (Arguments[2] >> 16) & 255;
|
||||
b[36] = (Arguments[2] >> 8) & 255;
|
||||
b[37] = Arguments[2] & 255;
|
||||
|
||||
// (url_search_params + "cus") 两次sm3之的结果
|
||||
/**let url_search_params_list = [
|
||||
91, 186, 35, 86, 143, 253, 6, 76,
|
||||
34, 21, 167, 148, 7, 42, 192, 219,
|
||||
188, 20, 182, 85, 213, 74, 213, 147,
|
||||
37, 155, 93, 139, 85, 118, 228, 213
|
||||
]*/
|
||||
b[38] = url_search_params_list[21];
|
||||
b[39] = url_search_params_list[22];
|
||||
|
||||
// ("cus") 对后缀两次sm3之的结果
|
||||
/**
|
||||
* let cus = [
|
||||
136, 101, 114, 147, 58, 77, 207, 201,
|
||||
215, 162, 154, 93, 248, 13, 142, 160,
|
||||
105, 73, 215, 241, 83, 58, 51, 43,
|
||||
255, 38, 168, 141, 216, 194, 35, 236
|
||||
]*/
|
||||
b[40] = cus[21];
|
||||
b[41] = cus[22];
|
||||
|
||||
// 对ua处理之后的结果
|
||||
/**
|
||||
* let ua = [
|
||||
129, 190, 70, 186, 86, 196, 199, 53,
|
||||
99, 38, 29, 209, 243, 17, 157, 69,
|
||||
147, 104, 53, 23, 114, 126, 66, 228,
|
||||
135, 30, 168, 185, 109, 156, 251, 88
|
||||
]*/
|
||||
b[42] = ua[23];
|
||||
b[43] = ua[24];
|
||||
|
||||
//3次加密结束时间
|
||||
b[44] = (b[10] >> 24) & 255;
|
||||
b[45] = (b[10] >> 16) & 255;
|
||||
b[46] = (b[10] >> 8) & 255;
|
||||
b[47] = b[10] & 255;
|
||||
b[48] = b[8];
|
||||
b[49] = (b[10] / 256 / 256 / 256 / 256) >> 0;
|
||||
b[50] = (b[10] / 256 / 256 / 256 / 256 / 256) >> 0;
|
||||
|
||||
// object配置项
|
||||
b[51] = b[15]["pageId"];
|
||||
b[52] = (b[15]["pageId"] >> 24) & 255;
|
||||
b[53] = (b[15]["pageId"] >> 16) & 255;
|
||||
b[54] = (b[15]["pageId"] >> 8) & 255;
|
||||
b[55] = b[15]["pageId"] & 255;
|
||||
|
||||
b[56] = b[15]["aid"];
|
||||
b[57] = b[15]["aid"] & 255;
|
||||
b[58] = (b[15]["aid"] >> 8) & 255;
|
||||
b[59] = (b[15]["aid"] >> 16) & 255;
|
||||
b[60] = (b[15]["aid"] >> 24) & 255;
|
||||
|
||||
// 中间进行了环境检测
|
||||
// 代码索引: 2496 索引值: 17 (索引64关键条件)
|
||||
// '1536|747|1536|834|0|30|0|0|1536|834|1536|864|1525|747|24|24|Win32'.charCodeAt()得到65位数组
|
||||
/**
|
||||
* let window_env_list = [49, 53, 51, 54, 124, 55, 52, 55, 124, 49, 53, 51, 54, 124, 56, 51, 52, 124, 48, 124, 51,
|
||||
* 48, 124, 48, 124, 48, 124, 49, 53, 51, 54, 124, 56, 51, 52, 124, 49, 53, 51, 54, 124, 56,
|
||||
* 54, 52, 124, 49, 53, 50, 53, 124, 55, 52, 55, 124, 50, 52, 124, 50, 52, 124, 87, 105, 110,
|
||||
* 51, 50]
|
||||
*/
|
||||
let window_env_list = [];
|
||||
for (let index = 0; index < window_env_str.length; index++) {
|
||||
window_env_list.push(window_env_str.charCodeAt(index));
|
||||
}
|
||||
b[64] = window_env_list.length;
|
||||
b[65] = b[64] & 255;
|
||||
b[66] = (b[64] >> 8) & 255;
|
||||
|
||||
b[69] = [].length;
|
||||
b[70] = b[69] & 255;
|
||||
b[71] = (b[69] >> 8) & 255;
|
||||
|
||||
b[72] =
|
||||
b[18] ^
|
||||
b[20] ^
|
||||
b[26] ^
|
||||
b[30] ^
|
||||
b[38] ^
|
||||
b[40] ^
|
||||
b[42] ^
|
||||
b[21] ^
|
||||
b[27] ^
|
||||
b[31] ^
|
||||
b[35] ^
|
||||
b[39] ^
|
||||
b[41] ^
|
||||
b[43] ^
|
||||
b[22] ^
|
||||
b[28] ^
|
||||
b[32] ^
|
||||
b[36] ^
|
||||
b[23] ^
|
||||
b[29] ^
|
||||
b[33] ^
|
||||
b[37] ^
|
||||
b[44] ^
|
||||
b[45] ^
|
||||
b[46] ^
|
||||
b[47] ^
|
||||
b[48] ^
|
||||
b[49] ^
|
||||
b[50] ^
|
||||
b[24] ^
|
||||
b[25] ^
|
||||
b[52] ^
|
||||
b[53] ^
|
||||
b[54] ^
|
||||
b[55] ^
|
||||
b[57] ^
|
||||
b[58] ^
|
||||
b[59] ^
|
||||
b[60] ^
|
||||
b[65] ^
|
||||
b[66] ^
|
||||
b[70] ^
|
||||
b[71];
|
||||
let bb = [
|
||||
b[18],
|
||||
b[20],
|
||||
b[52],
|
||||
b[26],
|
||||
b[30],
|
||||
b[34],
|
||||
b[58],
|
||||
b[38],
|
||||
b[40],
|
||||
b[53],
|
||||
b[42],
|
||||
b[21],
|
||||
b[27],
|
||||
b[54],
|
||||
b[55],
|
||||
b[31],
|
||||
b[35],
|
||||
b[57],
|
||||
b[39],
|
||||
b[41],
|
||||
b[43],
|
||||
b[22],
|
||||
b[28],
|
||||
b[32],
|
||||
b[60],
|
||||
b[36],
|
||||
b[23],
|
||||
b[29],
|
||||
b[33],
|
||||
b[37],
|
||||
b[44],
|
||||
b[45],
|
||||
b[59],
|
||||
b[46],
|
||||
b[47],
|
||||
b[48],
|
||||
b[49],
|
||||
b[50],
|
||||
b[24],
|
||||
b[25],
|
||||
b[65],
|
||||
b[66],
|
||||
b[70],
|
||||
b[71],
|
||||
];
|
||||
bb = bb.concat(window_env_list).concat(b[72]);
|
||||
return rc4_encrypt(
|
||||
String.fromCharCode.apply(null, bb),
|
||||
String.fromCharCode.apply(null, [121])
|
||||
);
|
||||
}
|
||||
|
||||
function generate_random_str() {
|
||||
let random_str_list = [];
|
||||
random_str_list = random_str_list.concat(
|
||||
gener_random(Math.random() * 10000, [3, 45])
|
||||
);
|
||||
random_str_list = random_str_list.concat(
|
||||
gener_random(Math.random() * 10000, [1, 0])
|
||||
);
|
||||
random_str_list = random_str_list.concat(
|
||||
gener_random(Math.random() * 10000, [1, 5])
|
||||
);
|
||||
return String.fromCharCode.apply(null, random_str_list);
|
||||
}
|
||||
|
||||
function generate_a_bogus(url_search_params, user_agent) {
|
||||
/**
|
||||
* url_search_params:"device_platform=webapp&aid=6383&channel=channel_pc_web&update_version_code=170400&pc_client_type=1&version_code=170400&version_name=17.4.0&cookie_enabled=true&screen_width=1536&screen_height=864&browser_language=zh-CN&browser_platform=Win32&browser_name=Chrome&browser_version=123.0.0.0&browser_online=true&engine_name=Blink&engine_version=123.0.0.0&os_name=Windows&os_version=10&cpu_core_num=16&device_memory=8&platform=PC&downlink=10&effective_type=4g&round_trip_time=50&webid=7362810250930783783&msToken=VkDUvz1y24CppXSl80iFPr6ez-3FiizcwD7fI1OqBt6IICq9RWG7nCvxKb8IVi55mFd-wnqoNkXGnxHrikQb4PuKob5Q-YhDp5Um215JzlBszkUyiEvR"
|
||||
* user_agent:"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/123.0.0.0 Safari/537.36"
|
||||
*/
|
||||
let result_str =
|
||||
generate_random_str() +
|
||||
generate_rc4_bb_str(
|
||||
url_search_params,
|
||||
user_agent,
|
||||
"1536|747|1536|834|0|30|0|0|1536|834|1536|864|1525|747|24|24|Win32"
|
||||
);
|
||||
|
||||
return encodeURIComponent(result_encrypt(result_str, "s4") + "=");
|
||||
}
|
||||
|
||||
//测试调用
|
||||
// console.log(generate_a_bogus(
|
||||
// "device_platform=webapp&aid=6383&channel=channel_pc_web&update_version_code=170400&pc_client_type=1&version_code=170400&version_name=17.4.0&cookie_enabled=true&screen_width=1536&screen_height=864&browser_language=zh-CN&browser_platform=Win32&browser_name=Chrome&browser_version=123.0.0.0&browser_online=true&engine_name=Blink&engine_version=123.0.0.0&os_name=Windows&os_version=10&cpu_core_num=16&device_memory=8&platform=PC&downlink=10&effective_type=4g&round_trip_time=50&webid=7362810250930783783&msToken=VkDUvz1y24CppXSl80iFPr6ez-3FiizcwD7fI1OqBt6IICq9RWG7nCvxKb8IVi55mFd-wnqoNkXGnxHrikQb4PuKob5Q-YhDp5Um215JzlBszkUyiEvR",
|
||||
// "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/123.0.0.0 Safari/537.36"
|
||||
// ));
|
||||
@@ -1,11 +1,14 @@
|
||||
use serde_derive::Deserialize;
|
||||
use serde_derive::Serialize;
|
||||
use serde_json::Value;
|
||||
use std::collections::HashMap;
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct DouyinRoomInfoResponse {
|
||||
pub data: Data,
|
||||
#[serde(default)]
|
||||
pub extra: Option<serde_json::Value>,
|
||||
#[serde(rename = "status_code")]
|
||||
pub status_code: i64,
|
||||
}
|
||||
@@ -14,9 +17,29 @@ pub struct DouyinRoomInfoResponse {
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct Data {
|
||||
pub data: Vec<Daum>,
|
||||
#[serde(rename = "enter_room_id", default)]
|
||||
pub enter_room_id: Option<String>,
|
||||
#[serde(default)]
|
||||
pub extra: Option<serde_json::Value>,
|
||||
pub user: User,
|
||||
#[serde(rename = "qrcode_url", default)]
|
||||
pub qrcode_url: Option<String>,
|
||||
#[serde(rename = "enter_mode", default)]
|
||||
pub enter_mode: Option<i64>,
|
||||
#[serde(rename = "room_status")]
|
||||
pub room_status: i64,
|
||||
#[serde(rename = "partition_road_map", default)]
|
||||
pub partition_road_map: Option<serde_json::Value>,
|
||||
#[serde(rename = "similar_rooms", default)]
|
||||
pub similar_rooms: Option<Vec<serde_json::Value>>,
|
||||
#[serde(rename = "shark_decision_conf", default)]
|
||||
pub shark_decision_conf: Option<String>,
|
||||
#[serde(rename = "web_stream_url", default)]
|
||||
pub web_stream_url: Option<serde_json::Value>,
|
||||
#[serde(rename = "login_lead", default)]
|
||||
pub login_lead: Option<serde_json::Value>,
|
||||
#[serde(rename = "auth_cert_info", default)]
|
||||
pub auth_cert_info: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
@@ -28,9 +51,36 @@ pub struct Daum {
|
||||
#[serde(rename = "status_str")]
|
||||
pub status_str: String,
|
||||
pub title: String,
|
||||
#[serde(rename = "user_count_str", default)]
|
||||
pub user_count_str: Option<String>,
|
||||
pub cover: Option<Cover>,
|
||||
#[serde(rename = "stream_url")]
|
||||
pub stream_url: Option<StreamUrl>,
|
||||
#[serde(default)]
|
||||
pub owner: Option<Owner>,
|
||||
#[serde(rename = "room_auth", default)]
|
||||
pub room_auth: Option<RoomAuth>,
|
||||
#[serde(rename = "live_room_mode", default)]
|
||||
pub live_room_mode: Option<i64>,
|
||||
#[serde(default)]
|
||||
pub stats: Option<Stats>,
|
||||
#[serde(rename = "has_commerce_goods", default)]
|
||||
pub has_commerce_goods: Option<bool>,
|
||||
#[serde(rename = "linker_map", default)]
|
||||
pub linker_map: Option<LinkerMap>,
|
||||
#[serde(rename = "linker_detail", default)]
|
||||
pub linker_detail: Option<LinkerDetail>,
|
||||
#[serde(rename = "room_view_stats", default)]
|
||||
pub room_view_stats: Option<RoomViewStats>,
|
||||
#[serde(rename = "scene_type_info", default)]
|
||||
pub scene_type_info: Option<SceneTypeInfo>,
|
||||
#[serde(rename = "like_count", default)]
|
||||
pub like_count: Option<i64>,
|
||||
#[serde(rename = "owner_user_id_str", default)]
|
||||
pub owner_user_id_str: Option<String>,
|
||||
// Many other fields that can be ignored for now
|
||||
#[serde(flatten)]
|
||||
pub other_fields: HashMap<String, serde_json::Value>,
|
||||
}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
@@ -56,8 +106,8 @@ pub struct StreamUrl {
|
||||
#[serde(rename = "live_core_sdk_data")]
|
||||
pub live_core_sdk_data: LiveCoreSdkData,
|
||||
pub extra: Extra,
|
||||
#[serde(rename = "pull_datas")]
|
||||
pub pull_datas: PullDatas,
|
||||
#[serde(rename = "pull_datas", default)]
|
||||
pub pull_datas: Option<serde_json::Value>,
|
||||
}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
@@ -182,10 +232,7 @@ pub struct Extra {
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct PullDatas {}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct Owner {
|
||||
#[serde(rename = "id_str")]
|
||||
pub id_str: String,
|
||||
@@ -234,6 +281,7 @@ pub struct Subscribe {
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct RoomAuth {
|
||||
#[serde(rename = "Chat")]
|
||||
pub chat: bool,
|
||||
@@ -383,6 +431,7 @@ pub struct RoomAuth {
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct SpecialStyle {
|
||||
#[serde(rename = "Chat")]
|
||||
pub chat: Chat,
|
||||
@@ -392,6 +441,7 @@ pub struct SpecialStyle {
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct Chat {
|
||||
#[serde(rename = "UnableStyle")]
|
||||
pub unable_style: i64,
|
||||
@@ -407,6 +457,7 @@ pub struct Chat {
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct Like {
|
||||
#[serde(rename = "UnableStyle")]
|
||||
pub unable_style: i64,
|
||||
@@ -422,6 +473,7 @@ pub struct Like {
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct Stats {
|
||||
#[serde(rename = "total_user_desp")]
|
||||
pub total_user_desp: String,
|
||||
@@ -435,10 +487,12 @@ pub struct Stats {
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct LinkerMap {}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct LinkerDetail {
|
||||
#[serde(rename = "linker_play_modes")]
|
||||
pub linker_play_modes: Vec<Value>,
|
||||
@@ -476,14 +530,17 @@ pub struct LinkerDetail {
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct LinkerMapStr {}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct PlaymodeDetail {}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct RoomViewStats {
|
||||
#[serde(rename = "is_hidden")]
|
||||
pub is_hidden: bool,
|
||||
@@ -510,6 +567,7 @@ pub struct RoomViewStats {
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct SceneTypeInfo {
|
||||
#[serde(rename = "is_union_live_room")]
|
||||
pub is_union_live_room: bool,
|
||||
@@ -529,6 +587,7 @@ pub struct SceneTypeInfo {
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct EntranceList {
|
||||
#[serde(rename = "group_id")]
|
||||
pub group_id: i64,
|
||||
@@ -549,6 +608,7 @@ pub struct EntranceList {
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct Icon {
|
||||
#[serde(rename = "url_list")]
|
||||
pub url_list: Vec<String>,
|
||||
@@ -770,6 +830,7 @@ pub struct H5Owner {
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct H5AvatarThumb {
|
||||
#[serde(rename = "url_list")]
|
||||
pub url_list: Vec<String>,
|
||||
|
||||
@@ -15,6 +15,7 @@ pub struct Data {
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct Ld {
|
||||
pub main: Main,
|
||||
}
|
||||
@@ -28,6 +29,7 @@ pub struct Main {
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct Md {
|
||||
pub main: Main,
|
||||
}
|
||||
@@ -40,23 +42,27 @@ pub struct Origin {
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct Sd {
|
||||
pub main: Main,
|
||||
}
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct Hd {
|
||||
pub main: Main,
|
||||
}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct Ao {
|
||||
pub main: Main,
|
||||
}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[allow(dead_code)]
|
||||
pub struct Uhd {
|
||||
pub main: Main,
|
||||
}
|
||||
|
||||
@@ -2,8 +2,8 @@ use core::fmt;
|
||||
use std::fmt::Display;
|
||||
|
||||
use async_std::{
|
||||
fs::{File, OpenOptions},
|
||||
io::{prelude::BufReadExt, BufReader, WriteExt},
|
||||
fs::OpenOptions,
|
||||
io::{prelude::BufReadExt, BufReader},
|
||||
path::Path,
|
||||
stream::StreamExt,
|
||||
};
|
||||
@@ -31,19 +31,19 @@ impl TsEntry {
|
||||
url: parts[0].to_string(),
|
||||
sequence: parts[1]
|
||||
.parse()
|
||||
.map_err(|e| format!("Failed to parse sequence: {}", e))?,
|
||||
.map_err(|e| format!("Failed to parse sequence: {e}"))?,
|
||||
length: parts[2]
|
||||
.parse()
|
||||
.map_err(|e| format!("Failed to parse length: {}", e))?,
|
||||
.map_err(|e| format!("Failed to parse length: {e}"))?,
|
||||
size: parts[3]
|
||||
.parse()
|
||||
.map_err(|e| format!("Failed to parse size: {}", e))?,
|
||||
.map_err(|e| format!("Failed to parse size: {e}"))?,
|
||||
ts: parts[4]
|
||||
.parse()
|
||||
.map_err(|e| format!("Failed to parse timestamp: {}", e))?,
|
||||
.map_err(|e| format!("Failed to parse timestamp: {e}"))?,
|
||||
is_header: parts[5]
|
||||
.parse()
|
||||
.map_err(|e| format!("Failed to parse is_header: {}", e))?,
|
||||
.map_err(|e| format!("Failed to parse is_header: {e}"))?,
|
||||
})
|
||||
}
|
||||
|
||||
@@ -51,34 +51,25 @@ impl TsEntry {
|
||||
pub fn ts_seconds(&self) -> i64 {
|
||||
// For some legacy problem, douyin entry's ts is s, bilibili entry's ts is ms.
|
||||
// This should be fixed after 2.5.6, but we need to support entry.log generated by previous version.
|
||||
if self.ts > 10000000000 {
|
||||
if self.ts > 10_000_000_000 {
|
||||
self.ts / 1000
|
||||
} else {
|
||||
self.ts
|
||||
}
|
||||
}
|
||||
|
||||
pub fn ts_mili(&self) -> i64 {
|
||||
// if already in ms, return as is
|
||||
if self.ts > 10000000000 {
|
||||
self.ts
|
||||
} else {
|
||||
self.ts * 1000
|
||||
}
|
||||
}
|
||||
|
||||
pub fn date_time(&self) -> String {
|
||||
let date_str = Utc
|
||||
.timestamp_opt(self.ts_seconds(), 0)
|
||||
.unwrap()
|
||||
.to_rfc3339();
|
||||
format!("#EXT-X-PROGRAM-DATE-TIME:{}\n", date_str)
|
||||
format!("#EXT-X-PROGRAM-DATE-TIME:{date_str}\n")
|
||||
}
|
||||
|
||||
/// Convert entry into a segment in HLS manifest.
|
||||
pub fn to_segment(&self) -> String {
|
||||
if self.is_header {
|
||||
return "".into();
|
||||
return String::new();
|
||||
}
|
||||
|
||||
let mut content = String::new();
|
||||
@@ -100,11 +91,9 @@ impl Display for TsEntry {
|
||||
}
|
||||
}
|
||||
|
||||
/// EntryStore is used to management stream segments, which is basicly a simple version of hls manifest,
|
||||
/// and of course, provids methods to generate hls manifest for frontend player.
|
||||
/// `EntryStore` is used to management stream segments, which is basically a simple version of hls manifest,
|
||||
/// and of course, provides methods to generate hls manifest for frontend player.
|
||||
pub struct EntryStore {
|
||||
// append only log file
|
||||
log_file: File,
|
||||
header: Option<TsEntry>,
|
||||
entries: Vec<TsEntry>,
|
||||
total_duration: f64,
|
||||
@@ -118,15 +107,8 @@ impl EntryStore {
|
||||
if !Path::new(work_dir).exists().await {
|
||||
std::fs::create_dir_all(work_dir).unwrap();
|
||||
}
|
||||
// open append only log file
|
||||
let log_file = OpenOptions::new()
|
||||
.create(true)
|
||||
.append(true)
|
||||
.open(format!("{}/{}", work_dir, ENTRY_FILE_NAME))
|
||||
.await
|
||||
.unwrap();
|
||||
|
||||
let mut entry_store = Self {
|
||||
log_file,
|
||||
header: None,
|
||||
entries: vec![],
|
||||
total_duration: 0.0,
|
||||
@@ -143,14 +125,14 @@ impl EntryStore {
|
||||
let file = OpenOptions::new()
|
||||
.create(false)
|
||||
.read(true)
|
||||
.open(format!("{}/{}", work_dir, ENTRY_FILE_NAME))
|
||||
.open(format!("{work_dir}/{ENTRY_FILE_NAME}"))
|
||||
.await
|
||||
.unwrap();
|
||||
let mut lines = BufReader::new(file).lines();
|
||||
while let Some(Ok(line)) = lines.next().await {
|
||||
let entry = TsEntry::from(&line);
|
||||
if let Err(e) = entry {
|
||||
log::error!("Failed to parse entry: {} {}", e, line);
|
||||
log::error!("Failed to parse entry: {e} {line}");
|
||||
continue;
|
||||
}
|
||||
|
||||
@@ -169,45 +151,8 @@ impl EntryStore {
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn add_entry(&mut self, entry: TsEntry) {
|
||||
if entry.is_header {
|
||||
self.header = Some(entry.clone());
|
||||
} else {
|
||||
self.entries.push(entry.clone());
|
||||
}
|
||||
|
||||
if let Err(e) = self.log_file.write_all(entry.to_string().as_bytes()).await {
|
||||
log::error!("Failed to write entry to log file: {}", e);
|
||||
}
|
||||
|
||||
self.log_file.flush().await.unwrap();
|
||||
|
||||
self.last_sequence = std::cmp::max(self.last_sequence, entry.sequence);
|
||||
|
||||
self.total_duration += entry.length;
|
||||
self.total_size += entry.size;
|
||||
}
|
||||
|
||||
pub fn get_header(&self) -> Option<&TsEntry> {
|
||||
self.header.as_ref()
|
||||
}
|
||||
|
||||
pub fn total_duration(&self) -> f64 {
|
||||
self.total_duration
|
||||
}
|
||||
|
||||
pub fn total_size(&self) -> u64 {
|
||||
self.total_size
|
||||
}
|
||||
|
||||
/// Get first timestamp in milliseconds
|
||||
pub fn first_ts(&self) -> Option<i64> {
|
||||
self.entries.first().map(|x| x.ts_mili())
|
||||
}
|
||||
|
||||
/// Get last timestamp in milliseconds
|
||||
pub fn last_ts(&self) -> Option<i64> {
|
||||
self.entries.last().map(|x| x.ts_mili())
|
||||
pub fn len(&self) -> usize {
|
||||
self.entries.len()
|
||||
}
|
||||
|
||||
/// Generate a hls manifest for selected range.
|
||||
|
||||
@@ -1,27 +1,49 @@
|
||||
use super::bilibili::client::BiliStream;
|
||||
use super::douyin::client::DouyinClientError;
|
||||
use custom_error::custom_error;
|
||||
use thiserror::Error;
|
||||
|
||||
custom_error! {pub RecorderError
|
||||
IndexNotFound {url: String} = "Index not found: {url}",
|
||||
ArchiveInUse {live_id: String} = "Can not delete current stream: {live_id}",
|
||||
EmptyCache = "Cache is empty",
|
||||
M3u8ParseFailed {content: String } = "Parse m3u8 content failed: {content}",
|
||||
NoStreamAvailable = "No available stream provided",
|
||||
FreezedStream {stream: BiliStream} = "Stream is freezed: {stream}",
|
||||
StreamExpired {stream: BiliStream} = "Stream is nearly expired: {stream}",
|
||||
NoRoomInfo = "No room info provided",
|
||||
InvalidStream {stream: BiliStream} = "Invalid stream: {stream}",
|
||||
SlowStream {stream: BiliStream} = "Stream is too slow: {stream}",
|
||||
EmptyHeader = "Header url is empty",
|
||||
InvalidTimestamp = "Header timestamp is invalid",
|
||||
InvalidDBOP {err: crate::database::DatabaseError } = "Database error: {err}",
|
||||
BiliClientError {err: super::bilibili::errors::BiliClientError} = "BiliClient error: {err}",
|
||||
DouyinClientError {err: DouyinClientError} = "DouyinClient error: {err}",
|
||||
IoError {err: std::io::Error} = "IO error: {err}",
|
||||
DanmuStreamError {err: danmu_stream::DanmuStreamError} = "Danmu stream error: {err}",
|
||||
SubtitleNotFound {live_id: String} = "Subtitle not found: {live_id}",
|
||||
SubtitleGenerationFailed {error: String} = "Subtitle generation failed: {error}",
|
||||
FfmpegError {err: String} = "FFmpeg error: {err}",
|
||||
ResolutionChanged {err: String} = "Resolution changed: {err}",
|
||||
#[derive(Error, Debug)]
|
||||
pub enum RecorderError {
|
||||
#[error("Index not found: {url}")]
|
||||
IndexNotFound { url: String },
|
||||
#[error("Can not delete current stream: {live_id}")]
|
||||
ArchiveInUse { live_id: String },
|
||||
#[error("Cache is empty")]
|
||||
EmptyCache,
|
||||
#[error("Parse m3u8 content failed: {content}")]
|
||||
M3u8ParseFailed { content: String },
|
||||
#[error("No available stream provided")]
|
||||
NoStreamAvailable,
|
||||
#[error("Stream is freezed: {stream}")]
|
||||
FreezedStream { stream: BiliStream },
|
||||
#[error("Stream is nearly expired: {stream}")]
|
||||
StreamExpired { stream: BiliStream },
|
||||
#[error("No room info provided")]
|
||||
NoRoomInfo,
|
||||
#[error("Invalid stream: {stream}")]
|
||||
InvalidStream { stream: BiliStream },
|
||||
#[error("Stream is too slow: {stream}")]
|
||||
SlowStream { stream: BiliStream },
|
||||
#[error("Header url is empty")]
|
||||
EmptyHeader,
|
||||
#[error("Header timestamp is invalid")]
|
||||
InvalidTimestamp,
|
||||
#[error("Database error: {0}")]
|
||||
InvalidDBOP(#[from] crate::database::DatabaseError),
|
||||
#[error("BiliClient error: {0}")]
|
||||
BiliClientError(#[from] super::bilibili::errors::BiliClientError),
|
||||
#[error("DouyinClient error: {0}")]
|
||||
DouyinClientError(#[from] DouyinClientError),
|
||||
#[error("IO error: {0}")]
|
||||
IoError(#[from] std::io::Error),
|
||||
#[error("Danmu stream error: {0}")]
|
||||
DanmuStreamError(#[from] danmu_stream::DanmuStreamError),
|
||||
#[error("Subtitle not found: {live_id}")]
|
||||
SubtitleNotFound { live_id: String },
|
||||
#[error("Subtitle generation failed: {error}")]
|
||||
SubtitleGenerationFailed { error: String },
|
||||
#[error("Resolution changed: {err}")]
|
||||
ResolutionChanged { err: String },
|
||||
#[error("Ffmpeg error: {0}")]
|
||||
FfmpegError(String),
|
||||
}
|
||||
|
||||
@@ -2,11 +2,13 @@ pub mod bilibili;
|
||||
pub mod danmu;
|
||||
pub mod douyin;
|
||||
pub mod errors;
|
||||
mod user_agent_generator;
|
||||
|
||||
mod entry;
|
||||
pub mod entry;
|
||||
|
||||
use async_trait::async_trait;
|
||||
use danmu::DanmuEntry;
|
||||
use m3u8_rs::MediaPlaylist;
|
||||
use std::hash::{Hash, Hasher};
|
||||
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
@@ -46,7 +48,7 @@ impl Hash for PlatformType {
|
||||
|
||||
#[derive(serde::Deserialize, serde::Serialize, Clone, Debug)]
|
||||
pub struct RecorderInfo {
|
||||
pub room_id: u64,
|
||||
pub room_id: i64,
|
||||
pub room_info: RoomInfo,
|
||||
pub user_info: UserInfo,
|
||||
pub total_length: f64,
|
||||
@@ -59,7 +61,7 @@ pub struct RecorderInfo {
|
||||
|
||||
#[derive(serde::Deserialize, serde::Serialize, Clone, Debug)]
|
||||
pub struct RoomInfo {
|
||||
pub room_id: u64,
|
||||
pub room_id: i64,
|
||||
pub room_title: String,
|
||||
pub room_cover: String,
|
||||
}
|
||||
@@ -75,9 +77,8 @@ pub struct UserInfo {
|
||||
pub trait Recorder: Send + Sync + 'static {
|
||||
async fn run(&self);
|
||||
async fn stop(&self);
|
||||
async fn first_segment_ts(&self, live_id: &str) -> i64;
|
||||
async fn m3u8_content(&self, live_id: &str, start: i64, end: i64) -> String;
|
||||
async fn master_m3u8(&self, live_id: &str, start: i64, end: i64) -> String;
|
||||
async fn playlist(&self, live_id: &str, start: i64, end: i64) -> MediaPlaylist;
|
||||
async fn get_related_playlists(&self, parent_id: &str) -> Vec<(String, String)>;
|
||||
async fn info(&self) -> RecorderInfo;
|
||||
async fn comments(&self, live_id: &str) -> Result<Vec<DanmuEntry>, errors::RecorderError>;
|
||||
async fn is_recording(&self, live_id: &str) -> bool;
|
||||
154
src-tauri/src/recorder/user_agent_generator.rs
Normal file
154
src-tauri/src/recorder/user_agent_generator.rs
Normal file
@@ -0,0 +1,154 @@
|
||||
use rand::prelude::*;
|
||||
|
||||
pub struct UserAgentGenerator {
|
||||
rng: ThreadRng,
|
||||
}
|
||||
|
||||
impl UserAgentGenerator {
|
||||
pub fn new() -> Self {
|
||||
Self { rng: thread_rng() }
|
||||
}
|
||||
|
||||
pub fn generate(&mut self) -> String {
|
||||
let browser_type = self.rng.gen_range(0..4);
|
||||
|
||||
match browser_type {
|
||||
0 => self.generate_chrome(),
|
||||
1 => self.generate_firefox(),
|
||||
2 => self.generate_safari(),
|
||||
_ => self.generate_edge(),
|
||||
}
|
||||
}
|
||||
|
||||
fn generate_chrome(&mut self) -> String {
|
||||
let chrome_versions = [
|
||||
"120.0.0.0",
|
||||
"119.0.0.0",
|
||||
"118.0.0.0",
|
||||
"117.0.0.0",
|
||||
"116.0.0.0",
|
||||
"115.0.0.0",
|
||||
"114.0.0.0",
|
||||
];
|
||||
let webkit_versions = ["537.36", "537.35", "537.34"];
|
||||
|
||||
let os = self.get_random_os();
|
||||
let chrome_version = chrome_versions.choose(&mut self.rng).unwrap();
|
||||
let webkit_version = webkit_versions.choose(&mut self.rng).unwrap();
|
||||
|
||||
format!(
|
||||
"Mozilla/5.0 ({os}) AppleWebKit/{webkit_version} (KHTML, like Gecko) Chrome/{chrome_version} Safari/{webkit_version}"
|
||||
)
|
||||
}
|
||||
|
||||
fn generate_firefox(&mut self) -> String {
|
||||
let firefox_versions = ["121.0", "120.0", "119.0", "118.0", "117.0", "116.0"];
|
||||
|
||||
let os = self.get_random_os_firefox();
|
||||
let firefox_version = firefox_versions.choose(&mut self.rng).unwrap();
|
||||
|
||||
format!("Mozilla/5.0 ({os}; rv:{firefox_version}) Gecko/20100101 Firefox/{firefox_version}")
|
||||
}
|
||||
|
||||
fn generate_safari(&mut self) -> String {
|
||||
let safari_versions = ["17.1", "17.0", "16.6", "16.5", "16.4", "16.3"];
|
||||
let webkit_versions = ["605.1.15", "605.1.14", "605.1.13"];
|
||||
|
||||
let safari_version = safari_versions.choose(&mut self.rng).unwrap();
|
||||
let webkit_version = webkit_versions.choose(&mut self.rng).unwrap();
|
||||
|
||||
// Safari 只在 macOS 和 iOS 上
|
||||
let is_mobile = self.rng.gen_bool(0.3);
|
||||
|
||||
if is_mobile {
|
||||
let ios_versions = ["17_1", "16_7", "16_6", "15_7"];
|
||||
let ios_version = ios_versions.choose(&mut self.rng).unwrap();
|
||||
let device = ["iPhone; CPU iPhone OS", "iPad; CPU OS"]
|
||||
.choose(&mut self.rng)
|
||||
.unwrap();
|
||||
|
||||
format!(
|
||||
"Mozilla/5.0 ({device} {ios_version} like Mac OS X) AppleWebKit/{webkit_version} (KHTML, like Gecko) Version/{safari_version} Mobile/15E148 Safari/{webkit_version}"
|
||||
)
|
||||
} else {
|
||||
let macos_versions = ["14_1", "13_6", "12_7"];
|
||||
let macos_version = macos_versions.choose(&mut self.rng).unwrap();
|
||||
|
||||
format!(
|
||||
"Mozilla/5.0 (Macintosh; Intel Mac OS X {macos_version}) AppleWebKit/{webkit_version} (KHTML, like Gecko) Version/{safari_version} Safari/{webkit_version}"
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
fn generate_edge(&mut self) -> String {
|
||||
let edge_versions = ["119.0.0.0", "118.0.0.0", "117.0.0.0", "116.0.0.0"];
|
||||
let chrome_versions = ["119.0.0.0", "118.0.0.0", "117.0.0.0", "116.0.0.0"];
|
||||
|
||||
let os = self.get_random_os();
|
||||
let edge_version = edge_versions.choose(&mut self.rng).unwrap();
|
||||
let chrome_version = chrome_versions.choose(&mut self.rng).unwrap();
|
||||
|
||||
format!(
|
||||
"Mozilla/5.0 ({os}) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/{chrome_version} Safari/537.36 Edg/{edge_version}"
|
||||
)
|
||||
}
|
||||
|
||||
fn get_random_os(&mut self) -> &'static str {
|
||||
let os_list = [
|
||||
"Windows NT 10.0; Win64; x64",
|
||||
"Windows NT 11.0; Win64; x64",
|
||||
"Macintosh; Intel Mac OS X 10_15_7",
|
||||
"Macintosh; Intel Mac OS X 10_14_6",
|
||||
"X11; Linux x86_64",
|
||||
"X11; Ubuntu; Linux x86_64",
|
||||
];
|
||||
|
||||
os_list.choose(&mut self.rng).unwrap()
|
||||
}
|
||||
|
||||
fn get_random_os_firefox(&mut self) -> &'static str {
|
||||
let os_list = [
|
||||
"Windows NT 10.0; Win64; x64",
|
||||
"Windows NT 11.0; Win64; x64",
|
||||
"Macintosh; Intel Mac OS X 10.15",
|
||||
"X11; Linux x86_64",
|
||||
"X11; Ubuntu; Linux i686",
|
||||
];
|
||||
|
||||
os_list.choose(&mut self.rng).unwrap()
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_generate_user_agents() {
|
||||
let mut generator = UserAgentGenerator::new();
|
||||
|
||||
for _ in 0..100 {
|
||||
let ua = generator.generate();
|
||||
assert!(!ua.is_empty());
|
||||
assert!(ua.starts_with("Mozilla/5.0"));
|
||||
|
||||
// 验证是否包含常见浏览器标识
|
||||
assert!(
|
||||
ua.contains("Chrome")
|
||||
|| ua.contains("Firefox")
|
||||
|| ua.contains("Safari")
|
||||
|| ua.contains("Edg")
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_chrome_user_agent_format() {
|
||||
let mut generator = UserAgentGenerator::new();
|
||||
let ua = generator.generate_chrome();
|
||||
|
||||
assert!(ua.contains("Chrome"));
|
||||
assert!(ua.contains("Safari"));
|
||||
assert!(ua.contains("AppleWebKit"));
|
||||
}
|
||||
}
|
||||
@@ -1,24 +1,26 @@
|
||||
use crate::config::Config;
|
||||
use crate::danmu2ass;
|
||||
use crate::database::recorder::RecorderRow;
|
||||
use crate::database::video::VideoRow;
|
||||
use crate::database::{account::AccountRow, record::RecordRow};
|
||||
use crate::database::{Database, DatabaseError};
|
||||
use crate::ffmpeg::{clip_from_m3u8, encode_video_danmu, Range};
|
||||
use crate::progress_reporter::{EventEmitter, ProgressReporter};
|
||||
use crate::ffmpeg::{encode_video_danmu, transcode, Range};
|
||||
use crate::progress::progress_reporter::{EventEmitter, ProgressReporter};
|
||||
use crate::recorder::bilibili::{BiliRecorder, BiliRecorderOptions};
|
||||
use crate::recorder::danmu::DanmuEntry;
|
||||
use crate::recorder::douyin::DouyinRecorder;
|
||||
use crate::recorder::errors::RecorderError;
|
||||
use crate::recorder::PlatformType;
|
||||
use crate::recorder::Recorder;
|
||||
use crate::recorder::RecorderInfo;
|
||||
use chrono::Utc;
|
||||
use custom_error::custom_error;
|
||||
use crate::recorder::{PlatformType, RoomInfo};
|
||||
use crate::recorder::{Recorder, UserInfo};
|
||||
use crate::webhook::events::{self, Payload};
|
||||
use crate::webhook::poster::WebhookPoster;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::collections::{HashMap, HashSet};
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::sync::atomic::AtomicBool;
|
||||
use std::sync::Arc;
|
||||
use thiserror::Error;
|
||||
use tokio::fs::{remove_file, write};
|
||||
use tokio::sync::broadcast;
|
||||
use tokio::sync::RwLock;
|
||||
@@ -35,9 +37,10 @@ pub struct RecorderList {
|
||||
#[derive(Debug, Deserialize, Serialize, Clone)]
|
||||
pub struct ClipRangeParams {
|
||||
pub title: String,
|
||||
pub note: String,
|
||||
pub cover: String,
|
||||
pub platform: String,
|
||||
pub room_id: u64,
|
||||
pub room_id: i64,
|
||||
pub live_id: String,
|
||||
pub range: Option<Range>,
|
||||
/// Encode danmu after clip
|
||||
@@ -49,13 +52,23 @@ pub struct ClipRangeParams {
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum RecorderEvent {
|
||||
LiveStart {
|
||||
recorder: RecorderInfo,
|
||||
},
|
||||
LiveEnd {
|
||||
room_id: i64,
|
||||
platform: PlatformType,
|
||||
room_id: u64,
|
||||
live_id: String,
|
||||
recorder: RecorderInfo,
|
||||
},
|
||||
RecordStart {
|
||||
recorder: RecorderInfo,
|
||||
},
|
||||
RecordEnd {
|
||||
recorder: RecorderInfo,
|
||||
},
|
||||
}
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct RecorderManager {
|
||||
#[cfg(not(feature = "headless"))]
|
||||
app_handle: AppHandle,
|
||||
@@ -66,41 +79,34 @@ pub struct RecorderManager {
|
||||
to_remove: Arc<RwLock<HashSet<String>>>,
|
||||
event_tx: broadcast::Sender<RecorderEvent>,
|
||||
is_migrating: Arc<AtomicBool>,
|
||||
webhook_poster: WebhookPoster,
|
||||
}
|
||||
|
||||
custom_error! {pub RecorderManagerError
|
||||
AlreadyExisted { room_id: u64 } = "房间 {room_id} 已存在",
|
||||
NotFound {room_id: u64 } = "房间 {room_id} 不存在",
|
||||
InvalidPlatformType { platform: String } = "不支持的平台: {platform}",
|
||||
RecorderError { err: RecorderError } = "录播器错误: {err}",
|
||||
IOError {err: std::io::Error } = "IO 错误: {err}",
|
||||
HLSError { err: String } = "HLS 服务器错误: {err}",
|
||||
DatabaseError { err: DatabaseError } = "数据库错误: {err}",
|
||||
Recording { live_id: String } = "无法删除正在录制的直播 {live_id}",
|
||||
ClipError { err: String } = "切片错误: {err}",
|
||||
}
|
||||
|
||||
impl From<std::io::Error> for RecorderManagerError {
|
||||
fn from(value: std::io::Error) -> Self {
|
||||
RecorderManagerError::IOError { err: value }
|
||||
}
|
||||
}
|
||||
|
||||
impl From<RecorderError> for RecorderManagerError {
|
||||
fn from(value: RecorderError) -> Self {
|
||||
RecorderManagerError::RecorderError { err: value }
|
||||
}
|
||||
}
|
||||
|
||||
impl From<DatabaseError> for RecorderManagerError {
|
||||
fn from(value: DatabaseError) -> Self {
|
||||
RecorderManagerError::DatabaseError { err: value }
|
||||
}
|
||||
#[derive(Error, Debug)]
|
||||
pub enum RecorderManagerError {
|
||||
#[error("Recorder already exists: {room_id}")]
|
||||
AlreadyExisted { room_id: i64 },
|
||||
#[error("Recorder not found: {room_id}")]
|
||||
NotFound { room_id: i64 },
|
||||
#[error("Invalid platform type: {platform}")]
|
||||
InvalidPlatformType { platform: String },
|
||||
#[error("Recorder error: {0}")]
|
||||
RecorderError(#[from] RecorderError),
|
||||
#[error("IO error: {0}")]
|
||||
IOError(#[from] std::io::Error),
|
||||
#[error("HLS error: {err}")]
|
||||
HLSError { err: String },
|
||||
#[error("Database error: {0}")]
|
||||
DatabaseError(#[from] DatabaseError),
|
||||
#[error("Recording: {live_id}")]
|
||||
Recording { live_id: String },
|
||||
#[error("Clip error: {err}")]
|
||||
ClipError { err: String },
|
||||
}
|
||||
|
||||
impl From<RecorderManagerError> for String {
|
||||
fn from(value: RecorderManagerError) -> Self {
|
||||
value.to_string()
|
||||
fn from(err: RecorderManagerError) -> Self {
|
||||
err.to_string()
|
||||
}
|
||||
}
|
||||
|
||||
@@ -110,6 +116,7 @@ impl RecorderManager {
|
||||
emitter: EventEmitter,
|
||||
db: Arc<Database>,
|
||||
config: Arc<RwLock<Config>>,
|
||||
webhook_poster: WebhookPoster,
|
||||
) -> RecorderManager {
|
||||
let (event_tx, _) = broadcast::channel(100);
|
||||
let manager = RecorderManager {
|
||||
@@ -122,6 +129,7 @@ impl RecorderManager {
|
||||
to_remove: Arc::new(RwLock::new(HashSet::new())),
|
||||
event_tx,
|
||||
is_migrating: Arc::new(AtomicBool::new(false)),
|
||||
webhook_poster,
|
||||
};
|
||||
|
||||
// Start event listener
|
||||
@@ -138,20 +146,6 @@ impl RecorderManager {
|
||||
manager
|
||||
}
|
||||
|
||||
pub fn clone(&self) -> Self {
|
||||
RecorderManager {
|
||||
#[cfg(not(feature = "headless"))]
|
||||
app_handle: self.app_handle.clone(),
|
||||
emitter: self.emitter.clone(),
|
||||
db: self.db.clone(),
|
||||
config: self.config.clone(),
|
||||
recorders: self.recorders.clone(),
|
||||
to_remove: self.to_remove.clone(),
|
||||
event_tx: self.event_tx.clone(),
|
||||
is_migrating: self.is_migrating.clone(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn get_event_sender(&self) -> broadcast::Sender<RecorderEvent> {
|
||||
self.event_tx.clone()
|
||||
}
|
||||
@@ -160,108 +154,67 @@ impl RecorderManager {
|
||||
let mut rx = self.event_tx.subscribe();
|
||||
while let Ok(event) = rx.recv().await {
|
||||
match event {
|
||||
RecorderEvent::LiveStart { recorder } => {
|
||||
let event =
|
||||
events::new_webhook_event(events::LIVE_STARTED, Payload::Room(recorder));
|
||||
let _ = self.webhook_poster.post_event(&event).await;
|
||||
}
|
||||
RecorderEvent::LiveEnd {
|
||||
platform,
|
||||
room_id,
|
||||
live_id,
|
||||
recorder,
|
||||
} => {
|
||||
self.handle_live_end(platform, room_id, &live_id).await;
|
||||
let event = events::new_webhook_event(
|
||||
events::LIVE_ENDED,
|
||||
Payload::Room(recorder.clone()),
|
||||
);
|
||||
let _ = self.webhook_poster.post_event(&event).await;
|
||||
self.handle_live_end(platform, room_id, &recorder).await;
|
||||
}
|
||||
RecorderEvent::RecordStart { recorder } => {
|
||||
let event =
|
||||
events::new_webhook_event(events::RECORD_STARTED, Payload::Room(recorder));
|
||||
let _ = self.webhook_poster.post_event(&event).await;
|
||||
}
|
||||
RecorderEvent::RecordEnd { recorder } => {
|
||||
let event =
|
||||
events::new_webhook_event(events::RECORD_ENDED, Payload::Room(recorder));
|
||||
let _ = self.webhook_poster.post_event(&event).await;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async fn handle_live_end(&self, platform: PlatformType, room_id: u64, live_id: &str) {
|
||||
async fn handle_live_end(&self, platform: PlatformType, room_id: i64, recorder: &RecorderInfo) {
|
||||
if !self.config.read().await.auto_generate.enabled {
|
||||
return;
|
||||
}
|
||||
|
||||
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
|
||||
log::info!("Start auto generate for {}", recorder_id);
|
||||
let live_record = self.db.get_record(room_id, live_id).await;
|
||||
log::info!("Start auto generate for {recorder_id}");
|
||||
let live_id = recorder.current_live_id.clone();
|
||||
let live_record = self.db.get_record(room_id, &live_id).await;
|
||||
if live_record.is_err() {
|
||||
log::error!("Live not found in record: {} {}", room_id, live_id);
|
||||
log::error!("Live not found in record: {room_id} {live_id}");
|
||||
return;
|
||||
}
|
||||
|
||||
let recorders = self.recorders.read().await;
|
||||
let recorder = match recorders.get(&recorder_id) {
|
||||
Some(recorder) => recorder,
|
||||
None => {
|
||||
log::error!("Recorder not found: {}", recorder_id);
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
let live_record = live_record.unwrap();
|
||||
let encode_danmu = self.config.read().await.auto_generate.encode_danmu;
|
||||
|
||||
let clip_config = ClipRangeParams {
|
||||
title: live_record.title,
|
||||
cover: "".into(),
|
||||
platform: live_record.platform.clone(),
|
||||
room_id,
|
||||
live_id: live_id.to_string(),
|
||||
range: None,
|
||||
danmu: encode_danmu,
|
||||
local_offset: 0,
|
||||
fix_encoding: false,
|
||||
};
|
||||
|
||||
let clip_filename = self.config.read().await.generate_clip_name(&clip_config);
|
||||
|
||||
// add prefix [full] for clip_filename
|
||||
let name_with_prefix = format!(
|
||||
"[full]{}",
|
||||
clip_filename.file_name().unwrap().to_str().unwrap()
|
||||
);
|
||||
let _ = clip_filename.with_file_name(name_with_prefix);
|
||||
|
||||
match self
|
||||
.clip_range_on_recorder(&**recorder, None, clip_filename, &clip_config)
|
||||
if let Err(e) = self
|
||||
.generate_whole_clip(
|
||||
None,
|
||||
platform.as_str().to_string(),
|
||||
room_id,
|
||||
live_record.parent_id,
|
||||
)
|
||||
.await
|
||||
{
|
||||
Ok(f) => {
|
||||
let metadata = match std::fs::metadata(&f) {
|
||||
Ok(metadata) => metadata,
|
||||
Err(e) => {
|
||||
log::error!("Failed to detect auto generated clip: {}", e);
|
||||
return;
|
||||
}
|
||||
};
|
||||
match self
|
||||
.db
|
||||
.add_video(&VideoRow {
|
||||
id: 0,
|
||||
status: 0,
|
||||
room_id,
|
||||
created_at: Utc::now().to_rfc3339(),
|
||||
cover: "".into(),
|
||||
file: f.file_name().unwrap().to_str().unwrap().to_string(),
|
||||
length: live_record.length,
|
||||
size: metadata.len() as i64,
|
||||
bvid: "".into(),
|
||||
title: "".into(),
|
||||
desc: "".into(),
|
||||
tags: "".into(),
|
||||
area: 0,
|
||||
platform: live_record.platform.clone(),
|
||||
})
|
||||
.await
|
||||
{
|
||||
Ok(_) => {}
|
||||
Err(e) => {
|
||||
log::error!("Add auto generate clip record failed: {}", e)
|
||||
}
|
||||
};
|
||||
}
|
||||
Err(e) => {
|
||||
log::error!("Auto generate clip failed: {}", e)
|
||||
}
|
||||
log::error!("Failed to generate whole clip: {e}");
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn set_migrating(&self, migrating: bool) {
|
||||
pub fn set_migrating(&self, migrating: bool) {
|
||||
self.is_migrating
|
||||
.store(migrating, std::sync::atomic::Ordering::Relaxed);
|
||||
}
|
||||
@@ -319,7 +272,12 @@ impl RecorderManager {
|
||||
.add_recorder(&account, platform, room_id, extra, *auto_start)
|
||||
.await
|
||||
{
|
||||
log::error!("Failed to add recorder: {}", e);
|
||||
log::error!(
|
||||
"Failed to add recorder: {} {} {}",
|
||||
platform.as_str(),
|
||||
room_id,
|
||||
e
|
||||
);
|
||||
}
|
||||
}
|
||||
interval.tick().await;
|
||||
@@ -330,7 +288,7 @@ impl RecorderManager {
|
||||
&self,
|
||||
account: &AccountRow,
|
||||
platform: PlatformType,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
extra: &str,
|
||||
auto_start: bool,
|
||||
) -> Result<(), RecorderManagerError> {
|
||||
@@ -402,8 +360,8 @@ impl RecorderManager {
|
||||
pub async fn remove_recorder(
|
||||
&self,
|
||||
platform: PlatformType,
|
||||
room_id: u64,
|
||||
) -> Result<(), RecorderManagerError> {
|
||||
room_id: i64,
|
||||
) -> Result<RecorderRow, RecorderManagerError> {
|
||||
// check recorder exists
|
||||
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
|
||||
if !self.recorders.read().await.contains_key(&recorder_id) {
|
||||
@@ -411,24 +369,24 @@ impl RecorderManager {
|
||||
}
|
||||
|
||||
// remove from db
|
||||
self.db.remove_recorder(room_id).await?;
|
||||
let recorder = self.db.remove_recorder(room_id).await?;
|
||||
|
||||
// add to to_remove
|
||||
log::debug!("Add to to_remove: {}", recorder_id);
|
||||
log::debug!("Add to to_remove: {recorder_id}");
|
||||
self.to_remove.write().await.insert(recorder_id.clone());
|
||||
|
||||
// stop recorder
|
||||
log::debug!("Stop recorder: {}", recorder_id);
|
||||
log::debug!("Stop recorder: {recorder_id}");
|
||||
if let Some(recorder_ref) = self.recorders.read().await.get(&recorder_id) {
|
||||
recorder_ref.stop().await;
|
||||
}
|
||||
|
||||
// remove recorder
|
||||
log::debug!("Remove recorder from manager: {}", recorder_id);
|
||||
log::debug!("Remove recorder from manager: {recorder_id}");
|
||||
self.recorders.write().await.remove(&recorder_id);
|
||||
|
||||
// remove from to_remove
|
||||
log::debug!("Remove from to_remove: {}", recorder_id);
|
||||
log::debug!("Remove from to_remove: {recorder_id}");
|
||||
self.to_remove.write().await.remove(&recorder_id);
|
||||
|
||||
// remove related cache folder
|
||||
@@ -438,11 +396,11 @@ impl RecorderManager {
|
||||
platform.as_str(),
|
||||
room_id
|
||||
);
|
||||
log::debug!("Remove cache folder: {}", cache_folder);
|
||||
log::debug!("Remove cache folder: {cache_folder}");
|
||||
let _ = tokio::fs::remove_dir_all(cache_folder).await;
|
||||
log::info!("Recorder {} cache folder removed", room_id);
|
||||
log::info!("Recorder {room_id} cache folder removed");
|
||||
|
||||
Ok(())
|
||||
Ok(recorder)
|
||||
}
|
||||
|
||||
pub async fn clip_range(
|
||||
@@ -454,7 +412,7 @@ impl RecorderManager {
|
||||
let recorders = self.recorders.read().await;
|
||||
let recorder_id = format!("{}:{}", params.platform, params.room_id);
|
||||
if !recorders.contains_key(&recorder_id) {
|
||||
log::error!("Recorder {} not found", recorder_id);
|
||||
log::error!("Recorder {recorder_id} not found");
|
||||
return Err(RecorderManagerError::NotFound {
|
||||
room_id: params.room_id,
|
||||
});
|
||||
@@ -473,64 +431,64 @@ impl RecorderManager {
|
||||
clip_file: PathBuf,
|
||||
params: &ClipRangeParams,
|
||||
) -> Result<PathBuf, RecorderManagerError> {
|
||||
let range_m3u8 = format!(
|
||||
"{}/{}/{}/playlist.m3u8",
|
||||
params.platform, params.room_id, params.live_id
|
||||
);
|
||||
|
||||
let manifest_content = self.handle_hls_request(&range_m3u8).await?;
|
||||
let mut manifest_content = String::from_utf8(manifest_content)
|
||||
.map_err(|e| RecorderManagerError::ClipError { err: e.to_string() })?;
|
||||
|
||||
// if manifest is for stream, replace EXT-X-PLAYLIST-TYPE:EVENT to EXT-X-PLAYLIST-TYPE:VOD, and add #EXT-X-ENDLIST
|
||||
if manifest_content.contains("#EXT-X-PLAYLIST-TYPE:EVENT") {
|
||||
manifest_content =
|
||||
manifest_content.replace("#EXT-X-PLAYLIST-TYPE:EVENT", "#EXT-X-PLAYLIST-TYPE:VOD");
|
||||
manifest_content += "\n#EXT-X-ENDLIST\n";
|
||||
}
|
||||
|
||||
let cache_path = self.config.read().await.cache.clone();
|
||||
let cache_path = Path::new(&cache_path);
|
||||
let random_filename = format!("manifest_{}.m3u8", uuid::Uuid::new_v4());
|
||||
let tmp_manifest_file_path = cache_path
|
||||
.join(¶ms.platform)
|
||||
let playlist_path = cache_path
|
||||
.join(params.platform.clone())
|
||||
.join(params.room_id.to_string())
|
||||
.join(¶ms.live_id)
|
||||
.join(random_filename);
|
||||
.join(params.live_id.clone())
|
||||
.join("playlist.m3u8");
|
||||
|
||||
// Write manifest content to temporary file
|
||||
tokio::fs::write(&tmp_manifest_file_path, manifest_content.as_bytes())
|
||||
.await
|
||||
.map_err(|e| RecorderManagerError::ClipError { err: e.to_string() })?;
|
||||
|
||||
if let Err(e) = clip_from_m3u8(
|
||||
reporter,
|
||||
&tmp_manifest_file_path,
|
||||
&clip_file,
|
||||
params.range.as_ref(),
|
||||
params.fix_encoding,
|
||||
)
|
||||
.await
|
||||
{
|
||||
log::error!("Failed to generate clip file: {}", e);
|
||||
return Err(RecorderManagerError::ClipError { err: e.to_string() });
|
||||
}
|
||||
|
||||
// remove temp file
|
||||
let _ = tokio::fs::remove_file(tmp_manifest_file_path).await;
|
||||
|
||||
// check clip_file exists
|
||||
if !clip_file.exists() {
|
||||
log::error!("Clip file not found: {}", clip_file.display());
|
||||
if !playlist_path.exists() {
|
||||
log::error!("Playlist file not found: {}", playlist_path.display());
|
||||
return Err(RecorderManagerError::ClipError {
|
||||
err: "Clip file not found".into(),
|
||||
err: "Playlist file not found".to_string(),
|
||||
});
|
||||
}
|
||||
|
||||
crate::ffmpeg::playlist::playlist_to_video(
|
||||
reporter,
|
||||
&playlist_path,
|
||||
&clip_file,
|
||||
params.range.clone(),
|
||||
)
|
||||
.await
|
||||
.map_err(|e| RecorderManagerError::ClipError { err: e.to_string() })?;
|
||||
|
||||
if params.fix_encoding {
|
||||
// transcode clip_file
|
||||
let tmp_clip_file = clip_file.with_extension("tmp.mp4");
|
||||
if let Err(e) = transcode(reporter, &clip_file, &tmp_clip_file, false).await {
|
||||
log::error!("Failed to transcode clip file: {e}");
|
||||
return Err(RecorderManagerError::ClipError { err: e.to_string() });
|
||||
}
|
||||
|
||||
// remove clip_file
|
||||
let _ = tokio::fs::remove_file(&clip_file).await;
|
||||
|
||||
// rename tmp_clip_file to clip_file
|
||||
let _ = tokio::fs::rename(tmp_clip_file, &clip_file).await;
|
||||
}
|
||||
|
||||
if !params.danmu {
|
||||
log::info!("Skip danmu encoding");
|
||||
return Ok(clip_file);
|
||||
}
|
||||
|
||||
let stream_start_timestamp_milis = recorder
|
||||
.playlist(
|
||||
¶ms.live_id,
|
||||
params.range.as_ref().unwrap().start as i64,
|
||||
params.range.as_ref().unwrap().end as i64,
|
||||
)
|
||||
.await
|
||||
.segments
|
||||
.first()
|
||||
.unwrap()
|
||||
.program_date_time
|
||||
.unwrap()
|
||||
.timestamp_millis();
|
||||
|
||||
let danmus = recorder.comments(¶ms.live_id).await;
|
||||
if danmus.is_err() {
|
||||
log::error!("Failed to get danmus");
|
||||
@@ -542,19 +500,23 @@ impl RecorderManager {
|
||||
params
|
||||
.range
|
||||
.as_ref()
|
||||
.map_or("None".to_string(), |r| r.to_string()),
|
||||
.map_or("None".to_string(), std::string::ToString::to_string),
|
||||
params.local_offset
|
||||
);
|
||||
let mut danmus = danmus.unwrap();
|
||||
log::debug!("First danmu entry: {:?}", danmus.first());
|
||||
log::debug!("Last danmu entry: {:?}", danmus.last());
|
||||
log::debug!("Stream start timestamp: {}", stream_start_timestamp_milis);
|
||||
log::debug!("Local offset: {}", params.local_offset);
|
||||
log::debug!("Range: {:?}", params.range);
|
||||
|
||||
if let Some(range) = ¶ms.range {
|
||||
// update entry ts to offset and filter danmus in range
|
||||
for d in &mut danmus {
|
||||
d.ts -= (range.start as i64 + params.local_offset) * 1000;
|
||||
d.ts -= stream_start_timestamp_milis + params.local_offset * 1000;
|
||||
}
|
||||
if range.duration() > 0.0 {
|
||||
danmus.retain(|x| x.ts >= 0 && x.ts <= (range.duration() as i64) * 1000);
|
||||
danmus.retain(|x| x.ts >= 0 && x.ts <= (range.duration() * 1000.0).round() as i64);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -586,13 +548,52 @@ impl RecorderManager {
|
||||
|
||||
pub async fn get_recorder_list(&self) -> RecorderList {
|
||||
let mut summary = RecorderList {
|
||||
count: self.recorders.read().await.len(),
|
||||
count: 0,
|
||||
recorders: Vec::new(),
|
||||
};
|
||||
|
||||
// initialized recorder set
|
||||
let mut recorder_set = HashSet::new();
|
||||
for recorder_ref in self.recorders.read().await.iter() {
|
||||
let room_info = recorder_ref.1.info().await;
|
||||
summary.recorders.push(room_info);
|
||||
summary.recorders.push(room_info.clone());
|
||||
recorder_set.insert(room_info.room_id);
|
||||
}
|
||||
|
||||
// get recorders from db
|
||||
let recorders = self.db.get_recorders().await;
|
||||
if recorders.is_err() {
|
||||
log::error!(
|
||||
"Failed to get recorders from db: {}",
|
||||
recorders.err().unwrap()
|
||||
);
|
||||
return summary;
|
||||
}
|
||||
let recorders = recorders.unwrap();
|
||||
summary.count = recorders.len();
|
||||
for recorder in recorders {
|
||||
// check if recorder is in recorder_set
|
||||
if !recorder_set.contains(&recorder.room_id) {
|
||||
summary.recorders.push(RecorderInfo {
|
||||
room_id: recorder.room_id,
|
||||
platform: recorder.platform,
|
||||
auto_start: recorder.auto_start,
|
||||
live_status: false,
|
||||
is_recording: false,
|
||||
total_length: 0.0,
|
||||
current_live_id: "".to_string(),
|
||||
room_info: RoomInfo {
|
||||
room_id: recorder.room_id,
|
||||
room_title: recorder.room_id.to_string(),
|
||||
room_cover: "".to_string(),
|
||||
},
|
||||
user_info: UserInfo {
|
||||
user_id: "".to_string(),
|
||||
user_name: "".to_string(),
|
||||
user_avatar: "".to_string(),
|
||||
},
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
summary.recorders.sort_by(|a, b| a.room_id.cmp(&b.room_id));
|
||||
@@ -602,7 +603,7 @@ impl RecorderManager {
|
||||
pub async fn get_recorder_info(
|
||||
&self,
|
||||
platform: PlatformType,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
) -> Option<RecorderInfo> {
|
||||
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
|
||||
if let Some(recorder_ref) = self.recorders.read().await.get(&recorder_id) {
|
||||
@@ -613,13 +614,22 @@ impl RecorderManager {
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn get_archives(&self, room_id: u64) -> Result<Vec<RecordRow>, RecorderManagerError> {
|
||||
Ok(self.db.get_records(room_id).await?)
|
||||
pub async fn get_archive_disk_usage(&self) -> Result<i64, RecorderManagerError> {
|
||||
Ok(self.db.get_record_disk_usage().await?)
|
||||
}
|
||||
|
||||
pub async fn get_archives(
|
||||
&self,
|
||||
room_id: i64,
|
||||
offset: i64,
|
||||
limit: i64,
|
||||
) -> Result<Vec<RecordRow>, RecorderManagerError> {
|
||||
Ok(self.db.get_records(room_id, offset, limit).await?)
|
||||
}
|
||||
|
||||
pub async fn get_archive(
|
||||
&self,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
live_id: &str,
|
||||
) -> Result<RecordRow, RecorderManagerError> {
|
||||
Ok(self.db.get_record(room_id, live_id).await?)
|
||||
@@ -628,7 +638,7 @@ impl RecorderManager {
|
||||
pub async fn get_archive_subtitle(
|
||||
&self,
|
||||
platform: PlatformType,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
live_id: &str,
|
||||
) -> Result<String, RecorderManagerError> {
|
||||
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
|
||||
@@ -643,7 +653,7 @@ impl RecorderManager {
|
||||
pub async fn generate_archive_subtitle(
|
||||
&self,
|
||||
platform: PlatformType,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
live_id: &str,
|
||||
) -> Result<String, RecorderManagerError> {
|
||||
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
|
||||
@@ -658,10 +668,10 @@ impl RecorderManager {
|
||||
pub async fn delete_archive(
|
||||
&self,
|
||||
platform: PlatformType,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
live_id: &str,
|
||||
) -> Result<(), RecorderManagerError> {
|
||||
log::info!("Deleting {}:{}", room_id, live_id);
|
||||
) -> Result<RecordRow, RecorderManagerError> {
|
||||
log::info!("Deleting {room_id}:{live_id}");
|
||||
// check if this is still recording
|
||||
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
|
||||
if let Some(recorder_ref) = self.recorders.read().await.get(&recorder_id) {
|
||||
@@ -672,19 +682,34 @@ impl RecorderManager {
|
||||
});
|
||||
}
|
||||
}
|
||||
self.db.remove_record(live_id).await?;
|
||||
let to_delete = self.db.remove_record(live_id).await?;
|
||||
let cache_folder = Path::new(self.config.read().await.cache.as_str())
|
||||
.join(platform.as_str())
|
||||
.join(room_id.to_string())
|
||||
.join(live_id);
|
||||
let _ = tokio::fs::remove_dir_all(cache_folder).await;
|
||||
Ok(())
|
||||
Ok(to_delete)
|
||||
}
|
||||
|
||||
pub async fn delete_archives(
|
||||
&self,
|
||||
platform: PlatformType,
|
||||
room_id: i64,
|
||||
live_ids: &[&str],
|
||||
) -> Result<Vec<RecordRow>, RecorderManagerError> {
|
||||
log::info!("Deleting archives in batch: {live_ids:?}");
|
||||
let mut to_deletes = Vec::new();
|
||||
for live_id in live_ids {
|
||||
let to_delete = self.delete_archive(platform, room_id, live_id).await?;
|
||||
to_deletes.push(to_delete);
|
||||
}
|
||||
Ok(to_deletes)
|
||||
}
|
||||
|
||||
pub async fn get_danmu(
|
||||
&self,
|
||||
platform: PlatformType,
|
||||
room_id: u64,
|
||||
room_id: i64,
|
||||
live_id: &str,
|
||||
) -> Result<Vec<DanmuEntry>, RecorderManagerError> {
|
||||
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
|
||||
@@ -702,7 +727,7 @@ impl RecorderManager {
|
||||
let path_segs: Vec<&str> = path.split('/').collect();
|
||||
|
||||
if path_segs.len() != 4 {
|
||||
log::warn!("Invalid request path: {}", path);
|
||||
log::warn!("Invalid request path: {path}");
|
||||
return Err(RecorderManagerError::HLSError {
|
||||
err: "Invalid hls path".into(),
|
||||
});
|
||||
@@ -710,7 +735,7 @@ impl RecorderManager {
|
||||
// parse recorder type
|
||||
let platform = path_segs[0];
|
||||
// parse room id
|
||||
let room_id = path_segs[1].parse::<u64>().unwrap();
|
||||
let room_id = path_segs[1].parse::<i64>().unwrap();
|
||||
// parse live id
|
||||
let live_id = path_segs[2];
|
||||
|
||||
@@ -733,8 +758,7 @@ impl RecorderManager {
|
||||
params
|
||||
.iter()
|
||||
.find(|param| param[0] == "start")
|
||||
.map(|param| param[1].parse::<i64>().unwrap())
|
||||
.unwrap_or(0)
|
||||
.map_or(0, |param| param[1].parse::<i64>().unwrap())
|
||||
} else {
|
||||
0
|
||||
};
|
||||
@@ -742,18 +766,18 @@ impl RecorderManager {
|
||||
params
|
||||
.iter()
|
||||
.find(|param| param[0] == "end")
|
||||
.map(|param| param[1].parse::<i64>().unwrap())
|
||||
.unwrap_or(0)
|
||||
.map_or(0, |param| param[1].parse::<i64>().unwrap())
|
||||
} else {
|
||||
0
|
||||
};
|
||||
|
||||
if path_segs[3] == "playlist.m3u8" {
|
||||
// get recorder
|
||||
let recorder_key = format!("{}:{}", platform, room_id);
|
||||
let recorder_key = format!("{platform}:{room_id}");
|
||||
let recorders = self.recorders.read().await;
|
||||
let recorder = recorders.get(&recorder_key);
|
||||
if recorder.is_none() {
|
||||
log::warn!("Recorder not found: {recorder_key}");
|
||||
return Err(RecorderManagerError::HLSError {
|
||||
err: "Recorder not found".into(),
|
||||
});
|
||||
@@ -761,38 +785,29 @@ impl RecorderManager {
|
||||
let recorder = recorder.unwrap();
|
||||
|
||||
// response with recorder generated m3u8, which contains ts entries that cached in local
|
||||
let m3u8_content = recorder.m3u8_content(live_id, start, end).await;
|
||||
log::debug!("Generating m3u8 for {live_id} with start {start} and end {end}");
|
||||
let playlist = recorder.playlist(live_id, start, end).await;
|
||||
let mut v: Vec<u8> = Vec::new();
|
||||
playlist.write_to(&mut v).unwrap();
|
||||
let m3u8_content: &str = std::str::from_utf8(&v).unwrap();
|
||||
|
||||
Ok(m3u8_content.into())
|
||||
} else if path_segs[3] == "master.m3u8" {
|
||||
// get recorder
|
||||
let recorder_key = format!("{}:{}", platform, room_id);
|
||||
let recorders = self.recorders.read().await;
|
||||
let recorder = recorders.get(&recorder_key);
|
||||
if recorder.is_none() {
|
||||
return Err(RecorderManagerError::HLSError {
|
||||
err: "Recorder not found".into(),
|
||||
});
|
||||
}
|
||||
let recorder = recorder.unwrap();
|
||||
let m3u8_content = recorder.master_m3u8(live_id, start, end).await;
|
||||
Ok(m3u8_content.into())
|
||||
} else {
|
||||
// try to find requested ts file in recorder's cache
|
||||
// cache files are stored in {cache_dir}/{room_id}/{timestamp}/{ts_file}
|
||||
let ts_file = format!("{}/{}", cache_path, path.replace("%7C", "|"));
|
||||
let recorders = self.recorders.read().await;
|
||||
let recorder_id = format!("{}:{}", platform, room_id);
|
||||
let recorder_id = format!("{platform}:{room_id}");
|
||||
let recorder = recorders.get(&recorder_id);
|
||||
if recorder.is_none() {
|
||||
log::warn!("Recorder not found: {}", recorder_id);
|
||||
log::warn!("Recorder not found: {recorder_id}");
|
||||
return Err(RecorderManagerError::HLSError {
|
||||
err: "Recorder not found".into(),
|
||||
});
|
||||
}
|
||||
let ts_file_content = tokio::fs::read(&ts_file).await;
|
||||
if ts_file_content.is_err() {
|
||||
log::warn!("Segment file not found: {}", ts_file);
|
||||
log::warn!("Segment file not found: {ts_file}");
|
||||
return Err(RecorderManagerError::HLSError {
|
||||
err: "Segment file not found".into(),
|
||||
});
|
||||
@@ -802,10 +817,10 @@ impl RecorderManager {
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn set_enable(&self, platform: PlatformType, room_id: u64, enabled: bool) {
|
||||
pub async fn set_enable(&self, platform: PlatformType, room_id: i64, enabled: bool) {
|
||||
// update RecordRow auto_start field
|
||||
if let Err(e) = self.db.update_recorder(platform, room_id, enabled).await {
|
||||
log::error!("Failed to update recorder auto_start: {}", e);
|
||||
log::error!("Failed to update recorder auto_start: {e}");
|
||||
}
|
||||
|
||||
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
|
||||
@@ -817,4 +832,112 @@ impl RecorderManager {
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn generate_whole_clip(
|
||||
&self,
|
||||
reporter: Option<&ProgressReporter>,
|
||||
platform: String,
|
||||
room_id: i64,
|
||||
parent_id: String,
|
||||
) -> Result<(), RecorderManagerError> {
|
||||
let recorder_id = format!("{}:{}", platform, room_id);
|
||||
let recorders = self.recorders.read().await;
|
||||
let recorder_ref = recorders.get(&recorder_id);
|
||||
if recorder_ref.is_none() {
|
||||
return Err(RecorderManagerError::NotFound { room_id });
|
||||
};
|
||||
|
||||
let recorder_ref = recorder_ref.unwrap();
|
||||
let playlists = recorder_ref.get_related_playlists(&parent_id).await;
|
||||
if playlists.is_empty() {
|
||||
log::error!("No related playlists found: {parent_id}");
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
let title = playlists.first().unwrap().0.clone();
|
||||
let playlists = playlists
|
||||
.iter()
|
||||
.map(|p| p.1.clone())
|
||||
.collect::<Vec<String>>();
|
||||
|
||||
let sanitized_filename = sanitize_filename::sanitize(format!(
|
||||
"[full][{platform}][{room_id}][{parent_id}]{title}.mp4"
|
||||
));
|
||||
let output_filename = Path::new(&sanitized_filename);
|
||||
let cover_filename = output_filename.with_extension("jpg");
|
||||
|
||||
let output_path =
|
||||
Path::new(&self.config.read().await.output.as_str()).join(output_filename);
|
||||
|
||||
log::info!("Concat playlists: {playlists:?}");
|
||||
log::info!("Output path: {output_path:?}");
|
||||
|
||||
let owned_path_bufs: Vec<std::path::PathBuf> =
|
||||
playlists.iter().map(std::path::PathBuf::from).collect();
|
||||
|
||||
let playlists_refs: Vec<&std::path::Path> = owned_path_bufs
|
||||
.iter()
|
||||
.map(std::path::PathBuf::as_path)
|
||||
.collect();
|
||||
|
||||
if let Err(e) =
|
||||
crate::ffmpeg::playlist::playlists_to_video(reporter, &playlists_refs, &output_path)
|
||||
.await
|
||||
{
|
||||
log::error!("Failed to concat playlists: {e}");
|
||||
return Err(RecorderManagerError::HLSError {
|
||||
err: "Failed to concat playlists".into(),
|
||||
});
|
||||
}
|
||||
|
||||
let metadata = std::fs::metadata(&output_path);
|
||||
if metadata.is_err() {
|
||||
return Err(RecorderManagerError::HLSError {
|
||||
err: "Failed to get file metadata".into(),
|
||||
});
|
||||
}
|
||||
let size = metadata.unwrap().len() as i64;
|
||||
|
||||
let video_metadata = crate::ffmpeg::extract_video_metadata(Path::new(&output_path)).await;
|
||||
let mut length = 0;
|
||||
if let Ok(video_metadata) = video_metadata {
|
||||
length = video_metadata.duration as i64;
|
||||
} else {
|
||||
log::error!(
|
||||
"Failed to get video metadata: {}",
|
||||
video_metadata.err().unwrap()
|
||||
);
|
||||
}
|
||||
|
||||
let _ = crate::ffmpeg::generate_thumbnail(Path::new(&output_path), 0.0).await;
|
||||
|
||||
let video = self
|
||||
.db
|
||||
.add_video(&VideoRow {
|
||||
id: 0,
|
||||
status: 0,
|
||||
room_id,
|
||||
created_at: chrono::Local::now().to_rfc3339(),
|
||||
cover: cover_filename.to_string_lossy().to_string(),
|
||||
file: output_filename.to_string_lossy().to_string(),
|
||||
note: "".into(),
|
||||
length,
|
||||
size,
|
||||
bvid: String::new(),
|
||||
title: String::new(),
|
||||
desc: String::new(),
|
||||
tags: String::new(),
|
||||
area: 0,
|
||||
platform: platform.clone(),
|
||||
})
|
||||
.await?;
|
||||
|
||||
let event =
|
||||
events::new_webhook_event(events::CLIP_GENERATED, events::Payload::Clip(video.clone()));
|
||||
if let Err(e) = self.webhook_poster.post_event(&event).await {
|
||||
log::error!("Post webhook event error: {e}");
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
use custom_error::custom_error;
|
||||
use std::sync::Arc;
|
||||
use tokio::sync::RwLock;
|
||||
|
||||
@@ -6,21 +5,17 @@ use crate::config::Config;
|
||||
use crate::database::Database;
|
||||
use crate::recorder::bilibili::client::BiliClient;
|
||||
use crate::recorder_manager::RecorderManager;
|
||||
use crate::webhook::poster::WebhookPoster;
|
||||
|
||||
#[cfg(feature = "headless")]
|
||||
use crate::progress_manager::ProgressManager;
|
||||
|
||||
custom_error! {
|
||||
StateError
|
||||
RecorderAlreadyExists = "Recorder already exists",
|
||||
RecorderCreateError = "Recorder create error",
|
||||
}
|
||||
use crate::progress::progress_manager::ProgressManager;
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct State {
|
||||
pub db: Arc<Database>,
|
||||
pub client: Arc<BiliClient>,
|
||||
pub config: Arc<RwLock<Config>>,
|
||||
pub webhook_poster: WebhookPoster,
|
||||
pub recorder_manager: Arc<RecorderManager>,
|
||||
#[cfg(not(feature = "headless"))]
|
||||
pub app_handle: tauri::AppHandle,
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
use async_trait::async_trait;
|
||||
use std::path::Path;
|
||||
|
||||
use crate::progress_reporter::ProgressReporterTrait;
|
||||
use crate::progress::progress_reporter::ProgressReporterTrait;
|
||||
|
||||
pub mod whisper_cpp;
|
||||
pub mod whisper_online;
|
||||
@@ -1,7 +1,7 @@
|
||||
use async_trait::async_trait;
|
||||
|
||||
use crate::{
|
||||
progress_reporter::ProgressReporterTrait,
|
||||
progress::progress_reporter::ProgressReporterTrait,
|
||||
subtitle_generator::{GenerateResult, SubtitleGeneratorType},
|
||||
};
|
||||
use async_std::sync::{Arc, RwLock};
|
||||
@@ -22,7 +22,7 @@ pub async fn new(model: &Path, prompt: &str) -> Result<WhisperCPP, String> {
|
||||
WhisperContextParameters::default(),
|
||||
)
|
||||
.map_err(|e| {
|
||||
log::error!("Create whisper context failed: {}", e);
|
||||
log::error!("Create whisper context failed: {e}");
|
||||
e.to_string()
|
||||
})?;
|
||||
|
||||
@@ -65,7 +65,7 @@ impl SubtitleGenerator for WhisperCPP {
|
||||
params.set_print_timestamps(false);
|
||||
|
||||
params.set_progress_callback_safe(move |p| {
|
||||
log::info!("Progress: {}%", p);
|
||||
log::info!("Progress: {p}%");
|
||||
});
|
||||
|
||||
let mut inter_samples = vec![Default::default(); samples.len()];
|
||||
@@ -88,7 +88,7 @@ impl SubtitleGenerator for WhisperCPP {
|
||||
reporter.update("生成字幕中");
|
||||
}
|
||||
if let Err(e) = state.full(params, &samples[..]) {
|
||||
log::error!("failed to run model: {}", e);
|
||||
log::error!("failed to run model: {e}");
|
||||
return Err(e.to_string());
|
||||
}
|
||||
|
||||
@@ -107,10 +107,7 @@ impl SubtitleGenerator for WhisperCPP {
|
||||
let milliseconds = ((timestamp - hours * 3600.0 - minutes * 60.0 - seconds)
|
||||
* 1000.0)
|
||||
.floor() as u32;
|
||||
format!(
|
||||
"{:02}:{:02}:{:02},{:03}",
|
||||
hours, minutes, seconds, milliseconds
|
||||
)
|
||||
format!("{hours:02}:{minutes:02}:{seconds:02},{milliseconds:03}")
|
||||
};
|
||||
|
||||
let line = format!(
|
||||
@@ -126,12 +123,12 @@ impl SubtitleGenerator for WhisperCPP {
|
||||
|
||||
log::info!("Time taken: {} seconds", start_time.elapsed().as_secs_f64());
|
||||
|
||||
let subtitle_content = srtparse::from_str(&subtitle)
|
||||
.map_err(|e| format!("Failed to parse subtitle: {}", e))?;
|
||||
let subtitle_content =
|
||||
srtparse::from_str(&subtitle).map_err(|e| format!("Failed to parse subtitle: {e}"))?;
|
||||
|
||||
Ok(GenerateResult {
|
||||
generator_type: SubtitleGeneratorType::Whisper,
|
||||
subtitle_id: "".to_string(),
|
||||
subtitle_id: String::new(),
|
||||
subtitle_content,
|
||||
})
|
||||
}
|
||||
@@ -154,14 +151,14 @@ mod tests {
|
||||
#[async_trait]
|
||||
impl ProgressReporterTrait for MockReporter {
|
||||
fn update(&self, message: &str) {
|
||||
println!("Mock update: {}", message);
|
||||
println!("Mock update: {message}");
|
||||
}
|
||||
|
||||
async fn finish(&self, success: bool, message: &str) {
|
||||
if success {
|
||||
println!("Mock finish: {}", message);
|
||||
println!("Mock finish: {message}");
|
||||
} else {
|
||||
println!("Mock error: {}", message);
|
||||
println!("Mock error: {message}");
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -178,6 +175,7 @@ mod tests {
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
#[ignore = "Might not have enough memory to run this test"]
|
||||
async fn generate_subtitle() {
|
||||
let whisper = new(Path::new("tests/model/ggml-tiny-q5_1.bin"), "")
|
||||
.await
|
||||
@@ -188,7 +186,7 @@ mod tests {
|
||||
.generate_subtitle(Some(&reporter), audio_path, "auto")
|
||||
.await;
|
||||
if let Err(e) = result {
|
||||
println!("Error: {}", e);
|
||||
println!("Error: {e}");
|
||||
panic!("Failed to generate subtitle");
|
||||
}
|
||||
}
|
||||
|
||||
@@ -5,7 +5,7 @@ use std::path::Path;
|
||||
use tokio::fs;
|
||||
|
||||
use crate::{
|
||||
progress_reporter::ProgressReporterTrait,
|
||||
progress::progress_reporter::ProgressReporterTrait,
|
||||
subtitle_generator::{GenerateResult, SubtitleGenerator, SubtitleGeneratorType},
|
||||
};
|
||||
|
||||
@@ -37,7 +37,7 @@ pub async fn new(
|
||||
let client = Client::builder()
|
||||
.timeout(std::time::Duration::from_secs(300)) // 5 minutes timeout
|
||||
.build()
|
||||
.map_err(|e| format!("Failed to create HTTP client: {}", e))?;
|
||||
.map_err(|e| format!("Failed to create HTTP client: {e}"))?;
|
||||
|
||||
let api_url = api_url.unwrap_or("https://api.openai.com/v1");
|
||||
let api_url = api_url.to_string() + "/audio/transcriptions";
|
||||
@@ -45,8 +45,8 @@ pub async fn new(
|
||||
Ok(WhisperOnline {
|
||||
client,
|
||||
api_url: api_url.to_string(),
|
||||
api_key: api_key.map(|k| k.to_string()),
|
||||
prompt: prompt.map(|p| p.to_string()),
|
||||
api_key: api_key.map(std::string::ToString::to_string),
|
||||
prompt: prompt.map(std::string::ToString::to_string),
|
||||
})
|
||||
}
|
||||
|
||||
@@ -67,7 +67,7 @@ impl SubtitleGenerator for WhisperOnline {
|
||||
}
|
||||
let audio_data = fs::read(audio_path)
|
||||
.await
|
||||
.map_err(|e| format!("Failed to read audio file: {}", e))?;
|
||||
.map_err(|e| format!("Failed to read audio file: {e}"))?;
|
||||
|
||||
// Get file extension for proper MIME type
|
||||
let file_extension = audio_path
|
||||
@@ -86,7 +86,7 @@ impl SubtitleGenerator for WhisperOnline {
|
||||
// Build form data with proper file part
|
||||
let file_part = reqwest::multipart::Part::bytes(audio_data)
|
||||
.mime_str(mime_type)
|
||||
.map_err(|e| format!("Failed to set MIME type: {}", e))?
|
||||
.map_err(|e| format!("Failed to set MIME type: {e}"))?
|
||||
.file_name(
|
||||
audio_path
|
||||
.file_name()
|
||||
@@ -111,7 +111,7 @@ impl SubtitleGenerator for WhisperOnline {
|
||||
let mut req_builder = self.client.post(&self.api_url);
|
||||
|
||||
if let Some(api_key) = &self.api_key {
|
||||
req_builder = req_builder.header("Authorization", format!("Bearer {}", api_key));
|
||||
req_builder = req_builder.header("Authorization", format!("Bearer {api_key}"));
|
||||
}
|
||||
|
||||
if let Some(reporter) = reporter {
|
||||
@@ -122,15 +122,14 @@ impl SubtitleGenerator for WhisperOnline {
|
||||
.multipart(form)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| format!("HTTP request failed: {}", e))?;
|
||||
.map_err(|e| format!("HTTP request failed: {e}"))?;
|
||||
|
||||
let status = response.status();
|
||||
if !status.is_success() {
|
||||
let error_text = response.text().await.unwrap_or_default();
|
||||
log::error!("API request failed with status {}: {}", status, error_text);
|
||||
log::error!("API request failed with status {status}: {error_text}");
|
||||
return Err(format!(
|
||||
"API request failed with status {}: {}",
|
||||
status, error_text
|
||||
"API request failed with status {status}: {error_text}"
|
||||
));
|
||||
}
|
||||
|
||||
@@ -138,17 +137,14 @@ impl SubtitleGenerator for WhisperOnline {
|
||||
let response_text = response
|
||||
.text()
|
||||
.await
|
||||
.map_err(|e| format!("Failed to get response text: {}", e))?;
|
||||
.map_err(|e| format!("Failed to get response text: {e}"))?;
|
||||
|
||||
// Try to parse as JSON
|
||||
let whisper_response: WhisperResponse =
|
||||
serde_json::from_str(&response_text).map_err(|e| {
|
||||
println!("{}", response_text);
|
||||
log::error!(
|
||||
"Failed to parse JSON response. Raw response: {}",
|
||||
response_text
|
||||
);
|
||||
format!("Failed to parse response: {}", e)
|
||||
println!("{response_text}");
|
||||
log::error!("Failed to parse JSON response. Raw response: {response_text}");
|
||||
format!("Failed to parse response: {e}")
|
||||
})?;
|
||||
|
||||
// Generate SRT format subtitle
|
||||
@@ -161,10 +157,7 @@ impl SubtitleGenerator for WhisperOnline {
|
||||
let milliseconds = ((timestamp - hours * 3600.0 - minutes * 60.0 - seconds)
|
||||
* 1000.0)
|
||||
.floor() as u32;
|
||||
format!(
|
||||
"{:02}:{:02}:{:02},{:03}",
|
||||
hours, minutes, seconds, milliseconds
|
||||
)
|
||||
format!("{hours:02}:{minutes:02}:{seconds:02},{milliseconds:03}")
|
||||
};
|
||||
|
||||
let line = format!(
|
||||
@@ -180,12 +173,12 @@ impl SubtitleGenerator for WhisperOnline {
|
||||
|
||||
log::info!("Time taken: {} seconds", start_time.elapsed().as_secs_f64());
|
||||
|
||||
let subtitle_content = srtparse::from_str(&subtitle)
|
||||
.map_err(|e| format!("Failed to parse subtitle: {}", e))?;
|
||||
let subtitle_content =
|
||||
srtparse::from_str(&subtitle).map_err(|e| format!("Failed to parse subtitle: {e}"))?;
|
||||
|
||||
Ok(GenerateResult {
|
||||
generator_type: SubtitleGeneratorType::WhisperOnline,
|
||||
subtitle_id: "".to_string(),
|
||||
subtitle_id: String::new(),
|
||||
subtitle_content,
|
||||
})
|
||||
}
|
||||
@@ -203,14 +196,14 @@ mod tests {
|
||||
#[async_trait]
|
||||
impl ProgressReporterTrait for MockReporter {
|
||||
fn update(&self, message: &str) {
|
||||
println!("Mock update: {}", message);
|
||||
println!("Mock update: {message}");
|
||||
}
|
||||
|
||||
async fn finish(&self, success: bool, message: &str) {
|
||||
if success {
|
||||
println!("Mock finish: {}", message);
|
||||
println!("Mock finish: {message}");
|
||||
} else {
|
||||
println!("Mock error: {}", message);
|
||||
println!("Mock error: {message}");
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -228,6 +221,7 @@ mod tests {
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
#[ignore = "requres api key"]
|
||||
async fn test_generate_subtitle() {
|
||||
let result = new(Some("https://api.openai.com/v1"), Some("sk-****"), None).await;
|
||||
assert!(result.is_ok());
|
||||
@@ -239,7 +233,7 @@ mod tests {
|
||||
"auto",
|
||||
)
|
||||
.await;
|
||||
println!("{:?}", result);
|
||||
println!("{result:?}");
|
||||
assert!(result.is_ok());
|
||||
let result = result.unwrap();
|
||||
println!("{:?}", result.subtitle_content);
|
||||
|
||||
47
src-tauri/src/webhook/events.rs
Normal file
47
src-tauri/src/webhook/events.rs
Normal file
@@ -0,0 +1,47 @@
|
||||
use uuid::Uuid;
|
||||
|
||||
use crate::{
|
||||
database::{account::AccountRow, record::RecordRow, recorder::RecorderRow, video::VideoRow},
|
||||
recorder::RecorderInfo,
|
||||
};
|
||||
|
||||
pub const CLIP_GENERATED: &str = "clip.generated";
|
||||
pub const CLIP_DELETED: &str = "clip.deleted";
|
||||
|
||||
pub const RECORD_STARTED: &str = "record.started";
|
||||
pub const RECORD_ENDED: &str = "record.ended";
|
||||
|
||||
pub const LIVE_STARTED: &str = "live.started";
|
||||
pub const LIVE_ENDED: &str = "live.ended";
|
||||
|
||||
pub const ARCHIVE_DELETED: &str = "archive.deleted";
|
||||
|
||||
pub const RECORDER_REMOVED: &str = "recorder.removed";
|
||||
pub const RECORDER_ADDED: &str = "recorder.added";
|
||||
|
||||
#[derive(serde::Serialize, serde::Deserialize, Debug, Clone)]
|
||||
pub struct WebhookEvent {
|
||||
pub id: String,
|
||||
pub event: String,
|
||||
pub payload: Payload,
|
||||
pub timestamp: i64,
|
||||
}
|
||||
|
||||
#[derive(serde::Serialize, serde::Deserialize, Debug, Clone)]
|
||||
#[serde(untagged)]
|
||||
pub enum Payload {
|
||||
Account(AccountRow),
|
||||
Recorder(RecorderRow),
|
||||
Room(RecorderInfo),
|
||||
Clip(VideoRow),
|
||||
Archive(RecordRow),
|
||||
}
|
||||
|
||||
pub fn new_webhook_event(event_type: &str, payload: Payload) -> WebhookEvent {
|
||||
WebhookEvent {
|
||||
id: Uuid::new_v4().to_string(),
|
||||
event: event_type.to_string(),
|
||||
payload,
|
||||
timestamp: chrono::Utc::now().timestamp(),
|
||||
}
|
||||
}
|
||||
2
src-tauri/src/webhook/mod.rs
Normal file
2
src-tauri/src/webhook/mod.rs
Normal file
@@ -0,0 +1,2 @@
|
||||
pub mod events;
|
||||
pub mod poster;
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user