mirror of
https://github.com/Xinrea/bili-shadowreplay.git
synced 2025-11-25 04:22:24 +08:00
Compare commits
56 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
14d03b7eb9 | ||
|
|
6f1db6c038 | ||
|
|
cd2d208e5c | ||
|
|
7d6ec72002 | ||
|
|
837cb6a978 | ||
|
|
aeeb0c08d7 | ||
|
|
72d8a7f485 | ||
|
|
5d3692c7a0 | ||
|
|
7e54231bef | ||
|
|
80a885dbf3 | ||
|
|
134c6bbb5f | ||
|
|
49a153adf7 | ||
|
|
99e15b0bda | ||
|
|
4de8a73af2 | ||
|
|
d104ba3180 | ||
|
|
abf0d4748f | ||
|
|
d2a9c44601 | ||
|
|
c269558bae | ||
|
|
cc22453a40 | ||
|
|
d525d92de4 | ||
|
|
2197dfe65c | ||
|
|
38ee00f474 | ||
|
|
8fdad41c71 | ||
|
|
f269995bb7 | ||
|
|
03a2db8c44 | ||
|
|
6d9cd3c6a8 | ||
|
|
303b2f7036 | ||
|
|
ec25c2ffd9 | ||
|
|
50ab608ddb | ||
|
|
3c76be9b81 | ||
|
|
ab7f0cf0b4 | ||
|
|
f9f590c4dc | ||
|
|
8d38fe582a | ||
|
|
dc4a26561d | ||
|
|
10c1d1f3a8 | ||
|
|
66bcf53d01 | ||
|
|
8ab4b7d693 | ||
|
|
ce2f097d32 | ||
|
|
f7575cd327 | ||
|
|
8634c6a211 | ||
|
|
b070013efc | ||
|
|
d2d9112f6c | ||
|
|
9fea18f2de | ||
|
|
74480f91ce | ||
|
|
b2e13b631f | ||
|
|
001d995c8f | ||
|
|
8cb2acea88 | ||
|
|
7c0d57d84e | ||
|
|
8cb875f449 | ||
|
|
e6bbe65723 | ||
|
|
f4a71a2476 | ||
|
|
47b9362b0a | ||
|
|
c1aad0806e | ||
|
|
4ccc90f9fb | ||
|
|
7dc63440e6 | ||
|
|
4094e8b80d |
44
.cursor/rules/ai-features.mdc
Normal file
44
.cursor/rules/ai-features.mdc
Normal file
@@ -0,0 +1,44 @@
|
||||
# AI Features and LangChain Integration
|
||||
|
||||
## AI Components
|
||||
|
||||
- **LangChain Integration**: Uses `@langchain/core`, `@langchain/deepseek`, `@langchain/langgraph`, `@langchain/ollama`
|
||||
- **Whisper Transcription**: Local and online transcription via `whisper-rs` in Rust backend
|
||||
- **AI Agent**: Located in [src/lib/agent/](mdc:src/lib/agent/) directory
|
||||
|
||||
## Frontend AI Features
|
||||
|
||||
- **AI Page**: [src/page/AI.svelte](mdc:src/page/AI.svelte) - Main AI interface
|
||||
- **Agent Logic**: [src/lib/agent/](mdc:src/lib/agent/) - AI agent implementation
|
||||
- **Interface**: [src/lib/interface.ts](mdc:src/lib/interface.ts) - AI communication layer
|
||||
|
||||
## Backend AI Features
|
||||
|
||||
- **Subtitle Generation**: [src-tauri/src/subtitle_generator/](mdc:src-tauri/src/subtitle_generator/) - AI-powered subtitle creation
|
||||
- **Whisper Integration**: [src-tauri/src/subtitle_generator.rs](mdc:src-tauri/src/subtitle_generator.rs) - Speech-to-text processing
|
||||
- **CUDA Support**: Optional CUDA acceleration for Whisper via feature flag
|
||||
|
||||
## AI Workflows
|
||||
|
||||
- **Live Transcription**: Real-time speech-to-text during live streams
|
||||
- **Content Summarization**: AI-powered content analysis and summarization
|
||||
- **Smart Editing**: AI-assisted video editing and clip generation
|
||||
- **Danmaku Processing**: AI analysis of danmaku (bullet comments) streams
|
||||
|
||||
## Configuration
|
||||
|
||||
- **LLM Settings**: Configure AI models in [src-tauri/config.example.toml](mdc:src-tauri/config.example.toml)
|
||||
- **Whisper Models**: Local model configuration for offline transcription
|
||||
- **API Keys**: External AI service configuration for online features
|
||||
|
||||
## Development Notes
|
||||
|
||||
- AI features require proper model configuration
|
||||
- CUDA feature enables GPU acceleration for Whisper
|
||||
- LangChain integration supports multiple AI providers
|
||||
- AI agent can work with both local and cloud-based models
|
||||
description:
|
||||
globs:
|
||||
alwaysApply: true
|
||||
|
||||
---
|
||||
53
.cursor/rules/build-deployment.mdc
Normal file
53
.cursor/rules/build-deployment.mdc
Normal file
@@ -0,0 +1,53 @@
|
||||
# Build and Deployment Configuration
|
||||
|
||||
## Build Scripts
|
||||
|
||||
- **PowerShell**: [build.ps1](mdc:build.ps1) - Windows build script
|
||||
- **FFmpeg Setup**: [ffmpeg_setup.ps1](mdc:ffmpeg_setup.ps1) - FFmpeg installation script
|
||||
- **Version Bump**: [scripts/bump.cjs](mdc:scripts/bump.cjs) - Version management script
|
||||
|
||||
## Package Management
|
||||
|
||||
- **Node.js**: [package.json](mdc:package.json) - Frontend dependencies and scripts
|
||||
- **Rust**: [src-tauri/Cargo.toml](mdc:src-tauri/Cargo.toml) - Backend dependencies and features
|
||||
- **Lock Files**: [yarn.lock](mdc:yarn.lock) - Yarn dependency lock
|
||||
|
||||
## Build Configuration
|
||||
|
||||
- **Vite**: [vite.config.ts](mdc:vite.config.ts) - Frontend build tool configuration
|
||||
- **Tailwind**: [tailwind.config.cjs](mdc:tailwind.config.cjs) - CSS framework configuration
|
||||
- **PostCSS**: [postcss.config.cjs](mdc:postcss.config.cjs) - CSS processing configuration
|
||||
- **TypeScript**: [tsconfig.json](mdc:tsconfig.json), [tsconfig.node.json](mdc:tsconfig.node.json) - TypeScript configuration
|
||||
|
||||
## Tauri Configuration
|
||||
|
||||
- **Main Config**: [src-tauri/tauri.conf.json](mdc:src-tauri/tauri.conf.json) - Core Tauri settings
|
||||
- **Platform Configs**:
|
||||
- [src-tauri/tauri.macos.conf.json](mdc:src-tauri/tauri.macos.conf.json) - macOS specific
|
||||
- [src-tauri/tauri.linux.conf.json](mdc:src-tauri/tauri.linux.conf.json) - Linux specific
|
||||
- [src-tauri/tauri.windows.conf.json](mdc:src-tauri/tauri.windows.conf.json) - Windows specific
|
||||
- [src-tauri/tauri.windows.cuda.conf.json](mdc:src-tauri/tauri.windows.cuda.conf.json) - Windows with CUDA
|
||||
|
||||
## Docker Support
|
||||
|
||||
- **Dockerfile**: [Dockerfile](mdc:Dockerfile) - Container deployment configuration
|
||||
- **Documentation**: [docs/](mdc:docs/) - VitePress-based documentation site
|
||||
|
||||
## Build Commands
|
||||
|
||||
- **Frontend**: `yarn build` - Build production frontend
|
||||
- **Tauri**: `yarn tauri build` - Build desktop application
|
||||
- **Documentation**: `yarn docs:build` - Build documentation site
|
||||
- **Type Check**: `yarn check` - TypeScript and Svelte validation
|
||||
|
||||
## Deployment Targets
|
||||
|
||||
- **Desktop**: Native Tauri applications for Windows, macOS, Linux
|
||||
- **Docker**: Containerized deployment option
|
||||
- **Documentation**: Static site deployment via VitePress
|
||||
- **Assets**: Static asset distribution for web components
|
||||
description:
|
||||
globs:
|
||||
alwaysApply: true
|
||||
|
||||
---
|
||||
53
.cursor/rules/database-data.mdc
Normal file
53
.cursor/rules/database-data.mdc
Normal file
@@ -0,0 +1,53 @@
|
||||
# Database and Data Management
|
||||
|
||||
## Database Architecture
|
||||
|
||||
- **SQLite Database**: Primary data storage using `sqlx` with async runtime
|
||||
- **Database Module**: [src-tauri/src/database/](mdc:src-tauri/src/database/) - Core database operations
|
||||
- **Migration System**: [src-tauri/src/migration.rs](mdc:src-tauri/src/migration.rs) - Database schema management
|
||||
|
||||
## Data Models
|
||||
|
||||
- **Recording Data**: Stream metadata, recording sessions, and file information
|
||||
- **Room Configuration**: Stream room settings and platform credentials
|
||||
- **Task Management**: Recording task status and progress tracking
|
||||
- **User Preferences**: Application settings and user configurations
|
||||
|
||||
## Frontend Data Layer
|
||||
|
||||
- **Database Interface**: [src/lib/db.ts](mdc:src/lib/db.ts) - Frontend database operations
|
||||
- **Stores**: [src/lib/stores/](mdc:src/lib/stores/) - State management for data
|
||||
- **Version Management**: [src/lib/stores/version.ts](mdc:src/lib/stores/version.ts) - Version tracking
|
||||
|
||||
## Data Operations
|
||||
|
||||
- **CRUD Operations**: Create, read, update, delete for all data entities
|
||||
- **Query Optimization**: Efficient SQL queries with proper indexing
|
||||
- **Transaction Support**: ACID compliance for critical operations
|
||||
- **Data Validation**: Input validation and sanitization
|
||||
|
||||
## File Management
|
||||
|
||||
- **Cache Directory**: [src-tauri/cache/](mdc:src-tauri/cache/) - Temporary file storage
|
||||
- **Upload Directory**: [src-tauri/cache/uploads/](mdc:src-tauri/cache/uploads/) - User upload storage
|
||||
- **Bilibili Cache**: [src-tauri/cache/bilibili/](mdc:src-tauri/cache/bilibili/) - Platform-specific cache
|
||||
|
||||
## Data Persistence
|
||||
|
||||
- **SQLite Files**: [src-tauri/data/data_v2.db](mdc:src-tauri/data/data_v2.db) - Main database file
|
||||
- **Write-Ahead Logging**: WAL mode for concurrent access and performance
|
||||
- **Backup Strategy**: Database backup and recovery procedures
|
||||
- **Migration Handling**: Automatic schema updates and data migration
|
||||
|
||||
## Development Guidelines
|
||||
|
||||
- Use prepared statements to prevent SQL injection
|
||||
- Implement proper error handling for database operations
|
||||
- Use transactions for multi-step operations
|
||||
- Follow database naming conventions consistently
|
||||
- Test database operations with sample data
|
||||
description:
|
||||
globs:
|
||||
alwaysApply: true
|
||||
|
||||
---
|
||||
45
.cursor/rules/frontend-development.mdc
Normal file
45
.cursor/rules/frontend-development.mdc
Normal file
@@ -0,0 +1,45 @@
|
||||
# Frontend Development Guidelines
|
||||
|
||||
## Svelte 3 Best Practices
|
||||
|
||||
- Use Svelte 3 syntax with `<script>` tags for component logic
|
||||
- Prefer reactive statements with `$:` for derived state
|
||||
- Use stores from [src/lib/stores/](mdc:src/lib/stores/) for global state management
|
||||
- Import components from [src/lib/components/](mdc:src/lib/components/)
|
||||
|
||||
## TypeScript Configuration
|
||||
|
||||
- Follow the configuration in [tsconfig.json](mdc:tsconfig.json)
|
||||
- Use strict type checking with `checkJs: true`
|
||||
- Extends `@tsconfig/svelte` for Svelte-specific TypeScript settings
|
||||
- Base URL is set to workspace root for clean imports
|
||||
|
||||
## Component Structure
|
||||
|
||||
- **Page components**: Located in [src/page/](mdc:src/page/) directory
|
||||
- **Reusable components**: Located in [src/lib/components/](mdc:src/lib/components/) directory
|
||||
- **Layout components**: [src/App.svelte](mdc:src/App.svelte), [src/AppClip.svelte](mdc:src/AppClip.svelte), [src/AppLive.svelte](mdc:src/AppLive.svelte)
|
||||
|
||||
## Styling
|
||||
|
||||
- Use Tailwind CSS classes for styling
|
||||
- Configuration in [tailwind.config.cjs](mdc:tailwind.config.cjs)
|
||||
- PostCSS configuration in [postcss.config.cjs](mdc:postcss.config.cjs)
|
||||
- Global styles in [src/styles.css](mdc:src/styles.css)
|
||||
|
||||
## Entry Points
|
||||
|
||||
- **Main app**: [src/main.ts](mdc:src/main.ts) - Main application entry
|
||||
- **Clip mode**: [src/main_clip.ts](mdc:src/main_clip.ts) - Clip editing interface
|
||||
- **Live mode**: [src/main_live.ts](mdc:src/main_live.ts) - Live streaming interface
|
||||
|
||||
## Development Workflow
|
||||
|
||||
- Use `yarn dev` for frontend-only development
|
||||
- Use `yarn tauri dev` for full Tauri development
|
||||
- Use `yarn check` for TypeScript and Svelte type checking
|
||||
description:
|
||||
globs:
|
||||
alwaysApply: true
|
||||
|
||||
---
|
||||
47
.cursor/rules/project-overview.mdc
Normal file
47
.cursor/rules/project-overview.mdc
Normal file
@@ -0,0 +1,47 @@
|
||||
# BiliBili ShadowReplay Project Overview
|
||||
|
||||
This is a Tauri-based desktop application for caching live streams and performing real-time editing and submission. It supports Bilibili and Douyin platforms.
|
||||
|
||||
## Project Structure
|
||||
|
||||
### Frontend (Svelte + TypeScript)
|
||||
|
||||
- **Main entry points**: [src/main.ts](mdc:src/main.ts), [src/main_clip.ts](mdc:src/main_clip.ts), [src/main_live.ts](mdc:src/main_live.ts)
|
||||
- **App components**: [src/App.svelte](mdc:src/App.svelte), [src/AppClip.svelte](mdc:src/AppClip.svelte), [src/AppLive.svelte](mdc:src/AppLive.svelte)
|
||||
- **Pages**: Located in [src/page/](mdc:src/page/) directory
|
||||
- **Components**: Located in [src/lib/components/](mdc:src/lib/components/) directory
|
||||
- **Stores**: Located in [src/lib/stores/](mdc:src/lib/stores/) directory
|
||||
|
||||
### Backend (Rust + Tauri)
|
||||
|
||||
- **Main entry**: [src-tauri/src/main.rs](mdc:src-tauri/src/main.rs)
|
||||
- **Core modules**:
|
||||
- [src-tauri/src/recorder/](mdc:src-tauri/src/recorder/) - Stream recording functionality
|
||||
- [src-tauri/src/database/](mdc:src-tauri/src/database/) - Database operations
|
||||
- [src-tauri/src/handlers/](mdc:src-tauri/src/handlers/) - Tauri command handlers
|
||||
- **Custom crate**: [src-tauri/crates/danmu_stream/](mdc:src-tauri/crates/danmu_stream/) - Danmaku stream processing
|
||||
|
||||
### Configuration
|
||||
|
||||
- **Frontend config**: [tsconfig.json](mdc:tsconfig.json), [vite.config.ts](mdc:vite.config.ts), [tailwind.config.cjs](mdc:tailwind.config.cjs)
|
||||
- **Backend config**: [src-tauri/Cargo.toml](mdc:src-tauri/Cargo.toml), [src-tauri/tauri.conf.json](mdc:src-tauri/tauri.conf.json)
|
||||
- **Example config**: [src-tauri/config.example.toml](mdc:src-tauri/config.example.toml)
|
||||
|
||||
## Key Technologies
|
||||
|
||||
- **Frontend**: Svelte 3, TypeScript, Tailwind CSS, Flowbite
|
||||
- **Backend**: Rust, Tauri 2, SQLite, FFmpeg
|
||||
- **AI Features**: LangChain, Whisper for transcription
|
||||
- **Build Tools**: Vite, VitePress for documentation
|
||||
|
||||
## Development Commands
|
||||
|
||||
- `yarn dev` - Start development server
|
||||
- `yarn tauri dev` - Start Tauri development
|
||||
- `yarn build` - Build frontend
|
||||
- `yarn docs:dev` - Start documentation server
|
||||
description:
|
||||
globs:
|
||||
alwaysApply: true
|
||||
|
||||
---
|
||||
47
.cursor/rules/rust-backend.mdc
Normal file
47
.cursor/rules/rust-backend.mdc
Normal file
@@ -0,0 +1,47 @@
|
||||
# Rust Backend Development Guidelines
|
||||
|
||||
## Project Structure
|
||||
|
||||
- **Main entry**: [src-tauri/src/main.rs](mdc:src-tauri/src/main.rs) - Application entry point
|
||||
- **Core modules**:
|
||||
- [src-tauri/src/recorder/](mdc:src-tauri/src/recorder/) - Stream recording and management
|
||||
- [src-tauri/src/database/](mdc:src-tauri/src/database/) - SQLite database operations
|
||||
- [src-tauri/src/handlers/](mdc:src-tauri/src/handlers/) - Tauri command handlers
|
||||
- [src-tauri/src/subtitle_generator/](mdc:src-tauri/src/subtitle_generator/) - AI-powered subtitle generation
|
||||
|
||||
## Custom Crates
|
||||
|
||||
- **danmu_stream**: [src-tauri/crates/danmu_stream/](mdc:src-tauri/crates/danmu_stream/) - Danmaku stream processing library
|
||||
|
||||
## Dependencies
|
||||
|
||||
- **Tauri 2**: Core framework for desktop app functionality
|
||||
- **FFmpeg**: Video/audio processing via `async-ffmpeg-sidecar`
|
||||
- **Whisper**: AI transcription via `whisper-rs` (CUDA support available)
|
||||
- **LangChain**: AI agent functionality
|
||||
- **SQLite**: Database via `sqlx` with async runtime
|
||||
|
||||
## Configuration
|
||||
|
||||
- **Cargo.toml**: [src-tauri/Cargo.toml](mdc:src-tauri/Cargo.toml) - Dependencies and features
|
||||
- **Tauri config**: [src-tauri/tauri.conf.json](mdc:src-tauri/tauri.conf.json) - App configuration
|
||||
- **Example config**: [src-tauri/config.example.toml](mdc:src-tauri/config.example.toml) - User configuration template
|
||||
|
||||
## Features
|
||||
|
||||
- **default**: Includes GUI and core functionality
|
||||
- **cuda**: Enables CUDA acceleration for Whisper transcription
|
||||
- **headless**: Headless mode without GUI
|
||||
- **custom-protocol**: Required for production builds
|
||||
|
||||
## Development Commands
|
||||
|
||||
- `yarn tauri dev` - Start Tauri development with hot reload
|
||||
- `yarn tauri build` - Build production application
|
||||
- `cargo check` - Check Rust code without building
|
||||
- `cargo test` - Run Rust tests
|
||||
description:
|
||||
globs:
|
||||
alwaysApply: true
|
||||
|
||||
---
|
||||
53
.cursor/rules/streaming-recording.mdc
Normal file
53
.cursor/rules/streaming-recording.mdc
Normal file
@@ -0,0 +1,53 @@
|
||||
# Streaming and Recording System
|
||||
|
||||
## Core Recording Components
|
||||
|
||||
- **Recorder Manager**: [src-tauri/src/recorder_manager.rs](mdc:src-tauri/src/recorder_manager.rs) - Main recording orchestration
|
||||
- **Recorder**: [src-tauri/src/recorder/](mdc:src-tauri/src/recorder/) - Individual stream recording logic
|
||||
- **Danmaku Stream**: [src-tauri/crates/danmu_stream/](mdc:src-tauri/crates/danmu_stream/) - Custom crate for bullet comment processing
|
||||
|
||||
## Supported Platforms
|
||||
|
||||
- **Bilibili**: Main platform support with live stream caching
|
||||
- **Douyin**: TikTok's Chinese platform support
|
||||
- **Multi-stream**: Support for recording multiple streams simultaneously
|
||||
|
||||
## Recording Features
|
||||
|
||||
- **Live Caching**: Real-time stream recording and buffering
|
||||
- **Time-based Clipping**: Extract specific time segments from recorded streams
|
||||
- **Danmaku Capture**: Record bullet comments and chat messages
|
||||
- **Quality Control**: Configurable recording quality and format options
|
||||
|
||||
## Frontend Interfaces
|
||||
|
||||
- **Live Mode**: [src/AppLive.svelte](mdc:src/AppLive.svelte) - Live streaming interface
|
||||
- **Clip Mode**: [src/AppClip.svelte](mdc:src/AppClip.svelte) - Video editing and clipping
|
||||
- **Room Management**: [src/page/Room.svelte](mdc:src/page/Room.svelte) - Stream room configuration
|
||||
- **Task Management**: [src/page/Task.svelte](mdc:src/page/Task.svelte) - Recording task monitoring
|
||||
|
||||
## Technical Implementation
|
||||
|
||||
- **FFmpeg Integration**: Video/audio processing via `async-ffmpeg-sidecar`
|
||||
- **M3U8 Support**: HLS stream processing with `m3u8-rs`
|
||||
- **Async Processing**: Non-blocking I/O with `tokio` runtime
|
||||
- **Database Storage**: SQLite for metadata and recording information
|
||||
|
||||
## Configuration
|
||||
|
||||
- **Recording Settings**: Configure in [src-tauri/config.example.toml](mdc:src-tauri/config.example.toml)
|
||||
- **FFmpeg Path**: Set FFmpeg binary location for video processing
|
||||
- **Storage Paths**: Configure cache and output directories
|
||||
- **Quality Settings**: Adjust recording bitrate and format options
|
||||
|
||||
## Development Workflow
|
||||
|
||||
- Use [src-tauri/src/recorder/](mdc:src-tauri/src/recorder/) for core recording logic
|
||||
- Test with [src-tauri/tests/](mdc:src-tauri/tests/) directory
|
||||
- Monitor recording progress via progress manager
|
||||
- Handle errors gracefully with custom error types
|
||||
description:
|
||||
globs:
|
||||
alwaysApply: true
|
||||
|
||||
---
|
||||
36
.devcontainer/Dockerfile
Normal file
36
.devcontainer/Dockerfile
Normal file
@@ -0,0 +1,36 @@
|
||||
ARG VARIANT=bookworm-slim
|
||||
FROM debian:${VARIANT}
|
||||
ENV DEBIAN_FRONTEND=noninteractive
|
||||
|
||||
# Arguments
|
||||
ARG CONTAINER_USER=vscode
|
||||
ARG CONTAINER_GROUP=vscode
|
||||
|
||||
# Install dependencies
|
||||
RUN apt-get update \
|
||||
&& apt-get install -y \
|
||||
build-essential \
|
||||
clang \
|
||||
cmake \
|
||||
curl \
|
||||
file \
|
||||
git \
|
||||
libayatana-appindicator3-dev \
|
||||
librsvg2-dev \
|
||||
libssl-dev \
|
||||
libwebkit2gtk-4.1-dev \
|
||||
libxdo-dev \
|
||||
pkg-config \
|
||||
wget \
|
||||
&& apt-get clean -y && rm -rf /var/lib/apt/lists/* /tmp/library-scripts
|
||||
|
||||
# Set users
|
||||
RUN adduser --disabled-password --gecos "" ${CONTAINER_USER}
|
||||
USER ${CONTAINER_USER}
|
||||
WORKDIR /home/${CONTAINER_USER}
|
||||
|
||||
# Install rustup
|
||||
RUN curl --proto "=https" --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y --profile minimal
|
||||
ENV PATH=${PATH}:/home/${CONTAINER_USER}/.cargo/bin
|
||||
|
||||
CMD [ "/bin/bash" ]
|
||||
31
.devcontainer/devcontainer.json
Normal file
31
.devcontainer/devcontainer.json
Normal file
@@ -0,0 +1,31 @@
|
||||
{
|
||||
"name": "vscode",
|
||||
"build": {
|
||||
"dockerfile": "Dockerfile",
|
||||
"args": {
|
||||
"CONTAINER_USER": "vscode",
|
||||
"CONTAINER_GROUP": "vscode"
|
||||
}
|
||||
},
|
||||
"features": {
|
||||
"ghcr.io/devcontainers/features/node:1": {
|
||||
"version": "latest"
|
||||
}
|
||||
},
|
||||
"customizations": {
|
||||
"vscode": {
|
||||
"settings": {
|
||||
"lldb.executable": "/usr/bin/lldb",
|
||||
"files.watcherExclude": {
|
||||
"**/target/**": true
|
||||
}
|
||||
},
|
||||
"extensions": [
|
||||
"vadimcn.vscode-lldb",
|
||||
"rust-lang.rust-analyzer",
|
||||
"tamasfe.even-better-toml"
|
||||
]
|
||||
}
|
||||
},
|
||||
"remoteUser": "vscode"
|
||||
}
|
||||
21
.github/ISSUE_TEMPLATE/bug_report.md
vendored
21
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@@ -1,21 +0,0 @@
|
||||
---
|
||||
name: Bug report
|
||||
about: 提交一个 BUG
|
||||
title: "[BUG]"
|
||||
labels: bug
|
||||
assignees: Xinrea
|
||||
---
|
||||
|
||||
**描述:**
|
||||
简要描述一下这个 BUG 的现象
|
||||
|
||||
**日志和截图:**
|
||||
如果可以的话,请尽量附上相关截图和日志文件(日志是位于安装目录下,名为 bsr.log 的文件)。
|
||||
|
||||
**相关信息:**
|
||||
|
||||
- 程序版本:
|
||||
- 系统类型:
|
||||
|
||||
**其他**
|
||||
任何其他想说的
|
||||
47
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
Normal file
47
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
Normal file
@@ -0,0 +1,47 @@
|
||||
name: Bug Report
|
||||
description: 提交 BUG 报告.
|
||||
title: "[bug] "
|
||||
labels: ["bug"]
|
||||
assignees:
|
||||
- Xinrea
|
||||
body:
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: 提交须知
|
||||
description: 请确认以下内容
|
||||
options:
|
||||
- label: 我是在最新版本上发现的此问题
|
||||
required: true
|
||||
- label: 我已阅读 [常见问题](https://bsr.xinrea.cn/usage/faq.html) 的说明
|
||||
required: true
|
||||
- type: dropdown
|
||||
id: app_type
|
||||
attributes:
|
||||
label: 以哪种方式使用的该软件?
|
||||
multiple: false
|
||||
options:
|
||||
- Docker 镜像
|
||||
- 桌面应用
|
||||
- type: dropdown
|
||||
id: os
|
||||
attributes:
|
||||
label: 运行环境
|
||||
multiple: false
|
||||
options:
|
||||
- Linux
|
||||
- Windows
|
||||
- MacOS
|
||||
- Docker
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: BUG 描述
|
||||
description: 请尽可能详细描述 BUG 的现象以及复现的方法
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: logs
|
||||
attributes:
|
||||
label: 日志
|
||||
description: 请粘贴日志内容或是上传日志文件(在主窗口的设置页面,提供了一键打开日志目录所在位置的按钮;当你打开日志目录所在位置后,进入 logs 目录,找到后缀名为 log 的文件)
|
||||
validations:
|
||||
required: true
|
||||
20
.github/ISSUE_TEMPLATE/feature_request.md
vendored
20
.github/ISSUE_TEMPLATE/feature_request.md
vendored
@@ -1,20 +0,0 @@
|
||||
---
|
||||
name: Feature request
|
||||
about: 提交一个新功能的建议
|
||||
title: "[feature]"
|
||||
labels: enhancement
|
||||
assignees: Xinrea
|
||||
|
||||
---
|
||||
|
||||
**遇到的问题:**
|
||||
在使用过程中遇到了什么问题让你想要提出建议
|
||||
|
||||
**想要的功能:**
|
||||
想要怎样的新功能来解决这个问题
|
||||
|
||||
**通过什么方式实现(有思路的话):**
|
||||
如果有相关的实现思路或者是参考,可以在此提供
|
||||
|
||||
**其他:**
|
||||
其他任何想说的话
|
||||
13
.github/ISSUE_TEMPLATE/feature_request.yml
vendored
Normal file
13
.github/ISSUE_TEMPLATE/feature_request.yml
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
name: Feature Request
|
||||
description: 提交新功能的需求
|
||||
title: "[feature] "
|
||||
labels: ["feature"]
|
||||
assignees:
|
||||
- Xinrea
|
||||
body:
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: 需求描述
|
||||
description: 请尽可能详细描述你想要的新功能
|
||||
validations:
|
||||
required: true
|
||||
43
.github/workflows/check.yml
vendored
Normal file
43
.github/workflows/check.yml
vendored
Normal file
@@ -0,0 +1,43 @@
|
||||
name: Check
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
branches: [ "main" ]
|
||||
paths:
|
||||
- 'src-tauri/**'
|
||||
- '.github/workflows/check.yml'
|
||||
|
||||
jobs:
|
||||
check:
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup Rust
|
||||
uses: dtolnay/rust-toolchain@stable
|
||||
with:
|
||||
components: rustfmt
|
||||
|
||||
- name: Cache cargo registry
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: |
|
||||
~/.cargo/registry
|
||||
~/.cargo/git
|
||||
src-tauri/target
|
||||
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}
|
||||
|
||||
- name: Install dependencies (ubuntu only)
|
||||
run: |
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf ffmpeg
|
||||
|
||||
- name: Check formatting
|
||||
run: cargo fmt --check
|
||||
working-directory: src-tauri
|
||||
|
||||
- name: Check tests
|
||||
run: cargo test -v && cargo test --no-default-features --features headless -v
|
||||
working-directory: src-tauri
|
||||
20
.github/workflows/main.yml
vendored
20
.github/workflows/main.yml
vendored
@@ -59,11 +59,6 @@ jobs:
|
||||
if: matrix.platform == 'windows-latest' && matrix.features == 'cuda'
|
||||
uses: Jimver/cuda-toolkit@v0.2.24
|
||||
|
||||
- name: Rust cache
|
||||
uses: swatinem/rust-cache@v2
|
||||
with:
|
||||
workspaces: "./src-tauri -> target"
|
||||
|
||||
- name: Setup ffmpeg
|
||||
if: matrix.platform == 'windows-latest'
|
||||
working-directory: ./
|
||||
@@ -87,6 +82,19 @@ jobs:
|
||||
Copy-Item "$cudaPath\cublas64*.dll" -Destination $targetPath
|
||||
Copy-Item "$cudaPath\cublasLt64*.dll" -Destination $targetPath
|
||||
|
||||
- name: Get previous tag
|
||||
id: get_previous_tag
|
||||
run: |
|
||||
# Get the previous tag (excluding the current one being pushed)
|
||||
PREVIOUS_TAG=$(git describe --tags --abbrev=0 HEAD~1 2>/dev/null || echo "")
|
||||
if [ -z "$PREVIOUS_TAG" ]; then
|
||||
# If no previous tag found, use the first commit
|
||||
PREVIOUS_TAG=$(git rev-list --max-parents=0 HEAD | head -1)
|
||||
fi
|
||||
echo "previous_tag=$PREVIOUS_TAG" >> $GITHUB_OUTPUT
|
||||
echo "current_tag=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT
|
||||
shell: bash
|
||||
|
||||
- uses: tauri-apps/tauri-action@v0
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
@@ -96,7 +104,7 @@ jobs:
|
||||
with:
|
||||
tagName: v__VERSION__
|
||||
releaseName: "BiliBili ShadowReplay v__VERSION__"
|
||||
releaseBody: "See the assets to download this version and install."
|
||||
releaseBody: "> [!NOTE]\n> 如果你是第一次下载安装,请参考 [安装准备](https://bsr.xinrea.cn/getting-started/installation/desktop.html) 选择合适的版本。\n> Changelog: https://github.com/Xinrea/bili-shadowreplay/compare/${{ steps.get_previous_tag.outputs.previous_tag }}...${{ steps.get_previous_tag.outputs.current_tag }}"
|
||||
releaseDraft: true
|
||||
prerelease: false
|
||||
args: ${{ matrix.args }} ${{ matrix.platform == 'windows-latest' && matrix.features == 'cuda' && '--config src-tauri/tauri.windows.cuda.conf.json' || '' }}
|
||||
|
||||
1
.gitignore
vendored
1
.gitignore
vendored
@@ -11,6 +11,7 @@ node_modules
|
||||
dist
|
||||
dist-ssr
|
||||
*.local
|
||||
/target/
|
||||
|
||||
# Editor directories and files
|
||||
.vscode/*
|
||||
|
||||
@@ -65,9 +65,16 @@ RUN apt-get update && apt-get install -y \
|
||||
libssl3 \
|
||||
ca-certificates \
|
||||
fonts-wqy-microhei \
|
||||
netbase \
|
||||
nscd \
|
||||
&& update-ca-certificates \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
|
||||
RUN touch /etc/netgroup
|
||||
RUN mkdir -p /var/run/nscd && chmod 755 /var/run/nscd
|
||||
RUN nscd
|
||||
|
||||
# Add /app to PATH
|
||||
ENV PATH="/app:${PATH}"
|
||||
|
||||
@@ -83,4 +90,4 @@ COPY --from=rust-builder /app/src-tauri/ffprobe ./ffprobe
|
||||
EXPOSE 3000
|
||||
|
||||
# Run the application
|
||||
CMD ["./bili-shadowreplay"]
|
||||
CMD ["sh", "-c", "nscd && ./bili-shadowreplay"]
|
||||
|
||||
@@ -4,9 +4,9 @@
|
||||
|
||||

|
||||

|
||||
|
||||

|
||||

|
||||
[](https://deepwiki.com/Xinrea/bili-shadowreplay)
|
||||
|
||||
BiliBili ShadowReplay 是一个缓存直播并进行实时编辑投稿的工具。通过划定时间区间,并编辑简单的必需信息,即可完成直播切片以及投稿,将整个流程压缩到分钟级。同时,也支持对缓存的历史直播进行回放,以及相同的切片编辑投稿处理流程。
|
||||
|
||||
@@ -22,7 +22,9 @@ BiliBili ShadowReplay 是一个缓存直播并进行实时编辑投稿的工具
|
||||
|
||||
## 参与开发
|
||||
|
||||
[Contributing](.github/CONTRIBUTING.md)
|
||||
可以通过 [DeepWiki](https://deepwiki.com/Xinrea/bili-shadowreplay) 了解本项目。
|
||||
|
||||
贡献指南:[Contributing](.github/CONTRIBUTING.md)
|
||||
|
||||
## 赞助
|
||||
|
||||
|
||||
@@ -1,7 +1,8 @@
|
||||
import { defineConfig } from "vitepress";
|
||||
import { withMermaid } from "vitepress-plugin-mermaid";
|
||||
|
||||
// https://vitepress.dev/reference/site-config
|
||||
export default defineConfig({
|
||||
export default withMermaid({
|
||||
title: "BiliBili ShadowReplay",
|
||||
description: "直播录制/实时回放/剪辑/投稿工具",
|
||||
themeConfig: {
|
||||
@@ -53,6 +54,7 @@ export default defineConfig({
|
||||
{ text: "切片功能", link: "/usage/features/clip" },
|
||||
{ text: "字幕功能", link: "/usage/features/subtitle" },
|
||||
{ text: "弹幕功能", link: "/usage/features/danmaku" },
|
||||
{ text: "Webhook", link: "/usage/features/webhook" },
|
||||
],
|
||||
},
|
||||
{ text: "常见问题", link: "/usage/faq" },
|
||||
@@ -60,7 +62,12 @@ export default defineConfig({
|
||||
},
|
||||
{
|
||||
text: "开发文档",
|
||||
items: [{ text: "架构设计", link: "/develop/architecture" }],
|
||||
items: [
|
||||
{
|
||||
text: "DeepWiki",
|
||||
link: "https://deepwiki.com/Xinrea/bili-shadowreplay",
|
||||
},
|
||||
],
|
||||
},
|
||||
],
|
||||
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
# 架构设计
|
||||
@@ -14,7 +14,7 @@ hero:
|
||||
link: /getting-started/installation/desktop
|
||||
- theme: alt
|
||||
text: 说明文档
|
||||
link: /usage/features/room_manage
|
||||
link: /usage/features/workflow
|
||||
|
||||
features:
|
||||
- icon: 📹
|
||||
|
||||
BIN
docs/public/images/whole_clip.png
Normal file
BIN
docs/public/images/whole_clip.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 67 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 516 KiB |
BIN
docs/public/videos/room_remove.mp4
Normal file
BIN
docs/public/videos/room_remove.mp4
Normal file
Binary file not shown.
@@ -0,0 +1,31 @@
|
||||
# 常见问题
|
||||
|
||||
## 一、在哪里反馈问题?
|
||||
|
||||
你可以前往 [Github Issues](https://github.com/Xinrea/bili-shadowreplay/issues/new?template=bug_report.md) 提交问题,或是加入[反馈交流群](https://qm.qq.com/q/v4lrE6gyum)。
|
||||
|
||||
1. 在提交问题前,请先阅读其它常见问题,确保你的问题已有解答;
|
||||
2. 其次,请确保你的程序已更新到最新版本;
|
||||
3. 最后,你应准备好提供你的程序日志文件,以便更好地定位问题。
|
||||
|
||||
## 二、在哪里查看日志?
|
||||
|
||||
在主窗口的设置页面,提供了一键打开日志目录所在位置的按钮。当你打开日志目录所在位置后,进入 `logs` 目录,找到后缀名为 `log` 的文件,这便是你需要提供给开发者的日志文件。
|
||||
|
||||
## 三、无法预览直播或是生成切片
|
||||
|
||||
如果你是 macOS 或 Linux 用户,请确保你已安装了 `ffmpeg` 和 `ffprobe` 工具;如果不知道如何安装,请参考 [FFmpeg 配置](/getting-started/config/ffmpeg)。
|
||||
|
||||
如果你是 Windows 用户,程序目录下应当自带了 `ffmpeg` 和 `ffprobe` 工具,如果无法预览直播或是生成切片,请向开发者反馈。
|
||||
|
||||
## 四、添加 B 站直播间出现 -352 错误
|
||||
|
||||
`-352` 错误是由 B 站风控机制导致的,如果你添加了大量的 B 站直播间进行录制,可以在设置页面调整直播间状态的检查间隔,尽量避免风控;如果你在直播间数量较少的情况下出现该错误,请向开发者反馈。
|
||||
|
||||
## 五、录播为什么都是碎片文件?
|
||||
|
||||
缓存目录下的录播文件并非用于直接播放或是投稿,而是用于直播流的预览与实时回放。如果你需要录播文件用于投稿,请打开对应录播的预览界面,使用快捷键创建选区,生成所需范围的切片,切片文件为常规的 mp4 文件,位于你所设置的切片目录下。
|
||||
|
||||
如果你将 BSR 作为单纯的录播软件使用,在设置中可以开启`整场录播生成`,这样在直播结束后,BSR 会自动生成整场录播的切片。
|
||||
|
||||

|
||||
|
||||
@@ -13,8 +13,26 @@
|
||||
|
||||
抖音直播间比较特殊,当未开播时,你无法找到直播间的入口,因此你需要当直播间开播时找到直播间网页地址,并记录其直播间号。
|
||||
|
||||
抖音直播间需要输入主播的 sec_uid,你可以在主播主页的 URL 中找到,例如 `https://www.douyin.com/user/MS4wLjABAAAA` 中的 `MS4wLjABAAAA`。
|
||||
|
||||
### 使用 DeepLinking 快速添加直播间
|
||||
|
||||
<video src="/videos/deeplinking.mp4" loop autoplay muted style="border-radius: 10px;"></video>
|
||||
|
||||
在浏览器中观看直播时,替换地址栏中直播间地址中的 `https://` 为 `bsr://` 即可快速唤起 BSR 添加直播间。
|
||||
|
||||
## 启用/禁用直播间
|
||||
|
||||
你可以点击直播间卡片右上角的菜单按钮,选择启用/禁用直播间。
|
||||
|
||||
- 启用后,当直播间开播时,会自动开始录制
|
||||
- 禁用后,当直播间开播时,不会自动开始录制
|
||||
|
||||
## 移除直播间
|
||||
|
||||
> [!CAUTION]
|
||||
> 移除直播间后,该直播间相关的所有录播都会被删除,请谨慎操作。
|
||||
|
||||
你可以点击直播间卡片右上角的菜单按钮,选择移除直播间。
|
||||
|
||||
<video src="/videos/room_remove.mp4" loop autoplay muted style="border-radius: 10px;"></video>
|
||||
|
||||
245
docs/usage/features/webhook.md
Normal file
245
docs/usage/features/webhook.md
Normal file
@@ -0,0 +1,245 @@
|
||||
# Webhook
|
||||
|
||||
> [!NOTE]
|
||||
> 你可以使用 <https://webhook.site> 来测试 Webhook 功能。
|
||||
|
||||
## 设置 Webhook
|
||||
|
||||
打开 BSR 设置页面,在基础设置中设置 Webhook 地址。
|
||||
|
||||
## Webhook Events
|
||||
|
||||
### 直播间相关
|
||||
|
||||
#### 添加直播间
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "a96a5e9f-9857-4c13-b889-91da2ace208a",
|
||||
"event": "recorder.added",
|
||||
"payload": {
|
||||
"room_id": 26966466,
|
||||
"created_at": "2025-09-07T03:33:14.258796+00:00",
|
||||
"platform": "bilibili",
|
||||
"auto_start": true,
|
||||
"extra": ""
|
||||
},
|
||||
"timestamp": 1757215994
|
||||
}
|
||||
```
|
||||
|
||||
#### 移除直播间
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "e33623d4-e040-4390-88f5-d351ceeeace7",
|
||||
"event": "recorder.removed",
|
||||
"payload": {
|
||||
"room_id": 27183290,
|
||||
"created_at": "2025-08-30T10:54:18.569198+00:00",
|
||||
"platform": "bilibili",
|
||||
"auto_start": true,
|
||||
"extra": ""
|
||||
},
|
||||
"timestamp": 1757217015
|
||||
}
|
||||
```
|
||||
|
||||
### 直播相关
|
||||
|
||||
> [!NOTE]
|
||||
> 直播开始和结束,不意味着录制的开始和结束。
|
||||
|
||||
#### 直播开始
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "f12f3424-f7d8-4b2f-a8b7-55477411482e",
|
||||
"event": "live.started",
|
||||
"payload": {
|
||||
"room_id": 843610,
|
||||
"room_info": {
|
||||
"room_id": 843610,
|
||||
"room_title": "登顶!",
|
||||
"room_cover": "https://i0.hdslb.com/bfs/live/new_room_cover/73aea43f4b4624c314d62fea4b424822fb506dfb.jpg"
|
||||
},
|
||||
"user_info": {
|
||||
"user_id": "475210",
|
||||
"user_name": "Xinrea",
|
||||
"user_avatar": "https://i1.hdslb.com/bfs/face/91beb3bf444b295fe12bae1f3dc6d9fc4fe4c224.jpg"
|
||||
},
|
||||
"total_length": 0,
|
||||
"current_live_id": "",
|
||||
"live_status": false,
|
||||
"is_recording": false,
|
||||
"auto_start": true,
|
||||
"platform": "bilibili"
|
||||
},
|
||||
"timestamp": 1757217190
|
||||
}
|
||||
```
|
||||
|
||||
#### 直播结束
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "e8b0756a-02f9-4655-b5ae-a170bf9547bd",
|
||||
"event": "live.ended",
|
||||
"payload": {
|
||||
"room_id": 843610,
|
||||
"room_info": {
|
||||
"room_id": 843610,
|
||||
"room_title": "登顶!",
|
||||
"room_cover": "https://i0.hdslb.com/bfs/live/new_room_cover/73aea43f4b4624c314d62fea4b424822fb506dfb.jpg"
|
||||
},
|
||||
"user_info": {
|
||||
"user_id": "475210",
|
||||
"user_name": "Xinrea",
|
||||
"user_avatar": "https://i1.hdslb.com/bfs/face/91beb3bf444b295fe12bae1f3dc6d9fc4fe4c224.jpg"
|
||||
},
|
||||
"total_length": 0,
|
||||
"current_live_id": "",
|
||||
"live_status": true,
|
||||
"is_recording": false,
|
||||
"auto_start": true,
|
||||
"platform": "bilibili"
|
||||
},
|
||||
"timestamp": 1757217365
|
||||
}
|
||||
```
|
||||
|
||||
### 录播相关
|
||||
|
||||
#### 开始录制
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "5ec1ea10-2b31-48fd-8deb-f2d7d2ea5985",
|
||||
"event": "record.started",
|
||||
"payload": {
|
||||
"room_id": 26966466,
|
||||
"room_info": {
|
||||
"room_id": 26966466,
|
||||
"room_title": "早安獭獭栞!下播前抽fufu",
|
||||
"room_cover": "https://i0.hdslb.com/bfs/live/user_cover/b810c36855168034557e905e5916b1dba1761fa4.jpg"
|
||||
},
|
||||
"user_info": {
|
||||
"user_id": "1609526545",
|
||||
"user_name": "栞栞Shiori",
|
||||
"user_avatar": "https://i1.hdslb.com/bfs/face/47e8dbabb895de44ec6cace085d4dc1d40307277.jpg"
|
||||
},
|
||||
"total_length": 0,
|
||||
"current_live_id": "1757216045412",
|
||||
"live_status": true,
|
||||
"is_recording": false,
|
||||
"auto_start": true,
|
||||
"platform": "bilibili"
|
||||
},
|
||||
"timestamp": 1757216045
|
||||
}
|
||||
```
|
||||
|
||||
#### 结束录制
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "56fd03e5-3965-4c2e-a6a9-bb6932347eb3",
|
||||
"event": "record.ended",
|
||||
"payload": {
|
||||
"room_id": 26966466,
|
||||
"room_info": {
|
||||
"room_id": 26966466,
|
||||
"room_title": "早安獭獭栞!下播前抽fufu",
|
||||
"room_cover": "https://i0.hdslb.com/bfs/live/user_cover/b810c36855168034557e905e5916b1dba1761fa4.jpg"
|
||||
},
|
||||
"user_info": {
|
||||
"user_id": "1609526545",
|
||||
"user_name": "栞栞Shiori",
|
||||
"user_avatar": "https://i1.hdslb.com/bfs/face/47e8dbabb895de44ec6cace085d4dc1d40307277.jpg"
|
||||
},
|
||||
"total_length": 52.96700000000001,
|
||||
"current_live_id": "1757215994597",
|
||||
"live_status": true,
|
||||
"is_recording": true,
|
||||
"auto_start": true,
|
||||
"platform": "bilibili"
|
||||
},
|
||||
"timestamp": 1757216040
|
||||
}
|
||||
```
|
||||
|
||||
#### 删除录播
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "c32bc811-ab4b-49fd-84c7-897727905d16",
|
||||
"event": "archive.deleted",
|
||||
"payload": {
|
||||
"platform": "bilibili",
|
||||
"live_id": "1756607084705",
|
||||
"room_id": 1967212929,
|
||||
"title": "灶台O.o",
|
||||
"length": 9,
|
||||
"size": 1927112,
|
||||
"created_at": "2025-08-31T02:24:44.728616+00:00",
|
||||
"cover": "bilibili/1967212929/1756607084705/cover.jpg"
|
||||
},
|
||||
"timestamp": 1757176219
|
||||
}
|
||||
```
|
||||
|
||||
### 切片相关
|
||||
|
||||
#### 切片生成
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "f542e0e1-688b-4f1a-8ce1-e5e51530cf5d",
|
||||
"event": "clip.generated",
|
||||
"payload": {
|
||||
"id": 316,
|
||||
"room_id": 27183290,
|
||||
"cover": "[27183290][1757172501727][一起看凡人修仙传][2025-09-07_00-16-11].jpg",
|
||||
"file": "[27183290][1757172501727][一起看凡人修仙传][2025-09-07_00-16-11].mp4",
|
||||
"note": "",
|
||||
"length": 121,
|
||||
"size": 53049119,
|
||||
"status": 0,
|
||||
"bvid": "",
|
||||
"title": "",
|
||||
"desc": "",
|
||||
"tags": "",
|
||||
"area": 0,
|
||||
"created_at": "2025-09-07T00:16:11.747461+08:00",
|
||||
"platform": "bilibili"
|
||||
},
|
||||
"timestamp": 1757175371
|
||||
}
|
||||
```
|
||||
|
||||
#### 切片删除
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "5c7ca728-753d-4a7d-a0b4-02c997ad2f92",
|
||||
"event": "clip.deleted",
|
||||
"payload": {
|
||||
"id": 313,
|
||||
"room_id": 27183290,
|
||||
"cover": "[27183290][1756903953470][不出非洲之心不下播][2025-09-03_21-10-54].jpg",
|
||||
"file": "[27183290][1756903953470][不出非洲之心不下播][2025-09-03_21-10-54].mp4",
|
||||
"note": "",
|
||||
"length": 32,
|
||||
"size": 18530098,
|
||||
"status": 0,
|
||||
"bvid": "",
|
||||
"title": "",
|
||||
"desc": "",
|
||||
"tags": "",
|
||||
"area": 0,
|
||||
"created_at": "2025-09-03T21:10:54.943682+08:00",
|
||||
"platform": "bilibili"
|
||||
},
|
||||
"timestamp": 1757147617
|
||||
}
|
||||
```
|
||||
@@ -1,7 +1,30 @@
|
||||
# 工作流程
|
||||
|
||||

|
||||
- 直播间:各个平台的直播间
|
||||
- 录播:直播流的存档,每次录制会自动生成一场录播记录
|
||||
- 切片:从直播流中剪切生成的视频片段
|
||||
- 投稿:将切片上传到各个平台(目前仅支持 Bilibili)
|
||||
|
||||
## 1. 直播间与录制
|
||||
下图展示了它们之间的关系:
|
||||
|
||||
添加直播间后,当直播间开播时,会自动开始录制,每次录制会自动生成一场录播记录,你可以点击直播间卡片右下角的历史记录按钮,查看录播记录。
|
||||
```mermaid
|
||||
flowchart TD
|
||||
A[直播间] -->|录制| B[录播 01]
|
||||
A -->|录制| C[录播 02]
|
||||
A -->|录制| E[录播 N]
|
||||
|
||||
B --> F[直播流预览窗口]
|
||||
|
||||
F -->|区间生成| G[切片 01]
|
||||
F -->|区间生成| H[切片 02]
|
||||
F -->|区间生成| I[切片 N]
|
||||
|
||||
G --> J[切片预览窗口]
|
||||
|
||||
J -->|字幕压制| K[新切片]
|
||||
|
||||
K --> J
|
||||
|
||||
J -->|投稿| L[Bilibili]
|
||||
|
||||
```
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"name": "bili-shadowreplay",
|
||||
"private": true,
|
||||
"version": "2.10.4",
|
||||
"version": "2.12.1",
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"dev": "vite",
|
||||
@@ -42,6 +42,7 @@
|
||||
"flowbite": "^2.5.1",
|
||||
"flowbite-svelte": "^0.46.16",
|
||||
"flowbite-svelte-icons": "^1.6.1",
|
||||
"mermaid": "^11.9.0",
|
||||
"postcss": "^8.4.21",
|
||||
"svelte": "^3.54.0",
|
||||
"svelte-check": "^3.0.0",
|
||||
@@ -49,8 +50,9 @@
|
||||
"tailwindcss": "^3.3.0",
|
||||
"ts-node": "^10.9.1",
|
||||
"tslib": "^2.4.1",
|
||||
"typescript": "^4.6.4",
|
||||
"typescript": "^5.0.0",
|
||||
"vite": "^4.0.0",
|
||||
"vitepress": "^1.6.3"
|
||||
"vitepress": "^1.6.3",
|
||||
"vitepress-plugin-mermaid": "^2.0.17"
|
||||
}
|
||||
}
|
||||
|
||||
546
src-tauri/Cargo.lock
generated
546
src-tauri/Cargo.lock
generated
File diff suppressed because it is too large
Load Diff
@@ -4,7 +4,7 @@ resolver = "2"
|
||||
|
||||
[package]
|
||||
name = "bili-shadowreplay"
|
||||
version = "2.10.4"
|
||||
version = "2.12.1"
|
||||
description = "BiliBili ShadowReplay"
|
||||
authors = ["Xinrea"]
|
||||
license = ""
|
||||
@@ -44,7 +44,7 @@ async-trait = "0.1.87"
|
||||
whisper-rs = "0.14.2"
|
||||
hound = "3.5.1"
|
||||
uuid = { version = "1.4", features = ["v4"] }
|
||||
axum = { version = "0.7", features = ["macros"] }
|
||||
axum = { version = "0.7", features = ["macros", "multipart"] }
|
||||
tower-http = { version = "0.5", features = ["cors", "fs"] }
|
||||
futures-core = "0.3"
|
||||
futures = "0.3"
|
||||
@@ -52,6 +52,7 @@ tokio-util = { version = "0.7", features = ["io"] }
|
||||
clap = { version = "4.5.37", features = ["derive"] }
|
||||
url = "2.5.4"
|
||||
srtparse = "0.2.0"
|
||||
thiserror = "1.0"
|
||||
|
||||
[features]
|
||||
# this feature is used for production builds or when `devPath` points to the filesystem
|
||||
|
||||
@@ -10,6 +10,9 @@ whisper_model = "./whisper_model.bin"
|
||||
whisper_prompt = "这是一段中文 你们好"
|
||||
openai_api_key = ""
|
||||
clip_name_format = "[{room_id}][{live_id}][{title}][{created_at}].mp4"
|
||||
# FLV 转换后自动清理源文件
|
||||
# 启用后,导入 FLV 视频并自动转换为 MP4 后,会删除原始 FLV 文件以节省存储空间
|
||||
cleanup_source_flv_after_import = false
|
||||
|
||||
[auto_generate]
|
||||
enabled = false
|
||||
|
||||
@@ -7,38 +7,42 @@ edition = "2021"
|
||||
name = "danmu_stream"
|
||||
path = "src/lib.rs"
|
||||
|
||||
[[example]]
|
||||
name = "bilibili"
|
||||
path = "examples/bilibili.rs"
|
||||
|
||||
[[example]]
|
||||
name = "douyin"
|
||||
path = "examples/douyin.rs"
|
||||
|
||||
[dependencies]
|
||||
tokio = { version = "1.0", features = ["full"] }
|
||||
tokio-tungstenite = { version = "0.20", features = ["native-tls"] }
|
||||
tokio = { version = "1", features = ["full"] }
|
||||
tokio-tungstenite = { version = "0.27", features = ["native-tls"] }
|
||||
futures-util = "0.3"
|
||||
prost = "0.12"
|
||||
prost = "0.14"
|
||||
chrono = "0.4"
|
||||
log = "0.4"
|
||||
env_logger = "0.10"
|
||||
serde = { version = "1.0", features = ["derive"] }
|
||||
serde_json = "1.0"
|
||||
reqwest = { version = "0.11", features = ["json"] }
|
||||
env_logger = "0.11"
|
||||
serde = { version = "1", features = ["derive"] }
|
||||
serde_json = "1"
|
||||
reqwest = { version = "0.12", features = ["json"] }
|
||||
url = "2.4"
|
||||
md5 = "0.7"
|
||||
md5 = "0.8"
|
||||
regex = "1.9"
|
||||
deno_core = "0.242.0"
|
||||
pct-str = "2.0.0"
|
||||
custom_error = "1.9.2"
|
||||
deno_core = "0.355"
|
||||
pct-str = "2.0"
|
||||
thiserror = "2.0"
|
||||
flate2 = "1.0"
|
||||
scroll = "0.13.0"
|
||||
scroll_derive = "0.13.0"
|
||||
brotli = "8.0.1"
|
||||
scroll = "0.13"
|
||||
scroll_derive = "0.13"
|
||||
brotli = "8.0"
|
||||
http = "1.0"
|
||||
rand = "0.9.1"
|
||||
urlencoding = "2.1.3"
|
||||
rand = "0.9"
|
||||
urlencoding = "2.1"
|
||||
gzip = "0.1.2"
|
||||
hex = "0.4.3"
|
||||
async-trait = "0.1.88"
|
||||
uuid = "1.17.0"
|
||||
async-trait = "0.1"
|
||||
uuid = "1"
|
||||
|
||||
[build-dependencies]
|
||||
tonic-build = "0.10"
|
||||
tonic-build = "0.14"
|
||||
|
||||
@@ -1,10 +1,11 @@
|
||||
use std::sync::Arc;
|
||||
|
||||
use tokio::sync::{mpsc, RwLock};
|
||||
|
||||
use crate::{
|
||||
provider::{new, DanmuProvider, ProviderType},
|
||||
DanmuMessageType, DanmuStreamError,
|
||||
};
|
||||
use tokio::sync::{mpsc, RwLock};
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct DanmuStream {
|
||||
|
||||
@@ -1,19 +1,8 @@
|
||||
use std::time::Duration;
|
||||
|
||||
use crate::DanmuStreamError;
|
||||
use reqwest::header::HeaderMap;
|
||||
|
||||
impl From<reqwest::Error> for DanmuStreamError {
|
||||
fn from(value: reqwest::Error) -> Self {
|
||||
Self::HttpError { err: value }
|
||||
}
|
||||
}
|
||||
|
||||
impl From<url::ParseError> for DanmuStreamError {
|
||||
fn from(value: url::ParseError) -> Self {
|
||||
Self::ParseError { err: value }
|
||||
}
|
||||
}
|
||||
use crate::DanmuStreamError;
|
||||
|
||||
pub struct ApiClient {
|
||||
client: reqwest::Client,
|
||||
|
||||
@@ -2,16 +2,24 @@ pub mod danmu_stream;
|
||||
mod http_client;
|
||||
pub mod provider;
|
||||
|
||||
use custom_error::custom_error;
|
||||
use thiserror::Error;
|
||||
|
||||
custom_error! {pub DanmuStreamError
|
||||
HttpError {err: reqwest::Error} = "HttpError {err}",
|
||||
ParseError {err: url::ParseError} = "ParseError {err}",
|
||||
WebsocketError {err: String } = "WebsocketError {err}",
|
||||
PackError {err: String} = "PackError {err}",
|
||||
UnsupportProto {proto: u16} = "UnsupportProto {proto}",
|
||||
MessageParseError {err: String} = "MessageParseError {err}",
|
||||
InvalidIdentifier {err: String} = "InvalidIdentifier {err}"
|
||||
#[derive(Error, Debug)]
|
||||
pub enum DanmuStreamError {
|
||||
#[error("HttpError {0:?}")]
|
||||
HttpError(#[from] reqwest::Error),
|
||||
#[error("ParseError {0:?}")]
|
||||
ParseError(#[from] url::ParseError),
|
||||
#[error("WebsocketError {err}")]
|
||||
WebsocketError { err: String },
|
||||
#[error("PackError {err}")]
|
||||
PackError { err: String },
|
||||
#[error("UnsupportProto {proto}")]
|
||||
UnsupportProto { proto: u16 },
|
||||
#[error("MessageParseError {err}")]
|
||||
MessageParseError { err: String },
|
||||
#[error("InvalidIdentifier {err}")]
|
||||
InvalidIdentifier { err: String },
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
|
||||
@@ -65,7 +65,6 @@ impl DanmuProvider for BiliDanmu {
|
||||
tx: mpsc::UnboundedSender<DanmuMessageType>,
|
||||
) -> Result<(), DanmuStreamError> {
|
||||
let mut retry_count = 0;
|
||||
const MAX_RETRIES: u32 = 5;
|
||||
const RETRY_DELAY: Duration = Duration::from_secs(5);
|
||||
info!(
|
||||
"Bilibili WebSocket connection started, room_id: {}",
|
||||
@@ -74,33 +73,37 @@ impl DanmuProvider for BiliDanmu {
|
||||
|
||||
loop {
|
||||
if *self.stop.read().await {
|
||||
info!(
|
||||
"Bilibili WebSocket connection stopped, room_id: {}",
|
||||
self.room_id
|
||||
);
|
||||
break;
|
||||
}
|
||||
|
||||
match self.connect_and_handle(tx.clone()).await {
|
||||
Ok(_) => {
|
||||
info!("Bilibili WebSocket connection closed normally");
|
||||
info!(
|
||||
"Bilibili WebSocket connection closed normally, room_id: {}",
|
||||
self.room_id
|
||||
);
|
||||
break;
|
||||
}
|
||||
Err(e) => {
|
||||
error!("Bilibili WebSocket connection error: {}", e);
|
||||
retry_count += 1;
|
||||
|
||||
if retry_count >= MAX_RETRIES {
|
||||
return Err(DanmuStreamError::WebsocketError {
|
||||
err: format!("Failed to connect after {} retries", MAX_RETRIES),
|
||||
});
|
||||
}
|
||||
|
||||
info!(
|
||||
"Retrying connection in {} seconds... (Attempt {}/{})",
|
||||
RETRY_DELAY.as_secs(),
|
||||
retry_count,
|
||||
MAX_RETRIES
|
||||
error!(
|
||||
"Bilibili WebSocket connection error, room_id: {}, error: {}",
|
||||
self.room_id, e
|
||||
);
|
||||
tokio::time::sleep(RETRY_DELAY).await;
|
||||
retry_count += 1;
|
||||
}
|
||||
}
|
||||
|
||||
info!(
|
||||
"Retrying connection in {} seconds... (Attempt {}), room_id: {}",
|
||||
RETRY_DELAY.as_secs(),
|
||||
retry_count,
|
||||
self.room_id
|
||||
);
|
||||
tokio::time::sleep(RETRY_DELAY).await;
|
||||
}
|
||||
|
||||
Ok(())
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
use serde::Deserialize;
|
||||
|
||||
use crate::{provider::bilibili::stream::WsStreamCtx, DanmuStreamError};
|
||||
use super::stream::WsStreamCtx;
|
||||
|
||||
use crate::DanmuStreamError;
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
#[allow(dead_code)]
|
||||
|
||||
@@ -1,4 +1,6 @@
|
||||
use crate::{provider::bilibili::stream::WsStreamCtx, DanmuStreamError};
|
||||
use super::stream::WsStreamCtx;
|
||||
|
||||
use crate::DanmuStreamError;
|
||||
|
||||
#[derive(Debug)]
|
||||
#[allow(dead_code)]
|
||||
|
||||
@@ -24,7 +24,7 @@ struct PackHotCount {
|
||||
|
||||
type BilibiliPackCtx<'a> = (BilibiliPackHeader, &'a [u8]);
|
||||
|
||||
fn pack(buffer: &[u8]) -> Result<BilibiliPackCtx, DanmuStreamError> {
|
||||
fn pack(buffer: &[u8]) -> Result<BilibiliPackCtx<'_>, DanmuStreamError> {
|
||||
let data = buffer
|
||||
.pread_with(0, scroll::BE)
|
||||
.map_err(|e: scroll::Error| DanmuStreamError::PackError { err: e.to_string() })?;
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
use serde::Deserialize;
|
||||
|
||||
use crate::{provider::bilibili::stream::WsStreamCtx, DanmuStreamError};
|
||||
use super::stream::WsStreamCtx;
|
||||
|
||||
use crate::DanmuStreamError;
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
#[allow(dead_code)]
|
||||
|
||||
@@ -1,10 +1,9 @@
|
||||
use serde::Deserialize;
|
||||
use serde_json::Value;
|
||||
|
||||
use crate::{
|
||||
provider::{bilibili::dannmu_msg::BiliDanmuMessage, DanmuMessageType},
|
||||
DanmuMessage, DanmuStreamError,
|
||||
};
|
||||
use super::dannmu_msg::BiliDanmuMessage;
|
||||
|
||||
use crate::{provider::DanmuMessageType, DanmuMessage, DanmuStreamError};
|
||||
|
||||
#[derive(Debug, Deserialize, Clone)]
|
||||
pub struct WsStreamCtx {
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
use serde::Deserialize;
|
||||
|
||||
use crate::{provider::bilibili::stream::WsStreamCtx, DanmuStreamError};
|
||||
use super::stream::WsStreamCtx;
|
||||
|
||||
use crate::DanmuStreamError;
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
#[allow(dead_code)]
|
||||
|
||||
@@ -1,4 +1,9 @@
|
||||
use crate::{provider::DanmuProvider, DanmuMessage, DanmuMessageType, DanmuStreamError};
|
||||
mod messages;
|
||||
|
||||
use std::io::Read;
|
||||
use std::sync::Arc;
|
||||
use std::time::{Duration, SystemTime};
|
||||
|
||||
use async_trait::async_trait;
|
||||
use deno_core::v8;
|
||||
use deno_core::JsRuntime;
|
||||
@@ -7,11 +12,9 @@ use flate2::read::GzDecoder;
|
||||
use futures_util::{SinkExt, StreamExt, TryStreamExt};
|
||||
use log::debug;
|
||||
use log::{error, info};
|
||||
use messages::*;
|
||||
use prost::bytes::Bytes;
|
||||
use prost::Message;
|
||||
use std::io::Read;
|
||||
use std::sync::Arc;
|
||||
use std::time::{Duration, SystemTime};
|
||||
use tokio::net::TcpStream;
|
||||
use tokio::sync::mpsc;
|
||||
use tokio::sync::RwLock;
|
||||
@@ -19,8 +22,7 @@ use tokio_tungstenite::{
|
||||
connect_async, tungstenite::Message as WsMessage, MaybeTlsStream, WebSocketStream,
|
||||
};
|
||||
|
||||
mod messages;
|
||||
use messages::*;
|
||||
use crate::{provider::DanmuProvider, DanmuMessage, DanmuMessageType, DanmuStreamError};
|
||||
|
||||
const USER_AGENT: &str = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36";
|
||||
|
||||
@@ -109,7 +111,7 @@ impl DouyinDanmu {
|
||||
runtime
|
||||
.execute_script(
|
||||
"<crypto-js.min.js>",
|
||||
deno_core::FastString::Static(crypto_js),
|
||||
deno_core::FastString::from_static(crypto_js),
|
||||
)
|
||||
.map_err(|e| DanmuStreamError::WebsocketError {
|
||||
err: format!("Failed to execute crypto-js: {}", e),
|
||||
@@ -118,7 +120,7 @@ impl DouyinDanmu {
|
||||
// Load and execute the sign.js file
|
||||
let js_code = include_str!("douyin/webmssdk.js");
|
||||
runtime
|
||||
.execute_script("<sign.js>", deno_core::FastString::Static(js_code))
|
||||
.execute_script("<sign.js>", deno_core::FastString::from_static(js_code))
|
||||
.map_err(|e| DanmuStreamError::WebsocketError {
|
||||
err: format!("Failed to execute JavaScript: {}", e),
|
||||
})?;
|
||||
@@ -126,10 +128,7 @@ impl DouyinDanmu {
|
||||
// Call the get_wss_url function
|
||||
let sign_call = format!("get_wss_url(\"{}\")", self.room_id);
|
||||
let result = runtime
|
||||
.execute_script(
|
||||
"<sign_call>",
|
||||
deno_core::FastString::Owned(sign_call.into_boxed_str()),
|
||||
)
|
||||
.execute_script("<sign_call>", deno_core::FastString::from(sign_call))
|
||||
.map_err(|e| DanmuStreamError::WebsocketError {
|
||||
err: format!("Failed to execute JavaScript: {}", e),
|
||||
})?;
|
||||
@@ -214,7 +213,7 @@ impl DouyinDanmu {
|
||||
if let Ok(Some(ack)) = handle_binary_message(&data, &tx, room_id).await {
|
||||
if let Some(write) = write.write().await.as_mut() {
|
||||
if let Err(e) =
|
||||
write.send(WsMessage::Binary(ack.encode_to_vec())).await
|
||||
write.send(WsMessage::binary(ack.encode_to_vec())).await
|
||||
{
|
||||
error!("Failed to send ack: {}", e);
|
||||
}
|
||||
@@ -257,7 +256,7 @@ impl DouyinDanmu {
|
||||
|
||||
async fn send_heartbeat(tx: &mpsc::Sender<WsMessage>) -> Result<(), DanmuStreamError> {
|
||||
// heartbeat message: 3A 02 68 62
|
||||
tx.send(WsMessage::Binary(vec![0x3A, 0x02, 0x68, 0x62]))
|
||||
tx.send(WsMessage::binary(vec![0x3A, 0x02, 0x68, 0x62]))
|
||||
.await
|
||||
.map_err(|e| DanmuStreamError::WebsocketError {
|
||||
err: format!("Failed to send heartbeat message: {}", e),
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
use prost::Message;
|
||||
use std::collections::HashMap;
|
||||
|
||||
use prost::Message;
|
||||
|
||||
// message Response {
|
||||
// repeated Message messagesList = 1;
|
||||
// string cursor = 2;
|
||||
|
||||
@@ -4,10 +4,10 @@ mod douyin;
|
||||
use async_trait::async_trait;
|
||||
use tokio::sync::mpsc;
|
||||
|
||||
use crate::{
|
||||
provider::bilibili::BiliDanmu, provider::douyin::DouyinDanmu, DanmuMessageType,
|
||||
DanmuStreamError,
|
||||
};
|
||||
use self::bilibili::BiliDanmu;
|
||||
use self::douyin::DouyinDanmu;
|
||||
|
||||
use crate::{DanmuMessageType, DanmuStreamError};
|
||||
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub enum ProviderType {
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,56 +0,0 @@
|
||||
use std::path::PathBuf;
|
||||
use std::sync::Arc;
|
||||
|
||||
use chrono::Utc;
|
||||
|
||||
use crate::database::Database;
|
||||
use crate::recorder::PlatformType;
|
||||
|
||||
pub async fn try_rebuild_archives(
|
||||
db: &Arc<Database>,
|
||||
cache_path: PathBuf,
|
||||
) -> Result<(), Box<dyn std::error::Error>> {
|
||||
let rooms = db.get_recorders().await?;
|
||||
for room in rooms {
|
||||
let room_id = room.room_id;
|
||||
let room_cache_path = cache_path.join(format!("{}/{}", room.platform, room_id));
|
||||
let mut files = tokio::fs::read_dir(room_cache_path).await?;
|
||||
while let Some(file) = files.next_entry().await? {
|
||||
if file.file_type().await?.is_dir() {
|
||||
// use folder name as live_id
|
||||
let live_id = file.file_name();
|
||||
let live_id = live_id.to_str().unwrap();
|
||||
// check if live_id is in db
|
||||
let record = db.get_record(room_id, live_id).await;
|
||||
if record.is_ok() {
|
||||
continue;
|
||||
}
|
||||
|
||||
// get created_at from folder metadata
|
||||
let metadata = file.metadata().await?;
|
||||
let created_at = metadata.created();
|
||||
if created_at.is_err() {
|
||||
continue;
|
||||
}
|
||||
let created_at = created_at.unwrap();
|
||||
let created_at = chrono::DateTime::<Utc>::from(created_at)
|
||||
.format("%Y-%m-%dT%H:%M:%S.%fZ")
|
||||
.to_string();
|
||||
// create a record for this live_id
|
||||
let record = db
|
||||
.add_record(
|
||||
PlatformType::from_str(room.platform.as_str()).unwrap(),
|
||||
live_id,
|
||||
room_id,
|
||||
&format!("UnknownLive {}", live_id),
|
||||
None,
|
||||
Some(&created_at),
|
||||
)
|
||||
.await?;
|
||||
|
||||
log::info!("rebuild archive {:?}", record);
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
@@ -1,6 +1,6 @@
|
||||
use std::path::{Path, PathBuf};
|
||||
|
||||
use chrono::Utc;
|
||||
use chrono::Local;
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
use crate::{recorder::PlatformType, recorder_manager::ClipRangeParams};
|
||||
@@ -37,6 +37,10 @@ pub struct Config {
|
||||
pub whisper_language: String,
|
||||
#[serde(default = "default_user_agent")]
|
||||
pub user_agent: String,
|
||||
#[serde(default = "default_cleanup_source_flv")]
|
||||
pub cleanup_source_flv_after_import: bool,
|
||||
#[serde(default = "default_webhook_url")]
|
||||
pub webhook_url: String,
|
||||
}
|
||||
|
||||
#[derive(Deserialize, Serialize, Clone)]
|
||||
@@ -92,6 +96,14 @@ fn default_user_agent() -> String {
|
||||
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/137.0.0.0 Safari/537.36".to_string()
|
||||
}
|
||||
|
||||
fn default_cleanup_source_flv() -> bool {
|
||||
false
|
||||
}
|
||||
|
||||
fn default_webhook_url() -> String {
|
||||
"".to_string()
|
||||
}
|
||||
|
||||
impl Config {
|
||||
pub fn load(
|
||||
config_path: &PathBuf,
|
||||
@@ -130,6 +142,8 @@ impl Config {
|
||||
config_path: config_path.to_str().unwrap().into(),
|
||||
whisper_language: default_whisper_language(),
|
||||
user_agent: default_user_agent(),
|
||||
cleanup_source_flv_after_import: default_cleanup_source_flv(),
|
||||
webhook_url: default_webhook_url(),
|
||||
};
|
||||
|
||||
config.save();
|
||||
@@ -168,6 +182,12 @@ impl Config {
|
||||
self.save();
|
||||
}
|
||||
|
||||
#[allow(dead_code)]
|
||||
pub fn set_cleanup_source_flv(&mut self, cleanup: bool) {
|
||||
self.cleanup_source_flv_after_import = cleanup;
|
||||
self.save();
|
||||
}
|
||||
|
||||
pub fn generate_clip_name(&self, params: &ClipRangeParams) -> PathBuf {
|
||||
let platform = PlatformType::from_str(¶ms.platform).unwrap();
|
||||
|
||||
@@ -183,13 +203,31 @@ impl Config {
|
||||
let format_config = format_config.replace("{platform}", platform.as_str());
|
||||
let format_config = format_config.replace("{room_id}", ¶ms.room_id.to_string());
|
||||
let format_config = format_config.replace("{live_id}", ¶ms.live_id);
|
||||
let format_config = format_config.replace("{x}", ¶ms.x.to_string());
|
||||
let format_config = format_config.replace("{y}", ¶ms.y.to_string());
|
||||
let format_config = format_config.replace(
|
||||
"{x}",
|
||||
¶ms
|
||||
.range
|
||||
.as_ref()
|
||||
.map_or("0".to_string(), |r| r.start.to_string()),
|
||||
);
|
||||
let format_config = format_config.replace(
|
||||
"{y}",
|
||||
¶ms
|
||||
.range
|
||||
.as_ref()
|
||||
.map_or("0".to_string(), |r| r.end.to_string()),
|
||||
);
|
||||
let format_config = format_config.replace(
|
||||
"{created_at}",
|
||||
&Utc::now().format("%Y-%m-%d_%H-%M-%S").to_string(),
|
||||
&Local::now().format("%Y-%m-%d_%H-%M-%S").to_string(),
|
||||
);
|
||||
let format_config = format_config.replace(
|
||||
"{length}",
|
||||
¶ms
|
||||
.range
|
||||
.as_ref()
|
||||
.map_or("0".to_string(), |r| r.duration().to_string()),
|
||||
);
|
||||
let format_config = format_config.replace("{length}", &(params.y - params.x).to_string());
|
||||
|
||||
let output = self.output.clone();
|
||||
|
||||
|
||||
4
src-tauri/src/constants.rs
Normal file
4
src-tauri/src/constants.rs
Normal file
@@ -0,0 +1,4 @@
|
||||
pub const PREFIX_SUBTITLE: &str = "[subtitle]";
|
||||
pub const PREFIX_IMPORTED: &str = "[imported]";
|
||||
pub const PREFIX_DANMAKU: &str = "[danmaku]";
|
||||
pub const PREFIX_CLIP: &str = "[clip]";
|
||||
@@ -24,8 +24,8 @@ struct DanmakuPosition {
|
||||
time: f64,
|
||||
}
|
||||
|
||||
const PLAY_RES_X: f64 = 1920.0;
|
||||
const PLAY_RES_Y: f64 = 1080.0;
|
||||
const PLAY_RES_X: f64 = 1280.0;
|
||||
const PLAY_RES_Y: f64 = 720.0;
|
||||
const BOTTOM_RESERVED: f64 = 50.0;
|
||||
const R2L_TIME: f64 = 8.0;
|
||||
const MAX_DELAY: f64 = 6.0;
|
||||
@@ -36,20 +36,20 @@ pub fn danmu_to_ass(danmus: Vec<DanmuEntry>) -> String {
|
||||
Title: Bilibili Danmaku
|
||||
ScriptType: v4.00+
|
||||
Collisions: Normal
|
||||
PlayResX: 1920
|
||||
PlayResY: 1080
|
||||
PlayResX: 1280
|
||||
PlayResY: 720
|
||||
Timer: 10.0000
|
||||
|
||||
[V4+ Styles]
|
||||
Format: Name, Fontname, Fontsize, PrimaryColour, SecondaryColour, OutlineColour, BackColour, Bold, Italic, Underline, StrikeOut, ScaleX, ScaleY, Spacing, Angle, BorderStyle, Outline, Shadow, Alignment, MarginL, MarginR, MarginV, Encoding
|
||||
Style: Default,Microsoft YaHei,48,&H00FFFFFF,&H000000FF,&H00000000,&H00000000,0,0,0,0,100,100,0,0,1,2,0,2,20,20,2,0
|
||||
Style: Default,微软雅黑,36,&H7fFFFFFF,&H7fFFFFFF,&H7f000000,&H7f000000,0,0,0,0,100,100,0,0,1,1,0,2,20,20,2,0
|
||||
|
||||
[Events]
|
||||
Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text
|
||||
"#;
|
||||
|
||||
let mut normal = normal_danmaku();
|
||||
let font_size = 48.0; // Default font size
|
||||
let font_size = 36.0; // Default font size
|
||||
|
||||
// Convert danmus to ASS events
|
||||
let events = danmus
|
||||
@@ -76,7 +76,7 @@ Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text
|
||||
"Dialogue: 0,{},{},Default,,0,0,0,,{{\\move({},{},{},{})}}{}",
|
||||
start_time,
|
||||
end_time,
|
||||
PLAY_RES_X,
|
||||
PLAY_RES_X + text_width / 2.0,
|
||||
pos.top + font_size, // Start position
|
||||
-text_width,
|
||||
pos.top + font_size, // End position
|
||||
|
||||
@@ -6,7 +6,7 @@ use chrono::Utc;
|
||||
use rand::seq::SliceRandom;
|
||||
use rand::Rng;
|
||||
|
||||
#[derive(Debug, Clone, serde::Serialize, sqlx::FromRow)]
|
||||
#[derive(Debug, Clone, serde::Serialize, serde::Deserialize, sqlx::FromRow)]
|
||||
pub struct AccountRow {
|
||||
pub platform: String,
|
||||
pub uid: u64, // Keep for Bilibili compatibility
|
||||
|
||||
@@ -4,7 +4,7 @@ use super::Database;
|
||||
use super::DatabaseError;
|
||||
use chrono::Utc;
|
||||
|
||||
#[derive(Debug, Clone, serde::Serialize, sqlx::FromRow)]
|
||||
#[derive(Debug, Clone, serde::Serialize, serde::Deserialize, sqlx::FromRow)]
|
||||
pub struct RecordRow {
|
||||
pub platform: String,
|
||||
pub live_id: String,
|
||||
@@ -18,14 +18,21 @@ pub struct RecordRow {
|
||||
|
||||
// CREATE TABLE records (live_id INTEGER PRIMARY KEY, room_id INTEGER, title TEXT, length INTEGER, size INTEGER, created_at TEXT);
|
||||
impl Database {
|
||||
pub async fn get_records(&self, room_id: u64) -> Result<Vec<RecordRow>, DatabaseError> {
|
||||
pub async fn get_records(
|
||||
&self,
|
||||
room_id: u64,
|
||||
offset: u64,
|
||||
limit: u64,
|
||||
) -> Result<Vec<RecordRow>, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
Ok(
|
||||
sqlx::query_as::<_, RecordRow>("SELECT * FROM records WHERE room_id = $1")
|
||||
.bind(room_id as i64)
|
||||
.fetch_all(&lock)
|
||||
.await?,
|
||||
Ok(sqlx::query_as::<_, RecordRow>(
|
||||
"SELECT * FROM records WHERE room_id = $1 ORDER BY created_at DESC LIMIT $2 OFFSET $3",
|
||||
)
|
||||
.bind(room_id as i64)
|
||||
.bind(limit as i64)
|
||||
.bind(offset as i64)
|
||||
.fetch_all(&lock)
|
||||
.await?)
|
||||
}
|
||||
|
||||
pub async fn get_record(
|
||||
@@ -35,10 +42,10 @@ impl Database {
|
||||
) -> Result<RecordRow, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
Ok(sqlx::query_as::<_, RecordRow>(
|
||||
"SELECT * FROM records WHERE live_id = $1 and room_id = $2",
|
||||
"SELECT * FROM records WHERE room_id = $1 and live_id = $2",
|
||||
)
|
||||
.bind(live_id)
|
||||
.bind(room_id as i64)
|
||||
.bind(live_id)
|
||||
.fetch_one(&lock)
|
||||
.await?)
|
||||
}
|
||||
@@ -73,13 +80,17 @@ impl Database {
|
||||
Ok(record)
|
||||
}
|
||||
|
||||
pub async fn remove_record(&self, live_id: &str) -> Result<(), DatabaseError> {
|
||||
pub async fn remove_record(&self, live_id: &str) -> Result<RecordRow, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let to_delete = sqlx::query_as::<_, RecordRow>("SELECT * FROM records WHERE live_id = $1")
|
||||
.bind(live_id)
|
||||
.fetch_one(&lock)
|
||||
.await?;
|
||||
sqlx::query("DELETE FROM records WHERE live_id = $1")
|
||||
.bind(live_id)
|
||||
.execute(&lock)
|
||||
.await?;
|
||||
Ok(())
|
||||
Ok(to_delete)
|
||||
}
|
||||
|
||||
pub async fn update_record(
|
||||
@@ -98,6 +109,20 @@ impl Database {
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn update_record_cover(
|
||||
&self,
|
||||
live_id: &str,
|
||||
cover: Option<String>,
|
||||
) -> Result<(), DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
sqlx::query("UPDATE records SET cover = $1 WHERE live_id = $2")
|
||||
.bind(cover)
|
||||
.bind(live_id)
|
||||
.execute(&lock)
|
||||
.await?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn get_total_length(&self) -> Result<i64, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let result: (i64,) = sqlx::query_as("SELECT SUM(length) FROM records;")
|
||||
@@ -147,4 +172,12 @@ impl Database {
|
||||
.await?)
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn get_record_disk_usage(&self) -> Result<u64, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let result: (i64,) = sqlx::query_as("SELECT SUM(size) FROM records;")
|
||||
.fetch_one(&lock)
|
||||
.await?;
|
||||
Ok(result.0 as u64)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -4,12 +4,13 @@ use crate::recorder::PlatformType;
|
||||
use chrono::Utc;
|
||||
/// Recorder in database is pretty simple
|
||||
/// because many room infos are collected in realtime
|
||||
#[derive(Debug, Clone, serde::Serialize, sqlx::FromRow)]
|
||||
#[derive(Debug, Clone, serde::Serialize, serde::Deserialize, sqlx::FromRow)]
|
||||
pub struct RecorderRow {
|
||||
pub room_id: u64,
|
||||
pub created_at: String,
|
||||
pub platform: String,
|
||||
pub auto_start: bool,
|
||||
pub extra: String,
|
||||
}
|
||||
|
||||
// recorders
|
||||
@@ -18,6 +19,7 @@ impl Database {
|
||||
&self,
|
||||
platform: PlatformType,
|
||||
room_id: u64,
|
||||
extra: &str,
|
||||
) -> Result<RecorderRow, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let recorder = RecorderRow {
|
||||
@@ -25,21 +27,28 @@ impl Database {
|
||||
created_at: Utc::now().to_rfc3339(),
|
||||
platform: platform.as_str().to_string(),
|
||||
auto_start: true,
|
||||
extra: extra.to_string(),
|
||||
};
|
||||
let _ = sqlx::query(
|
||||
"INSERT INTO recorders (room_id, created_at, platform, auto_start) VALUES ($1, $2, $3, $4)",
|
||||
"INSERT OR REPLACE INTO recorders (room_id, created_at, platform, auto_start, extra) VALUES ($1, $2, $3, $4, $5)",
|
||||
)
|
||||
.bind(room_id as i64)
|
||||
.bind(&recorder.created_at)
|
||||
.bind(platform.as_str())
|
||||
.bind(recorder.auto_start)
|
||||
.bind(extra)
|
||||
.execute(&lock)
|
||||
.await?;
|
||||
Ok(recorder)
|
||||
}
|
||||
|
||||
pub async fn remove_recorder(&self, room_id: u64) -> Result<(), DatabaseError> {
|
||||
pub async fn remove_recorder(&self, room_id: u64) -> Result<RecorderRow, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let recorder =
|
||||
sqlx::query_as::<_, RecorderRow>("SELECT * FROM recorders WHERE room_id = $1")
|
||||
.bind(room_id as i64)
|
||||
.fetch_one(&lock)
|
||||
.await?;
|
||||
let sql = sqlx::query("DELETE FROM recorders WHERE room_id = $1")
|
||||
.bind(room_id as i64)
|
||||
.execute(&lock)
|
||||
@@ -50,13 +59,13 @@ impl Database {
|
||||
|
||||
// remove related archive
|
||||
let _ = self.remove_archive(room_id).await;
|
||||
Ok(())
|
||||
Ok(recorder)
|
||||
}
|
||||
|
||||
pub async fn get_recorders(&self) -> Result<Vec<RecorderRow>, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
Ok(sqlx::query_as::<_, RecorderRow>(
|
||||
"SELECT room_id, created_at, platform, auto_start FROM recorders",
|
||||
"SELECT room_id, created_at, platform, auto_start, extra FROM recorders",
|
||||
)
|
||||
.fetch_all(&lock)
|
||||
.await?)
|
||||
|
||||
@@ -2,29 +2,13 @@ use super::Database;
|
||||
use super::DatabaseError;
|
||||
|
||||
// CREATE TABLE videos (id INTEGER PRIMARY KEY, room_id INTEGER, cover TEXT, file TEXT, length INTEGER, size INTEGER, status INTEGER, bvid TEXT, title TEXT, desc TEXT, tags TEXT, area INTEGER, created_at TEXT);
|
||||
#[derive(Debug, Clone, serde::Serialize, sqlx::FromRow)]
|
||||
#[derive(Debug, Clone, serde::Serialize, serde::Deserialize, sqlx::FromRow)]
|
||||
pub struct VideoRow {
|
||||
pub id: i64,
|
||||
pub room_id: u64,
|
||||
pub cover: String,
|
||||
pub file: String,
|
||||
pub length: i64,
|
||||
pub size: i64,
|
||||
pub status: i64,
|
||||
pub bvid: String,
|
||||
pub title: String,
|
||||
pub desc: String,
|
||||
pub tags: String,
|
||||
pub area: i64,
|
||||
pub created_at: String,
|
||||
pub platform: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, serde::Serialize, sqlx::FromRow)]
|
||||
pub struct VideoNoCover {
|
||||
pub id: i64,
|
||||
pub room_id: u64,
|
||||
pub file: String,
|
||||
pub note: String,
|
||||
pub length: i64,
|
||||
pub size: i64,
|
||||
pub status: i64,
|
||||
@@ -38,9 +22,9 @@ pub struct VideoNoCover {
|
||||
}
|
||||
|
||||
impl Database {
|
||||
pub async fn get_videos(&self, room_id: u64) -> Result<Vec<VideoNoCover>, DatabaseError> {
|
||||
pub async fn get_videos(&self, room_id: u64) -> Result<Vec<VideoRow>, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let videos = sqlx::query_as::<_, VideoNoCover>("SELECT * FROM videos WHERE room_id = $1;")
|
||||
let videos = sqlx::query_as::<_, VideoRow>("SELECT * FROM videos WHERE room_id = $1;")
|
||||
.bind(room_id as i64)
|
||||
.fetch_all(&lock)
|
||||
.await?;
|
||||
@@ -59,13 +43,14 @@ impl Database {
|
||||
|
||||
pub async fn update_video(&self, video_row: &VideoRow) -> Result<(), DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
sqlx::query("UPDATE videos SET status = $1, bvid = $2, title = $3, desc = $4, tags = $5, area = $6 WHERE id = $7")
|
||||
sqlx::query("UPDATE videos SET status = $1, bvid = $2, title = $3, desc = $4, tags = $5, area = $6, note = $7 WHERE id = $8")
|
||||
.bind(video_row.status)
|
||||
.bind(&video_row.bvid)
|
||||
.bind(&video_row.title)
|
||||
.bind(&video_row.desc)
|
||||
.bind(&video_row.tags)
|
||||
.bind(video_row.area)
|
||||
.bind(&video_row.note)
|
||||
.bind(video_row.id)
|
||||
.execute(&lock)
|
||||
.await?;
|
||||
@@ -83,10 +68,11 @@ impl Database {
|
||||
|
||||
pub async fn add_video(&self, video: &VideoRow) -> Result<VideoRow, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let sql = sqlx::query("INSERT INTO videos (room_id, cover, file, length, size, status, bvid, title, desc, tags, area, created_at, platform) VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13)")
|
||||
let sql = sqlx::query("INSERT INTO videos (room_id, cover, file, note, length, size, status, bvid, title, desc, tags, area, created_at, platform) VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14)")
|
||||
.bind(video.room_id as i64)
|
||||
.bind(&video.cover)
|
||||
.bind(&video.file)
|
||||
.bind(&video.note)
|
||||
.bind(video.length)
|
||||
.bind(video.size)
|
||||
.bind(video.status)
|
||||
@@ -106,7 +92,7 @@ impl Database {
|
||||
Ok(video)
|
||||
}
|
||||
|
||||
pub async fn update_video_cover(&self, id: i64, cover: String) -> Result<(), DatabaseError> {
|
||||
pub async fn update_video_cover(&self, id: i64, cover: &str) -> Result<(), DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
sqlx::query("UPDATE videos SET cover = $1 WHERE id = $2")
|
||||
.bind(cover)
|
||||
@@ -116,10 +102,10 @@ impl Database {
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn get_all_videos(&self) -> Result<Vec<VideoNoCover>, DatabaseError> {
|
||||
pub async fn get_all_videos(&self) -> Result<Vec<VideoRow>, DatabaseError> {
|
||||
let lock = self.db.read().await.clone().unwrap();
|
||||
let videos =
|
||||
sqlx::query_as::<_, VideoNoCover>("SELECT * FROM videos ORDER BY created_at DESC;")
|
||||
sqlx::query_as::<_, VideoRow>("SELECT * FROM videos ORDER BY created_at DESC;")
|
||||
.fetch_all(&lock)
|
||||
.await?;
|
||||
Ok(videos)
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
use std::fmt;
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::process::Stdio;
|
||||
|
||||
use crate::constants;
|
||||
use crate::progress_reporter::{ProgressReporter, ProgressReporterTrait};
|
||||
use crate::subtitle_generator::whisper_online;
|
||||
use crate::subtitle_generator::{
|
||||
@@ -8,17 +10,49 @@ use crate::subtitle_generator::{
|
||||
};
|
||||
use async_ffmpeg_sidecar::event::{FfmpegEvent, LogLevel};
|
||||
use async_ffmpeg_sidecar::log_parser::FfmpegLogParser;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use tokio::io::{AsyncBufReadExt, BufReader};
|
||||
|
||||
// 视频元数据结构
|
||||
#[derive(Debug)]
|
||||
pub struct VideoMetadata {
|
||||
pub duration: f64,
|
||||
#[allow(unused)]
|
||||
pub width: u32,
|
||||
#[allow(unused)]
|
||||
pub height: u32,
|
||||
}
|
||||
|
||||
#[cfg(target_os = "windows")]
|
||||
const CREATE_NO_WINDOW: u32 = 0x08000000;
|
||||
#[cfg(target_os = "windows")]
|
||||
#[allow(unused_imports)]
|
||||
use std::os::windows::process::CommandExt;
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct Range {
|
||||
pub start: f64,
|
||||
pub end: f64,
|
||||
}
|
||||
|
||||
impl fmt::Display for Range {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(f, "[{}, {}]", self.start, self.end)
|
||||
}
|
||||
}
|
||||
|
||||
impl Range {
|
||||
pub fn duration(&self) -> f64 {
|
||||
self.end - self.start
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn clip_from_m3u8(
|
||||
reporter: Option<&impl ProgressReporterTrait>,
|
||||
m3u8_index: &Path,
|
||||
output_path: &Path,
|
||||
range: Option<&Range>,
|
||||
fix_encoding: bool,
|
||||
) -> Result<(), String> {
|
||||
// first check output folder exists
|
||||
let output_folder = output_path.parent().unwrap();
|
||||
@@ -34,9 +68,24 @@ pub async fn clip_from_m3u8(
|
||||
#[cfg(target_os = "windows")]
|
||||
ffmpeg_process.creation_flags(CREATE_NO_WINDOW);
|
||||
|
||||
let child = ffmpeg_process
|
||||
.args(["-i", &format!("{}", m3u8_index.display())])
|
||||
.args(["-c", "copy"])
|
||||
let child_command = ffmpeg_process.args(["-i", &format!("{}", m3u8_index.display())]);
|
||||
|
||||
if let Some(range) = range {
|
||||
child_command
|
||||
.args(["-ss", &range.start.to_string()])
|
||||
.args(["-t", &range.duration().to_string()]);
|
||||
}
|
||||
|
||||
if fix_encoding {
|
||||
child_command
|
||||
.args(["-c:v", "libx264"])
|
||||
.args(["-c:a", "copy"])
|
||||
.args(["-b:v", "6000k"]);
|
||||
} else {
|
||||
child_command.args(["-c", "copy"]);
|
||||
}
|
||||
|
||||
let child = child_command
|
||||
.args(["-y", output_path.to_str().unwrap()])
|
||||
.args(["-progress", "pipe:2"])
|
||||
.stderr(Stdio::piped())
|
||||
@@ -58,6 +107,7 @@ pub async fn clip_from_m3u8(
|
||||
if reporter.is_none() {
|
||||
continue;
|
||||
}
|
||||
log::debug!("Clip progress: {}", p.time);
|
||||
reporter
|
||||
.unwrap()
|
||||
.update(format!("编码中:{}", p.time).as_str())
|
||||
@@ -316,6 +366,7 @@ pub async fn get_segment_duration(file: &Path) -> Result<f64, String> {
|
||||
duration.ok_or_else(|| "Failed to parse segment duration".to_string())
|
||||
}
|
||||
|
||||
/// Encode video subtitle using ffmpeg, output is file name with prefix [subtitle]
|
||||
pub async fn encode_video_subtitle(
|
||||
reporter: &impl ProgressReporterTrait,
|
||||
file: &Path,
|
||||
@@ -324,15 +375,21 @@ pub async fn encode_video_subtitle(
|
||||
) -> Result<String, String> {
|
||||
// ffmpeg -i fixed_\[30655190\]1742887114_0325084106_81.5.mp4 -vf "subtitles=test.srt:force_style='FontSize=24'" -c:v libx264 -c:a copy output.mp4
|
||||
log::info!("Encode video subtitle task start: {}", file.display());
|
||||
log::info!("srt_style: {}", srt_style);
|
||||
log::info!("SRT style: {}", srt_style);
|
||||
// output path is file with prefix [subtitle]
|
||||
let output_filename = format!("[subtitle]{}", file.file_name().unwrap().to_str().unwrap());
|
||||
let output_filename = format!(
|
||||
"{}{}",
|
||||
constants::PREFIX_SUBTITLE,
|
||||
file.file_name().unwrap().to_str().unwrap()
|
||||
);
|
||||
let output_path = file.with_file_name(&output_filename);
|
||||
|
||||
// check output path exists
|
||||
// check output path exists - log but allow overwrite
|
||||
if output_path.exists() {
|
||||
log::info!("Output path already exists: {}", output_path.display());
|
||||
return Err("Output path already exists".to_string());
|
||||
log::info!(
|
||||
"Output path already exists, will overwrite: {}",
|
||||
output_path.display()
|
||||
);
|
||||
}
|
||||
|
||||
let mut command_error = None;
|
||||
@@ -361,6 +418,7 @@ pub async fn encode_video_subtitle(
|
||||
.args(["-vf", vf.as_str()])
|
||||
.args(["-c:v", "libx264"])
|
||||
.args(["-c:a", "copy"])
|
||||
.args(["-b:v", "6000k"])
|
||||
.args([output_path.to_str().unwrap()])
|
||||
.args(["-y"])
|
||||
.args(["-progress", "pipe:2"])
|
||||
@@ -413,13 +471,19 @@ pub async fn encode_video_danmu(
|
||||
) -> Result<PathBuf, String> {
|
||||
// ffmpeg -i fixed_\[30655190\]1742887114_0325084106_81.5.mp4 -vf ass=subtitle.ass -c:v libx264 -c:a copy output.mp4
|
||||
log::info!("Encode video danmu task start: {}", file.display());
|
||||
let danmu_filename = format!("[danmu]{}", file.file_name().unwrap().to_str().unwrap());
|
||||
let output_path = file.with_file_name(danmu_filename);
|
||||
let danmu_filename = format!(
|
||||
"{}{}",
|
||||
constants::PREFIX_DANMAKU,
|
||||
file.file_name().unwrap().to_str().unwrap()
|
||||
);
|
||||
let output_file_path = file.with_file_name(danmu_filename);
|
||||
|
||||
// check output path exists
|
||||
if output_path.exists() {
|
||||
log::info!("Output path already exists: {}", output_path.display());
|
||||
return Err("Output path already exists".to_string());
|
||||
// check output path exists - log but allow overwrite
|
||||
if output_file_path.exists() {
|
||||
log::info!(
|
||||
"Output path already exists, will overwrite: {}",
|
||||
output_file_path.display()
|
||||
);
|
||||
}
|
||||
|
||||
let mut command_error = None;
|
||||
@@ -446,7 +510,8 @@ pub async fn encode_video_danmu(
|
||||
.args(["-vf", &format!("ass={}", subtitle)])
|
||||
.args(["-c:v", "libx264"])
|
||||
.args(["-c:a", "copy"])
|
||||
.args([output_path.to_str().unwrap()])
|
||||
.args(["-b:v", "6000k"])
|
||||
.args([output_file_path.to_str().unwrap()])
|
||||
.args(["-y"])
|
||||
.args(["-progress", "pipe:2"])
|
||||
.stderr(Stdio::piped())
|
||||
@@ -468,7 +533,7 @@ pub async fn encode_video_danmu(
|
||||
command_error = Some(e.to_string());
|
||||
}
|
||||
FfmpegEvent::Progress(p) => {
|
||||
log::info!("Encode video danmu progress: {}", p.time);
|
||||
log::debug!("Encode video danmu progress: {}", p.time);
|
||||
if reporter.is_none() {
|
||||
continue;
|
||||
}
|
||||
@@ -491,8 +556,11 @@ pub async fn encode_video_danmu(
|
||||
log::error!("Encode video danmu error: {}", error);
|
||||
Err(error)
|
||||
} else {
|
||||
log::info!("Encode video danmu task end: {}", output_path.display());
|
||||
Ok(output_path)
|
||||
log::info!(
|
||||
"Encode video danmu task end: {}",
|
||||
output_file_path.display()
|
||||
);
|
||||
Ok(output_file_path)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -759,15 +827,566 @@ fn ffprobe_path() -> PathBuf {
|
||||
path
|
||||
}
|
||||
|
||||
// 从视频文件切片
|
||||
pub async fn clip_from_video_file(
|
||||
reporter: Option<&impl ProgressReporterTrait>,
|
||||
input_path: &Path,
|
||||
output_path: &Path,
|
||||
start_time: f64,
|
||||
duration: f64,
|
||||
) -> Result<(), String> {
|
||||
let output_folder = output_path.parent().unwrap();
|
||||
if !output_folder.exists() {
|
||||
std::fs::create_dir_all(output_folder).unwrap();
|
||||
}
|
||||
|
||||
let mut ffmpeg_process = tokio::process::Command::new(ffmpeg_path());
|
||||
#[cfg(target_os = "windows")]
|
||||
ffmpeg_process.creation_flags(CREATE_NO_WINDOW);
|
||||
|
||||
let child = ffmpeg_process
|
||||
.args(["-i", &format!("{}", input_path.display())])
|
||||
.args(["-ss", &start_time.to_string()])
|
||||
.args(["-t", &duration.to_string()])
|
||||
.args(["-c:v", "libx264"])
|
||||
.args(["-c:a", "aac"])
|
||||
.args(["-b:v", "6000k"])
|
||||
.args(["-avoid_negative_ts", "make_zero"])
|
||||
.args(["-y", output_path.to_str().unwrap()])
|
||||
.args(["-progress", "pipe:2"])
|
||||
.stderr(Stdio::piped())
|
||||
.spawn();
|
||||
|
||||
if let Err(e) = child {
|
||||
return Err(format!("启动ffmpeg进程失败: {}", e));
|
||||
}
|
||||
|
||||
let mut child = child.unwrap();
|
||||
let stderr = child.stderr.take().unwrap();
|
||||
let reader = BufReader::new(stderr);
|
||||
let mut parser = FfmpegLogParser::new(reader);
|
||||
|
||||
let mut clip_error = None;
|
||||
while let Ok(event) = parser.parse_next_event().await {
|
||||
match event {
|
||||
FfmpegEvent::Progress(p) => {
|
||||
if let Some(reporter) = reporter {
|
||||
reporter.update(&format!("切片进度: {}", p.time));
|
||||
}
|
||||
}
|
||||
FfmpegEvent::LogEOF => break,
|
||||
FfmpegEvent::Log(level, content) => {
|
||||
if content.contains("error") || level == LogLevel::Error {
|
||||
log::error!("切片错误: {}", content);
|
||||
}
|
||||
}
|
||||
FfmpegEvent::Error(e) => {
|
||||
log::error!("切片错误: {}", e);
|
||||
clip_error = Some(e.to_string());
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
if let Err(e) = child.wait().await {
|
||||
return Err(e.to_string());
|
||||
}
|
||||
|
||||
if let Some(error) = clip_error {
|
||||
Err(error)
|
||||
} else {
|
||||
log::info!("切片任务完成: {}", output_path.display());
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
/// Extract basic information from a video file.
|
||||
///
|
||||
/// # Arguments
|
||||
/// * `file_path` - The path to the video file.
|
||||
///
|
||||
/// # Returns
|
||||
/// A `Result` containing the video metadata or an error message.
|
||||
pub async fn extract_video_metadata(file_path: &Path) -> Result<VideoMetadata, String> {
|
||||
let mut ffprobe_process = tokio::process::Command::new("ffprobe");
|
||||
#[cfg(target_os = "windows")]
|
||||
ffprobe_process.creation_flags(CREATE_NO_WINDOW);
|
||||
|
||||
let output = ffprobe_process
|
||||
.args([
|
||||
"-v",
|
||||
"quiet",
|
||||
"-print_format",
|
||||
"json",
|
||||
"-show_format",
|
||||
"-show_streams",
|
||||
"-select_streams",
|
||||
"v:0",
|
||||
&format!("{}", file_path.display()),
|
||||
])
|
||||
.output()
|
||||
.await
|
||||
.map_err(|e| format!("执行ffprobe失败: {}", e))?;
|
||||
|
||||
if !output.status.success() {
|
||||
return Err(format!(
|
||||
"ffprobe执行失败: {}",
|
||||
String::from_utf8_lossy(&output.stderr)
|
||||
));
|
||||
}
|
||||
|
||||
let json_str = String::from_utf8_lossy(&output.stdout);
|
||||
let json: serde_json::Value =
|
||||
serde_json::from_str(&json_str).map_err(|e| format!("解析ffprobe输出失败: {}", e))?;
|
||||
|
||||
// 解析视频流信息
|
||||
let streams = json["streams"].as_array().ok_or("未找到视频流信息")?;
|
||||
|
||||
if streams.is_empty() {
|
||||
return Err("未找到视频流".to_string());
|
||||
}
|
||||
|
||||
let video_stream = &streams[0];
|
||||
let format = &json["format"];
|
||||
|
||||
let duration = format["duration"]
|
||||
.as_str()
|
||||
.and_then(|d| d.parse::<f64>().ok())
|
||||
.unwrap_or(0.0);
|
||||
|
||||
let width = video_stream["width"].as_u64().unwrap_or(0) as u32;
|
||||
let height = video_stream["height"].as_u64().unwrap_or(0) as u32;
|
||||
|
||||
Ok(VideoMetadata {
|
||||
duration,
|
||||
width,
|
||||
height,
|
||||
})
|
||||
}
|
||||
|
||||
/// Generate thumbnail file from video, capturing a frame at the specified timestamp.
|
||||
///
|
||||
/// # Arguments
|
||||
/// * `video_full_path` - The full path to the video file.
|
||||
/// * `timestamp` - The timestamp (in seconds) to capture the thumbnail.
|
||||
///
|
||||
/// # Returns
|
||||
/// The path to the generated thumbnail image.
|
||||
pub async fn generate_thumbnail(video_full_path: &Path, timestamp: f64) -> Result<PathBuf, String> {
|
||||
let mut ffmpeg_process = tokio::process::Command::new(ffmpeg_path());
|
||||
#[cfg(target_os = "windows")]
|
||||
ffmpeg_process.creation_flags(CREATE_NO_WINDOW);
|
||||
|
||||
let thumbnail_full_path = video_full_path.with_extension("jpg");
|
||||
|
||||
let output = ffmpeg_process
|
||||
.args(["-i", &format!("{}", video_full_path.display())])
|
||||
.args(["-ss", ×tamp.to_string()])
|
||||
.args(["-vframes", "1"])
|
||||
.args(["-y", thumbnail_full_path.to_str().unwrap()])
|
||||
.output()
|
||||
.await
|
||||
.map_err(|e| format!("生成缩略图失败: {}", e))?;
|
||||
|
||||
if !output.status.success() {
|
||||
return Err(format!(
|
||||
"ffmpeg生成缩略图失败: {}",
|
||||
String::from_utf8_lossy(&output.stderr)
|
||||
));
|
||||
}
|
||||
|
||||
// 记录生成的缩略图信息
|
||||
if let Ok(metadata) = std::fs::metadata(&thumbnail_full_path) {
|
||||
log::info!(
|
||||
"生成缩略图完成: {} (文件大小: {} bytes)",
|
||||
thumbnail_full_path.display(),
|
||||
metadata.len()
|
||||
);
|
||||
} else {
|
||||
log::info!("生成缩略图完成: {}", thumbnail_full_path.display());
|
||||
}
|
||||
Ok(thumbnail_full_path)
|
||||
}
|
||||
|
||||
// 执行FFmpeg转换的通用函数
|
||||
pub async fn execute_ffmpeg_conversion(
|
||||
mut cmd: tokio::process::Command,
|
||||
reporter: &ProgressReporter,
|
||||
mode_name: &str,
|
||||
) -> Result<(), String> {
|
||||
use async_ffmpeg_sidecar::event::FfmpegEvent;
|
||||
use async_ffmpeg_sidecar::log_parser::FfmpegLogParser;
|
||||
use std::process::Stdio;
|
||||
use tokio::io::BufReader;
|
||||
|
||||
let mut child = cmd
|
||||
.stderr(Stdio::piped())
|
||||
.spawn()
|
||||
.map_err(|e| format!("启动FFmpeg进程失败: {}", e))?;
|
||||
|
||||
let stderr = child.stderr.take().unwrap();
|
||||
let reader = BufReader::new(stderr);
|
||||
let mut parser = FfmpegLogParser::new(reader);
|
||||
|
||||
let mut conversion_error = None;
|
||||
while let Ok(event) = parser.parse_next_event().await {
|
||||
match event {
|
||||
FfmpegEvent::Progress(p) => {
|
||||
reporter.update(&format!("正在转换视频格式... {} ({})", p.time, mode_name));
|
||||
}
|
||||
FfmpegEvent::LogEOF => break,
|
||||
FfmpegEvent::Log(level, content) => {
|
||||
if matches!(level, async_ffmpeg_sidecar::event::LogLevel::Error)
|
||||
&& content.contains("Error")
|
||||
{
|
||||
conversion_error = Some(content);
|
||||
}
|
||||
}
|
||||
FfmpegEvent::Error(e) => {
|
||||
conversion_error = Some(e);
|
||||
}
|
||||
_ => {} // 忽略其他事件类型
|
||||
}
|
||||
}
|
||||
|
||||
let status = child
|
||||
.wait()
|
||||
.await
|
||||
.map_err(|e| format!("等待FFmpeg进程失败: {}", e))?;
|
||||
|
||||
if !status.success() {
|
||||
let error_msg = conversion_error
|
||||
.unwrap_or_else(|| format!("FFmpeg退出码: {}", status.code().unwrap_or(-1)));
|
||||
return Err(format!("视频格式转换失败 ({}): {}", mode_name, error_msg));
|
||||
}
|
||||
|
||||
reporter.update(&format!("视频格式转换完成 100% ({})", mode_name));
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// 尝试流复制转换(无损,速度快)
|
||||
pub async fn try_stream_copy_conversion(
|
||||
source: &Path,
|
||||
dest: &Path,
|
||||
reporter: &ProgressReporter,
|
||||
) -> Result<(), String> {
|
||||
reporter.update("正在转换视频格式... 0% (无损模式)");
|
||||
|
||||
// 构建ffmpeg命令 - 流复制模式
|
||||
let mut cmd = tokio::process::Command::new(ffmpeg_path());
|
||||
#[cfg(target_os = "windows")]
|
||||
cmd.creation_flags(0x08000000); // CREATE_NO_WINDOW
|
||||
|
||||
cmd.args([
|
||||
"-i",
|
||||
&source.to_string_lossy(),
|
||||
"-c:v",
|
||||
"copy", // 直接复制视频流,零损失
|
||||
"-c:a",
|
||||
"copy", // 直接复制音频流,零损失
|
||||
"-avoid_negative_ts",
|
||||
"make_zero", // 修复时间戳问题
|
||||
"-movflags",
|
||||
"+faststart", // 优化web播放
|
||||
"-progress",
|
||||
"pipe:2", // 输出进度到stderr
|
||||
"-y", // 覆盖输出文件
|
||||
&dest.to_string_lossy(),
|
||||
]);
|
||||
|
||||
execute_ffmpeg_conversion(cmd, reporter, "无损转换").await
|
||||
}
|
||||
|
||||
// 高质量重编码转换(兼容性好,质量高)
|
||||
pub async fn try_high_quality_conversion(
|
||||
source: &Path,
|
||||
dest: &Path,
|
||||
reporter: &ProgressReporter,
|
||||
) -> Result<(), String> {
|
||||
reporter.update("正在转换视频格式... 0% (高质量模式)");
|
||||
|
||||
// 构建ffmpeg命令 - 高质量重编码
|
||||
let mut cmd = tokio::process::Command::new(ffmpeg_path());
|
||||
#[cfg(target_os = "windows")]
|
||||
cmd.creation_flags(0x08000000); // CREATE_NO_WINDOW
|
||||
|
||||
cmd.args([
|
||||
"-i",
|
||||
&source.to_string_lossy(),
|
||||
"-c:v",
|
||||
"libx264", // H.264编码器
|
||||
"-preset",
|
||||
"slow", // 慢速预设,更好的压缩效率
|
||||
"-crf",
|
||||
"18", // 高质量设置 (18-23范围,越小质量越高)
|
||||
"-c:a",
|
||||
"aac", // AAC音频编码器
|
||||
"-b:a",
|
||||
"192k", // 高音频码率
|
||||
"-avoid_negative_ts",
|
||||
"make_zero", // 修复时间戳问题
|
||||
"-movflags",
|
||||
"+faststart", // 优化web播放
|
||||
"-progress",
|
||||
"pipe:2", // 输出进度到stderr
|
||||
"-y", // 覆盖输出文件
|
||||
&dest.to_string_lossy(),
|
||||
]);
|
||||
|
||||
execute_ffmpeg_conversion(cmd, reporter, "高质量转换").await
|
||||
}
|
||||
|
||||
// 带进度的视频格式转换函数(智能质量保持策略)
|
||||
pub async fn convert_video_format(
|
||||
source: &Path,
|
||||
dest: &Path,
|
||||
reporter: &ProgressReporter,
|
||||
) -> Result<(), String> {
|
||||
// 先尝试stream copy(无损转换),如果失败则使用高质量重编码
|
||||
match try_stream_copy_conversion(source, dest, reporter).await {
|
||||
Ok(()) => Ok(()),
|
||||
Err(stream_copy_error) => {
|
||||
reporter.update("流复制失败,使用高质量重编码模式...");
|
||||
log::warn!(
|
||||
"Stream copy failed: {}, falling back to re-encoding",
|
||||
stream_copy_error
|
||||
);
|
||||
try_high_quality_conversion(source, dest, reporter).await
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// tests
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
// 测试 Range 结构体
|
||||
#[test]
|
||||
fn test_range_creation() {
|
||||
let range = Range {
|
||||
start: 10.0,
|
||||
end: 30.0,
|
||||
};
|
||||
assert_eq!(range.start, 10.0);
|
||||
assert_eq!(range.end, 30.0);
|
||||
assert_eq!(range.duration(), 20.0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_range_duration() {
|
||||
let range = Range {
|
||||
start: 0.0,
|
||||
end: 60.0,
|
||||
};
|
||||
assert_eq!(range.duration(), 60.0);
|
||||
|
||||
let range2 = Range {
|
||||
start: 15.5,
|
||||
end: 45.5,
|
||||
};
|
||||
assert_eq!(range2.duration(), 30.0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_range_display() {
|
||||
let range = Range {
|
||||
start: 5.0,
|
||||
end: 25.0,
|
||||
};
|
||||
assert_eq!(range.to_string(), "[5, 25]");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_range_edge_cases() {
|
||||
let zero_range = Range {
|
||||
start: 0.0,
|
||||
end: 0.0,
|
||||
};
|
||||
assert_eq!(zero_range.duration(), 0.0);
|
||||
|
||||
let negative_start = Range {
|
||||
start: -5.0,
|
||||
end: 10.0,
|
||||
};
|
||||
assert_eq!(negative_start.duration(), 15.0);
|
||||
|
||||
let large_range = Range {
|
||||
start: 1000.0,
|
||||
end: 2000.0,
|
||||
};
|
||||
assert_eq!(large_range.duration(), 1000.0);
|
||||
}
|
||||
|
||||
// 测试视频元数据提取
|
||||
#[tokio::test]
|
||||
async fn test_get_video_size() {
|
||||
let file = Path::new("/Users/xinreasuper/Desktop/shadowreplay-test/output2/[1789714684][1753965688317][摄像头被前夫抛妻弃子直播挣点奶粉][2025-07-31_12-58-14].mp4");
|
||||
let resolution = get_video_resolution(file.to_str().unwrap()).await.unwrap();
|
||||
println!("Resolution: {}", resolution);
|
||||
async fn test_extract_video_metadata() {
|
||||
let test_video = Path::new("tests/video/test.mp4");
|
||||
if test_video.exists() {
|
||||
let metadata = extract_video_metadata(test_video).await.unwrap();
|
||||
assert!(metadata.duration > 0.0);
|
||||
assert!(metadata.width > 0);
|
||||
assert!(metadata.height > 0);
|
||||
}
|
||||
}
|
||||
|
||||
// 测试音频时长获取
|
||||
#[tokio::test]
|
||||
async fn test_get_audio_duration() {
|
||||
let test_audio = Path::new("tests/audio/test.wav");
|
||||
if test_audio.exists() {
|
||||
let duration = get_audio_duration(test_audio).await.unwrap();
|
||||
assert!(duration > 0);
|
||||
}
|
||||
}
|
||||
|
||||
// 测试视频分辨率获取
|
||||
#[tokio::test]
|
||||
async fn test_get_video_resolution() {
|
||||
let file = Path::new("tests/video/h_test.m4s");
|
||||
if file.exists() {
|
||||
let resolution = get_video_resolution(file.to_str().unwrap()).await.unwrap();
|
||||
assert_eq!(resolution, "1920x1080");
|
||||
}
|
||||
}
|
||||
|
||||
// 测试缩略图生成
|
||||
#[tokio::test]
|
||||
async fn test_generate_thumbnail() {
|
||||
let file = Path::new("tests/video/test.mp4");
|
||||
if file.exists() {
|
||||
let thumbnail_file = generate_thumbnail(file, 0.0).await.unwrap();
|
||||
assert!(thumbnail_file.exists());
|
||||
assert_eq!(thumbnail_file.extension().unwrap(), "jpg");
|
||||
// clean up
|
||||
let _ = std::fs::remove_file(thumbnail_file);
|
||||
}
|
||||
}
|
||||
|
||||
// 测试 FFmpeg 版本检查
|
||||
#[tokio::test]
|
||||
async fn test_check_ffmpeg() {
|
||||
let result = check_ffmpeg().await;
|
||||
match result {
|
||||
Ok(version) => {
|
||||
assert!(!version.is_empty());
|
||||
// FFmpeg 版本字符串可能不包含 "ffmpeg" 这个词,所以检查是否包含数字
|
||||
assert!(version.chars().any(|c| c.is_ascii_digit()));
|
||||
}
|
||||
Err(_) => {
|
||||
// FFmpeg 可能没有安装,这是正常的
|
||||
println!("FFmpeg not available for testing");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 测试通用 FFmpeg 命令
|
||||
#[tokio::test]
|
||||
async fn test_generic_ffmpeg_command() {
|
||||
let result = generic_ffmpeg_command(&["-version"]).await;
|
||||
match result {
|
||||
Ok(_output) => {
|
||||
// 输出可能为空或者不包含 "ffmpeg" 字符串,我们只检查函数能正常执行
|
||||
println!("FFmpeg command executed successfully");
|
||||
}
|
||||
Err(_) => {
|
||||
// FFmpeg 可能没有安装,这是正常的
|
||||
println!("FFmpeg not available for testing");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 测试字幕生成错误处理
|
||||
#[tokio::test]
|
||||
async fn test_generate_video_subtitle_errors() {
|
||||
let test_file = Path::new("tests/video/test.mp4");
|
||||
|
||||
// 测试 Whisper 类型 - 模型未配置
|
||||
let result =
|
||||
generate_video_subtitle(None, test_file, "whisper", "", "", "", "", "zh").await;
|
||||
assert!(result.is_err());
|
||||
assert!(result.unwrap_err().contains("Whisper model not configured"));
|
||||
|
||||
// 测试 Whisper Online 类型 - API key 未配置
|
||||
let result =
|
||||
generate_video_subtitle(None, test_file, "whisper_online", "", "", "", "", "zh").await;
|
||||
assert!(result.is_err());
|
||||
assert!(result.unwrap_err().contains("API key not configured"));
|
||||
|
||||
// 测试未知类型
|
||||
let result =
|
||||
generate_video_subtitle(None, test_file, "unknown_type", "", "", "", "", "").await;
|
||||
assert!(result.is_err());
|
||||
assert!(result
|
||||
.unwrap_err()
|
||||
.contains("Unknown subtitle generator type"));
|
||||
}
|
||||
|
||||
// 测试路径构建函数
|
||||
#[test]
|
||||
fn test_ffmpeg_paths() {
|
||||
let ffmpeg_path = ffmpeg_path();
|
||||
let ffprobe_path = ffprobe_path();
|
||||
|
||||
#[cfg(windows)]
|
||||
{
|
||||
assert_eq!(ffmpeg_path.extension().unwrap(), "exe");
|
||||
assert_eq!(ffprobe_path.extension().unwrap(), "exe");
|
||||
}
|
||||
|
||||
#[cfg(not(windows))]
|
||||
{
|
||||
assert_eq!(ffmpeg_path.file_name().unwrap(), "ffmpeg");
|
||||
assert_eq!(ffprobe_path.file_name().unwrap(), "ffprobe");
|
||||
}
|
||||
}
|
||||
|
||||
// 测试错误处理
|
||||
#[tokio::test]
|
||||
async fn test_error_handling() {
|
||||
// 测试不存在的文件
|
||||
let non_existent_file = Path::new("tests/nonexistent/test.mp4");
|
||||
let result = extract_video_metadata(non_existent_file).await;
|
||||
assert!(result.is_err());
|
||||
|
||||
let result = get_video_resolution("tests/nonexistent/test.mp4").await;
|
||||
assert!(result.is_err());
|
||||
}
|
||||
|
||||
// 测试文件名和路径处理
|
||||
#[test]
|
||||
fn test_filename_processing() {
|
||||
let test_file = Path::new("tests/video/test.mp4");
|
||||
|
||||
// 测试字幕文件名生成
|
||||
let subtitle_filename = format!(
|
||||
"{}{}",
|
||||
constants::PREFIX_SUBTITLE,
|
||||
test_file.file_name().unwrap().to_str().unwrap()
|
||||
);
|
||||
assert!(subtitle_filename.starts_with(constants::PREFIX_SUBTITLE));
|
||||
assert!(subtitle_filename.contains("test.mp4"));
|
||||
|
||||
// 测试弹幕文件名生成
|
||||
let danmu_filename = format!(
|
||||
"{}{}",
|
||||
constants::PREFIX_DANMAKU,
|
||||
test_file.file_name().unwrap().to_str().unwrap()
|
||||
);
|
||||
assert!(danmu_filename.starts_with(constants::PREFIX_DANMAKU));
|
||||
assert!(danmu_filename.contains("test.mp4"));
|
||||
}
|
||||
|
||||
// 测试音频分块目录结构
|
||||
#[test]
|
||||
fn test_audio_chunk_directory_structure() {
|
||||
let test_file = Path::new("tests/audio/test.wav");
|
||||
let output_path = test_file.with_extension("wav");
|
||||
let output_dir = output_path.parent().unwrap();
|
||||
let base_name = output_path.file_stem().unwrap().to_str().unwrap();
|
||||
let chunk_dir = output_dir.join(format!("{}_chunks", base_name));
|
||||
|
||||
assert!(chunk_dir.to_string_lossy().contains("_chunks"));
|
||||
assert!(chunk_dir.to_string_lossy().contains("test"));
|
||||
}
|
||||
}
|
||||
|
||||
@@ -14,10 +14,27 @@ pub async fn get_config(state: state_type!()) -> Result<Config, ()> {
|
||||
#[allow(dead_code)]
|
||||
pub async fn set_cache_path(state: state_type!(), cache_path: String) -> Result<(), String> {
|
||||
let old_cache_path = state.config.read().await.cache.clone();
|
||||
log::info!(
|
||||
"Try to set cache path: {} -> {}",
|
||||
old_cache_path,
|
||||
cache_path
|
||||
);
|
||||
if old_cache_path == cache_path {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
let old_cache_path_obj = std::path::Path::new(&old_cache_path);
|
||||
let new_cache_path_obj = std::path::Path::new(&cache_path);
|
||||
// check if new cache path is under old cache path
|
||||
if new_cache_path_obj.starts_with(old_cache_path_obj) {
|
||||
log::error!(
|
||||
"New cache path is under old cache path: {} -> {}",
|
||||
old_cache_path,
|
||||
cache_path
|
||||
);
|
||||
return Err("New cache path cannot be under old cache path".to_string());
|
||||
}
|
||||
|
||||
state.recorder_manager.set_migrating(true).await;
|
||||
// stop and clear all recorders
|
||||
state.recorder_manager.stop_all().await;
|
||||
@@ -52,9 +69,11 @@ pub async fn set_cache_path(state: state_type!(), cache_path: String) -> Result<
|
||||
if entry.is_dir() {
|
||||
if let Err(e) = crate::handlers::utils::copy_dir_all(entry, &new_entry) {
|
||||
log::error!("Copy old cache to new cache error: {}", e);
|
||||
return Err(e.to_string());
|
||||
}
|
||||
} else if let Err(e) = std::fs::copy(entry, &new_entry) {
|
||||
log::error!("Copy old cache to new cache error: {}", e);
|
||||
return Err(e.to_string());
|
||||
}
|
||||
}
|
||||
|
||||
@@ -79,12 +98,30 @@ pub async fn set_cache_path(state: state_type!(), cache_path: String) -> Result<
|
||||
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
#[allow(dead_code)]
|
||||
pub async fn set_output_path(state: state_type!(), output_path: String) -> Result<(), ()> {
|
||||
pub async fn set_output_path(state: state_type!(), output_path: String) -> Result<(), String> {
|
||||
let mut config = state.config.write().await;
|
||||
let old_output_path = config.output.clone();
|
||||
log::info!(
|
||||
"Try to set output path: {} -> {}",
|
||||
old_output_path,
|
||||
output_path
|
||||
);
|
||||
if old_output_path == output_path {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
let old_output_path_obj = std::path::Path::new(&old_output_path);
|
||||
let new_output_path_obj = std::path::Path::new(&output_path);
|
||||
// check if new output path is under old output path
|
||||
if new_output_path_obj.starts_with(old_output_path_obj) {
|
||||
log::error!(
|
||||
"New output path is under old output path: {} -> {}",
|
||||
old_output_path,
|
||||
output_path
|
||||
);
|
||||
return Err("New output path cannot be under old output path".to_string());
|
||||
}
|
||||
|
||||
// list all file and folder in old output
|
||||
let mut old_output_entries = vec![];
|
||||
if let Ok(entries) = std::fs::read_dir(&old_output_path) {
|
||||
@@ -103,10 +140,12 @@ pub async fn set_output_path(state: state_type!(), output_path: String) -> Resul
|
||||
// if entry is a folder
|
||||
if entry.is_dir() {
|
||||
if let Err(e) = crate::handlers::utils::copy_dir_all(entry, &new_entry) {
|
||||
log::error!("Copy old cache to new cache error: {}", e);
|
||||
log::error!("Copy old output to new output error: {}", e);
|
||||
return Err(e.to_string());
|
||||
}
|
||||
} else if let Err(e) = std::fs::copy(entry, &new_entry) {
|
||||
log::error!("Copy old cache to new cache error: {}", e);
|
||||
log::error!("Copy old output to new output error: {}", e);
|
||||
return Err(e.to_string());
|
||||
}
|
||||
}
|
||||
|
||||
@@ -114,10 +153,10 @@ pub async fn set_output_path(state: state_type!(), output_path: String) -> Resul
|
||||
for entry in old_output_entries {
|
||||
if entry.is_dir() {
|
||||
if let Err(e) = std::fs::remove_dir_all(&entry) {
|
||||
log::error!("Remove old cache error: {}", e);
|
||||
log::error!("Remove old output error: {}", e);
|
||||
}
|
||||
} else if let Err(e) = std::fs::remove_file(&entry) {
|
||||
log::error!("Remove old cache error: {}", e);
|
||||
log::error!("Remove old output error: {}", e);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -251,4 +290,27 @@ pub async fn update_user_agent(state: state_type!(), user_agent: String) -> Resu
|
||||
log::info!("Updating user agent to {}", user_agent);
|
||||
state.config.write().await.set_user_agent(&user_agent);
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
#[cfg(feature = "gui")]
|
||||
pub async fn update_cleanup_source_flv(state: state_type!(), cleanup: bool) -> Result<(), ()> {
|
||||
log::info!("Updating cleanup source FLV after import to {}", cleanup);
|
||||
state.config.write().await.set_cleanup_source_flv(cleanup);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
pub async fn update_webhook_url(state: state_type!(), webhook_url: String) -> Result<(), ()> {
|
||||
log::info!("Updating webhook url to {}", webhook_url);
|
||||
let _ = state
|
||||
.webhook_poster
|
||||
.update_config(crate::webhook::poster::WebhookConfig {
|
||||
url: webhook_url.clone(),
|
||||
..Default::default()
|
||||
})
|
||||
.await;
|
||||
state.config.write().await.webhook_url = webhook_url;
|
||||
state.config.write().await.save();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
@@ -7,6 +7,7 @@ use crate::recorder::RecorderInfo;
|
||||
use crate::recorder_manager::RecorderList;
|
||||
use crate::state::State;
|
||||
use crate::state_type;
|
||||
use crate::webhook::events;
|
||||
|
||||
#[cfg(feature = "gui")]
|
||||
use tauri::State as TauriState;
|
||||
@@ -24,6 +25,7 @@ pub async fn add_recorder(
|
||||
state: state_type!(),
|
||||
platform: String,
|
||||
room_id: u64,
|
||||
extra: String,
|
||||
) -> Result<RecorderRow, String> {
|
||||
log::info!("Add recorder: {} {}", platform, room_id);
|
||||
let platform = PlatformType::from_str(&platform).unwrap();
|
||||
@@ -50,15 +52,23 @@ pub async fn add_recorder(
|
||||
match account {
|
||||
Ok(account) => match state
|
||||
.recorder_manager
|
||||
.add_recorder(&account, platform, room_id, true)
|
||||
.add_recorder(&account, platform, room_id, &extra, true)
|
||||
.await
|
||||
{
|
||||
Ok(()) => {
|
||||
let room = state.db.add_recorder(platform, room_id).await?;
|
||||
let room = state.db.add_recorder(platform, room_id, &extra).await?;
|
||||
state
|
||||
.db
|
||||
.new_message("添加直播间", &format!("添加了新直播间 {}", room_id))
|
||||
.await?;
|
||||
// post webhook event
|
||||
let event = events::new_webhook_event(
|
||||
events::RECORDER_ADDED,
|
||||
events::Payload::Recorder(room.clone()),
|
||||
);
|
||||
if let Err(e) = state.webhook_poster.post_event(&event).await {
|
||||
log::error!("Post webhook event error: {}", e);
|
||||
}
|
||||
Ok(room)
|
||||
}
|
||||
Err(e) => {
|
||||
@@ -86,11 +96,19 @@ pub async fn remove_recorder(
|
||||
.remove_recorder(platform, room_id)
|
||||
.await
|
||||
{
|
||||
Ok(()) => {
|
||||
Ok(recorder) => {
|
||||
state
|
||||
.db
|
||||
.new_message("移除直播间", &format!("移除了直播间 {}", room_id))
|
||||
.await?;
|
||||
// post webhook event
|
||||
let event = events::new_webhook_event(
|
||||
events::RECORDER_REMOVED,
|
||||
events::Payload::Recorder(recorder),
|
||||
);
|
||||
if let Err(e) = state.webhook_poster.post_event(&event).await {
|
||||
log::error!("Post webhook event error: {}", e);
|
||||
}
|
||||
log::info!("Removed recorder: {} {}", platform.as_str(), room_id);
|
||||
Ok(())
|
||||
}
|
||||
@@ -120,8 +138,21 @@ pub async fn get_room_info(
|
||||
}
|
||||
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
pub async fn get_archives(state: state_type!(), room_id: u64) -> Result<Vec<RecordRow>, String> {
|
||||
Ok(state.recorder_manager.get_archives(room_id).await?)
|
||||
pub async fn get_archive_disk_usage(state: state_type!()) -> Result<u64, String> {
|
||||
Ok(state.recorder_manager.get_archive_disk_usage().await?)
|
||||
}
|
||||
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
pub async fn get_archives(
|
||||
state: state_type!(),
|
||||
room_id: u64,
|
||||
offset: u64,
|
||||
limit: u64,
|
||||
) -> Result<Vec<RecordRow>, String> {
|
||||
Ok(state
|
||||
.recorder_manager
|
||||
.get_archives(room_id, offset, limit)
|
||||
.await?)
|
||||
}
|
||||
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
@@ -181,7 +212,7 @@ pub async fn delete_archive(
|
||||
if platform.is_none() {
|
||||
return Err("Unsupported platform".to_string());
|
||||
}
|
||||
state
|
||||
let to_delete = state
|
||||
.recorder_manager
|
||||
.delete_archive(platform.unwrap(), room_id, &live_id)
|
||||
.await?;
|
||||
@@ -192,6 +223,49 @@ pub async fn delete_archive(
|
||||
&format!("删除了房间 {} 的历史缓存 {}", room_id, live_id),
|
||||
)
|
||||
.await?;
|
||||
// post webhook event
|
||||
let event =
|
||||
events::new_webhook_event(events::ARCHIVE_DELETED, events::Payload::Archive(to_delete));
|
||||
if let Err(e) = state.webhook_poster.post_event(&event).await {
|
||||
log::error!("Post webhook event error: {}", e);
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[cfg_attr(feature = "gui", tauri::command)]
|
||||
pub async fn delete_archives(
|
||||
state: state_type!(),
|
||||
platform: String,
|
||||
room_id: u64,
|
||||
live_ids: Vec<String>,
|
||||
) -> Result<(), String> {
|
||||
let platform = PlatformType::from_str(&platform);
|
||||
if platform.is_none() {
|
||||
return Err("Unsupported platform".to_string());
|
||||
}
|
||||
let to_deletes = state
|
||||
.recorder_manager
|
||||
.delete_archives(
|
||||
platform.unwrap(),
|
||||
room_id,
|
||||
&live_ids.iter().map(|s| s.as_str()).collect::<Vec<&str>>(),
|
||||
)
|
||||
.await?;
|
||||
state
|
||||
.db
|
||||
.new_message(
|
||||
"删除历史缓存",
|
||||
&format!("删除了房间 {} 的历史缓存 {}", room_id, live_ids.join(", ")),
|
||||
)
|
||||
.await?;
|
||||
for to_delete in to_deletes {
|
||||
// post webhook event
|
||||
let event =
|
||||
events::new_webhook_event(events::ARCHIVE_DELETED, events::Payload::Archive(to_delete));
|
||||
if let Err(e) = state.webhook_poster.post_event(&event).await {
|
||||
log::error!("Post webhook event error: {}", e);
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
|
||||
@@ -302,3 +302,128 @@ pub async fn list_folder(_state: state_type!(), path: String) -> Result<Vec<Stri
|
||||
}
|
||||
Ok(files)
|
||||
}
|
||||
|
||||
/// 高级文件名清理函数,全面处理各种危险字符和控制字符
|
||||
///
|
||||
/// 适用于需要严格文件名清理的场景,支持中文字符
|
||||
///
|
||||
/// # 参数
|
||||
/// - `name`: 需要清理的文件名
|
||||
/// - `max_length`: 最大长度限制(默认100字符)
|
||||
///
|
||||
/// # 返回
|
||||
/// 经过全面清理的安全文件名
|
||||
#[cfg(feature = "headless")]
|
||||
pub fn sanitize_filename_advanced(name: &str, max_length: Option<usize>) -> String {
|
||||
let max_len = max_length.unwrap_or(100);
|
||||
|
||||
// 先清理所有字符
|
||||
let cleaned: String = name
|
||||
.chars()
|
||||
.map(|c| match c {
|
||||
// 文件系统危险字符
|
||||
'\\' | '/' | ':' | '*' | '?' | '"' | '<' | '>' | '|' => '_',
|
||||
// 控制字符和不可见字符
|
||||
c if c.is_control() => '_',
|
||||
// 保留安全的字符(白名单)
|
||||
c if c.is_alphanumeric()
|
||||
|| c == ' '
|
||||
|| c == '.'
|
||||
|| c == '-'
|
||||
|| c == '_'
|
||||
|| c == '('
|
||||
|| c == ')'
|
||||
|| c == '['
|
||||
|| c == ']'
|
||||
|| c == '《'
|
||||
|| c == '》'
|
||||
|| c == '('
|
||||
|| c == ')' =>
|
||||
{
|
||||
c
|
||||
}
|
||||
// 其他字符替换为下划线
|
||||
_ => '_',
|
||||
})
|
||||
.collect();
|
||||
|
||||
// 如果清理后的长度在限制内,直接返回
|
||||
if cleaned.chars().count() <= max_len {
|
||||
return cleaned;
|
||||
}
|
||||
|
||||
// 智能截断:保护文件扩展名
|
||||
if let Some(dot_pos) = cleaned.rfind('.') {
|
||||
let extension = &cleaned[dot_pos..];
|
||||
let main_part = &cleaned[..dot_pos];
|
||||
|
||||
// 确保扩展名不会太长(最多10个字符,包括点号)
|
||||
if extension.chars().count() <= 10 {
|
||||
let ext_len = extension.chars().count();
|
||||
let available_for_main = max_len.saturating_sub(ext_len);
|
||||
|
||||
if available_for_main > 0 {
|
||||
let truncated_main: String = main_part.chars().take(available_for_main).collect();
|
||||
return format!("{}{}", truncated_main, extension);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 如果没有扩展名或扩展名太长,直接截断
|
||||
cleaned.chars().take(max_len).collect()
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
#[test]
|
||||
#[cfg(feature = "headless")]
|
||||
fn test_sanitize_filename_advanced() {
|
||||
use super::sanitize_filename_advanced;
|
||||
|
||||
assert_eq!(
|
||||
sanitize_filename_advanced("test<>file.txt", None),
|
||||
"test__file.txt"
|
||||
);
|
||||
assert_eq!(sanitize_filename_advanced("文件名.txt", None), "文件名.txt");
|
||||
assert_eq!(
|
||||
sanitize_filename_advanced("《视频》(高清).mp4", None),
|
||||
"《视频》(高清).mp4"
|
||||
);
|
||||
assert_eq!(
|
||||
sanitize_filename_advanced("file\x00with\x01control.txt", None),
|
||||
"file_with_control.txt"
|
||||
);
|
||||
|
||||
// 测试空白字符处理(函数不自动移除空白字符)
|
||||
assert_eq!(
|
||||
sanitize_filename_advanced(" .hidden_file.txt ", None),
|
||||
" .hidden_file.txt "
|
||||
);
|
||||
assert_eq!(
|
||||
sanitize_filename_advanced(" normal_file.mp4 ", None),
|
||||
" normal_file.mp4 "
|
||||
);
|
||||
|
||||
// 测试特殊字符替换
|
||||
assert_eq!(
|
||||
sanitize_filename_advanced("file@#$%^&.txt", None),
|
||||
"file______.txt"
|
||||
);
|
||||
|
||||
// 测试长度限制 - 无扩展名
|
||||
let long_name = "测试".repeat(60);
|
||||
let result = sanitize_filename_advanced(&long_name, Some(10));
|
||||
assert_eq!(result.chars().count(), 10);
|
||||
|
||||
// 测试长度限制 - 有扩展名
|
||||
let long_name_with_ext = format!("{}.txt", "测试".repeat(60));
|
||||
let result = sanitize_filename_advanced(&long_name_with_ext, Some(10));
|
||||
assert!(result.ends_with(".txt"));
|
||||
assert_eq!(result.chars().count(), 10); // 6个测试字符 + .txt (4个字符)
|
||||
|
||||
// 测试短文件名不被截断
|
||||
let short_name = "test.mp4";
|
||||
let result = sanitize_filename_advanced(short_name, Some(50));
|
||||
assert_eq!(result, "test.mp4");
|
||||
}
|
||||
}
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -1,15 +1,14 @@
|
||||
// Prevents additional console window on Windows in release, DO NOT REMOVE!!
|
||||
#![cfg_attr(not(debug_assertions), windows_subsystem = "windows")]
|
||||
|
||||
mod archive_migration;
|
||||
mod config;
|
||||
mod constants;
|
||||
mod danmu2ass;
|
||||
mod database;
|
||||
mod ffmpeg;
|
||||
mod handlers;
|
||||
#[cfg(feature = "headless")]
|
||||
mod http_server;
|
||||
#[cfg(feature = "headless")]
|
||||
mod migration;
|
||||
mod progress_manager;
|
||||
mod progress_reporter;
|
||||
@@ -19,12 +18,15 @@ mod state;
|
||||
mod subtitle_generator;
|
||||
#[cfg(feature = "gui")]
|
||||
mod tray;
|
||||
mod webhook;
|
||||
|
||||
use archive_migration::try_rebuild_archives;
|
||||
use async_std::fs;
|
||||
use chrono::Utc;
|
||||
use config::Config;
|
||||
use database::Database;
|
||||
use migration::migration_methods::try_convert_clip_covers;
|
||||
use migration::migration_methods::try_convert_live_covers;
|
||||
use migration::migration_methods::try_rebuild_archives;
|
||||
use recorder::bilibili::client::BiliClient;
|
||||
use recorder::PlatformType;
|
||||
use recorder_manager::RecorderManager;
|
||||
@@ -165,6 +167,32 @@ fn get_migrations() -> Vec<Migration> {
|
||||
sql: r#"ALTER TABLE accounts ADD COLUMN id_str TEXT;"#,
|
||||
kind: MigrationKind::Up,
|
||||
},
|
||||
// add extra column to recorders
|
||||
Migration {
|
||||
version: 6,
|
||||
description: "add_extra_column_to_recorders",
|
||||
sql: r#"ALTER TABLE recorders ADD COLUMN extra TEXT;"#,
|
||||
kind: MigrationKind::Up,
|
||||
},
|
||||
// add indexes
|
||||
Migration {
|
||||
version: 7,
|
||||
description: "add_indexes",
|
||||
sql: r#"
|
||||
CREATE INDEX idx_records_live_id ON records (room_id, live_id);
|
||||
CREATE INDEX idx_records_created_at ON records (room_id, created_at);
|
||||
CREATE INDEX idx_videos_room_id ON videos (room_id);
|
||||
CREATE INDEX idx_videos_created_at ON videos (created_at);
|
||||
"#,
|
||||
kind: MigrationKind::Up,
|
||||
},
|
||||
// add note column for video
|
||||
Migration {
|
||||
version: 8,
|
||||
description: "add_note_column_for_video",
|
||||
sql: r#"ALTER TABLE videos ADD COLUMN note TEXT;"#,
|
||||
kind: MigrationKind::Up,
|
||||
},
|
||||
]
|
||||
}
|
||||
|
||||
@@ -242,7 +270,14 @@ async fn setup_server_state(args: Args) -> Result<State, Box<dyn std::error::Err
|
||||
|
||||
let progress_manager = Arc::new(ProgressManager::new());
|
||||
let emitter = EventEmitter::new(progress_manager.get_event_sender());
|
||||
let recorder_manager = Arc::new(RecorderManager::new(emitter, db.clone(), config.clone()));
|
||||
let webhook_poster =
|
||||
webhook::poster::create_webhook_poster(&config.read().await.webhook_url, None).unwrap();
|
||||
let recorder_manager = Arc::new(RecorderManager::new(
|
||||
emitter,
|
||||
db.clone(),
|
||||
config.clone(),
|
||||
webhook_poster.clone(),
|
||||
));
|
||||
|
||||
// Update account infos for headless mode
|
||||
let accounts = db.get_accounts().await?;
|
||||
@@ -301,11 +336,14 @@ async fn setup_server_state(args: Args) -> Result<State, Box<dyn std::error::Err
|
||||
}
|
||||
|
||||
let _ = try_rebuild_archives(&db, config.read().await.cache.clone().into()).await;
|
||||
let _ = try_convert_live_covers(&db, config.read().await.cache.clone().into()).await;
|
||||
let _ = try_convert_clip_covers(&db, config.read().await.output.clone().into()).await;
|
||||
|
||||
Ok(State {
|
||||
db,
|
||||
client,
|
||||
config,
|
||||
webhook_poster,
|
||||
recorder_manager,
|
||||
progress_manager,
|
||||
readonly: args.readonly,
|
||||
@@ -349,12 +387,15 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
|
||||
};
|
||||
db_clone.set(sqlite_pool.unwrap().clone()).await;
|
||||
db_clone.finish_pending_tasks().await?;
|
||||
let webhook_poster =
|
||||
webhook::poster::create_webhook_poster(&config.read().await.webhook_url, None).unwrap();
|
||||
|
||||
let recorder_manager = Arc::new(RecorderManager::new(
|
||||
app.app_handle().clone(),
|
||||
emitter,
|
||||
db.clone(),
|
||||
config.clone(),
|
||||
webhook_poster.clone(),
|
||||
));
|
||||
|
||||
let accounts = db_clone.get_accounts().await?;
|
||||
@@ -366,10 +407,11 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
|
||||
config,
|
||||
recorder_manager,
|
||||
app_handle: app.handle().clone(),
|
||||
webhook_poster,
|
||||
});
|
||||
}
|
||||
|
||||
// update account infos
|
||||
// update account info
|
||||
for account in accounts {
|
||||
let platform = PlatformType::from_str(&account.platform).unwrap();
|
||||
|
||||
@@ -426,9 +468,12 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
|
||||
|
||||
// try to rebuild archive table
|
||||
let cache_path = config_clone.read().await.cache.clone();
|
||||
if let Err(e) = try_rebuild_archives(&db_clone, cache_path.into()).await {
|
||||
let output_path = config_clone.read().await.output.clone();
|
||||
if let Err(e) = try_rebuild_archives(&db_clone, cache_path.clone().into()).await {
|
||||
log::warn!("Rebuilding archive table failed: {}", e);
|
||||
}
|
||||
let _ = try_convert_live_covers(&db_clone, cache_path.into()).await;
|
||||
let _ = try_convert_clip_covers(&db_clone, output_path.into()).await;
|
||||
|
||||
Ok(State {
|
||||
db,
|
||||
@@ -436,6 +481,7 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
|
||||
config,
|
||||
recorder_manager,
|
||||
app_handle: app.handle().clone(),
|
||||
webhook_poster,
|
||||
})
|
||||
}
|
||||
|
||||
@@ -505,6 +551,8 @@ fn setup_invoke_handlers(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<
|
||||
crate::handlers::config::update_status_check_interval,
|
||||
crate::handlers::config::update_whisper_language,
|
||||
crate::handlers::config::update_user_agent,
|
||||
crate::handlers::config::update_cleanup_source_flv,
|
||||
crate::handlers::config::update_webhook_url,
|
||||
crate::handlers::message::get_messages,
|
||||
crate::handlers::message::read_message,
|
||||
crate::handlers::message::delete_message,
|
||||
@@ -512,11 +560,13 @@ fn setup_invoke_handlers(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<
|
||||
crate::handlers::recorder::add_recorder,
|
||||
crate::handlers::recorder::remove_recorder,
|
||||
crate::handlers::recorder::get_room_info,
|
||||
crate::handlers::recorder::get_archive_disk_usage,
|
||||
crate::handlers::recorder::get_archives,
|
||||
crate::handlers::recorder::get_archive,
|
||||
crate::handlers::recorder::get_archive_subtitle,
|
||||
crate::handlers::recorder::generate_archive_subtitle,
|
||||
crate::handlers::recorder::delete_archive,
|
||||
crate::handlers::recorder::delete_archives,
|
||||
crate::handlers::recorder::get_danmu_record,
|
||||
crate::handlers::recorder::export_danmu,
|
||||
crate::handlers::recorder::send_danmaku,
|
||||
@@ -538,8 +588,14 @@ fn setup_invoke_handlers(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<
|
||||
crate::handlers::video::generate_video_subtitle,
|
||||
crate::handlers::video::get_video_subtitle,
|
||||
crate::handlers::video::update_video_subtitle,
|
||||
crate::handlers::video::update_video_note,
|
||||
crate::handlers::video::encode_video_subtitle,
|
||||
crate::handlers::video::generic_ffmpeg_command,
|
||||
crate::handlers::video::import_external_video,
|
||||
crate::handlers::video::batch_import_external_videos,
|
||||
crate::handlers::video::clip_video,
|
||||
crate::handlers::video::get_file_size,
|
||||
crate::handlers::video::get_import_progress,
|
||||
crate::handlers::task::get_tasks,
|
||||
crate::handlers::task::delete_task,
|
||||
crate::handlers::utils::show_in_folder,
|
||||
|
||||
129
src-tauri/src/migration/migration_methods.rs
Normal file
129
src-tauri/src/migration/migration_methods.rs
Normal file
@@ -0,0 +1,129 @@
|
||||
use std::path::PathBuf;
|
||||
use std::sync::Arc;
|
||||
|
||||
use base64::Engine;
|
||||
use chrono::Utc;
|
||||
|
||||
use crate::database::Database;
|
||||
use crate::recorder::PlatformType;
|
||||
|
||||
pub async fn try_rebuild_archives(
|
||||
db: &Arc<Database>,
|
||||
cache_path: PathBuf,
|
||||
) -> Result<(), Box<dyn std::error::Error>> {
|
||||
let rooms = db.get_recorders().await?;
|
||||
for room in rooms {
|
||||
let room_id = room.room_id;
|
||||
let room_cache_path = cache_path.join(format!("{}/{}", room.platform, room_id));
|
||||
let mut files = tokio::fs::read_dir(room_cache_path).await?;
|
||||
while let Some(file) = files.next_entry().await? {
|
||||
if file.file_type().await?.is_dir() {
|
||||
// use folder name as live_id
|
||||
let live_id = file.file_name();
|
||||
let live_id = live_id.to_str().unwrap();
|
||||
// check if live_id is in db
|
||||
let record = db.get_record(room_id, live_id).await;
|
||||
if record.is_ok() {
|
||||
continue;
|
||||
}
|
||||
|
||||
// get created_at from folder metadata
|
||||
let metadata = file.metadata().await?;
|
||||
let created_at = metadata.created();
|
||||
if created_at.is_err() {
|
||||
continue;
|
||||
}
|
||||
let created_at = created_at.unwrap();
|
||||
let created_at = chrono::DateTime::<Utc>::from(created_at)
|
||||
.format("%Y-%m-%dT%H:%M:%S.%fZ")
|
||||
.to_string();
|
||||
// create a record for this live_id
|
||||
let record = db
|
||||
.add_record(
|
||||
PlatformType::from_str(room.platform.as_str()).unwrap(),
|
||||
live_id,
|
||||
room_id,
|
||||
&format!("UnknownLive {}", live_id),
|
||||
None,
|
||||
Some(&created_at),
|
||||
)
|
||||
.await?;
|
||||
|
||||
log::info!("rebuild archive {:?}", record);
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn try_convert_live_covers(
|
||||
db: &Arc<Database>,
|
||||
cache_path: PathBuf,
|
||||
) -> Result<(), Box<dyn std::error::Error>> {
|
||||
let rooms = db.get_recorders().await?;
|
||||
for room in rooms {
|
||||
let room_id = room.room_id;
|
||||
let room_cache_path = cache_path.join(format!("{}/{}", room.platform, room_id));
|
||||
let records = db.get_records(room_id, 0, 999999999).await?;
|
||||
for record in &records {
|
||||
let record_path = room_cache_path.join(record.live_id.clone());
|
||||
let cover = record.cover.clone();
|
||||
if cover.is_none() {
|
||||
continue;
|
||||
}
|
||||
|
||||
let cover = cover.unwrap();
|
||||
if cover.starts_with("data:") {
|
||||
let base64 = cover.split("base64,").nth(1).unwrap();
|
||||
let bytes = base64::engine::general_purpose::STANDARD
|
||||
.decode(base64)
|
||||
.unwrap();
|
||||
let path = record_path.join("cover.jpg");
|
||||
tokio::fs::write(&path, bytes).await?;
|
||||
|
||||
log::info!("convert live cover: {}", path.display());
|
||||
// update record
|
||||
db.update_record_cover(
|
||||
record.live_id.as_str(),
|
||||
Some(format!(
|
||||
"{}/{}/{}/cover.jpg",
|
||||
room.platform, room_id, record.live_id
|
||||
)),
|
||||
)
|
||||
.await?;
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn try_convert_clip_covers(
|
||||
db: &Arc<Database>,
|
||||
output_path: PathBuf,
|
||||
) -> Result<(), Box<dyn std::error::Error>> {
|
||||
let videos = db.get_all_videos().await?;
|
||||
log::debug!("videos: {}", videos.len());
|
||||
for video in &videos {
|
||||
let cover = video.cover.clone();
|
||||
if cover.starts_with("data:") {
|
||||
let base64 = cover.split("base64,").nth(1).unwrap();
|
||||
let bytes = base64::engine::general_purpose::STANDARD
|
||||
.decode(base64)
|
||||
.unwrap();
|
||||
|
||||
let video_file_path = output_path.join(video.file.clone());
|
||||
let cover_file_path = video_file_path.with_extension("jpg");
|
||||
log::debug!("cover_file_path: {}", cover_file_path.display());
|
||||
tokio::fs::write(&cover_file_path, bytes).await?;
|
||||
|
||||
log::info!("convert clip cover: {}", cover_file_path.display());
|
||||
// update record
|
||||
db.update_video_cover(
|
||||
video.id,
|
||||
cover_file_path.file_name().unwrap().to_str().unwrap(),
|
||||
)
|
||||
.await?;
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
@@ -1,3 +1,5 @@
|
||||
pub mod migration_methods;
|
||||
|
||||
use sqlx::migrate::MigrationType;
|
||||
|
||||
#[derive(Debug)]
|
||||
@@ -6,6 +8,7 @@ pub enum MigrationKind {
|
||||
Down,
|
||||
}
|
||||
|
||||
#[cfg(feature = "headless")]
|
||||
#[derive(Debug)]
|
||||
pub struct Migration {
|
||||
pub version: i64,
|
||||
@@ -30,7 +30,7 @@ pub struct ProgressManager {
|
||||
#[cfg(feature = "headless")]
|
||||
impl ProgressManager {
|
||||
pub fn new() -> Self {
|
||||
let (progress_sender, progress_receiver) = broadcast::channel(16);
|
||||
let (progress_sender, progress_receiver) = broadcast::channel(256);
|
||||
Self {
|
||||
progress_sender,
|
||||
progress_receiver,
|
||||
|
||||
@@ -9,6 +9,7 @@ use crate::database::account::AccountRow;
|
||||
use crate::ffmpeg::get_video_resolution;
|
||||
use crate::progress_manager::Event;
|
||||
use crate::progress_reporter::EventEmitter;
|
||||
use crate::recorder::Recorder;
|
||||
use crate::recorder_manager::RecorderEvent;
|
||||
use crate::subtitle_generator::item_to_srt;
|
||||
|
||||
@@ -21,9 +22,9 @@ use danmu_stream::provider::ProviderType;
|
||||
use danmu_stream::DanmuMessageType;
|
||||
use errors::BiliClientError;
|
||||
use m3u8_rs::{Playlist, QuotedOrUnquoted, VariantStream};
|
||||
use rand::seq::SliceRandom;
|
||||
use regex::Regex;
|
||||
use std::path::Path;
|
||||
use std::sync::atomic::{AtomicBool, Ordering};
|
||||
use std::sync::Arc;
|
||||
use std::time::Duration;
|
||||
use tokio::fs::File;
|
||||
@@ -62,12 +63,11 @@ pub struct BiliRecorder {
|
||||
cover: Arc<RwLock<Option<String>>>,
|
||||
entry_store: Arc<RwLock<Option<EntryStore>>>,
|
||||
is_recording: Arc<RwLock<bool>>,
|
||||
force_update: Arc<AtomicBool>,
|
||||
last_update: Arc<RwLock<i64>>,
|
||||
quit: Arc<Mutex<bool>>,
|
||||
live_stream: Arc<RwLock<Option<BiliStream>>>,
|
||||
danmu_storage: Arc<RwLock<Option<DanmuStorage>>>,
|
||||
live_end_channel: broadcast::Sender<RecorderEvent>,
|
||||
event_channel: broadcast::Sender<RecorderEvent>,
|
||||
enabled: Arc<RwLock<bool>>,
|
||||
last_segment_offset: Arc<RwLock<Option<i64>>>, // 保存上次处理的最后一个片段的偏移
|
||||
current_header_info: Arc<RwLock<Option<HeaderInfo>>>, // 保存当前的分辨率
|
||||
@@ -121,9 +121,18 @@ impl BiliRecorder {
|
||||
if room_info.live_status == 1 {
|
||||
live_status = true;
|
||||
|
||||
let room_cover_path = Path::new(PlatformType::BiliBili.as_str())
|
||||
.join(options.room_id.to_string())
|
||||
.join("cover.jpg");
|
||||
let full_room_cover_path =
|
||||
Path::new(&options.config.read().await.cache).join(&room_cover_path);
|
||||
// Get cover image
|
||||
if let Ok(cover_base64) = client.get_cover_base64(&room_info.room_cover_url).await {
|
||||
cover = Some(cover_base64);
|
||||
if (client
|
||||
.download_file(&room_info.room_cover_url, &full_room_cover_path)
|
||||
.await)
|
||||
.is_ok()
|
||||
{
|
||||
cover = Some(room_cover_path.to_str().unwrap().to_string());
|
||||
}
|
||||
}
|
||||
|
||||
@@ -144,11 +153,10 @@ impl BiliRecorder {
|
||||
live_id: Arc::new(RwLock::new(String::new())),
|
||||
cover: Arc::new(RwLock::new(cover)),
|
||||
last_update: Arc::new(RwLock::new(Utc::now().timestamp())),
|
||||
force_update: Arc::new(AtomicBool::new(false)),
|
||||
quit: Arc::new(Mutex::new(false)),
|
||||
live_stream: Arc::new(RwLock::new(None)),
|
||||
danmu_storage: Arc::new(RwLock::new(None)),
|
||||
live_end_channel: options.channel,
|
||||
event_channel: options.channel,
|
||||
enabled: Arc::new(RwLock::new(options.auto_start)),
|
||||
last_segment_offset: Arc::new(RwLock::new(None)),
|
||||
current_header_info: Arc::new(RwLock::new(None)),
|
||||
@@ -216,41 +224,54 @@ impl BiliRecorder {
|
||||
}
|
||||
|
||||
// Get cover image
|
||||
if let Ok(cover_base64) = self
|
||||
let room_cover_path = Path::new(PlatformType::BiliBili.as_str())
|
||||
.join(self.room_id.to_string())
|
||||
.join("cover.jpg");
|
||||
let full_room_cover_path =
|
||||
Path::new(&self.config.read().await.cache).join(&room_cover_path);
|
||||
if (self
|
||||
.client
|
||||
.read()
|
||||
.await
|
||||
.get_cover_base64(&room_info.room_cover_url)
|
||||
.await
|
||||
.download_file(&room_info.room_cover_url, &full_room_cover_path)
|
||||
.await)
|
||||
.is_ok()
|
||||
{
|
||||
*self.cover.write().await = Some(cover_base64);
|
||||
*self.cover.write().await =
|
||||
Some(room_cover_path.to_str().unwrap().to_string());
|
||||
}
|
||||
} else if self.config.read().await.live_end_notify {
|
||||
#[cfg(feature = "gui")]
|
||||
self.app_handle
|
||||
.notification()
|
||||
.builder()
|
||||
.title("BiliShadowReplay - 直播结束")
|
||||
.body(format!(
|
||||
"{} 的直播结束了",
|
||||
self.user_info.read().await.user_name
|
||||
))
|
||||
.show()
|
||||
.unwrap();
|
||||
let _ = self.live_end_channel.send(RecorderEvent::LiveEnd {
|
||||
let _ = self.event_channel.send(RecorderEvent::LiveStart {
|
||||
recorder: self.info().await,
|
||||
});
|
||||
} else {
|
||||
if self.config.read().await.live_end_notify {
|
||||
#[cfg(feature = "gui")]
|
||||
self.app_handle
|
||||
.notification()
|
||||
.builder()
|
||||
.title("BiliShadowReplay - 直播结束")
|
||||
.body(format!(
|
||||
"{} 的直播结束了",
|
||||
self.user_info.read().await.user_name
|
||||
))
|
||||
.show()
|
||||
.unwrap();
|
||||
}
|
||||
let _ = self.event_channel.send(RecorderEvent::LiveEnd {
|
||||
platform: PlatformType::BiliBili,
|
||||
room_id: self.room_id,
|
||||
live_id: self.live_id.read().await.clone(),
|
||||
recorder: self.info().await,
|
||||
});
|
||||
}
|
||||
|
||||
// just doing reset
|
||||
// just doing reset, cuz live status is changed
|
||||
self.reset().await;
|
||||
}
|
||||
|
||||
*self.live_status.write().await = live_status;
|
||||
|
||||
if !live_status {
|
||||
// reset cuz live is ended
|
||||
self.reset().await;
|
||||
|
||||
return false;
|
||||
@@ -293,32 +314,36 @@ impl BiliRecorder {
|
||||
let variant = match master_manifest {
|
||||
Playlist::MasterPlaylist(playlist) => {
|
||||
let variants = playlist.variants.clone();
|
||||
variants.into_iter().find(|variant| {
|
||||
if let Some(other_attributes) = &variant.other_attributes {
|
||||
if let Some(QuotedOrUnquoted::Quoted(bili_display)) =
|
||||
other_attributes.get("BILI-DISPLAY")
|
||||
{
|
||||
bili_display == "原画"
|
||||
variants
|
||||
.into_iter()
|
||||
.filter(|variant| {
|
||||
if let Some(other_attributes) = &variant.other_attributes {
|
||||
if let Some(QuotedOrUnquoted::Quoted(bili_display)) =
|
||||
other_attributes.get("BILI-DISPLAY")
|
||||
{
|
||||
bili_display == "原画"
|
||||
} else {
|
||||
false
|
||||
}
|
||||
} else {
|
||||
false
|
||||
}
|
||||
} else {
|
||||
false
|
||||
}
|
||||
})
|
||||
})
|
||||
.collect::<Vec<_>>()
|
||||
}
|
||||
_ => {
|
||||
log::error!("[{}]Master manifest is not a media playlist", self.room_id);
|
||||
None
|
||||
vec![]
|
||||
}
|
||||
};
|
||||
|
||||
if variant.is_none() {
|
||||
if variant.is_empty() {
|
||||
log::error!("[{}]No variant found", self.room_id);
|
||||
return true;
|
||||
}
|
||||
|
||||
let variant = variant.unwrap();
|
||||
// random select a variant
|
||||
let variant = variant.choose(&mut rand::thread_rng()).unwrap();
|
||||
|
||||
let new_stream = self.stream_from_variant(variant).await;
|
||||
if new_stream.is_err() {
|
||||
@@ -332,34 +357,27 @@ impl BiliRecorder {
|
||||
|
||||
let stream = new_stream.unwrap();
|
||||
|
||||
let should_update_stream = self.live_stream.read().await.is_none()
|
||||
|| self.force_update.load(Ordering::Relaxed);
|
||||
|
||||
if should_update_stream {
|
||||
self.force_update.store(false, Ordering::Relaxed);
|
||||
|
||||
let new_stream = self.fetch_real_stream(&stream).await;
|
||||
if new_stream.is_err() {
|
||||
log::error!(
|
||||
"[{}]Fetch real stream failed: {}",
|
||||
self.room_id,
|
||||
new_stream.err().unwrap()
|
||||
);
|
||||
return true;
|
||||
}
|
||||
|
||||
let new_stream = new_stream.unwrap();
|
||||
*self.live_stream.write().await = Some(new_stream);
|
||||
*self.last_update.write().await = Utc::now().timestamp();
|
||||
|
||||
log::info!(
|
||||
"[{}]Update to a new stream: {:?} => {}",
|
||||
let new_stream = self.fetch_real_stream(&stream).await;
|
||||
if new_stream.is_err() {
|
||||
log::error!(
|
||||
"[{}]Fetch real stream failed: {}",
|
||||
self.room_id,
|
||||
self.live_stream.read().await.clone(),
|
||||
stream
|
||||
new_stream.err().unwrap()
|
||||
);
|
||||
return true;
|
||||
}
|
||||
|
||||
let new_stream = new_stream.unwrap();
|
||||
*self.live_stream.write().await = Some(new_stream);
|
||||
*self.last_update.write().await = Utc::now().timestamp();
|
||||
|
||||
log::info!(
|
||||
"[{}]Update to a new stream: {:?} => {}",
|
||||
self.room_id,
|
||||
self.live_stream.read().await.clone(),
|
||||
stream
|
||||
);
|
||||
|
||||
true
|
||||
}
|
||||
Err(e) => {
|
||||
@@ -372,7 +390,7 @@ impl BiliRecorder {
|
||||
|
||||
async fn stream_from_variant(
|
||||
&self,
|
||||
variant: VariantStream,
|
||||
variant: &VariantStream,
|
||||
) -> Result<BiliStream, super::errors::RecorderError> {
|
||||
let url = variant.uri.clone();
|
||||
// example url: https://cn-hnld-ct-01-47.bilivideo.com/live-bvc/931676/live_1789460279_3538985/index.m3u8?expires=1745927098&len=0&oi=3729149990&pt=h5&qn=10000&trid=10075ceab17d4c9498264eb76d572b6810ad&sigparams=cdn,expires,len,oi,pt,qn,trid&cdn=cn-gotcha01&sign=686434f3ad01d33e001c80bfb7e1713d&site=3124fc9e0fabc664ace3d1b33638f7f2&free_type=0&mid=0&sche=ban&bvchls=1&sid=cn-hnld-ct-01-47&chash=0&bmt=1&sg=lr&trace=25&isp=ct&rg=East&pv=Shanghai&sk=28cc07215ff940102a1d60dade11467e&codec=0&pp=rtmp&hdr_type=0&hot_cdn=57345&suffix=origin&flvsk=c9154f5b3c6b14808bc5569329cf7f94&origin_bitrate=1281767&score=1&source=puv3_master&p2p_type=-1&deploy_env=prod&sl=1&info_source=origin&vd=nc&zoneid_l=151355393&sid_l=stream_name_cold&src=puv3&order=1
|
||||
@@ -397,7 +415,7 @@ impl BiliRecorder {
|
||||
let danmu_stream = DanmuStream::new(ProviderType::BiliBili, &cookies, room_id).await;
|
||||
if danmu_stream.is_err() {
|
||||
let err = danmu_stream.err().unwrap();
|
||||
log::error!("Failed to create danmu stream: {}", err);
|
||||
log::error!("[{}]Failed to create danmu stream: {}", self.room_id, err);
|
||||
return Err(super::errors::RecorderError::DanmuStreamError { err });
|
||||
}
|
||||
let danmu_stream = danmu_stream.unwrap();
|
||||
@@ -424,7 +442,7 @@ impl BiliRecorder {
|
||||
}
|
||||
}
|
||||
} else {
|
||||
log::error!("Failed to receive danmu message");
|
||||
log::error!("[{}]Failed to receive danmu message", self.room_id);
|
||||
return Err(super::errors::RecorderError::DanmuStreamError {
|
||||
err: danmu_stream::DanmuStreamError::WebsocketError {
|
||||
err: "Failed to receive danmu message".to_string(),
|
||||
@@ -463,9 +481,14 @@ impl BiliRecorder {
|
||||
})
|
||||
}
|
||||
Err(e) => {
|
||||
log::error!("Failed fetching index content from {}", stream.index());
|
||||
log::error!(
|
||||
"Master manifest: {}",
|
||||
"[{}]Failed fetching index content from {}",
|
||||
self.room_id,
|
||||
stream.index()
|
||||
);
|
||||
log::error!(
|
||||
"[{}]Master manifest: {}",
|
||||
self.room_id,
|
||||
self.master_manifest.read().await.as_ref().unwrap()
|
||||
);
|
||||
Err(super::errors::RecorderError::BiliClientError { err: e })
|
||||
@@ -501,7 +524,11 @@ impl BiliRecorder {
|
||||
header_url = captures.get(0).unwrap().as_str().to_string();
|
||||
}
|
||||
if header_url.is_empty() {
|
||||
log::warn!("Parse header url failed: {}", index_content);
|
||||
log::warn!(
|
||||
"[{}]Parse header url failed: {}",
|
||||
self.room_id,
|
||||
index_content
|
||||
);
|
||||
}
|
||||
|
||||
Ok(header_url)
|
||||
@@ -579,7 +606,6 @@ impl BiliRecorder {
|
||||
let current_stream = current_stream.unwrap();
|
||||
let parsed = self.get_playlist().await;
|
||||
if parsed.is_err() {
|
||||
self.force_update.store(true, Ordering::Relaxed);
|
||||
return Err(parsed.err().unwrap());
|
||||
}
|
||||
|
||||
@@ -638,11 +664,16 @@ impl BiliRecorder {
|
||||
{
|
||||
Ok(size) => {
|
||||
if size == 0 {
|
||||
log::error!("Download header failed: {}", full_header_url);
|
||||
log::error!(
|
||||
"[{}]Download header failed: {}",
|
||||
self.room_id,
|
||||
full_header_url
|
||||
);
|
||||
// Clean up empty directory since header download failed
|
||||
if let Err(cleanup_err) = tokio::fs::remove_dir_all(&work_dir).await {
|
||||
log::warn!(
|
||||
"Failed to cleanup empty work directory {}: {}",
|
||||
"[{}]Failed to cleanup empty work directory {}: {}",
|
||||
self.room_id,
|
||||
work_dir,
|
||||
cleanup_err
|
||||
);
|
||||
@@ -653,6 +684,17 @@ impl BiliRecorder {
|
||||
}
|
||||
header.size = size;
|
||||
|
||||
if self.cover.read().await.is_some() {
|
||||
let current_cover_full_path = Path::new(&self.config.read().await.cache)
|
||||
.join(self.cover.read().await.clone().unwrap());
|
||||
// copy current cover to work_dir
|
||||
let _ = tokio::fs::copy(
|
||||
current_cover_full_path,
|
||||
&format!("{}/{}", work_dir, "cover.jpg"),
|
||||
)
|
||||
.await;
|
||||
}
|
||||
|
||||
// Now that download succeeded, create the record and setup stores
|
||||
self.db
|
||||
.add_record(
|
||||
@@ -660,7 +702,14 @@ impl BiliRecorder {
|
||||
timestamp.to_string().as_str(),
|
||||
self.room_id,
|
||||
&self.room_info.read().await.room_title,
|
||||
self.cover.read().await.clone(),
|
||||
format!(
|
||||
"{}/{}/{}/{}",
|
||||
PlatformType::BiliBili.as_str(),
|
||||
self.room_id,
|
||||
timestamp,
|
||||
"cover.jpg"
|
||||
)
|
||||
.into(),
|
||||
None,
|
||||
)
|
||||
.await?;
|
||||
@@ -693,13 +742,18 @@ impl BiliRecorder {
|
||||
url: header_url.clone(),
|
||||
resolution: new_resolution,
|
||||
});
|
||||
|
||||
let _ = self.event_channel.send(RecorderEvent::RecordStart {
|
||||
recorder: self.info().await,
|
||||
});
|
||||
}
|
||||
Err(e) => {
|
||||
log::error!("Download header failed: {}", e);
|
||||
log::error!("[{}]Download header failed: {}", self.room_id, e);
|
||||
// Clean up empty directory since header download failed
|
||||
if let Err(cleanup_err) = tokio::fs::remove_dir_all(&work_dir).await {
|
||||
log::warn!(
|
||||
"Failed to cleanup empty work directory {}: {}",
|
||||
"[{}]Failed to cleanup empty work directory {}: {}",
|
||||
self.room_id,
|
||||
work_dir,
|
||||
cleanup_err
|
||||
);
|
||||
@@ -746,7 +800,9 @@ impl BiliRecorder {
|
||||
}
|
||||
|
||||
match playlist {
|
||||
Playlist::MasterPlaylist(pl) => log::debug!("Master playlist:\n{:?}", pl),
|
||||
Playlist::MasterPlaylist(pl) => {
|
||||
log::debug!("[{}]Master playlist:\n{:?}", self.room_id, pl)
|
||||
}
|
||||
Playlist::MediaPlaylist(pl) => {
|
||||
let mut new_segment_fetched = false;
|
||||
let last_sequence = self
|
||||
@@ -779,30 +835,7 @@ impl BiliRecorder {
|
||||
}
|
||||
|
||||
// Extract stream start timestamp from header if available for FMP4
|
||||
let stream_start_timestamp = if current_stream.format == StreamType::FMP4 {
|
||||
if let Some(header_entry) = self
|
||||
.entry_store
|
||||
.read()
|
||||
.await
|
||||
.as_ref()
|
||||
.and_then(|store| store.get_header())
|
||||
{
|
||||
// Parse timestamp from header filename like "h1753276580.m4s"
|
||||
if let Some(timestamp_str) = header_entry
|
||||
.url
|
||||
.strip_prefix("h")
|
||||
.and_then(|s| s.strip_suffix(".m4s"))
|
||||
{
|
||||
timestamp_str.parse::<i64>().unwrap_or(0)
|
||||
} else {
|
||||
0
|
||||
}
|
||||
} else {
|
||||
0
|
||||
}
|
||||
} else {
|
||||
0
|
||||
};
|
||||
let stream_start_timestamp = self.room_info.read().await.live_start_time;
|
||||
|
||||
// Get the last segment offset from previous processing
|
||||
let mut last_offset = *self.last_segment_offset.read().await;
|
||||
@@ -815,7 +848,12 @@ impl BiliRecorder {
|
||||
|
||||
let ts_url = current_stream.ts_url(&ts.uri);
|
||||
if Url::parse(&ts_url).is_err() {
|
||||
log::error!("Ts url is invalid. ts_url={} original={}", ts_url, ts.uri);
|
||||
log::error!(
|
||||
"[{}]Ts url is invalid. ts_url={} original={}",
|
||||
self.room_id,
|
||||
ts_url,
|
||||
ts.uri
|
||||
);
|
||||
continue;
|
||||
}
|
||||
|
||||
@@ -879,7 +917,7 @@ impl BiliRecorder {
|
||||
|
||||
loop {
|
||||
if retry > 3 {
|
||||
log::error!("Download ts failed after retry");
|
||||
log::error!("[{}]Download ts failed after retry", self.room_id);
|
||||
|
||||
// Clean up empty directory if first ts download failed for non-FMP4
|
||||
if is_first_record
|
||||
@@ -889,7 +927,8 @@ impl BiliRecorder {
|
||||
if let Err(cleanup_err) = tokio::fs::remove_dir_all(&work_dir).await
|
||||
{
|
||||
log::warn!(
|
||||
"Failed to cleanup empty work directory {}: {}",
|
||||
"[{}]Failed to cleanup empty work directory {}: {}",
|
||||
self.room_id,
|
||||
work_dir,
|
||||
cleanup_err
|
||||
);
|
||||
@@ -906,7 +945,10 @@ impl BiliRecorder {
|
||||
{
|
||||
Ok(size) => {
|
||||
if size == 0 {
|
||||
log::error!("Segment with size 0, stream might be corrupted");
|
||||
log::error!(
|
||||
"[{}]Segment with size 0, stream might be corrupted",
|
||||
self.room_id
|
||||
);
|
||||
|
||||
// Clean up empty directory if first ts download failed for non-FMP4
|
||||
if is_first_record
|
||||
@@ -917,7 +959,8 @@ impl BiliRecorder {
|
||||
tokio::fs::remove_dir_all(&work_dir).await
|
||||
{
|
||||
log::warn!(
|
||||
"Failed to cleanup empty work directory {}: {}",
|
||||
"[{}]Failed to cleanup empty work directory {}: {}",
|
||||
self.room_id,
|
||||
work_dir,
|
||||
cleanup_err
|
||||
);
|
||||
@@ -968,20 +1011,30 @@ impl BiliRecorder {
|
||||
{
|
||||
Ok(duration) => {
|
||||
log::debug!(
|
||||
"Precise TS segment duration: {}s (original: {}s)",
|
||||
"[{}]Precise TS segment duration: {}s (original: {}s)",
|
||||
self.room_id,
|
||||
duration,
|
||||
ts_length
|
||||
);
|
||||
duration
|
||||
}
|
||||
Err(e) => {
|
||||
log::warn!("Failed to get precise TS duration for {}: {}, using fallback", file_name, e);
|
||||
log::warn!(
|
||||
"[{}]Failed to get precise TS duration for {}: {}, using fallback",
|
||||
self.room_id,
|
||||
file_name,
|
||||
e
|
||||
);
|
||||
ts_length
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// FMP4 segment without BILI-AUX info, use fallback
|
||||
log::debug!("No BILI-AUX data available for FMP4 segment {}, using target duration", file_name);
|
||||
log::debug!(
|
||||
"[{}]No BILI-AUX data available for FMP4 segment {}, using target duration",
|
||||
self.room_id,
|
||||
file_name
|
||||
);
|
||||
ts_length
|
||||
};
|
||||
|
||||
@@ -1012,7 +1065,14 @@ impl BiliRecorder {
|
||||
}
|
||||
Err(e) => {
|
||||
retry += 1;
|
||||
log::warn!("Download ts failed, retry {}: {}", retry, e);
|
||||
log::warn!(
|
||||
"[{}]Download ts failed, retry {}: {}",
|
||||
self.room_id,
|
||||
retry,
|
||||
e
|
||||
);
|
||||
log::warn!("[{}]file_name: {}", self.room_id, file_name);
|
||||
log::warn!("[{}]ts_url: {}", self.room_id, ts_url);
|
||||
|
||||
// If this is the last retry and it's the first record for non-FMP4, clean up
|
||||
if retry > 3
|
||||
@@ -1024,7 +1084,8 @@ impl BiliRecorder {
|
||||
tokio::fs::remove_dir_all(&work_dir).await
|
||||
{
|
||||
log::warn!(
|
||||
"Failed to cleanup empty work directory {}: {}",
|
||||
"[{}]Failed to cleanup empty work directory {}: {}",
|
||||
self.room_id,
|
||||
work_dir,
|
||||
cleanup_err
|
||||
);
|
||||
@@ -1058,7 +1119,10 @@ impl BiliRecorder {
|
||||
} else {
|
||||
// if index content is not changed for a long time, we should return a error to fetch a new stream
|
||||
if *self.last_update.read().await < Utc::now().timestamp() - 10 {
|
||||
log::error!("Stream content is not updating for 10s, maybe not started yet or not closed properly.");
|
||||
log::error!(
|
||||
"[{}]Stream content is not updating for 10s, maybe not started yet or not closed properly.",
|
||||
self.room_id
|
||||
);
|
||||
return Err(super::errors::RecorderError::FreezedStream {
|
||||
stream: current_stream,
|
||||
});
|
||||
@@ -1068,7 +1132,11 @@ impl BiliRecorder {
|
||||
if let Some(entry_store) = self.entry_store.read().await.as_ref() {
|
||||
if let Some(last_ts) = entry_store.last_ts() {
|
||||
if last_ts < Utc::now().timestamp() - 10 {
|
||||
log::error!("Stream is too slow, last entry ts is at {}", last_ts);
|
||||
log::error!(
|
||||
"[{}]Stream is too slow, last entry ts is at {}",
|
||||
self.room_id,
|
||||
last_ts
|
||||
);
|
||||
return Err(super::errors::RecorderError::SlowStream {
|
||||
stream: current_stream,
|
||||
});
|
||||
@@ -1088,8 +1156,7 @@ impl BiliRecorder {
|
||||
.as_ref()
|
||||
.is_some_and(|s| s.expire - Utc::now().timestamp() < pre_offset as i64)
|
||||
{
|
||||
log::info!("Stream is nearly expired, force update");
|
||||
self.force_update.store(true, Ordering::Relaxed);
|
||||
log::info!("[{}]Stream is nearly expired", self.room_id);
|
||||
return Err(super::errors::RecorderError::StreamExpired {
|
||||
stream: current_stream.unwrap(),
|
||||
});
|
||||
@@ -1138,17 +1205,18 @@ impl super::Recorder for BiliRecorder {
|
||||
async fn run(&self) {
|
||||
let self_clone = self.clone();
|
||||
*self.danmu_task.lock().await = Some(tokio::spawn(async move {
|
||||
log::info!("Start fetching danmu for room {}", self_clone.room_id);
|
||||
log::info!("[{}]Start fetching danmu", self_clone.room_id);
|
||||
let _ = self_clone.danmu().await;
|
||||
}));
|
||||
|
||||
let self_clone = self.clone();
|
||||
*self.record_task.lock().await = Some(tokio::spawn(async move {
|
||||
log::info!("Start running recorder for room {}", self_clone.room_id);
|
||||
log::info!("[{}]Start running recorder", self_clone.room_id);
|
||||
while !*self_clone.quit.lock().await {
|
||||
let mut connection_fail_count = 0;
|
||||
if self_clone.check_status().await {
|
||||
// Live status is ok, start recording.
|
||||
let mut continue_record = false;
|
||||
while self_clone.should_record().await {
|
||||
match self_clone.update_entries().await {
|
||||
Ok(ms) => {
|
||||
@@ -1172,11 +1240,28 @@ impl super::Recorder for BiliRecorder {
|
||||
connection_fail_count =
|
||||
std::cmp::min(5, connection_fail_count + 1);
|
||||
}
|
||||
// if error is stream expired, we should not break, cuz we need to fetch a new stream
|
||||
if let RecorderError::StreamExpired { stream: _ } = e {
|
||||
continue_record = true;
|
||||
}
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if continue_record {
|
||||
log::info!("[{}]Continue recording without reset", self_clone.room_id);
|
||||
continue;
|
||||
}
|
||||
|
||||
// whatever error happened during update entries, reset to start another recording.
|
||||
if self_clone.current_header_info.read().await.is_some() {
|
||||
let _ = self_clone.event_channel.send(RecorderEvent::RecordEnd {
|
||||
recorder: self_clone.info().await,
|
||||
});
|
||||
}
|
||||
*self_clone.is_recording.write().await = false;
|
||||
self_clone.reset().await;
|
||||
// go check status again after random 2-5 secs
|
||||
let secs = rand::random::<u64>() % 4 + 2;
|
||||
tokio::time::sleep(Duration::from_secs(
|
||||
@@ -1193,7 +1278,7 @@ impl super::Recorder for BiliRecorder {
|
||||
}
|
||||
|
||||
async fn stop(&self) {
|
||||
log::debug!("Stop recorder for room {}", self.room_id);
|
||||
log::debug!("[{}]Stop recorder", self.room_id);
|
||||
*self.quit.lock().await = true;
|
||||
if let Some(danmu_task) = self.danmu_task.lock().await.as_mut() {
|
||||
let _ = danmu_task.abort();
|
||||
@@ -1201,7 +1286,7 @@ impl super::Recorder for BiliRecorder {
|
||||
if let Some(record_task) = self.record_task.lock().await.as_mut() {
|
||||
let _ = record_task.abort();
|
||||
}
|
||||
log::info!("Recorder for room {} quit.", self.room_id);
|
||||
log::info!("[{}]Recorder quit.", self.room_id);
|
||||
}
|
||||
|
||||
/// timestamp is the id of live stream
|
||||
@@ -1214,7 +1299,10 @@ impl super::Recorder for BiliRecorder {
|
||||
}
|
||||
|
||||
async fn master_m3u8(&self, live_id: &str, start: i64, end: i64) -> String {
|
||||
log::info!("Master manifest for {live_id} {start}-{end}");
|
||||
log::info!(
|
||||
"[{}]Master manifest for {live_id} {start}-{end}",
|
||||
self.room_id
|
||||
);
|
||||
let offset = self.first_segment_ts(live_id).await / 1000;
|
||||
let mut m3u8_content = "#EXTM3U\n".to_string();
|
||||
m3u8_content += "#EXT-X-VERSION:6\n";
|
||||
@@ -1291,7 +1379,11 @@ impl super::Recorder for BiliRecorder {
|
||||
live_id,
|
||||
"danmu.txt"
|
||||
);
|
||||
log::debug!("loading danmu cache from {}", cache_file_path);
|
||||
log::debug!(
|
||||
"[{}]loading danmu cache from {}",
|
||||
self.room_id,
|
||||
cache_file_path
|
||||
);
|
||||
let storage = DanmuStorage::new(&cache_file_path).await;
|
||||
if storage.is_none() {
|
||||
return Ok(Vec::new());
|
||||
@@ -1340,13 +1432,19 @@ impl super::Recorder for BiliRecorder {
|
||||
let m3u8_index_file_path = format!("{}/{}", work_dir, "tmp.m3u8");
|
||||
let m3u8_content = self.m3u8_content(live_id, 0, 0).await;
|
||||
tokio::fs::write(&m3u8_index_file_path, m3u8_content).await?;
|
||||
log::info!("M3U8 index file generated: {}", m3u8_index_file_path);
|
||||
log::info!(
|
||||
"[{}]M3U8 index file generated: {}",
|
||||
self.room_id,
|
||||
m3u8_index_file_path
|
||||
);
|
||||
// generate a tmp clip file
|
||||
let clip_file_path = format!("{}/{}", work_dir, "tmp.mp4");
|
||||
if let Err(e) = crate::ffmpeg::clip_from_m3u8(
|
||||
None::<&crate::progress_reporter::ProgressReporter>,
|
||||
Path::new(&m3u8_index_file_path),
|
||||
Path::new(&clip_file_path),
|
||||
None,
|
||||
false,
|
||||
)
|
||||
.await
|
||||
{
|
||||
@@ -1354,7 +1452,11 @@ impl super::Recorder for BiliRecorder {
|
||||
error: e.to_string(),
|
||||
});
|
||||
}
|
||||
log::info!("Temp clip file generated: {}", clip_file_path);
|
||||
log::info!(
|
||||
"[{}]Temp clip file generated: {}",
|
||||
self.room_id,
|
||||
clip_file_path
|
||||
);
|
||||
// generate subtitle file
|
||||
let config = self.config.read().await;
|
||||
let result = crate::ffmpeg::generate_video_subtitle(
|
||||
@@ -1374,7 +1476,7 @@ impl super::Recorder for BiliRecorder {
|
||||
error: e.to_string(),
|
||||
});
|
||||
}
|
||||
log::info!("Subtitle generated");
|
||||
log::info!("[{}]Subtitle generated", self.room_id);
|
||||
let result = result.unwrap();
|
||||
let subtitle_content = result
|
||||
.subtitle_content
|
||||
@@ -1383,11 +1485,11 @@ impl super::Recorder for BiliRecorder {
|
||||
.collect::<Vec<String>>()
|
||||
.join("");
|
||||
subtitle_file.write_all(subtitle_content.as_bytes()).await?;
|
||||
log::info!("Subtitle file written");
|
||||
log::info!("[{}]Subtitle file written", self.room_id);
|
||||
// remove tmp file
|
||||
tokio::fs::remove_file(&m3u8_index_file_path).await?;
|
||||
tokio::fs::remove_file(&clip_file_path).await?;
|
||||
log::info!("Tmp file removed");
|
||||
log::info!("[{}]Tmp file removed", self.room_id);
|
||||
Ok(subtitle_content)
|
||||
}
|
||||
|
||||
|
||||
@@ -9,7 +9,7 @@ use super::response::VideoSubmitData;
|
||||
use crate::database::account::AccountRow;
|
||||
use crate::progress_reporter::ProgressReporter;
|
||||
use crate::progress_reporter::ProgressReporterTrait;
|
||||
use base64::Engine;
|
||||
use chrono::TimeZone;
|
||||
use pct_str::PctString;
|
||||
use pct_str::URIReserved;
|
||||
use regex::Regex;
|
||||
@@ -42,6 +42,7 @@ pub struct RoomInfo {
|
||||
pub room_keyframe_url: String,
|
||||
pub room_title: String,
|
||||
pub user_id: u64,
|
||||
pub live_start_time: i64,
|
||||
}
|
||||
|
||||
#[derive(Serialize, Deserialize, Clone, Debug)]
|
||||
@@ -331,6 +332,22 @@ impl BiliClient {
|
||||
let live_status = res["data"]["live_status"]
|
||||
.as_u64()
|
||||
.ok_or(BiliClientError::InvalidValue)? as u8;
|
||||
// "live_time": "2025-08-09 18:33:35",
|
||||
let live_start_time_str = res["data"]["live_time"]
|
||||
.as_str()
|
||||
.ok_or(BiliClientError::InvalidValue)?;
|
||||
let live_start_time = if live_start_time_str == "0000-00-00 00:00:00" {
|
||||
0
|
||||
} else {
|
||||
let naive =
|
||||
chrono::NaiveDateTime::parse_from_str(live_start_time_str, "%Y-%m-%d %H:%M:%S")
|
||||
.map_err(|_| BiliClientError::InvalidValue)?;
|
||||
chrono::Local
|
||||
.from_local_datetime(&naive)
|
||||
.earliest()
|
||||
.ok_or(BiliClientError::InvalidValue)?
|
||||
.timestamp()
|
||||
};
|
||||
Ok(RoomInfo {
|
||||
room_id,
|
||||
room_title,
|
||||
@@ -338,18 +355,21 @@ impl BiliClient {
|
||||
room_keyframe_url,
|
||||
user_id,
|
||||
live_status,
|
||||
live_start_time,
|
||||
})
|
||||
}
|
||||
|
||||
/// Get and encode response data into base64
|
||||
pub async fn get_cover_base64(&self, url: &str) -> Result<String, BiliClientError> {
|
||||
/// Download file from url to path
|
||||
pub async fn download_file(&self, url: &str, path: &Path) -> Result<(), BiliClientError> {
|
||||
if !path.parent().unwrap().exists() {
|
||||
std::fs::create_dir_all(path.parent().unwrap()).unwrap();
|
||||
}
|
||||
let response = self.client.get(url).send().await?;
|
||||
let bytes = response.bytes().await?;
|
||||
let base64 = base64::engine::general_purpose::STANDARD.encode(bytes);
|
||||
let mime_type = mime_guess::from_path(url)
|
||||
.first_or_octet_stream()
|
||||
.to_string();
|
||||
Ok(format!("data:{};base64,{}", mime_type, base64))
|
||||
let mut file = tokio::fs::File::create(&path).await?;
|
||||
let mut content = std::io::Cursor::new(bytes);
|
||||
tokio::io::copy(&mut content, &mut file).await?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn get_index_content(
|
||||
|
||||
@@ -59,7 +59,8 @@ pub struct DouyinRecorder {
|
||||
db: Arc<Database>,
|
||||
account: AccountRow,
|
||||
room_id: u64,
|
||||
room_info: Arc<RwLock<Option<response::DouyinRoomInfoResponse>>>,
|
||||
sec_user_id: String,
|
||||
room_info: Arc<RwLock<Option<client::DouyinBasicRoomInfo>>>,
|
||||
stream_url: Arc<RwLock<Option<String>>>,
|
||||
entry_store: Arc<RwLock<Option<EntryStore>>>,
|
||||
danmu_store: Arc<RwLock<Option<DanmuStorage>>>,
|
||||
@@ -84,6 +85,7 @@ impl DouyinRecorder {
|
||||
#[cfg(not(feature = "headless"))] app_handle: AppHandle,
|
||||
emitter: EventEmitter,
|
||||
room_id: u64,
|
||||
sec_user_id: &str,
|
||||
config: Arc<RwLock<Config>>,
|
||||
account: &AccountRow,
|
||||
db: &Arc<Database>,
|
||||
@@ -91,9 +93,9 @@ impl DouyinRecorder {
|
||||
channel: broadcast::Sender<RecorderEvent>,
|
||||
) -> Result<Self, super::errors::RecorderError> {
|
||||
let client = client::DouyinClient::new(&config.read().await.user_agent, account);
|
||||
let room_info = client.get_room_info(room_id).await?;
|
||||
let room_info = client.get_room_info(room_id, sec_user_id).await?;
|
||||
let mut live_status = LiveStatus::Offline;
|
||||
if room_info.data.room_status == 0 {
|
||||
if room_info.status == 0 {
|
||||
live_status = LiveStatus::Live;
|
||||
}
|
||||
|
||||
@@ -104,6 +106,7 @@ impl DouyinRecorder {
|
||||
db: db.clone(),
|
||||
account: account.clone(),
|
||||
room_id,
|
||||
sec_user_id: sec_user_id.to_string(),
|
||||
live_id: Arc::new(RwLock::new(String::new())),
|
||||
danmu_room_id: Arc::new(RwLock::new(String::new())),
|
||||
entry_store: Arc::new(RwLock::new(None)),
|
||||
@@ -134,9 +137,13 @@ impl DouyinRecorder {
|
||||
}
|
||||
|
||||
async fn check_status(&self) -> bool {
|
||||
match self.client.get_room_info(self.room_id).await {
|
||||
match self
|
||||
.client
|
||||
.get_room_info(self.room_id, &self.sec_user_id)
|
||||
.await
|
||||
{
|
||||
Ok(info) => {
|
||||
let live_status = info.data.room_status == 0; // room_status == 0 表示正在直播
|
||||
let live_status = info.status == 0; // room_status == 0 表示正在直播
|
||||
|
||||
*self.room_info.write().await = Some(info.clone());
|
||||
|
||||
@@ -157,7 +164,7 @@ impl DouyinRecorder {
|
||||
.title("BiliShadowReplay - 直播开始")
|
||||
.body(format!(
|
||||
"{} 开启了直播:{}",
|
||||
info.data.user.nickname, info.data.data[0].title
|
||||
info.user_name, info.room_title
|
||||
))
|
||||
.show()
|
||||
.unwrap();
|
||||
@@ -169,14 +176,14 @@ impl DouyinRecorder {
|
||||
.title("BiliShadowReplay - 直播结束")
|
||||
.body(format!(
|
||||
"{} 关闭了直播:{}",
|
||||
info.data.user.nickname, info.data.data[0].title
|
||||
info.user_name, info.room_title
|
||||
))
|
||||
.show()
|
||||
.unwrap();
|
||||
let _ = self.live_end_channel.send(RecorderEvent::LiveEnd {
|
||||
platform: PlatformType::Douyin,
|
||||
room_id: self.room_id,
|
||||
live_id: self.live_id.read().await.clone(),
|
||||
recorder: self.info().await,
|
||||
});
|
||||
}
|
||||
|
||||
@@ -202,13 +209,7 @@ impl DouyinRecorder {
|
||||
}
|
||||
|
||||
// Get stream URL when live starts
|
||||
if !info.data.data[0]
|
||||
.stream_url
|
||||
.as_ref()
|
||||
.unwrap()
|
||||
.hls_pull_url
|
||||
.is_empty()
|
||||
{
|
||||
if !info.hls_url.is_empty() {
|
||||
// Only set stream URL, don't create record yet
|
||||
// Record will be created when first ts download succeeds
|
||||
let new_stream_url = self.get_best_stream_url(&info).await;
|
||||
@@ -219,7 +220,7 @@ impl DouyinRecorder {
|
||||
|
||||
log::info!("New douyin stream URL: {}", new_stream_url.clone().unwrap());
|
||||
*self.stream_url.write().await = Some(new_stream_url.unwrap());
|
||||
*self.danmu_room_id.write().await = info.data.data[0].id_str.clone();
|
||||
*self.danmu_room_id.write().await = info.room_id_str.clone();
|
||||
}
|
||||
|
||||
true
|
||||
@@ -296,18 +297,8 @@ impl DouyinRecorder {
|
||||
)
|
||||
}
|
||||
|
||||
async fn get_best_stream_url(
|
||||
&self,
|
||||
room_info: &response::DouyinRoomInfoResponse,
|
||||
) -> Option<String> {
|
||||
let stream_data = room_info.data.data[0]
|
||||
.stream_url
|
||||
.as_ref()
|
||||
.unwrap()
|
||||
.live_core_sdk_data
|
||||
.pull_data
|
||||
.stream_data
|
||||
.clone();
|
||||
async fn get_best_stream_url(&self, room_info: &client::DouyinBasicRoomInfo) -> Option<String> {
|
||||
let stream_data = room_info.stream_data.clone();
|
||||
// parse stream_data into stream_info
|
||||
let stream_info = serde_json::from_str::<stream_info::StreamInfo>(&stream_data);
|
||||
if let Ok(stream_info) = stream_info {
|
||||
@@ -358,12 +349,13 @@ impl DouyinRecorder {
|
||||
return Err(RecorderError::NoStreamAvailable);
|
||||
}
|
||||
|
||||
let stream_url = self.stream_url.read().await.as_ref().unwrap().clone();
|
||||
let mut stream_url = self.stream_url.read().await.as_ref().unwrap().clone();
|
||||
|
||||
// Get m3u8 playlist
|
||||
let (playlist, updated_stream_url) = self.client.get_m3u8_content(&stream_url).await?;
|
||||
|
||||
*self.stream_url.write().await = Some(updated_stream_url);
|
||||
*self.stream_url.write().await = Some(updated_stream_url.clone());
|
||||
stream_url = updated_stream_url;
|
||||
|
||||
let mut new_segment_fetched = false;
|
||||
let mut is_first_segment = self.entry_store.read().await.is_none();
|
||||
@@ -461,10 +453,7 @@ impl DouyinRecorder {
|
||||
if is_first_segment {
|
||||
// Create database record
|
||||
let room_info = room_info.as_ref().unwrap();
|
||||
let cover_url = room_info.data.data[0]
|
||||
.cover
|
||||
.as_ref()
|
||||
.map(|cover| cover.url_list[0].clone());
|
||||
let cover_url = room_info.cover.clone();
|
||||
let cover = if let Some(url) = cover_url {
|
||||
Some(self.client.get_cover_base64(&url).await.unwrap_or_default())
|
||||
} else {
|
||||
@@ -477,7 +466,7 @@ impl DouyinRecorder {
|
||||
PlatformType::Douyin,
|
||||
self.live_id.read().await.as_str(),
|
||||
self.room_id,
|
||||
&room_info.data.data[0].title,
|
||||
&room_info.room_title,
|
||||
cover,
|
||||
None,
|
||||
)
|
||||
@@ -803,6 +792,8 @@ impl Recorder for DouyinRecorder {
|
||||
None::<&crate::progress_reporter::ProgressReporter>,
|
||||
Path::new(&m3u8_index_file_path),
|
||||
Path::new(&clip_file_path),
|
||||
None,
|
||||
false,
|
||||
)
|
||||
.await
|
||||
{
|
||||
@@ -862,17 +853,11 @@ impl Recorder for DouyinRecorder {
|
||||
let room_info = self.room_info.read().await;
|
||||
let room_cover_url = room_info
|
||||
.as_ref()
|
||||
.and_then(|info| {
|
||||
info.data
|
||||
.data
|
||||
.first()
|
||||
.and_then(|data| data.cover.as_ref())
|
||||
.map(|cover| cover.url_list[0].clone())
|
||||
})
|
||||
.and_then(|info| info.cover.clone())
|
||||
.unwrap_or_default();
|
||||
let room_title = room_info
|
||||
.as_ref()
|
||||
.and_then(|info| info.data.data.first().map(|data| data.title.clone()))
|
||||
.map(|info| info.room_title.clone())
|
||||
.unwrap_or_default();
|
||||
RecorderInfo {
|
||||
room_id: self.room_id,
|
||||
@@ -884,15 +869,15 @@ impl Recorder for DouyinRecorder {
|
||||
user_info: UserInfo {
|
||||
user_id: room_info
|
||||
.as_ref()
|
||||
.map(|info| info.data.user.sec_uid.clone())
|
||||
.map(|info| info.sec_user_id.clone())
|
||||
.unwrap_or_default(),
|
||||
user_name: room_info
|
||||
.as_ref()
|
||||
.map(|info| info.data.user.nickname.clone())
|
||||
.map(|info| info.user_name.clone())
|
||||
.unwrap_or_default(),
|
||||
user_avatar: room_info
|
||||
.as_ref()
|
||||
.map(|info| info.data.user.avatar_thumb.url_list[0].clone())
|
||||
.map(|info| info.user_avatar.clone())
|
||||
.unwrap_or_default(),
|
||||
},
|
||||
total_length: if let Some(store) = self.entry_store.read().await.as_ref() {
|
||||
|
||||
@@ -35,10 +35,24 @@ impl From<std::io::Error> for DouyinClientError {
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct DouyinBasicRoomInfo {
|
||||
pub room_id_str: String,
|
||||
pub room_title: String,
|
||||
pub cover: Option<String>,
|
||||
pub status: i64,
|
||||
pub hls_url: String,
|
||||
pub stream_data: String,
|
||||
// user related
|
||||
pub user_name: String,
|
||||
pub user_avatar: String,
|
||||
pub sec_user_id: String,
|
||||
}
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct DouyinClient {
|
||||
client: Client,
|
||||
cookies: String,
|
||||
account: AccountRow,
|
||||
}
|
||||
|
||||
impl DouyinClient {
|
||||
@@ -46,14 +60,15 @@ impl DouyinClient {
|
||||
let client = Client::builder().user_agent(user_agent).build().unwrap();
|
||||
Self {
|
||||
client,
|
||||
cookies: account.cookies.clone(),
|
||||
account: account.clone(),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn get_room_info(
|
||||
&self,
|
||||
room_id: u64,
|
||||
) -> Result<DouyinRoomInfoResponse, DouyinClientError> {
|
||||
sec_user_id: &str,
|
||||
) -> Result<DouyinBasicRoomInfo, DouyinClientError> {
|
||||
let url = format!(
|
||||
"https://live.douyin.com/webcast/room/web/enter/?aid=6383&app_name=douyin_web&live_id=1&device_platform=web&language=zh-CN&enter_from=web_live&a_bogus=0&cookie_enabled=true&screen_width=1920&screen_height=1080&browser_language=zh-CN&browser_platform=MacIntel&browser_name=Chrome&browser_version=122.0.0.0&web_rid={}",
|
||||
room_id
|
||||
@@ -63,7 +78,93 @@ impl DouyinClient {
|
||||
.client
|
||||
.get(&url)
|
||||
.header("Referer", "https://live.douyin.com/")
|
||||
.header("Cookie", self.cookies.clone())
|
||||
.header("Cookie", self.account.cookies.clone())
|
||||
.send()
|
||||
.await?;
|
||||
|
||||
let status = resp.status();
|
||||
let text = resp.text().await?;
|
||||
|
||||
if text.is_empty() {
|
||||
log::warn!("Empty room info response, trying H5 API");
|
||||
return self.get_room_info_h5(room_id, sec_user_id).await;
|
||||
}
|
||||
|
||||
if status.is_success() {
|
||||
if let Ok(data) = serde_json::from_str::<DouyinRoomInfoResponse>(&text) {
|
||||
let cover = data
|
||||
.data
|
||||
.data
|
||||
.first()
|
||||
.and_then(|data| data.cover.as_ref())
|
||||
.map(|cover| cover.url_list[0].clone());
|
||||
return Ok(DouyinBasicRoomInfo {
|
||||
room_id_str: data.data.data[0].id_str.clone(),
|
||||
sec_user_id: sec_user_id.to_string(),
|
||||
cover,
|
||||
room_title: data.data.data[0].title.clone(),
|
||||
user_name: data.data.user.nickname.clone(),
|
||||
user_avatar: data.data.user.avatar_thumb.url_list[0].clone(),
|
||||
status: data.data.room_status,
|
||||
hls_url: data.data.data[0]
|
||||
.stream_url
|
||||
.as_ref()
|
||||
.map(|stream_url| stream_url.hls_pull_url.clone())
|
||||
.unwrap_or_default(),
|
||||
stream_data: data.data.data[0]
|
||||
.stream_url
|
||||
.as_ref()
|
||||
.map(|s| s.live_core_sdk_data.pull_data.stream_data.clone())
|
||||
.unwrap_or_default(),
|
||||
});
|
||||
} else {
|
||||
log::error!("Failed to parse room info response: {}", text);
|
||||
return self.get_room_info_h5(room_id, sec_user_id).await;
|
||||
}
|
||||
}
|
||||
|
||||
log::error!("Failed to get room info: {}", status);
|
||||
return self.get_room_info_h5(room_id, sec_user_id).await;
|
||||
}
|
||||
|
||||
pub async fn get_room_info_h5(
|
||||
&self,
|
||||
room_id: u64,
|
||||
sec_user_id: &str,
|
||||
) -> Result<DouyinBasicRoomInfo, DouyinClientError> {
|
||||
// 参考biliup实现,构建完整的URL参数
|
||||
let room_id_str = room_id.to_string();
|
||||
// https://webcast.amemv.com/webcast/room/reflow/info/?type_id=0&live_id=1&version_code=99.99.99&app_id=1128&room_id=10000&sec_user_id=MS4wLjAB&aid=6383&device_platform=web&browser_language=zh-CN&browser_platform=Win32&browser_name=Mozilla&browser_version=5.0
|
||||
let url_params = [
|
||||
("type_id", "0"),
|
||||
("live_id", "1"),
|
||||
("version_code", "99.99.99"),
|
||||
("app_id", "1128"),
|
||||
("room_id", &room_id_str),
|
||||
("sec_user_id", sec_user_id),
|
||||
("aid", "6383"),
|
||||
("device_platform", "web"),
|
||||
];
|
||||
|
||||
// 构建URL
|
||||
let query_string = url_params
|
||||
.iter()
|
||||
.map(|(k, v)| format!("{}={}", k, v))
|
||||
.collect::<Vec<_>>()
|
||||
.join("&");
|
||||
let url = format!(
|
||||
"https://webcast.amemv.com/webcast/room/reflow/info/?{}",
|
||||
query_string
|
||||
);
|
||||
|
||||
log::info!("get_room_info_h5: {}", url);
|
||||
|
||||
let resp = self
|
||||
.client
|
||||
.get(&url)
|
||||
.header("User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36")
|
||||
.header("Referer", "https://live.douyin.com/")
|
||||
.header("Cookie", self.account.cookies.clone())
|
||||
.send()
|
||||
.await?;
|
||||
|
||||
@@ -71,20 +172,94 @@ impl DouyinClient {
|
||||
let text = resp.text().await?;
|
||||
|
||||
if status.is_success() {
|
||||
if let Ok(data) = serde_json::from_str::<DouyinRoomInfoResponse>(&text) {
|
||||
return Ok(data);
|
||||
} else {
|
||||
log::error!("Failed to parse room info response: {}", text);
|
||||
// Try to parse as H5 response format
|
||||
if let Ok(h5_data) =
|
||||
serde_json::from_str::<super::response::DouyinH5RoomInfoResponse>(&text)
|
||||
{
|
||||
// Extract RoomBasicInfo from H5 response
|
||||
let room = &h5_data.data.room;
|
||||
let owner = &room.owner;
|
||||
|
||||
let cover = room
|
||||
.cover
|
||||
.as_ref()
|
||||
.and_then(|c| c.url_list.first().cloned());
|
||||
let hls_url = room
|
||||
.stream_url
|
||||
.as_ref()
|
||||
.map(|s| s.hls_pull_url.clone())
|
||||
.unwrap_or_default();
|
||||
|
||||
return Ok(DouyinBasicRoomInfo {
|
||||
room_id_str: room.id_str.clone(),
|
||||
room_title: room.title.clone(),
|
||||
cover,
|
||||
status: if room.status == 2 { 0 } else { 1 },
|
||||
hls_url,
|
||||
user_name: owner.nickname.clone(),
|
||||
user_avatar: owner
|
||||
.avatar_thumb
|
||||
.url_list
|
||||
.first()
|
||||
.unwrap_or(&String::new())
|
||||
.clone(),
|
||||
sec_user_id: owner.sec_uid.clone(),
|
||||
stream_data: room
|
||||
.stream_url
|
||||
.as_ref()
|
||||
.map(|s| s.live_core_sdk_data.pull_data.stream_data.clone())
|
||||
.unwrap_or_default(),
|
||||
});
|
||||
}
|
||||
|
||||
// If that fails, try to parse as a generic JSON to see what we got
|
||||
if let Ok(json_value) = serde_json::from_str::<serde_json::Value>(&text) {
|
||||
log::error!(
|
||||
"Unexpected response structure: {}",
|
||||
serde_json::to_string_pretty(&json_value).unwrap_or_default()
|
||||
);
|
||||
|
||||
// Check if it's an error response
|
||||
if let Some(status_code) = json_value.get("status_code").and_then(|v| v.as_i64()) {
|
||||
if status_code != 0 {
|
||||
let error_msg = json_value
|
||||
.get("status_message")
|
||||
.and_then(|v| v.as_str())
|
||||
.unwrap_or("Unknown error");
|
||||
return Err(DouyinClientError::Network(format!(
|
||||
"API returned error status_code: {} - {}",
|
||||
status_code, error_msg
|
||||
)));
|
||||
}
|
||||
}
|
||||
|
||||
// 检查是否是"invalid session"错误
|
||||
if let Some(status_message) =
|
||||
json_value.get("status_message").and_then(|v| v.as_str())
|
||||
{
|
||||
if status_message.contains("invalid session") {
|
||||
return Err(DouyinClientError::Network(
|
||||
"Invalid session - please check your cookies. Make sure you have valid sessionid, passport_csrf_token, and other authentication cookies from douyin.com".to_string(),
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
return Err(DouyinClientError::Network(format!(
|
||||
"Failed to parse room info response: {}",
|
||||
"Failed to parse h5 room info response: {}",
|
||||
text
|
||||
)));
|
||||
} else {
|
||||
log::error!("Failed to parse h5 room info response: {}", text);
|
||||
return Err(DouyinClientError::Network(format!(
|
||||
"Failed to parse h5 room info response: {}",
|
||||
text
|
||||
)));
|
||||
}
|
||||
}
|
||||
|
||||
log::error!("Failed to get room info: {}", status);
|
||||
log::error!("Failed to get h5 room info: {}", status);
|
||||
Err(DouyinClientError::Network(format!(
|
||||
"Failed to get room info: {} {}",
|
||||
"Failed to get h5 room info: {} {}",
|
||||
status, text
|
||||
)))
|
||||
}
|
||||
@@ -96,7 +271,7 @@ impl DouyinClient {
|
||||
.client
|
||||
.get(url)
|
||||
.header("Referer", "https://www.douyin.com/")
|
||||
.header("Cookie", self.cookies.clone())
|
||||
.header("Cookie", self.account.cookies.clone())
|
||||
.send()
|
||||
.await?;
|
||||
|
||||
@@ -171,27 +346,13 @@ impl DouyinClient {
|
||||
&self,
|
||||
url: &str,
|
||||
) -> Result<(MediaPlaylist, String), DouyinClientError> {
|
||||
let content = self
|
||||
.client
|
||||
.get(url)
|
||||
.header("Referer", "https://live.douyin.com/")
|
||||
.header("Cookie", self.cookies.clone())
|
||||
.header("Accept", "*/*")
|
||||
.header("Accept-Language", "zh-CN,zh;q=0.9,en;q=0.8")
|
||||
.header("Accept-Encoding", "gzip, deflate, br")
|
||||
.header("Connection", "keep-alive")
|
||||
.header("Sec-Fetch-Dest", "empty")
|
||||
.header("Sec-Fetch-Mode", "cors")
|
||||
.header("Sec-Fetch-Site", "cross-site")
|
||||
.send()
|
||||
.await?
|
||||
.text()
|
||||
.await?;
|
||||
let content = self.client.get(url).send().await?.text().await?;
|
||||
// m3u8 content: #EXTM3U
|
||||
// #EXT-X-VERSION:3
|
||||
// #EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=2560000
|
||||
// http://7167739a741646b4651b6949b2f3eb8e.livehwc3.cn/pull-hls-l26.douyincdn.com/third/stream-693342996808860134_or4.m3u8?sub_m3u8=true&user_session_id=16090eb45ab8a2f042f7c46563936187&major_anchor_level=common&edge_slice=true&expire=67d944ec&sign=47b95cc6e8de20d82f3d404412fa8406
|
||||
if content.contains("BANDWIDTH") {
|
||||
log::info!("Master manifest with playlist URL: {}", url);
|
||||
let new_url = content.lines().last().unwrap();
|
||||
return Box::pin(self.get_m3u8_content(new_url)).await;
|
||||
}
|
||||
@@ -206,20 +367,7 @@ impl DouyinClient {
|
||||
}
|
||||
|
||||
pub async fn download_ts(&self, url: &str, path: &str) -> Result<u64, DouyinClientError> {
|
||||
let response = self
|
||||
.client
|
||||
.get(url)
|
||||
.header("Referer", "https://live.douyin.com/")
|
||||
.header("Cookie", self.cookies.clone())
|
||||
.header("Accept", "*/*")
|
||||
.header("Accept-Language", "zh-CN,zh;q=0.9,en;q=0.8")
|
||||
.header("Accept-Encoding", "gzip, deflate, br")
|
||||
.header("Connection", "keep-alive")
|
||||
.header("Sec-Fetch-Dest", "empty")
|
||||
.header("Sec-Fetch-Mode", "cors")
|
||||
.header("Sec-Fetch-Site", "cross-site")
|
||||
.send()
|
||||
.await?;
|
||||
let response = self.client.get(url).send().await?;
|
||||
|
||||
if response.status() != reqwest::StatusCode::OK {
|
||||
let error = response.error_for_status().unwrap_err();
|
||||
|
||||
@@ -673,3 +673,120 @@ pub struct AvatarSmall {
|
||||
#[serde(rename = "url_list")]
|
||||
pub url_list: Vec<String>,
|
||||
}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct DouyinH5RoomInfoResponse {
|
||||
pub data: H5Data,
|
||||
pub extra: H5Extra,
|
||||
#[serde(rename = "status_code")]
|
||||
pub status_code: i64,
|
||||
}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct H5Data {
|
||||
pub room: H5Room,
|
||||
pub user: H5User,
|
||||
}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct H5Room {
|
||||
pub id: u64,
|
||||
#[serde(rename = "id_str")]
|
||||
pub id_str: String,
|
||||
pub status: i64,
|
||||
pub title: String,
|
||||
pub cover: Option<H5Cover>,
|
||||
#[serde(rename = "stream_url")]
|
||||
pub stream_url: Option<H5StreamUrl>,
|
||||
pub owner: H5Owner,
|
||||
}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct H5Cover {
|
||||
#[serde(rename = "url_list")]
|
||||
pub url_list: Vec<String>,
|
||||
}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct H5StreamUrl {
|
||||
pub provider: i64,
|
||||
pub id: u64,
|
||||
#[serde(rename = "id_str")]
|
||||
pub id_str: String,
|
||||
#[serde(rename = "default_resolution")]
|
||||
pub default_resolution: String,
|
||||
#[serde(rename = "rtmp_pull_url")]
|
||||
pub rtmp_pull_url: String,
|
||||
#[serde(rename = "flv_pull_url")]
|
||||
pub flv_pull_url: H5FlvPullUrl,
|
||||
#[serde(rename = "hls_pull_url")]
|
||||
pub hls_pull_url: String,
|
||||
#[serde(rename = "hls_pull_url_map")]
|
||||
pub hls_pull_url_map: H5HlsPullUrlMap,
|
||||
#[serde(rename = "live_core_sdk_data")]
|
||||
pub live_core_sdk_data: LiveCoreSdkData,
|
||||
}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct H5FlvPullUrl {
|
||||
#[serde(rename = "FULL_HD1")]
|
||||
pub full_hd1: Option<String>,
|
||||
#[serde(rename = "HD1")]
|
||||
pub hd1: Option<String>,
|
||||
#[serde(rename = "SD1")]
|
||||
pub sd1: Option<String>,
|
||||
#[serde(rename = "SD2")]
|
||||
pub sd2: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct H5HlsPullUrlMap {
|
||||
#[serde(rename = "FULL_HD1")]
|
||||
pub full_hd1: Option<String>,
|
||||
#[serde(rename = "HD1")]
|
||||
pub hd1: Option<String>,
|
||||
#[serde(rename = "SD1")]
|
||||
pub sd1: Option<String>,
|
||||
#[serde(rename = "SD2")]
|
||||
pub sd2: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct H5Owner {
|
||||
pub nickname: String,
|
||||
#[serde(rename = "avatar_thumb")]
|
||||
pub avatar_thumb: H5AvatarThumb,
|
||||
#[serde(rename = "sec_uid")]
|
||||
pub sec_uid: String,
|
||||
}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct H5AvatarThumb {
|
||||
#[serde(rename = "url_list")]
|
||||
pub url_list: Vec<String>,
|
||||
}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct H5User {
|
||||
pub nickname: String,
|
||||
#[serde(rename = "avatar_thumb")]
|
||||
pub avatar_thumb: Option<H5AvatarThumb>,
|
||||
#[serde(rename = "sec_uid")]
|
||||
pub sec_uid: String,
|
||||
}
|
||||
|
||||
#[derive(Default, Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct H5Extra {
|
||||
pub now: i64,
|
||||
}
|
||||
|
||||
@@ -83,7 +83,7 @@ impl TsEntry {
|
||||
|
||||
let mut content = String::new();
|
||||
|
||||
content += &format!("#EXTINF:{:.2},\n", self.length);
|
||||
content += &format!("#EXTINF:{:.4},\n", self.length);
|
||||
content += &format!("{}\n", self.url);
|
||||
|
||||
content
|
||||
@@ -200,10 +200,12 @@ impl EntryStore {
|
||||
self.total_size
|
||||
}
|
||||
|
||||
/// Get first timestamp in milliseconds
|
||||
pub fn first_ts(&self) -> Option<i64> {
|
||||
self.entries.first().map(|x| x.ts_mili())
|
||||
}
|
||||
|
||||
/// Get last timestamp in milliseconds
|
||||
pub fn last_ts(&self) -> Option<i64> {
|
||||
self.entries.last().map(|x| x.ts_mili())
|
||||
}
|
||||
|
||||
@@ -1,22 +0,0 @@
|
||||
use actix_web::Response;
|
||||
|
||||
fn handle_hls_request(ts_path: Option<&str>) -> Response {
|
||||
if let Some(ts_path) = ts_path {
|
||||
if let Ok(content) = std::fs::read(ts_path) {
|
||||
return Response::builder()
|
||||
.status(200)
|
||||
.header("Content-Type", "video/mp2t")
|
||||
.header("Cache-Control", "no-cache")
|
||||
.header("Access-Control-Allow-Origin", "*")
|
||||
.body(content)
|
||||
.unwrap();
|
||||
}
|
||||
}
|
||||
Response::builder()
|
||||
.status(404)
|
||||
.header("Content-Type", "text/plain")
|
||||
.header("Cache-Control", "no-cache")
|
||||
.header("Access-Control-Allow-Origin", "*")
|
||||
.body(b"Not Found".to_vec())
|
||||
.unwrap()
|
||||
}
|
||||
@@ -1,9 +1,10 @@
|
||||
use crate::config::Config;
|
||||
use crate::danmu2ass;
|
||||
use crate::database::recorder::RecorderRow;
|
||||
use crate::database::video::VideoRow;
|
||||
use crate::database::{account::AccountRow, record::RecordRow};
|
||||
use crate::database::{Database, DatabaseError};
|
||||
use crate::ffmpeg::{clip_from_m3u8, encode_video_danmu};
|
||||
use crate::ffmpeg::{clip_from_m3u8, encode_video_danmu, Range};
|
||||
use crate::progress_reporter::{EventEmitter, ProgressReporter};
|
||||
use crate::recorder::bilibili::{BiliRecorder, BiliRecorderOptions};
|
||||
use crate::recorder::danmu::DanmuEntry;
|
||||
@@ -12,6 +13,8 @@ use crate::recorder::errors::RecorderError;
|
||||
use crate::recorder::PlatformType;
|
||||
use crate::recorder::Recorder;
|
||||
use crate::recorder::RecorderInfo;
|
||||
use crate::webhook::events::{self, Payload};
|
||||
use crate::webhook::poster::WebhookPoster;
|
||||
use chrono::Utc;
|
||||
use custom_error::custom_error;
|
||||
use serde::{Deserialize, Serialize};
|
||||
@@ -35,30 +38,38 @@ pub struct RecorderList {
|
||||
#[derive(Debug, Deserialize, Serialize, Clone)]
|
||||
pub struct ClipRangeParams {
|
||||
pub title: String,
|
||||
pub note: String,
|
||||
pub cover: String,
|
||||
pub platform: String,
|
||||
pub room_id: u64,
|
||||
pub live_id: String,
|
||||
/// Clip range start in seconds
|
||||
pub x: i64,
|
||||
/// Clip range end in seconds
|
||||
pub y: i64,
|
||||
/// Timestamp of first stream segment in seconds
|
||||
pub offset: i64,
|
||||
pub range: Option<Range>,
|
||||
/// Encode danmu after clip
|
||||
pub danmu: bool,
|
||||
pub local_offset: i64,
|
||||
/// Fix encoding after clip
|
||||
pub fix_encoding: bool,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum RecorderEvent {
|
||||
LiveStart {
|
||||
recorder: RecorderInfo,
|
||||
},
|
||||
LiveEnd {
|
||||
platform: PlatformType,
|
||||
room_id: u64,
|
||||
live_id: String,
|
||||
platform: PlatformType,
|
||||
recorder: RecorderInfo,
|
||||
},
|
||||
RecordStart {
|
||||
recorder: RecorderInfo,
|
||||
},
|
||||
RecordEnd {
|
||||
recorder: RecorderInfo,
|
||||
},
|
||||
}
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct RecorderManager {
|
||||
#[cfg(not(feature = "headless"))]
|
||||
app_handle: AppHandle,
|
||||
@@ -69,6 +80,7 @@ pub struct RecorderManager {
|
||||
to_remove: Arc<RwLock<HashSet<String>>>,
|
||||
event_tx: broadcast::Sender<RecorderEvent>,
|
||||
is_migrating: Arc<AtomicBool>,
|
||||
webhook_poster: WebhookPoster,
|
||||
}
|
||||
|
||||
custom_error! {pub RecorderManagerError
|
||||
@@ -113,6 +125,7 @@ impl RecorderManager {
|
||||
emitter: EventEmitter,
|
||||
db: Arc<Database>,
|
||||
config: Arc<RwLock<Config>>,
|
||||
webhook_poster: WebhookPoster,
|
||||
) -> RecorderManager {
|
||||
let (event_tx, _) = broadcast::channel(100);
|
||||
let manager = RecorderManager {
|
||||
@@ -125,6 +138,7 @@ impl RecorderManager {
|
||||
to_remove: Arc::new(RwLock::new(HashSet::new())),
|
||||
event_tx,
|
||||
is_migrating: Arc::new(AtomicBool::new(false)),
|
||||
webhook_poster,
|
||||
};
|
||||
|
||||
// Start event listener
|
||||
@@ -141,20 +155,6 @@ impl RecorderManager {
|
||||
manager
|
||||
}
|
||||
|
||||
pub fn clone(&self) -> Self {
|
||||
RecorderManager {
|
||||
#[cfg(not(feature = "headless"))]
|
||||
app_handle: self.app_handle.clone(),
|
||||
emitter: self.emitter.clone(),
|
||||
db: self.db.clone(),
|
||||
config: self.config.clone(),
|
||||
recorders: self.recorders.clone(),
|
||||
to_remove: self.to_remove.clone(),
|
||||
event_tx: self.event_tx.clone(),
|
||||
is_migrating: self.is_migrating.clone(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn get_event_sender(&self) -> broadcast::Sender<RecorderEvent> {
|
||||
self.event_tx.clone()
|
||||
}
|
||||
@@ -163,25 +163,46 @@ impl RecorderManager {
|
||||
let mut rx = self.event_tx.subscribe();
|
||||
while let Ok(event) = rx.recv().await {
|
||||
match event {
|
||||
RecorderEvent::LiveStart { recorder } => {
|
||||
let event =
|
||||
events::new_webhook_event(events::LIVE_STARTED, Payload::Room(recorder));
|
||||
let _ = self.webhook_poster.post_event(&event).await;
|
||||
}
|
||||
RecorderEvent::LiveEnd {
|
||||
platform,
|
||||
room_id,
|
||||
live_id,
|
||||
recorder,
|
||||
} => {
|
||||
self.handle_live_end(platform, room_id, &live_id).await;
|
||||
let event = events::new_webhook_event(
|
||||
events::LIVE_ENDED,
|
||||
Payload::Room(recorder.clone()),
|
||||
);
|
||||
let _ = self.webhook_poster.post_event(&event).await;
|
||||
self.handle_live_end(platform, room_id, &recorder).await;
|
||||
}
|
||||
RecorderEvent::RecordStart { recorder } => {
|
||||
let event =
|
||||
events::new_webhook_event(events::RECORD_STARTED, Payload::Room(recorder));
|
||||
let _ = self.webhook_poster.post_event(&event).await;
|
||||
}
|
||||
RecorderEvent::RecordEnd { recorder } => {
|
||||
let event =
|
||||
events::new_webhook_event(events::RECORD_ENDED, Payload::Room(recorder));
|
||||
let _ = self.webhook_poster.post_event(&event).await;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async fn handle_live_end(&self, platform: PlatformType, room_id: u64, live_id: &str) {
|
||||
async fn handle_live_end(&self, platform: PlatformType, room_id: u64, recorder: &RecorderInfo) {
|
||||
if !self.config.read().await.auto_generate.enabled {
|
||||
return;
|
||||
}
|
||||
|
||||
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
|
||||
log::info!("Start auto generate for {}", recorder_id);
|
||||
let live_record = self.db.get_record(room_id, live_id).await;
|
||||
let live_id = recorder.current_live_id.clone();
|
||||
let live_record = self.db.get_record(room_id, &live_id).await;
|
||||
if live_record.is_err() {
|
||||
log::error!("Live not found in record: {} {}", room_id, live_id);
|
||||
return;
|
||||
@@ -201,15 +222,15 @@ impl RecorderManager {
|
||||
|
||||
let clip_config = ClipRangeParams {
|
||||
title: live_record.title,
|
||||
note: "".into(),
|
||||
cover: "".into(),
|
||||
platform: live_record.platform.clone(),
|
||||
room_id,
|
||||
live_id: live_id.to_string(),
|
||||
x: 0,
|
||||
y: 0,
|
||||
offset: recorder.first_segment_ts(live_id).await,
|
||||
range: None,
|
||||
danmu: encode_danmu,
|
||||
local_offset: 0,
|
||||
fix_encoding: false,
|
||||
};
|
||||
|
||||
let clip_filename = self.config.read().await.generate_clip_name(&clip_config);
|
||||
@@ -242,6 +263,7 @@ impl RecorderManager {
|
||||
created_at: Utc::now().to_rfc3339(),
|
||||
cover: "".into(),
|
||||
file: f.file_name().unwrap().to_str().unwrap().to_string(),
|
||||
note: "".into(),
|
||||
length: live_record.length,
|
||||
size: metadata.len() as i64,
|
||||
bvid: "".into(),
|
||||
@@ -292,7 +314,8 @@ impl RecorderManager {
|
||||
let platform = PlatformType::from_str(&recorder.platform).unwrap();
|
||||
let room_id = recorder.room_id;
|
||||
let auto_start = recorder.auto_start;
|
||||
recorder_map.insert((platform, room_id), auto_start);
|
||||
let extra = recorder.extra;
|
||||
recorder_map.insert((platform, room_id), (auto_start, extra));
|
||||
}
|
||||
let mut recorders_to_add = Vec::new();
|
||||
for (platform, room_id) in recorder_map.keys() {
|
||||
@@ -307,7 +330,7 @@ impl RecorderManager {
|
||||
if self.is_migrating.load(std::sync::atomic::Ordering::Relaxed) {
|
||||
break;
|
||||
}
|
||||
let auto_start = recorder_map.get(&(platform, room_id)).unwrap();
|
||||
let (auto_start, extra) = recorder_map.get(&(platform, room_id)).unwrap();
|
||||
let account = self
|
||||
.db
|
||||
.get_account_by_platform(platform.clone().as_str())
|
||||
@@ -319,7 +342,7 @@ impl RecorderManager {
|
||||
let account = account.unwrap();
|
||||
|
||||
if let Err(e) = self
|
||||
.add_recorder(&account, platform, room_id, *auto_start)
|
||||
.add_recorder(&account, platform, room_id, extra, *auto_start)
|
||||
.await
|
||||
{
|
||||
log::error!("Failed to add recorder: {}", e);
|
||||
@@ -334,6 +357,7 @@ impl RecorderManager {
|
||||
account: &AccountRow,
|
||||
platform: PlatformType,
|
||||
room_id: u64,
|
||||
extra: &str,
|
||||
auto_start: bool,
|
||||
) -> Result<(), RecorderManagerError> {
|
||||
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
|
||||
@@ -363,6 +387,7 @@ impl RecorderManager {
|
||||
self.app_handle.clone(),
|
||||
self.emitter.clone(),
|
||||
room_id,
|
||||
extra,
|
||||
self.config.clone(),
|
||||
account,
|
||||
&self.db,
|
||||
@@ -404,7 +429,7 @@ impl RecorderManager {
|
||||
&self,
|
||||
platform: PlatformType,
|
||||
room_id: u64,
|
||||
) -> Result<(), RecorderManagerError> {
|
||||
) -> Result<RecorderRow, RecorderManagerError> {
|
||||
// check recorder exists
|
||||
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
|
||||
if !self.recorders.read().await.contains_key(&recorder_id) {
|
||||
@@ -412,7 +437,7 @@ impl RecorderManager {
|
||||
}
|
||||
|
||||
// remove from db
|
||||
self.db.remove_recorder(room_id).await?;
|
||||
let recorder = self.db.remove_recorder(room_id).await?;
|
||||
|
||||
// add to to_remove
|
||||
log::debug!("Add to to_remove: {}", recorder_id);
|
||||
@@ -443,7 +468,7 @@ impl RecorderManager {
|
||||
let _ = tokio::fs::remove_dir_all(cache_folder).await;
|
||||
log::info!("Recorder {} cache folder removed", room_id);
|
||||
|
||||
Ok(())
|
||||
Ok(recorder)
|
||||
}
|
||||
|
||||
pub async fn clip_range(
|
||||
@@ -475,14 +500,21 @@ impl RecorderManager {
|
||||
params: &ClipRangeParams,
|
||||
) -> Result<PathBuf, RecorderManagerError> {
|
||||
let range_m3u8 = format!(
|
||||
"{}/{}/{}/playlist.m3u8?start={}&end={}",
|
||||
params.platform, params.room_id, params.live_id, params.x, params.y
|
||||
"{}/{}/{}/playlist.m3u8",
|
||||
params.platform, params.room_id, params.live_id
|
||||
);
|
||||
|
||||
let manifest_content = self.handle_hls_request(&range_m3u8).await?;
|
||||
let manifest_content = String::from_utf8(manifest_content)
|
||||
let mut manifest_content = String::from_utf8(manifest_content)
|
||||
.map_err(|e| RecorderManagerError::ClipError { err: e.to_string() })?;
|
||||
|
||||
// if manifest is for stream, replace EXT-X-PLAYLIST-TYPE:EVENT to EXT-X-PLAYLIST-TYPE:VOD, and add #EXT-X-ENDLIST
|
||||
if manifest_content.contains("#EXT-X-PLAYLIST-TYPE:EVENT") {
|
||||
manifest_content =
|
||||
manifest_content.replace("#EXT-X-PLAYLIST-TYPE:EVENT", "#EXT-X-PLAYLIST-TYPE:VOD");
|
||||
manifest_content += "\n#EXT-X-ENDLIST\n";
|
||||
}
|
||||
|
||||
let cache_path = self.config.read().await.cache.clone();
|
||||
let cache_path = Path::new(&cache_path);
|
||||
let random_filename = format!("manifest_{}.m3u8", uuid::Uuid::new_v4());
|
||||
@@ -497,7 +529,15 @@ impl RecorderManager {
|
||||
.await
|
||||
.map_err(|e| RecorderManagerError::ClipError { err: e.to_string() })?;
|
||||
|
||||
if let Err(e) = clip_from_m3u8(reporter, &tmp_manifest_file_path, &clip_file).await {
|
||||
if let Err(e) = clip_from_m3u8(
|
||||
reporter,
|
||||
&tmp_manifest_file_path,
|
||||
&clip_file,
|
||||
params.range.as_ref(),
|
||||
params.fix_encoding,
|
||||
)
|
||||
.await
|
||||
{
|
||||
log::error!("Failed to generate clip file: {}", e);
|
||||
return Err(RecorderManagerError::ClipError { err: e.to_string() });
|
||||
}
|
||||
@@ -517,12 +557,6 @@ impl RecorderManager {
|
||||
return Ok(clip_file);
|
||||
}
|
||||
|
||||
let mut clip_offset = params.offset;
|
||||
if clip_offset > 0 {
|
||||
clip_offset -= recorder.first_segment_ts(¶ms.live_id).await;
|
||||
clip_offset = clip_offset.max(0);
|
||||
}
|
||||
|
||||
let danmus = recorder.comments(¶ms.live_id).await;
|
||||
if danmus.is_err() {
|
||||
log::error!("Failed to get danmus");
|
||||
@@ -530,20 +564,24 @@ impl RecorderManager {
|
||||
}
|
||||
|
||||
log::info!(
|
||||
"Filter danmus in range [{}, {}] with global offset {} and local offset {}",
|
||||
params.x,
|
||||
params.y,
|
||||
clip_offset,
|
||||
"Filter danmus in range {} with local offset {}",
|
||||
params
|
||||
.range
|
||||
.as_ref()
|
||||
.map_or("None".to_string(), |r| r.to_string()),
|
||||
params.local_offset
|
||||
);
|
||||
let mut danmus = danmus.unwrap();
|
||||
log::debug!("First danmu entry: {:?}", danmus.first());
|
||||
// update entry ts to offset
|
||||
for d in &mut danmus {
|
||||
d.ts -= (params.x + clip_offset + params.local_offset) * 1000;
|
||||
}
|
||||
if params.x != 0 || params.y != 0 {
|
||||
danmus.retain(|x| x.ts >= 0 && x.ts <= (params.y - params.x) * 1000);
|
||||
|
||||
if let Some(range) = ¶ms.range {
|
||||
// update entry ts to offset and filter danmus in range
|
||||
for d in &mut danmus {
|
||||
d.ts -= (range.start as i64 + params.local_offset) * 1000;
|
||||
}
|
||||
if range.duration() > 0.0 {
|
||||
danmus.retain(|x| x.ts >= 0 && x.ts <= (range.duration() as i64) * 1000);
|
||||
}
|
||||
}
|
||||
|
||||
if danmus.is_empty() {
|
||||
@@ -601,8 +639,17 @@ impl RecorderManager {
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn get_archives(&self, room_id: u64) -> Result<Vec<RecordRow>, RecorderManagerError> {
|
||||
Ok(self.db.get_records(room_id).await?)
|
||||
pub async fn get_archive_disk_usage(&self) -> Result<u64, RecorderManagerError> {
|
||||
Ok(self.db.get_record_disk_usage().await?)
|
||||
}
|
||||
|
||||
pub async fn get_archives(
|
||||
&self,
|
||||
room_id: u64,
|
||||
offset: u64,
|
||||
limit: u64,
|
||||
) -> Result<Vec<RecordRow>, RecorderManagerError> {
|
||||
Ok(self.db.get_records(room_id, offset, limit).await?)
|
||||
}
|
||||
|
||||
pub async fn get_archive(
|
||||
@@ -648,7 +695,7 @@ impl RecorderManager {
|
||||
platform: PlatformType,
|
||||
room_id: u64,
|
||||
live_id: &str,
|
||||
) -> Result<(), RecorderManagerError> {
|
||||
) -> Result<RecordRow, RecorderManagerError> {
|
||||
log::info!("Deleting {}:{}", room_id, live_id);
|
||||
// check if this is still recording
|
||||
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
|
||||
@@ -660,13 +707,28 @@ impl RecorderManager {
|
||||
});
|
||||
}
|
||||
}
|
||||
self.db.remove_record(live_id).await?;
|
||||
let to_delete = self.db.remove_record(live_id).await?;
|
||||
let cache_folder = Path::new(self.config.read().await.cache.as_str())
|
||||
.join(platform.as_str())
|
||||
.join(room_id.to_string())
|
||||
.join(live_id);
|
||||
let _ = tokio::fs::remove_dir_all(cache_folder).await;
|
||||
Ok(())
|
||||
Ok(to_delete)
|
||||
}
|
||||
|
||||
pub async fn delete_archives(
|
||||
&self,
|
||||
platform: PlatformType,
|
||||
room_id: u64,
|
||||
live_ids: &[&str],
|
||||
) -> Result<Vec<RecordRow>, RecorderManagerError> {
|
||||
log::info!("Deleting archives in batch: {:?}", live_ids);
|
||||
let mut to_deletes = Vec::new();
|
||||
for live_id in live_ids {
|
||||
let to_delete = self.delete_archive(platform, room_id, live_id).await?;
|
||||
to_deletes.push(to_delete);
|
||||
}
|
||||
Ok(to_deletes)
|
||||
}
|
||||
|
||||
pub async fn get_danmu(
|
||||
|
||||
@@ -6,6 +6,7 @@ use crate::config::Config;
|
||||
use crate::database::Database;
|
||||
use crate::recorder::bilibili::client::BiliClient;
|
||||
use crate::recorder_manager::RecorderManager;
|
||||
use crate::webhook::poster::WebhookPoster;
|
||||
|
||||
#[cfg(feature = "headless")]
|
||||
use crate::progress_manager::ProgressManager;
|
||||
@@ -21,6 +22,7 @@ pub struct State {
|
||||
pub db: Arc<Database>,
|
||||
pub client: Arc<BiliClient>,
|
||||
pub config: Arc<RwLock<Config>>,
|
||||
pub webhook_poster: WebhookPoster,
|
||||
pub recorder_manager: Arc<RecorderManager>,
|
||||
#[cfg(not(feature = "headless"))]
|
||||
pub app_handle: tauri::AppHandle,
|
||||
|
||||
@@ -178,6 +178,7 @@ mod tests {
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
#[ignore = "Might not have enough memory to run this test"]
|
||||
async fn generate_subtitle() {
|
||||
let whisper = new(Path::new("tests/model/ggml-tiny-q5_1.bin"), "")
|
||||
.await
|
||||
|
||||
@@ -228,6 +228,7 @@ mod tests {
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
#[ignore = "requres api key"]
|
||||
async fn test_generate_subtitle() {
|
||||
let result = new(Some("https://api.openai.com/v1"), Some("sk-****"), None).await;
|
||||
assert!(result.is_ok());
|
||||
|
||||
47
src-tauri/src/webhook/events.rs
Normal file
47
src-tauri/src/webhook/events.rs
Normal file
@@ -0,0 +1,47 @@
|
||||
use uuid::Uuid;
|
||||
|
||||
use crate::{
|
||||
database::{account::AccountRow, record::RecordRow, recorder::RecorderRow, video::VideoRow},
|
||||
recorder::RecorderInfo,
|
||||
};
|
||||
|
||||
pub const CLIP_GENERATED: &str = "clip.generated";
|
||||
pub const CLIP_DELETED: &str = "clip.deleted";
|
||||
|
||||
pub const RECORD_STARTED: &str = "record.started";
|
||||
pub const RECORD_ENDED: &str = "record.ended";
|
||||
|
||||
pub const LIVE_STARTED: &str = "live.started";
|
||||
pub const LIVE_ENDED: &str = "live.ended";
|
||||
|
||||
pub const ARCHIVE_DELETED: &str = "archive.deleted";
|
||||
|
||||
pub const RECORDER_REMOVED: &str = "recorder.removed";
|
||||
pub const RECORDER_ADDED: &str = "recorder.added";
|
||||
|
||||
#[derive(serde::Serialize, serde::Deserialize, Debug, Clone)]
|
||||
pub struct WebhookEvent {
|
||||
pub id: String,
|
||||
pub event: String,
|
||||
pub payload: Payload,
|
||||
pub timestamp: i64,
|
||||
}
|
||||
|
||||
#[derive(serde::Serialize, serde::Deserialize, Debug, Clone)]
|
||||
#[serde(untagged)]
|
||||
pub enum Payload {
|
||||
Account(AccountRow),
|
||||
Recorder(RecorderRow),
|
||||
Room(RecorderInfo),
|
||||
Clip(VideoRow),
|
||||
Archive(RecordRow),
|
||||
}
|
||||
|
||||
pub fn new_webhook_event(event_type: &str, payload: Payload) -> WebhookEvent {
|
||||
WebhookEvent {
|
||||
id: Uuid::new_v4().to_string(),
|
||||
event: event_type.to_string(),
|
||||
payload,
|
||||
timestamp: chrono::Utc::now().timestamp(),
|
||||
}
|
||||
}
|
||||
2
src-tauri/src/webhook/mod.rs
Normal file
2
src-tauri/src/webhook/mod.rs
Normal file
@@ -0,0 +1,2 @@
|
||||
pub mod events;
|
||||
pub mod poster;
|
||||
256
src-tauri/src/webhook/poster.rs
Normal file
256
src-tauri/src/webhook/poster.rs
Normal file
@@ -0,0 +1,256 @@
|
||||
//! Webhook Event Poster
|
||||
//!
|
||||
//! This module provides functionality for posting webhook events to external URLs.
|
||||
//! It includes retry logic, custom headers support, and proper error handling.
|
||||
//!
|
||||
//! # Examples
|
||||
//!
|
||||
//! ## Basic Usage
|
||||
//! ```rust,no_run
|
||||
//! use std::collections::HashMap;
|
||||
//! use bili_shadowreplay::webhook::poster::create_webhook_poster_with_headers;
|
||||
//!
|
||||
//! # async fn example() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
|
||||
//! let mut headers = HashMap::new();
|
||||
//! headers.insert("Authorization".to_string(), "Bearer token".to_string());
|
||||
//!
|
||||
//! let poster = create_webhook_poster_with_headers("https://api.example.com/webhook", headers)?;
|
||||
//! // Use the poster...
|
||||
//! # Ok(())
|
||||
//! # }
|
||||
//! ```
|
||||
//!
|
||||
//! ## Custom Configuration
|
||||
//! ```rust,no_run
|
||||
//! use std::time::Duration;
|
||||
//! use bili_shadowreplay::webhook::poster::{WebhookPoster, WebhookConfig};
|
||||
//!
|
||||
//! # async fn example() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
|
||||
//! let config = WebhookConfig {
|
||||
//! url: "https://your-webhook-url.com/endpoint".to_string(),
|
||||
//! timeout: Duration::from_secs(60),
|
||||
//! retry_attempts: 5,
|
||||
//! retry_delay: Duration::from_secs(2),
|
||||
//! headers: None,
|
||||
//! };
|
||||
//!
|
||||
//! let poster = WebhookPoster::new(config)?;
|
||||
//! // Use the poster...
|
||||
//! # Ok(())
|
||||
//! # }
|
||||
//! ```
|
||||
|
||||
use log::{error, info, warn};
|
||||
use reqwest::Client;
|
||||
use serde_json;
|
||||
use std::{sync::Arc, time::Duration};
|
||||
use tokio::{sync::RwLock, time::sleep};
|
||||
|
||||
use crate::webhook::events::WebhookEvent;
|
||||
|
||||
/// Configuration for webhook posting
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct WebhookConfig {
|
||||
pub url: String,
|
||||
pub timeout: Duration,
|
||||
pub retry_attempts: u32,
|
||||
pub retry_delay: Duration,
|
||||
pub headers: Option<std::collections::HashMap<String, String>>,
|
||||
}
|
||||
|
||||
impl Default for WebhookConfig {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
url: String::new(),
|
||||
timeout: Duration::from_secs(30),
|
||||
retry_attempts: 3,
|
||||
retry_delay: Duration::from_secs(1),
|
||||
headers: None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Webhook event poster for sending events to specified URLs
|
||||
/// All methods are thread-safe
|
||||
#[derive(Clone)]
|
||||
pub struct WebhookPoster {
|
||||
client: Arc<RwLock<Client>>,
|
||||
config: Arc<RwLock<WebhookConfig>>,
|
||||
}
|
||||
|
||||
impl WebhookPoster {
|
||||
/// Create a new webhook poster with the given configuration
|
||||
pub fn new(config: WebhookConfig) -> Result<Self, Box<dyn std::error::Error + Send + Sync>> {
|
||||
let client = Client::builder().timeout(config.timeout).build()?;
|
||||
|
||||
Ok(Self {
|
||||
client: Arc::new(RwLock::new(client)),
|
||||
config: Arc::new(RwLock::new(config)),
|
||||
})
|
||||
}
|
||||
|
||||
/// Post a webhook event to the configured URL
|
||||
pub async fn post_event(&self, event: &WebhookEvent) -> Result<(), WebhookPostError> {
|
||||
if self.config.read().await.url.is_empty() {
|
||||
log::debug!("Webhook URL is empty, skipping");
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
let serialized_event = serde_json::to_string(event)
|
||||
.map_err(|e| WebhookPostError::Serialization(e.to_string()))?;
|
||||
|
||||
let self_clone = self.clone();
|
||||
tokio::task::spawn(async move {
|
||||
let result = self_clone.post_with_retry(&serialized_event).await;
|
||||
if let Err(e) = result {
|
||||
log::error!("Post webhook event error: {}", e);
|
||||
}
|
||||
});
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Post raw JSON data to the configured URL
|
||||
#[allow(dead_code)]
|
||||
pub async fn post_json(&self, json_data: &str) -> Result<(), WebhookPostError> {
|
||||
if self.config.read().await.url.is_empty() {
|
||||
log::debug!("Webhook URL is empty, skipping");
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
self.post_with_retry(json_data).await
|
||||
}
|
||||
|
||||
/// Post data with retry logic
|
||||
async fn post_with_retry(&self, data: &str) -> Result<(), WebhookPostError> {
|
||||
if self.config.read().await.url.is_empty() {
|
||||
log::debug!("Webhook URL is empty, skipping");
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
let mut last_error = None;
|
||||
|
||||
for attempt in 1..=self.config.read().await.retry_attempts {
|
||||
match self.send_request(data).await {
|
||||
Ok(_) => {
|
||||
if attempt > 1 {
|
||||
info!("Webhook posted successfully on attempt {}", attempt);
|
||||
}
|
||||
return Ok(());
|
||||
}
|
||||
Err(e) => {
|
||||
last_error = Some(e);
|
||||
if attempt < self.config.read().await.retry_attempts {
|
||||
warn!(
|
||||
"Webhook post attempt {} failed, retrying in {:?}",
|
||||
attempt,
|
||||
self.config.read().await.retry_delay
|
||||
);
|
||||
sleep(self.config.read().await.retry_delay).await;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
error!("All webhook post attempts failed");
|
||||
Err(last_error.unwrap())
|
||||
}
|
||||
|
||||
/// Send the actual HTTP request
|
||||
async fn send_request(&self, data: &str) -> Result<(), WebhookPostError> {
|
||||
let webhook_url = self.config.read().await.url.clone();
|
||||
let mut request = self.client.read().await.post(&webhook_url);
|
||||
|
||||
// Add custom headers if configured
|
||||
if let Some(headers) = &self.config.read().await.headers {
|
||||
for (key, value) in headers {
|
||||
request = request.header(key, value);
|
||||
}
|
||||
}
|
||||
|
||||
log::debug!("Sending webhook request to: {}", webhook_url);
|
||||
|
||||
// Set content type to JSON
|
||||
request = request.header("Content-Type", "application/json");
|
||||
|
||||
let response = request
|
||||
.body(data.to_string())
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| WebhookPostError::Network(e.to_string()))?;
|
||||
|
||||
if !response.status().is_success() {
|
||||
let status = response.status();
|
||||
let body = response.text().await.unwrap_or_default();
|
||||
return Err(WebhookPostError::Http {
|
||||
status: status.as_u16(),
|
||||
body,
|
||||
});
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Update the webhook configuration
|
||||
pub async fn update_config(
|
||||
&self,
|
||||
config: WebhookConfig,
|
||||
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
|
||||
*self.client.write().await = Client::builder().timeout(config.timeout).build()?;
|
||||
*self.config.write().await = config;
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
/// Errors that can occur during webhook posting
|
||||
#[derive(Debug, thiserror::Error)]
|
||||
pub enum WebhookPostError {
|
||||
#[error("Network error: {0}")]
|
||||
Network(String),
|
||||
|
||||
#[error("HTTP error: status {status}, body: {body}")]
|
||||
Http { status: u16, body: String },
|
||||
|
||||
#[error("Serialization error: {0}")]
|
||||
Serialization(String),
|
||||
}
|
||||
|
||||
/// Convenience function to create a webhook poster with custom headers
|
||||
pub fn create_webhook_poster(
|
||||
url: &str,
|
||||
headers: Option<std::collections::HashMap<String, String>>,
|
||||
) -> Result<WebhookPoster, Box<dyn std::error::Error + Send + Sync>> {
|
||||
let config = WebhookConfig {
|
||||
url: url.to_string(),
|
||||
headers,
|
||||
..Default::default()
|
||||
};
|
||||
log::info!("Creating webhook poster with URL: {}", url);
|
||||
WebhookPoster::new(config)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use std::collections::HashMap;
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_webhook_poster_creation() {
|
||||
let config = WebhookConfig {
|
||||
url: "https://httpbin.org/post".to_string(),
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let poster = WebhookPoster::new(config);
|
||||
assert!(poster.is_ok());
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_webhook_poster_with_headers() {
|
||||
let mut headers = HashMap::new();
|
||||
headers.insert("Authorization".to_string(), "Bearer token".to_string());
|
||||
|
||||
let poster = create_webhook_poster("https://httpbin.org/post", Some(headers));
|
||||
assert!(poster.is_ok());
|
||||
}
|
||||
}
|
||||
BIN
src-tauri/tests/video/h_test.m4s
Normal file
BIN
src-tauri/tests/video/h_test.m4s
Normal file
Binary file not shown.
BIN
src-tauri/tests/video/test.mp4
Normal file
BIN
src-tauri/tests/video/test.mp4
Normal file
Binary file not shown.
@@ -1,6 +1,6 @@
|
||||
<script lang="ts">
|
||||
import Room from "./page/Room.svelte";
|
||||
import BSidebar from "./lib/BSidebar.svelte";
|
||||
import BSidebar from "./lib/components/BSidebar.svelte";
|
||||
import Summary from "./page/Summary.svelte";
|
||||
import Setting from "./page/Setting.svelte";
|
||||
import Account from "./page/Account.svelte";
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
<script lang="ts">
|
||||
import { invoke } from "./lib/invoker";
|
||||
import { invoke, convertFileSrc, get_cover } from "./lib/invoker";
|
||||
import { onMount } from "svelte";
|
||||
import VideoPreview from "./lib/VideoPreview.svelte";
|
||||
import VideoPreview from "./lib/components/VideoPreview.svelte";
|
||||
import type { Config, VideoItem } from "./lib/interface";
|
||||
import { convertFileSrc, set_title } from "./lib/invoker";
|
||||
import { set_title } from "./lib/invoker";
|
||||
|
||||
let video: VideoItem | null = null;
|
||||
let videos: any[] = [];
|
||||
@@ -14,7 +14,6 @@
|
||||
|
||||
invoke("get_config").then((c) => {
|
||||
config = c as Config;
|
||||
console.log(config);
|
||||
});
|
||||
|
||||
onMount(async () => {
|
||||
@@ -27,18 +26,21 @@
|
||||
// update window title to file name
|
||||
set_title((videoData as VideoItem).file);
|
||||
// 获取房间下的所有视频列表
|
||||
if (roomId) {
|
||||
videos = (
|
||||
(await invoke("get_videos", { roomId: roomId })) as VideoItem[]
|
||||
).map((v) => {
|
||||
return {
|
||||
id: v.id,
|
||||
value: v.id,
|
||||
name: v.file,
|
||||
file: convertFileSrc(config.output + "/" + v.file),
|
||||
cover: v.cover,
|
||||
};
|
||||
});
|
||||
if (roomId !== null && roomId !== undefined) {
|
||||
const videoList = (await invoke("get_videos", {
|
||||
roomId: roomId,
|
||||
})) as VideoItem[];
|
||||
videos = await Promise.all(
|
||||
videoList.map(async (v) => {
|
||||
return {
|
||||
id: v.id,
|
||||
value: v.id,
|
||||
name: v.file,
|
||||
file: await convertFileSrc(v.file),
|
||||
cover: v.cover,
|
||||
};
|
||||
})
|
||||
);
|
||||
}
|
||||
|
||||
// find video in videos
|
||||
@@ -52,36 +54,38 @@
|
||||
console.error("Failed to load video:", error);
|
||||
}
|
||||
}
|
||||
|
||||
console.log(video);
|
||||
});
|
||||
|
||||
async function handleVideoChange(newVideo: VideoItem) {
|
||||
if (newVideo) {
|
||||
// get cover from video
|
||||
const cover = await invoke("get_video_cover", { id: newVideo.id });
|
||||
newVideo.cover = cover as string;
|
||||
if (newVideo.cover && newVideo.cover.trim() !== "") {
|
||||
newVideo.cover = await get_cover("output", newVideo.cover);
|
||||
} else {
|
||||
newVideo.cover = "";
|
||||
}
|
||||
}
|
||||
video = newVideo;
|
||||
}
|
||||
|
||||
async function handleVideoListUpdate() {
|
||||
if (roomId) {
|
||||
if (roomId !== null && roomId !== undefined) {
|
||||
const videosData = await invoke("get_videos", { roomId });
|
||||
videos = (videosData as VideoItem[]).map((v) => {
|
||||
return {
|
||||
id: v.id,
|
||||
value: v.id,
|
||||
name: v.file,
|
||||
file: convertFileSrc(config.output + "/" + v.file),
|
||||
cover: v.cover,
|
||||
};
|
||||
});
|
||||
videos = await Promise.all(
|
||||
(videosData as VideoItem[]).map(async (v) => {
|
||||
return {
|
||||
id: v.id,
|
||||
value: v.id,
|
||||
name: v.file,
|
||||
file: await convertFileSrc(v.file),
|
||||
cover: v.cover,
|
||||
};
|
||||
})
|
||||
);
|
||||
}
|
||||
}
|
||||
</script>
|
||||
|
||||
{#if showVideoPreview && video && roomId}
|
||||
{#if showVideoPreview && video && roomId !== null && roomId !== undefined}
|
||||
<VideoPreview
|
||||
bind:show={showVideoPreview}
|
||||
{video}
|
||||
|
||||
@@ -6,8 +6,9 @@
|
||||
convertFileSrc,
|
||||
listen,
|
||||
log,
|
||||
get_cover,
|
||||
} from "./lib/invoker";
|
||||
import Player from "./lib/Player.svelte";
|
||||
import Player from "./lib/components/Player.svelte";
|
||||
import type { RecordItem } from "./lib/db";
|
||||
import { ChevronRight, ChevronLeft, Play, Pen } from "lucide-svelte";
|
||||
import {
|
||||
@@ -20,7 +21,7 @@
|
||||
clipRange,
|
||||
generateEventId,
|
||||
} from "./lib/interface";
|
||||
import MarkerPanel from "./lib/MarkerPanel.svelte";
|
||||
import MarkerPanel from "./lib/components/MarkerPanel.svelte";
|
||||
import { onDestroy, onMount } from "svelte";
|
||||
|
||||
const urlParams = new URLSearchParams(window.location.search);
|
||||
@@ -36,11 +37,12 @@
|
||||
|
||||
invoke("get_config").then((c) => {
|
||||
config = c as Config;
|
||||
console.log(config);
|
||||
});
|
||||
|
||||
let current_clip_event_id = null;
|
||||
let danmu_enabled = false;
|
||||
let fix_encoding = false;
|
||||
let clip_note: string = "";
|
||||
|
||||
// 弹幕相关变量
|
||||
let danmu_records: DanmuEntry[] = [];
|
||||
@@ -142,12 +144,27 @@
|
||||
function format_time(milliseconds: number): string {
|
||||
const seconds = Math.floor(milliseconds / 1000);
|
||||
const minutes = Math.floor(seconds / 60);
|
||||
const hours = Math.floor(minutes / 60).toString().padStart(2, "0");
|
||||
const hours = Math.floor(minutes / 60)
|
||||
.toString()
|
||||
.padStart(2, "0");
|
||||
const remaining_seconds = (seconds % 60).toString().padStart(2, "0");
|
||||
const remaining_minutes = (minutes % 60).toString().padStart(2, "0");
|
||||
return `${hours}:${remaining_minutes}:${remaining_seconds}`;
|
||||
}
|
||||
|
||||
// 将时长(单位: 秒)格式化为 "X小时 Y分 Z秒"
|
||||
function format_duration_seconds(totalSecondsFloat: number): string {
|
||||
const totalSeconds = Math.max(0, Math.floor(totalSecondsFloat));
|
||||
const hours = Math.floor(totalSeconds / 3600);
|
||||
const minutes = Math.floor((totalSeconds % 3600) / 60);
|
||||
const seconds = totalSeconds % 60;
|
||||
const parts = [] as string[];
|
||||
if (hours > 0) parts.push(`${hours} 小时`);
|
||||
if (minutes > 0) parts.push(`${minutes} 分`);
|
||||
parts.push(`${seconds} 秒`);
|
||||
return parts.join(" ");
|
||||
}
|
||||
|
||||
// 跳转到弹幕时间点
|
||||
function seek_to_danmu(danmu: DanmuEntry) {
|
||||
if (player) {
|
||||
@@ -157,7 +174,6 @@
|
||||
}
|
||||
|
||||
const update_listener = listen<ProgressUpdate>(`progress-update`, (e) => {
|
||||
console.log("progress-update event", e.payload.id);
|
||||
let event_id = e.payload.id;
|
||||
if (event_id === current_clip_event_id) {
|
||||
update_clip_prompt(e.payload.content);
|
||||
@@ -166,10 +182,8 @@
|
||||
const finished_listener = listen<ProgressFinished>(
|
||||
`progress-finished`,
|
||||
(e) => {
|
||||
console.log("progress-finished event", e.payload.id);
|
||||
let event_id = e.payload.id;
|
||||
if (event_id === current_clip_event_id) {
|
||||
console.log("clip event finished", event_id);
|
||||
update_clip_prompt(`生成切片`);
|
||||
if (!e.payload.success) {
|
||||
alert("请检查 ffmpeg 是否配置正确:" + e.payload.message);
|
||||
@@ -203,8 +217,6 @@
|
||||
end = parseFloat(localStorage.getItem(`${live_id}_end`)) - focus_start;
|
||||
}
|
||||
|
||||
console.log("Loaded start and end", start, end);
|
||||
|
||||
function generateCover() {
|
||||
const video = document.getElementById("video") as HTMLVideoElement;
|
||||
var w = video.videoWidth;
|
||||
@@ -252,7 +264,6 @@
|
||||
|
||||
invoke("get_archive", { roomId: room_id, liveId: live_id }).then(
|
||||
(a: RecordItem) => {
|
||||
console.log(a);
|
||||
archive = a;
|
||||
set_title(`[${room_id}]${archive.title}`);
|
||||
}
|
||||
@@ -267,17 +278,20 @@
|
||||
}
|
||||
|
||||
async function get_video_list() {
|
||||
videos = (
|
||||
(await invoke("get_videos", { roomId: room_id })) as VideoItem[]
|
||||
).map((v) => {
|
||||
return {
|
||||
id: v.id,
|
||||
value: v.id,
|
||||
name: v.file,
|
||||
file: convertFileSrc(config.output + "/" + v.file),
|
||||
cover: v.cover,
|
||||
};
|
||||
});
|
||||
const videoList = (await invoke("get_videos", {
|
||||
roomId: room_id,
|
||||
})) as VideoItem[];
|
||||
videos = await Promise.all(
|
||||
videoList.map(async (v) => {
|
||||
return {
|
||||
id: v.id,
|
||||
value: v.id,
|
||||
name: v.file,
|
||||
file: await convertFileSrc(v.file),
|
||||
cover: v.cover,
|
||||
};
|
||||
})
|
||||
);
|
||||
}
|
||||
|
||||
async function find_video(e) {
|
||||
@@ -290,10 +304,9 @@
|
||||
return v.value == id;
|
||||
});
|
||||
if (target_video) {
|
||||
target_video.cover = await invoke("get_video_cover", { id: id });
|
||||
target_video.cover = await get_cover("output", target_video.cover);
|
||||
}
|
||||
selected_video = target_video;
|
||||
console.log("video selected", videos, selected_video, e, id);
|
||||
}
|
||||
|
||||
async function generate_clip() {
|
||||
@@ -317,19 +330,23 @@
|
||||
current_clip_event_id = event_id;
|
||||
let new_video = (await clipRange(event_id, {
|
||||
title: archive.title,
|
||||
note: clip_note,
|
||||
room_id: room_id,
|
||||
platform: platform,
|
||||
cover: new_cover,
|
||||
live_id: live_id,
|
||||
x: Math.floor(focus_start + start),
|
||||
y: Math.floor(focus_start + end),
|
||||
range: {
|
||||
start: focus_start + start,
|
||||
end: focus_start + end,
|
||||
},
|
||||
danmu: danmu_enabled,
|
||||
offset: global_offset,
|
||||
local_offset:
|
||||
parseInt(localStorage.getItem(`local_offset:${live_id}`) || "0", 10) ||
|
||||
0,
|
||||
fix_encoding,
|
||||
})) as VideoItem;
|
||||
await get_video_list();
|
||||
new_video.cover = await get_cover("output", new_video.cover);
|
||||
video_selected = new_video.id;
|
||||
selected_video = videos.find((v) => {
|
||||
return v.value == new_video.id;
|
||||
@@ -337,6 +354,9 @@
|
||||
if (selected_video) {
|
||||
selected_video.cover = new_video.cover;
|
||||
}
|
||||
|
||||
// clean up previous input data
|
||||
clip_note = "";
|
||||
}
|
||||
|
||||
async function cancel_clip() {
|
||||
@@ -673,41 +693,91 @@
|
||||
|
||||
<!-- Clip Confirmation Dialog -->
|
||||
{#if show_clip_confirm}
|
||||
<div
|
||||
class="fixed inset-0 bg-gray-900/50 backdrop-blur-sm flex items-center justify-center z-50"
|
||||
>
|
||||
<div class="bg-[#1c1c1e] rounded-lg p-6 max-w-md w-full mx-4">
|
||||
<h3 class="text-lg font-medium text-white mb-4">确认生成切片</h3>
|
||||
<div class="space-y-4">
|
||||
<div class="text-sm text-gray-300">
|
||||
<p>切片时长: {(end - start).toFixed(2)} 秒</p>
|
||||
<div class="fixed inset-0 z-[100] flex items-center justify-center">
|
||||
<div
|
||||
class="absolute inset-0 bg-black/60 backdrop-blur-md"
|
||||
role="button"
|
||||
tabindex="0"
|
||||
aria-label="关闭对话框"
|
||||
on:click={() => (show_clip_confirm = false)}
|
||||
on:keydown={(e) => {
|
||||
if (e.key === "Escape" || e.key === "Enter" || e.key === " ") {
|
||||
e.preventDefault();
|
||||
show_clip_confirm = false;
|
||||
}
|
||||
}}
|
||||
/>
|
||||
|
||||
<div
|
||||
role="dialog"
|
||||
aria-modal="true"
|
||||
class="relative mx-4 w-full max-w-md rounded-2xl bg-[#1c1c1e] border border-white/10 shadow-2xl ring-1 ring-black/5"
|
||||
>
|
||||
<div class="p-5">
|
||||
<h3 class="text-[17px] font-semibold text-white">确认生成切片</h3>
|
||||
<p class="mt-1 text-[13px] text-white/70">请确认以下设置后继续</p>
|
||||
|
||||
<div class="mt-3 space-y-3">
|
||||
<div class="text-[13px] text-white/80">> 切片时长</div>
|
||||
<div
|
||||
class="mt-0.5 text-[22px] font-semibold tracking-tight text-white"
|
||||
>
|
||||
{format_duration_seconds(end - start)}
|
||||
</div>
|
||||
</div>
|
||||
<div class="flex items-center space-x-2">
|
||||
|
||||
<div class="mt-3 space-y-3">
|
||||
<div class="mt-1 text-[13px] text-white/80">> 切片备注(可选)</div>
|
||||
<input
|
||||
type="checkbox"
|
||||
id="confirm-danmu-checkbox"
|
||||
bind:checked={danmu_enabled}
|
||||
class="w-4 h-4 text-[#0A84FF] bg-[#2c2c2e] border-gray-800 rounded focus:ring-[#0A84FF] focus:ring-offset-[#1c1c1e]"
|
||||
type="text"
|
||||
id="confirm-clip-note-input"
|
||||
bind:value={clip_note}
|
||||
class="w-full px-3 py-2 bg-[#2c2c2e] text-white rounded-lg
|
||||
border border-gray-800/50 focus:border-[#0A84FF]
|
||||
transition duration-200 outline-none
|
||||
placeholder-gray-500"
|
||||
/>
|
||||
<label for="confirm-danmu-checkbox" class="text-sm text-gray-300"
|
||||
>压制弹幕</label
|
||||
>
|
||||
</div>
|
||||
<div class="flex justify-end space-x-3">
|
||||
<button
|
||||
on:click={() => (show_clip_confirm = false)}
|
||||
class="px-4 py-2 text-gray-300 hover:text-white transition-colors duration-200"
|
||||
>
|
||||
取消
|
||||
</button>
|
||||
<button
|
||||
on:click={confirm_generate_clip}
|
||||
class="px-4 py-2 bg-[#0A84FF] text-white rounded-lg hover:bg-[#0A84FF]/90 transition-colors duration-200"
|
||||
>
|
||||
确认生成
|
||||
</button>
|
||||
|
||||
<div class="mt-3 space-y-3">
|
||||
<label class="flex items-center gap-2.5">
|
||||
<input
|
||||
type="checkbox"
|
||||
id="confirm-danmu-checkbox"
|
||||
bind:checked={danmu_enabled}
|
||||
class="h-4 w-4 rounded border-white/30 bg-[#2c2c2e] text-[#0A84FF] accent-[#0A84FF] focus:outline-none focus:ring-2 focus:ring-[#0A84FF]/40"
|
||||
/>
|
||||
<span class="text-[13px] text-white/80">压制弹幕</span>
|
||||
</label>
|
||||
|
||||
<label class="flex items-center gap-2.5">
|
||||
<input
|
||||
type="checkbox"
|
||||
id="confirm-fix-encoding-checkbox"
|
||||
bind:checked={fix_encoding}
|
||||
class="h-4 w-4 rounded border-white/30 bg-[#2c2c2e] text-[#0A84FF] accent-[#0A84FF] focus:outline-none focus:ring-2 focus:ring-[#0A84FF]/40"
|
||||
/>
|
||||
<span class="text-[13px] text-white/80">修复编码</span>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div
|
||||
class="flex items-center justify-end gap-2 rounded-b-2xl border-t border-white/10 bg-[#111113] px-5 py-3"
|
||||
>
|
||||
<button
|
||||
on:click={() => (show_clip_confirm = false)}
|
||||
class="px-3.5 py-2 text-[13px] rounded-lg border border-white/20 text-white/90 hover:bg-white/10 transition-colors"
|
||||
>
|
||||
取消
|
||||
</button>
|
||||
<button
|
||||
on:click={confirm_generate_clip}
|
||||
class="px-3.5 py-2 text-[13px] rounded-lg bg-[#0A84FF] text-white shadow-[inset_0_1px_0_rgba(255,255,255,.15)] hover:bg-[#0A84FF]/90 transition-colors"
|
||||
>
|
||||
确认生成
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
@@ -1,26 +0,0 @@
|
||||
<script lang="ts">
|
||||
import { get, log } from "./invoker";
|
||||
export let src = "";
|
||||
export let iclass = "";
|
||||
let b = "";
|
||||
async function getImage(url: string) {
|
||||
if (!url) {
|
||||
return "/imgs/douyin.png";
|
||||
}
|
||||
if (url.startsWith("data")) {
|
||||
return url;
|
||||
}
|
||||
const response = await get(url);
|
||||
return URL.createObjectURL(await response.blob());
|
||||
}
|
||||
async function init() {
|
||||
try {
|
||||
b = await getImage(src);
|
||||
} catch (e) {
|
||||
log.error("Failed to get image:", e);
|
||||
}
|
||||
}
|
||||
init();
|
||||
</script>
|
||||
|
||||
<img src={b} class={iclass} alt="" />
|
||||
@@ -1,7 +1,12 @@
|
||||
import { tool } from "@langchain/core/tools";
|
||||
import { z } from "zod";
|
||||
import { invoke } from "../invoker";
|
||||
import { default_profile, generateEventId, type ClipRangeParams, type Profile } from "../interface";
|
||||
import {
|
||||
default_profile,
|
||||
generateEventId,
|
||||
type ClipRangeParams,
|
||||
type Profile,
|
||||
} from "../interface";
|
||||
|
||||
const platform_list = ["bilibili", "douyin"];
|
||||
|
||||
@@ -23,7 +28,7 @@ const get_accounts = tool(
|
||||
name: "get_accounts",
|
||||
description: "Get all available accounts",
|
||||
schema: z.object({}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -42,19 +47,28 @@ const remove_account = tool(
|
||||
platform: z
|
||||
.string()
|
||||
.describe(
|
||||
`The platform of the account. Can be ${platform_list.join(", ")}`
|
||||
`The platform of the account. Can be ${platform_list.join(", ")}`,
|
||||
),
|
||||
uid: z.number().describe("The uid of the account"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
const add_recorder = tool(
|
||||
async ({ platform, room_id }: { platform: string; room_id: number }) => {
|
||||
async ({
|
||||
platform,
|
||||
room_id,
|
||||
extra,
|
||||
}: {
|
||||
platform: string;
|
||||
room_id: number;
|
||||
extra: string;
|
||||
}) => {
|
||||
const result = await invoke("add_recorder", {
|
||||
platform,
|
||||
roomId: room_id,
|
||||
extra,
|
||||
});
|
||||
return result;
|
||||
},
|
||||
@@ -65,11 +79,16 @@ const add_recorder = tool(
|
||||
platform: z
|
||||
.string()
|
||||
.describe(
|
||||
`The platform of the recorder. Can be ${platform_list.join(", ")}`
|
||||
`The platform of the recorder. Can be ${platform_list.join(", ")}`,
|
||||
),
|
||||
room_id: z.number().describe("The room id of the recorder"),
|
||||
extra: z
|
||||
.string()
|
||||
.describe(
|
||||
"The extra of the recorder, should be empty for bilibili, and the sec_user_id for douyin",
|
||||
),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -88,11 +107,11 @@ const remove_recorder = tool(
|
||||
platform: z
|
||||
.string()
|
||||
.describe(
|
||||
`The platform of the recorder. Can be ${platform_list.join(", ")}`
|
||||
`The platform of the recorder. Can be ${platform_list.join(", ")}`,
|
||||
),
|
||||
room_id: z.number().describe("The room id of the recorder"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -105,7 +124,7 @@ const get_recorder_list = tool(
|
||||
name: "get_recorder_list",
|
||||
description: "Get the list of all available recorders",
|
||||
schema: z.object({}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -121,14 +140,24 @@ const get_recorder_info = tool(
|
||||
platform: z.string().describe("The platform of the room"),
|
||||
room_id: z.number().describe("The room id of the room"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
const get_archives = tool(
|
||||
async ({ room_id }: { room_id: number }) => {
|
||||
async ({
|
||||
room_id,
|
||||
offset,
|
||||
limit,
|
||||
}: {
|
||||
room_id: number;
|
||||
offset: number;
|
||||
limit: number;
|
||||
}) => {
|
||||
const archives = (await invoke("get_archives", {
|
||||
roomId: room_id,
|
||||
offset,
|
||||
limit,
|
||||
})) as any[];
|
||||
// hide cover in result
|
||||
return {
|
||||
@@ -146,8 +175,10 @@ const get_archives = tool(
|
||||
description: "Get the list of all archives of a recorder",
|
||||
schema: z.object({
|
||||
room_id: z.number().describe("The room id of the recorder"),
|
||||
offset: z.number().describe("The offset of the archives"),
|
||||
limit: z.number().describe("The limit of the archives"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -171,7 +202,7 @@ const get_archive = tool(
|
||||
room_id: z.number().describe("The room id of the recorder"),
|
||||
live_id: z.string().describe("The live id of the archive"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -199,12 +230,45 @@ const delete_archive = tool(
|
||||
platform: z
|
||||
.string()
|
||||
.describe(
|
||||
`The platform of the recorder. Can be ${platform_list.join(", ")}`
|
||||
`The platform of the recorder. Can be ${platform_list.join(", ")}`,
|
||||
),
|
||||
room_id: z.number().describe("The room id of the recorder"),
|
||||
live_id: z.string().describe("The live id of the archive"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
const delete_archives = tool(
|
||||
async ({
|
||||
platform,
|
||||
room_id,
|
||||
live_ids,
|
||||
}: {
|
||||
platform: string;
|
||||
room_id: number;
|
||||
live_ids: string[];
|
||||
}) => {
|
||||
const result = await invoke("delete_archives", {
|
||||
platform,
|
||||
roomId: room_id,
|
||||
liveIds: live_ids,
|
||||
});
|
||||
return result;
|
||||
},
|
||||
{
|
||||
name: "delete_archives",
|
||||
description: "Delete multiple archives",
|
||||
schema: z.object({
|
||||
platform: z
|
||||
.string()
|
||||
.describe(
|
||||
`The platform of the recorder. Can be ${platform_list.join(", ")}`,
|
||||
),
|
||||
room_id: z.number().describe("The room id of the recorder"),
|
||||
live_ids: z.array(z.string()).describe("The live ids of the archives"),
|
||||
}),
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -224,7 +288,7 @@ const get_background_tasks = tool(
|
||||
name: "get_background_tasks",
|
||||
description: "Get the list of all background tasks",
|
||||
schema: z.object({}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -239,7 +303,7 @@ const delete_background_task = tool(
|
||||
schema: z.object({
|
||||
id: z.string().describe("The id of the task"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -261,7 +325,7 @@ const get_videos = tool(
|
||||
schema: z.object({
|
||||
room_id: z.number().describe("The room id of the room"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -281,7 +345,7 @@ const get_all_videos = tool(
|
||||
name: "get_all_videos",
|
||||
description: "Get the list of all videos from all rooms",
|
||||
schema: z.object({}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -301,7 +365,7 @@ const get_video = tool(
|
||||
schema: z.object({
|
||||
id: z.number().describe("The id of the video"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -318,7 +382,7 @@ const get_video_cover = tool(
|
||||
schema: z.object({
|
||||
id: z.number().describe("The id of the video"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -333,7 +397,7 @@ const delete_video = tool(
|
||||
schema: z.object({
|
||||
id: z.number().describe("The id of the video"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -347,7 +411,7 @@ const get_video_typelist = tool(
|
||||
description:
|
||||
"Get the list of all video types(视频分区) that can be selected on bilibili platform",
|
||||
schema: z.object({}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -358,11 +422,12 @@ const get_video_subtitle = tool(
|
||||
},
|
||||
{
|
||||
name: "get_video_subtitle",
|
||||
description: "Get the subtitle of a video, if empty, you can use generate_video_subtitle to generate the subtitle",
|
||||
description:
|
||||
"Get the subtitle of a video, if empty, you can use generate_video_subtitle to generate the subtitle",
|
||||
schema: z.object({
|
||||
id: z.number().describe("The id of the video"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -377,7 +442,7 @@ const generate_video_subtitle = tool(
|
||||
schema: z.object({
|
||||
id: z.number().describe("The id of the video"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -397,15 +462,31 @@ const encode_video_subtitle = tool(
|
||||
srt_style: z
|
||||
.string()
|
||||
.describe(
|
||||
"The style of the subtitle, it is used for ffmpeg -vf force_style, it must be a valid srt style"
|
||||
"The style of the subtitle, it is used for ffmpeg -vf force_style, it must be a valid srt style",
|
||||
),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
const post_video_to_bilibili = tool(
|
||||
async ({ uid, room_id, video_id, title, desc, tag, tid }: { uid: number; room_id: number; video_id: number; title: string; desc: string; tag: string; tid: number }) => {
|
||||
async ({
|
||||
uid,
|
||||
room_id,
|
||||
video_id,
|
||||
title,
|
||||
desc,
|
||||
tag,
|
||||
tid,
|
||||
}: {
|
||||
uid: number;
|
||||
room_id: number;
|
||||
video_id: number;
|
||||
title: string;
|
||||
desc: string;
|
||||
tag: string;
|
||||
tid: number;
|
||||
}) => {
|
||||
// invoke("upload_procedure", {
|
||||
// uid: uid_selected,
|
||||
// eventId: event_id,
|
||||
@@ -421,28 +502,59 @@ const post_video_to_bilibili = tool(
|
||||
profile.desc = desc;
|
||||
profile.tag = tag;
|
||||
profile.tid = tid;
|
||||
const result = await invoke("upload_procedure", { uid, eventId: event_id, roomId: room_id, videoId: video_id, cover, profile });
|
||||
const result = await invoke("upload_procedure", {
|
||||
uid,
|
||||
eventId: event_id,
|
||||
roomId: room_id,
|
||||
videoId: video_id,
|
||||
cover,
|
||||
profile,
|
||||
});
|
||||
return result;
|
||||
},
|
||||
{
|
||||
name: "post_video_to_bilibili",
|
||||
description: "Post a video to bilibili",
|
||||
schema: z.object({
|
||||
uid: z.number().describe("The uid of the user, it should be one of the uid in the bilibili accounts"),
|
||||
uid: z
|
||||
.number()
|
||||
.describe(
|
||||
"The uid of the user, it should be one of the uid in the bilibili accounts",
|
||||
),
|
||||
room_id: z.number().describe("The room id of the room"),
|
||||
video_id: z.number().describe("The id of the video"),
|
||||
title: z.string().describe("The title of the video"),
|
||||
desc: z.string().describe("The description of the video"),
|
||||
tag: z.string().describe("The tag of the video, multiple tags should be separated by comma"),
|
||||
tid: z.number().describe("The tid of the video, it is the id of the video type, you can use get_video_typelist to get the list of all video types"),
|
||||
tag: z
|
||||
.string()
|
||||
.describe(
|
||||
"The tag of the video, multiple tags should be separated by comma",
|
||||
),
|
||||
tid: z
|
||||
.number()
|
||||
.describe(
|
||||
"The tid of the video, it is the id of the video type, you can use get_video_typelist to get the list of all video types",
|
||||
),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
const get_danmu_record = tool(
|
||||
async ({ platform, room_id, live_id }: { platform: string; room_id: number; live_id: string }) => {
|
||||
const result = (await invoke("get_danmu_record", { platform, roomId: room_id, liveId: live_id })) as any[];
|
||||
async ({
|
||||
platform,
|
||||
room_id,
|
||||
live_id,
|
||||
}: {
|
||||
platform: string;
|
||||
room_id: number;
|
||||
live_id: string;
|
||||
}) => {
|
||||
const result = (await invoke("get_danmu_record", {
|
||||
platform,
|
||||
roomId: room_id,
|
||||
liveId: live_id,
|
||||
})) as any[];
|
||||
// remove ts from result
|
||||
return {
|
||||
danmu_record: result.map((r: any) => {
|
||||
@@ -455,46 +567,84 @@ const get_danmu_record = tool(
|
||||
},
|
||||
{
|
||||
name: "get_danmu_record",
|
||||
description: "Get the danmu record of a live, entry ts is relative to the live start time in seconds",
|
||||
description:
|
||||
"Get the danmu record of a live, entry ts is relative to the live start time in seconds",
|
||||
schema: z.object({
|
||||
platform: z.string().describe("The platform of the room"),
|
||||
room_id: z.number().describe("The room id of the room"),
|
||||
live_id: z.string().describe("The live id of the live"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
const clip_range = tool(
|
||||
async ({ reason, clip_range_params }: { reason: string; clip_range_params: ClipRangeParams }) => {
|
||||
async ({
|
||||
reason,
|
||||
clip_range_params,
|
||||
}: {
|
||||
reason: string;
|
||||
clip_range_params: ClipRangeParams;
|
||||
}) => {
|
||||
const event_id = generateEventId();
|
||||
const result = await invoke("clip_range", { eventId: event_id, params: clip_range_params });
|
||||
const result = await invoke("clip_range", {
|
||||
eventId: event_id,
|
||||
params: clip_range_params,
|
||||
});
|
||||
return result;
|
||||
},
|
||||
{
|
||||
name: "clip_range",
|
||||
description: "Clip a range of a live, it will be used to generate a video. You must provide a reason for your decision on params",
|
||||
description:
|
||||
"Clip a range of a live, it will be used to generate a video. You must provide a reason for your decision on params",
|
||||
schema: z.object({
|
||||
reason: z.string().describe("The reason for the clip range, it will be shown to the user. You must offer a summary of the clip range content and why you choose this clip range."),
|
||||
reason: z
|
||||
.string()
|
||||
.describe(
|
||||
"The reason for the clip range, it will be shown to the user. You must offer a summary of the clip range content and why you choose this clip range.",
|
||||
),
|
||||
clip_range_params: z.object({
|
||||
room_id: z.number().describe("The room id of the room"),
|
||||
live_id: z.string().describe("The live id of the live"),
|
||||
x: z.number().describe("The start time in SECONDS of the clip, relative to the live start time, must be less than y"),
|
||||
y: z.number().describe("The end time in SECONDS of the clip, relative to the live start time, must be greater than x"),
|
||||
danmu: z.boolean().describe("Whether to encode danmu, encode danmu will take a lot of time, so it is recommended to set it to false"),
|
||||
offset: z.number().describe("Must be 0"),
|
||||
local_offset: z.number().describe("The offset for danmu timestamp, it is used to correct the timestamp of danmu"),
|
||||
range: z.object({
|
||||
start: z.number().describe("The start time in SECONDS of the clip"),
|
||||
end: z.number().describe("The end time in SECONDS of the clip"),
|
||||
}),
|
||||
danmu: z
|
||||
.boolean()
|
||||
.describe(
|
||||
"Whether to encode danmu, encode danmu will take a lot of time, so it is recommended to set it to false",
|
||||
),
|
||||
local_offset: z
|
||||
.number()
|
||||
.describe(
|
||||
"The offset for danmu timestamp, it is used to correct the timestamp of danmu",
|
||||
),
|
||||
title: z.string().describe("The title of the clip"),
|
||||
note: z.string().describe("The note of the clip"),
|
||||
cover: z.string().describe("Must be empty"),
|
||||
platform: z.string().describe("The platform of the clip"),
|
||||
fix_encoding: z
|
||||
.boolean()
|
||||
.describe(
|
||||
"Whether to fix the encoding of the clip, it will take a lot of time, so it is recommended to set it to false",
|
||||
),
|
||||
}),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
const get_recent_record = tool(
|
||||
async ({ room_id, offset, limit }: { room_id: number; offset: number; limit: number }) => {
|
||||
async ({
|
||||
room_id,
|
||||
offset,
|
||||
limit,
|
||||
}: {
|
||||
room_id: number;
|
||||
offset: number;
|
||||
limit: number;
|
||||
}) => {
|
||||
const records = (await invoke("get_recent_record", {
|
||||
roomId: room_id,
|
||||
offset,
|
||||
@@ -502,7 +652,11 @@ const get_recent_record = tool(
|
||||
})) as any[];
|
||||
return {
|
||||
records: records.map((r: any) => {
|
||||
return { ...r, cover: null, created_at: new Date(r.created_at).toLocaleString() };
|
||||
return {
|
||||
...r,
|
||||
cover: null,
|
||||
created_at: new Date(r.created_at).toLocaleString(),
|
||||
};
|
||||
}),
|
||||
};
|
||||
},
|
||||
@@ -514,18 +668,12 @@ const get_recent_record = tool(
|
||||
offset: z.number().describe("The offset of the records"),
|
||||
limit: z.number().describe("The limit of the records"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
const get_recent_record_all = tool(
|
||||
async ({
|
||||
offset,
|
||||
limit,
|
||||
}: {
|
||||
offset: number;
|
||||
limit: number;
|
||||
}) => {
|
||||
async ({ offset, limit }: { offset: number; limit: number }) => {
|
||||
const records = (await invoke("get_recent_record", {
|
||||
roomId: 0,
|
||||
offset,
|
||||
@@ -548,7 +696,7 @@ const get_recent_record_all = tool(
|
||||
offset: z.number().describe("The offset of the records"),
|
||||
limit: z.number().describe("The limit of the records"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -563,7 +711,7 @@ const generic_ffmpeg_command = tool(
|
||||
schema: z.object({
|
||||
args: z.array(z.string()).describe("The arguments of the ffmpeg command"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -578,7 +726,7 @@ const open_clip = tool(
|
||||
schema: z.object({
|
||||
video_id: z.number().describe("The id of the video"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
@@ -593,41 +741,67 @@ const list_folder = tool(
|
||||
schema: z.object({
|
||||
path: z.string().describe("The path of the folder"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
const get_archive_subtitle = tool(
|
||||
async ({ platform, room_id, live_id }: { platform: string; room_id: number; live_id: string }) => {
|
||||
const result = await invoke("get_archive_subtitle", { platform, roomId: room_id, liveId: live_id });
|
||||
async ({
|
||||
platform,
|
||||
room_id,
|
||||
live_id,
|
||||
}: {
|
||||
platform: string;
|
||||
room_id: number;
|
||||
live_id: string;
|
||||
}) => {
|
||||
const result = await invoke("get_archive_subtitle", {
|
||||
platform,
|
||||
roomId: room_id,
|
||||
liveId: live_id,
|
||||
});
|
||||
return result;
|
||||
},
|
||||
{
|
||||
name: "get_archive_subtitle",
|
||||
description: "Get the subtitle of a archive, it may not be generated yet, you can use generate_archive_subtitle to generate the subtitle",
|
||||
description:
|
||||
"Get the subtitle of a archive, it may not be generated yet, you can use generate_archive_subtitle to generate the subtitle",
|
||||
schema: z.object({
|
||||
platform: z.string().describe("The platform of the archive"),
|
||||
room_id: z.number().describe("The room id of the archive"),
|
||||
live_id: z.string().describe("The live id of the archive"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// @ts-ignore
|
||||
const generate_archive_subtitle = tool(
|
||||
async ({ platform, room_id, live_id }: { platform: string; room_id: number; live_id: string }) => {
|
||||
const result = await invoke("generate_archive_subtitle", { platform, roomId: room_id, liveId: live_id });
|
||||
async ({
|
||||
platform,
|
||||
room_id,
|
||||
live_id,
|
||||
}: {
|
||||
platform: string;
|
||||
room_id: number;
|
||||
live_id: string;
|
||||
}) => {
|
||||
const result = await invoke("generate_archive_subtitle", {
|
||||
platform,
|
||||
roomId: room_id,
|
||||
liveId: live_id,
|
||||
});
|
||||
return result;
|
||||
},
|
||||
{
|
||||
name: "generate_archive_subtitle",
|
||||
description: "Generate the subtitle of a archive, it may take a long time, you should not call this tool unless user ask you to generate the subtitle. It can be used to overwrite the subtitle of a archive",
|
||||
description:
|
||||
"Generate the subtitle of a archive, it may take a long time, you should not call this tool unless user ask you to generate the subtitle. It can be used to overwrite the subtitle of a archive",
|
||||
schema: z.object({
|
||||
platform: z.string().describe("The platform of the archive"),
|
||||
room_id: z.number().describe("The room id of the archive"),
|
||||
live_id: z.string().describe("The live id of the archive"),
|
||||
}),
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
const tools = [
|
||||
|
||||
@@ -9,7 +9,7 @@
|
||||
Video,
|
||||
Brain,
|
||||
} from "lucide-svelte";
|
||||
import { hasNewVersion } from "./stores/version";
|
||||
import { hasNewVersion } from "../stores/version";
|
||||
import SidebarItem from "./SidebarItem.svelte";
|
||||
import { createEventDispatcher } from "svelte";
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
<script lang="ts">
|
||||
import { Play, X, Type, Palette, Move, Plus, Trash2 } from "lucide-svelte";
|
||||
import { invoke, log } from "../lib/invoker";
|
||||
import { invoke, log } from "../invoker";
|
||||
import { onMount, createEventDispatcher } from "svelte";
|
||||
|
||||
const dispatch = createEventDispatcher();
|
||||
@@ -7,7 +7,7 @@
|
||||
|
||||
// 获取消息时间戳,如果没有则使用当前时间
|
||||
$: messageTime = message.additional_kwargs?.timestamp
|
||||
? new Date(message.additional_kwargs.timestamp)
|
||||
? new Date(message.additional_kwargs.timestamp as string | number)
|
||||
: new Date();
|
||||
</script>
|
||||
|
||||
722
src/lib/components/ImportVideoDialog.svelte
Normal file
722
src/lib/components/ImportVideoDialog.svelte
Normal file
@@ -0,0 +1,722 @@
|
||||
<script lang="ts">
|
||||
import {
|
||||
invoke,
|
||||
TAURI_ENV,
|
||||
ENDPOINT,
|
||||
listen,
|
||||
onConnectionRestore,
|
||||
} from "../invoker";
|
||||
import { Upload, X, CheckCircle } from "lucide-svelte";
|
||||
import { createEventDispatcher, onDestroy } from "svelte";
|
||||
import { open } from "@tauri-apps/plugin-dialog";
|
||||
import type { ProgressUpdate, ProgressFinished } from "../interface";
|
||||
|
||||
export let showDialog = false;
|
||||
export let roomId: number | null = null;
|
||||
|
||||
const dispatch = createEventDispatcher();
|
||||
|
||||
let selectedFilePath: string | null = null;
|
||||
let selectedFileName: string = "";
|
||||
let selectedFileSize: number = 0;
|
||||
let videoTitle = "";
|
||||
let importing = false;
|
||||
let uploading = false;
|
||||
let uploadProgress = 0;
|
||||
let dragOver = false;
|
||||
let fileInput: HTMLInputElement;
|
||||
let importProgress = "";
|
||||
let currentImportEventId: string | null = null;
|
||||
|
||||
// 批量导入状态
|
||||
let selectedFiles: string[] = [];
|
||||
let batchImporting = false;
|
||||
let currentFileIndex = 0;
|
||||
let totalFiles = 0;
|
||||
|
||||
// 获取当前正在处理的文件名(从文件路径中提取文件名)
|
||||
$: currentFileName =
|
||||
currentFileIndex > 0 && selectedFiles.length > 0
|
||||
? selectedFiles[currentFileIndex - 1]?.split(/[/\\]/).pop() || "未知文件"
|
||||
: "";
|
||||
|
||||
// 格式化文件大小
|
||||
function formatFileSize(sizeInBytes: number): string {
|
||||
if (sizeInBytes === 0) return "0 B";
|
||||
|
||||
const units = ["B", "KB", "MB", "GB", "TB"];
|
||||
const k = 1024;
|
||||
let unitIndex = 0;
|
||||
let size = sizeInBytes;
|
||||
|
||||
// 找到合适的单位
|
||||
while (size >= k && unitIndex < units.length - 1) {
|
||||
size /= k;
|
||||
unitIndex++;
|
||||
}
|
||||
|
||||
// 对于GB以上,显示2位小数;MB显示2位小数;KB及以下显示1位小数
|
||||
const decimals = unitIndex >= 3 ? 2 : unitIndex >= 2 ? 2 : 1;
|
||||
|
||||
return size.toFixed(decimals) + " " + units[unitIndex];
|
||||
}
|
||||
|
||||
// 进度监听器
|
||||
const progressUpdateListener = listen<ProgressUpdate>(
|
||||
"progress-update",
|
||||
(e) => {
|
||||
if (e.payload.id === currentImportEventId) {
|
||||
importProgress = e.payload.content;
|
||||
|
||||
// 从进度文本中提取当前文件索引
|
||||
const match = importProgress.match(/正在导入第(\d+)个/);
|
||||
if (match) {
|
||||
currentFileIndex = parseInt(match[1]);
|
||||
}
|
||||
}
|
||||
}
|
||||
);
|
||||
|
||||
const progressFinishedListener = listen<ProgressFinished>(
|
||||
"progress-finished",
|
||||
(e) => {
|
||||
if (e.payload.id === currentImportEventId) {
|
||||
if (e.payload.success) {
|
||||
// 导入成功,关闭对话框并刷新列表
|
||||
showDialog = false;
|
||||
selectedFilePath = null;
|
||||
selectedFileName = "";
|
||||
selectedFileSize = 0;
|
||||
videoTitle = "";
|
||||
resetBatchImportState();
|
||||
dispatch("imported");
|
||||
} else {
|
||||
alert("导入失败: " + e.payload.message);
|
||||
resetBatchImportState();
|
||||
}
|
||||
// 无论成功失败都要重置状态
|
||||
importing = false;
|
||||
currentImportEventId = null;
|
||||
importProgress = "";
|
||||
}
|
||||
}
|
||||
);
|
||||
|
||||
// 连接恢复时检查任务状态
|
||||
async function checkTaskStatus() {
|
||||
if (!currentImportEventId || !importing) return;
|
||||
|
||||
try {
|
||||
const progress = await invoke("get_import_progress");
|
||||
if (!progress) {
|
||||
importing = false;
|
||||
currentImportEventId = null;
|
||||
importProgress = "";
|
||||
resetBatchImportState();
|
||||
dispatch("imported");
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(`[ImportDialog] Failed to check task status:`, error);
|
||||
}
|
||||
}
|
||||
|
||||
// 注册连接恢复回调
|
||||
if (!TAURI_ENV) {
|
||||
onConnectionRestore(checkTaskStatus);
|
||||
}
|
||||
|
||||
onDestroy(() => {
|
||||
progressUpdateListener?.then((fn) => fn());
|
||||
progressFinishedListener?.then((fn) => fn());
|
||||
});
|
||||
|
||||
async function handleFileSelect() {
|
||||
if (TAURI_ENV) {
|
||||
// Tauri模式:使用文件对话框,支持多选
|
||||
try {
|
||||
const selected = await open({
|
||||
multiple: true,
|
||||
filters: [
|
||||
{
|
||||
name: "视频文件",
|
||||
extensions: [
|
||||
"mp4",
|
||||
"mkv",
|
||||
"avi",
|
||||
"mov",
|
||||
"wmv",
|
||||
"flv",
|
||||
"m4v",
|
||||
"webm",
|
||||
],
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
// 检查用户是否取消了选择
|
||||
if (!selected) return;
|
||||
|
||||
if (Array.isArray(selected) && selected.length > 1) {
|
||||
// 批量导入:多个文件
|
||||
selectedFiles = selected;
|
||||
await startBatchImport();
|
||||
} else if (Array.isArray(selected) && selected.length === 1) {
|
||||
// 单文件导入:数组中的单个文件
|
||||
await setSelectedFile(selected[0]);
|
||||
} else if (typeof selected === "string") {
|
||||
// 单文件导入:直接返回字符串路径
|
||||
await setSelectedFile(selected);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("文件选择失败:", error);
|
||||
alert("文件选择失败: " + error);
|
||||
}
|
||||
} else {
|
||||
// Web模式:触发文件输入
|
||||
fileInput?.click();
|
||||
}
|
||||
}
|
||||
|
||||
async function handleFileInputChange(event: Event) {
|
||||
const target = event.target as HTMLInputElement;
|
||||
const files = target.files;
|
||||
if (files && files.length > 0) {
|
||||
if (files.length > 1) {
|
||||
// 批量上传模式
|
||||
await uploadAndImportMultipleFiles(Array.from(files));
|
||||
} else {
|
||||
// 单文件上传模式(保持现有逻辑)
|
||||
const file = files[0];
|
||||
// 提前设置文件信息,提升用户体验
|
||||
selectedFileName = file.name;
|
||||
videoTitle = file.name.replace(/\.[^/.]+$/, ""); // 去掉扩展名
|
||||
selectedFileSize = file.size;
|
||||
|
||||
await uploadFile(file);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function handleDrop(event: DragEvent) {
|
||||
event.preventDefault();
|
||||
dragOver = false;
|
||||
|
||||
if (TAURI_ENV) return; // Tauri模式不支持拖拽
|
||||
|
||||
const files = event.dataTransfer?.files;
|
||||
if (files && files.length > 0) {
|
||||
const file = files[0];
|
||||
// 检查文件类型
|
||||
const allowedTypes = [
|
||||
"video/mp4",
|
||||
"video/x-msvideo",
|
||||
"video/quicktime",
|
||||
"video/x-ms-wmv",
|
||||
"video/x-flv",
|
||||
"video/x-m4v",
|
||||
"video/webm",
|
||||
"video/x-matroska",
|
||||
];
|
||||
if (
|
||||
allowedTypes.includes(file.type) ||
|
||||
file.name.match(/\.(mp4|mkv|avi|mov|wmv|flv|m4v|webm)$/i)
|
||||
) {
|
||||
// 提前设置文件信息,提升用户体验
|
||||
selectedFileName = file.name;
|
||||
videoTitle = file.name.replace(/\.[^/.]+$/, ""); // 去掉扩展名
|
||||
selectedFileSize = file.size;
|
||||
|
||||
await uploadFile(file);
|
||||
} else {
|
||||
alert(
|
||||
"请选择支持的视频文件格式 (MP4, MKV, AVI, MOV, WMV, FLV, M4V, WebM)"
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function uploadFile(file: File) {
|
||||
uploading = true;
|
||||
uploadProgress = 0;
|
||||
|
||||
try {
|
||||
const formData = new FormData();
|
||||
formData.append("file", file);
|
||||
formData.append("roomId", String(roomId || 0));
|
||||
|
||||
const xhr = new XMLHttpRequest();
|
||||
|
||||
// 监听上传进度
|
||||
xhr.upload.addEventListener("progress", (e) => {
|
||||
if (e.lengthComputable) {
|
||||
uploadProgress = Math.round((e.loaded / e.total) * 100);
|
||||
}
|
||||
});
|
||||
|
||||
// 处理上传完成
|
||||
xhr.addEventListener("load", async () => {
|
||||
if (xhr.status === 200) {
|
||||
const response = JSON.parse(xhr.responseText);
|
||||
|
||||
if (response.code === 0 && response.data) {
|
||||
// 使用本地文件信息,更快更准确
|
||||
await setSelectedFile(response.data.filePath, file.size);
|
||||
} else {
|
||||
throw new Error(response.message || "上传失败");
|
||||
}
|
||||
} else {
|
||||
throw new Error(`上传失败: HTTP ${xhr.status}`);
|
||||
}
|
||||
uploading = false;
|
||||
});
|
||||
|
||||
xhr.addEventListener("error", () => {
|
||||
alert("上传失败:网络错误");
|
||||
uploading = false;
|
||||
});
|
||||
|
||||
xhr.open("POST", `${ENDPOINT}/api/upload_file`);
|
||||
xhr.send(formData);
|
||||
} catch (error) {
|
||||
console.error("上传失败:", error);
|
||||
alert("上传失败: " + error);
|
||||
uploading = false;
|
||||
}
|
||||
}
|
||||
|
||||
async function uploadAndImportMultipleFiles(files: File[]) {
|
||||
batchImporting = true;
|
||||
importing = true;
|
||||
totalFiles = files.length;
|
||||
currentFileIndex = 0;
|
||||
importProgress = `准备批量上传和导入 ${totalFiles} 个文件...`;
|
||||
|
||||
// 设置当前处理的文件名列表
|
||||
const fileNames = files.map((file) => file.name);
|
||||
|
||||
try {
|
||||
// 验证所有文件格式
|
||||
const allowedTypes = [
|
||||
"video/mp4",
|
||||
"video/x-msvideo",
|
||||
"video/quicktime",
|
||||
"video/x-ms-wmv",
|
||||
"video/x-flv",
|
||||
"video/x-m4v",
|
||||
"video/webm",
|
||||
"video/x-matroska",
|
||||
];
|
||||
for (const file of files) {
|
||||
if (
|
||||
!allowedTypes.includes(file.type) &&
|
||||
!file.name.match(/\.(mp4|mkv|avi|mov|wmv|flv|m4v|webm)$/i)
|
||||
) {
|
||||
throw new Error(`不支持的文件格式: ${file.name}`);
|
||||
}
|
||||
}
|
||||
|
||||
const formData = new FormData();
|
||||
formData.append("room_id", String(roomId || 0));
|
||||
|
||||
files.forEach((file) => {
|
||||
formData.append("files", file);
|
||||
});
|
||||
|
||||
const xhr = new XMLHttpRequest();
|
||||
|
||||
// 监听上传进度
|
||||
xhr.upload.addEventListener("progress", (e) => {
|
||||
if (e.lengthComputable) {
|
||||
const progress = Math.round((e.loaded / e.total) * 100);
|
||||
importProgress = `批量上传进度: ${progress}%`;
|
||||
|
||||
// 根据进度估算当前正在上传的文件
|
||||
const estimatedCurrentIndex = Math.min(
|
||||
Math.floor((progress / 100) * totalFiles),
|
||||
totalFiles - 1
|
||||
);
|
||||
currentFileName = fileNames[estimatedCurrentIndex] || fileNames[0];
|
||||
}
|
||||
});
|
||||
|
||||
// 处理上传完成
|
||||
xhr.addEventListener("load", () => {
|
||||
if (xhr.status === 200) {
|
||||
const response = JSON.parse(xhr.responseText);
|
||||
if (response.code === 0) {
|
||||
// 批量上传和导入成功,关闭对话框并刷新列表
|
||||
showDialog = false;
|
||||
selectedFilePath = null;
|
||||
selectedFileName = "";
|
||||
selectedFileSize = 0;
|
||||
videoTitle = "";
|
||||
resetBatchImportState();
|
||||
dispatch("imported");
|
||||
} else {
|
||||
throw new Error(response.message || "批量导入失败");
|
||||
}
|
||||
} else {
|
||||
throw new Error(`批量上传失败: HTTP ${xhr.status}`);
|
||||
}
|
||||
});
|
||||
|
||||
xhr.addEventListener("error", () => {
|
||||
alert("批量上传失败:网络错误");
|
||||
resetBatchImportState();
|
||||
});
|
||||
|
||||
xhr.open("POST", `${ENDPOINT}/api/upload_and_import_files`);
|
||||
xhr.send(formData);
|
||||
} catch (error) {
|
||||
console.error("批量上传失败:", error);
|
||||
alert("批量上传失败: " + error);
|
||||
resetBatchImportState();
|
||||
}
|
||||
}
|
||||
|
||||
async function setSelectedFile(filePath: string, fileSize?: number) {
|
||||
selectedFilePath = filePath;
|
||||
selectedFileName = filePath.split(/[/\\]/).pop() || "";
|
||||
videoTitle = selectedFileName.replace(/\.[^/.]+$/, ""); // 去掉扩展名
|
||||
|
||||
if (fileSize !== undefined) {
|
||||
selectedFileSize = fileSize;
|
||||
} else {
|
||||
// 获取文件大小 (Tauri模式)
|
||||
try {
|
||||
selectedFileSize = await invoke("get_file_size", { filePath });
|
||||
} catch (e) {
|
||||
selectedFileSize = 0;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* 开始批量导入视频文件
|
||||
*/
|
||||
async function startBatchImport() {
|
||||
if (selectedFiles.length === 0) return;
|
||||
|
||||
batchImporting = true;
|
||||
importing = true;
|
||||
totalFiles = selectedFiles.length;
|
||||
currentFileIndex = 0;
|
||||
importProgress = `准备批量导入 ${totalFiles} 个文件...`;
|
||||
|
||||
try {
|
||||
const eventId = "batch_import_" + Date.now();
|
||||
currentImportEventId = eventId;
|
||||
|
||||
await invoke("batch_import_external_videos", {
|
||||
eventId: eventId,
|
||||
filePaths: selectedFiles,
|
||||
roomId: roomId || 0,
|
||||
});
|
||||
|
||||
// 注意:成功处理在 progressFinishedListener 中进行
|
||||
} catch (error) {
|
||||
console.error("批量导入失败:", error);
|
||||
alert("批量导入失败: " + error);
|
||||
resetBatchImportState();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* 重置批量导入状态
|
||||
*/
|
||||
function resetBatchImportState() {
|
||||
batchImporting = false;
|
||||
importing = false;
|
||||
currentImportEventId = null;
|
||||
importProgress = "";
|
||||
selectedFiles = [];
|
||||
totalFiles = 0;
|
||||
currentFileIndex = 0;
|
||||
}
|
||||
|
||||
async function startImport() {
|
||||
if (!selectedFilePath) return;
|
||||
|
||||
importing = true;
|
||||
importProgress = "准备导入...";
|
||||
|
||||
try {
|
||||
const eventId = "import_" + Date.now();
|
||||
currentImportEventId = eventId;
|
||||
|
||||
await invoke("import_external_video", {
|
||||
eventId: eventId,
|
||||
filePath: selectedFilePath,
|
||||
title: videoTitle,
|
||||
roomId: roomId || 0,
|
||||
});
|
||||
|
||||
// 注意:成功处理移到了progressFinishedListener中
|
||||
} catch (error) {
|
||||
console.error("导入失败:", error);
|
||||
alert("导入失败: " + error);
|
||||
importing = false;
|
||||
currentImportEventId = null;
|
||||
importProgress = "";
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* 关闭对话框并重置所有状态
|
||||
*/
|
||||
function closeDialog() {
|
||||
showDialog = false;
|
||||
// 重置单文件导入状态
|
||||
selectedFilePath = null;
|
||||
selectedFileName = "";
|
||||
selectedFileSize = 0;
|
||||
videoTitle = "";
|
||||
uploading = false;
|
||||
uploadProgress = 0;
|
||||
importing = false;
|
||||
currentImportEventId = null;
|
||||
importProgress = "";
|
||||
// 重置批量导入状态
|
||||
resetBatchImportState();
|
||||
}
|
||||
|
||||
function handleDragOver(event: DragEvent) {
|
||||
event.preventDefault();
|
||||
if (!TAURI_ENV) {
|
||||
dragOver = true;
|
||||
}
|
||||
}
|
||||
|
||||
function handleDragLeave() {
|
||||
dragOver = false;
|
||||
}
|
||||
</script>
|
||||
|
||||
<!-- 隐藏的文件输入 -->
|
||||
{#if !TAURI_ENV}
|
||||
<input
|
||||
bind:this={fileInput}
|
||||
type="file"
|
||||
accept=".mp4,.mkv,.avi,.mov,.wmv,.flv,.m4v,.webm,video/*"
|
||||
multiple
|
||||
style="display: none"
|
||||
on:change={handleFileInputChange}
|
||||
/>
|
||||
{/if}
|
||||
|
||||
{#if showDialog}
|
||||
<div
|
||||
class="fixed inset-0 bg-black/20 dark:bg-black/40 backdrop-blur-sm z-50 flex items-center justify-center p-4"
|
||||
>
|
||||
<div
|
||||
class="bg-white dark:bg-[#323234] rounded-xl shadow-xl w-full max-w-[600px] max-h-[90vh] overflow-hidden flex flex-col"
|
||||
>
|
||||
<div class="flex-1 overflow-y-auto">
|
||||
<div class="p-6 space-y-4">
|
||||
<div class="flex justify-between items-center">
|
||||
<h3 class="text-lg font-medium text-gray-900 dark:text-white">
|
||||
导入外部视频
|
||||
</h3>
|
||||
<button
|
||||
on:click={closeDialog}
|
||||
class="text-gray-400 hover:text-gray-600"
|
||||
>
|
||||
<X class="w-5 h-5" />
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<!-- 文件选择区域 -->
|
||||
<div
|
||||
class="border-2 border-dashed rounded-lg p-8 text-center transition-colors {dragOver
|
||||
? 'border-blue-400 bg-blue-50 dark:bg-blue-900/20'
|
||||
: 'border-gray-300 dark:border-gray-600'}"
|
||||
on:dragover={handleDragOver}
|
||||
on:dragleave={handleDragLeave}
|
||||
on:drop={handleDrop}
|
||||
>
|
||||
{#if uploading}
|
||||
<!-- 上传进度 -->
|
||||
<div class="space-y-4">
|
||||
<Upload
|
||||
class="w-12 h-12 text-blue-500 mx-auto animate-bounce"
|
||||
/>
|
||||
<p class="text-sm text-gray-900 dark:text-white font-medium">
|
||||
上传中...
|
||||
</p>
|
||||
<div
|
||||
class="w-full bg-gray-200 dark:bg-gray-700 rounded-full h-2"
|
||||
>
|
||||
<div
|
||||
class="bg-blue-500 h-2 rounded-full transition-all"
|
||||
style="width: {uploadProgress}%"
|
||||
></div>
|
||||
</div>
|
||||
<p class="text-xs text-gray-500">{uploadProgress}%</p>
|
||||
</div>
|
||||
{:else if batchImporting}
|
||||
<!-- 批量导入中 -->
|
||||
<div class="space-y-4">
|
||||
<div class="flex items-center justify-center">
|
||||
<svg
|
||||
class="animate-spin h-12 w-12 text-blue-500"
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
fill="none"
|
||||
viewBox="0 0 24 24"
|
||||
>
|
||||
<circle
|
||||
class="opacity-25"
|
||||
cx="12"
|
||||
cy="12"
|
||||
r="10"
|
||||
stroke="currentColor"
|
||||
stroke-width="4"
|
||||
></circle>
|
||||
<path
|
||||
class="opacity-75"
|
||||
fill="currentColor"
|
||||
d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4zm2 5.291A7.962 7.962 0 014 12H0c0 3.042 1.135 5.824 3 7.938l3-2.647z"
|
||||
></path>
|
||||
</svg>
|
||||
</div>
|
||||
<p class="text-sm text-gray-900 dark:text-white font-medium">
|
||||
批量导入进行中...
|
||||
</p>
|
||||
<div class="text-xs text-gray-500">{importProgress}</div>
|
||||
{#if currentFileName}
|
||||
<div class="text-xs text-gray-400 break-all">
|
||||
当前文件:{currentFileName}
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
{:else if selectedFilePath}
|
||||
<!-- 已选择文件 -->
|
||||
<div class="space-y-4">
|
||||
<div class="flex items-center justify-center">
|
||||
<CheckCircle class="w-12 h-12 text-green-500 mx-auto" />
|
||||
</div>
|
||||
<p class="text-sm text-gray-900 dark:text-white font-medium">
|
||||
{selectedFileName}
|
||||
</p>
|
||||
<p class="text-xs text-gray-500">
|
||||
大小: {formatFileSize(selectedFileSize)}
|
||||
</p>
|
||||
<p
|
||||
class="text-xs text-gray-400 break-all"
|
||||
title={selectedFilePath}
|
||||
>
|
||||
{selectedFilePath}
|
||||
</p>
|
||||
<button
|
||||
on:click={() => {
|
||||
selectedFilePath = null;
|
||||
selectedFileName = "";
|
||||
selectedFileSize = 0;
|
||||
videoTitle = "";
|
||||
}}
|
||||
class="text-sm text-red-500 hover:text-red-700"
|
||||
>
|
||||
重新选择
|
||||
</button>
|
||||
</div>
|
||||
{:else}
|
||||
<!-- 选择文件提示 -->
|
||||
<div class="space-y-4">
|
||||
<Upload class="w-12 h-12 text-gray-400 mx-auto" />
|
||||
{#if TAURI_ENV}
|
||||
<p class="text-sm text-gray-600 dark:text-gray-400">
|
||||
点击按钮选择视频文件(支持多选)
|
||||
</p>
|
||||
{:else}
|
||||
<p class="text-sm text-gray-600 dark:text-gray-400">
|
||||
拖拽视频文件到此处,或点击按钮选择文件(支持多选)
|
||||
</p>
|
||||
{/if}
|
||||
<p class="text-xs text-gray-500 dark:text-gray-500">
|
||||
支持 MP4, MKV, AVI, MOV, WMV, FLV, M4V, WebM 格式
|
||||
</p>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
{#if !uploading && !selectedFilePath && !batchImporting}
|
||||
<button
|
||||
on:click={handleFileSelect}
|
||||
class="mt-4 px-4 py-2 bg-blue-500 text-white rounded-lg hover:bg-blue-600 transition-colors"
|
||||
>
|
||||
{TAURI_ENV ? "选择文件" : "选择或拖拽文件"}
|
||||
</button>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<!-- 视频信息编辑 -->
|
||||
{#if selectedFilePath}
|
||||
<div class="space-y-4">
|
||||
<div>
|
||||
<label
|
||||
for="video-title-input"
|
||||
class="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-2"
|
||||
>
|
||||
视频标题
|
||||
</label>
|
||||
<input
|
||||
id="video-title-input"
|
||||
type="text"
|
||||
bind:value={videoTitle}
|
||||
class="w-full px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-gray-700 text-gray-900 dark:text-white focus:outline-none focus:ring-2 focus:ring-blue-500 dark:focus:ring-blue-400"
|
||||
placeholder="输入视频标题"
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- 操作按钮 - 固定在底部 -->
|
||||
<div
|
||||
class="border-t border-gray-200 dark:border-gray-700 p-4 bg-gray-50 dark:bg-[#2a2a2c]"
|
||||
>
|
||||
<div class="flex justify-end space-x-3">
|
||||
<button
|
||||
on:click={closeDialog}
|
||||
class="px-4 py-2 text-gray-700 dark:text-gray-300 hover:bg-gray-100 dark:hover:bg-gray-600 rounded-lg transition-colors"
|
||||
>
|
||||
取消
|
||||
</button>
|
||||
<button
|
||||
on:click={startImport}
|
||||
disabled={!selectedFilePath ||
|
||||
importing ||
|
||||
!videoTitle.trim() ||
|
||||
uploading ||
|
||||
batchImporting}
|
||||
class="px-4 py-2 bg-green-500 text-white rounded-lg hover:bg-green-600 disabled:opacity-50 disabled:cursor-not-allowed transition-colors flex items-center space-x-2"
|
||||
>
|
||||
{#if importing}
|
||||
<svg
|
||||
class="animate-spin h-4 w-4"
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
fill="none"
|
||||
viewBox="0 0 24 24"
|
||||
>
|
||||
<circle
|
||||
class="opacity-25"
|
||||
cx="12"
|
||||
cy="12"
|
||||
r="10"
|
||||
stroke="currentColor"
|
||||
stroke-width="4"
|
||||
></circle>
|
||||
<path
|
||||
class="opacity-75"
|
||||
fill="currentColor"
|
||||
d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4zm2 5.291A7.962 7.962 0 014 12H0c0 3.042 1.135 5.824 3 7.938l3-2.647z"
|
||||
></path>
|
||||
</svg>
|
||||
{/if}
|
||||
<span>{importing ? importProgress || "导入中..." : "开始导入"}</span
|
||||
>
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
@@ -5,12 +5,12 @@
|
||||
ForwardOutline,
|
||||
ClockOutline,
|
||||
} from "flowbite-svelte-icons";
|
||||
import type { Marker } from "./interface";
|
||||
import type { Marker } from "../interface";
|
||||
import { createEventDispatcher } from "svelte";
|
||||
import { Tooltip } from "flowbite-svelte";
|
||||
import { invoke, TAURI_ENV } from "../lib/invoker";
|
||||
import { invoke, TAURI_ENV } from "../invoker";
|
||||
import { save } from "@tauri-apps/plugin-dialog";
|
||||
import type { RecordItem } from "./db";
|
||||
import type { RecordItem } from "../db";
|
||||
const dispatch = createEventDispatcher();
|
||||
export let archive: RecordItem;
|
||||
export let markers: Marker[] = [];
|
||||
@@ -3,9 +3,9 @@
|
||||
</script>
|
||||
|
||||
<script lang="ts">
|
||||
import { invoke, TAURI_ENV, ENDPOINT, listen, log } from "../lib/invoker";
|
||||
import type { AccountInfo } from "./db";
|
||||
import type { Marker, RecorderList, RecorderInfo } from "./interface";
|
||||
import { invoke, TAURI_ENV, ENDPOINT, listen, log } from "../invoker";
|
||||
import type { AccountInfo } from "../db";
|
||||
import type { Marker, RecorderList, RecorderInfo } from "../interface";
|
||||
|
||||
import { createEventDispatcher } from "svelte";
|
||||
import {
|
||||
@@ -376,6 +376,11 @@
|
||||
return;
|
||||
}
|
||||
|
||||
if (event.payload.ts < global_offset * 1000) {
|
||||
log.error("invalid danmu ts:", event.payload.ts, global_offset);
|
||||
return;
|
||||
}
|
||||
|
||||
let danmu_record = {
|
||||
...event.payload,
|
||||
ts: event.payload.ts - global_offset * 1000,
|
||||
@@ -670,37 +675,22 @@
|
||||
}
|
||||
switch (e.key) {
|
||||
case "[":
|
||||
e.preventDefault();
|
||||
start = parseFloat(video.currentTime.toFixed(2));
|
||||
if (end < start) {
|
||||
end = get_total();
|
||||
}
|
||||
|
||||
saveStartEnd();
|
||||
console.log(start, end);
|
||||
break;
|
||||
case "【":
|
||||
e.preventDefault();
|
||||
start = parseFloat(video.currentTime.toFixed(2));
|
||||
if (end < start) {
|
||||
// 如果没有选区(end为0)或者end小于start,自动设置终点为视频结尾
|
||||
if (end === 0 || end < start) {
|
||||
end = get_total();
|
||||
}
|
||||
saveStartEnd();
|
||||
console.log(start, end);
|
||||
break;
|
||||
case "]":
|
||||
e.preventDefault();
|
||||
end = parseFloat(video.currentTime.toFixed(2));
|
||||
if (start > end) {
|
||||
start = 0;
|
||||
}
|
||||
saveStartEnd();
|
||||
console.log(start, end);
|
||||
break;
|
||||
case "】":
|
||||
e.preventDefault();
|
||||
end = parseFloat(video.currentTime.toFixed(2));
|
||||
if (start > end) {
|
||||
// 如果没有选区(start为0)或者start大于end,自动设置起点为视频开头
|
||||
if (start === 0 || start > end) {
|
||||
start = 0;
|
||||
}
|
||||
saveStartEnd();
|
||||
@@ -822,7 +812,9 @@
|
||||
const minValue = 0;
|
||||
let maxValue = 0;
|
||||
if (preprocessed.length > 0) {
|
||||
const counts = preprocessed.map((v) => v.count).filter(c => isFinite(c));
|
||||
const counts = preprocessed
|
||||
.map((v) => v.count)
|
||||
.filter((c) => isFinite(c));
|
||||
if (counts.length > 0) {
|
||||
// Use reduce instead of spread operator to avoid stack overflow
|
||||
maxValue = counts.reduce((max, current) => Math.max(max, current), 0);
|
||||
@@ -832,13 +824,13 @@
|
||||
canvas.clearRect(0, 0, canvasWidth, canvasHeight);
|
||||
if (preprocessed.length > 0) {
|
||||
canvas.beginPath();
|
||||
const x = ((preprocessed[0].ts) / duration) * canvasWidth;
|
||||
const x = (preprocessed[0].ts / duration) * canvasWidth;
|
||||
const y =
|
||||
(1 - (preprocessed[0].count - minValue) / (maxValue - minValue)) *
|
||||
canvasHeight;
|
||||
canvas.moveTo(x, y);
|
||||
for (let i = 0; i < preprocessed.length; i++) {
|
||||
const x = ((preprocessed[i].ts) / duration) * canvasWidth;
|
||||
const x = (preprocessed[i].ts / duration) * canvasWidth;
|
||||
const y =
|
||||
(1 - (preprocessed[i].count - minValue) / (maxValue - minValue)) *
|
||||
canvasHeight;
|
||||
@@ -1,6 +1,6 @@
|
||||
<script lang="ts">
|
||||
import { X } from "lucide-svelte";
|
||||
import { parseSubtitleStyle, type SubtitleStyle } from "./interface";
|
||||
import { parseSubtitleStyle, type SubtitleStyle } from "../interface";
|
||||
|
||||
export let show = false;
|
||||
export let onClose: () => void;
|
||||
@@ -183,8 +183,11 @@
|
||||
<h3 class="text-sm font-medium text-gray-300">对齐和边距</h3>
|
||||
<div class="grid grid-cols-2 gap-4">
|
||||
<div class="space-y-2">
|
||||
<label class="block text-sm text-gray-400">对齐方式</label>
|
||||
<label for="alignment-select" class="block text-sm text-gray-400"
|
||||
>对齐方式</label
|
||||
>
|
||||
<select
|
||||
id="alignment-select"
|
||||
bind:value={style.alignment}
|
||||
class="w-full px-3 py-2 bg-[#2c2c2e] text-white rounded-lg
|
||||
border border-gray-800/50 focus:border-[#0A84FF]
|
||||
@@ -198,8 +201,11 @@
|
||||
</select>
|
||||
</div>
|
||||
<div class="space-y-2">
|
||||
<label class="block text-sm text-gray-400">垂直边距</label>
|
||||
<label for="margin-v-input" class="block text-sm text-gray-400"
|
||||
>垂直边距</label
|
||||
>
|
||||
<input
|
||||
id="margin-v-input"
|
||||
type="number"
|
||||
bind:value={style.marginV}
|
||||
class="w-full px-3 py-2 bg-[#2c2c2e] text-white rounded-lg
|
||||
@@ -210,8 +216,11 @@
|
||||
</div>
|
||||
<div class="grid grid-cols-2 gap-4">
|
||||
<div class="space-y-2">
|
||||
<label class="block text-sm text-gray-400">左边距</label>
|
||||
<label for="margin-l-input" class="block text-sm text-gray-400"
|
||||
>左边距</label
|
||||
>
|
||||
<input
|
||||
id="margin-l-input"
|
||||
type="number"
|
||||
bind:value={style.marginL}
|
||||
class="w-full px-3 py-2 bg-[#2c2c2e] text-white rounded-lg
|
||||
@@ -220,8 +229,11 @@
|
||||
/>
|
||||
</div>
|
||||
<div class="space-y-2">
|
||||
<label class="block text-sm text-gray-400">右边距</label>
|
||||
<label for="margin-r-input" class="block text-sm text-gray-400"
|
||||
>右边距</label
|
||||
>
|
||||
<input
|
||||
id="margin-r-input"
|
||||
type="number"
|
||||
bind:value={style.marginR}
|
||||
class="w-full px-3 py-2 bg-[#2c2c2e] text-white rounded-lg
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user