Compare commits

...

120 Commits

Author SHA1 Message Date
Xinrea
a5a7a8afaf fix: douyin danmu 2025-06-20 00:50:37 +08:00
Xinrea
583ac13a37 chore: update dependencies 2025-06-19 23:35:11 +08:00
Xinrea
3e58972072 fix: recorder re-added when removing 2025-06-19 00:31:11 +08:00
Xinrea
f15aa27727 fix: release build with cuda cache error 2025-06-19 00:29:22 +08:00
Xinrea
2581014dbd fix: dockerfile 2025-06-19 00:18:16 +08:00
Xinrea
baaaa1b57e feat: randomly use multiple accounts (close #123) 2025-06-19 00:11:01 +08:00
Xinrea
160fbb3590 feat: collect frontend log to backend (close #122) 2025-06-18 22:49:07 +08:00
Xinrea
6f3253678c feat: readonly mode (#121)
* feat: readonly mode (close #112)

* feat: check free-space when clipping
2025-06-18 01:09:58 +08:00
Xinrea
563ad66243 feat: danmu local offset settings (close #115) (#120) 2025-06-16 23:38:24 +08:00
Xinrea
a8d002cc53 fix: deadlock removing douyin-recorder 2025-06-13 00:29:29 +08:00
Xinrea
0615410fa4 fix: deadlock removing bili-recorder 2025-06-13 00:15:59 +08:00
Xinrea
fc98e065f8 fix: status-check interval not work for bilibili recorder 2025-06-12 23:35:24 +08:00
Xinrea
66f671ffa0 feat: douyin danmu (#119)
* feat: migrate to uniformed interface for danmu stream

* feat: douyin danmu support (close #113)

* chore: fix typo

* fix: loop-decompress body
2025-06-12 01:00:33 +08:00
Xinrea
69a35af456 fix: recorder adding back when removing (close #114)
Add a to_remove-set to filter recorders that are still in removing-stage,
so that monitor-thread wouldn't add them back.
2025-06-08 10:37:04 +08:00
Xinrea
e462bd0b4c feat: simplified room control mechanism (close #111) 2025-06-08 10:22:20 +08:00
Xinrea
ae6483427f bump version to 2.6.0 2025-06-03 21:53:14 +08:00
Xinrea
ad97677104 feat: configuration for status check interval 2025-05-30 00:32:30 +08:00
Xinrea
996d15ef25 chore: adjust ffmpeg log level & clean up code 2025-05-29 01:18:27 +08:00
Xinrea
06de32ffe7 bump version to 2.5.9 2025-05-27 15:13:45 +08:00
Xinrea
dd43074e46 fix: danmu api missing param 2025-05-27 15:13:22 +08:00
Xinrea
93495e13db bump version to 2.5.8 2025-05-27 01:49:02 +08:00
Xinrea
16950edae4 fix: danmu api wbi_sign required 2025-05-27 01:48:38 +08:00
Xinrea
4af1203360 fix: get logfile size on unix and windows 2025-05-24 18:04:51 +08:00
Xinrea
55b5bd1fd2 fix: metadata on windows 2025-05-24 17:00:09 +08:00
Xinrea
f0a7cf4ed0 bump version to 2.5.7 2025-05-24 15:55:02 +08:00
Xinrea
62e7412abf feat: new stream recording control 2025-05-24 15:54:06 +08:00
Xinrea
275bf647d2 feat: add fail count to avoid connection reject 2025-05-24 15:15:28 +08:00
Xinrea
00af723be9 refactor: stream update 2025-05-24 14:05:23 +08:00
Xinrea
19da577836 chore: add logs for recorder control handler 2025-05-24 12:57:24 +08:00
Xinrea
bf3a2b469b feat: implement log rotation on startup 2025-05-20 23:15:37 +08:00
Xinrea
bf31bfd099 feat: add links on release list 2025-05-20 22:26:47 +08:00
Xinrea
d02fea99f2 refactor: sidebar items 2025-05-20 22:17:49 +08:00
Xinrea
2404bacb4e fix: force switching to new stream when error (close #106) 2025-05-15 14:54:01 +08:00
Xinrea
b6c274c181 fix: adjust date-time-adding rule in manifest 2025-05-15 14:43:47 +08:00
Xinrea
f9b472aee7 fix: wrong ts comparison when clip (close #105) 2025-05-15 14:43:31 +08:00
Xinrea
45f277741b fix: entry timestamp to date str 2025-05-15 01:28:00 +08:00
Xinrea
94179f59cd bump version to 2.5.6 2025-05-15 01:07:37 +08:00
Xinrea
c7b550a3e3 fix: stuck when clipping douyin live 2025-05-15 01:07:36 +08:00
Xinrea
fd51fd2387 chore: adjust ffmpeg log level 2025-05-14 16:39:29 +08:00
Xinrea
23d1798ab6 fix: panic on recorder monitor thread 2025-05-14 16:37:47 +08:00
Xinrea
90e81d0d4d bump verstion to 2.5.5 2025-05-13 15:41:09 +08:00
Xinrea
6a7a19547d fix: invoke error message 2025-05-13 14:24:02 +08:00
Xinrea
1550849ee2 fix: ffmpeg path on windows 2025-05-13 13:37:03 +08:00
Xinrea
15116e2197 chore: add debug log for all ffmpeg task 2025-05-13 12:05:04 +08:00
Xinrea
63eda5179b bump version to 2.5.4 2025-05-08 21:06:34 +08:00
Xinrea
d7b1277363 feat: add douyin cookie documentation 2025-05-08 21:05:20 +08:00
Xinrea
337c933b92 chore: add logs for add recorder 2025-05-08 20:45:17 +08:00
Xinrea
b01b2cc9c0 fix: only provide date-time when discontinuity for douyin stream 2025-05-08 20:42:17 +08:00
Xinrea
30069b2f33 bump version to 2.5.3 2025-05-08 01:07:56 +08:00
Xinrea
c5bd57468c fix: failed to encode subtitle after manual-add 2025-05-08 01:06:52 +08:00
Xinrea
c050c65675 fix: danmu encode offset 2025-05-08 00:50:16 +08:00
Xinrea
e1bd7e7563 fix: task progress not updated 2025-05-08 00:32:21 +08:00
Xinrea
cc129f6384 fix: configuration not saved 2025-05-07 23:17:00 +08:00
Xinrea
e7ea0c0ff0 fix: preview on range select 2025-05-07 22:17:07 +08:00
Xinrea
9630d51c4c fix: provide date time on every segment 2025-05-07 22:17:07 +08:00
Xinrea
ceb140a4c2 Update src-tauri/src/recorder/entry.rs
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-05-07 22:17:07 +08:00
Xinrea
fe8410ab98 Update src-tauri/src/recorder/bilibili.rs
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-05-07 22:17:07 +08:00
Xinrea
00731cda93 refactor: manifest handled by entry store 2025-05-07 22:17:07 +08:00
Xinrea
c05979cb11 fix: panic when config path not exists 2025-05-06 20:03:04 +08:00
Xinrea
6e1a10e45c bump version to 2.5.1 2025-05-04 09:56:36 +08:00
Xinrea
bd74dfdb26 fix: preview stall when replacing expired stream (close #100) 2025-05-04 09:56:02 +08:00
Xinrea
b7c2fd3387 fix(web): cannot preview stream in default setting (close #101) 2025-05-04 09:29:36 +08:00
Xinrea
b65e41ca23 fix(web): delete archive 2025-05-04 09:07:00 +08:00
Xinrea
ec70eded14 docs: update 2025-05-03 23:41:51 +08:00
Xinrea
dcf9047d82 ci/cd: build package on tags push 2025-05-03 22:12:15 +08:00
Xinrea
cd85e9f65a feat: update felgens 2025-05-03 19:19:41 +08:00
Xinrea
066fd4fb77 bump version to 2.5.0 2025-05-03 18:33:41 +08:00
Xinrea
9a6bb30e73 fix(bilibili): not fetching real room for danmu 2025-05-03 18:25:28 +08:00
Xinrea
99d9f27618 fix: stream preview in tauri mode 2025-05-03 17:56:59 +08:00
Xinrea
02ddac6b17 docs: update 2025-05-02 17:30:10 +08:00
Xinrea
017438ee50 fix(http_server): using event_id from task creator 2025-05-02 11:16:55 +08:00
Xinrea
d938982107 feat: add clip download button in web mode 2025-05-02 10:44:06 +08:00
Xinrea
bdde1969f7 docs: fix header image 2025-05-02 10:43:48 +08:00
Xinrea
c8eb038190 fix(bilibili): force update for invalid index 2025-05-01 20:29:44 +08:00
Xinrea
2d90b79f73 fix(http_server): body size limit 2025-05-01 20:29:22 +08:00
Xinrea
f39d3baff5 fix(ffmpeg): add font in runtime for danmu encoding 2025-05-01 20:27:17 +08:00
Xinrea
84664ee272 fix(ffmpeg): auto create output folder 2025-05-01 16:49:17 +08:00
Xinrea
d603216baf fix(ci/cd): runtime env PATH 2025-05-01 16:13:10 +08:00
Xinrea
522873c7fb fix(utils): disk info on linux 2025-05-01 15:57:50 +08:00
Xinrea
a6548f9941 fix(ci/cd): runtime certificates 2025-05-01 15:48:49 +08:00
Xinrea
3843dd88b2 fix(utils): disk info from relative cache path 2025-05-01 15:34:04 +08:00
Xinrea
baddb4e9d4 feat: minimal runtime 2025-05-01 03:02:45 +08:00
Xinrea
4aa51b51bd feat: avoid tauri in headless 2025-05-01 01:51:00 +08:00
Xinrea
725494db7d feat: adjust dependencies 2025-05-01 01:24:07 +08:00
Xinrea
292caa4158 fix(utils): cache folder info on linux 2025-05-01 00:39:39 +08:00
Xinrea
29e9656919 fix(http_server): sse message contains invalid characters 2025-04-30 22:03:06 +08:00
Xinrea
78f4682efb fix(bilibili): invalid stream that needs redirect 2025-04-30 22:00:17 +08:00
Xinrea
fa090b0b66 fix: danmu offset 2025-04-30 21:46:29 +08:00
Xinrea
32b7e9c3c2 fix: file export in web 2025-04-30 21:33:09 +08:00
Xinrea
4d3e069a81 ci/cd: static ffmpeg build in runtime 2025-04-30 18:29:33 +08:00
Xinrea
3ed658a31c ci/cd: remove check 2025-04-30 17:23:20 +08:00
Xinrea
efb24798c8 ci/cd: minimal runtime 2025-04-30 17:22:53 +08:00
Xinrea
e72e9027ef feat: download ffmpeg instead installing in runtime 2025-04-30 16:21:14 +08:00
Xinrea
17c93fb716 fix(ci/cd): pages build 2025-04-30 11:26:45 +08:00
Xinrea
a826666ad6 docs: add pages 2025-04-30 11:24:44 +08:00
Xinrea
c8282cb66f fix(ci/cd): add dependencies 2025-04-30 10:33:13 +08:00
Xinrea
592fd3940e ci/cd: add auto clippy check 2025-04-30 10:27:04 +08:00
Xinrea
7e9980b098 refactor: clean up code 2025-04-30 10:22:11 +08:00
Xinrea
283ee06034 fix: url open in web 2025-04-30 09:35:06 +08:00
Xinrea
9a00693bb3 ci/cd: ffmpeg in runtime 2025-04-30 03:42:19 +08:00
Xinrea
16906a46cd doc: update README 2025-04-30 03:36:55 +08:00
Xinrea
bdf017024a fix: runtime dependencies 2025-04-30 03:02:08 +08:00
Xinrea
58ae1ef426 fix: dockerfile 2025-04-30 02:51:07 +08:00
Xinrea
98e6544c25 fix: ci/cd workflow 2025-04-30 02:33:43 +08:00
Xinrea
1b57beeea6 ci/cd: docker package build 2025-04-30 02:30:52 +08:00
Xinrea
1625a5f889 fix: using static shaka-player lib 2025-04-30 02:30:52 +08:00
Xinrea
ae20e7fad7 fix: provide codecs master manifest 2025-04-30 02:30:52 +08:00
Xinrea
fc594b12e0 Revert "feat: switch bilibili stream to TS for compatibility"
This reverts commit 7a22637a7a.
2025-04-30 02:30:52 +08:00
Xinrea
0d25f32101 feat: event listen 2025-04-30 02:30:52 +08:00
Xinrea
cfd4522036 fix: panic when cancel none-existed event 2025-04-30 02:30:52 +08:00
Xinrea
f638d4aee0 fix: hls from endpoint 2025-04-30 02:30:52 +08:00
Xinrea
b237b78300 fix: danmu offset in web 2025-04-30 02:30:52 +08:00
Xinrea
ed2983c073 feat: settings i web env 2025-04-30 02:30:52 +08:00
Xinrea
730227ac45 fix: hls in tauri 2025-04-30 02:30:52 +08:00
Xinrea
7fb4f41f01 feat: switch bilibili stream to TS for compatibility 2025-04-30 02:30:52 +08:00
Xinrea
d92e013413 fix: ranged hls content 2025-04-30 02:30:52 +08:00
Xinrea
980fd145d0 feat: hls server 2025-04-30 02:30:52 +08:00
Xinrea
693734e12a fix: backend fetch 2025-04-30 02:30:52 +08:00
Xinrea
cbeae9b40d fix: handlers in tauri mode 2025-04-30 02:30:52 +08:00
Xinrea
4d0cc2c3b6 refactor: add extra layer for invoke 2025-04-27 23:02:03 +08:00
107 changed files with 23894 additions and 3267 deletions

39
.dockerignore Normal file
View File

@@ -0,0 +1,39 @@
# Dependencies
node_modules
.pnpm-store
.npm
.yarn/cache
.yarn/unplugged
.yarn/build-state.yml
.yarn/install-state.gz
# Build outputs
dist
build
target
*.log
# Version control
.git
.gitignore
# IDE and editor files
.idea
.vscode
*.swp
*.swo
.DS_Store
# Environment files
.env
.env.local
.env.*.local
# Debug files
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Tauri specific
src-tauri/target
src-tauri/dist

View File

@@ -55,13 +55,9 @@ jobs:
# Those targets are only used on macos runners so it's in an `if` to slightly speed up windows and linux builds.
targets: ${{ matrix.platform == 'macos-latest' && 'aarch64-apple-darwin,x86_64-apple-darwin' || '' }}
- uses: Swatinem/rust-cache@v2
with:
workspaces: "./src-tauri -> target"
- name: Install CUDA toolkit (Windows CUDA only)
if: matrix.platform == 'windows-latest' && matrix.features == 'cuda'
uses: Jimver/cuda-toolkit@master
uses: Jimver/cuda-toolkit@v0.2.24
- name: Rust cache
uses: swatinem/rust-cache@v2

51
.github/workflows/package.yml vendored Normal file
View File

@@ -0,0 +1,51 @@
name: Docker Build and Push
on:
workflow_dispatch:
push:
tags:
- "v*"
env:
REGISTRY: ghcr.io
IMAGE_NAME: ${{ github.repository }}
jobs:
build-and-push:
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Log in to the Container registry
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
tags: |
type=sha,format=long
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{major}}
type=raw,value=latest,enable={{is_default_branch}}
- name: Build and push Docker image
uses: docker/build-push-action@v5
with:
context: .
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}

66
.github/workflows/pages.yml vendored Normal file
View File

@@ -0,0 +1,66 @@
name: Deploy VitePress site to Pages
on:
# Runs on pushes targeting the `main` branch. Change this to `master` if you're
# using the `master` branch as the default branch.
push:
branches: [main]
paths:
- docs/**
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
# Sets permissions of the GITHUB_TOKEN to allow deployment to GitHub Pages
permissions:
contents: read
pages: write
id-token: write
# Allow only one concurrent deployment, skipping runs queued between the run in-progress and latest queued.
# However, do NOT cancel in-progress runs as we want to allow these production deployments to complete.
concurrency:
group: pages
cancel-in-progress: false
jobs:
# Build job
build:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0 # Not needed if lastUpdated is not enabled
# - uses: pnpm/action-setup@v3 # Uncomment this block if you're using pnpm
# with:
# version: 9 # Not needed if you've set "packageManager" in package.json
# - uses: oven-sh/setup-bun@v1 # Uncomment this if you're using Bun
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: 22
cache: npm # or pnpm / yarn
- name: Setup Pages
uses: actions/configure-pages@v4
- name: Install dependencies
run: yarn install # or pnpm install / yarn install / bun install
- name: Build with VitePress
run: yarn run docs:build # or pnpm docs:build / yarn docs:build / bun run docs:build
- name: Upload artifact
uses: actions/upload-pages-artifact@v3
with:
path: docs/.vitepress/dist
# Deployment job
deploy:
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}
needs: build
runs-on: ubuntu-latest
name: Deploy
steps:
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v4

10
.gitignore vendored
View File

@@ -27,4 +27,12 @@ dist-ssr
src-tauri/*.exe
# test files
src-tauri/tests/audio/*.srt
src-tauri/tests/audio/*.srt
.env
docs/.vitepress/cache
docs/.vitepress/dist
*.debug.js
*.debug.map

View File

@@ -1,6 +1,7 @@
[[language]]
name = "rust"
auto-format = true
rulers = []
[[language]]
name = "svelte"

86
Dockerfile Normal file
View File

@@ -0,0 +1,86 @@
# Build frontend
FROM node:20-bullseye AS frontend-builder
WORKDIR /app
# Install system dependencies
RUN apt-get update && apt-get install -y \
python3 \
make \
g++ \
&& rm -rf /var/lib/apt/lists/*
# Copy package files
COPY package.json yarn.lock ./
# Install dependencies with specific flags
RUN yarn install --frozen-lockfile
# Copy source files
COPY . .
# Build frontend
RUN yarn build
# Build Rust backend
FROM rust:1.86-slim AS rust-builder
WORKDIR /app
# Install required system dependencies
RUN apt-get update && apt-get install -y \
cmake \
pkg-config \
libssl-dev \
glib-2.0-dev \
libclang-dev \
g++ \
wget \
xz-utils \
&& rm -rf /var/lib/apt/lists/*
# Copy Rust project files
COPY src-tauri/Cargo.toml src-tauri/Cargo.lock ./src-tauri/
COPY src-tauri/src ./src-tauri/src
COPY src-tauri/crates ./src-tauri/crates
# Build Rust backend
WORKDIR /app/src-tauri
RUN rustup component add rustfmt
RUN cargo build --no-default-features --features headless --release
# Download and install FFmpeg static build
RUN wget https://johnvansickle.com/ffmpeg/releases/ffmpeg-release-amd64-static.tar.xz \
&& tar xf ffmpeg-release-amd64-static.tar.xz \
&& mv ffmpeg-*-static/ffmpeg ./ \
&& mv ffmpeg-*-static/ffprobe ./ \
&& rm -rf ffmpeg-*-static ffmpeg-release-amd64-static.tar.xz
# Final stage
FROM debian:bookworm-slim AS final
WORKDIR /app
# Install runtime dependencies, SSL certificates and Chinese fonts
RUN apt-get update && apt-get install -y \
libssl3 \
ca-certificates \
fonts-wqy-microhei \
&& update-ca-certificates \
&& rm -rf /var/lib/apt/lists/*
# Add /app to PATH
ENV PATH="/app:${PATH}"
# Copy built frontend
COPY --from=frontend-builder /app/dist ./dist
# Copy built Rust binary
COPY --from=rust-builder /app/src-tauri/target/release/bili-shadowreplay .
COPY --from=rust-builder /app/src-tauri/ffmpeg ./ffmpeg
COPY --from=rust-builder /app/src-tauri/ffprobe ./ffprobe
# Expose port
EXPOSE 3000
# Run the application
CMD ["./bili-shadowreplay"]

View File

@@ -1,72 +1,27 @@
# BiliBili ShadowReplay
![icon](docs/header.png)
![icon](docs/public/images/header.png)
![GitHub Actions Workflow Status](https://img.shields.io/github/actions/workflow/status/xinrea/bili-shadowreplay/main.yml?label=Application%20Build)
![GitHub Actions Workflow Status](https://img.shields.io/github/actions/workflow/status/Xinrea/bili-shadowreplay/package.yml?label=Docker%20Build)
![GitHub Actions Workflow Status](https://img.shields.io/github/actions/workflow/status/xinrea/bili-shadowreplay/main.yml)
![GitHub Release](https://img.shields.io/github/v/release/xinrea/bili-shadowreplay)
![GitHub Downloads (all assets, all releases)](https://img.shields.io/github/downloads/xinrea/bili-shadowreplay/total)
> [!WARNING]
> v2.0.0 版本为重大更新,将不兼容 v1.x 版本的数据。
BiliBili ShadowReplay 是一个缓存直播并进行实时编辑投稿的工具。通过划定时间区间,并编辑简单的必需信息,即可完成直播切片以及投稿,将整个流程压缩到分钟级。同时,也支持对缓存的历史直播进行回放,以及相同的切片编辑投稿处理流程。
目前仅支持 B 站和抖音平台的直播。
![rooms](docs/summary.png)
![rooms](docs/public/images/summary.png)
## 安装和使用
前往网站查看说明:[BiliBili ShadowReplay](https://bsr.xinrea.cn/)
## 参与开发
[Contributing](.github/CONTRIBUTING.md)
## 总览
## 赞助
![rooms](docs/summary.png)
## 直播间管理
![clip](docs/rooms.png)
显示当前缓存的直播间列表,在添加前需要在账号页面添加至少一个账号(主账号)用于直播流以及用户信息的获取。
操作菜单包含打开直播流、查看历史记录以及删除等操作。其中历史记录以列表形式展示,可以进行回放以及删除。
![archives](docs/archives.png)
无论是正在进行的直播还是历史录播,都可在预览窗口进行回放,同时也可以进行切片编辑以及投稿。关于预览窗口的相关说明请见 [预览窗口](#预览窗口)。
## 账号管理
![accounts](docs/accounts.png)
程序需要至少一个账号用于直播流以及用户信息的获取,可以在此页面添加账号。
你可以添加多个账号,但只有一个账号会被标记为主账号,主账号用于直播流的获取。所有账号都可在切片投稿或是观看直播流发送弹幕时自由选择,详情见 [预览窗口](#预览窗口)。
抖音账号目前仅支持手动 Cookie 添加,且账号仅用于获取直播信息和直播流。
## 预览窗口
![livewindow](docs/livewindow.png)
预览窗口是一个多功能的窗口,可以用于观看直播流、回放历史录播、编辑切片、记录时间点以及投稿等操作。如果当前播放的是直播流,那么会有实时弹幕观看以及发送弹幕相关的选项。
通过预览窗口的快捷键操作,可以快速选择时间区间,进行切片生成以及投稿。
无论是弹幕发送还是投稿,均可自由选择账号,只要在账号管理中添加了该账号。
进度条上方会显示弹幕频率图,可以直观地看到弹幕的分布情况;右侧的弹幕统计过滤器可以用于过滤弹幕,只显示含有指定文字的弹幕的统计情况。
## 封面编辑
![cover](docs/coveredit.png)
在预览窗口中,生成切片后可以进行封面编辑,包括关键帧的选择、文字的添加和拖动等。
## 设置
![settings](docs/settings.png)
在设置页面可以进行一些基本的设置,包括缓存和切片的保存路径,以及相关事件是否显示通知等。
> [!WARNING]
> 缓存目录进行切换时,会有文件复制等操作,如果缓存量较大,可能会耗费较长时间;且在此期间预览功能会暂时失效,需要等待操作完成。
![donate](docs/public/images/donate.png)

43
docs/.vitepress/config.ts Normal file
View File

@@ -0,0 +1,43 @@
import { defineConfig } from "vitepress";
// https://vitepress.dev/reference/site-config
export default defineConfig({
title: "BiliBili ShadowReplay",
description: "直播录制/实时回放/剪辑/投稿工具",
themeConfig: {
// https://vitepress.dev/reference/default-theme-config
nav: [
{ text: "Home", link: "/" },
{
text: "Releases",
link: "https://github.com/Xinrea/bili-shadowreplay/releases",
},
],
sidebar: [
{
text: "开始使用",
items: [
{ text: "安装准备", link: "/getting-started/installation" },
{ text: "配置使用", link: "/getting-started/configuration" },
{ text: "FFmpeg 配置", link: "/getting-started/ffmpeg" },
],
},
{
text: "说明文档",
items: [
{ text: "功能说明", link: "/usage/features" },
{ text: "常见问题", link: "/usage/faq" },
],
},
{
text: "开发文档",
items: [{ text: "架构设计", link: "/develop/architecture" }],
},
],
socialLinks: [
{ icon: "github", link: "https://github.com/Xinrea/bili-shadowreplay" },
],
},
});

View File

@@ -0,0 +1 @@
# 架构设计

View File

@@ -0,0 +1,27 @@
# 配置使用
## 账号配置
要添加直播间,至少需要配置一个同平台的账号。在账号页面,你可以通过添加账号按钮添加一个账号。
- B 站账号:目前支持扫码登录和 Cookie 手动配置两种方式,推荐使用扫码登录
- 抖音账号:目前仅支持 Cookie 手动配置登陆
### 抖音账号配置
首先确保已经登录抖音,然后打开[个人主页](https://www.douyin.com/user/self),右键单击网页,在菜单中选择 `检查Inspect`,打开开发者工具,切换到 `网络Network` 选项卡,然后刷新网页,此时能在列表中找到 `self` 请求(一般是列表中第一个),单击该请求,查看`请求标头`,在 `请求标头` 中找到 `Cookie`,复制该字段的值,粘贴到配置页面的 `Cookie` 输入框中,要注意复制完全。
![DouyinCookie](/images/douyin_cookie.png)
## FFmpeg 配置
如果想要使用切片生成和压制功能,请确保 FFmpeg 已正确配置;除了 Windows 平台打包自带 FFfmpeg 以外,其他平台需要手动安装 FFfmpeg请参考 [FFfmpeg 配置](/getting-started/ffmpeg)。
## Whisper 模型配置
要使用 AI 字幕识别功能,需要在设置页面配置 Whisper 模型路径,模型文件可以从网络上下载,例如:
- [Whisper.cpp国内镜像内容较旧](https://www.modelscope.cn/models/cjc1887415157/whisper.cpp/files)
- [Whisper.cpp](https://huggingface.co/ggerganov/whisper.cpp/tree/main)
可以跟据自己的需求选择不同的模型,要注意带有 `en` 的模型是英文模型,其他模型为多语言模型。

View File

@@ -0,0 +1,47 @@
# FFmpeg 配置
FFmpeg 是一个开源的音视频处理工具,支持多种格式的音视频编解码、转码、剪辑、合并等操作。
在本项目中FFmpeg 用于切片生成以及字幕和弹幕的硬编码处理,因此需要确保安装了 FFmpeg。
## MacOS
在 MacOS 上安装 FFmpeg 非常简单,可以使用 Homebrew 来安装:
```bash
brew install ffmpeg
```
如果没有安装 Homebrew可以参考 [Homebrew 官网](https://brew.sh/) 进行安装。
## Linux
在 Linux 上安装 FFmpeg 可以使用系统自带的包管理器进行安装,例如:
- Ubuntu/Debian 系统:
```bash
sudo apt install ffmpeg
```
- Fedora 系统:
```bash
sudo dnf install ffmpeg
```
- Arch Linux 系统:
```bash
sudo pacman -S ffmpeg
```
- CentOS 系统:
```bash
sudo yum install epel-release
sudo yum install ffmpeg
```
## Windows
Windows 版本安装后FFmpeg 已经放置在了程序目录下,因此不需要额外安装。

View File

@@ -0,0 +1,66 @@
# 安装准备
## 桌面端安装
桌面端目前提供了 Windows、Linux 和 MacOS 三个平台的安装包。
安装包分为两个版本,普通版和 debug 版普通版适合大部分用户使用debug 版包含了更多的调试信息,适合开发者使用;由于程序会对账号等敏感信息进行管理,请从信任的来源进行下载;所有版本均可在 [GitHub Releases](https://github.com/Xinrea/bili-shadowreplay/releases) 页面下载安装。
### Windows
由于程序内置 Whisper 字幕识别模型支持Windows 版本分为两种:
- **普通版本**:内置了 Whisper GPU 加速,字幕识别较快,体积较大,只支持 Nvidia 显卡
- **CPU 版本** 使用 CPU 进行字幕识别推理,速度较慢
请根据自己的显卡情况选择合适的版本进行下载。
### Linux
Linux 版本目前仅支持使用 CPU 推理,且测试较少,可能存在一些问题,遇到问题请及时反馈。
### MacOS
MacOS 版本内置 Metal GPU 加速;安装后首次运行,会提示无法打开从网络下载的软件,请在设置-隐私与安全性下,选择仍然打开以允许程序运行。
## Docker 部署
BiliBili ShadowReplay 提供了服务端部署的能力,提供 Web 控制界面,可以用于在服务器等无图形界面环境下部署使用。
### 镜像获取
```bash
# 拉取最新版本
docker pull ghcr.io/xinrea/bili-shadowreplay:latest
# 拉取指定版本
docker pull ghcr.io/xinrea/bili-shadowreplay:2.5.0
# 速度太慢?从镜像源拉取
docker pull ghcr.nju.edu.cn/xinrea/bili-shadowreplay:latest
```
### 镜像使用
使用方法:
```bash
sudo docker run -it -d\
-p 3000:3000 \
-v $DATA_DIR:/app/data \
-v $CACHE_DIR:/app/cache \
-v $OUTPUT_DIR:/app/output \
-v $WHISPER_MODEL:/app/whisper_model.bin \
--name bili-shadowreplay \
ghcr.io/xinrea/bili-shadowreplay:latest
```
其中:
- `$DATA_DIR`:为数据目录,对应于桌面版的数据目录,
Windows 下位于 `C:\Users\{用户名}\AppData\Roaming\cn.vjoi.bilishadowreplay`;
MacOS 下位于 `/Users/{user}/Library/Application Support/cn.vjoi.bilishadowreplay`
- `$CACHE_DIR`:为缓存目录,对应于桌面版的缓存目录;
- `$OUTPUT_DIR`:为输出目录,对应于桌面版的输出目录;
- `$WHISPER_MODEL`:为 Whisper 模型文件路径,对应于桌面版的 Whisper 模型文件路径。

70
docs/index.md Normal file
View File

@@ -0,0 +1,70 @@
---
# https://vitepress.dev/reference/default-theme-home-page
layout: home
hero:
name: "BiliBili ShadowReplay"
tagline: "直播录制/实时回放/剪辑/投稿工具"
image:
src: /images/icon.png
alt: BiliBili ShadowReplay
actions:
- theme: brand
text: 开始使用
link: /getting-started/installation
- theme: alt
text: 说明文档
link: /usage/features
features:
- icon: 📹
title: 直播录制
details: 缓存直播流,直播结束自动生成整场录播
- icon: 📺
title: 实时回放
details: 实时回放当前直播,不错过任何内容
- icon: ✂️
title: 剪辑投稿
details: 剪辑切片,封面编辑,一键投稿
- icon: 📝
title: 字幕生成
details: 支持 Wisper 模型生成字幕,编辑与压制
- icon: 📄
title: 弹幕支持
details: 直播间弹幕压制到切片,并支持直播弹幕发送和导出
- icon: 🌐
title: 多直播平台支持
details: 目前支持 B 站和抖音直播
- icon: 🔍
title: 云端部署
details: 支持 Docker 部署,提供 Web 控制界面
- icon: 📦
title: 多平台支持
details: 桌面端支持 Windows/Linux/macOS
---
## 总览
![rooms](/images/summary.png)
## 直播间管理
![clip](/images/rooms.png)
![archives](/images/archives.png)
## 账号管理
![accounts](/images/accounts.png)
## 预览窗口
![livewindow](/images/livewindow.png)
## 封面编辑
![cover](/images/coveredit.png)
## 设置
![settings](/images/settings.png)

View File

Before

Width:  |  Height:  |  Size: 555 KiB

After

Width:  |  Height:  |  Size: 555 KiB

View File

Before

Width:  |  Height:  |  Size: 1.2 MiB

After

Width:  |  Height:  |  Size: 1.2 MiB

View File

Before

Width:  |  Height:  |  Size: 2.9 MiB

After

Width:  |  Height:  |  Size: 2.9 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 474 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 548 KiB

View File

Before

Width:  |  Height:  |  Size: 114 KiB

After

Width:  |  Height:  |  Size: 114 KiB

View File

Before

Width:  |  Height:  |  Size: 18 KiB

After

Width:  |  Height:  |  Size: 18 KiB

View File

Before

Width:  |  Height:  |  Size: 2.8 MiB

After

Width:  |  Height:  |  Size: 2.8 MiB

View File

Before

Width:  |  Height:  |  Size: 1.9 MiB

After

Width:  |  Height:  |  Size: 1.9 MiB

View File

Before

Width:  |  Height:  |  Size: 622 KiB

After

Width:  |  Height:  |  Size: 622 KiB

View File

Before

Width:  |  Height:  |  Size: 721 KiB

After

Width:  |  Height:  |  Size: 721 KiB

0
docs/usage/faq.md Normal file
View File

0
docs/usage/features.md Normal file
View File

View File

@@ -1,14 +1,13 @@
<!DOCTYPE html>
<html lang="zh-cn">
<head>
<meta charset="UTF-8" />
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
</head>
<body>
<div id="app"></div>
<script type="module" src="/src/main.ts"></script>
</body>
<head>
<meta charset="UTF-8" />
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>BiliBili ShadowReplay</title>
</head>
<body>
<div id="app"></div>
<script type="module" src="/src/main.ts"></script>
</body>
</html>

View File

@@ -1,53 +1,65 @@
<!DOCTYPE html>
<html lang="zh-cn" class="dark">
<head>
<meta charset="UTF-8" />
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<link rel="stylesheet" href="shaka-player/controls.min.css" />
<link rel="stylesheet" href="shaka-player/youtube-theme.css" />
<script src="shaka-player/shaka-player.ui.js"></script>
</head>
<body>
<div id="app"></div>
<script type="module" src="src/live_main.ts"></script>
<style>
input[type="range"]::-webkit-slider-thumb {
width: 12px; /* 设置滑块按钮宽度 */
height: 12px; /* 设置滑块按钮高度 */
border-radius: 50%; /* 设置为圆形 */
}
html {
scrollbar-face-color: #646464;
scrollbar-base-color: #646464;
scrollbar-3dlight-color: #646464;
scrollbar-highlight-color: #646464;
scrollbar-track-color: #000;
scrollbar-arrow-color: #000;
scrollbar-shadow-color: #646464;
}
::-webkit-scrollbar {
width: 8px;
height: 3px;
}
::-webkit-scrollbar-button {
background-color: #666;
}
::-webkit-scrollbar-track {
background-color: #646464;
}
::-webkit-scrollbar-track-piece {
background-color: #000;
}
::-webkit-scrollbar-thumb {
height: 50px;
background-color: #666;
border-radius: 3px;
}
::-webkit-scrollbar-corner {
background-color: #646464;
}
</style>
</body>
<head>
<meta charset="UTF-8" />
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<link rel="stylesheet" href="shaka-player/controls.min.css" />
<link rel="stylesheet" href="shaka-player/youtube-theme.css" />
<script src="shaka-player/shaka-player.ui.js"></script>
</head>
<body>
<div id="app"></div>
<script type="module" src="src/live_main.ts"></script>
<style>
input[type="range"]::-webkit-slider-thumb {
width: 12px;
/* 设置滑块按钮宽度 */
height: 12px;
/* 设置滑块按钮高度 */
border-radius: 50%;
/* 设置为圆形 */
}
html {
scrollbar-face-color: #646464;
scrollbar-base-color: #646464;
scrollbar-3dlight-color: #646464;
scrollbar-highlight-color: #646464;
scrollbar-track-color: #000;
scrollbar-arrow-color: #000;
scrollbar-shadow-color: #646464;
}
::-webkit-scrollbar {
width: 8px;
height: 3px;
}
::-webkit-scrollbar-button {
background-color: #666;
}
::-webkit-scrollbar-track {
background-color: #646464;
}
::-webkit-scrollbar-track-piece {
background-color: #000;
}
::-webkit-scrollbar-thumb {
height: 50px;
background-color: #666;
border-radius: 3px;
}
::-webkit-scrollbar-corner {
background-color: #646464;
}
</style>
</body>
</html>

View File

@@ -1,14 +1,17 @@
{
"name": "bili-shadowreplay",
"private": true,
"version": "2.4.3",
"version": "2.6.1",
"type": "module",
"scripts": {
"dev": "vite",
"build": "vite build",
"preview": "vite preview",
"check": "svelte-check --tsconfig ./tsconfig.json",
"tauri": "tauri"
"tauri": "tauri",
"docs:dev": "vitepress dev docs",
"docs:build": "vitepress build docs",
"docs:preview": "vitepress preview docs"
},
"dependencies": {
"@tauri-apps/api": "^2.4.1",
@@ -19,7 +22,6 @@
"@tauri-apps/plugin-os": "~2",
"@tauri-apps/plugin-shell": "~2",
"@tauri-apps/plugin-sql": "~2",
"html2canvas": "^1.4.1",
"lucide-svelte": "^0.479.0",
"qrcode": "^1.5.4"
},
@@ -41,6 +43,7 @@
"ts-node": "^10.9.1",
"tslib": "^2.4.1",
"typescript": "^4.6.4",
"vite": "^4.0.0"
"vite": "^4.0.0",
"vitepress": "^1.6.3"
}
}

View File

@@ -1,5 +1,9 @@
# Generated by Cargo
# will have compiled files and executables
/target/
cache
output
tmps
clips
data
config.toml

2400
src-tauri/Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,3 +1,7 @@
[workspace]
members = ["crates/danmu_stream"]
resolver = "2"
[package]
name = "bili-shadowreplay"
version = "1.0.0"
@@ -9,11 +13,8 @@ edition = "2021"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[build-dependencies]
tauri-build = { version = "2", features = [] }
[dependencies]
tauri = { version = "2", features = ["protocol-asset", "tray-icon"] }
danmu_stream = { path = "crates/danmu_stream" }
serde_json = "1.0"
reqwest = { version = "0.11", features = ["blocking", "json"] }
serde_derive = "1.0.158"
@@ -25,7 +26,6 @@ async-ffmpeg-sidecar = "0.0.1"
chrono = { version = "0.4.24", features = ["serde"] }
toml = "0.7.3"
custom_error = "1.9.2"
felgens = { git = "https://github.com/Xinrea/felgens.git", tag = "v0.4.2" }
regex = "1.7.3"
tokio = { version = "1.27.0", features = ["process"] }
platform-dirs = "0.3.0"
@@ -37,31 +37,93 @@ urlencoding = "2.1.3"
log = "0.4.22"
simplelog = "0.12.2"
sqlx = { version = "0.8", features = ["runtime-tokio", "sqlite"] }
tauri-plugin-dialog = "2"
tauri-plugin-shell = "2"
tauri-plugin-fs = "2"
tauri-plugin-http = "2"
tauri-utils = "2"
tauri-plugin-sql = { version = "2", features = ["sqlite"] }
tauri-plugin-os = "2"
tauri-plugin-notification = "2"
rand = "0.8.5"
base64 = "0.21"
mime_guess = "2.0"
async-trait = "0.1.87"
whisper-rs = "0.14.2"
hound = "3.5.1"
fix-path-env = { git = "https://github.com/tauri-apps/fix-path-env-rs" }
uuid = { version = "1.4", features = ["v4"] }
axum = { version = "0.7", features = ["macros"] }
tower-http = { version = "0.5", features = ["cors", "fs"] }
futures-core = "0.3"
futures = "0.3"
tokio-util = { version = "0.7", features = ["io"] }
clap = { version = "4.5.37", features = ["derive"] }
url = "2.5.4"
[features]
# this feature is used for production builds or when `devPath` points to the filesystem
# DO NOT REMOVE!!
custom-protocol = ["tauri/custom-protocol"]
cuda = ["whisper-rs/cuda"]
headless = []
default = ["gui"]
gui = [
"tauri",
"tauri-plugin-single-instance",
"tauri-plugin-dialog",
"tauri-plugin-shell",
"tauri-plugin-fs",
"tauri-plugin-http",
"tauri-plugin-sql",
"tauri-utils",
"tauri-plugin-os",
"tauri-plugin-notification",
"fix-path-env",
"tauri-build",
]
[target.'cfg(not(any(target_os = "android", target_os = "ios")))'.dependencies]
tauri-plugin-single-instance = "2"
[dependencies.tauri]
version = "2"
features = ["protocol-asset", "tray-icon"]
optional = true
[dependencies.tauri-plugin-single-instance]
version = "2"
optional = true
[dependencies.tauri-plugin-dialog]
version = "2"
optional = true
[dependencies.tauri-plugin-shell]
version = "2"
optional = true
[dependencies.tauri-plugin-fs]
version = "2"
optional = true
[dependencies.tauri-plugin-http]
version = "2"
optional = true
[dependencies.tauri-plugin-sql]
version = "2"
optional = true
features = ["sqlite"]
[dependencies.tauri-utils]
version = "2"
optional = true
[dependencies.tauri-plugin-os]
version = "2"
optional = true
[dependencies.tauri-plugin-notification]
version = "2"
optional = true
[dependencies.fix-path-env]
git = "https://github.com/tauri-apps/fix-path-env-rs"
optional = true
[build-dependencies.tauri-build]
version = "2"
features = []
optional = true
[target.'cfg(windows)'.dependencies]
whisper-rs = { version = "0.14.2", default-features = false }

View File

@@ -1,3 +1,4 @@
fn main() {
#[cfg(feature = "gui")]
tauri_build::build()
}

View File

@@ -0,0 +1,14 @@
cache = "./cache"
output = "./output"
live_start_notify = true
live_end_notify = true
clip_notify = true
post_notify = true
auto_subtitle = false
whisper_model = "./whisper_model.bin"
whisper_prompt = "这是一段中文 你们好"
clip_name_format = "[{room_id}][{live_id}][{title}][{created_at}].mp4"
[auto_generate]
enabled = false
encode_danmu = false

View File

@@ -0,0 +1,43 @@
[package]
name = "danmu_stream"
version = "0.1.0"
edition = "2021"
[lib]
name = "danmu_stream"
path = "src/lib.rs"
[[example]]
name = "douyin"
path = "examples/douyin.rs"
[dependencies]
tokio = { version = "1.0", features = ["full"] }
tokio-tungstenite = { version = "0.20", features = ["native-tls"] }
futures-util = "0.3"
prost = "0.12"
chrono = "0.4"
log = "0.4"
env_logger = "0.10"
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
reqwest = { version = "0.11", features = ["json"] }
url = "2.4"
md5 = "0.7"
regex = "1.9"
deno_core = "0.242.0"
pct-str = "2.0.0"
custom_error = "1.9.2"
flate2 = "1.0"
scroll = "0.13.0"
scroll_derive = "0.13.0"
brotli = "8.0.1"
http = "1.0"
rand = "0.9.1"
urlencoding = "2.1.3"
gzip = "0.1.2"
hex = "0.4.3"
async-trait = "0.1.88"
[build-dependencies]
tonic-build = "0.10"

View File

View File

@@ -0,0 +1,40 @@
use std::{sync::Arc, time::Duration};
use danmu_stream::{danmu_stream::DanmuStream, provider::ProviderType, DanmuMessageType};
use tokio::time::sleep;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Initialize logging
env_logger::init();
// Replace these with actual values
let room_id = 7514298567821937427; // Replace with actual Douyin room_id. When live starts, the room_id will be generated, so it's more like a live_id.
let cookie = "your_cookie";
let stream = Arc::new(DanmuStream::new(ProviderType::Douyin, cookie, room_id).await?);
log::info!("Start to receive danmu messages");
let _ = stream.start().await;
let stream_clone = stream.clone();
tokio::spawn(async move {
loop {
if let Ok(Some(msg)) = stream_clone.recv().await {
match msg {
DanmuMessageType::DanmuMessage(danmu) => {
log::info!("Received danmu message: {:?}", danmu.message);
}
}
} else {
log::info!("Channel closed");
break;
}
}
});
sleep(Duration::from_secs(10)).await;
stream.stop().await?;
Ok(())
}

View File

@@ -0,0 +1,51 @@
use std::sync::Arc;
use crate::{
provider::{new, DanmuProvider, ProviderType},
DanmuMessageType, DanmuStreamError,
};
use tokio::sync::{mpsc, RwLock};
#[derive(Clone)]
pub struct DanmuStream {
pub provider_type: ProviderType,
pub identifier: String,
pub room_id: u64,
pub provider: Arc<RwLock<Box<dyn DanmuProvider>>>,
tx: mpsc::UnboundedSender<DanmuMessageType>,
rx: Arc<RwLock<mpsc::UnboundedReceiver<DanmuMessageType>>>,
}
impl DanmuStream {
pub async fn new(
provider_type: ProviderType,
identifier: &str,
room_id: u64,
) -> Result<Self, DanmuStreamError> {
let (tx, rx) = mpsc::unbounded_channel();
let provider = new(provider_type, identifier, room_id).await?;
Ok(Self {
provider_type,
identifier: identifier.to_string(),
room_id,
provider: Arc::new(RwLock::new(provider)),
tx,
rx: Arc::new(RwLock::new(rx)),
})
}
pub async fn start(&self) -> Result<(), DanmuStreamError> {
self.provider.write().await.start(self.tx.clone()).await
}
pub async fn stop(&self) -> Result<(), DanmuStreamError> {
self.provider.write().await.stop().await?;
// close channel
self.rx.write().await.close();
Ok(())
}
pub async fn recv(&self) -> Result<Option<DanmuMessageType>, DanmuStreamError> {
Ok(self.rx.write().await.recv().await)
}
}

View File

@@ -0,0 +1,51 @@
use std::time::Duration;
use crate::DanmuStreamError;
use reqwest::header::HeaderMap;
impl From<reqwest::Error> for DanmuStreamError {
fn from(value: reqwest::Error) -> Self {
Self::HttpError { err: value }
}
}
impl From<url::ParseError> for DanmuStreamError {
fn from(value: url::ParseError) -> Self {
Self::ParseError { err: value }
}
}
pub struct ApiClient {
client: reqwest::Client,
header: HeaderMap,
}
impl ApiClient {
pub fn new(cookies: &str) -> Self {
let mut header = HeaderMap::new();
header.insert("cookie", cookies.parse().unwrap());
Self {
client: reqwest::Client::new(),
header,
}
}
pub async fn get(
&self,
url: &str,
query: Option<&[(&str, &str)]>,
) -> Result<reqwest::Response, DanmuStreamError> {
let resp = self
.client
.get(url)
.query(query.unwrap_or_default())
.headers(self.header.clone())
.timeout(Duration::from_secs(10))
.send()
.await?
.error_for_status()?;
Ok(resp)
}
}

View File

@@ -0,0 +1,30 @@
pub mod danmu_stream;
mod http_client;
pub mod provider;
use custom_error::custom_error;
custom_error! {pub DanmuStreamError
HttpError {err: reqwest::Error} = "HttpError {err}",
ParseError {err: url::ParseError} = "ParseError {err}",
WebsocketError {err: String } = "WebsocketError {err}",
PackError {err: String} = "PackError {err}",
UnsupportProto {proto: u16} = "UnsupportProto {proto}",
MessageParseError {err: String} = "MessageParseError {err}",
InvalidIdentifier {err: String} = "InvalidIdentifier {err}"
}
pub enum DanmuMessageType {
DanmuMessage(DanmuMessage),
}
#[derive(Debug, Clone)]
pub struct DanmuMessage {
pub room_id: u64,
pub user_id: u64,
pub user_name: String,
pub message: String,
pub color: u32,
/// timestamp in milliseconds
pub timestamp: i64,
}

View File

@@ -0,0 +1,72 @@
mod bilibili;
mod douyin;
use async_trait::async_trait;
use tokio::sync::mpsc;
use crate::{
provider::bilibili::BiliDanmu, provider::douyin::DouyinDanmu, DanmuMessageType,
DanmuStreamError,
};
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub enum ProviderType {
BiliBili,
Douyin,
}
#[async_trait]
pub trait DanmuProvider: Send + Sync {
async fn new(identifier: &str, room_id: u64) -> Result<Self, DanmuStreamError>
where
Self: Sized;
async fn start(
&self,
tx: mpsc::UnboundedSender<DanmuMessageType>,
) -> Result<(), DanmuStreamError>;
async fn stop(&self) -> Result<(), DanmuStreamError>;
}
/// Creates a new danmu stream provider for the specified platform.
///
/// This function initializes and starts a danmu stream provider based on the specified platform type.
/// The provider will fetch danmu messages and send them through the provided channel.
///
/// # Arguments
///
/// * `tx` - An unbounded sender channel that will receive danmu messages
/// * `provider_type` - The type of platform to fetch danmu from (BiliBili or Douyin)
/// * `identifier` - User validation information (e.g., cookies) required by the platform
/// * `room_id` - The unique identifier of the room/channel to fetch danmu from. Notice that douyin room_id is more like a live_id, it changes every time the live starts.
///
/// # Returns
///
/// Returns `Result<(), DanmmuStreamError>` where:
/// * `Ok(())` indicates successful initialization and start of the provider, only return after disconnect
/// * `Err(DanmmuStreamError)` indicates an error occurred during initialization or startup
///
/// # Examples
///
/// ```rust
/// use tokio::sync::mpsc;
/// let (tx, mut rx) = mpsc::unbounded_channel();
/// new(tx, ProviderType::BiliBili, "your_cookie", 123456).await?;
/// ```
pub async fn new(
provider_type: ProviderType,
identifier: &str,
room_id: u64,
) -> Result<Box<dyn DanmuProvider>, DanmuStreamError> {
match provider_type {
ProviderType::BiliBili => {
let bili = BiliDanmu::new(identifier, room_id).await?;
Ok(Box::new(bili))
}
ProviderType::Douyin => {
let douyin = DouyinDanmu::new(identifier, room_id).await?;
Ok(Box::new(douyin))
}
}
}

View File

@@ -0,0 +1,436 @@
mod dannmu_msg;
mod interact_word;
mod pack;
mod send_gift;
mod stream;
mod super_chat;
use std::{sync::Arc, time::SystemTime};
use async_trait::async_trait;
use futures_util::{SinkExt, StreamExt, TryStreamExt};
use log::{error, info};
use pct_str::{PctString, URIReserved};
use regex::Regex;
use serde::{Deserialize, Serialize};
use tokio::{
sync::{mpsc, RwLock},
time::{sleep, Duration},
};
use tokio_tungstenite::{connect_async, tungstenite::Message};
use crate::{
http_client::ApiClient,
provider::{DanmuMessageType, DanmuProvider},
DanmuStreamError,
};
type WsReadType = futures_util::stream::SplitStream<
tokio_tungstenite::WebSocketStream<tokio_tungstenite::MaybeTlsStream<tokio::net::TcpStream>>,
>;
type WsWriteType = futures_util::stream::SplitSink<
tokio_tungstenite::WebSocketStream<tokio_tungstenite::MaybeTlsStream<tokio::net::TcpStream>>,
Message,
>;
pub struct BiliDanmu {
client: ApiClient,
room_id: u64,
user_id: u64,
stop: Arc<RwLock<bool>>,
write: Arc<RwLock<Option<WsWriteType>>>,
}
#[async_trait]
impl DanmuProvider for BiliDanmu {
async fn new(cookie: &str, room_id: u64) -> Result<Self, DanmuStreamError> {
// find DedeUserID=<user_id> in cookie str
let user_id = BiliDanmu::parse_user_id(cookie)?;
let client = ApiClient::new(cookie);
Ok(Self {
client,
user_id,
room_id,
stop: Arc::new(RwLock::new(false)),
write: Arc::new(RwLock::new(None)),
})
}
async fn start(
&self,
tx: mpsc::UnboundedSender<DanmuMessageType>,
) -> Result<(), DanmuStreamError> {
let mut retry_count = 0;
const MAX_RETRIES: u32 = 5;
const RETRY_DELAY: Duration = Duration::from_secs(5);
info!(
"Bilibili WebSocket connection started, room_id: {}",
self.room_id
);
loop {
if *self.stop.read().await {
break;
}
match self.connect_and_handle(tx.clone()).await {
Ok(_) => {
info!("Bilibili WebSocket connection closed normally");
break;
}
Err(e) => {
error!("Bilibili WebSocket connection error: {}", e);
retry_count += 1;
if retry_count >= MAX_RETRIES {
return Err(DanmuStreamError::WebsocketError {
err: format!("Failed to connect after {} retries", MAX_RETRIES),
});
}
info!(
"Retrying connection in {} seconds... (Attempt {}/{})",
RETRY_DELAY.as_secs(),
retry_count,
MAX_RETRIES
);
tokio::time::sleep(RETRY_DELAY).await;
}
}
}
Ok(())
}
async fn stop(&self) -> Result<(), DanmuStreamError> {
*self.stop.write().await = true;
if let Some(mut write) = self.write.write().await.take() {
if let Err(e) = write.close().await {
error!("Failed to close WebSocket connection: {}", e);
}
}
Ok(())
}
}
impl BiliDanmu {
async fn connect_and_handle(
&self,
tx: mpsc::UnboundedSender<DanmuMessageType>,
) -> Result<(), DanmuStreamError> {
let wbi_key = self.get_wbi_key().await?;
let danmu_info = self.get_danmu_info(&wbi_key, self.room_id).await?;
let ws_hosts = danmu_info.data.host_list.clone();
let mut conn = None;
// try to connect to ws_hsots, once success, send the token to the tx
for i in ws_hosts {
let host = format!("wss://{}/sub", i.host);
match connect_async(&host).await {
Ok((c, _)) => {
conn = Some(c);
break;
}
Err(e) => {
eprintln!(
"Connect ws host: {} has error, trying next host ...\n{:?}\n{:?}",
host, i, e
);
}
}
}
let conn = conn.ok_or(DanmuStreamError::WebsocketError {
err: "Failed to connect to ws host".into(),
})?;
let (write, read) = conn.split();
*self.write.write().await = Some(write);
let json = serde_json::to_string(&WsSend {
roomid: self.room_id,
key: danmu_info.data.token,
uid: self.user_id,
protover: 3,
platform: "web".to_string(),
t: 2,
})
.map_err(|e| DanmuStreamError::WebsocketError { err: e.to_string() })?;
let json = pack::encode(&json, 7);
if let Some(write) = self.write.write().await.as_mut() {
write
.send(Message::binary(json))
.await
.map_err(|e| DanmuStreamError::WebsocketError { err: e.to_string() })?;
}
tokio::select! {
v = BiliDanmu::send_heartbeat_packets(Arc::clone(&self.write)) => v,
v = BiliDanmu::recv(read, tx, Arc::clone(&self.stop)) => v
}?;
Ok(())
}
async fn send_heartbeat_packets(
write: Arc<RwLock<Option<WsWriteType>>>,
) -> Result<(), DanmuStreamError> {
loop {
if let Some(write) = write.write().await.as_mut() {
write
.send(Message::binary(pack::encode("", 2)))
.await
.map_err(|e| DanmuStreamError::WebsocketError { err: e.to_string() })?;
}
sleep(Duration::from_secs(30)).await;
}
}
async fn recv(
mut read: WsReadType,
tx: mpsc::UnboundedSender<DanmuMessageType>,
stop: Arc<RwLock<bool>>,
) -> Result<(), DanmuStreamError> {
while let Ok(Some(msg)) = read.try_next().await {
if *stop.read().await {
log::info!("Stopping bilibili danmu stream");
break;
}
let data = msg.into_data();
if !data.is_empty() {
let s = pack::build_pack(&data);
if let Ok(msgs) = s {
for i in msgs {
let ws = stream::WsStreamCtx::new(&i);
if let Ok(ws) = ws {
match ws.match_msg() {
Ok(v) => {
tx.send(v).map_err(|e| DanmuStreamError::WebsocketError {
err: e.to_string(),
})?;
}
Err(e) => {
log::trace!(
"This message parsing is not yet supported:\nMessage: {i}\nErr: {e:#?}"
);
}
}
} else {
log::error!("{}", ws.unwrap_err());
}
}
}
}
}
Ok(())
}
async fn get_danmu_info(
&self,
wbi_key: &str,
room_id: u64,
) -> Result<DanmuInfo, DanmuStreamError> {
let room_id = self.get_real_room(wbi_key, room_id).await?;
let params = self
.get_sign(
wbi_key,
serde_json::json!({
"id": room_id,
"type": 0,
}),
)
.await?;
let resp = self
.client
.get(
&format!(
"https://api.live.bilibili.com/xlive/web-room/v1/index/getDanmuInfo?{}",
params
),
None,
)
.await?
.json::<DanmuInfo>()
.await?;
Ok(resp)
}
async fn get_real_room(&self, wbi_key: &str, room_id: u64) -> Result<u64, DanmuStreamError> {
let params = self
.get_sign(
wbi_key,
serde_json::json!({
"id": room_id,
"from": "room",
}),
)
.await?;
let resp = self
.client
.get(
&format!(
"https://api.live.bilibili.com/room/v1/Room/room_init?{}",
params
),
None,
)
.await?
.json::<RoomInit>()
.await?
.data
.room_id;
Ok(resp)
}
fn parse_user_id(cookie: &str) -> Result<u64, DanmuStreamError> {
let mut user_id = None;
// find DedeUserID=<user_id> in cookie str
let re = Regex::new(r"DedeUserID=(\d+)").unwrap();
if let Some(captures) = re.captures(cookie) {
if let Some(user) = captures.get(1) {
user_id = Some(user.as_str().parse::<u64>().unwrap());
}
}
if let Some(user_id) = user_id {
Ok(user_id)
} else {
Err(DanmuStreamError::InvalidIdentifier {
err: format!("Failed to find user_id in cookie: {cookie}"),
})
}
}
async fn get_wbi_key(&self) -> Result<String, DanmuStreamError> {
let nav_info: serde_json::Value = self
.client
.get("https://api.bilibili.com/x/web-interface/nav", None)
.await?
.json()
.await?;
let re = Regex::new(r"wbi/(.*).png").unwrap();
let img = re
.captures(nav_info["data"]["wbi_img"]["img_url"].as_str().unwrap())
.unwrap()
.get(1)
.unwrap()
.as_str();
let sub = re
.captures(nav_info["data"]["wbi_img"]["sub_url"].as_str().unwrap())
.unwrap()
.get(1)
.unwrap()
.as_str();
let raw_string = format!("{}{}", img, sub);
Ok(raw_string)
}
pub async fn get_sign(
&self,
wbi_key: &str,
mut parameters: serde_json::Value,
) -> Result<String, DanmuStreamError> {
let table = vec![
46, 47, 18, 2, 53, 8, 23, 32, 15, 50, 10, 31, 58, 3, 45, 35, 27, 43, 5, 49, 33, 9, 42,
19, 29, 28, 14, 39, 12, 38, 41, 13, 37, 48, 7, 16, 24, 55, 40, 61, 26, 17, 0, 1, 60,
51, 30, 4, 22, 25, 54, 21, 56, 59, 6, 63, 57, 62, 11, 36, 20, 34, 44, 52,
];
let raw_string = wbi_key;
let mut encoded = Vec::new();
table.into_iter().for_each(|x| {
if x < raw_string.len() {
encoded.push(raw_string.as_bytes()[x]);
}
});
// only keep 32 bytes of encoded
encoded = encoded[0..32].to_vec();
let encoded = String::from_utf8(encoded).unwrap();
// Timestamp in seconds
let wts = SystemTime::now()
.duration_since(SystemTime::UNIX_EPOCH)
.unwrap()
.as_secs();
parameters
.as_object_mut()
.unwrap()
.insert("wts".to_owned(), serde_json::Value::String(wts.to_string()));
// Get all keys from parameters into vec
let mut keys = parameters
.as_object()
.unwrap()
.keys()
.map(|x| x.to_owned())
.collect::<Vec<String>>();
// sort keys
keys.sort();
let mut params = String::new();
keys.iter().for_each(|x| {
params.push_str(x);
params.push('=');
// Convert value to string based on its type
let value = match parameters.get(x).unwrap() {
serde_json::Value::String(s) => s.clone(),
serde_json::Value::Number(n) => n.to_string(),
serde_json::Value::Bool(b) => b.to_string(),
_ => "".to_string(),
};
// Value filters !'()* characters
let value = value.replace(['!', '\'', '(', ')', '*'], "");
let value = PctString::encode(value.chars(), URIReserved);
params.push_str(value.as_str());
// add & if not last
if x != keys.last().unwrap() {
params.push('&');
}
});
// md5 params+encoded
let w_rid = md5::compute(params.to_string() + encoded.as_str());
let params = params + format!("&w_rid={:x}", w_rid).as_str();
Ok(params)
}
}
#[derive(Serialize)]
struct WsSend {
uid: u64,
roomid: u64,
key: String,
protover: u32,
platform: String,
#[serde(rename = "type")]
t: u32,
}
#[derive(Debug, Deserialize, Clone)]
pub struct DanmuInfo {
pub data: DanmuInfoData,
}
#[derive(Debug, Deserialize, Clone)]
pub struct DanmuInfoData {
pub token: String,
pub host_list: Vec<WsHost>,
}
#[derive(Debug, Deserialize, Clone)]
pub struct WsHost {
pub host: String,
}
#[derive(Debug, Deserialize, Clone)]
pub struct RoomInit {
data: RoomInitData,
}
#[derive(Debug, Deserialize, Clone)]
pub struct RoomInitData {
room_id: u64,
}

View File

@@ -0,0 +1,88 @@
use serde::Deserialize;
use crate::{provider::bilibili::stream::WsStreamCtx, DanmuStreamError};
#[derive(Debug, Deserialize)]
#[allow(dead_code)]
pub struct BiliDanmuMessage {
pub uid: u64,
pub username: String,
pub msg: String,
pub fan: Option<String>,
pub fan_level: Option<u64>,
pub timestamp: i64,
}
impl BiliDanmuMessage {
pub fn new_from_ctx(ctx: &WsStreamCtx) -> Result<Self, DanmuStreamError> {
let info = ctx
.info
.as_ref()
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "info is None".to_string(),
})?;
let array_2 = info
.get(2)
.and_then(|x| x.as_array())
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "array_2 is None".to_string(),
})?
.to_owned();
let uid = array_2.first().and_then(|x| x.as_u64()).ok_or_else(|| {
DanmuStreamError::MessageParseError {
err: "uid is None".to_string(),
}
})?;
let username = array_2
.get(1)
.and_then(|x| x.as_str())
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "username is None".to_string(),
})?
.to_string();
let msg = info
.get(1)
.and_then(|x| x.as_str())
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "msg is None".to_string(),
})?
.to_string();
let array_3 = info
.get(3)
.and_then(|x| x.as_array())
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "array_3 is None".to_string(),
})?
.to_owned();
let fan = array_3
.get(1)
.and_then(|x| x.as_str())
.map(|x| x.to_owned());
let fan_level = array_3.first().and_then(|x| x.as_u64());
let timestamp = info
.first()
.and_then(|x| x.as_array())
.and_then(|x| x.get(4))
.and_then(|x| x.as_i64())
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "timestamp is None".to_string(),
})?;
Ok(Self {
uid,
username,
msg,
fan,
fan_level,
timestamp,
})
}
}

View File

@@ -0,0 +1,67 @@
use crate::{provider::bilibili::stream::WsStreamCtx, DanmuStreamError};
#[derive(Debug)]
#[allow(dead_code)]
pub struct InteractWord {
pub uid: u64,
pub uname: String,
pub fan: Option<String>,
pub fan_level: Option<u32>,
}
#[allow(dead_code)]
impl InteractWord {
pub fn new_from_ctx(ctx: &WsStreamCtx) -> Result<Self, DanmuStreamError> {
let data = ctx
.data
.as_ref()
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "data is None".to_string(),
})?;
let uname = data
.uname
.as_ref()
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "uname is None".to_string(),
})?
.to_string();
let uid = data
.uid
.as_ref()
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "uid is None".to_string(),
})?
.as_u64()
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "uid is None".to_string(),
})?;
let fan = data
.fans_medal
.as_ref()
.and_then(|x| x.medal_name.to_owned());
let fan = if fan == Some("".to_string()) {
None
} else {
fan
};
let fan_level = data.fans_medal.as_ref().and_then(|x| x.medal_level);
let fan_level = if fan_level == Some(0) {
None
} else {
fan_level
};
Ok(Self {
uid,
uname,
fan,
fan_level,
})
}
}

View File

@@ -0,0 +1,161 @@
// This file is copied from https://github.com/eatradish/felgens/blob/master/src/pack.rs
use std::io::Read;
use flate2::read::ZlibDecoder;
use scroll::Pread;
use scroll_derive::Pread;
use crate::DanmuStreamError;
#[derive(Debug, Pread, Clone)]
struct BilibiliPackHeader {
pack_len: u32,
_header_len: u16,
ver: u16,
_op: u32,
_seq: u32,
}
#[derive(Debug, Pread)]
struct PackHotCount {
count: u32,
}
type BilibiliPackCtx<'a> = (BilibiliPackHeader, &'a [u8]);
fn pack(buffer: &[u8]) -> Result<BilibiliPackCtx, DanmuStreamError> {
let data = buffer
.pread_with(0, scroll::BE)
.map_err(|e: scroll::Error| DanmuStreamError::PackError { err: e.to_string() })?;
let buf = &buffer[16..];
Ok((data, buf))
}
fn write_int(buffer: &[u8], start: usize, val: u32) -> Vec<u8> {
let val_bytes = val.to_be_bytes();
let mut buf = buffer.to_vec();
for (i, c) in val_bytes.iter().enumerate() {
buf[start + i] = *c;
}
buf
}
pub fn encode(s: &str, op: u8) -> Vec<u8> {
let data = s.as_bytes();
let packet_len = 16 + data.len();
let header = vec![0, 0, 0, 0, 0, 16, 0, 1, 0, 0, 0, op, 0, 0, 0, 1];
let header = write_int(&header, 0, packet_len as u32);
[&header, data].concat()
}
pub fn build_pack(buf: &[u8]) -> Result<Vec<String>, DanmuStreamError> {
let ctx = pack(buf)?;
let msgs = decode(ctx)?;
Ok(msgs)
}
fn get_hot_count(body: &[u8]) -> Result<u32, DanmuStreamError> {
let count = body
.pread_with::<PackHotCount>(0, scroll::BE)
.map_err(|e| DanmuStreamError::PackError { err: e.to_string() })?
.count;
Ok(count)
}
fn zlib_decode(body: &[u8]) -> Result<(BilibiliPackHeader, Vec<u8>), DanmuStreamError> {
let mut buf = vec![];
let mut z = ZlibDecoder::new(body);
z.read_to_end(&mut buf)
.map_err(|e| DanmuStreamError::PackError { err: e.to_string() })?;
let ctx = pack(&buf)?;
let header = ctx.0;
let buf = ctx.1.to_vec();
Ok((header, buf))
}
fn decode(ctx: BilibiliPackCtx) -> Result<Vec<String>, DanmuStreamError> {
let (mut header, body) = ctx;
let mut buf = body.to_vec();
loop {
(header, buf) = match header.ver {
2 => zlib_decode(&buf)?,
3 => brotli_decode(&buf)?,
0 | 1 => break,
_ => break,
}
}
let msgs = match header.ver {
0 => split_msgs(buf, header)?,
1 => vec![format!("{{\"count\": {}}}", get_hot_count(&buf)?)],
x => return Err(DanmuStreamError::UnsupportProto { proto: x }),
};
Ok(msgs)
}
fn split_msgs(buf: Vec<u8>, header: BilibiliPackHeader) -> Result<Vec<String>, DanmuStreamError> {
let mut buf = buf;
let mut header = header;
let mut msgs = vec![];
let mut offset = 0;
let buf_len = buf.len();
msgs.push(
std::str::from_utf8(&buf[..(header.pack_len - 16) as usize])
.map_err(|e| DanmuStreamError::PackError { err: e.to_string() })?
.to_string(),
);
buf = buf[(header.pack_len - 16) as usize..].to_vec();
offset += header.pack_len - 16;
while offset != buf_len as u32 {
let ctx = pack(&buf).map_err(|e| DanmuStreamError::PackError { err: e.to_string() })?;
header = ctx.0;
buf = ctx.1.to_vec();
msgs.push(
std::str::from_utf8(&buf[..(header.pack_len - 16) as usize])
.map_err(|e| DanmuStreamError::PackError { err: e.to_string() })?
.to_string(),
);
buf = buf[(header.pack_len - 16) as usize..].to_vec();
offset += header.pack_len;
}
Ok(msgs)
}
fn brotli_decode(body: &[u8]) -> Result<(BilibiliPackHeader, Vec<u8>), DanmuStreamError> {
let mut reader = brotli::Decompressor::new(body, 4096);
let mut buf = Vec::new();
reader
.read_to_end(&mut buf)
.map_err(|e| DanmuStreamError::PackError { err: e.to_string() })?;
let ctx = pack(&buf).map_err(|e| DanmuStreamError::PackError { err: e.to_string() })?;
let header = ctx.0;
let buf = ctx.1.to_vec();
Ok((header, buf))
}

View File

@@ -0,0 +1,115 @@
use serde::Deserialize;
use crate::{provider::bilibili::stream::WsStreamCtx, DanmuStreamError};
#[derive(Debug, Deserialize)]
#[allow(dead_code)]
pub struct SendGift {
pub action: String,
pub gift_name: String,
pub num: u64,
pub uname: String,
pub uid: u64,
pub medal_name: Option<String>,
pub medal_level: Option<u32>,
pub price: u32,
}
#[allow(dead_code)]
impl SendGift {
pub fn new_from_ctx(ctx: &WsStreamCtx) -> Result<Self, DanmuStreamError> {
let data = ctx
.data
.as_ref()
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "data is None".to_string(),
})?;
let action = data
.action
.as_ref()
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "action is None".to_string(),
})?
.to_owned();
let combo_send = data.combo_send.clone();
let gift_name = if let Some(gift) = data.gift_name.as_ref() {
gift.to_owned()
} else if let Some(gift) = combo_send.clone().and_then(|x| x.gift_name) {
gift
} else {
return Err(DanmuStreamError::MessageParseError {
err: "gift_name is None".to_string(),
});
};
let num = if let Some(num) = combo_send.clone().and_then(|x| x.combo_num) {
num
} else if let Some(num) = data.num {
num
} else if let Some(num) = combo_send.and_then(|x| x.gift_num) {
num
} else {
return Err(DanmuStreamError::MessageParseError {
err: "num is None".to_string(),
});
};
let uname = data
.uname
.as_ref()
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "uname is None".to_string(),
})?
.to_owned();
let uid = data
.uid
.as_ref()
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "uid is None".to_string(),
})?
.as_u64()
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "uid is None".to_string(),
})?;
let medal_name = data
.medal_info
.as_ref()
.and_then(|x| x.medal_name.to_owned());
let medal_level = data.medal_info.as_ref().and_then(|x| x.medal_level);
let medal_name = if medal_name == Some("".to_string()) {
None
} else {
medal_name
};
let medal_level = if medal_level == Some(0) {
None
} else {
medal_level
};
let price = data
.price
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "price is None".to_string(),
})?;
Ok(Self {
action,
gift_name,
num,
uname,
uid,
medal_name,
medal_level,
price,
})
}
}

View File

@@ -0,0 +1,97 @@
use serde::Deserialize;
use serde_json::Value;
use crate::{
provider::{bilibili::dannmu_msg::BiliDanmuMessage, DanmuMessageType},
DanmuStreamError, DanmuMessage,
};
#[derive(Debug, Deserialize, Clone)]
pub struct WsStreamCtx {
pub cmd: Option<String>,
pub info: Option<Vec<Value>>,
pub data: Option<WsStreamCtxData>,
#[serde(flatten)]
_v: Value,
}
#[derive(Debug, Deserialize, Clone)]
#[allow(dead_code)]
pub struct WsStreamCtxData {
pub message: Option<String>,
pub price: Option<u32>,
pub start_time: Option<u64>,
pub time: Option<u32>,
pub uid: Option<Value>,
pub user_info: Option<WsStreamCtxDataUser>,
pub medal_info: Option<WsStreamCtxDataMedalInfo>,
pub uname: Option<String>,
pub fans_medal: Option<WsStreamCtxDataMedalInfo>,
pub action: Option<String>,
#[serde(rename = "giftName")]
pub gift_name: Option<String>,
pub num: Option<u64>,
pub combo_num: Option<u64>,
pub gift_num: Option<u64>,
pub combo_send: Box<Option<WsStreamCtxData>>,
}
#[derive(Debug, Deserialize, Clone)]
pub struct WsStreamCtxDataMedalInfo {
pub medal_name: Option<String>,
pub medal_level: Option<u32>,
}
#[derive(Debug, Deserialize, Clone)]
#[allow(dead_code)]
pub struct WsStreamCtxDataUser {
pub face: String,
pub uname: String,
}
impl WsStreamCtx {
pub fn new(s: &str) -> Result<Self, DanmuStreamError> {
serde_json::from_str(s).map_err(|_| DanmuStreamError::MessageParseError {
err: "Failed to parse message".to_string(),
})
}
pub fn match_msg(&self) -> Result<DanmuMessageType, DanmuStreamError> {
let cmd = self.handle_cmd();
let danmu_msg = match cmd {
Some(c) if c.contains("DANMU_MSG") => Some(BiliDanmuMessage::new_from_ctx(self)?),
_ => None,
};
if let Some(danmu_msg) = danmu_msg {
Ok(DanmuMessageType::DanmuMessage(DanmuMessage {
room_id: 0,
user_id: danmu_msg.uid,
user_name: danmu_msg.username,
message: danmu_msg.msg,
color: 0,
timestamp: danmu_msg.timestamp,
}))
} else {
Err(DanmuStreamError::MessageParseError {
err: "Unknown message".to_string(),
})
}
}
fn handle_cmd(&self) -> Option<&str> {
// handle DANMU_MSG:4:0:2:2:2:0
let cmd = if let Some(c) = self.cmd.as_deref() {
if c.starts_with("DANMU_MSG") {
Some("DANMU_MSG")
} else {
Some(c)
}
} else {
None
};
cmd
}
}

View File

@@ -0,0 +1,93 @@
use serde::Deserialize;
use crate::{provider::bilibili::stream::WsStreamCtx, DanmuStreamError};
#[derive(Debug, Deserialize)]
#[allow(dead_code)]
pub struct SuperChatMessage {
pub uname: String,
pub uid: u64,
pub face: String,
pub price: u32,
pub start_time: u64,
pub time: u32,
pub msg: String,
pub medal_name: Option<String>,
pub medal_level: Option<u32>,
}
#[allow(dead_code)]
impl SuperChatMessage {
pub fn new_from_ctx(ctx: &WsStreamCtx) -> Result<Self, DanmuStreamError> {
let data = ctx
.data
.as_ref()
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "data is None".to_string(),
})?;
let user_info =
data.user_info
.as_ref()
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "user_info is None".to_string(),
})?;
let uname = user_info.uname.to_owned();
let uid = data.uid.as_ref().and_then(|x| x.as_u64()).ok_or_else(|| {
DanmuStreamError::MessageParseError {
err: "uid is None".to_string(),
}
})?;
let face = user_info.face.to_owned();
let price = data
.price
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "price is None".to_string(),
})?;
let start_time = data
.start_time
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "start_time is None".to_string(),
})?;
let time = data
.time
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "time is None".to_string(),
})?;
let msg = data
.message
.as_ref()
.ok_or_else(|| DanmuStreamError::MessageParseError {
err: "message is None".to_string(),
})?
.to_owned();
let medal = data
.medal_info
.as_ref()
.map(|x| (x.medal_name.to_owned(), x.medal_level.to_owned()));
let medal_name = medal.as_ref().and_then(|(name, _)| name.to_owned());
let medal_level = medal.and_then(|(_, level)| level);
Ok(Self {
uname,
uid,
face,
price,
start_time,
time,
msg,
medal_name,
medal_level,
})
}
}

View File

@@ -0,0 +1,463 @@
use crate::{provider::DanmuProvider, DanmuMessage, DanmuMessageType, DanmuStreamError};
use async_trait::async_trait;
use chrono;
use deno_core::v8;
use deno_core::JsRuntime;
use deno_core::RuntimeOptions;
use flate2::read::GzDecoder;
use futures_util::{SinkExt, StreamExt, TryStreamExt};
use log::debug;
use log::{error, info};
use prost::bytes::Bytes;
use prost::Message;
use std::io::Read;
use std::sync::Arc;
use std::time::{Duration, SystemTime};
use tokio::net::TcpStream;
use tokio::sync::mpsc;
use tokio::sync::RwLock;
use tokio_tungstenite::{
connect_async, tungstenite::Message as WsMessage, MaybeTlsStream, WebSocketStream,
};
mod messages;
use messages::*;
const USER_AGENT: &str = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36";
const HEARTBEAT_INTERVAL: Duration = Duration::from_secs(10);
type WsReadType = futures_util::stream::SplitStream<WebSocketStream<MaybeTlsStream<TcpStream>>>;
type WsWriteType =
futures_util::stream::SplitSink<WebSocketStream<MaybeTlsStream<TcpStream>>, WsMessage>;
pub struct DouyinDanmu {
room_id: u64,
cookie: String,
stop: Arc<RwLock<bool>>,
write: Arc<RwLock<Option<WsWriteType>>>,
}
impl DouyinDanmu {
async fn connect_and_handle(
&self,
tx: mpsc::UnboundedSender<DanmuMessageType>,
) -> Result<(), DanmuStreamError> {
let url = self.get_wss_url().await?;
let request = tokio_tungstenite::tungstenite::http::Request::builder()
.uri(url)
.header(
tokio_tungstenite::tungstenite::http::header::COOKIE,
self.cookie.as_str(),
)
.header(
tokio_tungstenite::tungstenite::http::header::REFERER,
"https://live.douyin.com/",
)
.header(
tokio_tungstenite::tungstenite::http::header::USER_AGENT,
USER_AGENT,
)
.header(
tokio_tungstenite::tungstenite::http::header::HOST,
"webcast5-ws-web-hl.douyin.com",
)
.header(
tokio_tungstenite::tungstenite::http::header::UPGRADE,
"websocket",
)
.header(
tokio_tungstenite::tungstenite::http::header::CONNECTION,
"Upgrade",
)
.header(
tokio_tungstenite::tungstenite::http::header::SEC_WEBSOCKET_VERSION,
"13",
)
.header(
tokio_tungstenite::tungstenite::http::header::SEC_WEBSOCKET_EXTENSIONS,
"permessage-deflate; client_max_window_bits",
)
.header(
tokio_tungstenite::tungstenite::http::header::SEC_WEBSOCKET_KEY,
"V1Yza5x1zcfkembl6u/0Pg==",
)
.body(())
.unwrap();
let (ws_stream, response) =
connect_async(request)
.await
.map_err(|e| DanmuStreamError::WebsocketError {
err: format!("Failed to connect to douyin websocket: {}", e),
})?;
// Log the response status for debugging
info!("WebSocket connection response: {:?}", response.status());
let (write, read) = ws_stream.split();
*self.write.write().await = Some(write);
self.handle_connection(read, tx).await
}
async fn get_wss_url(&self) -> Result<String, DanmuStreamError> {
// Create a new V8 runtime
let mut runtime = JsRuntime::new(RuntimeOptions::default());
// Add global CryptoJS object
let crypto_js = include_str!("douyin/crypto-js.min.js");
runtime
.execute_script(
"<crypto-js.min.js>",
deno_core::FastString::Static(crypto_js),
)
.map_err(|e| DanmuStreamError::WebsocketError {
err: format!("Failed to execute crypto-js: {}", e),
})?;
// Load and execute the sign.js file
let js_code = include_str!("douyin/webmssdk.js");
runtime
.execute_script("<sign.js>", deno_core::FastString::Static(js_code))
.map_err(|e| DanmuStreamError::WebsocketError {
err: format!("Failed to execute JavaScript: {}", e),
})?;
// Call the get_wss_url function
let sign_call = format!("get_wss_url(\"{}\")", self.room_id);
let result = runtime
.execute_script(
"<sign_call>",
deno_core::FastString::Owned(sign_call.into_boxed_str()),
)
.map_err(|e| DanmuStreamError::WebsocketError {
err: format!("Failed to execute JavaScript: {}", e),
})?;
// Get the result from the V8 runtime
let scope = &mut runtime.handle_scope();
let local = v8::Local::new(scope, result);
let url = local.to_string(scope).unwrap().to_rust_string_lossy(scope);
debug!("Douyin wss url: {}", url);
Ok(url)
}
async fn handle_connection(
&self,
mut read: WsReadType,
tx: mpsc::UnboundedSender<DanmuMessageType>,
) -> Result<(), DanmuStreamError> {
// Start heartbeat task with error handling
let (tx_write, mut _rx_write) = mpsc::channel(32);
let tx_write_clone = tx_write.clone();
let stop = Arc::clone(&self.stop);
let heartbeat_handle = tokio::spawn(async move {
let mut last_heartbeat = SystemTime::now();
let mut consecutive_failures = 0;
const MAX_FAILURES: u32 = 3;
loop {
if *stop.read().await {
log::info!("Stopping douyin danmu stream");
break;
}
tokio::time::sleep(HEARTBEAT_INTERVAL).await;
match Self::send_heartbeat(&tx_write_clone).await {
Ok(_) => {
last_heartbeat = SystemTime::now();
consecutive_failures = 0;
}
Err(e) => {
error!("Failed to send heartbeat: {}", e);
consecutive_failures += 1;
if consecutive_failures >= MAX_FAILURES {
error!("Too many consecutive heartbeat failures, closing connection");
break;
}
// Check if we've exceeded the maximum time without a successful heartbeat
if let Ok(duration) = last_heartbeat.elapsed() {
if duration > HEARTBEAT_INTERVAL * 2 {
error!("No successful heartbeat for too long, closing connection");
break;
}
}
}
}
}
});
// Main message handling loop
let room_id = self.room_id;
let stop = Arc::clone(&self.stop);
let write = Arc::clone(&self.write);
let message_handle = tokio::spawn(async move {
while let Some(msg) =
read.try_next()
.await
.map_err(|e| DanmuStreamError::WebsocketError {
err: format!("Failed to read message: {}", e),
})?
{
if *stop.read().await {
log::info!("Stopping douyin danmu stream");
break;
}
match msg {
WsMessage::Binary(data) => {
if let Ok(Some(ack)) = handle_binary_message(&data, &tx, room_id).await {
if let Some(write) = write.write().await.as_mut() {
if let Err(e) =
write.send(WsMessage::Binary(ack.encode_to_vec())).await
{
error!("Failed to send ack: {}", e);
}
}
}
}
WsMessage::Close(_) => {
info!("WebSocket connection closed");
break;
}
WsMessage::Ping(data) => {
// Respond to ping with pong
if let Err(e) = tx_write.send(WsMessage::Pong(data)).await {
error!("Failed to send pong: {}", e);
break;
}
}
_ => {}
}
}
Ok::<(), DanmuStreamError>(())
});
// Wait for either the heartbeat or message handling to complete
tokio::select! {
result = heartbeat_handle => {
if let Err(e) = result {
error!("Heartbeat task failed: {}", e);
}
}
result = message_handle => {
if let Err(e) = result {
error!("Message handling task failed: {}", e);
}
}
}
Ok(())
}
async fn send_heartbeat(tx: &mpsc::Sender<WsMessage>) -> Result<(), DanmuStreamError> {
// heartbeat message: 3A 02 68 62
tx.send(WsMessage::Binary(vec![0x3A, 0x02, 0x68, 0x62]))
.await
.map_err(|e| DanmuStreamError::WebsocketError {
err: format!("Failed to send heartbeat message: {}", e),
})?;
Ok(())
}
}
async fn handle_binary_message(
data: &[u8],
tx: &mpsc::UnboundedSender<DanmuMessageType>,
room_id: u64,
) -> Result<Option<PushFrame>, DanmuStreamError> {
// First decode the PushFrame
let push_frame = PushFrame::decode(Bytes::from(data.to_vec())).map_err(|e| {
DanmuStreamError::WebsocketError {
err: format!("Failed to decode PushFrame: {}", e),
}
})?;
// Decompress the payload
let mut decoder = GzDecoder::new(push_frame.payload.as_slice());
let mut decompressed = Vec::new();
decoder
.read_to_end(&mut decompressed)
.map_err(|e| DanmuStreamError::WebsocketError {
err: format!("Failed to decompress payload: {}", e),
})?;
// Decode the Response from decompressed payload
let response = Response::decode(Bytes::from(decompressed)).map_err(|e| {
DanmuStreamError::WebsocketError {
err: format!("Failed to decode Response: {}", e),
}
})?;
// if payload_package.needAck:
// obj = PushFrame()
// obj.payloadType = 'ack'
// obj.logId = log_id
// obj.payloadType = payload_package.internalExt
// ack = obj.SerializeToString()
let mut ack = None;
if response.need_ack {
let ack_msg = PushFrame {
payload_type: "ack".to_string(),
log_id: push_frame.log_id,
payload_encoding: "".to_string(),
payload: vec![],
seq_id: 0,
service: 0,
method: 0,
headers_list: vec![],
};
debug!("Need to respond ack: {:?}", ack_msg);
ack = Some(ack_msg);
}
for message in response.messages_list {
match message.method.as_str() {
"WebcastChatMessage" => {
let chat_msg =
DouyinChatMessage::decode(message.payload.as_slice()).map_err(|e| {
DanmuStreamError::WebsocketError {
err: format!("Failed to decode chat message: {}", e),
}
})?;
if let Some(user) = chat_msg.user {
let danmu_msg = DanmuMessage {
room_id,
user_id: user.id,
user_name: user.nick_name,
message: chat_msg.content,
color: 0xffffff,
timestamp: chat_msg.event_time as i64 * 1000,
};
debug!("Received danmu message: {:?}", danmu_msg);
tx.send(DanmuMessageType::DanmuMessage(danmu_msg))
.map_err(|e| DanmuStreamError::WebsocketError {
err: format!("Failed to send message to channel: {}", e),
})?;
}
}
"WebcastGiftMessage" => {
let gift_msg = GiftMessage::decode(message.payload.as_slice()).map_err(|e| {
DanmuStreamError::WebsocketError {
err: format!("Failed to decode gift message: {}", e),
}
})?;
if let Some(user) = gift_msg.user {
if let Some(gift) = gift_msg.gift {
log::debug!("Received gift: {} from user: {}", gift.name, user.nick_name);
}
}
}
"WebcastLikeMessage" => {
let like_msg = LikeMessage::decode(message.payload.as_slice()).map_err(|e| {
DanmuStreamError::WebsocketError {
err: format!("Failed to decode like message: {}", e),
}
})?;
if let Some(user) = like_msg.user {
log::debug!(
"Received {} likes from user: {}",
like_msg.count,
user.nick_name
);
}
}
"WebcastMemberMessage" => {
let member_msg =
MemberMessage::decode(message.payload.as_slice()).map_err(|e| {
DanmuStreamError::WebsocketError {
err: format!("Failed to decode member message: {}", e),
}
})?;
if let Some(user) = member_msg.user {
log::debug!(
"Member joined: {} (Action: {})",
user.nick_name,
member_msg.action_description
);
}
}
_ => {
debug!("Unknown message: {:?}", message);
}
}
}
Ok(ack)
}
#[async_trait]
impl DanmuProvider for DouyinDanmu {
async fn new(identifier: &str, room_id: u64) -> Result<Self, DanmuStreamError> {
Ok(Self {
room_id,
cookie: identifier.to_string(),
stop: Arc::new(RwLock::new(false)),
write: Arc::new(RwLock::new(None)),
})
}
async fn start(
&self,
tx: mpsc::UnboundedSender<DanmuMessageType>,
) -> Result<(), DanmuStreamError> {
let mut retry_count = 0;
const MAX_RETRIES: u32 = 5;
const RETRY_DELAY: Duration = Duration::from_secs(5);
info!(
"Douyin WebSocket connection started, room_id: {}",
self.room_id
);
loop {
if *self.stop.read().await {
break;
}
match self.connect_and_handle(tx.clone()).await {
Ok(_) => {
info!("Douyin WebSocket connection closed normally");
break;
}
Err(e) => {
error!("Douyin WebSocket connection error: {}", e);
retry_count += 1;
if retry_count >= MAX_RETRIES {
return Err(DanmuStreamError::WebsocketError {
err: format!("Failed to connect after {} retries", MAX_RETRIES),
});
}
info!(
"Retrying connection in {} seconds... (Attempt {}/{})",
RETRY_DELAY.as_secs(),
retry_count,
MAX_RETRIES
);
tokio::time::sleep(RETRY_DELAY).await;
}
}
}
Ok(())
}
async fn stop(&self) -> Result<(), DanmuStreamError> {
*self.stop.write().await = true;
if let Some(mut write) = self.write.write().await.take() {
if let Err(e) = write.close().await {
error!("Failed to close WebSocket connection: {}", e);
}
}
Ok(())
}
}

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,861 @@
use prost::Message;
use std::collections::HashMap;
// message Response {
// repeated Message messagesList = 1;
// string cursor = 2;
// uint64 fetchInterval = 3;
// uint64 now = 4;
// string internalExt = 5;
// uint32 fetchType = 6;
// map<string, string> routeParams = 7;
// uint64 heartbeatDuration = 8;
// bool needAck = 9;
// string pushServer = 10;
// string liveCursor = 11;
// bool historyNoMore = 12;
// }
#[derive(Message)]
pub struct Response {
#[prost(message, repeated, tag = "1")]
pub messages_list: Vec<CommonMessage>,
#[prost(string, tag = "2")]
pub cursor: String,
#[prost(uint64, tag = "3")]
pub fetch_interval: u64,
#[prost(uint64, tag = "4")]
pub now: u64,
#[prost(string, tag = "5")]
pub internal_ext: String,
#[prost(uint32, tag = "6")]
pub fetch_type: u32,
#[prost(map = "string, string", tag = "7")]
pub route_params: HashMap<String, String>,
#[prost(uint64, tag = "8")]
pub heartbeat_duration: u64,
#[prost(bool, tag = "9")]
pub need_ack: bool,
#[prost(string, tag = "10")]
pub push_server: String,
#[prost(string, tag = "11")]
pub live_cursor: String,
#[prost(bool, tag = "12")]
pub history_no_more: bool,
}
#[derive(Message)]
pub struct CommonMessage {
#[prost(string, tag = "1")]
pub method: String,
#[prost(bytes, tag = "2")]
pub payload: Vec<u8>,
#[prost(int64, tag = "3")]
pub msg_id: i64,
#[prost(int32, tag = "4")]
pub msg_type: i32,
#[prost(int64, tag = "5")]
pub offset: i64,
#[prost(bool, tag = "6")]
pub need_wrds_store: bool,
#[prost(int64, tag = "7")]
pub wrds_version: i64,
#[prost(string, tag = "8")]
pub wrds_sub_key: String,
}
#[derive(Message)]
pub struct Common {
#[prost(string, tag = "1")]
pub method: String,
#[prost(uint64, tag = "2")]
pub msg_id: u64,
#[prost(uint64, tag = "3")]
pub room_id: u64,
#[prost(uint64, tag = "4")]
pub create_time: u64,
#[prost(uint32, tag = "5")]
pub monitor: u32,
#[prost(bool, tag = "6")]
pub is_show_msg: bool,
#[prost(string, tag = "7")]
pub describe: String,
#[prost(uint64, tag = "9")]
pub fold_type: u64,
#[prost(uint64, tag = "10")]
pub anchor_fold_type: u64,
#[prost(uint64, tag = "11")]
pub priority_score: u64,
#[prost(string, tag = "12")]
pub log_id: String,
#[prost(string, tag = "13")]
pub msg_process_filter_k: String,
#[prost(string, tag = "14")]
pub msg_process_filter_v: String,
#[prost(message, optional, tag = "15")]
pub user: Option<User>,
}
#[derive(Message)]
pub struct User {
#[prost(uint64, tag = "1")]
pub id: u64,
#[prost(uint64, tag = "2")]
pub short_id: u64,
#[prost(string, tag = "3")]
pub nick_name: String,
#[prost(uint32, tag = "4")]
pub gender: u32,
#[prost(string, tag = "5")]
pub signature: String,
#[prost(uint32, tag = "6")]
pub level: u32,
#[prost(uint64, tag = "7")]
pub birthday: u64,
#[prost(string, tag = "8")]
pub telephone: String,
#[prost(message, optional, tag = "9")]
pub avatar_thumb: Option<Image>,
#[prost(message, optional, tag = "10")]
pub avatar_medium: Option<Image>,
#[prost(message, optional, tag = "11")]
pub avatar_large: Option<Image>,
#[prost(bool, tag = "12")]
pub verified: bool,
#[prost(uint32, tag = "13")]
pub experience: u32,
#[prost(string, tag = "14")]
pub city: String,
#[prost(int32, tag = "15")]
pub status: i32,
#[prost(uint64, tag = "16")]
pub create_time: u64,
#[prost(uint64, tag = "17")]
pub modify_time: u64,
#[prost(uint32, tag = "18")]
pub secret: u32,
#[prost(string, tag = "19")]
pub share_qrcode_uri: String,
#[prost(uint32, tag = "20")]
pub income_share_percent: u32,
#[prost(message, repeated, tag = "21")]
pub badge_image_list: Vec<Image>,
#[prost(message, optional, tag = "22")]
pub follow_info: Option<FollowInfo>,
#[prost(message, optional, tag = "23")]
pub pay_grade: Option<PayGrade>,
#[prost(message, optional, tag = "24")]
pub fans_club: Option<FansClub>,
#[prost(string, tag = "26")]
pub special_id: String,
#[prost(message, optional, tag = "27")]
pub avatar_border: Option<Image>,
#[prost(message, optional, tag = "28")]
pub medal: Option<Image>,
#[prost(message, repeated, tag = "29")]
pub real_time_icons_list: Vec<Image>,
#[prost(string, tag = "38")]
pub display_id: String,
#[prost(string, tag = "46")]
pub sec_uid: String,
#[prost(uint64, tag = "1022")]
pub fan_ticket_count: u64,
#[prost(string, tag = "1028")]
pub id_str: String,
#[prost(uint32, tag = "1045")]
pub age_range: u32,
}
#[derive(Message, PartialEq)]
pub struct Image {
#[prost(string, repeated, tag = "1")]
pub url_list_list: Vec<String>,
#[prost(string, tag = "2")]
pub uri: String,
#[prost(uint64, tag = "3")]
pub height: u64,
#[prost(uint64, tag = "4")]
pub width: u64,
#[prost(string, tag = "5")]
pub avg_color: String,
#[prost(uint32, tag = "6")]
pub image_type: u32,
#[prost(string, tag = "7")]
pub open_web_url: String,
#[prost(message, optional, tag = "8")]
pub content: Option<ImageContent>,
#[prost(bool, tag = "9")]
pub is_animated: bool,
#[prost(message, optional, tag = "10")]
pub flex_setting_list: Option<NinePatchSetting>,
#[prost(message, optional, tag = "11")]
pub text_setting_list: Option<NinePatchSetting>,
}
#[derive(Message, PartialEq)]
pub struct ImageContent {
#[prost(string, tag = "1")]
pub name: String,
#[prost(string, tag = "2")]
pub font_color: String,
#[prost(uint64, tag = "3")]
pub level: u64,
#[prost(string, tag = "4")]
pub alternative_text: String,
}
#[derive(Message, PartialEq)]
pub struct NinePatchSetting {
#[prost(string, repeated, tag = "1")]
pub setting_list_list: Vec<String>,
}
#[derive(Message)]
pub struct FollowInfo {
#[prost(uint64, tag = "1")]
pub following_count: u64,
#[prost(uint64, tag = "2")]
pub follower_count: u64,
#[prost(uint64, tag = "3")]
pub follow_status: u64,
#[prost(uint64, tag = "4")]
pub push_status: u64,
#[prost(string, tag = "5")]
pub remark_name: String,
#[prost(string, tag = "6")]
pub follower_count_str: String,
#[prost(string, tag = "7")]
pub following_count_str: String,
}
#[derive(Message)]
pub struct PayGrade {
#[prost(int64, tag = "1")]
pub total_diamond_count: i64,
#[prost(message, optional, tag = "2")]
pub diamond_icon: Option<Image>,
#[prost(string, tag = "3")]
pub name: String,
#[prost(message, optional, tag = "4")]
pub icon: Option<Image>,
#[prost(string, tag = "5")]
pub next_name: String,
#[prost(int64, tag = "6")]
pub level: i64,
#[prost(message, optional, tag = "7")]
pub next_icon: Option<Image>,
#[prost(int64, tag = "8")]
pub next_diamond: i64,
#[prost(int64, tag = "9")]
pub now_diamond: i64,
#[prost(int64, tag = "10")]
pub this_grade_min_diamond: i64,
#[prost(int64, tag = "11")]
pub this_grade_max_diamond: i64,
#[prost(int64, tag = "12")]
pub pay_diamond_bak: i64,
#[prost(string, tag = "13")]
pub grade_describe: String,
#[prost(message, repeated, tag = "14")]
pub grade_icon_list: Vec<GradeIcon>,
#[prost(int64, tag = "15")]
pub screen_chat_type: i64,
#[prost(message, optional, tag = "16")]
pub im_icon: Option<Image>,
#[prost(message, optional, tag = "17")]
pub im_icon_with_level: Option<Image>,
#[prost(message, optional, tag = "18")]
pub live_icon: Option<Image>,
#[prost(message, optional, tag = "19")]
pub new_im_icon_with_level: Option<Image>,
#[prost(message, optional, tag = "20")]
pub new_live_icon: Option<Image>,
#[prost(int64, tag = "21")]
pub upgrade_need_consume: i64,
#[prost(string, tag = "22")]
pub next_privileges: String,
#[prost(message, optional, tag = "23")]
pub background: Option<Image>,
#[prost(message, optional, tag = "24")]
pub background_back: Option<Image>,
#[prost(int64, tag = "25")]
pub score: i64,
#[prost(message, optional, tag = "26")]
pub buff_info: Option<GradeBuffInfo>,
}
#[derive(Message)]
pub struct GradeIcon {
#[prost(message, optional, tag = "1")]
pub icon: Option<Image>,
#[prost(int64, tag = "2")]
pub icon_diamond: i64,
#[prost(int64, tag = "3")]
pub level: i64,
#[prost(string, tag = "4")]
pub level_str: String,
}
#[derive(Message)]
pub struct GradeBuffInfo {}
#[derive(Message)]
pub struct FansClub {
#[prost(message, optional, tag = "1")]
pub data: Option<FansClubData>,
#[prost(map = "int32, message", tag = "2")]
pub prefer_data: HashMap<i32, FansClubData>,
}
#[derive(Message, PartialEq)]
pub struct FansClubData {
#[prost(string, tag = "1")]
pub club_name: String,
#[prost(int32, tag = "2")]
pub level: i32,
#[prost(int32, tag = "3")]
pub user_fans_club_status: i32,
#[prost(message, optional, tag = "4")]
pub badge: Option<UserBadge>,
#[prost(int64, repeated, tag = "5")]
pub available_gift_ids: Vec<i64>,
#[prost(int64, tag = "6")]
pub anchor_id: i64,
}
#[derive(Message, PartialEq)]
pub struct UserBadge {
#[prost(map = "int32, message", tag = "1")]
pub icons: HashMap<i32, Image>,
#[prost(string, tag = "2")]
pub title: String,
}
#[derive(Message)]
pub struct PublicAreaCommon {
#[prost(message, optional, tag = "1")]
pub user_label: Option<Image>,
#[prost(uint64, tag = "2")]
pub user_consume_in_room: u64,
#[prost(uint64, tag = "3")]
pub user_send_gift_cnt_in_room: u64,
}
#[derive(Message)]
pub struct LandscapeAreaCommon {
#[prost(bool, tag = "1")]
pub show_head: bool,
#[prost(bool, tag = "2")]
pub show_nickname: bool,
#[prost(bool, tag = "3")]
pub show_font_color: bool,
#[prost(string, repeated, tag = "4")]
pub color_value_list: Vec<String>,
#[prost(enumeration = "CommentTypeTag", repeated, tag = "5")]
pub comment_type_tags_list: Vec<i32>,
}
#[derive(Message)]
pub struct Text {
#[prost(string, tag = "1")]
pub key: String,
#[prost(string, tag = "2")]
pub default_patter: String,
#[prost(message, optional, tag = "3")]
pub default_format: Option<TextFormat>,
#[prost(message, repeated, tag = "4")]
pub pieces_list: Vec<TextPiece>,
}
#[derive(Message)]
pub struct TextFormat {
#[prost(string, tag = "1")]
pub color: String,
#[prost(bool, tag = "2")]
pub bold: bool,
#[prost(bool, tag = "3")]
pub italic: bool,
#[prost(uint32, tag = "4")]
pub weight: u32,
#[prost(uint32, tag = "5")]
pub italic_angle: u32,
#[prost(uint32, tag = "6")]
pub font_size: u32,
#[prost(bool, tag = "7")]
pub use_heigh_light_color: bool,
#[prost(bool, tag = "8")]
pub use_remote_clor: bool,
}
#[derive(Message)]
pub struct TextPiece {
#[prost(bool, tag = "1")]
pub r#type: bool,
#[prost(message, optional, tag = "2")]
pub format: Option<TextFormat>,
#[prost(string, tag = "3")]
pub string_value: String,
#[prost(message, optional, tag = "4")]
pub user_value: Option<TextPieceUser>,
#[prost(message, optional, tag = "5")]
pub gift_value: Option<TextPieceGift>,
#[prost(message, optional, tag = "6")]
pub heart_value: Option<TextPieceHeart>,
#[prost(message, optional, tag = "7")]
pub pattern_ref_value: Option<TextPiecePatternRef>,
#[prost(message, optional, tag = "8")]
pub image_value: Option<TextPieceImage>,
}
#[derive(Message)]
pub struct TextPieceUser {
#[prost(message, optional, tag = "1")]
pub user: Option<User>,
#[prost(bool, tag = "2")]
pub with_colon: bool,
}
#[derive(Message)]
pub struct TextPieceGift {
#[prost(uint64, tag = "1")]
pub gift_id: u64,
#[prost(message, optional, tag = "2")]
pub name_ref: Option<PatternRef>,
}
#[derive(Message)]
pub struct PatternRef {
#[prost(string, tag = "1")]
pub key: String,
#[prost(string, tag = "2")]
pub default_pattern: String,
}
#[derive(Message)]
pub struct TextPieceHeart {
#[prost(string, tag = "1")]
pub color: String,
}
#[derive(Message)]
pub struct TextPiecePatternRef {
#[prost(string, tag = "1")]
pub key: String,
#[prost(string, tag = "2")]
pub default_pattern: String,
}
#[derive(Message)]
pub struct TextPieceImage {
#[prost(message, optional, tag = "1")]
pub image: Option<Image>,
#[prost(float, tag = "2")]
pub scaling_rate: f32,
}
#[derive(Clone, Copy, Debug, PartialEq, Eq, Hash, PartialOrd, Ord, ::prost::Enumeration)]
#[repr(i32)]
pub enum CommentTypeTag {
CommentTypeTagUnknown = 0,
CommentTypeTagStar = 1,
}
#[derive(Message)]
pub struct DouyinChatMessage {
#[prost(message, optional, tag = "1")]
pub common: Option<Common>,
#[prost(message, optional, tag = "2")]
pub user: Option<User>,
#[prost(string, tag = "3")]
pub content: String,
#[prost(bool, tag = "4")]
pub visible_to_sender: bool,
#[prost(message, optional, tag = "5")]
pub background_image: Option<Image>,
#[prost(string, tag = "6")]
pub full_screen_text_color: String,
#[prost(message, optional, tag = "7")]
pub background_image_v2: Option<Image>,
#[prost(message, optional, tag = "9")]
pub public_area_common: Option<PublicAreaCommon>,
#[prost(message, optional, tag = "10")]
pub gift_image: Option<Image>,
#[prost(uint64, tag = "11")]
pub agree_msg_id: u64,
#[prost(uint32, tag = "12")]
pub priority_level: u32,
#[prost(message, optional, tag = "13")]
pub landscape_area_common: Option<LandscapeAreaCommon>,
#[prost(uint64, tag = "15")]
pub event_time: u64,
#[prost(bool, tag = "16")]
pub send_review: bool,
#[prost(bool, tag = "17")]
pub from_intercom: bool,
#[prost(bool, tag = "18")]
pub intercom_hide_user_card: bool,
#[prost(string, tag = "20")]
pub chat_by: String,
#[prost(uint32, tag = "21")]
pub individual_chat_priority: u32,
#[prost(message, optional, tag = "22")]
pub rtf_content: Option<Text>,
}
#[derive(Message)]
pub struct GiftMessage {
#[prost(message, optional, tag = "1")]
pub common: Option<Common>,
#[prost(uint64, tag = "2")]
pub gift_id: u64,
#[prost(uint64, tag = "3")]
pub fan_ticket_count: u64,
#[prost(uint64, tag = "4")]
pub group_count: u64,
#[prost(uint64, tag = "5")]
pub repeat_count: u64,
#[prost(uint64, tag = "6")]
pub combo_count: u64,
#[prost(message, optional, tag = "7")]
pub user: Option<User>,
#[prost(message, optional, tag = "8")]
pub to_user: Option<User>,
#[prost(uint32, tag = "9")]
pub repeat_end: u32,
#[prost(message, optional, tag = "10")]
pub text_effect: Option<TextEffect>,
#[prost(uint64, tag = "11")]
pub group_id: u64,
#[prost(uint64, tag = "12")]
pub income_taskgifts: u64,
#[prost(uint64, tag = "13")]
pub room_fan_ticket_count: u64,
#[prost(message, optional, tag = "14")]
pub priority: Option<GiftIMPriority>,
#[prost(message, optional, tag = "15")]
pub gift: Option<GiftStruct>,
#[prost(string, tag = "16")]
pub log_id: String,
#[prost(uint64, tag = "17")]
pub send_type: u64,
#[prost(message, optional, tag = "18")]
pub public_area_common: Option<PublicAreaCommon>,
#[prost(message, optional, tag = "19")]
pub tray_display_text: Option<Text>,
#[prost(uint64, tag = "20")]
pub banned_display_effects: u64,
#[prost(bool, tag = "25")]
pub display_for_self: bool,
#[prost(string, tag = "26")]
pub interact_gift_info: String,
#[prost(string, tag = "27")]
pub diy_item_info: String,
#[prost(uint64, repeated, tag = "28")]
pub min_asset_set_list: Vec<u64>,
#[prost(uint64, tag = "29")]
pub total_count: u64,
#[prost(uint32, tag = "30")]
pub client_gift_source: u32,
#[prost(uint64, repeated, tag = "32")]
pub to_user_ids_list: Vec<u64>,
#[prost(uint64, tag = "33")]
pub send_time: u64,
#[prost(uint64, tag = "34")]
pub force_display_effects: u64,
#[prost(string, tag = "35")]
pub trace_id: String,
#[prost(uint64, tag = "36")]
pub effect_display_ts: u64,
}
#[derive(Message)]
pub struct GiftStruct {
#[prost(message, optional, tag = "1")]
pub image: Option<Image>,
#[prost(string, tag = "2")]
pub describe: String,
#[prost(bool, tag = "3")]
pub notify: bool,
#[prost(uint64, tag = "4")]
pub duration: u64,
#[prost(uint64, tag = "5")]
pub id: u64,
#[prost(bool, tag = "7")]
pub for_linkmic: bool,
#[prost(bool, tag = "8")]
pub doodle: bool,
#[prost(bool, tag = "9")]
pub for_fansclub: bool,
#[prost(bool, tag = "10")]
pub combo: bool,
#[prost(uint32, tag = "11")]
pub r#type: u32,
#[prost(uint32, tag = "12")]
pub diamond_count: u32,
#[prost(bool, tag = "13")]
pub is_displayed_on_panel: bool,
#[prost(uint64, tag = "14")]
pub primary_effect_id: u64,
#[prost(message, optional, tag = "15")]
pub gift_label_icon: Option<Image>,
#[prost(string, tag = "16")]
pub name: String,
#[prost(string, tag = "17")]
pub region: String,
#[prost(string, tag = "18")]
pub manual: String,
#[prost(bool, tag = "19")]
pub for_custom: bool,
#[prost(message, optional, tag = "21")]
pub icon: Option<Image>,
#[prost(uint32, tag = "22")]
pub action_type: u32,
}
#[derive(Message)]
pub struct GiftIMPriority {
#[prost(uint64, repeated, tag = "1")]
pub queue_sizes_list: Vec<u64>,
#[prost(uint64, tag = "2")]
pub self_queue_priority: u64,
#[prost(uint64, tag = "3")]
pub priority: u64,
}
#[derive(Message)]
pub struct TextEffect {
#[prost(message, optional, tag = "1")]
pub portrait: Option<TextEffectDetail>,
#[prost(message, optional, tag = "2")]
pub landscape: Option<TextEffectDetail>,
}
#[derive(Message)]
pub struct TextEffectDetail {
#[prost(message, optional, tag = "1")]
pub text: Option<Text>,
#[prost(uint32, tag = "2")]
pub text_font_size: u32,
#[prost(message, optional, tag = "3")]
pub background: Option<Image>,
#[prost(uint32, tag = "4")]
pub start: u32,
#[prost(uint32, tag = "5")]
pub duration: u32,
#[prost(uint32, tag = "6")]
pub x: u32,
#[prost(uint32, tag = "7")]
pub y: u32,
#[prost(uint32, tag = "8")]
pub width: u32,
#[prost(uint32, tag = "9")]
pub height: u32,
#[prost(uint32, tag = "10")]
pub shadow_dx: u32,
#[prost(uint32, tag = "11")]
pub shadow_dy: u32,
#[prost(uint32, tag = "12")]
pub shadow_radius: u32,
#[prost(string, tag = "13")]
pub shadow_color: String,
#[prost(string, tag = "14")]
pub stroke_color: String,
#[prost(uint32, tag = "15")]
pub stroke_width: u32,
}
#[derive(Message)]
pub struct LikeMessage {
#[prost(message, optional, tag = "1")]
pub common: Option<Common>,
#[prost(uint64, tag = "2")]
pub count: u64,
#[prost(uint64, tag = "3")]
pub total: u64,
#[prost(uint64, tag = "4")]
pub color: u64,
#[prost(message, optional, tag = "5")]
pub user: Option<User>,
#[prost(string, tag = "6")]
pub icon: String,
#[prost(message, optional, tag = "7")]
pub double_like_detail: Option<DoubleLikeDetail>,
#[prost(message, optional, tag = "8")]
pub display_control_info: Option<DisplayControlInfo>,
#[prost(uint64, tag = "9")]
pub linkmic_guest_uid: u64,
#[prost(string, tag = "10")]
pub scene: String,
#[prost(message, optional, tag = "11")]
pub pico_display_info: Option<PicoDisplayInfo>,
}
#[derive(Message)]
pub struct DoubleLikeDetail {
#[prost(bool, tag = "1")]
pub double_flag: bool,
#[prost(uint32, tag = "2")]
pub seq_id: u32,
#[prost(uint32, tag = "3")]
pub renewals_num: u32,
#[prost(uint32, tag = "4")]
pub triggers_num: u32,
}
#[derive(Message)]
pub struct DisplayControlInfo {
#[prost(bool, tag = "1")]
pub show_text: bool,
#[prost(bool, tag = "2")]
pub show_icons: bool,
}
#[derive(Message)]
pub struct PicoDisplayInfo {
#[prost(uint64, tag = "1")]
pub combo_sum_count: u64,
#[prost(string, tag = "2")]
pub emoji: String,
#[prost(message, optional, tag = "3")]
pub emoji_icon: Option<Image>,
#[prost(string, tag = "4")]
pub emoji_text: String,
}
#[derive(Message)]
pub struct MemberMessage {
#[prost(message, optional, tag = "1")]
pub common: Option<Common>,
#[prost(message, optional, tag = "2")]
pub user: Option<User>,
#[prost(uint64, tag = "3")]
pub member_count: u64,
#[prost(message, optional, tag = "4")]
pub operator: Option<User>,
#[prost(bool, tag = "5")]
pub is_set_to_admin: bool,
#[prost(bool, tag = "6")]
pub is_top_user: bool,
#[prost(uint64, tag = "7")]
pub rank_score: u64,
#[prost(uint64, tag = "8")]
pub top_user_no: u64,
#[prost(uint64, tag = "9")]
pub enter_type: u64,
#[prost(uint64, tag = "10")]
pub action: u64,
#[prost(string, tag = "11")]
pub action_description: String,
#[prost(uint64, tag = "12")]
pub user_id: u64,
#[prost(message, optional, tag = "13")]
pub effect_config: Option<EffectConfig>,
#[prost(string, tag = "14")]
pub pop_str: String,
#[prost(message, optional, tag = "15")]
pub enter_effect_config: Option<EffectConfig>,
#[prost(message, optional, tag = "16")]
pub background_image: Option<Image>,
#[prost(message, optional, tag = "17")]
pub background_image_v2: Option<Image>,
#[prost(message, optional, tag = "18")]
pub anchor_display_text: Option<Text>,
#[prost(message, optional, tag = "19")]
pub public_area_common: Option<PublicAreaCommon>,
#[prost(uint64, tag = "20")]
pub user_enter_tip_type: u64,
#[prost(uint64, tag = "21")]
pub anchor_enter_tip_type: u64,
}
#[derive(Message)]
pub struct EffectConfig {
#[prost(uint64, tag = "1")]
pub r#type: u64,
#[prost(message, optional, tag = "2")]
pub icon: Option<Image>,
#[prost(uint64, tag = "3")]
pub avatar_pos: u64,
#[prost(message, optional, tag = "4")]
pub text: Option<Text>,
#[prost(message, optional, tag = "5")]
pub text_icon: Option<Image>,
#[prost(uint32, tag = "6")]
pub stay_time: u32,
#[prost(uint64, tag = "7")]
pub anim_asset_id: u64,
#[prost(message, optional, tag = "8")]
pub badge: Option<Image>,
#[prost(uint64, repeated, tag = "9")]
pub flex_setting_array_list: Vec<u64>,
#[prost(message, optional, tag = "10")]
pub text_icon_overlay: Option<Image>,
#[prost(message, optional, tag = "11")]
pub animated_badge: Option<Image>,
#[prost(bool, tag = "12")]
pub has_sweep_light: bool,
#[prost(uint64, repeated, tag = "13")]
pub text_flex_setting_array_list: Vec<u64>,
#[prost(uint64, tag = "14")]
pub center_anim_asset_id: u64,
#[prost(message, optional, tag = "15")]
pub dynamic_image: Option<Image>,
#[prost(map = "string, string", tag = "16")]
pub extra_map: HashMap<String, String>,
#[prost(uint64, tag = "17")]
pub mp4_anim_asset_id: u64,
#[prost(uint64, tag = "18")]
pub priority: u64,
#[prost(uint64, tag = "19")]
pub max_wait_time: u64,
#[prost(string, tag = "20")]
pub dress_id: String,
#[prost(uint64, tag = "21")]
pub alignment: u64,
#[prost(uint64, tag = "22")]
pub alignment_offset: u64,
}
// message PushFrame {
// uint64 seqId = 1;
// uint64 logId = 2;
// uint64 service = 3;
// uint64 method = 4;
// repeated HeadersList headersList = 5;
// string payloadEncoding = 6;
// string payloadType = 7;
// bytes payload = 8;
// }
#[derive(Message)]
pub struct PushFrame {
#[prost(uint64, tag = "1")]
pub seq_id: u64,
#[prost(uint64, tag = "2")]
pub log_id: u64,
#[prost(uint64, tag = "3")]
pub service: u64,
#[prost(uint64, tag = "4")]
pub method: u64,
#[prost(message, repeated, tag = "5")]
pub headers_list: Vec<HeadersList>,
#[prost(string, tag = "6")]
pub payload_encoding: String,
#[prost(string, tag = "7")]
pub payload_type: String,
#[prost(bytes, tag = "8")]
pub payload: Vec<u8>,
}
// message HeadersList {
// string key = 1;
// string value = 2;
// }
#[derive(Message)]
pub struct HeadersList {
#[prost(string, tag = "1")]
pub key: String,
#[prost(string, tag = "2")]
pub value: String,
}

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -2372,6 +2372,12 @@
"const": "core:app:allow-set-app-theme",
"markdownDescription": "Enables the set_app_theme command without any pre-configured scope."
},
{
"description": "Enables the set_dock_visibility command without any pre-configured scope.",
"type": "string",
"const": "core:app:allow-set-dock-visibility",
"markdownDescription": "Enables the set_dock_visibility command without any pre-configured scope."
},
{
"description": "Enables the tauri_version command without any pre-configured scope.",
"type": "string",
@@ -2432,6 +2438,12 @@
"const": "core:app:deny-set-app-theme",
"markdownDescription": "Denies the set_app_theme command without any pre-configured scope."
},
{
"description": "Denies the set_dock_visibility command without any pre-configured scope.",
"type": "string",
"const": "core:app:deny-set-dock-visibility",
"markdownDescription": "Denies the set_dock_visibility command without any pre-configured scope."
},
{
"description": "Denies the tauri_version command without any pre-configured scope.",
"type": "string",

View File

@@ -2372,6 +2372,12 @@
"const": "core:app:allow-set-app-theme",
"markdownDescription": "Enables the set_app_theme command without any pre-configured scope."
},
{
"description": "Enables the set_dock_visibility command without any pre-configured scope.",
"type": "string",
"const": "core:app:allow-set-dock-visibility",
"markdownDescription": "Enables the set_dock_visibility command without any pre-configured scope."
},
{
"description": "Enables the tauri_version command without any pre-configured scope.",
"type": "string",
@@ -2432,6 +2438,12 @@
"const": "core:app:deny-set-app-theme",
"markdownDescription": "Denies the set_app_theme command without any pre-configured scope."
},
{
"description": "Denies the set_dock_visibility command without any pre-configured scope.",
"type": "string",
"const": "core:app:deny-set-dock-visibility",
"markdownDescription": "Denies the set_dock_visibility command without any pre-configured scope."
},
{
"description": "Denies the tauri_version command without any pre-configured scope.",
"type": "string",

View File

@@ -1,7 +1,6 @@
use std::path::{Path, PathBuf};
use chrono::Utc;
use platform_dirs::AppDirs;
use serde::{Deserialize, Serialize};
use crate::{recorder::PlatformType, recorder_manager::ClipRangeParams};
@@ -10,8 +9,6 @@ use crate::{recorder::PlatformType, recorder_manager::ClipRangeParams};
pub struct Config {
pub cache: String,
pub output: String,
pub webid: String,
pub webid_ts: i64,
pub live_start_notify: bool,
pub live_end_notify: bool,
pub clip_notify: bool,
@@ -26,6 +23,10 @@ pub struct Config {
pub clip_name_format: String,
#[serde(default = "default_auto_generate_config")]
pub auto_generate: AutoGenerateConfig,
#[serde(default = "default_status_check_interval")]
pub status_check_interval: u64,
#[serde(skip)]
pub config_path: String,
}
#[derive(Deserialize, Serialize, Clone)]
@@ -39,7 +40,7 @@ fn default_auto_subtitle() -> bool {
}
fn default_whisper_model() -> String {
"".to_string()
"whisper_model.bin".to_string()
}
fn default_whisper_prompt() -> String {
@@ -57,69 +58,69 @@ fn default_auto_generate_config() -> AutoGenerateConfig {
}
}
fn default_status_check_interval() -> u64 {
30
}
impl Config {
pub fn load() -> Self {
let app_dirs = AppDirs::new(Some("cn.vjoi.bili-shadowreplay"), false).unwrap();
let config_path = app_dirs.config_dir.join("Conf.toml");
pub fn load(
config_path: &PathBuf,
default_cache: &Path,
default_output: &Path,
) -> Result<Self, String> {
if let Ok(content) = std::fs::read_to_string(config_path) {
if let Ok(config) = toml::from_str(&content) {
return config;
if let Ok(mut config) = toml::from_str::<Config>(&content) {
config.config_path = config_path.to_str().unwrap().into();
return Ok(config);
}
}
if let Some(dir_path) = PathBuf::from(config_path).parent() {
if let Err(e) = std::fs::create_dir_all(dir_path) {
return Err(format!("Failed to create config dir: {e}"));
}
}
let config = Config {
webid: "".to_string(),
webid_ts: 0,
cache: app_dirs
.cache_dir
.join("cache")
.to_str()
.unwrap()
.to_string(),
output: app_dirs
.data_dir
.join("output")
.to_str()
.unwrap()
.to_string(),
cache: default_cache.to_str().unwrap().into(),
output: default_output.to_str().unwrap().into(),
live_start_notify: true,
live_end_notify: true,
clip_notify: true,
post_notify: true,
auto_subtitle: false,
whisper_model: "".to_string(),
whisper_prompt: "这是一段中文 你们好".to_string(),
clip_name_format: "[{room_id}][{live_id}][{title}][{created_at}].mp4".to_string(),
whisper_model: default_whisper_model(),
whisper_prompt: default_whisper_prompt(),
clip_name_format: default_clip_name_format(),
auto_generate: default_auto_generate_config(),
status_check_interval: default_status_check_interval(),
config_path: config_path.to_str().unwrap().into(),
};
config.save();
config
Ok(config)
}
pub fn save(&self) {
let content = toml::to_string(&self).unwrap();
let app_dirs = AppDirs::new(Some("cn.vjoi.bili-shadowreplay"), false).unwrap();
// Create app dirs if not exists
std::fs::create_dir_all(&app_dirs.config_dir).unwrap();
let config_path = app_dirs.config_dir.join("Conf.toml");
std::fs::write(config_path, content).unwrap();
if let Err(e) = std::fs::write(self.config_path.clone(), content) {
log::error!("Failed to save config: {} {}", e, self.config_path);
}
}
#[allow(dead_code)]
pub fn set_cache_path(&mut self, path: &str) {
self.cache = path.to_string();
self.save();
}
#[allow(dead_code)]
pub fn set_output_path(&mut self, path: &str) {
self.output = path.into();
self.save();
}
pub fn webid_expired(&self) -> bool {
let now = chrono::Utc::now().timestamp();
// expire in 20 hours
now - self.webid_ts > 72000
}
pub fn generate_clip_name(&self, params: &ClipRangeParams) -> PathBuf {
let platform = PlatformType::from_str(&params.platform).unwrap();

View File

@@ -3,6 +3,7 @@ use crate::recorder::PlatformType;
use super::Database;
use super::DatabaseError;
use chrono::Utc;
use rand::seq::SliceRandom;
use rand::Rng;
#[derive(Debug, Clone, serde::Serialize, sqlx::FromRow)]
@@ -19,7 +20,11 @@ pub struct AccountRow {
// accounts
impl Database {
// CREATE TABLE accounts (uid INTEGER PRIMARY KEY, name TEXT, avatar TEXT, csrf TEXT, cookies TEXT, created_at TEXT);
pub async fn add_account(&self, platform: &str, cookies: &str) -> Result<AccountRow, DatabaseError> {
pub async fn add_account(
&self,
platform: &str,
cookies: &str,
) -> Result<AccountRow, DatabaseError> {
let lock = self.db.read().await.clone().unwrap();
let platform = PlatformType::from_str(platform).unwrap();
@@ -100,13 +105,15 @@ impl Database {
avatar: &str,
) -> Result<(), DatabaseError> {
let lock = self.db.read().await.clone().unwrap();
let sql = sqlx::query("UPDATE accounts SET name = $1, avatar = $2 WHERE uid = $3 and platform = $4")
.bind(name)
.bind(avatar)
.bind(uid as i64)
.bind(platform)
.execute(&lock)
.await?;
let sql = sqlx::query(
"UPDATE accounts SET name = $1, avatar = $2 WHERE uid = $3 and platform = $4",
)
.bind(name)
.bind(avatar)
.bind(uid as i64)
.bind(platform)
.execute(&lock)
.await?;
if sql.rows_affected() != 1 {
return Err(DatabaseError::NotFoundError);
}
@@ -122,20 +129,30 @@ impl Database {
pub async fn get_account(&self, platform: &str, uid: u64) -> Result<AccountRow, DatabaseError> {
let lock = self.db.read().await.clone().unwrap();
Ok(
sqlx::query_as::<_, AccountRow>("SELECT * FROM accounts WHERE uid = $1 and platform = $2")
.bind(uid as i64)
.bind(platform)
.fetch_one(&lock)
.await?,
Ok(sqlx::query_as::<_, AccountRow>(
"SELECT * FROM accounts WHERE uid = $1 and platform = $2",
)
.bind(uid as i64)
.bind(platform)
.fetch_one(&lock)
.await?)
}
pub async fn get_account_by_platform(&self, platform: &str) -> Result<AccountRow, DatabaseError> {
pub async fn get_account_by_platform(
&self,
platform: &str,
) -> Result<AccountRow, DatabaseError> {
let lock = self.db.read().await.clone().unwrap();
Ok(sqlx::query_as::<_, AccountRow>("SELECT * FROM accounts WHERE platform = $1")
.bind(platform)
.fetch_one(&lock)
.await?)
let accounts =
sqlx::query_as::<_, AccountRow>("SELECT * FROM accounts WHERE platform = $1")
.bind(platform)
.fetch_all(&lock)
.await?;
if accounts.is_empty() {
return Err(DatabaseError::NotFoundError);
}
// randomly select one account
let account = accounts.choose(&mut rand::thread_rng()).unwrap();
Ok(account.clone())
}
}

View File

@@ -1,7 +1,7 @@
use std::path::{Path, PathBuf};
use std::process::Stdio;
use crate::progress_event::ProgressReporterTrait;
use crate::progress_reporter::ProgressReporterTrait;
use async_ffmpeg_sidecar::event::FfmpegEvent;
use async_ffmpeg_sidecar::log_parser::FfmpegLogParser;
use tokio::io::BufReader;
@@ -11,7 +11,17 @@ pub async fn clip_from_m3u8(
m3u8_index: &Path,
output_path: &Path,
) -> Result<(), String> {
let child = tokio::process::Command::new("ffmpeg")
// first check output folder exists
let output_folder = output_path.parent().unwrap();
if !output_folder.exists() {
log::warn!(
"Output folder does not exist, creating: {}",
output_folder.display()
);
std::fs::create_dir_all(output_folder).unwrap();
}
let child = tokio::process::Command::new(ffmpeg_path())
.args(["-i", &format!("{}", m3u8_index.display())])
.args(["-c", "copy"])
.args(["-y", output_path.to_str().unwrap()])
@@ -40,6 +50,9 @@ pub async fn clip_from_m3u8(
.update(format!("编码中:{}", p.time).as_str())
}
FfmpegEvent::LogEOF => break,
FfmpegEvent::Log(_level, content) => {
log::debug!("{}", content);
}
FfmpegEvent::Error(e) => {
log::error!("Clip error: {}", e);
clip_error = Some(e.to_string());
@@ -68,7 +81,7 @@ pub async fn extract_audio(file: &Path) -> Result<(), String> {
let output_path = file.with_extension("wav");
let mut extract_error = None;
let child = tokio::process::Command::new("ffmpeg")
let child = tokio::process::Command::new(ffmpeg_path())
.args(["-i", file.to_str().unwrap()])
.args(["-ar", "16000"])
.args([output_path.to_str().unwrap()])
@@ -93,6 +106,9 @@ pub async fn extract_audio(file: &Path) -> Result<(), String> {
extract_error = Some(e.to_string());
}
FfmpegEvent::LogEOF => break,
FfmpegEvent::Log(_level, content) => {
log::debug!("{}", content);
}
_ => {}
}
}
@@ -147,7 +163,7 @@ pub async fn encode_video_subtitle(
let vf = format!("subtitles={}:force_style='{}'", subtitle, srt_style);
log::info!("vf: {}", vf);
let child = tokio::process::Command::new("ffmpeg")
let child = tokio::process::Command::new(ffmpeg_path())
.args(["-i", file.to_str().unwrap()])
.args(["-vf", vf.as_str()])
.args(["-c:v", "libx264"])
@@ -178,6 +194,9 @@ pub async fn encode_video_subtitle(
reporter.update(format!("压制中:{}", p.time).as_str());
}
FfmpegEvent::LogEOF => break,
FfmpegEvent::Log(_level, content) => {
log::debug!("{}", content);
}
_ => {}
}
}
@@ -227,7 +246,7 @@ pub async fn encode_video_danmu(
format!("'{}'", subtitle.display())
};
let child = tokio::process::Command::new("ffmpeg")
let child = tokio::process::Command::new(ffmpeg_path())
.args(["-i", file.to_str().unwrap()])
.args(["-vf", &format!("ass={}", subtitle)])
.args(["-c:v", "libx264"])
@@ -262,6 +281,9 @@ pub async fn encode_video_danmu(
.unwrap()
.update(format!("压制中:{}", p.time).as_str());
}
FfmpegEvent::Log(_level, content) => {
log::debug!("{}", content);
}
FfmpegEvent::LogEOF => break,
_ => {}
}
@@ -280,3 +302,51 @@ pub async fn encode_video_danmu(
Ok(output_path)
}
}
/// Trying to run ffmpeg for version
pub async fn check_ffmpeg() -> Result<String, String> {
let child = tokio::process::Command::new(ffmpeg_path())
.arg("-version")
.stdout(Stdio::piped())
.spawn();
if let Err(e) = child {
log::error!("Faild to spwan ffmpeg process: {e}");
return Err(e.to_string());
}
let mut child = child.unwrap();
let stdout = child.stdout.take();
if stdout.is_none() {
log::error!("Failed to take ffmpeg output");
return Err("Failed to take ffmpeg output".into());
}
let stdout = stdout.unwrap();
let reader = BufReader::new(stdout);
let mut parser = FfmpegLogParser::new(reader);
let mut version = None;
while let Ok(event) = parser.parse_next_event().await {
match event {
FfmpegEvent::ParsedVersion(v) => version = Some(v.version),
FfmpegEvent::LogEOF => break,
_ => {}
}
}
if let Some(version) = version {
Ok(version)
} else {
Err("Failed to parse version from output".into())
}
}
fn ffmpeg_path() -> PathBuf {
let mut path = Path::new("ffmpeg").to_path_buf();
if cfg!(windows) {
path.set_extension("exe");
}
path
}

View File

@@ -1,30 +1,28 @@
use crate::database::account::AccountRow;
use crate::recorder::bilibili::client::{QrInfo, QrStatus};
use crate::state::State;
use crate::state_type;
#[cfg(feature = "gui")]
use tauri::State as TauriState;
#[tauri::command]
pub async fn get_accounts(state: TauriState<'_, State>) -> Result<super::AccountInfo, String> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_accounts(state: state_type!()) -> Result<super::AccountInfo, String> {
let account_info = super::AccountInfo {
accounts: state.db.get_accounts().await?,
};
Ok(account_info)
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn add_account(
state: TauriState<'_, State>,
state: state_type!(),
platform: String,
cookies: &str,
) -> Result<AccountRow, String> {
let account = state.db.add_account(&platform, cookies).await?;
if platform == "bilibili" {
state.config.write().await.webid = state.client.fetch_webid(&account).await?;
state.config.write().await.webid_ts = chrono::Utc::now().timestamp();
let account_info = state
.client
.get_user_info(&state.config.read().await.webid, &account, account.uid)
.await?;
let account_info = state.client.get_user_info(&account, account.uid).await?;
state
.db
.update_account(
@@ -38,9 +36,9 @@ pub async fn add_account(
Ok(account)
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn remove_account(
state: TauriState<'_, State>,
state: state_type!(),
platform: String,
uid: u64,
) -> Result<(), String> {
@@ -51,24 +49,21 @@ pub async fn remove_account(
Ok(state.db.remove_account(&platform, uid).await?)
}
#[tauri::command]
pub async fn get_account_count(state: TauriState<'_, State>) -> Result<u64, String> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_account_count(state: state_type!()) -> Result<u64, String> {
Ok(state.db.get_accounts().await?.len() as u64)
}
#[tauri::command]
pub async fn get_qr_status(
state: tauri::State<'_, State>,
qrcode_key: &str,
) -> Result<QrStatus, ()> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_qr_status(state: state_type!(), qrcode_key: &str) -> Result<QrStatus, ()> {
match state.client.get_qr_status(qrcode_key).await {
Ok(qr_status) => Ok(qr_status),
Err(_e) => Err(()),
}
}
#[tauri::command]
pub async fn get_qr(state: tauri::State<'_, State>) -> Result<QrInfo, ()> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_qr(state: state_type!()) -> Result<QrInfo, ()> {
match state.client.get_qr().await {
Ok(qr_info) => Ok(qr_info),
Err(_e) => Err(()),

View File

@@ -1,17 +1,18 @@
use crate::config::Config;
use crate::state::State;
use crate::state_type;
#[cfg(feature = "gui")]
use tauri::State as TauriState;
#[tauri::command]
pub async fn get_config(state: TauriState<'_, State>) -> Result<Config, ()> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_config(state: state_type!()) -> Result<Config, ()> {
Ok(state.config.read().await.clone())
}
#[tauri::command]
pub async fn set_cache_path(
state: TauriState<'_, State>,
cache_path: String,
) -> Result<(), String> {
#[cfg_attr(feature = "gui", tauri::command)]
#[allow(dead_code)]
pub async fn set_cache_path(state: state_type!(), cache_path: String) -> Result<(), String> {
let old_cache_path = state.config.read().await.cache.clone();
if old_cache_path == cache_path {
return Ok(());
@@ -76,8 +77,9 @@ pub async fn set_cache_path(
Ok(())
}
#[tauri::command]
pub async fn set_output_path(state: TauriState<'_, State>, output_path: String) -> Result<(), ()> {
#[cfg_attr(feature = "gui", tauri::command)]
#[allow(dead_code)]
pub async fn set_output_path(state: state_type!(), output_path: String) -> Result<(), ()> {
let mut config = state.config.write().await;
let old_output_path = config.output.clone();
if old_output_path == output_path {
@@ -123,9 +125,9 @@ pub async fn set_output_path(state: TauriState<'_, State>, output_path: String)
Ok(())
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn update_notify(
state: TauriState<'_, State>,
state: state_type!(),
live_start_notify: bool,
live_end_notify: bool,
clip_notify: bool,
@@ -139,29 +141,23 @@ pub async fn update_notify(
Ok(())
}
#[tauri::command]
pub async fn update_whisper_model(
state: TauriState<'_, State>,
whisper_model: String,
) -> Result<(), ()> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn update_whisper_model(state: state_type!(), whisper_model: String) -> Result<(), ()> {
state.config.write().await.whisper_model = whisper_model;
state.config.write().await.save();
Ok(())
}
#[tauri::command]
pub async fn update_subtitle_setting(
state: TauriState<'_, State>,
auto_subtitle: bool,
) -> Result<(), ()> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn update_subtitle_setting(state: state_type!(), auto_subtitle: bool) -> Result<(), ()> {
state.config.write().await.auto_subtitle = auto_subtitle;
state.config.write().await.save();
Ok(())
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn update_clip_name_format(
state: TauriState<'_, State>,
state: state_type!(),
clip_name_format: String,
) -> Result<(), ()> {
state.config.write().await.clip_name_format = clip_name_format;
@@ -169,19 +165,16 @@ pub async fn update_clip_name_format(
Ok(())
}
#[tauri::command]
pub async fn update_whisper_prompt(
state: TauriState<'_, State>,
whisper_prompt: String,
) -> Result<(), ()> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn update_whisper_prompt(state: state_type!(), whisper_prompt: String) -> Result<(), ()> {
state.config.write().await.whisper_prompt = whisper_prompt;
state.config.write().await.save();
Ok(())
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn update_auto_generate(
state: tauri::State<'_, State>,
state: state_type!(),
enabled: bool,
encode_danmu: bool,
) -> Result<(), String> {
@@ -191,3 +184,17 @@ pub async fn update_auto_generate(
config.save();
Ok(())
}
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn update_status_check_interval(
state: state_type!(),
mut interval: u64,
) -> Result<(), ()> {
if interval < 10 {
interval = 10; // Minimum interval of 10 seconds
}
log::info!("Updating status check interval to {} seconds", interval);
state.config.write().await.status_check_interval = interval;
state.config.write().await.save();
Ok(())
}

View File

@@ -0,0 +1,15 @@
#[cfg(feature = "gui")]
#[macro_export]
macro_rules! state_type {
() => {
TauriState<'_, State>
};
}
#[cfg(feature = "headless")]
#[macro_export]
macro_rules! state_type {
() => {
State
};
}

View File

@@ -1,18 +1,21 @@
use crate::database::message::MessageRow;
use crate::state::State;
use crate::state_type;
#[cfg(feature = "gui")]
use tauri::State as TauriState;
#[tauri::command]
pub async fn get_messages(state: TauriState<'_, State>) -> Result<Vec<MessageRow>, String> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_messages(state: state_type!()) -> Result<Vec<MessageRow>, String> {
Ok(state.db.get_messages().await?)
}
#[tauri::command]
pub async fn read_message(state: TauriState<'_, State>, id: i64) -> Result<(), String> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn read_message(state: state_type!(), id: i64) -> Result<(), String> {
Ok(state.db.read_message(id).await?)
}
#[tauri::command]
pub async fn delete_message(state: TauriState<'_, State>, id: i64) -> Result<(), String> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn delete_message(state: state_type!(), id: i64) -> Result<(), String> {
Ok(state.db.delete_message(id).await?)
}
}

View File

@@ -1,5 +1,6 @@
pub mod account;
pub mod config;
pub mod macros;
pub mod message;
pub mod recorder;
pub mod utils;

View File

@@ -6,16 +6,22 @@ use crate::recorder::PlatformType;
use crate::recorder::RecorderInfo;
use crate::recorder_manager::RecorderList;
use crate::state::State;
use crate::state_type;
#[cfg(feature = "gui")]
use tauri::State as TauriState;
#[tauri::command]
pub async fn get_recorder_list(state: TauriState<'_, State>) -> Result<RecorderList, ()> {
use serde::Deserialize;
use serde::Serialize;
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_recorder_list(state: state_type!()) -> Result<RecorderList, ()> {
Ok(state.recorder_manager.get_recorder_list().await)
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn add_recorder(
state: TauriState<'_, State>,
state: state_type!(),
platform: String,
room_id: u64,
) -> Result<RecorderRow, String> {
@@ -24,13 +30,9 @@ pub async fn add_recorder(
let account = match platform {
PlatformType::BiliBili => {
if let Ok(account) = state.db.get_account_by_platform("bilibili").await {
if state.config.read().await.webid_expired() {
log::info!("Webid expired, refetching");
state.config.write().await.webid = state.client.fetch_webid(&account).await?;
state.config.write().await.webid_ts = chrono::Utc::now().timestamp();
}
Ok(account)
} else {
log::error!("No available bilibili account found");
Err("没有可用账号,请先添加账号".to_string())
}
}
@@ -38,6 +40,7 @@ pub async fn add_recorder(
if let Ok(account) = state.db.get_account_by_platform("douyin").await {
Ok(account)
} else {
log::error!("No available douyin account found");
Err("没有可用账号,请先添加账号".to_string())
}
}
@@ -47,13 +50,7 @@ pub async fn add_recorder(
match account {
Ok(account) => match state
.recorder_manager
.add_recorder(
state.config.read().await.webid.as_str(),
&account,
platform,
room_id,
true,
)
.add_recorder(&account, platform, room_id, true)
.await
{
Ok(()) => {
@@ -64,18 +61,25 @@ pub async fn add_recorder(
.await?;
Ok(room)
}
Err(e) => Err(format!("添加失败: {}", e)),
Err(e) => {
log::error!("Failed to add recorder: {}", e);
Err(format!("添加失败: {}", e))
}
},
Err(e) => Err(format!("添加失败: {}", e)),
Err(e) => {
log::error!("Failed to add recorder: {}", e);
Err(format!("添加失败: {}", e))
}
}
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn remove_recorder(
state: TauriState<'_, State>,
state: state_type!(),
platform: String,
room_id: u64,
) -> Result<(), String> {
log::info!("Remove recorder: {} {}", platform, room_id);
let platform = PlatformType::from_str(&platform).unwrap();
match state
.recorder_manager
@@ -87,15 +91,19 @@ pub async fn remove_recorder(
.db
.new_message("移除直播间", &format!("移除了直播间 {}", room_id))
.await?;
Ok(state.db.remove_recorder(room_id).await?)
log::info!("Removed recorder: {} {}", platform.as_str(), room_id);
Ok(())
}
Err(e) => {
log::error!("Failed to remove recorder: {}", e);
Err(e.to_string())
}
Err(e) => Err(e.to_string()),
}
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_room_info(
state: TauriState<'_, State>,
state: state_type!(),
platform: String,
room_id: u64,
) -> Result<RecorderInfo, String> {
@@ -111,17 +119,14 @@ pub async fn get_room_info(
}
}
#[tauri::command]
pub async fn get_archives(
state: TauriState<'_, State>,
room_id: u64,
) -> Result<Vec<RecordRow>, String> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_archives(state: state_type!(), room_id: u64) -> Result<Vec<RecordRow>, String> {
Ok(state.recorder_manager.get_archives(room_id).await?)
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_archive(
state: TauriState<'_, State>,
state: state_type!(),
room_id: u64,
live_id: String,
) -> Result<RecordRow, String> {
@@ -131,9 +136,9 @@ pub async fn get_archive(
.await?)
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn delete_archive(
state: TauriState<'_, State>,
state: state_type!(),
platform: String,
room_id: u64,
live_id: String,
@@ -153,9 +158,9 @@ pub async fn delete_archive(
Ok(())
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_danmu_record(
state: TauriState<'_, State>,
state: state_type!(),
platform: String,
room_id: u64,
live_id: String,
@@ -167,33 +172,38 @@ pub async fn get_danmu_record(
.await?)
}
#[tauri::command]
pub async fn export_danmu(
state: TauriState<'_, State>,
#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct ExportDanmuOptions {
platform: String,
room_id: u64,
live_id: String,
x: i64,
y: i64,
offset: i64,
ass: bool,
}
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn export_danmu(
state: state_type!(),
options: ExportDanmuOptions,
) -> Result<String, String> {
let platform = PlatformType::from_str(&platform).unwrap();
let platform = PlatformType::from_str(&options.platform).unwrap();
let mut danmus = state
.recorder_manager
.get_danmu(platform, room_id, &live_id)
.get_danmu(platform, options.room_id, &options.live_id)
.await?;
log::debug!("First danmu entry: {:?}", danmus.first());
// update entry ts to offset
for d in &mut danmus {
d.ts -= (x + offset) * 1000;
d.ts -= (options.x + options.y) * 1000;
}
if x != 0 || y != 0 {
danmus.retain(|e| e.ts >= 0 && e.ts <= (y - x) * 1000);
if options.x != 0 || options.y != 0 {
danmus.retain(|e| e.ts >= 0 && e.ts <= (options.y - options.x) * 1000);
}
if ass {
if options.ass {
Ok(danmu2ass::danmu_to_ass(danmus))
} else {
// map and join entries
@@ -205,9 +215,9 @@ pub async fn export_danmu(
}
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn send_danmaku(
state: TauriState<'_, State>,
state: state_type!(),
uid: u64,
room_id: u64,
message: String,
@@ -220,25 +230,25 @@ pub async fn send_danmaku(
Ok(())
}
#[tauri::command]
pub async fn get_total_length(state: TauriState<'_, State>) -> Result<i64, String> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_total_length(state: state_type!()) -> Result<i64, String> {
match state.db.get_total_length().await {
Ok(total_length) => Ok(total_length),
Err(e) => Err(format!("Failed to get total length: {}", e)),
}
}
#[tauri::command]
pub async fn get_today_record_count(state: TauriState<'_, State>) -> Result<i64, String> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_today_record_count(state: state_type!()) -> Result<i64, String> {
match state.db.get_today_record_count().await {
Ok(count) => Ok(count),
Err(e) => Err(format!("Failed to get today record count: {}", e)),
}
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_recent_record(
state: TauriState<'_, State>,
state: state_type!(),
offset: u64,
limit: u64,
) -> Result<Vec<RecordRow>, String> {
@@ -248,45 +258,30 @@ pub async fn get_recent_record(
}
}
#[tauri::command]
pub async fn set_auto_start(
state: TauriState<'_, State>,
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn set_enable(
state: state_type!(),
platform: String,
room_id: u64,
auto_start: bool,
enabled: bool,
) -> Result<(), String> {
log::info!("Set enable for recorder {platform} {room_id} {enabled}");
let platform = PlatformType::from_str(&platform).unwrap();
state
.recorder_manager
.set_auto_start(platform, room_id, auto_start)
.set_enable(platform, room_id, enabled)
.await;
Ok(())
}
#[tauri::command]
pub async fn force_start(
state: TauriState<'_, State>,
platform: String,
room_id: u64,
) -> Result<(), String> {
let platform = PlatformType::from_str(&platform).unwrap();
state.recorder_manager.force_start(platform, room_id).await;
Ok(())
}
#[tauri::command]
pub async fn force_stop(
state: TauriState<'_, State>,
platform: String,
room_id: u64,
) -> Result<(), String> {
let platform = PlatformType::from_str(&platform).unwrap();
state.recorder_manager.force_stop(platform, room_id).await;
Ok(())
}
#[tauri::command]
pub async fn fetch_hls(state: TauriState<'_, State>, uri: String) -> Result<Vec<u8>, String> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn fetch_hls(state: state_type!(), uri: String) -> Result<Vec<u8>, String> {
// Handle wildcard pattern in the URI
let uri = if uri.contains("/hls/") {
uri.split("/hls/").last().unwrap_or(&uri).to_string()
} else {
uri
};
state
.recorder_manager
.handle_hls_request(&uri)

View File

@@ -1,13 +1,20 @@
use std::process::Command;
use std::path::PathBuf;
use tauri::{Manager, Theme};
use tauri_utils::config::WindowEffectsConfig;
use tokio::fs::OpenOptions;
use tokio::io::AsyncWriteExt;
use crate::recorder::PlatformType;
use crate::state::State;
use crate::state_type;
#[cfg(feature = "gui")]
use {
crate::recorder::PlatformType,
std::process::Command,
tauri::State as TauriState,
tauri::{Manager, Theme},
tauri_utils::config::WindowEffectsConfig,
tokio::fs::OpenOptions,
tokio::io::AsyncWriteExt,
};
#[allow(dead_code)]
pub fn copy_dir_all(
src: impl AsRef<std::path::Path>,
dst: impl AsRef<std::path::Path>,
@@ -25,7 +32,8 @@ pub fn copy_dir_all(
Ok(())
}
#[tauri::command]
#[cfg(feature = "gui")]
#[cfg_attr(feature = "gui", tauri::command)]
pub fn show_in_folder(path: String) {
#[cfg(target_os = "windows")]
{
@@ -81,39 +89,90 @@ pub fn show_in_folder(path: String) {
pub struct DiskInfo {
disk: String,
total: u64,
free: u64,
pub free: u64,
}
#[tauri::command]
pub async fn get_disk_info(state: tauri::State<'_, State>) -> Result<DiskInfo, ()> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_disk_info(state: state_type!()) -> Result<DiskInfo, ()> {
let cache = state.config.read().await.cache.clone();
// check system disk info
let disks = sysinfo::Disks::new_with_refreshed_list();
// get cache disk info
let mut disk_info = DiskInfo {
disk: "".into(),
total: 0,
free: 0,
};
// Find the disk with the longest matching mount point
let mut longest_match = 0;
for disk in disks.list() {
let mount_point = disk.mount_point().to_str().unwrap();
if cache.starts_with(mount_point) && mount_point.split("/").count() > longest_match {
disk_info.disk = mount_point.into();
disk_info.total = disk.total_space();
disk_info.free = disk.available_space();
longest_match = mount_point.split("/").count();
}
// if cache is relative path, convert it to absolute path
let mut cache = PathBuf::from(&cache);
if cache.is_relative() {
// get current working directory
let cwd = std::env::current_dir().unwrap();
cache = cwd.join(cache);
}
Ok(disk_info)
get_disk_info_inner(cache).await
}
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn console_log(_state: state_type!(), level: &str, message: &str) -> Result<(), ()> {
match level {
"error" => log::error!("[frontend] {}", message),
"warn" => log::warn!("[frontend] {}", message),
"info" => log::info!("[frontend] {}", message),
_ => log::debug!("[frontend] {}", message),
}
Ok(())
}
pub async fn get_disk_info_inner(target: PathBuf) -> Result<DiskInfo, ()> {
#[cfg(target_os = "linux")]
{
// get disk info from df command
let output = tokio::process::Command::new("df")
.arg(target)
.output()
.await
.unwrap();
let output_str = String::from_utf8(output.stdout).unwrap();
// Filesystem 1K-blocks Used Available Use% Mounted on
// /dev/nvme0n1p2 959218776 43826092 866593352 5% /app/cache
let lines = output_str.lines().collect::<Vec<&str>>();
if lines.len() < 2 {
log::error!("df command output is too short: {}", output_str);
return Err(());
}
let parts = lines[1].split_whitespace().collect::<Vec<&str>>();
let disk = parts[0].to_string();
let total = parts[1].parse::<u64>().unwrap() * 1024;
let free = parts[3].parse::<u64>().unwrap() * 1024;
return Ok(DiskInfo { disk, total, free });
}
#[cfg(any(target_os = "windows", target_os = "macos"))]
{
// check system disk info
let disks = sysinfo::Disks::new_with_refreshed_list();
// get target disk info
let mut disk_info = DiskInfo {
disk: "".into(),
total: 0,
free: 0,
};
// Find the disk with the longest matching mount point
let mut longest_match = 0;
for disk in disks.list() {
let mount_point = disk.mount_point().to_str().unwrap();
if target.starts_with(mount_point) && mount_point.split("/").count() > longest_match {
disk_info.disk = mount_point.into();
disk_info.total = disk.total_space();
disk_info.free = disk.available_space();
longest_match = mount_point.split("/").count();
}
}
Ok(disk_info)
}
}
#[cfg(feature = "gui")]
#[tauri::command]
pub async fn export_to_file(
_state: tauri::State<'_, State>,
_state: state_type!(),
file_name: &str,
content: &str,
) -> Result<(), String> {
@@ -136,59 +195,66 @@ pub async fn export_to_file(
Ok(())
}
#[cfg(feature = "gui")]
#[tauri::command]
pub async fn open_log_folder(state: tauri::State<'_, State>) -> Result<(), String> {
let log_dir = state.app_handle.path().app_log_dir().unwrap();
show_in_folder(log_dir.to_str().unwrap().to_string());
pub async fn open_log_folder(state: state_type!()) -> Result<(), String> {
#[cfg(feature = "gui")]
{
let log_dir = state.app_handle.path().app_log_dir().unwrap();
show_in_folder(log_dir.to_str().unwrap().to_string());
}
Ok(())
}
#[cfg(feature = "gui")]
#[tauri::command]
pub async fn open_live(
state: tauri::State<'_, State>,
state: state_type!(),
platform: String,
room_id: u64,
live_id: String,
) -> Result<(), String> {
log::info!("Open player window: {} {}", room_id, live_id);
let platform = PlatformType::from_str(&platform).unwrap();
let recorder_info = state
.recorder_manager
.get_recorder_info(platform, room_id)
.await
.unwrap();
let handle = state.app_handle.clone();
let builder = tauri::WebviewWindowBuilder::new(
&handle,
format!("Live:{}:{}", room_id, live_id),
tauri::WebviewUrl::App(
format!(
"live_index.html?platform={}&room_id={}&live_id={}",
platform.as_str(),
room_id,
live_id
)
.into(),
),
)
.title(format!(
"Live[{}] {}",
room_id, recorder_info.room_info.room_title
))
.theme(Some(Theme::Light))
.inner_size(1200.0, 800.0)
.effects(WindowEffectsConfig {
effects: vec![
tauri_utils::WindowEffect::Tabbed,
tauri_utils::WindowEffect::Mica,
],
state: None,
radius: None,
color: None,
});
#[cfg(feature = "gui")]
{
let platform = PlatformType::from_str(&platform).unwrap();
let recorder_info = state
.recorder_manager
.get_recorder_info(platform, room_id)
.await
.unwrap();
let builder = tauri::WebviewWindowBuilder::new(
&state.app_handle,
format!("Live:{}:{}", room_id, live_id),
tauri::WebviewUrl::App(
format!(
"live_index.html?platform={}&room_id={}&live_id={}",
platform.as_str(),
room_id,
live_id
)
.into(),
),
)
.title(format!(
"Live[{}] {}",
room_id, recorder_info.room_info.room_title
))
.theme(Some(Theme::Light))
.inner_size(1200.0, 800.0)
.effects(WindowEffectsConfig {
effects: vec![
tauri_utils::WindowEffect::Tabbed,
tauri_utils::WindowEffect::Mica,
],
state: None,
radius: None,
color: None,
});
if let Err(e) = builder.decorations(true).build() {
log::error!("live window build failed: {}", e);
if let Err(e) = builder.decorations(true).build() {
log::error!("live window build failed: {}", e);
}
}
Ok(())

View File

@@ -1,23 +1,47 @@
use crate::database::video::VideoRow;
use crate::ffmpeg;
use crate::progress_event::{cancel_progress, ProgressReporter, ProgressReporterTrait};
use crate::handlers::utils::get_disk_info_inner;
use crate::progress_reporter::{
cancel_progress, EventEmitter, ProgressReporter, ProgressReporterTrait,
};
use crate::recorder::bilibili::profile::Profile;
use crate::recorder_manager::ClipRangeParams;
use crate::state::State;
use crate::subtitle_generator::whisper::{self};
use crate::subtitle_generator::SubtitleGenerator;
use chrono::Utc;
use std::path::Path;
use tauri::State as TauriState;
use tauri_plugin_notification::NotificationExt;
use std::path::{Path, PathBuf};
#[tauri::command]
use crate::state::State;
use crate::state_type;
#[cfg(feature = "gui")]
use {tauri::State as TauriState, tauri_plugin_notification::NotificationExt};
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn clip_range(
state: TauriState<'_, State>,
state: state_type!(),
event_id: String,
params: ClipRangeParams,
) -> Result<VideoRow, String> {
let reporter = ProgressReporter::new(&state.app_handle, &event_id).await?;
// check storage space, preserve 1GB for other usage
let output = state.config.read().await.output.clone();
let mut output = PathBuf::from(&output);
if output.is_relative() {
// get current working directory
let cwd = std::env::current_dir().unwrap();
output = cwd.join(output);
}
if let Ok(disk_info) = get_disk_info_inner(output).await {
// if free space is less than 1GB, return error
if disk_info.free < 1024 * 1024 * 1024 {
return Err("Storage space is not enough, clip canceled".to_string());
}
}
#[cfg(feature = "gui")]
let emitter = EventEmitter::new(state.app_handle.clone());
#[cfg(feature = "headless")]
let emitter = EventEmitter::new(state.progress_manager.get_event_sender());
let reporter = ProgressReporter::new(&emitter, &event_id).await?;
match clip_range_inner(state, &reporter, params).await {
Ok(video) => {
reporter.finish(true, "切片完成").await;
@@ -31,12 +55,13 @@ pub async fn clip_range(
}
async fn clip_range_inner(
state: TauriState<'_, State>,
state: state_type!(),
reporter: &ProgressReporter,
params: ClipRangeParams,
) -> Result<VideoRow, String> {
log::info!(
"Clip room_id: {}, ts: {}, start: {}, end: {}",
"[{}]Clip room_id: {}, ts: {}, start: {}, end: {}",
reporter.event_id,
params.room_id,
params.live_id,
params.x,
@@ -112,6 +137,7 @@ async fn clip_range_inner(
)
.await?;
if state.config.read().await.clip_notify {
#[cfg(feature = "gui")]
state
.app_handle
.notification()
@@ -130,9 +156,9 @@ async fn clip_range_inner(
Ok(video)
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn upload_procedure(
state: TauriState<'_, State>,
state: state_type!(),
event_id: String,
uid: u64,
room_id: u64,
@@ -140,7 +166,11 @@ pub async fn upload_procedure(
cover: String,
profile: Profile,
) -> Result<String, String> {
let reporter = ProgressReporter::new(&state.app_handle, &event_id).await?;
#[cfg(feature = "gui")]
let emitter = EventEmitter::new(state.app_handle.clone());
#[cfg(feature = "headless")]
let emitter = EventEmitter::new(state.progress_manager.get_event_sender());
let reporter = ProgressReporter::new(&emitter, &event_id).await?;
match upload_procedure_inner(state, &reporter, uid, room_id, video_id, cover, profile).await {
Ok(bvid) => {
reporter.finish(true, "投稿成功").await;
@@ -154,7 +184,7 @@ pub async fn upload_procedure(
}
async fn upload_procedure_inner(
state: TauriState<'_, State>,
state: state_type!(),
reporter: &ProgressReporter,
uid: u64,
room_id: u64,
@@ -193,6 +223,7 @@ async fn upload_procedure_inner(
)
.await?;
if state.config.read().await.post_notify {
#[cfg(feature = "gui")]
state
.app_handle
.notification()
@@ -218,27 +249,24 @@ async fn upload_procedure_inner(
}
}
#[tauri::command]
pub async fn cancel(_state: TauriState<'_, State>, event_id: String) -> Result<(), String> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn cancel(_state: state_type!(), event_id: String) -> Result<(), String> {
cancel_progress(&event_id).await;
Ok(())
}
#[tauri::command]
pub async fn get_video(state: TauriState<'_, State>, id: i64) -> Result<VideoRow, String> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_video(state: state_type!(), id: i64) -> Result<VideoRow, String> {
Ok(state.db.get_video(id).await?)
}
#[tauri::command]
pub async fn get_videos(
state: TauriState<'_, State>,
room_id: u64,
) -> Result<Vec<VideoRow>, String> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_videos(state: state_type!(), room_id: u64) -> Result<Vec<VideoRow>, String> {
Ok(state.db.get_videos(room_id).await?)
}
#[tauri::command]
pub async fn delete_video(state: TauriState<'_, State>, id: i64) -> Result<(), String> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn delete_video(state: state_type!(), id: i64) -> Result<(), String> {
// get video info from dbus
let video = state.db.get_video(id).await?;
// delete video from db
@@ -262,25 +290,25 @@ pub async fn delete_video(state: TauriState<'_, State>, id: i64) -> Result<(), S
Ok(())
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_video_typelist(
state: TauriState<'_, State>,
state: state_type!(),
) -> Result<Vec<crate::recorder::bilibili::response::Typelist>, String> {
let account = state.db.get_account_by_platform("bilibili").await?;
Ok(state.client.get_video_typelist(&account).await?)
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn update_video_cover(
state: TauriState<'_, State>,
state: state_type!(),
id: i64,
cover: String,
) -> Result<(), String> {
Ok(state.db.update_video_cover(id, cover).await?)
}
#[tauri::command]
pub async fn get_video_subtitle(state: TauriState<'_, State>, id: i64) -> Result<String, String> {
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn get_video_subtitle(state: state_type!(), id: i64) -> Result<String, String> {
let video = state.db.get_video(id).await?;
let filepath = Path::new(state.config.read().await.output.as_str()).join(&video.file);
let file = Path::new(&filepath);
@@ -292,13 +320,17 @@ pub async fn get_video_subtitle(state: TauriState<'_, State>, id: i64) -> Result
}
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn generate_video_subtitle(
state: TauriState<'_, State>,
state: state_type!(),
event_id: String,
id: i64,
) -> Result<String, String> {
let reporter = ProgressReporter::new(&state.app_handle, &event_id).await?;
#[cfg(feature = "gui")]
let emitter = EventEmitter::new(state.app_handle.clone());
#[cfg(feature = "headless")]
let emitter = EventEmitter::new(state.progress_manager.get_event_sender());
let reporter = ProgressReporter::new(&emitter, &event_id).await?;
match generate_video_subtitle_inner(state, &reporter, id).await {
Ok(subtitle) => {
reporter.finish(true, "字幕生成完成").await;
@@ -314,7 +346,7 @@ pub async fn generate_video_subtitle(
}
async fn generate_video_subtitle_inner(
state: TauriState<'_, State>,
state: state_type!(),
reporter: &ProgressReporter,
id: i64,
) -> Result<String, String> {
@@ -339,9 +371,9 @@ async fn generate_video_subtitle_inner(
}
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn update_video_subtitle(
state: TauriState<'_, State>,
state: state_type!(),
id: i64,
subtitle: String,
) -> Result<(), String> {
@@ -355,14 +387,18 @@ pub async fn update_video_subtitle(
Ok(())
}
#[tauri::command]
#[cfg_attr(feature = "gui", tauri::command)]
pub async fn encode_video_subtitle(
state: TauriState<'_, State>,
state: state_type!(),
event_id: String,
id: i64,
srt_style: String,
) -> Result<VideoRow, String> {
let reporter = ProgressReporter::new(&state.app_handle, &event_id).await?;
#[cfg(feature = "gui")]
let emitter = EventEmitter::new(state.app_handle.clone());
#[cfg(feature = "headless")]
let emitter = EventEmitter::new(state.progress_manager.get_event_sender());
let reporter = ProgressReporter::new(&emitter, &event_id).await?;
match encode_video_subtitle_inner(state, &reporter, id, srt_style).await {
Ok(video) => {
reporter.finish(true, "字幕编码完成").await;
@@ -378,7 +414,7 @@ pub async fn encode_video_subtitle(
}
async fn encode_video_subtitle_inner(
state: TauriState<'_, State>,
state: state_type!(),
reporter: &ProgressReporter,
id: i64,
srt_style: String,

1170
src-tauri/src/http_server.rs Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -7,37 +7,90 @@ mod danmu2ass;
mod database;
mod ffmpeg;
mod handlers;
mod progress_event;
#[cfg(feature = "headless")]
mod http_server;
#[cfg(feature = "headless")]
mod migration;
mod progress_manager;
mod progress_reporter;
mod recorder;
mod recorder_manager;
mod state;
mod subtitle_generator;
#[cfg(feature = "gui")]
mod tray;
use archive_migration::try_rebuild_archives;
use async_std::fs;
use chrono::Utc;
use config::Config;
use database::Database;
use recorder::{bilibili::client::BiliClient, PlatformType};
use recorder::bilibili::client::BiliClient;
use recorder_manager::RecorderManager;
use simplelog::ConfigBuilder;
use state::State;
use std::fs::File;
use std::path::Path;
use std::sync::Arc;
use tauri::{Manager, WindowEvent};
use tauri_plugin_sql::{Migration, MigrationKind};
use tokio::sync::RwLock;
#[cfg(not(target_os = "windows"))]
use std::os::unix::fs::MetadataExt;
#[cfg(target_os = "windows")]
use std::os::windows::fs::MetadataExt;
#[cfg(feature = "gui")]
use {
recorder::PlatformType,
tauri::{Manager, WindowEvent},
tauri_plugin_sql::{Migration, MigrationKind},
};
#[cfg(feature = "headless")]
use {
clap::{arg, command, Parser},
futures_core::future::BoxFuture,
migration::{Migration, MigrationKind},
sqlx::error::BoxDynError,
sqlx::migrate::Migration as SqlxMigration,
sqlx::migrate::MigrationSource,
sqlx::{
migrate::{MigrateDatabase, Migrator},
Pool, Sqlite,
},
};
/// open a log file, if file size exceeds 1MB, backup log file and create a new one.
async fn open_log_file(log_dir: &Path) -> Result<File, Box<dyn std::error::Error>> {
let log_filename = log_dir.join("bsr.log");
if let Ok(meta) = fs::metadata(&log_filename).await {
#[cfg(target_os = "windows")]
let file_size = meta.file_size();
#[cfg(not(target_os = "windows"))]
let file_size = meta.size();
if file_size > 1024 * 1024 {
// move original file to backup
let date_str = Utc::now().format("%Y-%m-%d_%H-%M-%S").to_string();
let backup_filename = log_dir.join(format!("bsr-{date_str}.log"));
fs::rename(&log_filename, backup_filename).await?;
}
}
Ok(File::options()
.create(true)
.append(true)
.open(&log_filename)?)
}
async fn setup_logging(log_dir: &Path) -> Result<(), Box<dyn std::error::Error>> {
// mkdir if not exists
if !log_dir.exists() {
std::fs::create_dir_all(log_dir)?;
}
let log_file = log_dir.join("bsr.log");
// open file with append mode
let file = File::options().create(true).append(true).open(&log_file)?;
let file = open_log_file(log_dir).await?;
let config = ConfigBuilder::new()
.set_target_level(simplelog::LevelFilter::Debug)
@@ -47,6 +100,7 @@ async fn setup_logging(log_dir: &Path) -> Result<(), Box<dyn std::error::Error>>
.add_filter_ignore_str("sqlx")
.add_filter_ignore_str("reqwest")
.add_filter_ignore_str("h2")
.add_filter_ignore_str("danmu_stream")
.build();
simplelog::CombinedLogger::init(vec![
@@ -90,25 +144,122 @@ fn get_migrations() -> Vec<Migration> {
]
}
async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::Error>> {
println!("Setting up app state...");
#[cfg(feature = "headless")]
#[derive(Debug)]
struct MigrationList(Vec<Migration>);
#[cfg(feature = "headless")]
impl MigrationSource<'static> for MigrationList {
fn resolve(self) -> BoxFuture<'static, std::result::Result<Vec<SqlxMigration>, BoxDynError>> {
Box::pin(async move {
let mut migrations = Vec::new();
for migration in self.0 {
if matches!(migration.kind, MigrationKind::Up) {
migrations.push(SqlxMigration::new(
migration.version,
migration.description.into(),
migration.kind.into(),
migration.sql.into(),
false,
));
}
}
Ok(migrations)
})
}
}
#[cfg(feature = "headless")]
async fn setup_server_state(args: Args) -> Result<State, Box<dyn std::error::Error>> {
use std::path::PathBuf;
use progress_manager::ProgressManager;
use progress_reporter::EventEmitter;
setup_logging(Path::new("./")).await?;
log::info!("Setting up server state...");
let config_path = PathBuf::from(&args.config);
let cache_path = PathBuf::from("./cache");
let output_path = PathBuf::from("./output");
let config = match Config::load(&config_path, &cache_path, &output_path) {
Ok(config) => config,
Err(e) => {
log::error!("Failed to load config: {e}");
return Err(e.into());
}
};
let client = Arc::new(BiliClient::new()?);
let config = Arc::new(RwLock::new(Config::load()));
let config = Arc::new(RwLock::new(config));
let db = Arc::new(Database::new());
// connect to sqlite database
let conn_url = format!("sqlite:{}/data_v2.db", args.db);
// create db folder if not exists
if !Path::new(&args.db).exists() {
std::fs::create_dir_all(&args.db)?;
}
if !Sqlite::database_exists(&conn_url).await.unwrap_or(false) {
Sqlite::create_database(&conn_url).await?;
}
let db_pool: Pool<Sqlite> = Pool::connect(&conn_url).await?;
let migrations = get_migrations();
let migrator = Migrator::new(MigrationList(migrations))
.await
.expect("Failed to create migrator");
migrator
.run(&db_pool)
.await
.expect("Failed to run migrations");
db.set(db_pool).await;
let progress_manager = Arc::new(ProgressManager::new());
let emitter = EventEmitter::new(progress_manager.get_event_sender());
let recorder_manager = Arc::new(RecorderManager::new(emitter, db.clone(), config.clone()));
let _ = try_rebuild_archives(&db, config.read().await.cache.clone().into()).await;
Ok(State {
db,
client,
config,
recorder_manager,
progress_manager,
readonly: args.readonly,
})
}
#[cfg(feature = "gui")]
async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::Error>> {
use platform_dirs::AppDirs;
use progress_reporter::EventEmitter;
let log_dir = app.path().app_log_dir()?;
setup_logging(&log_dir).await?;
log::info!("Setting up app state...");
let app_dirs = AppDirs::new(Some("cn.vjoi.bili-shadowreplay"), false).unwrap();
let config_path = app_dirs.config_dir.join("Conf.toml");
let cache_path = app_dirs.cache_dir.join("cache");
let output_path = app_dirs.data_dir.join("output");
log::info!("Loading config from {:?}", config_path);
let config = match Config::load(&config_path, &cache_path, &output_path) {
Ok(config) => config,
Err(e) => {
log::error!("Failed to load config, exiting: {e}");
return Err(e.into());
}
};
let client = Arc::new(BiliClient::new()?);
let config = Arc::new(RwLock::new(config));
let config_clone = config.clone();
let dbs = app.state::<tauri_plugin_sql::DbInstances>().inner();
let db = Arc::new(Database::new());
let db_clone = db.clone();
let client_clone = client.clone();
let log_dir = app.path().app_log_dir()?;
setup_logging(&log_dir).await?;
let recorder_manager = Arc::new(RecorderManager::new(
app.handle().clone(),
db.clone(),
config.clone(),
));
let recorder_manager_clone = recorder_manager.clone();
let emitter = EventEmitter::new(app.handle().clone());
let binding = dbs.0.read().await;
let dbpool = binding.get("sqlite:data_v2.db").unwrap();
let sqlite_pool = match dbpool {
@@ -116,6 +267,13 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
};
db_clone.set(sqlite_pool.unwrap().clone()).await;
let recorder_manager = Arc::new(RecorderManager::new(
app.app_handle().clone(),
emitter,
db.clone(),
config.clone(),
));
let accounts = db_clone.get_accounts().await?;
if accounts.is_empty() {
log::warn!("No account found");
@@ -128,45 +286,30 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
});
}
let bili_account = db_clone.get_account_by_platform("bilibili").await;
if let Ok(bili_account) = bili_account {
let mut webid = client_clone.fetch_webid(&bili_account).await;
if webid.is_err() {
log::error!("Failed to fetch webid: {}", webid.err().unwrap());
webid = Ok("".to_string());
// update account infos
for account in accounts {
// only update bilibili account
let platform = PlatformType::from_str(&account.platform).unwrap();
if platform != PlatformType::BiliBili {
continue;
}
let webid = webid.unwrap();
// update account infos
for account in accounts {
// only update bilibili account
let platform = PlatformType::from_str(&account.platform).unwrap();
if platform != PlatformType::BiliBili {
continue;
match client_clone.get_user_info(&account, account.uid).await {
Ok(account_info) => {
if let Err(e) = db_clone
.update_account(
&account.platform,
account_info.user_id,
&account_info.user_name,
&account_info.user_avatar_url,
)
.await
{
log::error!("Error when updating account info {}", e);
}
}
match client_clone
.get_user_info(&webid, &account, account.uid)
.await
{
Ok(account_info) => {
if let Err(e) = db_clone
.update_account(
&account.platform,
account_info.user_id,
&account_info.user_name,
&account_info.user_avatar_url,
)
.await
{
log::error!("Error when updating account info {}", e);
}
}
Err(e) => {
log::error!("Get user info failed {}", e);
}
Err(e) => {
log::error!("Get user info failed {}", e);
}
}
}
@@ -186,6 +329,7 @@ async fn setup_app_state(app: &tauri::App) -> Result<State, Box<dyn std::error::
})
}
#[cfg(feature = "gui")]
fn setup_plugins(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<tauri::Wry> {
let migrations = get_migrations();
let builder = builder
@@ -214,6 +358,7 @@ fn setup_plugins(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<tauri::W
builder
}
#[cfg(feature = "gui")]
fn setup_event_handlers(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<tauri::Wry> {
builder.on_window_event(|window, event| {
if let WindowEvent::CloseRequested { api, .. } = event {
@@ -225,6 +370,7 @@ fn setup_event_handlers(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<t
})
}
#[cfg(feature = "gui")]
fn setup_invoke_handlers(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<tauri::Wry> {
builder.invoke_handler(tauri::generate_handler![
crate::handlers::account::get_accounts,
@@ -242,6 +388,7 @@ fn setup_invoke_handlers(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<
crate::handlers::config::update_clip_name_format,
crate::handlers::config::update_whisper_prompt,
crate::handlers::config::update_auto_generate,
crate::handlers::config::update_status_check_interval,
crate::handlers::message::get_messages,
crate::handlers::message::read_message,
crate::handlers::message::delete_message,
@@ -258,9 +405,7 @@ fn setup_invoke_handlers(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<
crate::handlers::recorder::get_total_length,
crate::handlers::recorder::get_today_record_count,
crate::handlers::recorder::get_recent_record,
crate::handlers::recorder::set_auto_start,
crate::handlers::recorder::force_start,
crate::handlers::recorder::force_stop,
crate::handlers::recorder::set_enable,
crate::handlers::recorder::fetch_hls,
crate::handlers::video::clip_range,
crate::handlers::video::upload_procedure,
@@ -279,9 +424,11 @@ fn setup_invoke_handlers(builder: tauri::Builder<tauri::Wry>) -> tauri::Builder<
crate::handlers::utils::get_disk_info,
crate::handlers::utils::open_live,
crate::handlers::utils::open_log_folder,
crate::handlers::utils::console_log,
])
}
#[cfg(feature = "gui")]
fn main() -> Result<(), Box<dyn std::error::Error>> {
let _ = fix_path_env::fix();
@@ -296,18 +443,12 @@ fn main() -> Result<(), Box<dyn std::error::Error>> {
let state = setup_app_state(app).await?;
let _ = tray::create_tray(app.handle());
// only auto download ffmpeg if it's linux
if cfg!(target_os = "linux") {
if let Err(e) = async_ffmpeg_sidecar::download::auto_download().await {
log::error!("Error when auto downloading ffmpeg: {}", e);
}
// check ffmpeg status
match ffmpeg::check_ffmpeg().await {
Err(e) => log::error!("Failed to check ffmpeg version: {e}"),
Ok(v) => log::info!("Checked ffmpeg version: {v}"),
}
log::info!(
"FFMPEG version: {:?}",
async_ffmpeg_sidecar::version::ffmpeg_version().await
);
app.manage(state);
Ok(())
})
@@ -316,3 +457,39 @@ fn main() -> Result<(), Box<dyn std::error::Error>> {
Ok(())
}
#[cfg(feature = "headless")]
#[derive(Parser, Debug)]
#[command(version, about, long_about = None)]
struct Args {
/// Path to the config file
#[arg(short, long, default_value_t = String::from("config.toml"))]
config: String,
/// Path to the database folder
#[arg(short, long, default_value_t = String::from("./data"))]
db: String,
/// ReadOnly mode
#[arg(short, long, default_value_t = false)]
readonly: bool,
}
#[cfg(feature = "headless")]
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// get params from command line
let args = Args::parse();
let state = setup_server_state(args)
.await
.expect("Failed to setup server state");
// check ffmpeg status
match ffmpeg::check_ffmpeg().await {
Err(e) => log::error!("Failed to check ffmpeg version: {e}"),
Ok(v) => log::info!("Checked ffmpeg version: {v}"),
}
http_server::start_api_server(state).await;
Ok(())
}

View File

@@ -0,0 +1,24 @@
use sqlx::migrate::MigrationType;
#[derive(Debug)]
pub enum MigrationKind {
Up,
Down,
}
#[derive(Debug)]
pub struct Migration {
pub version: i64,
pub description: &'static str,
pub sql: &'static str,
pub kind: MigrationKind,
}
impl From<MigrationKind> for MigrationType {
fn from(kind: MigrationKind) -> Self {
match kind {
MigrationKind::Up => Self::ReversibleUp,
MigrationKind::Down => Self::ReversibleDown,
}
}
}

View File

@@ -1,108 +0,0 @@
use async_trait::async_trait;
use serde::Serialize;
use std::sync::atomic::AtomicBool;
use std::sync::Arc;
use std::sync::LazyLock;
use tauri::{AppHandle, Emitter};
use tokio::sync::RwLock;
type CancelFlagMap = std::collections::HashMap<String, Arc<AtomicBool>>;
static CANCEL_FLAG_MAP: LazyLock<Arc<RwLock<CancelFlagMap>>> =
LazyLock::new(|| Arc::new(RwLock::new(CancelFlagMap::new())));
#[derive(Clone, Serialize)]
#[serde(rename_all = "camelCase")]
pub struct ProgressUpdate<'a> {
pub id: &'a str,
pub content: &'a str,
}
#[derive(Clone, Serialize)]
#[serde(rename_all = "camelCase")]
pub struct ProgressFinished<'a> {
pub id: &'a str,
pub success: bool,
pub message: &'a str,
}
#[derive(Clone)]
pub struct ProgressReporter {
app_handle: AppHandle,
event_id: String,
pub cancel: Arc<AtomicBool>,
}
#[async_trait]
pub trait ProgressReporterTrait: Send + Sync + Clone {
fn update(&self, content: &str);
async fn finish(&self, success: bool, message: &str);
}
impl ProgressReporter {
pub async fn new(app_handle: &AppHandle, event_id: &str) -> Result<Self, String> {
// if already exists, return
if CANCEL_FLAG_MAP.read().await.get(event_id).is_some() {
log::error!("Task already exists: {}", event_id);
let _ = app_handle.emit(
"progress-finished",
ProgressFinished {
id: event_id,
success: false,
message: "任务已经存在",
},
);
return Err("任务已经存在".to_string());
}
let cancel = Arc::new(AtomicBool::new(false));
CANCEL_FLAG_MAP
.write()
.await
.insert(event_id.to_string(), cancel.clone());
Ok(Self {
app_handle: app_handle.clone(),
event_id: event_id.to_string(),
cancel,
})
}
}
#[async_trait]
impl ProgressReporterTrait for ProgressReporter {
fn update(&self, content: &str) {
if let Err(e) = self.app_handle.emit(
"progress-update",
ProgressUpdate {
id: &self.event_id,
content,
},
) {
log::error!("Failed to emit progress update: {}", e);
}
}
async fn finish(&self, success: bool, message: &str) {
if let Err(e) = self.app_handle.emit(
"progress-finished",
ProgressFinished {
id: &self.event_id,
success,
message,
},
) {
log::error!("Failed to emit progress finished: {}", e);
}
CANCEL_FLAG_MAP.write().await.remove(&self.event_id);
}
}
pub async fn cancel_progress(event_id: &str) {
CANCEL_FLAG_MAP
.write()
.await
.get_mut(event_id)
.unwrap()
.store(true, std::sync::atomic::Ordering::Relaxed);
}

View File

@@ -0,0 +1,47 @@
use serde::{Deserialize, Serialize};
#[cfg(feature = "headless")]
use tokio::sync::broadcast;
#[derive(Clone, Serialize, Deserialize)]
pub enum Event {
ProgressUpdate {
id: String,
content: String,
},
ProgressFinished {
id: String,
success: bool,
message: String,
},
DanmuReceived {
room: u64,
ts: i64,
content: String,
},
}
#[cfg(feature = "headless")]
pub struct ProgressManager {
pub progress_sender: broadcast::Sender<Event>,
pub progress_receiver: broadcast::Receiver<Event>,
}
#[cfg(feature = "headless")]
impl ProgressManager {
pub fn new() -> Self {
let (progress_sender, progress_receiver) = broadcast::channel(16);
Self {
progress_sender,
progress_receiver,
}
}
pub fn get_event_sender(&self) -> broadcast::Sender<Event> {
self.progress_sender.clone()
}
pub fn subscribe(&self) -> broadcast::Receiver<Event> {
self.progress_receiver.resubscribe()
}
}

View File

@@ -0,0 +1,167 @@
use async_trait::async_trait;
use std::sync::atomic::AtomicBool;
use std::sync::Arc;
use std::sync::LazyLock;
use tokio::sync::RwLock;
use crate::progress_manager::Event;
#[cfg(feature = "gui")]
use {
crate::recorder::danmu::DanmuEntry,
serde::Serialize,
tauri::{AppHandle, Emitter},
};
#[cfg(feature = "headless")]
use tokio::sync::broadcast;
type CancelFlagMap = std::collections::HashMap<String, Arc<AtomicBool>>;
static CANCEL_FLAG_MAP: LazyLock<Arc<RwLock<CancelFlagMap>>> =
LazyLock::new(|| Arc::new(RwLock::new(CancelFlagMap::new())));
#[derive(Clone)]
pub struct ProgressReporter {
emitter: EventEmitter,
pub event_id: String,
pub cancel: Arc<AtomicBool>,
}
#[async_trait]
pub trait ProgressReporterTrait: Send + Sync + Clone {
fn update(&self, content: &str);
async fn finish(&self, success: bool, message: &str);
}
#[derive(Clone)]
pub struct EventEmitter {
#[cfg(feature = "gui")]
app_handle: AppHandle,
#[cfg(feature = "headless")]
sender: broadcast::Sender<Event>,
}
#[cfg(feature = "gui")]
#[derive(Clone, Serialize)]
struct UpdateEvent<'a> {
id: &'a str,
content: &'a str,
}
#[cfg(feature = "gui")]
#[derive(Clone, Serialize)]
struct FinishEvent<'a> {
id: &'a str,
success: bool,
message: &'a str,
}
impl EventEmitter {
pub fn new(
#[cfg(feature = "gui")] app_handle: AppHandle,
#[cfg(feature = "headless")] sender: broadcast::Sender<Event>,
) -> Self {
Self {
#[cfg(feature = "gui")]
app_handle,
#[cfg(feature = "headless")]
sender,
}
}
pub fn emit(&self, event: &Event) {
#[cfg(feature = "gui")]
{
match event {
Event::ProgressUpdate { id, content } => {
self.app_handle
.emit("progress-update", UpdateEvent { id, content })
.unwrap();
}
Event::ProgressFinished {
id,
success,
message,
} => {
self.app_handle
.emit(
"progress-finished",
FinishEvent {
id,
success: *success,
message,
},
)
.unwrap();
}
Event::DanmuReceived { room, ts, content } => {
self.app_handle
.emit(
&format!("danmu:{}", room),
DanmuEntry {
ts: *ts,
content: content.clone(),
},
)
.unwrap();
}
}
}
#[cfg(feature = "headless")]
let _ = self.sender.send(event.clone());
}
}
impl ProgressReporter {
pub async fn new(emitter: &EventEmitter, event_id: &str) -> Result<Self, String> {
// if already exists, return
if CANCEL_FLAG_MAP.read().await.get(event_id).is_some() {
log::error!("Task already exists: {}", event_id);
emitter.emit(&Event::ProgressFinished {
id: event_id.to_string(),
success: false,
message: "任务已经存在".to_string(),
});
return Err("任务已经存在".to_string());
}
let cancel = Arc::new(AtomicBool::new(false));
CANCEL_FLAG_MAP
.write()
.await
.insert(event_id.to_string(), cancel.clone());
Ok(Self {
emitter: emitter.clone(),
event_id: event_id.to_string(),
cancel,
})
}
}
#[async_trait]
impl ProgressReporterTrait for ProgressReporter {
fn update(&self, content: &str) {
self.emitter.emit(&Event::ProgressUpdate {
id: self.event_id.clone(),
content: content.to_string(),
});
}
async fn finish(&self, success: bool, message: &str) {
self.emitter.emit(&Event::ProgressFinished {
id: self.event_id.clone(),
success,
message: message.to_string(),
});
CANCEL_FLAG_MAP.write().await.remove(&self.event_id);
}
}
pub async fn cancel_progress(event_id: &str) {
let mut cancel_flag_map = CANCEL_FLAG_MAP.write().await;
if let Some(cancel_flag) = cancel_flag_map.get_mut(event_id) {
cancel_flag.store(true, std::sync::atomic::Ordering::Relaxed);
}
}

View File

@@ -77,10 +77,10 @@ pub trait Recorder: Send + Sync + 'static {
async fn stop(&self);
async fn first_segment_ts(&self, live_id: &str) -> i64;
async fn m3u8_content(&self, live_id: &str, start: i64, end: i64) -> String;
async fn master_m3u8(&self, live_id: &str, start: i64, end: i64) -> String;
async fn info(&self) -> RecorderInfo;
async fn comments(&self, live_id: &str) -> Result<Vec<DanmuEntry>, errors::RecorderError>;
async fn is_recording(&self, live_id: &str) -> bool;
async fn force_start(&self);
async fn force_stop(&self);
async fn set_auto_start(&self, auto_start: bool);
async fn enable(&self);
async fn disable(&self);
}

View File

@@ -2,35 +2,39 @@ pub mod client;
pub mod errors;
pub mod profile;
pub mod response;
use super::entry::EntryStore;
use super::entry::{EntryStore, Range};
use super::errors::RecorderError;
use super::PlatformType;
use crate::database::account::AccountRow;
use crate::progress_manager::Event;
use crate::progress_reporter::EventEmitter;
use crate::recorder_manager::RecorderEvent;
use super::danmu::{DanmuEntry, DanmuStorage};
use super::entry::TsEntry;
use chrono::{TimeZone, Utc};
use chrono::Utc;
use client::{BiliClient, BiliStream, RoomInfo, StreamType, UserInfo};
use dashmap::DashMap;
use danmu_stream::danmu_stream::DanmuStream;
use danmu_stream::provider::ProviderType;
use danmu_stream::DanmuMessageType;
use errors::BiliClientError;
use felgens::{ws_socket_object, FelgensError, WsStreamMessageType};
use m3u8_rs::Playlist;
use rand::Rng;
use m3u8_rs::{Playlist, QuotedOrUnquoted, VariantStream};
use regex::Regex;
use std::sync::atomic::{AtomicBool, Ordering};
use std::sync::Arc;
use std::thread;
use std::time::Duration;
use tauri::{AppHandle, Emitter, Url};
use tauri_plugin_notification::NotificationExt;
use tokio::sync::mpsc::{self, UnboundedReceiver};
use tokio::sync::{broadcast, Mutex, RwLock};
use tokio::task::JoinHandle;
use url::Url;
use crate::config::Config;
use crate::database::{Database, DatabaseError};
use async_trait::async_trait;
#[cfg(feature = "gui")]
use {tauri::AppHandle, tauri_plugin_notification::NotificationExt};
/// A recorder for BiliBili live streams
///
/// This recorder fetches, caches and serves TS entries, currently supporting only StreamType::FMP4.
@@ -38,28 +42,31 @@ use async_trait::async_trait;
// TODO implement StreamType::TS
#[derive(Clone)]
pub struct BiliRecorder {
#[cfg(feature = "gui")]
app_handle: AppHandle,
emitter: EventEmitter,
client: Arc<RwLock<BiliClient>>,
db: Arc<Database>,
account: AccountRow,
config: Arc<RwLock<Config>>,
pub room_id: u64,
pub room_info: Arc<RwLock<RoomInfo>>,
pub user_info: Arc<RwLock<UserInfo>>,
pub live_status: Arc<RwLock<bool>>,
pub live_id: Arc<RwLock<String>>,
pub cover: Arc<RwLock<Option<String>>>,
pub entry_store: Arc<RwLock<Option<EntryStore>>>,
pub is_recording: Arc<RwLock<bool>>,
pub auto_start: Arc<RwLock<bool>>,
pub current_record: Arc<RwLock<bool>>,
room_id: u64,
room_info: Arc<RwLock<RoomInfo>>,
user_info: Arc<RwLock<UserInfo>>,
live_status: Arc<RwLock<bool>>,
live_id: Arc<RwLock<String>>,
cover: Arc<RwLock<Option<String>>>,
entry_store: Arc<RwLock<Option<EntryStore>>>,
is_recording: Arc<RwLock<bool>>,
force_update: Arc<AtomicBool>,
last_update: Arc<RwLock<i64>>,
quit: Arc<Mutex<bool>>,
pub live_stream: Arc<RwLock<Option<BiliStream>>>,
live_stream: Arc<RwLock<Option<BiliStream>>>,
danmu_storage: Arc<RwLock<Option<DanmuStorage>>>,
m3u8_cache: DashMap<String, String>,
live_end_channel: broadcast::Sender<RecorderEvent>,
enabled: Arc<RwLock<bool>>,
danmu_task: Arc<Mutex<Option<JoinHandle<()>>>>,
record_task: Arc<Mutex<Option<JoinHandle<()>>>>,
}
impl From<DatabaseError> for super::errors::RecorderError {
@@ -74,21 +81,26 @@ impl From<BiliClientError> for super::errors::RecorderError {
}
}
pub struct BiliRecorderOptions {
#[cfg(feature = "gui")]
pub app_handle: AppHandle,
pub emitter: EventEmitter,
pub db: Arc<Database>,
pub room_id: u64,
pub account: AccountRow,
pub config: Arc<RwLock<Config>>,
pub auto_start: bool,
pub channel: broadcast::Sender<RecorderEvent>,
}
impl BiliRecorder {
pub async fn new(
app_handle: AppHandle,
webid: &str,
db: &Arc<Database>,
room_id: u64,
account: &AccountRow,
config: Arc<RwLock<Config>>,
auto_start: bool,
channel: broadcast::Sender<RecorderEvent>,
) -> Result<Self, super::errors::RecorderError> {
pub async fn new(options: BiliRecorderOptions) -> Result<Self, super::errors::RecorderError> {
let client = BiliClient::new()?;
let room_info = client.get_room_info(account, room_id).await?;
let room_info = client
.get_room_info(&options.account, options.room_id)
.await?;
let user_info = client
.get_user_info(webid, account, room_info.user_id)
.get_user_info(&options.account, room_info.user_id)
.await?;
let mut live_status = false;
let mut cover = None;
@@ -102,19 +114,19 @@ impl BiliRecorder {
}
let recorder = Self {
app_handle,
#[cfg(feature = "gui")]
app_handle: options.app_handle,
emitter: options.emitter,
client: Arc::new(RwLock::new(client)),
db: db.clone(),
account: account.clone(),
config,
room_id,
db: options.db.clone(),
account: options.account.clone(),
config: options.config.clone(),
room_id: options.room_id,
room_info: Arc::new(RwLock::new(room_info)),
user_info: Arc::new(RwLock::new(user_info)),
live_status: Arc::new(RwLock::new(live_status)),
entry_store: Arc::new(RwLock::new(None)),
is_recording: Arc::new(RwLock::new(false)),
auto_start: Arc::new(RwLock::new(auto_start)),
current_record: Arc::new(RwLock::new(false)),
live_id: Arc::new(RwLock::new(String::new())),
cover: Arc::new(RwLock::new(cover)),
last_update: Arc::new(RwLock::new(Utc::now().timestamp())),
@@ -122,10 +134,13 @@ impl BiliRecorder {
quit: Arc::new(Mutex::new(false)),
live_stream: Arc::new(RwLock::new(None)),
danmu_storage: Arc::new(RwLock::new(None)),
m3u8_cache: DashMap::new(),
live_end_channel: channel,
live_end_channel: options.channel,
enabled: Arc::new(RwLock::new(options.auto_start)),
danmu_task: Arc::new(Mutex::new(None)),
record_task: Arc::new(Mutex::new(None)),
};
log::info!("Recorder for room {} created.", room_id);
log::info!("Recorder for room {} created.", options.room_id);
Ok(recorder)
}
@@ -142,7 +157,7 @@ impl BiliRecorder {
return false;
}
*self.current_record.read().await
*self.enabled.read().await
}
async fn check_status(&self) -> bool {
@@ -160,15 +175,15 @@ impl BiliRecorder {
// handle live notification
if *self.live_status.read().await != live_status {
log::info!(
"[{}]Live status changed to {}, current_record: {}, auto_start: {}",
"[{}]Live status changed to {}, enabled: {}",
self.room_id,
live_status,
*self.current_record.read().await,
*self.auto_start.read().await
*self.enabled.read().await
);
if live_status {
if self.config.read().await.live_start_notify {
#[cfg(feature = "gui")]
self.app_handle
.notification()
.builder()
@@ -193,6 +208,7 @@ impl BiliRecorder {
*self.cover.write().await = Some(cover_base64);
}
} else if self.config.read().await.live_end_notify {
#[cfg(feature = "gui")]
self.app_handle
.notification()
.builder()
@@ -218,40 +234,85 @@ impl BiliRecorder {
if !live_status {
self.reset().await;
*self.current_record.write().await = false;
return false;
}
// no need to check stream if current_record is false and auto_start is false
if !*self.current_record.read().await && !*self.auto_start.read().await {
// no need to check stream if should not record
if !self.should_record().await {
return true;
}
// current_record => update stream
// auto_start+is_new_stream => update stream and current_record=true
let new_stream = match self
.client
.read()
.await
.get_play_url(&self.account, self.room_id)
.await
{
Ok(stream) => Some(stream),
Err(e) => {
log::error!("[{}]Fetch stream failed: {}", self.room_id, e);
let master_manifest = self.client.read().await.get_index_content(&self.account, &format!("https://api.live.bilibili.com/xlive/play-gateway/master/url?cid={}&pt=h5&p2p_type=-1&net=0&free_type=0&build=0&feature=2&qn=10000", self.room_id)).await;
if master_manifest.is_err() {
log::error!(
"[{}]Fetch master manifest failed: {}",
self.room_id,
master_manifest.err().unwrap()
);
return true;
}
let master_manifest =
m3u8_rs::parse_playlist_res(master_manifest.as_ref().unwrap().as_bytes())
.map_err(|_| super::errors::RecorderError::M3u8ParseFailed {
content: master_manifest.as_ref().unwrap().clone(),
});
if master_manifest.is_err() {
log::error!(
"[{}]Parse master manifest failed: {}",
self.room_id,
master_manifest.err().unwrap()
);
return true;
}
let master_manifest = master_manifest.unwrap();
let variant = match master_manifest {
Playlist::MasterPlaylist(playlist) => {
let variants = playlist.variants.clone();
variants.into_iter().find(|variant| {
if let Some(other_attributes) = &variant.other_attributes {
if let Some(QuotedOrUnquoted::Quoted(bili_display)) =
other_attributes.get("BILI-DISPLAY")
{
bili_display == "原画"
} else {
false
}
} else {
false
}
})
}
_ => {
log::error!("[{}]Master manifest is not a media playlist", self.room_id);
None
}
};
if new_stream.is_none() {
if variant.is_none() {
log::error!("[{}]No variant found", self.room_id);
return true;
}
let variant = variant.unwrap();
let new_stream = self.stream_from_variant(variant).await;
if new_stream.is_err() {
log::error!(
"[{}]Fetch stream failed: {}",
self.room_id,
new_stream.err().unwrap()
);
return true;
}
let stream = new_stream.unwrap();
// auto start must be true here, if what fetched is a new stream, set current_record=true to auto start recording
if self.live_stream.read().await.is_none()
let should_update_stream = self.live_stream.read().await.is_none()
|| !self
.live_stream
.read()
@@ -259,23 +320,31 @@ impl BiliRecorder {
.as_ref()
.unwrap()
.is_same(&stream)
|| self.force_update.load(Ordering::Relaxed)
{
|| self.force_update.load(Ordering::Relaxed);
if should_update_stream {
log::info!(
"[{}]Fetched a new stream: {:?} => {}",
"[{}]Update to a new stream: {:?} => {}",
self.room_id,
self.live_stream.read().await.clone(),
stream
);
*self.current_record.write().await = true;
self.force_update.store(false, Ordering::Relaxed);
}
if *self.current_record.read().await {
*self.live_stream.write().await = Some(stream);
let new_stream = self.fetch_real_stream(stream).await;
if new_stream.is_err() {
log::error!(
"[{}]Fetch real stream failed: {}",
self.room_id,
new_stream.err().unwrap()
);
return true;
}
let new_stream = new_stream.unwrap();
*self.live_stream.write().await = Some(new_stream);
*self.last_update.write().await = Utc::now().timestamp();
return true;
}
true
@@ -288,49 +357,67 @@ impl BiliRecorder {
}
}
async fn danmu(&self) {
let cookies = self.account.cookies.clone();
let uid: u64 = self.account.uid;
while !*self.quit.lock().await {
let (tx, rx) = mpsc::unbounded_channel();
let ws = ws_socket_object(tx, uid, self.room_id, cookies.as_str());
if let Err(e) = tokio::select! {v = ws => v, v = self.recv(self.room_id,rx) => v} {
log::error!("danmu error: {}", e);
}
// reconnect after 3s
log::warn!("danmu will reconnect after 3s");
tokio::time::sleep(Duration::from_secs(3)).await;
}
log::info!("danmu thread {} quit.", self.room_id);
async fn stream_from_variant(
&self,
variant: VariantStream,
) -> Result<BiliStream, super::errors::RecorderError> {
let url = variant.uri.clone();
// example url: https://cn-hnld-ct-01-47.bilivideo.com/live-bvc/931676/live_1789460279_3538985/index.m3u8?expires=1745927098&len=0&oi=3729149990&pt=h5&qn=10000&trid=10075ceab17d4c9498264eb76d572b6810ad&sigparams=cdn,expires,len,oi,pt,qn,trid&cdn=cn-gotcha01&sign=686434f3ad01d33e001c80bfb7e1713d&site=3124fc9e0fabc664ace3d1b33638f7f2&free_type=0&mid=0&sche=ban&bvchls=1&sid=cn-hnld-ct-01-47&chash=0&bmt=1&sg=lr&trace=25&isp=ct&rg=East&pv=Shanghai&sk=28cc07215ff940102a1d60dade11467e&codec=0&pp=rtmp&hdr_type=0&hot_cdn=57345&suffix=origin&flvsk=c9154f5b3c6b14808bc5569329cf7f94&origin_bitrate=1281767&score=1&source=puv3_master&p2p_type=-1&deploy_env=prod&sl=1&info_source=origin&vd=nc&zoneid_l=151355393&sid_l=stream_name_cold&src=puv3&order=1
// extract host: cn-hnld-ct-01-47.bilivideo.com
let host = url.split('/').nth(2).unwrap_or_default();
let extra = url.split('?').nth(1).unwrap_or_default();
// extract base url: live-bvc/931676/live_1789460279_3538985/
let base_url = url
.split('/')
.skip(3)
.take(3)
.collect::<Vec<&str>>()
.join("/")
+ "/";
let stream = BiliStream::new(StreamType::FMP4, base_url.as_str(), host, extra);
Ok(stream)
}
async fn recv(
&self,
room: u64,
mut rx: UnboundedReceiver<WsStreamMessageType>,
) -> Result<(), FelgensError> {
while let Some(msg) = rx.recv().await {
if *self.quit.lock().await {
break;
}
if let WsStreamMessageType::DanmuMsg(msg) = msg {
let _ = self.app_handle.emit(
&format!("danmu:{}", room),
DanmuEntry {
ts: msg.timestamp as i64,
content: msg.msg.clone(),
},
);
if *self.live_status.read().await {
// save danmu
if let Some(storage) = self.danmu_storage.write().await.as_ref() {
storage.add_line(msg.timestamp as i64, &msg.msg).await;
async fn danmu(&self) -> Result<(), super::errors::RecorderError> {
let cookies = self.account.cookies.clone();
let room_id = self.room_id;
let danmu_stream = DanmuStream::new(ProviderType::BiliBili, &cookies, room_id).await;
if danmu_stream.is_err() {
let err = danmu_stream.err().unwrap();
log::error!("Failed to create danmu stream: {}", err);
return Err(super::errors::RecorderError::DanmuStreamError { err });
}
let danmu_stream = danmu_stream.unwrap();
// create a task to receive danmu message
let danmu_stream_clone = danmu_stream.clone();
tokio::spawn(async move {
let _ = danmu_stream_clone.start().await;
});
loop {
if let Ok(Some(msg)) = danmu_stream.recv().await {
match msg {
DanmuMessageType::DanmuMessage(danmu) => {
self.emitter.emit(&Event::DanmuReceived {
room: self.room_id,
ts: danmu.timestamp,
content: danmu.message.clone(),
});
if let Some(storage) = self.danmu_storage.write().await.as_ref() {
storage.add_line(danmu.timestamp, &danmu.message).await;
}
}
}
} else {
log::error!("Failed to receive danmu message");
return Err(super::errors::RecorderError::DanmuStreamError {
err: danmu_stream::DanmuStreamError::WebsocketError {
err: "Failed to receive danmu message".to_string(),
},
});
}
}
Ok(())
}
async fn get_playlist(&self) -> Result<Playlist, super::errors::RecorderError> {
@@ -343,7 +430,7 @@ impl BiliRecorder {
.client
.read()
.await
.get_index_content(&stream.index())
.get_index_content(&self.account, &stream.index())
.await
{
Ok(index_content) => {
@@ -378,7 +465,45 @@ impl BiliRecorder {
.client
.read()
.await
.get_index_content(&stream.index())
.get_index_content(&self.account, &stream.index())
.await?;
if index_content.is_empty() {
return Err(super::errors::RecorderError::InvalidStream { stream });
}
if index_content.contains("Not Found") {
return Err(super::errors::RecorderError::IndexNotFound {
url: stream.index(),
});
}
let mut header_url = String::from("");
let re = Regex::new(r"h.*\.m4s").unwrap();
if let Some(captures) = re.captures(&index_content) {
header_url = captures.get(0).unwrap().as_str().to_string();
}
if header_url.is_empty() {
log::warn!("Parse header url failed: {}", index_content);
}
Ok(header_url)
}
async fn fetch_real_stream(
&self,
stream: BiliStream,
) -> Result<BiliStream, super::errors::RecorderError> {
let index_content = self
.client
.read()
.await
.get_index_content(&self.account, &stream.index())
.await?;
if index_content.is_empty() {
return Err(super::errors::RecorderError::InvalidStream { stream });
}
let index_content = self
.client
.read()
.await
.get_index_content(&self.account, &stream.index())
.await?;
if index_content.is_empty() {
return Err(super::errors::RecorderError::InvalidStream { stream });
@@ -396,20 +521,10 @@ impl BiliRecorder {
let host = base_url.split('/').next().unwrap();
// extra is params after index.m3u8
let extra = new_url.split(base_url).last().unwrap();
let stream = BiliStream::new(StreamType::FMP4, base_url, host, extra);
log::info!("Update stream: {}", stream);
*self.live_stream.write().await = Some(stream);
return Box::pin(self.get_header_url()).await;
let new_stream = BiliStream::new(StreamType::FMP4, base_url, host, extra);
return Box::pin(self.fetch_real_stream(new_stream)).await;
}
let mut header_url = String::from("");
let re = Regex::new(r"h.*\.m4s").unwrap();
if let Some(captures) = re.captures(&index_content) {
header_url = captures.get(0).unwrap().as_str().to_string();
}
if header_url.is_empty() {
log::warn!("Parse header url failed: {}", index_content);
}
Ok(header_url)
Ok(stream)
}
async fn extract_liveid(&self, header_url: &str) -> i64 {
@@ -486,7 +601,7 @@ impl BiliRecorder {
let danmu_file_path = format!("{}{}", work_dir, "danmu.txt");
*self.danmu_storage.write().await = DanmuStorage::new(&danmu_file_path).await;
let full_header_url = current_stream.ts_url(&header_url);
let file_name = header_url.split('/').last().unwrap();
let file_name = header_url.split('/').next_back().unwrap();
let mut header = TsEntry {
url: file_name.to_string(),
sequence: 0,
@@ -556,7 +671,7 @@ impl BiliRecorder {
continue;
}
// encode segment offset into filename
let file_name = ts.uri.split('/').last().unwrap_or(&ts.uri);
let file_name = ts.uri.split('/').next_back().unwrap_or(&ts.uri);
let mut ts_length = pl.target_duration as f64;
let ts = timestamp * 1000 + seg_offset;
// calculate entry length using offset
@@ -650,6 +765,7 @@ impl BiliRecorder {
}
}
Err(e) => {
self.force_update.store(true, Ordering::Relaxed);
return Err(e);
}
}
@@ -657,13 +773,12 @@ impl BiliRecorder {
// check stream is nearly expired
// WHY: when program started, all stream is fetched nearly at the same time, so they will expire toggether,
// this might meet server rate limit. So we add a random offset to make request spread over time.
let mut rng = rand::thread_rng();
let pre_offset = rng.gen_range(5..=120);
// no need to update stream as it's not expired yet
let pre_offset = rand::random::<u64>() % 181 + 120; // Random number between 120 and 300
// no need to update stream as it's not expired yet
let current_stream = self.live_stream.read().await.clone();
if current_stream
.as_ref()
.is_some_and(|s| s.expire - Utc::now().timestamp() < pre_offset)
.is_some_and(|s| s.expire - Utc::now().timestamp() < pre_offset as i64)
{
log::info!("Stream is nearly expired, force update");
self.force_update.store(true, Ordering::Relaxed);
@@ -676,138 +791,36 @@ impl BiliRecorder {
}
async fn generate_archive_m3u8(&self, live_id: &str, start: i64, end: i64) -> String {
let range_required = start != 0 || end != 0;
if range_required {
log::info!("Generate archive m3u8 for range [{}, {}]", start, end);
}
let cache_key = format!("{}:{}:{}", live_id, start, end);
if self.m3u8_cache.contains_key(&cache_key) {
return self.m3u8_cache.get(&cache_key).unwrap().clone();
}
let mut m3u8_content = "#EXTM3U\n".to_string();
m3u8_content += "#EXT-X-VERSION:6\n";
m3u8_content += "#EXT-X-TARGETDURATION:1\n";
m3u8_content += "#EXT-X-PLAYLIST-TYPE:VOD\n";
// add header, FMP4 need this
// TODO handle StreamType::TS
let header_url = format!("h{}.m4s", live_id);
m3u8_content += &format!("#EXT-X-MAP:URI=\"{}\"\n", header_url);
// add entries from read_dir
let work_dir = self.get_work_dir(live_id).await;
let entries = EntryStore::new(&work_dir).await.get_entries().clone();
if entries.is_empty() {
return m3u8_content;
let entry_store = EntryStore::new(&work_dir).await;
let mut range = None;
if start != 0 || end != 0 {
range = Some(Range {
x: start as f32,
y: end as f32,
})
}
let mut last_sequence = entries.first().unwrap().sequence;
let live_ts = live_id.parse::<i64>().unwrap();
m3u8_content += &format!(
"#EXT-X-OFFSET:{}\n",
(entries.first().unwrap().ts - live_ts * 1000) / 1000
);
let mut first_entry_ts = None;
for e in entries {
// ignore header, cause it's already in EXT-X-MAP
if e.is_header {
continue;
}
if first_entry_ts.is_none() {
first_entry_ts = Some(e.ts / 1000);
}
let entry_offset = e.ts / 1000 - first_entry_ts.unwrap();
if range_required && (entry_offset < start || entry_offset > end) {
continue;
}
let current_seq = e.sequence;
if current_seq - last_sequence > 1 {
m3u8_content += "#EXT-X-DISCONTINUITY\n"
}
// add #EXT-X-PROGRAM-DATE-TIME with ISO 8601 date
let ts = e.ts / 1000;
let date_str = Utc.timestamp_opt(ts, 0).unwrap().to_rfc3339();
m3u8_content += &format!("#EXT-X-PROGRAM-DATE-TIME:{}\n", date_str);
m3u8_content += &format!("#EXTINF:{:.2},\n", e.length);
m3u8_content += &format!("{}\n", e.url);
last_sequence = current_seq;
}
m3u8_content += "#EXT-X-ENDLIST";
// cache this
self.m3u8_cache.insert(cache_key, m3u8_content.clone());
m3u8_content
entry_store.manifest(true, true, range)
}
/// if fetching live/last stream m3u8, all entries are cached in memory, so it will be much faster than read_dir
async fn generate_live_m3u8(&self, start: i64, end: i64) -> String {
let range_required = start != 0 || end != 0;
if range_required {
log::info!("Generate live m3u8 for range [{}, {}]", start, end);
}
let live_status = *self.live_status.read().await;
let mut m3u8_content = "#EXTM3U\n".to_string();
m3u8_content += "#EXT-X-VERSION:6\n";
m3u8_content += "#EXT-X-TARGETDURATION:1\n";
m3u8_content += "#EXT-X-SERVER-CONTROL:HOLD-BACK:3\n";
// if stream is closed, switch to VOD
if live_status && !range_required {
m3u8_content += "#EXT-X-PLAYLIST-TYPE:EVENT\n";
let range = if start != 0 || end != 0 {
Some(Range {
x: start as f32,
y: end as f32,
})
} else {
m3u8_content += "#EXT-X-PLAYLIST-TYPE:VOD\n";
}
let live_id = self.live_id.read().await.clone();
// initial segment for fmp4, info from self.header
if let Some(header) = self.entry_store.read().await.as_ref().unwrap().get_header() {
let file_name = header.url.split('/').last().unwrap();
m3u8_content += &format!("#EXT-X-MAP:URI=\"{}\"\n", file_name);
}
let entries = self
.entry_store
.read()
.await
.as_ref()
.unwrap()
.get_entries()
.clone();
if entries.is_empty() {
m3u8_content += "#EXT-X-OFFSET:0\n";
return m3u8_content;
}
None
};
let mut last_sequence = entries.first().unwrap().sequence;
// this does nothing, but privide first entry ts for player
let live_ts = live_id.parse::<i64>().unwrap();
m3u8_content += &format!(
"#EXT-X-OFFSET:{}\n",
(entries.first().unwrap().ts - live_ts * 1000) / 1000
);
let first_entry_ts = entries.first().unwrap().ts / 1000;
for entry in entries.iter() {
let entry_offset = entry.ts / 1000 - first_entry_ts;
if range_required && (entry_offset < start || entry_offset > end) {
continue;
}
if entry.sequence - last_sequence > 1 {
// discontinuity happens
m3u8_content += "#EXT-X-DISCONTINUITY\n"
}
// add #EXT-X-PROGRAM-DATE-TIME with ISO 8601 date
let ts = entry.ts / 1000;
let date_str = Utc.timestamp_opt(ts, 0).unwrap().to_rfc3339();
m3u8_content += &format!("#EXT-X-PROGRAM-DATE-TIME:{}\n", date_str);
m3u8_content += &format!("#EXTINF:{:.2},\n", entry.length);
last_sequence = entry.sequence;
let file_name = entry.url.split('/').last().unwrap();
m3u8_content += &format!("{}\n", file_name);
}
// let player know stream is closed
if !live_status || range_required {
m3u8_content += "#EXT-X-ENDLIST";
}
m3u8_content
self.entry_store.read().await.as_ref().unwrap().manifest(
!live_status || range.is_some(),
true,
range,
)
}
}
@@ -815,75 +828,97 @@ impl BiliRecorder {
impl super::Recorder for BiliRecorder {
async fn run(&self) {
let self_clone = self.clone();
thread::spawn(move || {
let runtime = tokio::runtime::Runtime::new().unwrap();
runtime.block_on(async move {
while !*self_clone.quit.lock().await {
if self_clone.check_status().await {
// Live status is ok, start recording.
while self_clone.should_record().await {
match self_clone.update_entries().await {
Ok(ms) => {
if ms < 1000 {
thread::sleep(std::time::Duration::from_millis(
(1000 - ms) as u64,
));
}
if ms >= 3000 {
log::warn!(
"[{}]Update entries cost too long: {}ms",
self_clone.room_id,
ms
);
}
*self_clone.is_recording.write().await = true;
*self.danmu_task.lock().await = Some(tokio::spawn(async move {
log::info!("Start fetching danmu for room {}", self_clone.room_id);
let _ = self_clone.danmu().await;
}));
let self_clone = self.clone();
*self.record_task.lock().await = Some(tokio::spawn(async move {
log::info!("Start running recorder for room {}", self_clone.room_id);
while !*self_clone.quit.lock().await {
let mut connection_fail_count = 0;
if self_clone.check_status().await {
// Live status is ok, start recording.
while self_clone.should_record().await {
match self_clone.update_entries().await {
Ok(ms) => {
if ms < 1000 {
tokio::time::sleep(Duration::from_millis((1000 - ms) as u64))
.await;
}
Err(e) => {
log::error!(
"[{}]Update entries error: {}",
if ms >= 3000 {
log::warn!(
"[{}]Update entries cost too long: {}ms",
self_clone.room_id,
e
ms
);
break;
}
*self_clone.is_recording.write().await = true;
connection_fail_count = 0;
}
Err(e) => {
log::error!("[{}]Update entries error: {}", self_clone.room_id, e);
if let RecorderError::BiliClientError { err: _ } = e {
connection_fail_count =
std::cmp::min(5, connection_fail_count + 1);
}
break;
}
}
*self_clone.is_recording.write().await = false;
// go check status again after random 2-5 secs
let mut rng = rand::thread_rng();
let secs = rng.gen_range(2..=5);
thread::sleep(std::time::Duration::from_secs(secs));
continue;
}
// Every 10s check live status.
thread::sleep(std::time::Duration::from_secs(10));
*self_clone.is_recording.write().await = false;
// go check status again after random 2-5 secs
let secs = rand::random::<u64>() % 4 + 2;
tokio::time::sleep(Duration::from_secs(
secs + 2_u64.pow(connection_fail_count),
))
.await;
continue;
}
log::info!("recording thread {} quit.", self_clone.room_id);
});
});
// Thread for danmaku
let self_clone = self.clone();
thread::spawn(move || {
let runtime = tokio::runtime::Runtime::new().unwrap();
runtime.block_on(async move {
self_clone.danmu().await;
});
});
tokio::time::sleep(Duration::from_secs(
self_clone.config.read().await.status_check_interval,
))
.await;
}
}));
}
async fn stop(&self) {
log::debug!("Stop recorder for room {}", self.room_id);
*self.quit.lock().await = true;
if let Some(danmu_task) = self.danmu_task.lock().await.as_mut() {
let _ = danmu_task.abort();
}
if let Some(record_task) = self.record_task.lock().await.as_mut() {
let _ = record_task.abort();
}
log::info!("Recorder for room {} quit.", self.room_id);
}
/// timestamp is the id of live stream
async fn m3u8_content(&self, live_id: &str, start: i64, end: i64) -> String {
if *self.live_id.read().await == live_id && *self.current_record.read().await {
if *self.live_id.read().await == live_id && self.should_record().await {
self.generate_live_m3u8(start, end).await
} else {
self.generate_archive_m3u8(live_id, start, end).await
}
}
async fn master_m3u8(&self, live_id: &str, start: i64, end: i64) -> String {
log::info!("Master manifest for {live_id} {start}-{end}");
let offset = self.first_segment_ts(live_id).await / 1000;
let mut m3u8_content = "#EXTM3U\n".to_string();
m3u8_content += "#EXT-X-VERSION:6\n";
m3u8_content += &format!(
"#EXT-X-STREAM-INF:BANDWIDTH=1280000,RESOLUTION=1920x1080,CODECS={},DANMU={}\n",
"\"avc1.64001F,mp4a.40.2\"", offset
);
m3u8_content += &format!("playlist.m3u8?start={}&end={}\n", start, end);
m3u8_content
}
async fn first_segment_ts(&self, live_id: &str) -> i64 {
if *self.live_id.read().await == live_id {
let entry_store = self.entry_store.read().await;
@@ -921,7 +956,7 @@ impl super::Recorder for BiliRecorder {
current_live_id: self.live_id.read().await.clone(),
live_status: *self.live_status.read().await,
is_recording: *self.is_recording.read().await,
auto_start: *self.auto_start.read().await,
auto_start: *self.enabled.read().await,
platform: PlatformType::BiliBili.as_str().to_string(),
}
}
@@ -959,15 +994,11 @@ impl super::Recorder for BiliRecorder {
*self.live_id.read().await == live_id && *self.live_status.read().await
}
async fn force_start(&self) {
*self.current_record.write().await = true;
async fn enable(&self) {
*self.enabled.write().await = true;
}
async fn force_stop(&self) {
*self.current_record.write().await = false;
}
async fn set_auto_start(&self, auto_start: bool) {
*self.auto_start.write().await = auto_start;
async fn disable(&self) {
*self.enabled.write().await = false;
}
}

View File

@@ -2,14 +2,13 @@ use super::errors::BiliClientError;
use super::profile;
use super::profile::Profile;
use super::response;
use super::response::Format;
use super::response::GeneralResponse;
use super::response::PostVideoMetaResponse;
use super::response::PreuploadResponse;
use super::response::VideoSubmitData;
use crate::database::account::AccountRow;
use crate::progress_event::ProgressReporter;
use crate::progress_event::ProgressReporterTrait;
use crate::progress_reporter::ProgressReporter;
use crate::progress_reporter::ProgressReporterTrait;
use base64::Engine;
use pct_str::PctString;
use pct_str::URIReserved;
@@ -110,11 +109,17 @@ impl BiliStream {
}
pub fn index(&self) -> String {
format!("{}{}{}?{}", self.host, self.path, "index.m3u8", self.extra)
format!(
"https://{}/{}/{}?{}",
self.host, self.path, "index.m3u8", self.extra
)
}
pub fn ts_url(&self, seg_name: &str) -> String {
format!("{}{}{}?{}", self.host, self.path, seg_name, self.extra)
format!(
"https://{}/{}/{}?{}",
self.host, self.path, seg_name, self.extra
)
}
pub fn get_path(base_url: &str) -> String {
@@ -160,29 +165,6 @@ impl BiliClient {
}
}
pub async fn fetch_webid(&self, account: &AccountRow) -> Result<String, BiliClientError> {
// get webid from html content
// webid is in script tag <script id="__RENDER_DATA__" type="application/json">
// https://space.bilibili.com/{user_id}
// let url = format!("https://space.bilibili.com/{}", account.uid);
// let res = self.client.get(&url).send().await?;
// let content = res.text().await?;
// let re =
// Regex::new(r#"<script id="__RENDER_DATA__" type="application/json">(.+?)</script>"#)
// .unwrap();
// let cap = re.captures(&content).ok_or(BiliClientError::InvalidValue)?;
// let str = cap.get(1).ok_or(BiliClientError::InvalidValue)?.as_str();
// // str need url decode
// let json_str = urlencoding::decode(str).map_err(|_| BiliClientError::InvalidValue)?; // url decode
// let json: serde_json::Value = serde_json::from_str(&json_str).unwrap();
// let webid = json["access_id"]
// .as_str()
// .ok_or(BiliClientError::InvalidValue)?;
// log::info!("webid: {}", webid);
// Ok(webid.into())
Ok("".into())
}
pub async fn get_qr(&self) -> Result<QrInfo, BiliClientError> {
let res: serde_json::Value = self
.client
@@ -223,7 +205,7 @@ impl BiliClient {
.as_str()
.ok_or(BiliClientError::InvalidValue)?
.to_string();
let query_str = url.split('?').last().unwrap();
let query_str = url.split('?').next_back().unwrap();
cookies = query_str.replace('&', ";");
}
Ok(QrStatus { code, cookies })
@@ -247,7 +229,6 @@ impl BiliClient {
pub async fn get_user_info(
&self,
webid: &str,
account: &AccountRow,
user_id: u64,
) -> Result<UserInfo, BiliClientError> {
@@ -256,12 +237,12 @@ impl BiliClient {
"platform": "web",
"web_location": "1550101",
"token": "",
"w_webid": webid,
"w_webid": "",
});
let params = self.get_sign(params).await?;
let mut headers = self.headers.clone();
headers.insert("cookie", account.cookies.parse().unwrap());
let res: serde_json::Value = self
let resp = self
.client
.get(format!(
"https://api.bilibili.com/x/space/wbi/acc/info?{}",
@@ -269,15 +250,24 @@ impl BiliClient {
))
.headers(headers)
.send()
.await?
.json()
.await?;
if res["code"].as_i64().unwrap_or(-1) != 0 {
log::error!(
"Get user info failed {}",
res["code"].as_i64().unwrap_or(-1)
);
return Err(BiliClientError::InvalidCode);
if !resp.status().is_success() {
if resp.status() == reqwest::StatusCode::PRECONDITION_FAILED {
return Err(BiliClientError::SecurityControlError);
}
return Err(BiliClientError::InvalidResponseStatus {
status: resp.status(),
});
}
let res: serde_json::Value = resp.json().await?;
let code = res["code"]
.as_u64()
.ok_or(BiliClientError::InvalidResponseJson { resp: res.clone() })?;
if code != 0 {
log::error!("Get user info failed {}", code);
return Err(BiliClientError::InvalidMessageCode { code });
}
Ok(UserInfo {
user_id,
@@ -294,7 +284,7 @@ impl BiliClient {
) -> Result<RoomInfo, BiliClientError> {
let mut headers = self.headers.clone();
headers.insert("cookie", account.cookies.parse().unwrap());
let res: serde_json::Value = self
let response = self
.client
.get(format!(
"https://api.live.bilibili.com/room/v1/Room/get_info?room_id={}",
@@ -302,12 +292,23 @@ impl BiliClient {
))
.headers(headers)
.send()
.await?
.json()
.await?;
let code = res["code"].as_u64().ok_or(BiliClientError::InvalidValue)?;
if !response.status().is_success() {
if response.status() == reqwest::StatusCode::PRECONDITION_FAILED {
return Err(BiliClientError::SecurityControlError);
}
return Err(BiliClientError::InvalidResponseStatus {
status: response.status(),
});
}
let res: serde_json::Value = response.json().await?;
let code = res["code"]
.as_u64()
.ok_or(BiliClientError::InvalidResponseJson { resp: res.clone() })?;
if code != 0 {
return Err(BiliClientError::InvalidCode);
return Err(BiliClientError::InvalidMessageCode { code });
}
let room_id = res["data"]["room_id"]
@@ -341,8 +342,8 @@ impl BiliClient {
})
}
/// Get and encode response data into base64
pub async fn get_cover_base64(&self, url: &str) -> Result<String, BiliClientError> {
log::info!("get_cover_base64: {}", url);
let response = self.client.get(url).send().await?;
let bytes = response.bytes().await?;
let base64 = base64::engine::general_purpose::STANDARD.encode(bytes);
@@ -352,72 +353,26 @@ impl BiliClient {
Ok(format!("data:{};base64,{}", mime_type, base64))
}
pub async fn get_play_url(
pub async fn get_index_content(
&self,
account: &AccountRow,
room_id: u64,
) -> Result<BiliStream, BiliClientError> {
url: &String,
) -> Result<String, BiliClientError> {
let mut headers = self.headers.clone();
headers.insert("cookie", account.cookies.parse().unwrap());
let res: GeneralResponse = self
.client
.get(format!(
"https://api.live.bilibili.com/xlive/web-room/v2/index/getRoomPlayInfo?room_id={}&protocol=1&format=0,1,2&codec=0&qn=10000&platform=h5",
room_id
))
.headers(headers)
.send().await?
.json().await?;
if res.code == 0 {
if let response::Data::RoomPlayInfo(data) = res.data {
if let Some(stream) = data.playurl_info.playurl.stream.first() {
// Get fmp4 format
if let Some(f) = stream.format.iter().find(|f| f.format_name == "fmp4") {
self.get_stream(f).await
} else {
log::error!("No fmp4 stream found: {:?}", data);
Err(BiliClientError::InvalidResponse)
}
} else {
log::error!("No stream provided: {:#?}", data);
Err(BiliClientError::InvalidResponse)
}
} else {
log::error!("Invalid response: {:#?}", res);
Err(BiliClientError::InvalidResponse)
}
} else {
log::error!("Invalid response: {:#?}", res);
Err(BiliClientError::InvalidResponse)
}
}
async fn get_stream(&self, format: &Format) -> Result<BiliStream, BiliClientError> {
if let Some(codec) = format.codec.first() {
if let Some(url_info) = codec.url_info.first() {
Ok(BiliStream::new(
StreamType::FMP4,
&codec.base_url,
&url_info.host,
&url_info.extra,
))
} else {
Err(BiliClientError::InvalidFormat)
}
} else {
Err(BiliClientError::InvalidFormat)
}
}
pub async fn get_index_content(&self, url: &String) -> Result<String, BiliClientError> {
Ok(self
let response = self
.client
.get(url.to_owned())
.headers(self.headers.clone())
.headers(headers)
.send()
.await?
.text()
.await?)
.await?;
if response.status().is_success() {
Ok(response.text().await?)
} else {
log::error!("get_index_content failed: {}", response.status());
Err(BiliClientError::InvalidStream)
}
}
pub async fn download_ts(&self, url: &str, file_path: &str) -> Result<u64, BiliClientError> {

View File

@@ -3,15 +3,19 @@ use custom_error::custom_error;
custom_error! {pub BiliClientError
InvalidResponse = "Invalid response",
InitClientError = "Client init error",
InvalidCode = "Invalid Code",
InvalidResponseStatus{ status: reqwest::StatusCode } = "Invalid response status: {status}",
InvalidResponseJson{ resp: serde_json::Value } = "Invalid response json: {resp}",
InvalidMessageCode{ code: u64 } = "Invalid message code: {code}",
InvalidValue = "Invalid value",
InvalidUrl = "Invalid url",
InvalidFormat = "Invalid stream format",
InvalidStream = "Invalid stream",
UploadError{err: String} = "Upload error: {err}",
UploadCancelled = "Upload was cancelled by user",
EmptyCache = "Empty cache",
ClientError{err: reqwest::Error} = "Client error: {err}",
IOError{err: std::io::Error} = "IO error: {err}",
SecurityControlError = "Security control error",
}
impl From<reqwest::Error> for BiliClientError {

View File

@@ -12,6 +12,7 @@ pub struct GeneralResponse {
#[derive(Serialize, Deserialize, Debug)]
#[serde(untagged)]
#[allow(clippy::large_enum_variant)]
pub enum Data {
VideoSubmit(VideoSubmitData),
Cover(CoverData),

View File

@@ -1,24 +1,32 @@
pub mod client;
mod response;
mod stream_info;
use super::entry::{EntryStore, TsEntry};
use super::entry::{EntryStore, Range, TsEntry};
use super::{
danmu::DanmuEntry, errors::RecorderError, PlatformType, Recorder, RecorderInfo, RoomInfo,
UserInfo,
};
use crate::database::Database;
use crate::progress_manager::Event;
use crate::progress_reporter::EventEmitter;
use crate::recorder_manager::RecorderEvent;
use crate::{config::Config, database::account::AccountRow};
use async_trait::async_trait;
use chrono::{TimeZone, Utc};
use chrono::Utc;
use client::DouyinClientError;
use dashmap::DashMap;
use std::collections::HashMap;
use danmu_stream::danmu_stream::DanmuStream;
use danmu_stream::provider::ProviderType;
use danmu_stream::DanmuMessageType;
use rand::random;
use std::sync::Arc;
use std::time::Duration;
use tauri::AppHandle;
use tauri_plugin_notification::NotificationExt;
use tokio::sync::{broadcast, RwLock};
use tokio::sync::{broadcast, Mutex, RwLock};
use tokio::task::JoinHandle;
use super::danmu::DanmuStorage;
#[cfg(not(feature = "headless"))]
use {tauri::AppHandle, tauri_plugin_notification::NotificationExt};
#[derive(Clone, Copy, PartialEq, Debug)]
pub enum LiveStatus {
@@ -40,36 +48,44 @@ impl From<DouyinClientError> for RecorderError {
#[derive(Clone)]
pub struct DouyinRecorder {
#[cfg(not(feature = "headless"))]
app_handle: AppHandle,
emitter: EventEmitter,
client: client::DouyinClient,
db: Arc<Database>,
pub room_id: u64,
pub room_info: Arc<RwLock<Option<response::DouyinRoomInfoResponse>>>,
pub stream_url: Arc<RwLock<Option<String>>>,
pub entry_store: Arc<RwLock<Option<EntryStore>>>,
pub live_id: Arc<RwLock<String>>,
pub live_status: Arc<RwLock<LiveStatus>>,
account: AccountRow,
room_id: u64,
room_info: Arc<RwLock<Option<response::DouyinRoomInfoResponse>>>,
stream_url: Arc<RwLock<Option<String>>>,
entry_store: Arc<RwLock<Option<EntryStore>>>,
danmu_store: Arc<RwLock<Option<DanmuStorage>>>,
live_id: Arc<RwLock<String>>,
live_status: Arc<RwLock<LiveStatus>>,
is_recording: Arc<RwLock<bool>>,
auto_start: Arc<RwLock<bool>>,
current_record: Arc<RwLock<bool>>,
running: Arc<RwLock<bool>>,
last_update: Arc<RwLock<i64>>,
m3u8_cache: DashMap<String, String>,
config: Arc<RwLock<Config>>,
live_end_channel: broadcast::Sender<RecorderEvent>,
enabled: Arc<RwLock<bool>>,
danmu_stream_task: Arc<Mutex<Option<JoinHandle<()>>>>,
danmu_task: Arc<Mutex<Option<JoinHandle<()>>>>,
record_task: Arc<Mutex<Option<JoinHandle<()>>>>,
}
impl DouyinRecorder {
#[allow(clippy::too_many_arguments)]
pub async fn new(
app_handle: AppHandle,
#[cfg(not(feature = "headless"))] app_handle: AppHandle,
emitter: EventEmitter,
room_id: u64,
config: Arc<RwLock<Config>>,
douyin_account: &AccountRow,
account: &AccountRow,
db: &Arc<Database>,
auto_start: bool,
enabled: bool,
channel: broadcast::Sender<RecorderEvent>,
) -> Result<Self, super::errors::RecorderError> {
let client = client::DouyinClient::new(douyin_account);
let client = client::DouyinClient::new(account);
let room_info = client.get_room_info(room_id).await?;
let mut live_status = LiveStatus::Offline;
if room_info.data.room_status == 0 {
@@ -77,23 +93,29 @@ impl DouyinRecorder {
}
Ok(Self {
#[cfg(not(feature = "headless"))]
app_handle,
emitter,
db: db.clone(),
account: account.clone(),
room_id,
live_id: Arc::new(RwLock::new(String::new())),
entry_store: Arc::new(RwLock::new(None)),
danmu_store: Arc::new(RwLock::new(None)),
client,
room_info: Arc::new(RwLock::new(Some(room_info))),
stream_url: Arc::new(RwLock::new(None)),
live_status: Arc::new(RwLock::new(live_status)),
running: Arc::new(RwLock::new(false)),
is_recording: Arc::new(RwLock::new(false)),
auto_start: Arc::new(RwLock::new(auto_start)),
current_record: Arc::new(RwLock::new(false)),
enabled: Arc::new(RwLock::new(enabled)),
last_update: Arc::new(RwLock::new(Utc::now().timestamp())),
m3u8_cache: DashMap::new(),
config,
live_end_channel: channel,
danmu_stream_task: Arc::new(Mutex::new(None)),
danmu_task: Arc::new(Mutex::new(None)),
record_task: Arc::new(Mutex::new(None)),
})
}
@@ -102,30 +124,27 @@ impl DouyinRecorder {
return false;
}
*self.current_record.read().await
*self.enabled.read().await
}
async fn check_status(&self) -> bool {
match self.client.get_room_info(self.room_id).await {
Ok(info) => {
let live_status = info.data.room_status == 0; // room_status == 0 表示正在直播
let previous_liveid = self.live_id.read().await.clone();
*self.room_info.write().await = Some(info.clone());
if (*self.live_status.read().await == LiveStatus::Live) != live_status {
// live status changed, reset current record flag
*self.current_record.write().await = false;
log::info!(
"[{}]Live status changed to {}, current_record: {}, auto_start: {}",
"[{}]Live status changed to {}, auto_start: {}",
self.room_id,
live_status,
*self.current_record.read().await,
*self.auto_start.read().await
*self.enabled.read().await
);
if live_status {
#[cfg(not(feature = "headless"))]
self.app_handle
.notification()
.builder()
@@ -137,6 +156,7 @@ impl DouyinRecorder {
.show()
.unwrap();
} else {
#[cfg(not(feature = "headless"))]
self.app_handle
.notification()
.builder()
@@ -147,7 +167,6 @@ impl DouyinRecorder {
))
.show()
.unwrap();
let _ = self.live_end_channel.send(RecorderEvent::LiveEnd {
platform: PlatformType::Douyin,
room_id: self.room_id,
@@ -165,65 +184,75 @@ impl DouyinRecorder {
}
if !live_status {
*self.current_record.write().await = false;
self.reset().await;
return false;
}
if !*self.current_record.read().await && !*self.auto_start.read().await {
let should_record = self.should_record().await;
if !should_record {
return true;
}
if *self.auto_start.read().await
&& previous_liveid != info.data.data[0].id_str.clone()
// Get stream URL when live starts
if !info.data.data[0]
.stream_url
.as_ref()
.unwrap()
.hls_pull_url
.is_empty()
{
*self.current_record.write().await = true;
}
if *self.current_record.read().await {
// Get stream URL when live starts
if !info.data.data[0]
.stream_url
*self.live_id.write().await = info.data.data[0].id_str.clone();
// create a new record
let cover_url = info.data.data[0]
.cover
.as_ref()
.unwrap()
.hls_pull_url
.is_empty()
.map(|cover| cover.url_list[0].clone());
let cover = if let Some(url) = cover_url {
Some(self.client.get_cover_base64(&url).await.unwrap())
} else {
None
};
if let Err(e) = self
.db
.add_record(
PlatformType::Douyin,
self.live_id.read().await.as_str(),
self.room_id,
&info.data.data[0].title,
cover,
None,
)
.await
{
*self.live_id.write().await = info.data.data[0].id_str.clone();
// create a new record
let cover_url = info.data.data[0]
.cover
.as_ref()
.map(|cover| cover.url_list[0].clone());
let cover = if let Some(url) = cover_url {
Some(self.client.get_cover_base64(&url).await.unwrap())
} else {
None
};
if let Err(e) = self
.db
.add_record(
PlatformType::Douyin,
self.live_id.read().await.as_str(),
self.room_id,
&info.data.data[0].title,
cover,
None,
)
.await
{
log::error!("Failed to add record: {}", e);
}
// setup entry store
let work_dir = self.get_work_dir(self.live_id.read().await.as_str()).await;
let entry_store = EntryStore::new(&work_dir).await;
*self.entry_store.write().await = Some(entry_store);
log::error!("Failed to add record: {}", e);
}
return true;
// setup entry store
let work_dir = self.get_work_dir(self.live_id.read().await.as_str()).await;
let entry_store = EntryStore::new(&work_dir).await;
*self.entry_store.write().await = Some(entry_store);
// setup danmu store
let danmu_file_path = format!("{}{}", work_dir, "danmu.txt");
let danmu_store = DanmuStorage::new(&danmu_file_path).await;
*self.danmu_store.write().await = danmu_store;
// start danmu task
if let Some(danmu_task) = self.danmu_task.lock().await.as_mut() {
danmu_task.abort();
}
if let Some(danmu_stream_task) = self.danmu_stream_task.lock().await.as_mut() {
danmu_stream_task.abort();
}
let live_id = self.live_id.read().await.clone();
let self_clone = self.clone();
*self.danmu_task.lock().await = Some(tokio::spawn(async move {
log::info!("Start fetching danmu for live {}", live_id);
let _ = self_clone.danmu().await;
}));
}
true
@@ -235,6 +264,53 @@ impl DouyinRecorder {
}
}
async fn danmu(&self) -> Result<(), super::errors::RecorderError> {
let cookies = self.account.cookies.clone();
let live_id = self
.live_id
.read()
.await
.clone()
.parse::<u64>()
.unwrap_or(0);
let danmu_stream = DanmuStream::new(ProviderType::Douyin, &cookies, live_id).await;
if danmu_stream.is_err() {
let err = danmu_stream.err().unwrap();
log::error!("Failed to create danmu stream: {}", err);
return Err(super::errors::RecorderError::DanmuStreamError { err });
}
let danmu_stream = danmu_stream.unwrap();
let danmu_stream_clone = danmu_stream.clone();
*self.danmu_stream_task.lock().await = Some(tokio::spawn(async move {
let _ = danmu_stream_clone.start().await;
}));
loop {
if let Ok(Some(msg)) = danmu_stream.recv().await {
match msg {
DanmuMessageType::DanmuMessage(danmu) => {
self.emitter.emit(&Event::DanmuReceived {
room: self.room_id,
ts: danmu.timestamp,
content: danmu.message.clone(),
});
if let Some(storage) = self.danmu_store.read().await.as_ref() {
storage.add_line(danmu.timestamp, &danmu.message).await;
}
}
}
} else {
log::error!("Failed to receive danmu message");
return Err(super::errors::RecorderError::DanmuStreamError {
err: danmu_stream::DanmuStreamError::WebsocketError {
err: "Failed to receive danmu message".to_string(),
},
});
}
}
}
async fn reset(&self) {
*self.entry_store.write().await = None;
*self.live_id.write().await = String::new();
@@ -360,7 +436,7 @@ impl DouyinRecorder {
sequence,
length: segment.duration as f64,
size,
ts: Utc::now().timestamp(),
ts: Utc::now().timestamp_millis(),
is_header: false,
};
@@ -374,6 +450,8 @@ impl DouyinRecorder {
}
Err(e) => {
log::error!("Failed to download segment: {}", e);
*self.stream_url.write().await = None;
return Err(e.into());
}
}
}
@@ -406,79 +484,30 @@ impl DouyinRecorder {
}
async fn generate_m3u8(&self, live_id: &str, start: i64, end: i64) -> String {
let mut m3u8_content = "#EXTM3U\n".to_string();
let range_required = start != 0 || end != 0;
m3u8_content += "#EXT-X-VERSION:3\n";
log::debug!("Generate m3u8 for {live_id}:{start}:{end}");
let range = if start != 0 || end != 0 {
Some(Range {
x: start as f32,
y: end as f32,
})
} else {
None
};
// if requires a range, we need to filter entries and only use entries in the range, so m3u8 type is VOD.
let entries = if !range_required && live_id == *self.live_id.read().await {
m3u8_content += "#EXT-X-PLAYLIST-TYPE:EVENT\n";
if live_id == *self.live_id.read().await {
self.entry_store
.read()
.await
.as_ref()
.unwrap()
.get_entries()
.clone()
.manifest(range.is_some(), false, range)
} else {
m3u8_content += "#EXT-X-PLAYLIST-TYPE:VOD\n";
let work_dir = self.get_work_dir(live_id).await;
let entry_store = EntryStore::new(&work_dir).await;
entry_store.get_entries().clone()
};
m3u8_content += "#EXT-X-OFFSET:0\n";
if entries.is_empty() {
return m3u8_content;
EntryStore::new(&work_dir)
.await
.manifest(true, false, range)
}
m3u8_content += "#EXT-X-TARGETDURATION:6\n";
let first_sequence = entries.first().as_ref().unwrap().sequence;
let first_entry_ts = entries.first().unwrap().ts;
let mut previous_seq = first_sequence;
let mut discontinue_entries = HashMap::<u64, bool>::new();
for entry in &entries {
if range_required
&& (entry.ts - first_entry_ts < start || entry.ts - first_entry_ts > end)
{
continue;
}
if entry.sequence - previous_seq > 1 {
discontinue_entries.insert(entry.sequence, true);
discontinue_entries.insert(previous_seq, true);
}
previous_seq = entry.sequence;
}
// reset previous seq
previous_seq = first_sequence;
for entry in entries {
if range_required
&& (entry.ts - first_entry_ts < start || entry.ts - first_entry_ts > end)
{
continue;
}
if entry.sequence - previous_seq > 1 {
m3u8_content += "#EXT-X-DISCONTINUITY\n";
}
previous_seq = entry.sequence;
if *discontinue_entries.get(&entry.sequence).unwrap_or(&false) {
let date_str = Utc.timestamp_opt(entry.ts, 0).unwrap().to_rfc3339();
m3u8_content += &format!("#EXT-X-PROGRAM-DATE-TIME:{}\n", date_str);
}
m3u8_content += &format!("#EXTINF:{:.2},\n", entry.length);
m3u8_content += &format!("{}\n", entry.url);
}
if *self.live_status.read().await != LiveStatus::Live || range_required {
m3u8_content += "#EXT-X-ENDLIST\n";
}
m3u8_content
}
}
@@ -488,8 +517,9 @@ impl Recorder for DouyinRecorder {
*self.running.write().await = true;
let self_clone = self.clone();
tokio::spawn(async move {
*self.record_task.lock().await = Some(tokio::spawn(async move {
while *self_clone.running.read().await {
let mut connection_fail_count = 0;
if self_clone.check_status().await {
// Live status is ok, start recording
while self_clone.should_record().await {
@@ -507,41 +537,64 @@ impl Recorder for DouyinRecorder {
);
}
*self_clone.is_recording.write().await = true;
connection_fail_count = 0;
}
Err(e) => {
log::error!("[{}]Update entries error: {}", self_clone.room_id, e);
if let RecorderError::DouyinClientError { err: _e } = e {
connection_fail_count =
std::cmp::min(5, connection_fail_count + 1);
}
break;
}
}
}
*self_clone.is_recording.write().await = false;
// Check status again after 2-5 seconds
tokio::time::sleep(Duration::from_secs(2)).await;
// Check status again after some seconds
let secs = random::<u64>() % 5;
tokio::time::sleep(Duration::from_secs(
secs + 2_u64.pow(connection_fail_count),
))
.await;
continue;
}
// Check live status every 10s
tokio::time::sleep(Duration::from_secs(10)).await;
tokio::time::sleep(Duration::from_secs(
self_clone.config.read().await.status_check_interval,
))
.await;
}
log::info!("recording thread {} quit.", self_clone.room_id);
});
}));
}
async fn stop(&self) {
*self.running.write().await = false;
// stop 3 tasks
if let Some(danmu_task) = self.danmu_task.lock().await.as_mut() {
let _ = danmu_task.abort();
}
if let Some(danmu_stream_task) = self.danmu_stream_task.lock().await.as_mut() {
let _ = danmu_stream_task.abort();
}
if let Some(record_task) = self.record_task.lock().await.as_mut() {
let _ = record_task.abort();
}
log::info!("Recorder for room {} quit.", self.room_id);
}
async fn m3u8_content(&self, live_id: &str, start: i64, end: i64) -> String {
let cache_key = format!("{}:{}:{}", live_id, start, end);
let range_required = start != 0 || end != 0;
if !range_required {
return self.generate_m3u8(live_id, start, end).await;
}
self.generate_m3u8(live_id, start, end).await
}
if let Some(cached) = self.m3u8_cache.get(&cache_key) {
return cached.clone();
}
let m3u8_content = self.generate_m3u8(live_id, start, end).await;
self.m3u8_cache.insert(cache_key, m3u8_content.clone());
async fn master_m3u8(&self, live_id: &str, start: i64, end: i64) -> String {
let mut m3u8_content = "#EXTM3U\n".to_string();
m3u8_content += "#EXT-X-VERSION:6\n";
m3u8_content += format!(
"#EXT-X-STREAM-INF:BANDWIDTH=1280000,RESOLUTION=1920x1080,CODECS=\"avc1.64001F,mp4a.40.2\",DANMU={}\n",
self.first_segment_ts(live_id).await / 1000
)
.as_str();
m3u8_content += &format!("playlist.m3u8?start={}&end={}\n", start, end);
m3u8_content
}
@@ -604,28 +657,46 @@ impl Recorder for DouyinRecorder {
current_live_id: self.live_id.read().await.clone(),
live_status: *self.live_status.read().await == LiveStatus::Live,
is_recording: *self.is_recording.read().await,
auto_start: *self.auto_start.read().await,
auto_start: *self.enabled.read().await,
platform: PlatformType::Douyin.as_str().to_string(),
}
}
async fn comments(&self, _live_id: &str) -> Result<Vec<DanmuEntry>, RecorderError> {
Ok(vec![])
async fn comments(&self, live_id: &str) -> Result<Vec<DanmuEntry>, RecorderError> {
Ok(if live_id == *self.live_id.read().await {
// just return current cache content
match self.danmu_store.read().await.as_ref() {
Some(storage) => storage.get_entries().await,
None => Vec::new(),
}
} else {
// load disk cache
let cache_file_path = format!(
"{}/douyin/{}/{}/{}",
self.config.read().await.cache,
self.room_id,
live_id,
"danmu.txt"
);
log::debug!("loading danmu cache from {}", cache_file_path);
let storage = DanmuStorage::new(&cache_file_path).await;
if storage.is_none() {
return Ok(Vec::new());
}
let storage = storage.unwrap();
storage.get_entries().await
})
}
async fn is_recording(&self, live_id: &str) -> bool {
*self.live_id.read().await == live_id && *self.live_status.read().await == LiveStatus::Live
}
async fn force_start(&self) {
*self.current_record.write().await = true;
async fn enable(&self) {
*self.enabled.write().await = true;
}
async fn force_stop(&self) {
*self.current_record.write().await = false;
}
async fn set_auto_start(&self, auto_start: bool) {
*self.auto_start.write().await = auto_start;
async fn disable(&self) {
*self.enabled.write().await = false;
}
}

View File

@@ -1,9 +1,9 @@
use crate::database::account::AccountRow;
use base64::Engine;
use m3u8_rs::{MediaPlaylist, Playlist};
use reqwest::{Client, Error as ReqwestError};
use m3u8_rs::{Playlist, MediaPlaylist};
use tokio::fs::File;
use tokio::io::AsyncWriteExt;
use crate::database::account::AccountRow;
use super::response::DouyinRoomInfoResponse;
use std::fmt;
@@ -47,20 +47,25 @@ pub struct DouyinClient {
impl DouyinClient {
pub fn new(account: &AccountRow) -> Self {
let client = Client::builder()
.user_agent(USER_AGENT)
.build()
.unwrap();
Self { client, cookies: account.cookies.clone() }
let client = Client::builder().user_agent(USER_AGENT).build().unwrap();
Self {
client,
cookies: account.cookies.clone(),
}
}
pub async fn get_room_info(&self, room_id: u64) -> Result<DouyinRoomInfoResponse, DouyinClientError> {
pub async fn get_room_info(
&self,
room_id: u64,
) -> Result<DouyinRoomInfoResponse, DouyinClientError> {
let url = format!(
"https://live.douyin.com/webcast/room/web/enter/?aid=6383&app_name=douyin_web&live_id=1&device_platform=web&language=zh-CN&enter_from=web_live&cookie_enabled=true&screen_width=1920&screen_height=1080&browser_language=zh-CN&browser_platform=MacIntel&browser_name=Chrome&browser_version=122.0.0.0&web_rid={}",
room_id
);
let resp = self.client.get(&url)
let resp = self
.client
.get(&url)
.header("Referer", "https://live.douyin.com/")
.header("User-Agent", USER_AGENT)
.header("Cookie", self.cookies.clone())
@@ -77,16 +82,17 @@ impl DouyinClient {
let response = self.client.get(url).send().await?;
let bytes = response.bytes().await?;
let base64 = base64::engine::general_purpose::STANDARD.encode(bytes);
let mime_type = mime_guess::from_path(url).first_or_octet_stream().to_string();
let mime_type = mime_guess::from_path(url)
.first_or_octet_stream()
.to_string();
Ok(format!("data:{};base64,{}", mime_type, base64))
}
pub async fn get_m3u8_content(&self, url: &str) -> Result<(MediaPlaylist, String), DouyinClientError> {
let content = self.client.get(url)
.send()
.await?
.text()
.await?;
pub async fn get_m3u8_content(
&self,
url: &str,
) -> Result<(MediaPlaylist, String), DouyinClientError> {
let content = self.client.get(url).send().await?.text().await?;
// m3u8 content: #EXTM3U
// #EXT-X-VERSION:3
// #EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=2560000
@@ -97,27 +103,27 @@ impl DouyinClient {
}
match m3u8_rs::parse_playlist_res(content.as_bytes()) {
Ok(Playlist::MasterPlaylist(_)) => {
Err(DouyinClientError::Playlist("Unexpected master playlist".to_string()))
}
Ok(Playlist::MasterPlaylist(_)) => Err(DouyinClientError::Playlist(
"Unexpected master playlist".to_string(),
)),
Ok(Playlist::MediaPlaylist(pl)) => Ok((pl, url.to_string())),
Err(e) => Err(DouyinClientError::Playlist(e.to_string())),
}
}
pub async fn download_ts(&self, url: &str, path: &str) -> Result<u64, DouyinClientError> {
let response = self.client.get(url)
.send()
.await?;
let response = self.client.get(url).send().await?;
if response.status() != reqwest::StatusCode::OK {
return Err(DouyinClientError::Network(response.error_for_status().unwrap_err()));
return Err(DouyinClientError::Network(
response.error_for_status().unwrap_err(),
));
}
let content = response.bytes().await?;
let mut file = File::create(path).await?;
file.write_all(&content).await?;
Ok(content.len() as u64)
}
}

View File

@@ -1,9 +1,13 @@
use core::fmt;
use std::fmt::Display;
use async_std::{
fs::{File, OpenOptions},
io::{prelude::BufReadExt, BufReader, WriteExt},
path::Path,
stream::StreamExt,
};
use chrono::{TimeZone, Utc};
const ENTRY_FILE_NAME: &str = "entries.log";
@@ -17,6 +21,78 @@ pub struct TsEntry {
pub is_header: bool,
}
impl TsEntry {
pub fn from(line: &str) -> Result<Self, String> {
let parts: Vec<&str> = line.split('|').collect();
if parts.len() != 6 {
return Err("Invalid input format: expected 6 fields separated by '|'".to_string());
}
Ok(TsEntry {
url: parts[0].to_string(),
sequence: parts[1]
.parse()
.map_err(|e| format!("Failed to parse sequence: {}", e))?,
length: parts[2]
.parse()
.map_err(|e| format!("Failed to parse length: {}", e))?,
size: parts[3]
.parse()
.map_err(|e| format!("Failed to parse size: {}", e))?,
ts: parts[4]
.parse()
.map_err(|e| format!("Failed to parse timestamp: {}", e))?,
is_header: parts[5]
.parse()
.map_err(|e| format!("Failed to parse is_header: {}", e))?,
})
}
/// Get timestamp in seconds
pub fn ts_seconds(&self) -> i64 {
// For some legacy problem, douyin entry's ts is s, bilibili entry's ts is ms.
// This should be fixed after 2.5.6, but we need to support entry.log generated by previous version.
if self.ts > 1619884800000 {
self.ts / 1000
} else {
self.ts
}
}
pub fn date_time(&self) -> String {
let date_str = Utc
.timestamp_opt(self.ts_seconds(), 0)
.unwrap()
.to_rfc3339();
format!("#EXT-X-PROGRAM-DATE-TIME:{}\n", date_str)
}
/// Convert entry into a segment in HLS manifest.
pub fn to_segment(&self) -> String {
if self.is_header {
return "".into();
}
let mut content = String::new();
content += &format!("#EXTINF:{:.2},\n", self.length);
content += &format!("{}\n", self.url);
content
}
}
impl Display for TsEntry {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
writeln!(
f,
"{}|{}|{}|{}|{}|{}",
self.url, self.sequence, self.length, self.size, self.ts, self.is_header
)
}
}
/// EntryStore is used to management stream segments, which is basicly a simple version of hls manifest,
/// and of course, provids methods to generate hls manifest for frontend player.
pub struct EntryStore {
// append only log file
log_file: File,
@@ -66,15 +142,13 @@ impl EntryStore {
.unwrap();
let mut lines = BufReader::new(file).lines();
while let Some(Ok(line)) = lines.next().await {
let parts: Vec<&str> = line.split('|').collect();
let entry = TsEntry {
url: parts[0].to_string(),
sequence: parts[1].parse().unwrap(),
length: parts[2].parse().unwrap(),
size: parts[3].parse().unwrap(),
ts: parts[4].parse().unwrap(),
is_header: parts[5].parse().unwrap(),
};
let entry = TsEntry::from(&line);
if let Err(e) = entry {
log::error!("Failed to parse entry: {} {}", e, line);
continue;
}
let entry = entry.unwrap();
if entry.sequence > self.last_sequence {
self.last_sequence = entry.sequence;
@@ -100,19 +174,10 @@ impl EntryStore {
self.entries.push(entry.clone());
}
if let Err(e) = self
.log_file
.write_all(
format!(
"{}|{}|{}|{}|{}|{}\n",
entry.url, entry.sequence, entry.length, entry.size, entry.ts, entry.is_header
)
.as_bytes(),
)
.await
{
if let Err(e) = self.log_file.write_all(entry.to_string().as_bytes()).await {
log::error!("Failed to write entry to log file: {}", e);
}
self.log_file.flush().await.unwrap();
if self.last_sequence < entry.sequence {
@@ -127,10 +192,6 @@ impl EntryStore {
self.header.as_ref()
}
pub fn get_entries(&self) -> &Vec<TsEntry> {
&self.entries
}
pub fn total_duration(&self) -> f64 {
self.total_duration
}
@@ -150,4 +211,83 @@ impl EntryStore {
pub fn first_ts(&self) -> Option<i64> {
self.entries.first().map(|e| e.ts)
}
/// Generate a hls manifest for selected range.
/// `vod` indicates the manifest is for stream or video.
/// `force_time` adds DATE-TIME tag for each entry.
pub fn manifest(&self, vod: bool, force_time: bool, range: Option<Range>) -> String {
let mut m3u8_content = "#EXTM3U\n".to_string();
m3u8_content += "#EXT-X-VERSION:6\n";
m3u8_content += if vod {
"#EXT-X-PLAYLIST-TYPE:VOD\n"
} else {
"#EXT-X-PLAYLIST-TYPE:EVENT\n"
};
let end_content = if vod { "#EXT-X-ENDLIST" } else { "" };
if self.entries.is_empty() {
m3u8_content += end_content;
return m3u8_content;
}
m3u8_content += &format!(
"#EXT-X-TARGETDURATION:{}\n",
(0.5 + self.entries.first().unwrap().length).floor()
);
// add header, FMP4 need this
if let Some(header) = &self.header {
m3u8_content += &format!("#EXT-X-MAP:URI=\"{}\"\n", header.url);
}
// Collect entries in range
let first_entry = self.entries.first().unwrap();
let first_entry_ts = first_entry.ts_seconds();
let mut entries_in_range = vec![];
for e in &self.entries {
// ignore header, cause it's already in EXT-X-MAP
if e.is_header {
continue;
}
let entry_offset = (e.ts_seconds() - first_entry_ts) as f32;
if range.is_none_or(|r| r.is_in(entry_offset)) {
entries_in_range.push(e);
}
}
if entries_in_range.is_empty() {
m3u8_content += end_content;
return m3u8_content;
}
let mut previous_seq = entries_in_range.first().unwrap().sequence;
for (i, e) in entries_in_range.iter().enumerate() {
let discontinuous = e.sequence < previous_seq || e.sequence - previous_seq > 1;
if discontinuous {
m3u8_content += "#EXT-X-DISCONTINUITY\n";
}
// Add date time under these situations.
if i == 0 || i == entries_in_range.len() - 1 || force_time || discontinuous {
m3u8_content += &e.date_time();
}
m3u8_content += &e.to_segment();
previous_seq = e.sequence;
}
m3u8_content += end_content;
m3u8_content
}
}
#[derive(Debug, Clone, Copy)]
pub struct Range {
pub x: f32,
pub y: f32,
}
impl Range {
pub fn is_in(&self, v: f32) -> bool {
v >= self.x && v <= self.y
}
}

View File

@@ -19,4 +19,5 @@ custom_error! {pub RecorderError
BiliClientError {err: super::bilibili::errors::BiliClientError} = "BiliClient error: {err}",
DouyinClientError {err: DouyinClientError} = "DouyinClient error: {err}",
IoError {err: std::io::Error} = "IO error: {err}",
DanmuStreamError {err: danmu_stream::DanmuStreamError} = "Danmu stream error: {err}",
}

View File

@@ -0,0 +1,22 @@
use actix_web::Response;
fn handle_hls_request(ts_path: Option<&str>) -> Response {
if let Some(ts_path) = ts_path {
if let Ok(content) = std::fs::read(ts_path) {
return Response::builder()
.status(200)
.header("Content-Type", "video/mp2t")
.header("Cache-Control", "no-cache")
.header("Access-Control-Allow-Origin", "*")
.body(content)
.unwrap();
}
}
Response::builder()
.status(404)
.header("Content-Type", "text/plain")
.header("Cache-Control", "no-cache")
.header("Access-Control-Allow-Origin", "*")
.body(b"Not Found".to_vec())
.unwrap()
}

View File

@@ -1,11 +1,11 @@
use crate::config::Config;
use crate::danmu2ass;
use crate::database::video::VideoRow;
use crate::database::DatabaseError;
use crate::database::{account::AccountRow, record::RecordRow, Database};
use crate::database::{account::AccountRow, record::RecordRow};
use crate::database::{Database, DatabaseError};
use crate::ffmpeg::{clip_from_m3u8, encode_video_danmu};
use crate::progress_event::ProgressReporter;
use crate::recorder::bilibili::BiliRecorder;
use crate::progress_reporter::{EventEmitter, ProgressReporter};
use crate::recorder::bilibili::{BiliRecorder, BiliRecorderOptions};
use crate::recorder::danmu::DanmuEntry;
use crate::recorder::douyin::DouyinRecorder;
use crate::recorder::errors::RecorderError;
@@ -14,25 +14,25 @@ use crate::recorder::Recorder;
use crate::recorder::RecorderInfo;
use chrono::Utc;
use custom_error::custom_error;
use hyper::Uri;
use serde::Deserialize;
use std::collections::HashMap;
use serde::{Deserialize, Serialize};
use std::collections::{HashMap, HashSet};
use std::path::{Path, PathBuf};
use std::str::FromStr;
use std::sync::atomic::AtomicBool;
use std::sync::Arc;
use tauri::AppHandle;
use tokio::fs::{remove_file, write};
use tokio::sync::broadcast;
use tokio::sync::RwLock;
#[cfg(not(feature = "headless"))]
use tauri::AppHandle;
#[derive(serde::Deserialize, serde::Serialize, Clone, Debug)]
pub struct RecorderList {
pub count: usize,
pub recorders: Vec<RecorderInfo>,
}
#[derive(Debug, Deserialize)]
#[derive(Debug, Deserialize, Serialize)]
pub struct ClipRangeParams {
pub title: String,
pub cover: String,
@@ -47,6 +47,7 @@ pub struct ClipRangeParams {
pub offset: i64,
/// Encode danmu after clip
pub danmu: bool,
pub local_offset: i64,
}
#[derive(Debug, Clone)]
@@ -59,10 +60,13 @@ pub enum RecorderEvent {
}
pub struct RecorderManager {
#[cfg(not(feature = "headless"))]
app_handle: AppHandle,
emitter: EventEmitter,
db: Arc<Database>,
config: Arc<RwLock<Config>>,
recorders: Arc<RwLock<HashMap<String, Box<dyn Recorder>>>>,
to_remove: Arc<RwLock<HashSet<String>>>,
event_tx: broadcast::Sender<RecorderEvent>,
is_migrating: Arc<AtomicBool>,
}
@@ -105,16 +109,20 @@ impl From<RecorderManagerError> for String {
impl RecorderManager {
pub fn new(
app_handle: AppHandle,
#[cfg(not(feature = "headless"))] app_handle: AppHandle,
emitter: EventEmitter,
db: Arc<Database>,
config: Arc<RwLock<Config>>,
) -> RecorderManager {
let (event_tx, _) = broadcast::channel(100);
let manager = RecorderManager {
#[cfg(not(feature = "headless"))]
app_handle,
emitter,
db,
config,
recorders: Arc::new(RwLock::new(HashMap::new())),
to_remove: Arc::new(RwLock::new(HashSet::new())),
event_tx,
is_migrating: Arc::new(AtomicBool::new(false)),
};
@@ -135,10 +143,13 @@ impl RecorderManager {
pub fn clone(&self) -> Self {
RecorderManager {
#[cfg(not(feature = "headless"))]
app_handle: self.app_handle.clone(),
emitter: self.emitter.clone(),
db: self.db.clone(),
config: self.config.clone(),
recorders: self.recorders.clone(),
to_remove: self.to_remove.clone(),
event_tx: self.event_tx.clone(),
is_migrating: self.is_migrating.clone(),
}
@@ -198,6 +209,7 @@ impl RecorderManager {
y: 0,
offset: recorder.first_segment_ts(live_id).await,
danmu: encode_danmu,
local_offset: 0,
};
let clip_filename = self.config.read().await.generate_clip_name(&clip_config);
@@ -284,7 +296,9 @@ impl RecorderManager {
let mut recorders_to_add = Vec::new();
for (platform, room_id) in recorder_map.keys() {
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
if !self.recorders.read().await.contains_key(&recorder_id) {
if !self.recorders.read().await.contains_key(&recorder_id)
&& !self.to_remove.read().await.contains(&recorder_id)
{
recorders_to_add.push((*platform, *room_id));
}
}
@@ -302,8 +316,9 @@ impl RecorderManager {
continue;
}
let account = account.unwrap();
if let Err(e) = self
.add_recorder("", &account, platform, room_id, *auto_start)
.add_recorder(&account, platform, room_id, *auto_start)
.await
{
log::error!("Failed to add recorder: {}", e);
@@ -315,7 +330,6 @@ impl RecorderManager {
pub async fn add_recorder(
&self,
webid: &str,
account: &AccountRow,
platform: PlatformType,
room_id: u64,
@@ -329,21 +343,24 @@ impl RecorderManager {
let event_tx = self.get_event_sender();
let recorder: Box<dyn Recorder + 'static> = match platform {
PlatformType::BiliBili => Box::new(
BiliRecorder::new(
self.app_handle.clone(),
webid,
&self.db,
BiliRecorder::new(BiliRecorderOptions {
#[cfg(feature = "gui")]
app_handle: self.app_handle.clone(),
emitter: self.emitter.clone(),
db: self.db.clone(),
room_id,
account,
self.config.clone(),
account: account.clone(),
config: self.config.clone(),
auto_start,
event_tx,
)
channel: event_tx,
})
.await?,
),
PlatformType::Douyin => Box::new(
DouyinRecorder::new(
#[cfg(feature = "gui")]
self.app_handle.clone(),
self.emitter.clone(),
room_id,
self.config.clone(),
account,
@@ -378,6 +395,10 @@ impl RecorderManager {
self.recorders.write().await.clear();
}
/// Remove a recorder from the manager
///
/// This will stop the recorder and remove it from the manager
/// and remove the related cache folder
pub async fn remove_recorder(
&self,
platform: PlatformType,
@@ -389,14 +410,27 @@ impl RecorderManager {
return Err(RecorderManagerError::NotFound { room_id });
}
// remove from db
self.db.remove_recorder(room_id).await?;
// add to to_remove
log::debug!("Add to to_remove: {}", recorder_id);
self.to_remove.write().await.insert(recorder_id.clone());
// stop recorder
log::debug!("Stop recorder: {}", recorder_id);
if let Some(recorder_ref) = self.recorders.read().await.get(&recorder_id) {
recorder_ref.stop().await;
}
// remove recorder
log::debug!("Remove recorder from manager: {}", recorder_id);
self.recorders.write().await.remove(&recorder_id);
// remove from to_remove
log::debug!("Remove from to_remove: {}", recorder_id);
self.to_remove.write().await.remove(&recorder_id);
// remove related cache folder
let cache_folder = format!(
"{}/{}/{}",
@@ -404,6 +438,7 @@ impl RecorderManager {
platform.as_str(),
room_id
);
log::debug!("Remove cache folder: {}", cache_folder);
let _ = tokio::fs::remove_dir_all(cache_folder).await;
log::info!("Recorder {} cache folder removed", room_id);
@@ -439,7 +474,7 @@ impl RecorderManager {
params: &ClipRangeParams,
) -> Result<PathBuf, RecorderManagerError> {
let range_m3u8 = format!(
"http://127.0.0.1/{}/{}/{}/playlist.m3u8?start={}&end={}",
"{}/{}/{}/playlist.m3u8?start={}&end={}",
params.platform, params.room_id, params.live_id, params.x, params.y
);
@@ -480,16 +515,17 @@ impl RecorderManager {
}
log::info!(
"Filter danmus in range [{}, {}] with offset {}",
"Filter danmus in range [{}, {}] with global offset {} and local offset {}",
params.x,
params.y,
params.offset
params.offset,
params.local_offset
);
let mut danmus = danmus.unwrap();
log::debug!("First danmu entry: {:?}", danmus.first());
// update entry ts to offset
for d in &mut danmus {
d.ts -= (params.x + params.offset) * 1000;
d.ts -= (params.x + params.offset + params.local_offset) * 1000;
}
if params.x != 0 || params.y != 0 {
danmus.retain(|x| x.ts >= 0 && x.ts <= (params.y - params.x) * 1000);
@@ -604,25 +640,58 @@ impl RecorderManager {
pub async fn handle_hls_request(&self, uri: &str) -> Result<Vec<u8>, RecorderManagerError> {
let cache_path = self.config.read().await.cache.clone();
let uri = Uri::from_str(uri)
.map_err(|e| RecorderManagerError::HLSError { err: e.to_string() })?;
let path = uri.path();
let path = uri.split('?').next().unwrap_or(uri);
let params = uri.split('?').nth(1).unwrap_or("");
let path_segs: Vec<&str> = path.split('/').collect();
if path_segs.len() != 5 {
if path_segs.len() != 4 {
log::warn!("Invalid request path: {}", path);
return Err(RecorderManagerError::HLSError {
err: "Invalid hls path".into(),
});
}
// parse recorder type
let platform = path_segs[1];
let platform = path_segs[0];
// parse room id
let room_id = path_segs[2].parse::<u64>().unwrap();
let room_id = path_segs[1].parse::<u64>().unwrap();
// parse live id
let live_id = path_segs[3];
let live_id = path_segs[2];
if path_segs[4] == "playlist.m3u8" {
let params = Some(params);
// parse params, example: start=10&end=20
// start and end are optional
// split params by &, and then split each param by =
let params = if let Some(params) = params {
let params = params
.split('&')
.map(|param| param.split('=').collect::<Vec<&str>>())
.collect::<Vec<Vec<&str>>>();
Some(params)
} else {
None
};
let start = if let Some(params) = &params {
params
.iter()
.find(|param| param[0] == "start")
.map(|param| param[1].parse::<i64>().unwrap())
.unwrap_or(0)
} else {
0
};
let end = if let Some(params) = &params {
params
.iter()
.find(|param| param[0] == "end")
.map(|param| param[1].parse::<i64>().unwrap())
.unwrap_or(0)
} else {
0
};
if path_segs[3] == "playlist.m3u8" {
// get recorder
let recorder_key = format!("{}:{}", platform, room_id);
let recorders = self.recorders.read().await;
@@ -633,42 +702,23 @@ impl RecorderManager {
});
}
let recorder = recorder.unwrap();
let params = uri.query();
// parse params, example: start=10&end=20
// start and end are optional
// split params by &, and then split each param by =
let params = if let Some(params) = params {
let params = params
.split('&')
.map(|param| param.split('=').collect::<Vec<&str>>())
.collect::<Vec<Vec<&str>>>();
Some(params)
} else {
None
};
let start = if let Some(params) = &params {
params
.iter()
.find(|param| param[0] == "start")
.map(|param| param[1].parse::<i64>().unwrap())
.unwrap_or(0)
} else {
0
};
let end = if let Some(params) = &params {
params
.iter()
.find(|param| param[0] == "end")
.map(|param| param[1].parse::<i64>().unwrap())
.unwrap_or(0)
} else {
0
};
// response with recorder generated m3u8, which contains ts entries that cached in local
let m3u8_content = recorder.m3u8_content(live_id, start, end).await;
Ok(m3u8_content.into())
} else if path_segs[3] == "master.m3u8" {
// get recorder
let recorder_key = format!("{}:{}", platform, room_id);
let recorders = self.recorders.read().await;
let recorder = recorders.get(&recorder_key);
if recorder.is_none() {
return Err(RecorderManagerError::HLSError {
err: "Recorder not found".into(),
});
}
let recorder = recorder.unwrap();
let m3u8_content = recorder.master_m3u8(live_id, start, end).await;
Ok(m3u8_content.into())
} else {
// try to find requested ts file in recorder's cache
@@ -695,29 +745,19 @@ impl RecorderManager {
}
}
pub async fn set_auto_start(&self, platform: PlatformType, room_id: u64, auto_start: bool) {
pub async fn set_enable(&self, platform: PlatformType, room_id: u64, enabled: bool) {
// update RecordRow auto_start field
if let Err(e) = self.db.update_recorder(platform, room_id, auto_start).await {
if let Err(e) = self.db.update_recorder(platform, room_id, enabled).await {
log::error!("Failed to update recorder auto_start: {}", e);
}
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
if let Some(recorder_ref) = self.recorders.read().await.get(&recorder_id) {
recorder_ref.set_auto_start(auto_start).await;
}
}
pub async fn force_start(&self, platform: PlatformType, room_id: u64) {
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
if let Some(recorder_ref) = self.recorders.read().await.get(&recorder_id) {
recorder_ref.force_start().await;
}
}
pub async fn force_stop(&self, platform: PlatformType, room_id: u64) {
let recorder_id = format!("{}:{}", platform.as_str(), room_id);
if let Some(recorder_ref) = self.recorders.read().await.get(&recorder_id) {
recorder_ref.force_stop().await;
if enabled {
recorder_ref.enable().await;
} else {
recorder_ref.disable().await;
}
}
}
}

View File

@@ -7,6 +7,9 @@ use crate::database::Database;
use crate::recorder::bilibili::client::BiliClient;
use crate::recorder_manager::RecorderManager;
#[cfg(feature = "headless")]
use crate::progress_manager::ProgressManager;
custom_error! {
StateError
RecorderAlreadyExists = "Recorder already exists",
@@ -19,5 +22,10 @@ pub struct State {
pub client: Arc<BiliClient>,
pub config: Arc<RwLock<Config>>,
pub recorder_manager: Arc<RecorderManager>,
#[cfg(not(feature = "headless"))]
pub app_handle: tauri::AppHandle,
#[cfg(feature = "headless")]
pub progress_manager: Arc<ProgressManager>,
#[cfg(feature = "headless")]
pub readonly: bool,
}

View File

@@ -1,22 +1,24 @@
use async_trait::async_trait;
use std::path::Path;
use crate::progress_event::ProgressReporterTrait;
use crate::progress_reporter::ProgressReporterTrait;
pub mod whisper;
// subtitle_generator types
#[allow(dead_code)]
pub enum SubtitleGeneratorType {
Whisper,
}
impl SubtitleGeneratorType {
#[allow(dead_code)]
pub fn as_str(&self) -> &'static str {
match self {
SubtitleGeneratorType::Whisper => "whisper",
}
}
#[allow(dead_code)]
pub fn from_str(s: &str) -> Option<Self> {
match s {
"whisper" => Some(SubtitleGeneratorType::Whisper),

View File

@@ -1,6 +1,6 @@
use async_trait::async_trait;
use crate::progress_event::ProgressReporterTrait;
use crate::progress_reporter::ProgressReporterTrait;
use async_std::sync::{Arc, RwLock};
use std::path::Path;
use tokio::io::AsyncWriteExt;
@@ -140,6 +140,7 @@ mod tests {
#[derive(Clone)]
struct MockReporter {}
impl MockReporter {
#[allow(dead_code)]
fn update(&self, _message: &str) {
// mock implementation
}

View File

@@ -5,29 +5,36 @@
import Setting from "./page/Setting.svelte";
import Account from "./page/Account.svelte";
import About from "./page/About.svelte";
let active = "#总览";
import { log } from "./lib/invoker";
let active = "总览";
log.info("App loaded");
</script>
<main>
<div class="wrap">
<div class="sidebar">
<BSidebar bind:activeUrl={active} />
<BSidebar
bind:activeUrl={active}
on:activeChange={(e) => {
active = e.detail;
}}
/>
</div>
<div class="content bg-white dark:bg-[#2c2c2e]">
<!-- switch component by active -->
<div class="page" class:visible={active == "#总览"}>
<div class="page" class:visible={active == "总览"}>
<Summary />
</div>
<div class="page" class:visible={active == "#直播间"}>
<div class="page" class:visible={active == "直播间"}>
<Room />
</div>
<div class="page" class:visible={active == "#账号"}>
<div class="page" class:visible={active == "账号"}>
<Account />
</div>
<div class="page" class:visible={active == "#设置"}>
<div class="page" class:visible={active == "设置"}>
<Setting />
</div>
<div class="page" class:visible={active == "#关于"}>
<div class="page" class:visible={active == "关于"}>
<About />
</div>
</div>

View File

@@ -1,7 +1,13 @@
<script lang="ts">
import { convertFileSrc, invoke } from "@tauri-apps/api/core";
import {
invoke,
set_title,
TAURI_ENV,
convertFileSrc,
listen,
log,
} from "./lib/invoker";
import Player from "./lib/Player.svelte";
import { getCurrentWebviewWindow } from "@tauri-apps/api/webviewWindow";
import type { AccountInfo, RecordItem } from "./lib/db";
import { ChevronRight, ChevronLeft, Play, Pen } from "lucide-svelte";
import {
@@ -18,10 +24,8 @@
import MarkerPanel from "./lib/MarkerPanel.svelte";
import CoverEditor from "./lib/CoverEditor.svelte";
import VideoPreview from "./lib/VideoPreview.svelte";
import { listen } from "@tauri-apps/api/event";
import { onDestroy, onMount } from "svelte";
const appWindow = getCurrentWebviewWindow();
const urlParams = new URLSearchParams(window.location.search);
const room_id = parseInt(urlParams.get("room_id"));
const platform = urlParams.get("platform");
@@ -29,6 +33,8 @@
const focus_start = parseInt(urlParams.get("start") || "0");
const focus_end = parseInt(urlParams.get("end") || "0");
log.info("AppLive loaded", room_id, platform, live_id);
// get profile in local storage with a default value
let profile: Profile = get_profile();
let config: Config = null;
@@ -84,28 +90,28 @@
let current_post_event_id = null;
let danmu_enabled = false;
let progress_update_listener = listen<ProgressUpdate>(
`progress-update`,
(e) => {
let event_id = e.payload.id;
if (event_id == current_clip_event_id) {
update_clip_prompt(e.payload.content);
} else if (event_id == current_post_event_id) {
update_post_prompt(e.payload.content);
}
const update_listener = listen<ProgressUpdate>(`progress-update`, (e) => {
console.log("progress-update event", e.payload.id);
let event_id = e.payload.id;
if (event_id === current_clip_event_id) {
update_clip_prompt(e.payload.content);
} else if (event_id === current_post_event_id) {
update_post_prompt(e.payload.content);
}
);
let progress_finished_listener = listen<ProgressFinished>(
});
const finished_listener = listen<ProgressFinished>(
`progress-finished`,
(e) => {
console.log("progress-finished event", e.payload.id);
let event_id = e.payload.id;
if (event_id == current_clip_event_id) {
if (event_id === current_clip_event_id) {
console.log("clip event finished", event_id);
update_clip_prompt(`生成切片`);
if (!e.payload.success) {
alert("请检查 ffmpeg 是否配置正确:" + e.payload.message);
}
current_clip_event_id = null;
} else if (event_id == current_post_event_id) {
} else if (event_id === current_post_event_id) {
update_post_prompt(`投稿`);
if (!e.payload.success) {
alert(e.payload.message);
@@ -115,10 +121,9 @@
}
);
// remove listeners when component is destroyed
onDestroy(() => {
progress_update_listener.then((fn) => fn());
progress_finished_listener.then((fn) => fn());
update_listener?.then((fn) => fn());
finished_listener?.then((fn) => fn());
});
let archive: RecordItem = null;
@@ -194,7 +199,7 @@
(a: RecordItem) => {
console.log(a);
archive = a;
appWindow.setTitle(`[${room_id}]${archive.title}`);
set_title(`[${room_id}]${archive.title}`);
}
);
@@ -267,7 +272,10 @@
x: Math.floor(focus_start + start),
y: Math.floor(focus_start + end),
danmu: danmu_enabled,
offset: global_offset + parseInt(live_id),
offset: global_offset,
local_offset:
parseInt(localStorage.getItem(`local_offset:${live_id}`) || "0", 10) ||
0,
});
console.log("video file generatd:", new_video);
await get_video_list();
@@ -275,7 +283,9 @@
selected_video = videos.find((v) => {
return v.value == new_video.id;
});
selected_video.cover = new_video.cover;
if (selected_video) {
selected_video.cover = new_video.cover;
}
}
async function do_post() {
@@ -340,6 +350,19 @@
JSON.stringify(markers)
);
}
async function save_video() {
if (!selected_video) {
return;
}
// download video
const video_url = selected_video.file;
const video_name = selected_video.name;
const a = document.createElement("a");
a.href = video_url;
a.download = video_name;
a.click();
}
</script>
<main>
@@ -409,7 +432,6 @@
}}
onClose={() => {
preview = false;
selected_video = null;
}}
onVideoListUpdate={get_video_list}
/>
@@ -493,21 +515,32 @@
</div>
</div>
<select
bind:value={video_selected}
on:change={find_video}
class="w-full px-3 py-2 bg-[#2c2c2e] text-white rounded-lg
<div class="flex flex-row items-center justify-between">
<select
bind:value={video_selected}
on:change={find_video}
class="w-full px-3 py-2 bg-[#2c2c2e] text-white rounded-lg
border border-gray-800/50 focus:border-[#0A84FF]
transition duration-200 outline-none appearance-none
hover:border-gray-700/50"
>
<option value={0}>选择切片</option>
{#each videos as video}
<option value={video.value}>{video.name}</option>
{/each}
</select>
>
<option value={0}>选择切片</option>
{#each videos as video}
<option value={video.value}>{video.name}</option>
{/each}
</select>
{#if !TAURI_ENV && selected_video}
<button
on:click={save_video}
class="w-24 ml-2 px-3 py-2 bg-[#0A84FF] text-white rounded-lg
transition-all duration-200 hover:bg-[#0A84FF]/90
disabled:opacity-50 disabled:cursor-not-allowed"
>
下载
</button>
{/if}
</div>
</section>
<!-- 封面预览 -->
{#if selected_video && selected_video.id != -1}
<section>

2
src/env.d.ts vendored Normal file
View File

@@ -0,0 +1,2 @@
declare const __APP_VERSION__: string;
declare const __API_BASE_URL__: string;

View File

@@ -1,12 +1,18 @@
<script>
import { Info, LayoutDashboard, Settings, Users, Video } from "lucide-svelte";
import { hasNewVersion } from "./stores/version";
import SidebarItem from "./SidebarItem.svelte";
import { createEventDispatcher } from "svelte";
// acitveUrl is shared between project
export let activeUrl = "#总览";
const dispatch = createEventDispatcher();
export let activeUrl = "总览";
/**
* @param {{ detail: String; }} route
*/
function navigate(route) {
activeUrl = route;
dispatch("activeChange", route.detail);
}
</script>
@@ -14,81 +20,36 @@
class="w-48 bg-[#f0f0f3]/50 dark:bg-[#2c2c2e]/50 backdrop-blur-xl border-r border-gray-200 dark:border-gray-700"
>
<nav class="p-3 space-y-1">
<button
on:click={() => navigate("#总览")}
class="flex w-full items-center space-x-2 px-3 py-2 rounded-lg {activeUrl ===
'#总览'
? 'bg-blue-500/10 text-[#0A84FF]'
: 'text-gray-700'} dark:text-[#0A84FF] hover:bg-[#e5e5e5] dark:hover:bg-[#3a3a3c]"
<SidebarItem label="总览" {activeUrl} on:activeChange={navigate}>
<div slot="icon">
<LayoutDashboard class="w-5 h-5" />
</div>
</SidebarItem>
<SidebarItem label="直播间" {activeUrl} on:activeChange={navigate}>
<div slot="icon">
<Video class="w-5 h-5" />
</div>
</SidebarItem>
<SidebarItem label="账号" {activeUrl} on:activeChange={navigate}>
<div slot="icon">
<Users class="w-5 h-5" />
</div>
</SidebarItem>
<SidebarItem label="设置" {activeUrl} on:activeChange={navigate}>
<div slot="icon">
<Settings class="w-5 h-5" />
</div>
</SidebarItem>
<SidebarItem
label="关于"
{activeUrl}
on:activeChange={navigate}
dot={$hasNewVersion}
>
<LayoutDashboard
class="w-5 h-5 {activeUrl === '#总览'
? 'text-[#0A84FF]'
: 'text-gray-700 dark:text-[#0A84FF]'}"
/>
<span>总览</span>
</button>
<button
on:click={() => navigate("#直播间")}
class="flex w-full items-center space-x-2 px-3 py-2 rounded-lg {activeUrl ===
'#直播间'
? 'bg-blue-500/10 text-[#0A84FF]'
: 'text-gray-700'} dark:text-[#0A84FF] hover:bg-[#e5e5e5] dark:hover:bg-[#3a3a3c]"
>
<Video
class="w-5 h-5 {activeUrl === '#直播间'
? 'text-[#0A84FF]'
: 'text-gray-700 dark:text-[#0A84FF]'}"
/>
<span>直播间</span>
</button>
<button
on:click={() => navigate("#账号")}
class="flex w-full items-center space-x-2 px-3 py-2 rounded-lg {activeUrl ===
'#账号'
? 'bg-blue-500/10 text-[#0A84FF]'
: 'text-gray-700'} dark:text-[#0A84FF] hover:bg-[#e5e5e5] dark:hover:bg-[#3a3a3c]"
>
<Users
class="w-5 h-5 {activeUrl === '#账号'
? 'text-[#0A84FF]'
: 'text-gray-700 dark:text-[#0A84FF]'}"
/>
<span>账号</span>
</button>
<button
on:click={() => navigate("#设置")}
class="flex w-full items-center space-x-2 px-3 py-2 rounded-lg {activeUrl ===
'#设置'
? 'bg-blue-500/10 text-[#0A84FF]'
: 'text-gray-700'} dark:text-[#0A84FF] hover:bg-[#e5e5e5] dark:hover:bg-[#3a3a3c]"
>
<Settings
class="w-5 h-5 {activeUrl === '#设置'
? 'text-[#0A84FF]'
: 'text-gray-700 dark:text-[#0A84FF]'}"
/>
<span>设置</span>
</button>
<button
on:click={() => navigate("#关于")}
class="flex w-full items-center space-x-2 px-3 py-2 rounded-lg {activeUrl ===
'#关于'
? 'bg-blue-500/10 text-[#0A84FF]'
: 'text-gray-700'} dark:text-[#0A84FF] hover:bg-[#e5e5e5] dark:hover:bg-[#3a3a3c] relative"
>
<Info
class="w-5 h-5 {activeUrl === '#关于'
? 'text-[#0A84FF]'
: 'text-gray-700 dark:text-[#0A84FF]'}"
/>
<span>关于</span>
{#if $hasNewVersion}
<div
class="absolute right-3 top-1/2 -translate-y-1/2 w-2 h-2 bg-red-500 rounded-full"
></div>
{/if}
</button>
<div slot="icon">
<Info class="w-5 h-5" />
</div>
</SidebarItem>
</nav>
</div>

View File

@@ -1,6 +1,6 @@
<script lang="ts">
import { Play, X, Type, Palette, Move, Plus, Trash2 } from "lucide-svelte";
import { invoke } from "@tauri-apps/api/core";
import { invoke, log } from "../lib/invoker";
import { onMount, createEventDispatcher } from "svelte";
const dispatch = createEventDispatcher();
@@ -83,7 +83,7 @@
scheduleRedraw();
};
backgroundImage.onerror = (e) => {
console.error("Failed to load image:", e);
log.error("Failed to load image:", e);
};
backgroundImage.src = videoFrame;
}

View File

@@ -1,5 +1,5 @@
<script lang="ts">
import { fetch } from "@tauri-apps/plugin-http";
import { get, log } from "./invoker";
export let src = "";
export let iclass = "";
let b = "";
@@ -7,16 +7,17 @@
if (!url) {
return "/imgs/douyin.png";
}
const response = await fetch(url, {
method: "GET",
});
if (url.startsWith("data")) {
return url;
}
const response = await get(url);
return URL.createObjectURL(await response.blob());
}
async function init() {
try {
b = await getImage(src);
} catch (e) {
console.error(e);
log.error("Failed to get image:", e);
}
}
init();

View File

@@ -8,7 +8,7 @@
import type { Marker } from "./interface";
import { createEventDispatcher } from "svelte";
import { Tooltip } from "flowbite-svelte";
import { invoke } from "@tauri-apps/api/core";
import { invoke, TAURI_ENV } from "../lib/invoker";
import { save } from "@tauri-apps/plugin-dialog";
import type { RecordItem } from "./db";
const dispatch = createEventDispatcher();
@@ -47,12 +47,19 @@
.split(" ")[0]
.replaceAll("/", "-")}]${archive.title}.txt`;
console.log("export to file", file_name);
const path = await save({
title: "导出标记列表",
defaultPath: file_name,
});
if (!path) return;
await invoke("export_to_file", { fileName: path, content: r });
if (TAURI_ENV) {
const path = await save({
title: "导出标记列表",
defaultPath: file_name,
});
if (!path) return;
await invoke("export_to_file", { fileName: path, content: r });
} else {
const a = document.createElement("a");
a.href = "data:text/plain;charset=utf-8," + encodeURIComponent(r);
a.download = file_name;
a.click();
}
}
</script>

View File

@@ -3,8 +3,7 @@
</script>
<script lang="ts">
import { invoke } from "@tauri-apps/api/core";
import { listen } from "@tauri-apps/api/event";
import { invoke, TAURI_ENV, ENDPOINT, listen, log } from "../lib/invoker";
import type { AccountInfo } from "./db";
import type { Marker, RecorderList, RecorderInfo } from "./interface";
@@ -12,7 +11,6 @@
import {
GridOutline,
SortHorizontalOutline,
DownloadOutline,
FileExportOutline,
} from "flowbite-svelte-icons";
import { save } from "@tauri-apps/plugin-dialog";
@@ -41,6 +39,10 @@
let show_export = false;
let recorders: RecorderInfo[] = [];
// local setting of danmu offset
let local_offset: number =
parseInt(localStorage.getItem(`local_offset:${live_id}`) || "0", 10) || 0;
// save start and end to localStorage
function saveStartEnd() {
localStorage.setItem(`${live_id}_start`, (start + focus_start).toString());
@@ -48,6 +50,20 @@
console.log("Saved start and end", start + focus_start, end + focus_start);
}
async function loadGlobalOffset(url: string) {
const response = await fetch(url);
const text = await response.text();
const offsetRegex = /DANMU=(\d+)/;
const match = text.match(offsetRegex);
if (match && match[1]) {
global_offset = parseInt(match[1], 10);
console.log("DANMU OFFSET found", global_offset);
} else {
console.warn("No DANMU OFFSET found");
console.log(text);
}
}
function tauriNetworkPlugin(uri, requestType, progressUpdated) {
const controller = new AbortController();
const abortStatus = {
@@ -56,10 +72,7 @@
};
const pendingRequest = new Promise((resolve, reject) => {
if (requestType == 0) {
console.log("fetch uri: ", uri);
}
invoke("fetch_hls", { uri })
invoke("fetch_hls", { uri: uri })
.then((data: number[]) => {
if (abortStatus.canceled) {
reject(new Error("Request was aborted"));
@@ -70,22 +83,27 @@
const uint8Array = new Uint8Array(data);
const arrayBuffer = uint8Array.buffer;
if (requestType == 0) {
let m3u8Content = data.map((v) => String.fromCharCode(v)).join();
const offsetRegex = /#EXT-X-OFFSET:(\d+)/;
const match = m3u8Content.match(offsetRegex);
const is_m3u8 = uri.split("?")[0].endsWith(".m3u8");
if (match && match[1]) {
global_offset = parseInt(match[1], 10);
} else {
console.warn("No #EXT-X-OFFSET found");
if (is_m3u8) {
let m3u8Content = new TextDecoder().decode(uint8Array);
if (global_offset == 0) {
const offsetRegex = /DANMU=(\d+)/;
const match = m3u8Content.match(offsetRegex);
if (match && match[1]) {
global_offset = parseInt(match[1], 10);
console.log("DANMU OFFSET found", global_offset);
} else {
console.warn("No DANMU OFFSET found");
}
}
}
// Set content-type based on URI extension
let content_type =
requestType == 1
? "application/octet-stream"
: "application/vnd.apple.mpegurl";
let content_type = is_m3u8
? "application/vnd.apple.mpegurl"
: "application/octet-stream";
// Create response object with byteLength for segment data
const response = {
@@ -103,7 +121,7 @@
resolve(response);
})
.catch((error) => {
console.error("Network error:", error);
log.error("Network error:", error);
reject(
new shaka.util.Error(
shaka.util.Error.Severity.CRITICAL,
@@ -122,8 +140,10 @@
});
}
shaka.net.NetworkingEngine.registerScheme("http", tauriNetworkPlugin);
shaka.net.NetworkingEngine.registerScheme("https", tauriNetworkPlugin);
if (TAURI_ENV) {
shaka.net.NetworkingEngine.registerScheme("http", tauriNetworkPlugin);
shaka.net.NetworkingEngine.registerScheme("https", tauriNetworkPlugin);
}
async function update_stream_list() {
recorders = (
@@ -190,13 +210,15 @@
});
try {
await player.load(
`http://127.0.0.1/${platform}/${room_id}/${live_id}/playlist.m3u8?start=${focus_start}&end=${focus_end}`
);
const url = `${ENDPOINT ? ENDPOINT : window.location.origin}/hls/${platform}/${room_id}/${live_id}/master.m3u8?start=${focus_start}&end=${focus_end}`;
if (!TAURI_ENV) {
await loadGlobalOffset(url);
}
await player.load(url);
// This runs if the asynchronous load is successful.
console.log("The video has now been loaded!");
} catch (error) {
console.error("Error code", error.code, "object", error);
log.error("Error code", error.code, "object", error);
if (error.code == 3000) {
// reload
setTimeout(() => {
@@ -211,6 +233,7 @@
"Error message: " +
error.message
);
log.error("Error code", error.code, "object", error);
}
}
@@ -223,7 +246,6 @@
video.addEventListener("volumechange", (event) => {
localStorage.setItem(`volume:${room_id}`, video.volume.toString());
console.log("Update volume to", video.volume);
});
document.getElementsByClassName("shaka-overflow-menu-button")[0].remove();
@@ -261,36 +283,36 @@
let ts = parseInt(live_id);
if (platform == "bilibili") {
let danmu_displayed = {};
// history danmaku sender
setInterval(() => {
if (video.paused || !danmu_enabled || danmu_records.length == 0) {
let danmu_displayed = {};
// history danmaku sender
setInterval(() => {
if (video.paused || !danmu_enabled || danmu_records.length == 0) {
return;
}
// using live source
if (isLive() && get_total() - video.currentTime <= 5) {
return;
}
const cur = Math.floor(
(video.currentTime + global_offset + focus_start + local_offset) * 1000
);
let danmus = danmu_records.filter((v) => {
return v.ts >= cur - 1000 && v.ts < cur;
});
danmus.forEach((v) => {
if (danmu_displayed[v.ts]) {
delete danmu_displayed[v.ts];
return;
}
danmu_handler(v.content);
});
}, 1000);
// using live source
if (isLive() && get_total() - video.currentTime <= 5) {
return;
}
const cur = Math.floor(
(video.currentTime + global_offset + ts + focus_start) * 1000
);
let danmus = danmu_records.filter((v) => {
return v.ts >= cur - 1000 && v.ts < cur;
});
danmus.forEach((v) => {
if (danmu_displayed[v.ts]) {
delete danmu_displayed[v.ts];
return;
}
danmu_handler(v.content);
});
}, 1000);
if (isLive()) {
if (isLive()) {
if (platform == "bilibili") {
// add a account select
const accountSelect = document.createElement("select");
accountSelect.style.height = "30px";
@@ -301,7 +323,6 @@
accountSelect.style.padding = "0 10px";
accountSelect.style.boxSizing = "border-box";
accountSelect.style.fontSize = "1em";
// get accounts from tauri
const account_info = (await invoke("get_accounts")) as AccountInfo;
account_info.accounts.forEach((account) => {
@@ -344,190 +365,279 @@
shakaSpacer.appendChild(accountSelect);
shakaSpacer.appendChild(danmakuInput);
// listen to danmaku event
const unlisten = await listen(
"danmu:" + room_id,
(event: { payload: DanmuEntry }) => {
// if not enabled or playback is not keep up with live, ignore the danmaku
if (!danmu_enabled || get_total() - video.currentTime > 5) {
danmu_records.push(event.payload);
return;
}
if (Object.keys(danmu_displayed).length > 1000) {
danmu_displayed = {};
}
danmu_displayed[event.payload.ts] = true;
danmu_records.push(event.payload);
danmu_handler(event.payload.content);
}
);
window.onbeforeunload = () => {
unlisten();
};
}
// create a danmaku toggle button
const danmakuToggle = document.createElement("button");
danmakuToggle.innerText = "弹幕已开启";
danmakuToggle.style.height = "30px";
danmakuToggle.style.backgroundColor = "rgba(0, 128, 255, 0.5)";
danmakuToggle.style.color = "white";
danmakuToggle.style.border = "1px solid gray";
danmakuToggle.style.padding = "0 10px";
danmakuToggle.style.boxSizing = "border-box";
danmakuToggle.style.fontSize = "1em";
danmakuToggle.addEventListener("click", async () => {
danmu_enabled = !danmu_enabled;
danmakuToggle.innerText = danmu_enabled ? "弹幕已开启" : "弹幕已关闭";
// clear background color
danmakuToggle.style.backgroundColor = danmu_enabled
? "rgba(0, 128, 255, 0.5)"
: "rgba(255, 0, 0, 0.5)";
});
// create a area that overlay half top of the video, which shows danmakus floating from right to left
const overlay = document.createElement("div");
overlay.style.width = "100%";
overlay.style.height = "100%";
overlay.style.position = "absolute";
overlay.style.top = "0";
overlay.style.left = "0";
overlay.style.pointerEvents = "none";
overlay.style.zIndex = "30";
overlay.style.display = "flex";
overlay.style.alignItems = "center";
overlay.style.flexDirection = "column";
overlay.style.paddingTop = "10%";
// place overlay to the top of the video
video.parentElement.appendChild(overlay);
// Store the positions of the last few danmakus to avoid overlap
const danmakuPositions = [];
function danmu_handler(content: string) {
const danmaku = document.createElement("p");
danmaku.style.position = "absolute";
// Calculate a random position for the danmaku
let topPosition = 0;
let attempts = 0;
do {
topPosition = Math.random() * 30;
attempts++;
} while (
danmakuPositions.some((pos) => Math.abs(pos - topPosition) < 5) &&
attempts < 10
);
// Record the position
danmakuPositions.push(topPosition);
if (danmakuPositions.length > 10) {
danmakuPositions.shift(); // Keep the last 10 positions
// listen to danmaku event
await listen("danmu:" + room_id, (event: { payload: DanmuEntry }) => {
// if not enabled or playback is not keep up with live, ignore the danmaku
if (!danmu_enabled || get_total() - video.currentTime > 5) {
danmu_records.push(event.payload);
return;
}
danmaku.style.top = `${topPosition}%`;
danmaku.style.right = "0";
danmaku.style.color = "white";
danmaku.style.fontSize = "1.2em";
danmaku.style.whiteSpace = "nowrap";
danmaku.style.transform = "translateX(100%)";
danmaku.style.transition = "transform 10s linear";
danmaku.style.pointerEvents = "none";
danmaku.style.margin = "0";
danmaku.style.padding = "0";
danmaku.style.zIndex = "500";
danmaku.style.textShadow = "1px 1px 2px rgba(0, 0, 0, 0.6)";
danmaku.innerText = content;
overlay.appendChild(danmaku);
requestAnimationFrame(() => {
danmaku.style.transform = `translateX(-${overlay.clientWidth + danmaku.clientWidth}px)`;
});
danmaku.addEventListener("transitionend", () => {
overlay.removeChild(danmaku);
});
}
shakaSpacer.appendChild(danmakuToggle);
if (Object.keys(danmu_displayed).length > 1000) {
danmu_displayed = {};
}
danmu_displayed[event.payload.ts] = true;
danmu_records.push(event.payload);
danmu_handler(event.payload.content);
});
}
// create a playback rate select to of shaka-spacer
const playbackRateSelect = document.createElement("select");
playbackRateSelect.style.height = "30px";
playbackRateSelect.style.minWidth = "60px";
playbackRateSelect.style.backgroundColor = "rgba(0, 0, 0, 0.5)";
playbackRateSelect.style.color = "white";
playbackRateSelect.style.border = "1px solid gray";
playbackRateSelect.style.padding = "0 10px";
playbackRateSelect.style.boxSizing = "border-box";
playbackRateSelect.style.fontSize = "1em";
playbackRateSelect.style.right = "10px";
playbackRateSelect.style.position = "absolute";
playbackRateSelect.innerHTML = `
<option value="0.5">0.5x</option>
<option value="1">1x</option>
<option value="1.5">1.5x</option>
<option value="2">2x</option>
<option value="5">5x</option>
`;
// default playback rate is 1
playbackRateSelect.value = "1";
playbackRateSelect.addEventListener("change", () => {
const rate = parseFloat(playbackRateSelect.value);
video.playbackRate = rate;
// create a danmaku toggle button
const danmakuToggle = document.createElement("button");
danmakuToggle.innerText = "弹幕已开启";
danmakuToggle.style.height = "30px";
danmakuToggle.style.backgroundColor = "rgba(0, 128, 255, 0.5)";
danmakuToggle.style.color = "white";
danmakuToggle.style.border = "1px solid gray";
danmakuToggle.style.padding = "0 10px";
danmakuToggle.style.boxSizing = "border-box";
danmakuToggle.style.fontSize = "1em";
danmakuToggle.addEventListener("click", async () => {
danmu_enabled = !danmu_enabled;
danmakuToggle.innerText = danmu_enabled ? "弹幕已开启" : "弹幕已关闭";
// clear background color
danmakuToggle.style.backgroundColor = danmu_enabled
? "rgba(0, 128, 255, 0.5)"
: "rgba(255, 0, 0, 0.5)";
});
shakaSpacer.appendChild(playbackRateSelect);
// create a area that overlay half top of the video, which shows danmakus floating from right to left
const overlay = document.createElement("div");
overlay.style.width = "100%";
overlay.style.height = "100%";
overlay.style.position = "absolute";
overlay.style.top = "0";
overlay.style.left = "0";
overlay.style.pointerEvents = "none";
overlay.style.zIndex = "30";
overlay.style.display = "flex";
overlay.style.alignItems = "center";
overlay.style.flexDirection = "column";
overlay.style.paddingTop = "10%";
// place overlay to the top of the video
video.parentElement.appendChild(overlay);
// Store the positions of the last few danmakus to avoid overlap
const danmakuPositions = [];
function danmu_handler(content: string) {
const danmaku = document.createElement("p");
danmaku.style.position = "absolute";
// Calculate a random position for the danmaku
let topPosition = 0;
let attempts = 0;
do {
topPosition = Math.random() * 30;
attempts++;
} while (
danmakuPositions.some((pos) => Math.abs(pos - topPosition) < 5) &&
attempts < 10
);
// Record the position
danmakuPositions.push(topPosition);
if (danmakuPositions.length > 10) {
danmakuPositions.shift(); // Keep the last 10 positions
}
danmaku.style.top = `${topPosition}%`;
danmaku.style.right = "0";
danmaku.style.color = "white";
danmaku.style.fontSize = "1.2em";
danmaku.style.whiteSpace = "nowrap";
danmaku.style.transform = "translateX(100%)";
danmaku.style.transition = "transform 10s linear";
danmaku.style.pointerEvents = "none";
danmaku.style.margin = "0";
danmaku.style.padding = "0";
danmaku.style.zIndex = "500";
danmaku.style.textShadow = "1px 1px 2px rgba(0, 0, 0, 0.6)";
danmaku.innerText = content;
overlay.appendChild(danmaku);
requestAnimationFrame(() => {
danmaku.style.transform = `translateX(-${overlay.clientWidth + danmaku.clientWidth}px)`;
});
danmaku.addEventListener("transitionend", () => {
overlay.removeChild(danmaku);
});
}
shakaSpacer.appendChild(danmakuToggle);
// create a playback rate button and menu
const playbackRateButton = document.createElement("button");
playbackRateButton.style.height = "30px";
playbackRateButton.style.minWidth = "30px";
playbackRateButton.style.backgroundColor = "rgba(0, 0, 0, 0.5)";
playbackRateButton.style.color = "white";
playbackRateButton.style.border = "1px solid gray";
playbackRateButton.style.padding = "0 10px";
playbackRateButton.style.boxSizing = "border-box";
playbackRateButton.style.fontSize = "1em";
playbackRateButton.style.right = "10px";
playbackRateButton.style.position = "absolute";
playbackRateButton.innerText = "⚙️";
const SettingMenu = document.createElement("div");
SettingMenu.style.position = "absolute";
SettingMenu.style.bottom = "40px";
SettingMenu.style.right = "10px";
SettingMenu.style.backgroundColor = "rgba(0, 0, 0, 0.8)";
SettingMenu.style.border = "1px solid gray";
SettingMenu.style.padding = "8px";
SettingMenu.style.display = "none";
SettingMenu.style.zIndex = "1000";
// Add danmaku offset input
const offsetContainer = document.createElement("div");
offsetContainer.style.marginBottom = "8px";
const offsetLabel = document.createElement("label");
offsetLabel.innerText = "弹幕偏移(秒):";
offsetLabel.style.color = "white";
offsetLabel.style.marginRight = "8px";
const offsetInput = document.createElement("input");
offsetInput.type = "number";
offsetInput.value = "0";
offsetInput.style.width = "60px";
offsetInput.style.backgroundColor = "rgba(0, 0, 0, 0.5)";
offsetInput.style.color = "white";
offsetInput.style.border = "1px solid gray";
offsetInput.style.padding = "2px 4px";
offsetInput.style.boxSizing = "border-box";
offsetContainer.appendChild(offsetLabel);
offsetContainer.appendChild(offsetInput);
SettingMenu.appendChild(offsetContainer);
// Add divider
const divider = document.createElement("hr");
divider.style.border = "none";
divider.style.borderTop = "1px solid gray";
divider.style.margin = "8px 0";
SettingMenu.appendChild(divider);
// Add playback rate options
const rates = [0.5, 1, 1.5, 2, 5];
rates.forEach((rate) => {
const rateButton = document.createElement("button");
rateButton.innerText = `${rate}x`;
rateButton.style.display = "block";
rateButton.style.width = "100%";
rateButton.style.padding = "4px 8px";
rateButton.style.margin = "2px 0";
rateButton.style.backgroundColor = "rgba(0, 0, 0, 0.5)";
rateButton.style.color = "white";
rateButton.style.border = "1px solid gray";
rateButton.style.cursor = "pointer";
rateButton.style.textAlign = "left";
if (rate === 1) {
rateButton.style.backgroundColor = "rgba(0, 128, 255, 0.5)";
}
rateButton.addEventListener("click", () => {
video.playbackRate = rate;
// Update active state
rates.forEach((r) => {
const btn = SettingMenu.querySelector(
`button[data-rate="${r}"]`
) as HTMLButtonElement;
if (btn) {
btn.style.backgroundColor =
r === rate ? "rgba(0, 128, 255, 0.5)" : "rgba(0, 0, 0, 0.5)";
}
});
});
rateButton.setAttribute("data-rate", rate.toString());
SettingMenu.appendChild(rateButton);
});
// Handle offset input changes
offsetInput.addEventListener("change", () => {
const offset = parseFloat(offsetInput.value);
if (!isNaN(offset)) {
local_offset = offset;
localStorage.setItem(`local_offset:${live_id}`, offset.toString());
}
});
// Toggle menu visibility
playbackRateButton.addEventListener("click", () => {
SettingMenu.style.display =
SettingMenu.style.display === "none" ? "block" : "none";
// if display is block, button background color should be red
if (SettingMenu.style.display === "block") {
playbackRateButton.style.backgroundColor = "rgba(0, 128, 255, 0.5)";
} else {
playbackRateButton.style.backgroundColor = "rgba(0, 0, 0, 0.5)";
}
});
// Close menu when clicking outside
document.addEventListener("click", (e) => {
if (
!playbackRateButton.contains(e.target as Node) &&
!SettingMenu.contains(e.target as Node)
) {
SettingMenu.style.display = "none";
}
});
shakaSpacer.appendChild(playbackRateButton);
shakaSpacer.appendChild(SettingMenu);
let danmu_statistics: { ts: number; count: number }[] = [];
if (platform == "bilibili") {
// create a danmu statistics select into shaka-spacer
let statisticKey = "";
const statisticKeyInput = document.createElement("input");
statisticKeyInput.style.height = "30px";
statisticKeyInput.style.width = "100px";
statisticKeyInput.style.backgroundColor = "rgba(0, 0, 0, 0.5)";
statisticKeyInput.style.color = "white";
statisticKeyInput.style.border = "1px solid gray";
statisticKeyInput.style.padding = "0 10px";
statisticKeyInput.style.boxSizing = "border-box";
statisticKeyInput.style.fontSize = "1em";
statisticKeyInput.style.right = "75px";
statisticKeyInput.placeholder = "弹幕统计过滤";
statisticKeyInput.style.position = "absolute";
// create a danmu statistics select into shaka-spacer
let statisticKey = "";
const statisticKeyInput = document.createElement("input");
statisticKeyInput.style.height = "30px";
statisticKeyInput.style.width = "100px";
statisticKeyInput.style.backgroundColor = "rgba(0, 0, 0, 0.5)";
statisticKeyInput.style.color = "white";
statisticKeyInput.style.border = "1px solid gray";
statisticKeyInput.style.padding = "0 10px";
statisticKeyInput.style.boxSizing = "border-box";
statisticKeyInput.style.fontSize = "1em";
statisticKeyInput.style.right = "55px";
statisticKeyInput.placeholder = "弹幕统计过滤";
statisticKeyInput.style.position = "absolute";
function update_statistics() {
let counts = {};
danmu_records.forEach((e) => {
if (statisticKey != "" && !e.content.includes(statisticKey)) {
return;
}
const timeSlot = Math.floor(e.ts / 10000) * 10000; // 将时间戳向下取整到10秒
counts[timeSlot] = (counts[timeSlot] || 0) + 1;
});
danmu_statistics = [];
for (let ts in counts) {
danmu_statistics.push({ ts: parseInt(ts), count: counts[ts] });
function update_statistics() {
let counts = {};
danmu_records.forEach((e) => {
if (statisticKey != "" && !e.content.includes(statisticKey)) {
return;
}
}
update_statistics();
if (isLive()) {
setInterval(async () => {
update_statistics();
}, 10 * 1000);
}
statisticKeyInput.addEventListener("change", () => {
statisticKey = statisticKeyInput.value;
update_statistics();
const timeSlot =
Math.floor((e.ts + local_offset * 1000) / 10000) * 10000; // 将时间戳向下取整到10秒
counts[timeSlot] = (counts[timeSlot] || 0) + 1;
});
shakaSpacer.appendChild(statisticKeyInput);
danmu_statistics = [];
for (let ts in counts) {
danmu_statistics.push({ ts: parseInt(ts), count: counts[ts] });
}
}
update_statistics();
if (isLive()) {
setInterval(async () => {
update_statistics();
}, 10 * 1000);
}
statisticKeyInput.addEventListener("change", () => {
statisticKey = statisticKeyInput.value;
update_statistics();
});
shakaSpacer.appendChild(statisticKeyInput);
// shaka-spacer should be flex-direction: column
shakaSpacer.style.flexDirection = "column";
@@ -604,7 +714,7 @@
// dispatch event
dispatch("markerAdd", {
offset: video.currentTime,
realtime: ts + video.currentTime,
realtime: global_offset + video.currentTime,
});
break;
case "ArrowLeft":
@@ -797,22 +907,32 @@
async function exportDanmu(ass: boolean) {
console.log("Export danmus");
const assContent = (await invoke("export_danmu", {
platform: platform,
roomId: room_id,
liveId: live_id,
x: Math.floor(focus_start + start),
y: Math.floor(focus_start + end),
offset: global_offset + parseInt(live_id),
ass: ass,
options: {
platform: platform,
roomId: room_id,
liveId: live_id,
x: Math.floor(focus_start + start),
y: Math.floor(focus_start + end),
offset: global_offset,
ass: ass,
},
})) as string;
let file_name = `danmu_${room_id}_${live_id}.${ass ? "ass" : "txt"}`;
const path = await save({
title: "导出弹幕",
defaultPath: file_name,
});
if (!path) return;
await invoke("export_to_file", { fileName: path, content: assContent });
if (TAURI_ENV) {
const path = await save({
title: "导出弹幕",
defaultPath: file_name,
});
if (!path) return;
await invoke("export_to_file", { fileName: path, content: assContent });
} else {
const a = document.createElement("a");
a.href =
"data:text/plain;charset=utf-8," + encodeURIComponent(assContent);
a.download = file_name;
a.click();
}
}
</script>

View File

@@ -0,0 +1,30 @@
<script>
import { Info, LayoutDashboard, Settings, Users, Video } from "lucide-svelte";
import { createEventDispatcher } from "svelte";
const dispatch = createEventDispatcher();
// acitveUrl is shared between project
export let activeUrl = "总览";
export let label = "";
export let dot = false;
</script>
<button
on:click={() => dispatch("activeChange", label)}
class="flex w-full items-center space-x-2 px-3 py-2 rounded-lg {activeUrl ===
label
? 'bg-blue-500/10 text-[#0A84FF]'
: 'text-gray-700'} dark:text-[#0A84FF] hover:bg-[#e5e5e5] dark:hover:bg-[#3a3a3c]"
>
<slot
name="icon"
class={activeUrl === label
? "text-[#0A84FF]"
: "text-gray-700 dark:text-[#0A84FF]"}
></slot>
<span>{label}</span>
{#if dot}
<div class="absolute right-6 w-2 h-2 bg-red-500 rounded-full"></div>
{/if}
</button>

View File

@@ -1,5 +1,5 @@
<script lang="ts">
import { invoke } from "@tauri-apps/api/core";
import { invoke } from "../lib/invoker";
import { Dropdown, DropdownItem, Select } from "flowbite-svelte";
import { ChevronDownOutline } from "flowbite-svelte-icons";
import type { Children, VideoType } from "./interface";

View File

@@ -10,6 +10,7 @@
Trash2,
BrainCircuit,
Eraser,
Download,
} from "lucide-svelte";
import {
generateEventId,
@@ -20,9 +21,8 @@
type VideoItem,
} from "./interface";
import SubtitleStyleEditor from "./SubtitleStyleEditor.svelte";
import { invoke } from "@tauri-apps/api/core";
import { listen } from "@tauri-apps/api/event";
import { onDestroy } from "svelte/internal";
import { invoke, TAURI_ENV, listen } from "../lib/invoker";
import { onDestroy } from "svelte";
export let show = false;
export let video: VideoItem;
@@ -77,38 +77,37 @@
let current_encode_event_id = null;
let current_generate_event_id = null;
let progress_update_listener = listen<ProgressUpdate>(
`progress-update`,
(e) => {
let event_id = e.payload.id;
console.log(e.payload);
if (event_id == current_encode_event_id) {
update_encode_prompt(e.payload.content);
} else if (event_id == current_generate_event_id) {
update_generate_prompt(e.payload.content);
}
const update_listener = listen<ProgressUpdate>(`progress-update`, (e) => {
let event_id = e.payload.id;
console.log(e.payload);
if (event_id == current_encode_event_id) {
update_encode_prompt(e.payload.content);
} else if (event_id == current_generate_event_id) {
update_generate_prompt(e.payload.content);
}
);
});
let progress_finished_listener = listen<ProgressFinished>(
`progress-finished`,
(e) => {
let event_id = e.payload.id;
if (event_id == current_encode_event_id) {
update_encode_prompt(`压制字幕`);
if (!e.payload.success) {
alert("压制失败: " + e.payload.message);
}
current_encode_event_id = null;
} else if (event_id == current_generate_event_id) {
update_generate_prompt(`AI 生成字幕`);
if (!e.payload.success) {
alert("生成字幕失败: " + e.payload.message);
}
current_generate_event_id = null;
const finish_listener = listen<ProgressFinished>(`progress-finished`, (e) => {
let event_id = e.payload.id;
if (event_id == current_encode_event_id) {
update_encode_prompt(`压制字幕`);
if (!e.payload.success) {
alert("压制失败: " + e.payload.message);
}
current_encode_event_id = null;
} else if (event_id == current_generate_event_id) {
update_generate_prompt(`AI 生成字幕`);
if (!e.payload.success) {
alert("生成字幕失败: " + e.payload.message);
}
current_generate_event_id = null;
}
);
});
onDestroy(() => {
update_listener?.then((fn) => fn());
finish_listener?.then((fn) => fn());
});
function update_encode_prompt(content: string) {
const encode_prompt = document.getElementById("encode-prompt");
@@ -123,11 +122,6 @@
generate_prompt.textContent = content;
}
}
// remove listeners when component is destroyed
onDestroy(() => {
progress_update_listener.then((fn) => fn());
progress_finished_listener.then((fn) => fn());
});
// 监听当前字幕索引变化
$: if (currentSubtitleIndex >= 0 && subtitleElements[currentSubtitleIndex]) {
@@ -195,6 +189,7 @@
async function saveSubtitles() {
if (video?.file) {
try {
console.log("update video subtitle");
await invoke("update_video_subtitle", {
id: video.id,
subtitle: subtitlesToSrt(subtitles),
@@ -295,7 +290,7 @@
timeMarkers = Array.from(
{ length: Math.min(Math.ceil(duration / interval) + 1, maxMarkers) },
(_, i) => Math.min(i * interval, duration)
(_, i) => Math.min(i * interval, duration),
);
}
@@ -372,7 +367,7 @@
subtitles = subtitles.map((s, i) =>
i === index
? { ...s, startTime: newStartTimeFinal, endTime: newEndTime }
: s
: s,
);
subtitles = subtitles.sort((a, b) => a.startTime - b.startTime);
}
@@ -397,7 +392,7 @@
const newTime = Math.max(0, sub.startTime + delta);
if (newTime < sub.endTime - 0.1) {
subtitles = subtitles.map((s, i) =>
i === index ? { ...s, startTime: newTime } : s
i === index ? { ...s, startTime: newTime } : s,
);
subtitles = subtitles.sort((a, b) => a.startTime - b.startTime);
}
@@ -405,7 +400,7 @@
const newTime = Math.min(videoElement.duration, sub.endTime + delta);
if (newTime > sub.startTime + 0.1) {
subtitles = subtitles.map((s, i) =>
i === index ? { ...s, endTime: newTime } : s
i === index ? { ...s, endTime: newTime } : s,
);
subtitles = subtitles.sort((a, b) => a.startTime - b.startTime);
}
@@ -415,7 +410,7 @@
function handleTimelineMouseDown(
e: MouseEvent,
index: number,
isStart: boolean
isStart: boolean,
) {
draggingSubtitle = { index, isStart };
document.addEventListener("mousemove", handleTimelineMouseMove);
@@ -519,7 +514,7 @@
function getCurrentSubtitleIndex(): number {
return subtitles.findIndex(
(sub) => currentTime >= sub.startTime && currentTime < sub.endTime
(sub) => currentTime >= sub.startTime && currentTime < sub.endTime,
);
}
@@ -539,6 +534,7 @@
}
async function encodeVideoSubtitle() {
await saveSubtitles();
const event_id = generateEventId();
current_encode_event_id = event_id;
const result = await invoke("encode_video_subtitle", {
@@ -553,7 +549,7 @@
function handleVideoSelect(e: Event) {
const selectedVideo = videos.find(
(v) => v.id === Number((e.target as HTMLSelectElement).value)
(v) => v.id === Number((e.target as HTMLSelectElement).value),
);
if (selectedVideo) {
// 清空字幕列表
@@ -571,6 +567,16 @@
onVideoChange?.(selectedVideo);
}
}
async function saveVideo() {
if (!video) return;
const video_url = video.file;
const video_name = video.file;
const a = document.createElement("a");
a.href = video_url;
a.download = video_name;
a.click();
}
</script>
{#if show}
@@ -602,6 +608,15 @@
<option value={v.id}>{v.name}</option>
{/each}
</select>
<!-- 保存按钮 -->
{#if !TAURI_ENV}
<button
class="text-blue-500 hover:text-blue-400 transition-colors duration-200 px-2 py-1.5 rounded-md hover:bg-blue-500/10"
on:click={saveVideo}
>
<Download class="w-4 h-4" />
</button>
{/if}
<!-- 删除按钮 -->
<button
class="text-red-500 hover:text-red-400 transition-colors duration-200 px-2 py-1.5 rounded-md hover:bg-red-500/10"

View File

@@ -1,4 +1,4 @@
import { invoke } from "@tauri-apps/api/core";
import { invoke } from "../lib/invoker";
export interface RoomInfo {
live_status: number;
@@ -100,6 +100,7 @@ export interface Config {
whisper_prompt: string;
clip_name_format: string;
auto_generate: AutoGenerateConfig;
status_check_interval: number;
}
export interface AutoGenerateConfig {
@@ -208,6 +209,7 @@ export interface ClipRangeParams {
y: number;
danmu: boolean;
offset: number;
local_offset: number;
}
export function generateEventId() {

170
src/lib/invoker.ts Normal file
View File

@@ -0,0 +1,170 @@
import { invoke as tauri_invoke } from "@tauri-apps/api/core";
import { getCurrentWebviewWindow } from "@tauri-apps/api/webviewWindow";
import { fetch as tauri_fetch } from "@tauri-apps/plugin-http";
import { convertFileSrc as tauri_convert } from "@tauri-apps/api/core";
import { listen as tauri_listen } from "@tauri-apps/api/event";
import { open as tauri_open } from "@tauri-apps/plugin-shell";
declare global {
interface Window {
__TAURI_INTERNALS__?: any;
}
}
const ENDPOINT = localStorage.getItem("endpoint") || "";
const TAURI_ENV = typeof window.__TAURI_INTERNALS__ !== "undefined";
const log = {
error: (...args: any[]) => {
const message = args.map((arg) => JSON.stringify(arg)).join(" ");
invoke("console_log", { level: "error", message });
console.error(message);
},
warn: (...args: any[]) => {
const message = args.map((arg) => JSON.stringify(arg)).join(" ");
invoke("console_log", { level: "warn", message });
console.warn(message);
},
info: (...args: any[]) => {
const message = args.map((arg) => JSON.stringify(arg)).join(" ");
invoke("console_log", { level: "info", message });
console.info(message);
},
debug: (...args: any[]) => {
const message = args.map((arg) => JSON.stringify(arg)).join(" ");
invoke("console_log", { level: "debug", message });
console.debug(message);
},
};
async function invoke<T>(
command: string,
args?: Record<string, any>
): Promise<T> {
try {
if (TAURI_ENV) {
// using tauri invoke
return await tauri_invoke<T>(command, args);
}
if (command === "open_live") {
console.log(args);
// open new page to live_index.html
window.open(
`live_index.html?platform=${args.platform}&room_id=${args.roomId}&live_id=${args.liveId}`,
"_blank"
);
return;
}
const response = await fetch(`${ENDPOINT}/api/${command}`, {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify(args || {}),
});
// if status is 405, it means the command is not allowed
if (response.status === 405) {
throw new Error(
`Command ${command} is not allowed, maybe bili-shadowreplay is running in readonly mode`
);
}
if (!response.ok) {
const error = await response.json();
throw new Error(error.message || `HTTP error: ${response.status}`);
}
const resp = await response.json();
if (resp.code !== 0) {
throw new Error(resp.message);
}
return resp.data as T;
} catch (error) {
// 将 HTTP 错误转换为 Tauri 风格的错误
throw new Error(`Failed to invoke ${command}:\n${error}`);
}
}
async function get(url: string) {
if (TAURI_ENV) {
return await tauri_fetch(url);
}
const response = await fetch(`${ENDPOINT}/api/fetch`, {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
url,
method: "GET",
headers: {},
body: null,
}),
});
return response;
}
async function set_title(title: string) {
if (TAURI_ENV) {
return await getCurrentWebviewWindow().setTitle(title);
}
document.title = title;
}
function convertFileSrc(filePath: string) {
if (TAURI_ENV) {
return tauri_convert(filePath);
}
return `${ENDPOINT}/output/${filePath.split("/").pop()}`;
}
let event_source: EventSource | null = null;
if (!TAURI_ENV) {
event_source = new EventSource(`${ENDPOINT}/api/sse`);
event_source.onopen = () => {
console.log("EventSource connection opened");
};
event_source.onerror = (error) => {
console.error("EventSource error:", error);
};
}
async function listen<T>(event: string, callback: (data: any) => void) {
if (TAURI_ENV) {
return await tauri_listen(event, callback);
}
event_source.addEventListener(event, (event_data) => {
const data = JSON.parse(event_data.data);
console.log("Parsed EventSource data:", data);
callback({
type: event,
payload: data,
});
});
}
async function open(url: string) {
if (TAURI_ENV) {
return await tauri_open(url);
}
window.open(url, "_blank");
}
export {
invoke,
get,
set_title,
TAURI_ENV,
convertFileSrc,
ENDPOINT,
listen,
open,
log,
};

Some files were not shown because too many files have changed in this diff Show More