Compare commits
26 Commits
f2732b36f2
...
develop
| Author | SHA1 | Date | |
|---|---|---|---|
| dac2b0e8dc | |||
| 0541b0b776 | |||
| 6dd2025d3d | |||
| 7176cc8f4b | |||
| 83c9bcf12e | |||
| ed2e660d34 | |||
| efdba35b77 | |||
| ab3a86041a | |||
| 91e55fa37c | |||
| bbb767cd20 | |||
| a69b4c0bcb | |||
| 4bd22a2009 | |||
| fd08aaffdf | |||
| c954bf25d4 | |||
| 7b88022b66 | |||
| 98a6ba88fc | |||
| 916cc7764a | |||
| 694f335408 | |||
| abb7cafaed | |||
| d45fe0fbde | |||
| c4020615d2 | |||
| 34c3f0dc89 | |||
| f9b9ce0994 | |||
| 9c7f04d197 | |||
| ab95d124bc | |||
| bdd3e30f14 |
170
.ai/tauris-agent.md
Normal file
@@ -0,0 +1,170 @@
|
||||
# ROLE: Senior Desktop Audio Engineer & Tauri Architect
|
||||
|
||||
You are an expert in:
|
||||
|
||||
- Tauri (Rust backend + system WebView frontend)
|
||||
- Native audio streaming (FFmpeg, GStreamer, CPAL, Rodio)
|
||||
- Desktop media players
|
||||
- Chromecast / casting architectures
|
||||
- Incremental refactors of production apps
|
||||
|
||||
You are working on an existing project named **Taurus RadioPlayer**.
|
||||
|
||||
---
|
||||
|
||||
## PROJECT CONTEXT (IMPORTANT)
|
||||
|
||||
This is a **Tauri desktop application**, NOT Electron.
|
||||
|
||||
### Current architecture
|
||||
|
||||
- Frontend: Vanilla HTML / CSS / JS served in WebView
|
||||
- Backend: Rust (Tauri commands)
|
||||
- Audio: **Native player (FFmpeg decode + CPAL output)** via Tauri commands (`player_play/stop/set_volume/get_state`)
|
||||
- Casting: Google Cast via Node.js sidecar (`castv2-client`)
|
||||
- Stations: JSON file + user-defined stations in `localStorage`
|
||||
- Platforms: Windows, Linux, macOS
|
||||
|
||||
### Critical limitation
|
||||
|
||||
Browser/HTML5 audio is insufficient for:
|
||||
|
||||
- stable radio streaming
|
||||
- buffering control
|
||||
- reconnection
|
||||
- unified local + cast playback
|
||||
|
||||
---
|
||||
|
||||
## PRIMARY GOAL
|
||||
|
||||
Upgrade the application by:
|
||||
|
||||
1. **Removing HTML5 Audio completely**
|
||||
2. **Implementing a native audio streaming engine**
|
||||
3. **Keeping the existing HTML/CSS UI unchanged**
|
||||
4. **Preserving the current station model and UX**
|
||||
5. **Maintaining cross-platform compatibility**
|
||||
6. **Avoiding unnecessary rewrites**
|
||||
|
||||
This is an **incremental upgrade**, not a rewrite.
|
||||
|
||||
---
|
||||
|
||||
## TARGET ARCHITECTURE
|
||||
|
||||
- UI remains WebView-based (HTML/CSS/JS)
|
||||
- JS communicates only via Tauri `invoke()`
|
||||
- Audio decoding and playback are handled natively
|
||||
- Local playback: FFmpeg decodes to PCM and CPAL outputs to speakers
|
||||
- Casting (preferred): backend starts a **cast tap** that reuses the already-decoded PCM stream and re-encodes it to an MP3 HTTP stream (`-listen 1`) on the LAN; the sidecar casts that local URL
|
||||
- Casting (fallback): backend can still run a standalone URL→MP3 proxy when the tap cannot be started
|
||||
- Casting logic may remain temporarily in the sidecar
|
||||
|
||||
Note: “Reuse decoded audio” here means: one FFmpeg decode → PCM → fan-out to CPAL (local) and FFmpeg encode/listen (cast).
|
||||
|
||||
---
|
||||
|
||||
## TECHNICAL DIRECTIVES (MANDATORY)
|
||||
|
||||
### 1. Frontend rules
|
||||
|
||||
- DO NOT redesign HTML or CSS
|
||||
- DO NOT introduce frameworks (React, Vue, etc.)
|
||||
- Keep playback controlled via backend commands (no `new Audio()` usage)
|
||||
- All playback must go through backend commands
|
||||
|
||||
### 2. Backend rules
|
||||
|
||||
- Prefer **Rust-native solutions**
|
||||
- Acceptable audio stacks:
|
||||
- FFmpeg + CPAL / Rodio
|
||||
- GStreamer (if justified)
|
||||
- Implement commands such as:
|
||||
- `player_play(url)`
|
||||
- `player_stop()`
|
||||
- `player_set_volume(volume)`
|
||||
- `player_get_state()`
|
||||
- Handle:
|
||||
- buffering
|
||||
- reconnect on stream drop
|
||||
- clean shutdown
|
||||
- thread safety
|
||||
|
||||
### 3. Casting rules
|
||||
|
||||
- Do not break existing Chromecast support
|
||||
- Prefer reusing backend-controlled audio where possible (e.g., Cast via local proxy instead of sending station URL directly)
|
||||
- Do not introduce browser-based casting
|
||||
- Sidecar removal is OPTIONAL, not required now
|
||||
|
||||
---
|
||||
|
||||
## MIGRATION STRATEGY (VERY IMPORTANT)
|
||||
|
||||
You must:
|
||||
|
||||
- Work in **small, safe steps**
|
||||
- Clearly explain what files change and why
|
||||
- Never delete working functionality without replacement
|
||||
- Prefer additive refactors over destructive ones
|
||||
|
||||
Each response should:
|
||||
|
||||
1. Explain intent
|
||||
2. Show concrete code
|
||||
3. State which file is modified
|
||||
4. Preserve compatibility
|
||||
|
||||
---
|
||||
|
||||
## WHAT YOU SHOULD PRODUCE
|
||||
|
||||
You may generate:
|
||||
|
||||
- Rust code (Tauri commands, audio engine)
|
||||
- JS changes (invoke-based playback)
|
||||
- Architecture explanations
|
||||
- Migration steps
|
||||
- TODO lists
|
||||
- Warnings about pitfalls
|
||||
|
||||
You MUST NOT:
|
||||
|
||||
- Suggest Electron or Flutter
|
||||
- Suggest full rewrites
|
||||
- Ignore existing sidecar or station model
|
||||
- Break the current UX
|
||||
|
||||
---
|
||||
|
||||
## ENGINEERING PHILOSOPHY
|
||||
|
||||
This app should evolve into:
|
||||
|
||||
> “A native audio engine with a web UI shell”
|
||||
|
||||
The WebView is a **control surface**, not a media engine.
|
||||
|
||||
---
|
||||
|
||||
## COMMUNICATION STYLE
|
||||
|
||||
- Be precise
|
||||
- Be pragmatic
|
||||
- Be production-oriented
|
||||
- Prefer correctness over novelty
|
||||
- Assume this is a real app with users
|
||||
|
||||
---
|
||||
|
||||
## FIRST TASK WHEN STARTING
|
||||
|
||||
Begin by:
|
||||
|
||||
1. Identifying all HTML5 Audio usage
|
||||
2. Proposing the native audio engine design
|
||||
3. Defining the minimal command interface
|
||||
4. Planning the replacement step-by-step
|
||||
|
||||
Do NOT write all code at once.
|
||||
47
.github/FFMPEG_GUIDE.md
vendored
Normal file
@@ -0,0 +1,47 @@
|
||||
# FFmpeg CI Guide
|
||||
|
||||
This file describes how to provide a vetted FFmpeg build to the CI workflow and how the workflow expects archive layouts.
|
||||
|
||||
## Secrets (recommended)
|
||||
- `FFMPEG_URL` — primary URL the workflow will download. Use a stable URL to a signed/hosted FFmpeg build.
|
||||
- `FFMPEG_URL_LINUX` — optional override for Linux runners.
|
||||
- `FFMPEG_URL_WINDOWS` — optional override for Windows runners.
|
||||
- `FFMPEG_URL_MACOS` — optional override for macOS runners.
|
||||
|
||||
If per-OS secrets are present, they take precedence over `FFMPEG_URL`.
|
||||
|
||||
## Recommended FFmpeg sources
|
||||
- Use official static builds from a trusted provider (example):
|
||||
- Windows (ffmpeg.exe): https://www.gyan.dev/ffmpeg/builds/
|
||||
- Linux (static): https://johnvansickle.com/ffmpeg/
|
||||
- macOS (static): https://evermeet.cx/ffmpeg/
|
||||
|
||||
Prefer hosting a copy in your own artifact store (S3, GitHub Releases) so you control the binary used in CI.
|
||||
|
||||
## Expected archive layouts
|
||||
The workflow will attempt to extract common archive formats. Recommended layouts:
|
||||
|
||||
- Zip containing `ffmpeg.exe` at the archive root
|
||||
- Example: `ffmpeg-2025-01-01.zip` -> `ffmpeg.exe` (root)
|
||||
|
||||
- Tar.gz or tar.xz containing an `ffmpeg` binary at the archive root or inside a single top-level folder
|
||||
- Example: `ffmpeg-2025/ffmpeg` or `ffmpeg`
|
||||
|
||||
- Raw binary: a direct link to the `ffmpeg` executable is also supported (the workflow will make it executable).
|
||||
|
||||
If your archive nests the binary deep inside several folders, consider publishing a trimmed archive that places `ffmpeg` at the root for easier CI extraction.
|
||||
|
||||
## Verifying locally
|
||||
To test the workflow steps locally, download your chosen archive and ensure running the binary prints version information:
|
||||
|
||||
```bash
|
||||
# on Linux/macOS
|
||||
./ffmpeg -version
|
||||
|
||||
# on Windows (PowerShell)
|
||||
.\ffmpeg.exe -version
|
||||
```
|
||||
|
||||
## Notes for maintainers
|
||||
- If you need the workflow to handle a custom archive layout, I can update the extraction step (`.github/workflows/ffmpeg-preflight.yml`) to locate the binary path inside the archive and move it to `src-tauri/resources/ffmpeg(.exe)`.
|
||||
- After adding secrets, open a PR to trigger the workflow and verify the `FFmpeg preflight OK` message in the CI logs.
|
||||
129
.github/workflows/ffmpeg-preflight.yml
vendored
Normal file
@@ -0,0 +1,129 @@
|
||||
name: FFmpeg Preflight and Build
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ main, master ]
|
||||
pull_request:
|
||||
branches: [ main, master ]
|
||||
|
||||
jobs:
|
||||
preflight:
|
||||
runs-on: ${{ matrix.os }}
|
||||
strategy:
|
||||
matrix:
|
||||
os: [ubuntu-latest, windows-latest, macos-latest]
|
||||
|
||||
env:
|
||||
# Provide a fallback URL via repository secret `FFMPEG_URL_{OS}` or `FFMPEG_URL`.
|
||||
FFMPEG_URL: ${{ secrets.FFMPEG_URL }}
|
||||
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '18'
|
||||
|
||||
- name: Set up Rust
|
||||
uses: dtolnay/gh-actions-rs@stable
|
||||
|
||||
- name: Determine OS-specific ffmpeg URL
|
||||
id: ffmpeg-url
|
||||
shell: bash
|
||||
run: |
|
||||
echo "RUNNER_OS=${RUNNER_OS}"
|
||||
if [[ "${RUNNER_OS}" == "Windows" ]]; then
|
||||
echo "url=${{ secrets.FFMPEG_URL_WINDOWS || secrets.FFMPEG_URL }}" >> $GITHUB_OUTPUT
|
||||
elif [[ "${RUNNER_OS}" == "macOS" ]]; then
|
||||
echo "url=${{ secrets.FFMPEG_URL_MACOS || secrets.FFMPEG_URL }}" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "url=${{ secrets.FFMPEG_URL_LINUX || secrets.FFMPEG_URL }}" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
- name: Create resources dir
|
||||
run: mkdir -p src-tauri/resources
|
||||
|
||||
- name: Download and install FFmpeg into resources
|
||||
if: steps.ffmpeg-url.outputs.url != ''
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
URL="${{ steps.ffmpeg-url.outputs.url }}"
|
||||
echo "Downloading ffmpeg from: $URL"
|
||||
FNAME="${RUNNER_TEMP}/ffmpeg_bundle"
|
||||
if [[ "${RUNNER_OS}" == "Windows" ]]; then
|
||||
powershell -Command "(New-Object Net.WebClient).DownloadFile('$URL', '$FNAME.zip')"
|
||||
powershell -Command "Expand-Archive -Path '$FNAME.zip' -DestinationPath '${{ github.workspace }}\\src-tauri\\resources'"
|
||||
else
|
||||
curl -sL "$URL" -o "$FNAME"
|
||||
# Attempt to extract common archive formats
|
||||
if file "$FNAME" | grep -q 'Zip archive'; then
|
||||
unzip -q "$FNAME" -d src-tauri/resources
|
||||
elif file "$FNAME" | grep -q 'gzip compressed data'; then
|
||||
tar -xzf "$FNAME" -C src-tauri/resources
|
||||
elif file "$FNAME" | grep -q 'XZ compressed'; then
|
||||
tar -xJf "$FNAME" -C src-tauri/resources
|
||||
else
|
||||
# Assume raw binary
|
||||
mv "$FNAME" src-tauri/resources/ffmpeg
|
||||
chmod +x src-tauri/resources/ffmpeg
|
||||
fi
|
||||
fi
|
||||
|
||||
- name: List resources
|
||||
run: ls -la src-tauri/resources || true
|
||||
|
||||
- name: Locate ffmpeg binary (Linux/macOS)
|
||||
if: runner.os != 'Windows'
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
# Try to find an ffmpeg executable anywhere under resources
|
||||
BINPATH=$(find src-tauri/resources -type f -iname ffmpeg -print -quit || true)
|
||||
if [ -z "$BINPATH" ]; then
|
||||
BINPATH=$(find src-tauri/resources -type f -iname 'ffmpeg*' -print -quit || true)
|
||||
fi
|
||||
if [ -n "$BINPATH" ]; then
|
||||
echo "Found ffmpeg at $BINPATH"
|
||||
cp "$BINPATH" src-tauri/resources/ffmpeg
|
||||
chmod +x src-tauri/resources/ffmpeg
|
||||
else
|
||||
echo "ffmpeg binary not found in resources"
|
||||
ls -R src-tauri/resources || true
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Locate ffmpeg binary (Windows)
|
||||
if: runner.os == 'Windows'
|
||||
shell: pwsh
|
||||
run: |
|
||||
$found = Get-ChildItem -Path src-tauri/resources -Recurse -Filter ffmpeg.exe -ErrorAction SilentlyContinue | Select-Object -First 1
|
||||
if (-not $found) {
|
||||
$found = Get-ChildItem -Path src-tauri/resources -Recurse -Filter '*ffmpeg*' -ErrorAction SilentlyContinue | Select-Object -First 1
|
||||
}
|
||||
if ($found) {
|
||||
Write-Host "Found ffmpeg at $($found.FullName)"
|
||||
Copy-Item $found.FullName -Destination 'src-tauri\resources\ffmpeg.exe' -Force
|
||||
} else {
|
||||
Write-Host "ffmpeg not found in src-tauri/resources"
|
||||
Get-ChildItem src-tauri\resources -Recurse | Format-List
|
||||
exit 1
|
||||
}
|
||||
|
||||
- name: Install npm deps
|
||||
run: npm ci
|
||||
|
||||
- name: Copy project FFmpeg helpers
|
||||
run: node tools/copy-ffmpeg.js || true
|
||||
|
||||
- name: Build Rust and run ffmpeg preflight check
|
||||
working-directory: src-tauri
|
||||
run: |
|
||||
set -e
|
||||
cargo build --release
|
||||
cargo run --release --bin check_ffmpeg
|
||||
|
||||
- name: Optional frontend build
|
||||
run: npm run build --if-present || true
|
||||
34
.gitignore
vendored
@@ -1,3 +1,37 @@
|
||||
# Node
|
||||
node_modules/
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
package-lock.json
|
||||
|
||||
# Tauri / Rust
|
||||
/target/
|
||||
/src-tauri/binaries/
|
||||
/src-tauri/target/
|
||||
|
||||
# Local build artifacts
|
||||
/dist/
|
||||
/build/
|
||||
|
||||
# FFmpeg / downloaded binaries
|
||||
/ffmpeg/bin/
|
||||
|
||||
# Editor / OS files
|
||||
.vscode/
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# Logs and temp
|
||||
*.log
|
||||
*.tmp
|
||||
|
||||
# Generated by tools
|
||||
/tools/*.cache
|
||||
|
||||
# Misc
|
||||
*.tgz
|
||||
.env
|
||||
# Logs
|
||||
logs
|
||||
*.log
|
||||
|
||||
54
README.md
@@ -36,7 +36,13 @@ Before you begin, ensure you have the following installed on your machine:
|
||||
To start the application in development mode (with hot-reloading for frontend changes):
|
||||
|
||||
```bash
|
||||
npm run tauri dev
|
||||
npm run dev
|
||||
```
|
||||
|
||||
If you want FFmpeg to be bundled into `src-tauri/resources/` for local/native playback during dev, use:
|
||||
|
||||
```bash
|
||||
npm run dev:native
|
||||
```
|
||||
|
||||
This command will:
|
||||
@@ -50,7 +56,7 @@ To create an optimized, standalone executable for your operating system:
|
||||
|
||||
1. **Run the build command**:
|
||||
```bash
|
||||
npm run tauri build
|
||||
npm run build
|
||||
```
|
||||
|
||||
2. **Locate the artifacts**:
|
||||
@@ -67,7 +73,9 @@ To create an optimized, standalone executable for your operating system:
|
||||
* `styles.css`: Application styling.
|
||||
* `stations.json`: Configuration file for available radio streams.
|
||||
* **`src-tauri/`**: Rust backend code.
|
||||
* `src/main.rs`: The entry point for the Rust process. Handles Google Cast discovery and playback logic.
|
||||
* `src/lib.rs`: Tauri command layer (native player commands, Cast commands, utility HTTP helpers).
|
||||
* `src/player.rs`: Native audio engine (FFmpeg decode → PCM ring buffer → CPAL output).
|
||||
* `src/main.rs`: Rust entry point (wires the Tauri app; most command logic lives in `lib.rs`).
|
||||
* `tauri.conf.json`: Configuration for the Tauri app (window size, permissions, package info).
|
||||
|
||||
## Customization
|
||||
@@ -103,18 +111,46 @@ To change the default window size, edit `src-tauri/tauri.conf.json`:
|
||||
* **WebView2 Error (Windows)**: If the app doesn't start on Windows, ensure the [Microsoft Edge WebView2 Runtime](https://developer.microsoft.com/en-us/microsoft-edge/webview2/) is installed.
|
||||
* **Build Failures**: Try running `cargo update` inside the `src-tauri` folder to update Rust dependencies.
|
||||
|
||||
## FFmpeg (Optional) for Native Playback
|
||||
|
||||
Local/native playback uses an external **FFmpeg** binary to decode radio streams.
|
||||
|
||||
### How the app finds FFmpeg
|
||||
|
||||
At runtime it searches in this order:
|
||||
|
||||
1. `RADIOPLAYER_FFMPEG` environment variable (absolute or relative path)
|
||||
2. Next to the application executable (Windows: `ffmpeg.exe`, macOS/Linux: `ffmpeg`)
|
||||
3. Common bundle resource folders relative to the executable:
|
||||
- `resources/ffmpeg(.exe)`
|
||||
- `Resources/ffmpeg(.exe)`
|
||||
- `../resources/ffmpeg(.exe)`
|
||||
- `../Resources/ffmpeg(.exe)`
|
||||
4. Your system `PATH`
|
||||
|
||||
### Optional: download FFmpeg automatically (Windows)
|
||||
|
||||
This is **opt-in** (it is not run automatically during build/run). It downloads a prebuilt FFmpeg zip and extracts `ffmpeg.exe` into `tools/ffmpeg/bin/ffmpeg.exe`.
|
||||
|
||||
```bash
|
||||
npm run ffmpeg:download
|
||||
```
|
||||
|
||||
Then run `npm run dev:native` (or `npm run build`) to copy FFmpeg into `src-tauri/resources/` for bundling.
|
||||
|
||||
## License
|
||||
|
||||
[Add License Information Here]
|
||||
|
||||
|
||||
## Release v0.1
|
||||
## Release v0.2
|
||||
|
||||
Initial public preview (v0.1) — a minimal, working RadioPlayer experience:
|
||||
Public beta (v0.2) — updates since v0.1:
|
||||
|
||||
- Custom CAF Receiver UI (HTML/CSS/JS) in `receiver/` with branded artwork and playback status.
|
||||
- Plays LIVE stream: `https://live.radio1.si/Radio1MB` (contentType: `audio/mpeg`, streamType: `LIVE`).
|
||||
- Desktop sidecar (`sidecar/index.js`) launches the Default Media Receiver and sends LOAD commands; launch flow now retries if the device reports `NOT_ALLOWED` by stopping existing sessions first.
|
||||
- **Android build support:** Project includes Android build scripts and Gradle wrappers. See [scripts/build-android.sh](scripts/build-android.sh) and [build-android.ps1](build-android.ps1). Prebuilt native helper binaries are available in `src-tauri/binaries/` for convenience.
|
||||
- **Web receiver & webapp:** The `receiver/` folder contains a Custom CAF Receiver UI (HTML/CSS/JS) and the `webapp/` folder provides a standalone web distribution for hosting the app in browsers or PWAs.
|
||||
- **Sidecar improvements:** `sidecar/index.js` now retries launches when devices return `NOT_ALLOWED` by attempting to stop existing sessions before retrying. Check sidecar logs for `Launch NOT_ALLOWED` messages and retry attempts.
|
||||
- **LIVE stream:** The app continues to support the LIVE stream `https://live.radio1.si/Radio1MB` (contentType: `audio/mpeg`, streamType: `LIVE`).
|
||||
|
||||
Included receiver files:
|
||||
|
||||
@@ -140,6 +176,6 @@ npx http-server receiver -p 8443 -S -C localhost.pem -K localhost-key.pem
|
||||
|
||||
Sidecar / troubleshoot
|
||||
|
||||
- If a Cast launch fails with `NOT_ALLOWED`, the sidecar will now attempt to stop any existing sessions on the device and retry the launch (best-effort). Check sidecar logs for `Launch NOT_ALLOWED` and subsequent retry attempts.
|
||||
- If a Cast launch fails with `NOT_ALLOWED`, the sidecar will attempt to stop any existing sessions on the device and retry the launch (best-effort). Check sidecar logs for `Launch NOT_ALLOWED` and subsequent retry attempts.
|
||||
- Note: the sidecar uses `castv2-client` (not the official Google sender SDK). Group/stereo behavior may vary across device types — for full sender capabilities consider adding an official sender implementation.
|
||||
|
||||
|
||||
343
TECHNICAL_DOCUMENTATION.md
Normal file
@@ -0,0 +1,343 @@
|
||||
# RadioPlayer — Technical Documentation (Tauri + Desktop)
|
||||
|
||||
This document describes the desktop (Tauri) application architecture, build pipeline, backend commands, and how the UI maps to that backend.
|
||||
|
||||
## High-level architecture
|
||||
|
||||
- **Frontend (WebView)**: Vanilla HTML/CSS/JS in [src/index.html](src/index.html), [src/main.js](src/main.js), [src/styles.css](src/styles.css)
|
||||
- **Tauri host (Rust)**: Command layer + device discovery in [src-tauri/src/lib.rs](src-tauri/src/lib.rs)
|
||||
- **Native audio engine (Rust)**: FFmpeg decode + CPAL output in [src-tauri/src/player.rs](src-tauri/src/player.rs)
|
||||
- **Cast sidecar (Node executable)**: Google Cast control via `castv2-client` in [sidecar/index.js](sidecar/index.js)
|
||||
- **Packaging utilities**:
|
||||
- Sidecar binary copy/rename step: [tools/copy-binaries.js](tools/copy-binaries.js)
|
||||
- Windows EXE icon patch: [tools/post-build-rcedit.js](tools/post-build-rcedit.js)
|
||||
- Optional FFmpeg bundling helper: [tools/copy-ffmpeg.js](tools/copy-ffmpeg.js) (see [tools/ffmpeg/README.md](tools/ffmpeg/README.md))
|
||||
|
||||
Data flow:
|
||||
|
||||
1. UI actions call JS functions in `main.js`.
|
||||
2. JS calls Tauri commands via `window.__TAURI__.core.invoke()` (for both local playback and casting).
|
||||
3. In **Local mode**, Rust spawns FFmpeg and plays decoded PCM via CPAL.
|
||||
4. In **Cast mode**, the Rust backend discovers Cast devices via mDNS and stores `{ deviceName -> ip }`.
|
||||
5. On `cast_play/stop/volume`, Rust spawns (or reuses) a **sidecar process**, then sends newline-delimited JSON commands to the sidecar stdin.
|
||||
|
||||
## Running and building
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Node.js (project uses ESM at the root; see [package.json](package.json))
|
||||
- Rust toolchain (via rustup)
|
||||
- Platform build tools (Windows: Visual Studio C++ Build Tools)
|
||||
- Tauri prerequisites (WebView2 runtime on Windows)
|
||||
|
||||
### Dev
|
||||
|
||||
From repo root:
|
||||
|
||||
- `npm install`
|
||||
- `npm run dev`
|
||||
|
||||
This runs `tauri dev` (see [package.json](package.json)).
|
||||
|
||||
### Production build (Windows MSI/NSIS, etc.)
|
||||
|
||||
From repo root:
|
||||
|
||||
- `npm run build`
|
||||
|
||||
What it does (see [package.json](package.json)):
|
||||
|
||||
1. `node tools/copy-binaries.js` — ensures the expected bundled binary name exists.
|
||||
2. `tauri build` — builds the Rust host and generates platform bundles.
|
||||
3. `node tools/post-build-rcedit.js` — patches the Windows EXE icon using the locally installed `rcedit` binary.
|
||||
|
||||
Artifacts typically land under:
|
||||
|
||||
- `src-tauri/target/release/bundle/`
|
||||
|
||||
### Building the sidecar
|
||||
|
||||
The sidecar is built separately using `pkg` (see [sidecar/package.json](sidecar/package.json)):
|
||||
|
||||
- `cd sidecar`
|
||||
- `npm install`
|
||||
- `npm run build`
|
||||
|
||||
This outputs:
|
||||
|
||||
- `src-tauri/binaries/radiocast-sidecar-x86_64-pc-windows-msvc.exe`
|
||||
|
||||
## Tauri configuration
|
||||
|
||||
### App config
|
||||
|
||||
Defined in [src-tauri/tauri.conf.json](src-tauri/tauri.conf.json):
|
||||
|
||||
- **build.frontendDist**: `../src`
|
||||
- The desktop app serves the static files in `src/`.
|
||||
- **window**:
|
||||
- `width: 360`, `height: 720`, `resizable: false`
|
||||
- `decorations: false`, `transparent: true` (frameless / custom UI)
|
||||
- **security.csp**: `null` (CSP disabled)
|
||||
- **bundle.targets**: `"all"`
|
||||
- **bundle.externalBin**: includes external binaries shipped with the bundle.
|
||||
|
||||
### Capabilities and permissions
|
||||
|
||||
Defined in [src-tauri/capabilities/default.json](src-tauri/capabilities/default.json):
|
||||
|
||||
- `core:default`
|
||||
- `core:window:allow-close` (allows JS to call window close)
|
||||
- `opener:default`
|
||||
- `shell:default` (required for spawning the sidecar)
|
||||
|
||||
## Rust backend (Tauri commands)
|
||||
|
||||
All commands are in [src-tauri/src/lib.rs](src-tauri/src/lib.rs) and registered via `invoke_handler`.
|
||||
|
||||
### Shared state
|
||||
|
||||
- `AppState.known_devices: HashMap<String, String>`
|
||||
- maps **device name** → **IP string**
|
||||
- `SidecarState.child: Option<CommandChild>`
|
||||
- stores a single long-lived sidecar child process
|
||||
|
||||
### mDNS discovery
|
||||
|
||||
In `.setup()` the backend spawns a thread that browses:
|
||||
|
||||
- `_googlecast._tcp.local.`
|
||||
|
||||
When a device is resolved:
|
||||
|
||||
- Name is taken from the `fn` TXT record if present, otherwise `fullname`.
|
||||
- First IPv4 address is preferred.
|
||||
- New devices are inserted into `known_devices` and logged.
|
||||
|
||||
### Commands
|
||||
|
||||
### Native player commands (local playback)
|
||||
|
||||
Local playback is handled by the Rust engine in [src-tauri/src/player.rs](src-tauri/src/player.rs). The UI controls it using these commands:
|
||||
|
||||
#### `player_play(url: String) -> Result<(), String>`
|
||||
|
||||
- Starts native playback of the provided stream URL.
|
||||
- Internally spawns FFmpeg to decode into `s16le` PCM and feeds a ring buffer consumed by a CPAL output stream.
|
||||
- Reports `buffering` → `playing` based on buffer fill/underrun.
|
||||
|
||||
#### `player_stop() -> Result<(), String>`
|
||||
|
||||
- Stops the native pipeline and updates state.
|
||||
|
||||
#### `player_set_volume(volume: f32) -> Result<(), String>`
|
||||
|
||||
- Sets volume in range `[0, 1]`.
|
||||
|
||||
#### `player_get_state() -> Result<PlayerState, String>`
|
||||
|
||||
- Returns `{ status, url, volume, error }`.
|
||||
- Used by the UI to keep status text and play/stop button in sync.
|
||||
|
||||
#### `list_cast_devices() -> Result<Vec<String>, String>`
|
||||
|
||||
- Returns the sorted list of discovered Cast device names.
|
||||
- Used by the UI when opening the Cast picker overlay.
|
||||
|
||||
#### `cast_play(device_name: String, url: String) -> Result<(), String>`
|
||||
|
||||
- Resolves `device_name` → `ip` from `known_devices`.
|
||||
- Spawns the sidecar if it doesn’t exist yet:
|
||||
- `app.shell().sidecar("radiocast-sidecar")`
|
||||
- Sidecar stdout/stderr are forwarded to the Rust process logs.
|
||||
- Writes a JSON line to the sidecar stdin:
|
||||
|
||||
```json
|
||||
{ "command": "play", "args": { "ip": "<ip>", "url": "<streamUrl>" } }
|
||||
```
|
||||
|
||||
#### `cast_stop(device_name: String) -> Result<(), String>`
|
||||
|
||||
- If the sidecar process exists, writes:
|
||||
|
||||
```json
|
||||
{ "command": "stop", "args": {} }
|
||||
```
|
||||
|
||||
#### `cast_set_volume(device_name: String, volume: f32) -> Result<(), String>`
|
||||
|
||||
- If the sidecar process exists, writes:
|
||||
|
||||
```json
|
||||
{ "command": "volume", "args": { "level": 0.0 } }
|
||||
```
|
||||
|
||||
Notes:
|
||||
|
||||
- `volume` is passed from the UI in the range `[0, 1]`.
|
||||
|
||||
#### `fetch_url(url: String) -> Result<String, String>`
|
||||
|
||||
- Performs a server-side HTTP GET using `reqwest`.
|
||||
- Returns response body as text.
|
||||
- Used by the UI to bypass browser CORS limitations when calling 3rd-party endpoints.
|
||||
|
||||
## Sidecar protocol and behavior
|
||||
|
||||
Implementation: [sidecar/index.js](sidecar/index.js)
|
||||
|
||||
### Input protocol (stdin)
|
||||
|
||||
The sidecar reads **newline-delimited JSON objects**:
|
||||
|
||||
- `{"command":"play","args":{"ip":"...","url":"..."}}`
|
||||
- `{"command":"stop","args":{}}`
|
||||
- `{"command":"volume","args":{"level":0.5}}`
|
||||
|
||||
### Output protocol (stdout/stderr)
|
||||
|
||||
Logs are JSON objects:
|
||||
|
||||
- `{"type":"log","message":"..."}` to stdout
|
||||
- `{"type":"error","message":"..."}` to stderr
|
||||
|
||||
### Cast launch logic
|
||||
|
||||
- Connects to the device IP.
|
||||
- Reads existing sessions via `getSessions()`.
|
||||
- If Default Media Receiver (`appId === "CC1AD845"`) exists, tries to join.
|
||||
- If other sessions exist, attempts to stop them to avoid `NOT_ALLOWED`.
|
||||
- On `NOT_ALLOWED` launch, retries once after stopping sessions (best-effort).
|
||||
|
||||
## Frontend behavior
|
||||
|
||||
### Station data model
|
||||
|
||||
Stations are loaded from [src/stations.json](src/stations.json) and normalized in [src/main.js](src/main.js) into:
|
||||
|
||||
```js
|
||||
{ id, name, url, logo, enabled, raw }
|
||||
```
|
||||
|
||||
Normalization rules (important for `stations.json` format compatibility):
|
||||
|
||||
- `name`: `title || id || name || "Unknown"`
|
||||
- `url`: `liveAudio || liveVideo || liveStream || url || ""`
|
||||
- `logo`: `logo || poster || ""`
|
||||
- Stations with `enabled === false` or without a URL are filtered out.
|
||||
|
||||
User-defined stations are stored in `localStorage` under `userStations` and appended after file stations.
|
||||
|
||||
The last selected station is stored under `localStorage.lastStationId`.
|
||||
|
||||
### Playback modes
|
||||
|
||||
State is tracked in JS:
|
||||
|
||||
- `currentMode`: `"local"` or `"cast"`
|
||||
- `currentCastDevice`: string or `null`
|
||||
- `isPlaying`: boolean
|
||||
|
||||
#### Local mode
|
||||
|
||||
- Uses backend invokes: `player_play`, `player_stop`, `player_set_volume`.
|
||||
- The UI polls `player_get_state` to reflect `buffering/playing/stopped/error`.
|
||||
|
||||
#### Cast mode
|
||||
|
||||
- Uses backend invokes: `cast_play`, `cast_stop`, `cast_set_volume`.
|
||||
|
||||
### Current song (“Now Playing”) polling
|
||||
|
||||
- For the currently selected station only, the app polls a station endpoint every 10s.
|
||||
- It prefers `raw.currentSong`, otherwise uses `raw.lastSongs`.
|
||||
- Remote URLs are fetched via the Tauri backend `fetch_url` to bypass CORS.
|
||||
- If the provider returns timing fields (`playTimeStart*`, `playTimeLength*`), the UI schedules a single refresh near song end.
|
||||
|
||||
### Overlays
|
||||
|
||||
The element [src/index.html](src/index.html) `#cast-overlay` is reused for two different overlays:
|
||||
|
||||
- Cast device picker (`openCastOverlay()`)
|
||||
- Station grid chooser (`openStationsOverlay()`)
|
||||
|
||||
The content is switched by:
|
||||
|
||||
- Toggling the `stations-grid` class on `#device-list`
|
||||
- Replacing `#device-list` contents dynamically
|
||||
|
||||
## UI controls (button-by-button)
|
||||
|
||||
All UI IDs below are in [src/index.html](src/index.html) and are wired in [src/main.js](src/main.js).
|
||||
|
||||
### Window / header
|
||||
|
||||
- `#close-btn`
|
||||
- Calls `getCurrentWindow().close()` (requires `core:window:allow-close`).
|
||||
- `#cast-toggle-btn`
|
||||
- Opens the Cast overlay and lists discovered devices (`invoke('list_cast_devices')`).
|
||||
- `#edit-stations-btn`
|
||||
- Opens the Stations Editor overlay (user stations stored in `localStorage.userStations`).
|
||||
|
||||
Note:
|
||||
|
||||
- `#cast-toggle-btn` and `#edit-stations-btn` appear twice in the HTML header. Duplicate IDs are invalid HTML and only the first element returned by `getElementById()` will be wired.
|
||||
|
||||
### Coverflow (station carousel inside artwork)
|
||||
|
||||
- `#artwork-prev`
|
||||
- Selects previous station via `setStationByIndex()`.
|
||||
- `#artwork-next`
|
||||
- Selects next station via `setStationByIndex()`.
|
||||
- `#artwork-coverflow` (drag/wheel area)
|
||||
- Pointer drag changes station when movement exceeds a threshold.
|
||||
- Wheel scroll changes station with a short debounce.
|
||||
- Coverflow card click
|
||||
- Selects that station.
|
||||
- Coverflow card double-click (on the selected station)
|
||||
- Opens the station grid overlay.
|
||||
|
||||
### Transport controls
|
||||
|
||||
- `#play-btn`
|
||||
- Toggles play/stop (`togglePlay()`):
|
||||
- Local mode: `invoke('player_play')` / `invoke('player_stop')`.
|
||||
- Cast mode: `invoke('cast_play')` / `invoke('cast_stop')`.
|
||||
- `#prev-btn`
|
||||
- Previous station (`playPrev()` → `setStationByIndex()`).
|
||||
- `#next-btn`
|
||||
- Next station (`playNext()` → `setStationByIndex()`).
|
||||
|
||||
### Volume
|
||||
|
||||
- `#volume-slider`
|
||||
- Local: `invoke('player_set_volume')`.
|
||||
- Cast: `invoke('cast_set_volume')`.
|
||||
- Persists `localStorage.volume`.
|
||||
- `#mute-btn`
|
||||
- Present in the UI but currently not wired to a handler in `main.js`.
|
||||
|
||||
### Cast overlay
|
||||
|
||||
- `#close-overlay`
|
||||
- Closes the overlay (`closeCastOverlay()`).
|
||||
|
||||
### Stations editor overlay
|
||||
|
||||
- `#editor-close-btn`
|
||||
- Closes the editor overlay.
|
||||
- `#add-station-form` submit
|
||||
- Adds/updates a station in `localStorage.userStations`.
|
||||
- Triggers a full station reload (`loadStations()`).
|
||||
|
||||
## Service worker / PWA pieces
|
||||
|
||||
- Service worker file: [src/sw.js](src/sw.js)
|
||||
- Caches core app assets for offline-ish behavior.
|
||||
- Web manifest: [src/manifest.json](src/manifest.json)
|
||||
- Name/icons/theme for installable PWA (primarily relevant for the web build; harmless in Tauri).
|
||||
|
||||
## Known sharp edges / notes
|
||||
|
||||
- **Duplicate IDs in HTML header**: only one of the duplicates will receive JS event listeners.
|
||||
- **Sidecar bundling name**: the build pipeline copies `radiocast-sidecar-...` to `RadioPlayer-...` (see [tools/copy-binaries.js](tools/copy-binaries.js)); ensure the bundled binary name matches what `shell.sidecar("radiocast-sidecar")` expects for your target.
|
||||
@@ -1,11 +0,0 @@
|
||||
This folder is not a full Android Studio project.
|
||||
|
||||
The buildable Android Studio/Gradle project is generated by Tauri at:
|
||||
|
||||
- src-tauri/gen/android
|
||||
|
||||
If you haven't generated it yet, run from the repo root:
|
||||
|
||||
- .\node_modules\.bin\tauri.cmd android init --ci
|
||||
|
||||
Then open `src-tauri/gen/android` in Android Studio and build the APK/AAB.
|
||||
|
Before Width: | Height: | Size: 682 KiB |
|
Before Width: | Height: | Size: 55 KiB |
|
Before Width: | Height: | Size: 290 KiB |
|
Before Width: | Height: | Size: 290 KiB |
|
Before Width: | Height: | Size: 49 KiB |
|
Before Width: | Height: | Size: 859 B |
|
Before Width: | Height: | Size: 2.6 KiB |
|
Before Width: | Height: | Size: 15 KiB |
@@ -1 +0,0 @@
|
||||
{"name":"","short_name":"","icons":[{"src":"/android-chrome-192x192.png","sizes":"192x192","type":"image/png"},{"src":"/android-chrome-512x512.png","sizes":"512x512","type":"image/png"}],"theme_color":"#ffffff","background_color":"#ffffff","display":"standalone"}
|
||||
@@ -1 +0,0 @@
|
||||
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" class="iconify iconify--logos" width="32" height="32" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 256"><path fill="#F7DF1E" d="M0 0h256v256H0V0Z"></path><path d="m67.312 213.932l19.59-11.856c3.78 6.701 7.218 12.371 15.465 12.371c7.905 0 12.89-3.092 12.89-15.12v-81.798h24.057v82.138c0 24.917-14.606 36.259-35.916 36.259c-19.245 0-30.416-9.967-36.087-21.996m85.07-2.576l19.588-11.341c5.157 8.421 11.859 14.607 23.715 14.607c9.969 0 16.325-4.984 16.325-11.858c0-8.248-6.53-11.17-17.528-15.98l-6.013-2.58c-17.357-7.387-28.87-16.667-28.87-36.257c0-18.044 13.747-31.792 35.228-31.792c15.294 0 26.292 5.328 34.196 19.247l-18.732 12.03c-4.125-7.389-8.591-10.31-15.465-10.31c-7.046 0-11.514 4.468-11.514 10.31c0 7.217 4.468 10.14 14.778 14.608l6.014 2.577c20.45 8.765 31.963 17.7 31.963 37.804c0 21.654-17.012 33.51-39.867 33.51c-22.339 0-36.774-10.654-43.819-24.574"></path></svg>
|
||||
|
Before Width: | Height: | Size: 995 B |
@@ -1,4 +0,0 @@
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="206" height="231" viewBox="0 0 206 231">
|
||||
<!-- Wrapper SVG that embeds the PNG app icon so existing references to tauri.svg render the PNG -->
|
||||
<image href="appIcon.png" width="206" height="231" preserveAspectRatio="xMidYMid slice" />
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 289 B |
@@ -1,158 +0,0 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>Radio Player</title>
|
||||
<link rel="stylesheet" href="styles.css">
|
||||
<script src="main.js" defer type="module"></script>
|
||||
</head>
|
||||
|
||||
<body>
|
||||
<div class="app-container">
|
||||
<div class="bg-shape shape-1"></div>
|
||||
<div class="bg-shape shape-2"></div>
|
||||
|
||||
<main class="glass-card">
|
||||
<header data-tauri-drag-region>
|
||||
<button id="menu-btn" class="icon-btn" aria-label="Menu">
|
||||
<svg width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2"
|
||||
stroke-linecap="round" stroke-linejoin="round">
|
||||
<line x1="3" y1="12" x2="21" y2="12"></line>
|
||||
<line x1="3" y1="6" x2="21" y2="6"></line>
|
||||
<line x1="3" y1="18" x2="21" y2="18"></line>
|
||||
</svg>
|
||||
</button>
|
||||
<div class="header-info" data-tauri-drag-region>
|
||||
<span class="app-title">Radio1 Player</span>
|
||||
<span class="status-indicator" id="status-indicator">
|
||||
<span class="status-dot"></span> <span id="status-text">Ready</span>
|
||||
</span>
|
||||
</div>
|
||||
<div class="header-buttons">
|
||||
<button id="cast-toggle-btn" class="icon-btn" aria-label="Cast">
|
||||
<svg width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2"
|
||||
stroke-linecap="round" stroke-linejoin="round">
|
||||
<path d="M2 16.1A5 5 0 0 1 5.9 20M2 12.05A9 9 0 0 1 9.95 20M2 8V6a14 14 0 0 1 14 14h-2" />
|
||||
</svg>
|
||||
</button>
|
||||
<button id="close-btn" class="icon-btn close-btn" aria-label="Close">
|
||||
<svg width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2"
|
||||
stroke-linecap="round" stroke-linejoin="round">
|
||||
<line x1="18" y1="6" x2="6" y2="18"></line>
|
||||
<line x1="6" y1="6" x2="18" y2="18"></line>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
</header>
|
||||
|
||||
<section class="artwork-section">
|
||||
<div class="artwork-container">
|
||||
<div class="artwork-placeholder">
|
||||
<!-- Gooey SVG filter for fluid blob blending -->
|
||||
<svg width="0" height="0" style="position:absolute">
|
||||
<defs>
|
||||
<filter id="goo">
|
||||
<!-- increased blur for smoother, more transparent blending -->
|
||||
<feGaussianBlur in="SourceGraphic" stdDeviation="18" result="blur" />
|
||||
<feColorMatrix in="blur" mode="matrix" values="1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 18 -7" result="goo" />
|
||||
<feBlend in="SourceGraphic" in2="goo" />
|
||||
</filter>
|
||||
</defs>
|
||||
</svg>
|
||||
|
||||
<div class="logo-blobs" aria-hidden="true">
|
||||
<span class="blob b1"></span>
|
||||
<span class="blob b2"></span>
|
||||
<span class="blob b3"></span>
|
||||
<span class="blob b4"></span>
|
||||
<span class="blob b5"></span>
|
||||
<span class="blob b6"></span>
|
||||
<span class="blob b7"></span>
|
||||
<span class="blob b8"></span>
|
||||
<span class="blob b9"></span>
|
||||
<span class="blob b10"></span>
|
||||
</div>
|
||||
|
||||
<img id="station-logo-img" class="station-logo-img hidden" alt="station logo">
|
||||
<span class="station-logo-text">1</span>
|
||||
</div>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
<section class="track-info">
|
||||
<h2 id="station-name"></h2>
|
||||
<p id="station-subtitle"></p>
|
||||
</section>
|
||||
|
||||
<!-- Visual Progress Bar (Live) -->
|
||||
<div class="progress-container">
|
||||
<div class="progress-bar">
|
||||
<div class="progress-fill"></div>
|
||||
<div class="progress-handle"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<section class="controls-section">
|
||||
<button id="prev-btn" class="control-btn secondary" aria-label="Previous Station">
|
||||
<svg width="24" height="24" viewBox="0 0 24 24" fill="currentColor">
|
||||
<path d="M6 6h2v12H6zm3.5 6l8.5 6V6z" />
|
||||
</svg>
|
||||
</button>
|
||||
|
||||
<button id="play-btn" class="control-btn primary" aria-label="Play">
|
||||
<div class="icon-container">
|
||||
<!-- Play Icon -->
|
||||
<svg id="icon-play" width="32" height="32" viewBox="0 0 24 24" fill="currentColor">
|
||||
<path d="M8 5v14l11-7z" />
|
||||
</svg>
|
||||
<!-- Stop/Pause Icon (Hidden by default) -->
|
||||
<svg id="icon-stop" class="hidden" width="32" height="32" viewBox="0 0 24 24" fill="currentColor">
|
||||
<path d="M6 6h12v12H6z" />
|
||||
</svg>
|
||||
</div>
|
||||
</button>
|
||||
|
||||
<button id="next-btn" class="control-btn secondary" aria-label="Next Station">
|
||||
<svg width="24" height="24" viewBox="0 0 24 24" fill="currentColor">
|
||||
<path d="M6 18l8.5-6L6 6v12zM16 6v12h2V6h-2z" />
|
||||
</svg>
|
||||
</button>
|
||||
</section>
|
||||
|
||||
<section class="volume-section">
|
||||
<button id="mute-btn" class="icon-btn small">
|
||||
<svg width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2">
|
||||
<polygon points="11 5 6 9 2 9 2 15 6 15 11 19 11 5"></polygon>
|
||||
<path d="M19.07 4.93a10 10 0 0 1 0 14.14M15.54 8.46a5 5 0 0 1 0 7.07"></path>
|
||||
</svg>
|
||||
</button>
|
||||
<div class="slider-container">
|
||||
<input type="range" id="volume-slider" min="0" max="100" value="50">
|
||||
</div>
|
||||
<span id="volume-value">50%</span>
|
||||
</section>
|
||||
|
||||
<!-- Hidden Cast Overlay (Beautified) -->
|
||||
<div id="cast-overlay" class="overlay hidden" aria-hidden="true" data-tauri-drag-region>
|
||||
<div class="modal" role="dialog" aria-modal="true" aria-labelledby="deviceTitle">
|
||||
<h2 id="deviceTitle">Choose</h2>
|
||||
|
||||
<ul id="device-list" class="device-list">
|
||||
<!-- Render device items here -->
|
||||
<li class="device">
|
||||
<div class="device-main">Scanning...</div>
|
||||
<div class="device-sub">Searching for speakers</div>
|
||||
</li>
|
||||
</ul>
|
||||
|
||||
<button id="close-overlay" class="btn cancel" type="button">Cancel</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</main>
|
||||
</div>
|
||||
</body>
|
||||
|
||||
</html>
|
||||
@@ -1,355 +0,0 @@
|
||||
const { invoke } = window.__TAURI__.core;
|
||||
const { getCurrentWindow } = window.__TAURI__.window;
|
||||
|
||||
// State
|
||||
let stations = [];
|
||||
let currentIndex = 0;
|
||||
let isPlaying = false;
|
||||
let currentMode = 'local'; // 'local' | 'cast'
|
||||
let currentCastDevice = null;
|
||||
const audio = new Audio();
|
||||
|
||||
// UI Elements
|
||||
const stationNameEl = document.getElementById('station-name');
|
||||
const stationSubtitleEl = document.getElementById('station-subtitle');
|
||||
const statusTextEl = document.getElementById('status-text');
|
||||
const statusDotEl = document.querySelector('.status-dot');
|
||||
const playBtn = document.getElementById('play-btn');
|
||||
const iconPlay = document.getElementById('icon-play');
|
||||
const iconStop = document.getElementById('icon-stop');
|
||||
const prevBtn = document.getElementById('prev-btn');
|
||||
const nextBtn = document.getElementById('next-btn');
|
||||
const volumeSlider = document.getElementById('volume-slider');
|
||||
const volumeValue = document.getElementById('volume-value');
|
||||
const castBtn = document.getElementById('cast-toggle-btn');
|
||||
const castOverlay = document.getElementById('cast-overlay');
|
||||
const closeOverlayBtn = document.getElementById('close-overlay');
|
||||
const deviceListEl = document.getElementById('device-list');
|
||||
const logoTextEl = document.querySelector('.station-logo-text');
|
||||
const logoImgEl = document.getElementById('station-logo-img');
|
||||
|
||||
// Init
|
||||
async function init() {
|
||||
await loadStations();
|
||||
setupEventListeners();
|
||||
updateUI();
|
||||
}
|
||||
|
||||
async function loadStations() {
|
||||
try {
|
||||
const resp = await fetch('stations.json');
|
||||
const raw = await resp.json();
|
||||
|
||||
// Normalize station objects so the rest of the app can rely on `name` and `url`.
|
||||
stations = raw
|
||||
.map((s) => {
|
||||
// If already in the old format, keep as-is
|
||||
if (s.name && s.url) return s;
|
||||
|
||||
const name = s.title || s.id || s.name || 'Unknown';
|
||||
// Prefer liveAudio, fall back to liveVideo or any common fields
|
||||
const url = s.liveAudio || s.liveVideo || s.liveStream || s.url || '';
|
||||
|
||||
return {
|
||||
id: s.id || name,
|
||||
name,
|
||||
url,
|
||||
logo: s.logo || s.poster || '',
|
||||
enabled: typeof s.enabled === 'boolean' ? s.enabled : true,
|
||||
raw: s,
|
||||
};
|
||||
})
|
||||
// Filter out disabled stations and those without a stream URL
|
||||
.filter((s) => s.enabled !== false && s.url && s.url.length > 0);
|
||||
|
||||
if (stations.length > 0) {
|
||||
currentIndex = 0;
|
||||
loadStation(currentIndex);
|
||||
}
|
||||
} catch (e) {
|
||||
console.error('Failed to load stations', e);
|
||||
statusTextEl.textContent = 'Error loading stations';
|
||||
}
|
||||
}
|
||||
|
||||
function setupEventListeners() {
|
||||
playBtn.addEventListener('click', togglePlay);
|
||||
prevBtn.addEventListener('click', playPrev);
|
||||
nextBtn.addEventListener('click', playNext);
|
||||
|
||||
volumeSlider.addEventListener('input', handleVolumeInput);
|
||||
|
||||
castBtn.addEventListener('click', openCastOverlay);
|
||||
closeOverlayBtn.addEventListener('click', closeCastOverlay);
|
||||
|
||||
// Close overlay on background click
|
||||
castOverlay.addEventListener('click', (e) => {
|
||||
if (e.target === castOverlay) closeCastOverlay();
|
||||
});
|
||||
|
||||
// Close button
|
||||
document.getElementById('close-btn').addEventListener('click', async () => {
|
||||
const appWindow = getCurrentWindow();
|
||||
await appWindow.close();
|
||||
});
|
||||
|
||||
// Menu button - explicit functionality or placeholder?
|
||||
// For now just log or maybe show about
|
||||
document.getElementById('menu-btn').addEventListener('click', () => {
|
||||
openStationsOverlay();
|
||||
});
|
||||
|
||||
// Hotkeys?
|
||||
}
|
||||
|
||||
function loadStation(index) {
|
||||
if (index < 0 || index >= stations.length) return;
|
||||
const station = stations[index];
|
||||
|
||||
stationNameEl.textContent = station.name;
|
||||
stationSubtitleEl.textContent = currentMode === 'cast' ? `Casting to ${currentCastDevice}` : 'Live Stream';
|
||||
|
||||
// Update Logo Text (First letter or number)
|
||||
// Simple heuristic: if name has a number, use it, else first letter
|
||||
// If station has a logo URL, show the image; otherwise show the text fallback
|
||||
if (station.logo && station.logo.length > 0) {
|
||||
logoImgEl.src = station.logo;
|
||||
logoImgEl.classList.remove('hidden');
|
||||
logoTextEl.classList.add('hidden');
|
||||
} else {
|
||||
// Fallback to single-letter/logo text
|
||||
logoImgEl.src = '';
|
||||
logoImgEl.classList.add('hidden');
|
||||
const numberMatch = station.name.match(/\d+/);
|
||||
if (numberMatch) {
|
||||
logoTextEl.textContent = numberMatch[0];
|
||||
} else {
|
||||
logoTextEl.textContent = station.name.charAt(0).toUpperCase();
|
||||
}
|
||||
logoTextEl.classList.remove('hidden');
|
||||
}
|
||||
}
|
||||
|
||||
async function togglePlay() {
|
||||
if (isPlaying) {
|
||||
await stop();
|
||||
} else {
|
||||
await play();
|
||||
}
|
||||
}
|
||||
|
||||
async function play() {
|
||||
const station = stations[currentIndex];
|
||||
if (!station) return;
|
||||
|
||||
statusTextEl.textContent = 'Buffering...';
|
||||
statusDotEl.style.backgroundColor = 'var(--text-muted)'; // Grey/Yellow while loading
|
||||
|
||||
if (currentMode === 'local') {
|
||||
audio.src = station.url;
|
||||
audio.volume = volumeSlider.value / 100;
|
||||
try {
|
||||
await audio.play();
|
||||
isPlaying = true;
|
||||
updateUI();
|
||||
} catch (e) {
|
||||
console.error('Playback failed', e);
|
||||
statusTextEl.textContent = 'Error';
|
||||
}
|
||||
} else if (currentMode === 'cast' && currentCastDevice) {
|
||||
// Cast logic
|
||||
try {
|
||||
await invoke('cast_play', { deviceName: currentCastDevice, url: station.url });
|
||||
isPlaying = true;
|
||||
// Sync volume
|
||||
const vol = volumeSlider.value / 100;
|
||||
invoke('cast_set_volume', { deviceName: currentCastDevice, volume: vol });
|
||||
updateUI();
|
||||
} catch (e) {
|
||||
console.error('Cast failed', e);
|
||||
statusTextEl.textContent = 'Cast Error';
|
||||
currentMode = 'local'; // Fallback
|
||||
updateUI();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function stop() {
|
||||
if (currentMode === 'local') {
|
||||
audio.pause();
|
||||
audio.src = '';
|
||||
} else if (currentMode === 'cast' && currentCastDevice) {
|
||||
try {
|
||||
await invoke('cast_stop', { deviceName: currentCastDevice });
|
||||
} catch (e) {
|
||||
console.error(e);
|
||||
}
|
||||
}
|
||||
|
||||
isPlaying = false;
|
||||
updateUI();
|
||||
}
|
||||
|
||||
async function playNext() {
|
||||
if (stations.length === 0) return;
|
||||
|
||||
// If playing, stop first? Or seamless?
|
||||
// For radio, seamless switch requires stop then play new URL
|
||||
const wasPlaying = isPlaying;
|
||||
|
||||
if (wasPlaying) await stop();
|
||||
|
||||
currentIndex = (currentIndex + 1) % stations.length;
|
||||
loadStation(currentIndex);
|
||||
|
||||
if (wasPlaying) await play();
|
||||
}
|
||||
|
||||
async function playPrev() {
|
||||
if (stations.length === 0) return;
|
||||
|
||||
const wasPlaying = isPlaying;
|
||||
|
||||
if (wasPlaying) await stop();
|
||||
|
||||
currentIndex = (currentIndex - 1 + stations.length) % stations.length;
|
||||
loadStation(currentIndex);
|
||||
|
||||
if (wasPlaying) await play();
|
||||
}
|
||||
|
||||
function updateUI() {
|
||||
// Play/Stop Button
|
||||
if (isPlaying) {
|
||||
iconPlay.classList.add('hidden');
|
||||
iconStop.classList.remove('hidden');
|
||||
playBtn.classList.add('playing'); // Add pulsing ring animation
|
||||
statusTextEl.textContent = 'Playing';
|
||||
statusDotEl.style.backgroundColor = 'var(--success)';
|
||||
stationSubtitleEl.textContent = currentMode === 'cast' ? `Casting to ${currentCastDevice}` : 'Live Stream';
|
||||
} else {
|
||||
iconPlay.classList.remove('hidden');
|
||||
iconStop.classList.add('hidden');
|
||||
playBtn.classList.remove('playing'); // Remove pulsing ring
|
||||
statusTextEl.textContent = 'Ready';
|
||||
statusDotEl.style.backgroundColor = 'var(--text-muted)';
|
||||
stationSubtitleEl.textContent = currentMode === 'cast' ? `Connected to ${currentCastDevice}` : 'Live Stream';
|
||||
}
|
||||
}
|
||||
|
||||
function handleVolumeInput() {
|
||||
const val = volumeSlider.value;
|
||||
volumeValue.textContent = `${val}%`;
|
||||
const decimals = val / 100;
|
||||
|
||||
if (currentMode === 'local') {
|
||||
audio.volume = decimals;
|
||||
} else if (currentMode === 'cast' && currentCastDevice) {
|
||||
invoke('cast_set_volume', { deviceName: currentCastDevice, volume: decimals });
|
||||
}
|
||||
}
|
||||
|
||||
// Cast Logic
|
||||
async function openCastOverlay() {
|
||||
castOverlay.classList.remove('hidden');
|
||||
castOverlay.setAttribute('aria-hidden', 'false');
|
||||
deviceListEl.innerHTML = '<li class="device"><div class="device-main">Scanning...</div><div class="device-sub">Searching for speakers</div></li>';
|
||||
|
||||
try {
|
||||
const devices = await invoke('list_cast_devices');
|
||||
deviceListEl.innerHTML = '';
|
||||
|
||||
// Add "This Computer" option
|
||||
const localLi = document.createElement('li');
|
||||
localLi.className = 'device' + (currentMode === 'local' ? ' selected' : '');
|
||||
localLi.innerHTML = '<div class="device-main">This Computer</div><div class="device-sub">Local Playback</div>';
|
||||
localLi.onclick = () => selectCastDevice(null);
|
||||
deviceListEl.appendChild(localLi);
|
||||
|
||||
if (devices.length > 0) {
|
||||
devices.forEach(d => {
|
||||
const li = document.createElement('li');
|
||||
li.className = 'device' + (currentMode === 'cast' && currentCastDevice === d ? ' selected' : '');
|
||||
li.innerHTML = `<div class="device-main">${d}</div><div class="device-sub">Google Cast Speaker</div>`;
|
||||
li.onclick = () => selectCastDevice(d);
|
||||
deviceListEl.appendChild(li);
|
||||
});
|
||||
}
|
||||
} catch (e) {
|
||||
deviceListEl.innerHTML = `<li class="device"><div class="device-main">Error</div><div class="device-sub">${e}</div></li>`;
|
||||
}
|
||||
}
|
||||
|
||||
function closeCastOverlay() {
|
||||
castOverlay.classList.add('hidden');
|
||||
castOverlay.setAttribute('aria-hidden', 'true');
|
||||
}
|
||||
|
||||
async function selectCastDevice(deviceName) {
|
||||
closeCastOverlay();
|
||||
|
||||
// If checking same device, do nothing
|
||||
if (deviceName === currentCastDevice) return;
|
||||
|
||||
// If switching mode, stop current playback
|
||||
if (isPlaying) {
|
||||
await stop();
|
||||
}
|
||||
|
||||
if (deviceName) {
|
||||
currentMode = 'cast';
|
||||
currentCastDevice = deviceName;
|
||||
castBtn.style.color = 'var(--success)';
|
||||
} else {
|
||||
currentMode = 'local';
|
||||
currentCastDevice = null;
|
||||
castBtn.style.color = 'var(--text-main)';
|
||||
}
|
||||
|
||||
updateUI();
|
||||
|
||||
// Auto-play if we were playing? Let's stay stopped to be safe/explicit
|
||||
// Or auto-play for better UX?
|
||||
// Let's prompt user to play.
|
||||
}
|
||||
|
||||
window.addEventListener('DOMContentLoaded', init);
|
||||
|
||||
// Open overlay and show list of stations (used by menu/hamburger)
|
||||
function openStationsOverlay() {
|
||||
castOverlay.classList.remove('hidden');
|
||||
castOverlay.setAttribute('aria-hidden', 'false');
|
||||
deviceListEl.innerHTML = '<li class="device"><div class="device-main">Loading...</div><div class="device-sub">Preparing stations</div></li>';
|
||||
|
||||
// If stations not loaded yet, show message
|
||||
if (!stations || stations.length === 0) {
|
||||
deviceListEl.innerHTML = '<li class="device"><div class="device-main">No stations found</div><div class="device-sub">Check your stations.json</div></li>';
|
||||
return;
|
||||
}
|
||||
|
||||
deviceListEl.innerHTML = '';
|
||||
|
||||
stations.forEach((s, idx) => {
|
||||
const li = document.createElement('li');
|
||||
li.className = 'device' + (currentIndex === idx ? ' selected' : '');
|
||||
const subtitle = (s.raw && s.raw.www) ? s.raw.www : (s.id || '');
|
||||
li.innerHTML = `<div class="device-main">${s.name}</div><div class="device-sub">${subtitle}</div>`;
|
||||
li.onclick = async () => {
|
||||
// Always switch to local playback when selecting from stations menu
|
||||
currentMode = 'local';
|
||||
currentCastDevice = null;
|
||||
castBtn.style.color = 'var(--text-main)';
|
||||
|
||||
// Select and play
|
||||
currentIndex = idx;
|
||||
loadStation(currentIndex);
|
||||
closeCastOverlay();
|
||||
try {
|
||||
await play();
|
||||
} catch (e) {
|
||||
console.error('Failed to play station from menu', e);
|
||||
}
|
||||
};
|
||||
deviceListEl.appendChild(li);
|
||||
});
|
||||
}
|
||||
@@ -1,629 +0,0 @@
|
||||
:root {
|
||||
--bg-gradient: linear-gradient(135deg, #7b7fd8, #b57cf2);
|
||||
--glass-bg: rgba(255, 255, 255, 0.1);
|
||||
--glass-border: rgba(255, 255, 255, 0.2);
|
||||
--accent: #dfa6ff;
|
||||
--accent-glow: rgba(223, 166, 255, 0.5);
|
||||
--text-main: #ffffff;
|
||||
--text-muted: rgba(255, 255, 255, 0.7);
|
||||
--danger: #cf6679;
|
||||
--success: #7dffb3;
|
||||
--card-radius: 10px;
|
||||
}
|
||||
|
||||
* {
|
||||
box-sizing: border-box;
|
||||
user-select: none;
|
||||
-webkit-user-drag: none;
|
||||
cursor: default;
|
||||
}
|
||||
|
||||
/* Show pointer cursor for interactive / clickable elements (override global default) */
|
||||
a, a[href], button, input[type="button"], input[type="submit"],
|
||||
[role="button"], [onclick], .clickable, .icon-btn, .control-btn, label[for],
|
||||
.station-item, [tabindex]:not([tabindex="-1"]) {
|
||||
cursor: pointer !important;
|
||||
}
|
||||
|
||||
/* Hide Scrollbars */
|
||||
::-webkit-scrollbar {
|
||||
display: none;
|
||||
}
|
||||
|
||||
body {
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
height: 100vh;
|
||||
width: 100vw;
|
||||
background: linear-gradient(-45deg, #7b7fd8, #b57cf2, #8b5cf6, #6930c3, #7b7fd8);
|
||||
background-size: 400% 400%;
|
||||
animation: gradientShift 12s ease-in-out infinite;
|
||||
font-family: 'Segoe UI', system-ui, sans-serif;
|
||||
color: var(--text-main);
|
||||
overflow: hidden;
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
@keyframes gradientShift {
|
||||
0% {
|
||||
background-position: 0% 50%;
|
||||
}
|
||||
25% {
|
||||
background-position: 100% 50%;
|
||||
}
|
||||
50% {
|
||||
background-position: 50% 100%;
|
||||
}
|
||||
75% {
|
||||
background-position: 100% 50%;
|
||||
}
|
||||
100% {
|
||||
background-position: 0% 50%;
|
||||
}
|
||||
}
|
||||
|
||||
/* Background Blobs */
|
||||
.bg-shape {
|
||||
position: absolute;
|
||||
border-radius: 50%;
|
||||
filter: blur(60px);
|
||||
z-index: 0;
|
||||
opacity: 0.6;
|
||||
animation: float 10s infinite alternate;
|
||||
}
|
||||
|
||||
.shape-1 {
|
||||
width: 300px;
|
||||
height: 300px;
|
||||
background: #5e60ce;
|
||||
top: -50px;
|
||||
left: -50px;
|
||||
}
|
||||
|
||||
.shape-2 {
|
||||
width: 250px;
|
||||
height: 250px;
|
||||
background: #ff6bf0;
|
||||
bottom: -50px;
|
||||
right: -50px;
|
||||
animation-delay: -5s;
|
||||
}
|
||||
|
||||
@keyframes float {
|
||||
0% { transform: translate(0, 0); }
|
||||
100% { transform: translate(30px, 30px); }
|
||||
}
|
||||
|
||||
.app-container {
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
position: relative;
|
||||
padding: 10px; /* Slight padding from window edges if desired, or 0 */
|
||||
}
|
||||
|
||||
.glass-card {
|
||||
position: relative;
|
||||
z-index: 1;
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
background: var(--glass-bg);
|
||||
border: 1px solid var(--glass-border);
|
||||
backdrop-filter: blur(24px);
|
||||
border-radius: var(--card-radius);
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
padding: 24px;
|
||||
box-shadow: 0 16px 40px rgba(0, 0, 0, 0.2);
|
||||
}
|
||||
|
||||
/* Header */
|
||||
header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
margin-bottom: 20px;
|
||||
-webkit-app-region: drag; /* Draggable area */
|
||||
}
|
||||
|
||||
.header-info {
|
||||
text-align: center;
|
||||
flex: 1;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.app-title {
|
||||
font-weight: 600;
|
||||
font-size: 1rem;
|
||||
color: var(--text-main);
|
||||
}
|
||||
|
||||
.status-indicator {
|
||||
font-size: 0.8rem;
|
||||
color: var(--success);
|
||||
margin-top: 4px;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 6px;
|
||||
}
|
||||
|
||||
.status-dot {
|
||||
width: 6px;
|
||||
height: 6px;
|
||||
background-color: var(--success);
|
||||
border-radius: 50%;
|
||||
box-shadow: 0 0 8px var(--success);
|
||||
}
|
||||
|
||||
.icon-btn {
|
||||
background: none;
|
||||
border: none;
|
||||
color: var(--text-main);
|
||||
padding: 8px;
|
||||
cursor: pointer;
|
||||
border-radius: 50%;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
transition: background 0.2s;
|
||||
-webkit-app-region: no-drag; /* Buttons clickable */
|
||||
}
|
||||
|
||||
.icon-btn:hover {
|
||||
background: rgba(255, 255, 255, 0.1);
|
||||
}
|
||||
|
||||
.header-buttons {
|
||||
display: flex;
|
||||
gap: 4px;
|
||||
align-items: center;
|
||||
-webkit-app-region: no-drag;
|
||||
}
|
||||
|
||||
.close-btn:hover {
|
||||
background: rgba(207, 102, 121, 0.3) !important;
|
||||
color: var(--danger);
|
||||
}
|
||||
|
||||
/* Artwork */
|
||||
.artwork-section {
|
||||
flex: 1;
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.artwork-container {
|
||||
width: 220px;
|
||||
height: 220px;
|
||||
border-radius: 24px;
|
||||
padding: 6px; /* spacing for ring */
|
||||
background: linear-gradient(135deg, rgba(255,255,255,0.1), rgba(255,255,255,0));
|
||||
box-shadow: 5px 5px 15px rgba(0,0,0,0.1), inset 1px 1px 2px rgba(255,255,255,0.3);
|
||||
}
|
||||
|
||||
.artwork-placeholder {
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
background: linear-gradient(135deg, #4ea8de, #6930c3);
|
||||
border-radius: 20px;
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
position: relative;
|
||||
overflow: hidden;
|
||||
box-shadow: inset 0 0 20px rgba(0,0,0,0.2);
|
||||
}
|
||||
|
||||
.artwork-placeholder {
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
background: linear-gradient(135deg, #4ea8de, #6930c3);
|
||||
border-radius: 20px;
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
position: relative;
|
||||
overflow: hidden;
|
||||
box-shadow: inset 0 0 20px rgba(0,0,0,0.2);
|
||||
}
|
||||
|
||||
.station-logo-text {
|
||||
font-size: 5rem;
|
||||
font-weight: 800;
|
||||
font-style: italic;
|
||||
color: rgba(255,255,255,0.9);
|
||||
text-shadow: 0 4px 10px rgba(0,0,0,0.3);
|
||||
position: relative;
|
||||
z-index: 3;
|
||||
}
|
||||
|
||||
.station-logo-img {
|
||||
/* Fill the artwork placeholder while keeping aspect ratio and inner padding */
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
object-fit: contain;
|
||||
display: block;
|
||||
padding: 12px; /* inner spacing from rounded edges */
|
||||
box-sizing: border-box;
|
||||
border-radius: 12px;
|
||||
box-shadow: 0 8px 20px rgba(0,0,0,0.35);
|
||||
position: relative;
|
||||
z-index: 3;
|
||||
}
|
||||
|
||||
/* Logo blobs container sits behind logo but inside artwork placeholder */
|
||||
.logo-blobs {
|
||||
position: absolute;
|
||||
inset: 0;
|
||||
filter: url(#goo);
|
||||
z-index: 1;
|
||||
pointer-events: none;
|
||||
}
|
||||
|
||||
/* Make artwork/logo clickable: show pointer cursor */
|
||||
.artwork-placeholder,
|
||||
.artwork-placeholder:hover,
|
||||
.station-logo-img,
|
||||
.station-logo-text {
|
||||
cursor: pointer !important;
|
||||
pointer-events: auto;
|
||||
}
|
||||
|
||||
/* Subtle hover affordance to make clickability clearer */
|
||||
.artwork-placeholder:hover .station-logo-img,
|
||||
.artwork-placeholder:hover .station-logo-text {
|
||||
transform: scale(1.03);
|
||||
transition: transform 160ms ease;
|
||||
}
|
||||
|
||||
.blob {
|
||||
position: absolute;
|
||||
border-radius: 50%;
|
||||
/* more transparent overall */
|
||||
opacity: 0.18;
|
||||
/* slightly smaller blur for subtle definition */
|
||||
filter: blur(6px);
|
||||
}
|
||||
|
||||
.b1 { width: 110px; height: 110px; left: 8%; top: 20%; background: radial-gradient(circle at 30% 30%, #c77dff, #8b5cf6); animation: float1 6s ease-in-out infinite; }
|
||||
.b2 { width: 85px; height: 85px; right: 6%; top: 10%; background: radial-gradient(circle at 30% 30%, #7bffd1, #7dffb3); animation: float2 5.5s ease-in-out infinite; }
|
||||
.b3 { width: 95px; height: 95px; left: 20%; bottom: 12%; background: radial-gradient(circle at 20% 20%, #ffd07a, #ff6bf0); animation: float3 7s ease-in-out infinite; }
|
||||
.b4 { width: 70px; height: 70px; right: 24%; bottom: 18%; background: radial-gradient(circle at 30% 30%, #6bd3ff, #4ea8de); animation: float4 6.5s ease-in-out infinite; }
|
||||
.b5 { width: 50px; height: 50px; left: 46%; top: 36%; background: radial-gradient(circle at 40% 40%, #ffa6d6, #c77dff); animation: float5 8s ease-in-out infinite; }
|
||||
|
||||
/* Additional blobs */
|
||||
.b6 { width: 75px; height: 75px; left: 12%; top: 48%; background: radial-gradient(circle at 30% 30%, #bde7ff, #6bd3ff); animation: float6 6.8s ease-in-out infinite; }
|
||||
.b7 { width: 42px; height: 42px; right: 10%; top: 42%; background: radial-gradient(circle at 40% 40%, #ffd9b3, #ffd07a); animation: float7 7.2s ease-in-out infinite; }
|
||||
.b8 { width: 70px; height: 70px; left: 34%; bottom: 8%; background: radial-gradient(circle at 30% 30%, #e3b6ff, #c77dff); animation: float8 6.4s ease-in-out infinite; }
|
||||
.b9 { width: 36px; height: 36px; right: 34%; bottom: 6%; background: radial-gradient(circle at 30% 30%, #9ef7d3, #7bffd1); animation: float9 8.4s ease-in-out infinite; }
|
||||
.b10 { width: 30px; height: 30px; left: 52%; bottom: 28%; background: radial-gradient(circle at 30% 30%, #ffd0f0, #ffa6d6); animation: float10 5.8s ease-in-out infinite; }
|
||||
|
||||
@keyframes float1 { 0% { transform: translateY(0) translateX(0) scale(1); } 50% { transform: translateY(12px) translateX(8px) scale(1.06); } 100% { transform: translateY(0) translateX(0) scale(1); } }
|
||||
@keyframes float2 { 0% { transform: translateY(0) translateX(0) scale(1); } 50% { transform: translateY(-10px) translateX(-6px) scale(1.04); } 100% { transform: translateY(0) translateX(0) scale(1); } }
|
||||
@keyframes float3 { 0% { transform: translateY(0) translateX(0) scale(1); } 50% { transform: translateY(8px) translateX(-10px) scale(1.05); } 100% { transform: translateY(0) translateX(0) scale(1); } }
|
||||
@keyframes float4 { 0% { transform: translateY(0) translateX(0) scale(1); } 50% { transform: translateY(-6px) translateX(10px) scale(1.03); } 100% { transform: translateY(0) translateX(0) scale(1); } }
|
||||
@keyframes float5 { 0% { transform: translateY(0) translateX(0) scale(1); } 50% { transform: translateY(-12px) translateX(4px) scale(1.07); } 100% { transform: translateY(0) translateX(0) scale(1); } }
|
||||
@keyframes float6 { 0% { transform: translateY(0) translateX(0) scale(1); } 50% { transform: translateY(-8px) translateX(6px) scale(1.05); } 100% { transform: translateY(0) translateX(0) scale(1); } }
|
||||
@keyframes float7 { 0% { transform: translateY(0) translateX(0) scale(1); } 50% { transform: translateY(10px) translateX(-6px) scale(1.04); } 100% { transform: translateY(0) translateX(0) scale(1); } }
|
||||
@keyframes float8 { 0% { transform: translateY(0) translateX(0) scale(1); } 50% { transform: translateY(-6px) translateX(10px) scale(1.03); } 100% { transform: translateY(0) translateX(0) scale(1); } }
|
||||
@keyframes float9 { 0% { transform: translateY(0) translateX(0) scale(1); } 50% { transform: translateY(12px) translateX(-4px) scale(1.06); } 100% { transform: translateY(0) translateX(0) scale(1); } }
|
||||
@keyframes float10 { 0% { transform: translateY(0) translateX(0) scale(1); } 50% { transform: translateY(-10px) translateX(2px) scale(1.04); } 100% { transform: translateY(0) translateX(0) scale(1); } }
|
||||
|
||||
/* Slightly darken backdrop gradient so blobs read better */
|
||||
.artwork-placeholder::before {
|
||||
content: '';
|
||||
position: absolute;
|
||||
inset: 0;
|
||||
background: linear-gradient(180deg, rgba(0,0,0,0.06), rgba(0,0,0,0.12));
|
||||
z-index: 0;
|
||||
}
|
||||
|
||||
/* Track Info */
|
||||
.track-info {
|
||||
text-align: center;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.track-info h2 {
|
||||
margin: 0;
|
||||
font-size: 1.5rem;
|
||||
font-weight: 600;
|
||||
text-shadow: 0 2px 4px rgba(0,0,0,0.2);
|
||||
}
|
||||
|
||||
.track-info p {
|
||||
margin: 6px 0 0;
|
||||
color: var(--text-muted);
|
||||
font-size: 0.95rem;
|
||||
}
|
||||
|
||||
/* Progress Bar (Visual) */
|
||||
.progress-container {
|
||||
width: 100%;
|
||||
height: 4px;
|
||||
background: rgba(255,255,255,0.1);
|
||||
border-radius: 2px;
|
||||
margin-bottom: 30px;
|
||||
position: relative;
|
||||
}
|
||||
|
||||
.progress-fill {
|
||||
width: 100%; /* Live always full or pulsing */
|
||||
height: 100%;
|
||||
background: linear-gradient(90deg, var(--accent), #fff);
|
||||
border-radius: 2px;
|
||||
opacity: 0.8;
|
||||
box-shadow: 0 0 10px var(--accent-glow);
|
||||
}
|
||||
|
||||
.progress-handle {
|
||||
position: absolute;
|
||||
right: 0;
|
||||
top: 50%;
|
||||
transform: translate(50%, -50%);
|
||||
width: 12px;
|
||||
height: 12px;
|
||||
background: #fff;
|
||||
border-radius: 50%;
|
||||
box-shadow: 0 0 10px rgba(255,255,255,0.8);
|
||||
}
|
||||
|
||||
/* Controls */
|
||||
.controls-section {
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
gap: 30px;
|
||||
margin-bottom: 30px;
|
||||
}
|
||||
|
||||
.control-btn {
|
||||
background: none;
|
||||
border: none;
|
||||
color: var(--text-main);
|
||||
cursor: pointer;
|
||||
transition: transform 0.1s, opacity 0.2s;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
}
|
||||
|
||||
.control-btn:active {
|
||||
transform: scale(0.9);
|
||||
}
|
||||
|
||||
.control-btn.secondary {
|
||||
width: 48px;
|
||||
height: 48px;
|
||||
border-radius: 50%;
|
||||
background: rgba(255,255,255,0.05);
|
||||
border: 1px solid rgba(255,255,255,0.1);
|
||||
box-shadow: 0 4px 10px rgba(0,0,0,0.1);
|
||||
}
|
||||
|
||||
.control-btn.primary {
|
||||
width: 72px;
|
||||
height: 72px;
|
||||
border-radius: 50%;
|
||||
background: linear-gradient(135deg, rgba(255,255,255,0.2), rgba(255,255,255,0.05));
|
||||
border: 1px solid rgba(255,255,255,0.3);
|
||||
box-shadow: 0 8px 20px rgba(0,0,0,0.2), inset 0 0 10px rgba(255,255,255,0.1);
|
||||
color: #fff;
|
||||
}
|
||||
|
||||
.control-btn.primary svg {
|
||||
filter: drop-shadow(0 0 5px var(--accent-glow));
|
||||
}
|
||||
|
||||
/* Playing state - pulsing glow ring */
|
||||
.control-btn.primary.playing {
|
||||
animation: pulse-ring 2s ease-in-out infinite;
|
||||
}
|
||||
|
||||
@keyframes pulse-ring {
|
||||
0%, 100% {
|
||||
box-shadow: 0 8px 20px rgba(0,0,0,0.2),
|
||||
inset 0 0 10px rgba(255,255,255,0.1),
|
||||
0 0 0 0 rgba(223, 166, 255, 0.7);
|
||||
}
|
||||
50% {
|
||||
box-shadow: 0 8px 20px rgba(0,0,0,0.2),
|
||||
inset 0 0 10px rgba(255,255,255,0.1),
|
||||
0 0 0 8px rgba(223, 166, 255, 0);
|
||||
}
|
||||
}
|
||||
|
||||
/* Icon container prevents layout jump */
|
||||
.icon-container {
|
||||
position: relative;
|
||||
width: 32px;
|
||||
height: 32px;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
}
|
||||
|
||||
.icon-container svg {
|
||||
position: absolute;
|
||||
top: 50%;
|
||||
left: 50%;
|
||||
transform: translate(-50%, -50%);
|
||||
}
|
||||
|
||||
.hidden {
|
||||
display: none !important;
|
||||
}
|
||||
|
||||
/* Volume */
|
||||
.volume-section {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 12px;
|
||||
margin-top: auto;
|
||||
padding: 0 10px;
|
||||
}
|
||||
|
||||
.slider-container {
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
input[type=range] {
|
||||
width: 100%;
|
||||
background: transparent;
|
||||
-webkit-appearance: none;
|
||||
appearance: none;
|
||||
}
|
||||
|
||||
input[type=range]::-webkit-slider-runnable-track {
|
||||
width: 100%;
|
||||
height: 4px;
|
||||
cursor: pointer;
|
||||
background: rgba(255,255,255,0.2);
|
||||
border-radius: 2px;
|
||||
}
|
||||
|
||||
input[type=range]::-webkit-slider-thumb {
|
||||
height: 16px;
|
||||
width: 16px;
|
||||
border-radius: 50%;
|
||||
background: #ffffff;
|
||||
cursor: pointer;
|
||||
-webkit-appearance: none;
|
||||
margin-top: -6px; /* align with track */
|
||||
box-shadow: 0 0 10px rgba(0,0,0,0.2);
|
||||
}
|
||||
|
||||
#volume-value {
|
||||
font-size: 0.8rem;
|
||||
font-weight: 500;
|
||||
width: 30px;
|
||||
text-align: right;
|
||||
}
|
||||
|
||||
.icon-btn.small {
|
||||
padding: 0;
|
||||
width: 24px;
|
||||
height: 24px;
|
||||
}
|
||||
|
||||
/* Cast Overlay (Beautified as per layout2_plan.md) */
|
||||
.overlay {
|
||||
position: fixed;
|
||||
inset: 0;
|
||||
background: rgba(20, 10, 35, 0.45);
|
||||
backdrop-filter: blur(14px);
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
z-index: 1000;
|
||||
opacity: 0;
|
||||
pointer-events: none;
|
||||
transition: opacity 0.3s;
|
||||
}
|
||||
|
||||
.overlay:not(.hidden) {
|
||||
opacity: 1;
|
||||
pointer-events: auto;
|
||||
}
|
||||
|
||||
/* Modal */
|
||||
.modal {
|
||||
width: min(420px, calc(100vw - 48px));
|
||||
padding: 22px;
|
||||
border-radius: 22px;
|
||||
background: rgba(30, 30, 40, 0.82);
|
||||
border: 1px solid rgba(255,255,255,0.12);
|
||||
box-shadow: 0 30px 80px rgba(0,0,0,0.6);
|
||||
color: #fff;
|
||||
animation: pop 0.22s ease;
|
||||
-webkit-app-region: no-drag;
|
||||
}
|
||||
|
||||
@keyframes pop {
|
||||
from { transform: scale(0.94); opacity: 0; }
|
||||
to { transform: scale(1); opacity: 1; }
|
||||
}
|
||||
|
||||
.modal h2 {
|
||||
margin: 0 0 14px;
|
||||
text-align: center;
|
||||
font-size: 20px;
|
||||
}
|
||||
|
||||
/* Device list */
|
||||
.device-list {
|
||||
list-style: none;
|
||||
padding: 10px 5px;
|
||||
margin: 0 0 18px;
|
||||
max-height: 360px;
|
||||
overflow-y: auto;
|
||||
}
|
||||
|
||||
/* Device row */
|
||||
.device {
|
||||
padding: 12px 14px;
|
||||
border-radius: 14px;
|
||||
margin-bottom: 8px;
|
||||
cursor: pointer;
|
||||
background: rgba(255,255,255,0.05);
|
||||
transition: transform 0.15s ease, background 0.15s ease, box-shadow 0.15s ease;
|
||||
text-align: left;
|
||||
}
|
||||
|
||||
.device:hover {
|
||||
background: rgba(255,255,255,0.10);
|
||||
transform: translateY(-1px);
|
||||
}
|
||||
|
||||
.device .device-main {
|
||||
font-size: 15px;
|
||||
font-weight: 600;
|
||||
color: var(--text-main);
|
||||
}
|
||||
|
||||
.device .device-sub {
|
||||
margin-top: 3px;
|
||||
font-size: 12px;
|
||||
opacity: 0.7;
|
||||
color: var(--text-muted);
|
||||
}
|
||||
|
||||
/* Selected device */
|
||||
.device.selected {
|
||||
background: linear-gradient(135deg, #c77dff, #8b5cf6);
|
||||
box-shadow: 0 0 18px rgba(199,125,255,0.65);
|
||||
color: #111;
|
||||
}
|
||||
|
||||
.device.selected .device-main,
|
||||
.device.selected .device-sub {
|
||||
color: #111;
|
||||
}
|
||||
|
||||
.device.selected .device-sub {
|
||||
opacity: 0.85;
|
||||
}
|
||||
|
||||
/* Cancel button */
|
||||
.btn.cancel {
|
||||
width: 100%;
|
||||
padding: 12px;
|
||||
border-radius: 999px;
|
||||
border: none;
|
||||
background: #d16b7d;
|
||||
color: #fff;
|
||||
font-size: 15px;
|
||||
cursor: pointer;
|
||||
transition: transform 0.15s ease, background 0.2s;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.btn.cancel:hover {
|
||||
transform: scale(1.02);
|
||||
background: #e17c8d;
|
||||
}
|
||||
18
cast-receiver/README.md
Normal file
@@ -0,0 +1,18 @@
|
||||
# Radio Player - Custom Cast Receiver
|
||||
|
||||
This folder contains a minimal Google Cast Web Receiver that displays a purple gradient background, station artwork, title and subtitle. It accepts `customData` hints sent from the sender (your app) for `backgroundImage`, `backgroundGradient` and `appName`.
|
||||
|
||||
Hosting requirements
|
||||
- The receiver must be served over HTTPS and be publicly accessible.
|
||||
- Recommended: host under GitHub Pages (`gh-pages` branch or `/docs` folder) or any static host (Netlify, Vercel, S3 + CloudFront).
|
||||
|
||||
Registering with Google Cast Console
|
||||
1. Go to the Cast SDK Developer Console and create a new Application.
|
||||
2. Choose "Custom Receiver" and provide the public HTTPS URL to `index.html` (e.g. `https://example.com/cast-receiver/index.html`).
|
||||
3. Note the generated Application ID.
|
||||
|
||||
Sender changes
|
||||
- After obtaining the Application ID, update your sender (sidecar) to launch that app ID instead of the DefaultMediaReceiver. The sidecar already supports passing `metadata.appId` when launching.
|
||||
|
||||
Testing locally
|
||||
- You can serve this folder locally during development, but Chromecast devices require public HTTPS endpoints to use a registered app.
|
||||
23
cast-receiver/index.html
Normal file
@@ -0,0 +1,23 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8" />
|
||||
<meta name="viewport" content="width=device-width,initial-scale=1" />
|
||||
<title>Radio Player Receiver</title>
|
||||
<link rel="stylesheet" href="styles.css">
|
||||
</head>
|
||||
<body>
|
||||
<div id="bg" class="bg"></div>
|
||||
<div id="app" class="app">
|
||||
<div class="artwork"><img id="art" alt="Artwork"></div>
|
||||
<div class="meta">
|
||||
<div id="appName" class="app-name">Radio Player</div>
|
||||
<h1 id="title">Radio Player</h1>
|
||||
<h2 id="subtitle"></h2>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script src="https://www.gstatic.com/cast/sdk/libs/caf_receiver/v3/cast_receiver_framework.js"></script>
|
||||
<script src="receiver.js"></script>
|
||||
</body>
|
||||
</html>
|
||||
50
cast-receiver/receiver.js
Normal file
@@ -0,0 +1,50 @@
|
||||
// Minimal CAF receiver that applies customData theming and shows media metadata.
|
||||
const context = cast.framework.CastReceiverContext.getInstance();
|
||||
const playerManager = context.getPlayerManager();
|
||||
|
||||
function applyBranding(customData, metadata) {
|
||||
try {
|
||||
const bgEl = document.getElementById('bg');
|
||||
const art = document.getElementById('art');
|
||||
const title = document.getElementById('title');
|
||||
const subtitle = document.getElementById('subtitle');
|
||||
const appName = document.getElementById('appName');
|
||||
|
||||
if (customData) {
|
||||
if (customData.backgroundImage) {
|
||||
bgEl.style.backgroundImage = `url(${customData.backgroundImage})`;
|
||||
bgEl.style.backgroundSize = 'cover';
|
||||
bgEl.style.backgroundPosition = 'center';
|
||||
} else if (customData.backgroundGradient) {
|
||||
bgEl.style.background = customData.backgroundGradient;
|
||||
}
|
||||
if (customData.appName) appName.textContent = customData.appName;
|
||||
}
|
||||
|
||||
if (metadata) {
|
||||
if (metadata.title) title.textContent = metadata.title;
|
||||
const sub = metadata.subtitle || metadata.artist || '';
|
||||
subtitle.textContent = sub;
|
||||
if (metadata.images && metadata.images.length) {
|
||||
art.src = metadata.images[0].url || '';
|
||||
}
|
||||
}
|
||||
} catch (e) {
|
||||
// swallow UI errors
|
||||
console.warn('Branding apply failed', e);
|
||||
}
|
||||
}
|
||||
|
||||
playerManager.setMessageInterceptor(cast.framework.messages.MessageType.LOAD, (request) => {
|
||||
const media = request.media || {};
|
||||
const customData = media.customData || {};
|
||||
applyBranding(customData, media.metadata || {});
|
||||
return request;
|
||||
});
|
||||
|
||||
playerManager.addEventListener(cast.framework.events.EventType.MEDIA_STATUS, () => {
|
||||
const media = playerManager.getMediaInformation();
|
||||
if (media) applyBranding(media.customData || {}, media.metadata || {});
|
||||
});
|
||||
|
||||
context.start();
|
||||
11
cast-receiver/styles.css
Normal file
@@ -0,0 +1,11 @@
|
||||
:root{--primary:#6a0dad;--accent:#b36cf3}
|
||||
html,body{height:100%;margin:0;font-family:Inter,system-ui,Arial,Helvetica,sans-serif}
|
||||
body{background:linear-gradient(135deg,var(--primary),var(--accent));color:#fff}
|
||||
.bg{position:fixed;inset:0;background-size:cover;background-position:center;filter:blur(10px) saturate(120%);opacity:0.9}
|
||||
.app{position:relative;z-index:2;display:flex;align-items:center;gap:24px;padding:48px}
|
||||
.artwork{width:320px;height:320px;flex:0 0 320px;background:rgba(255,255,255,0.06);display:flex;align-items:center;justify-content:center;border-radius:8px;overflow:hidden}
|
||||
.artwork img{width:100%;height:100%;object-fit:cover}
|
||||
.meta{display:flex;flex-direction:column}
|
||||
.app-name{font-weight:600;opacity:0.9}
|
||||
h1{margin:6px 0 0 0;font-size:28px}
|
||||
h2{margin:6px 0 0 0;font-size:18px;opacity:0.9}
|
||||
1548
package-lock.json
generated
15
package.json
@@ -1,15 +1,22 @@
|
||||
{
|
||||
"name": "radio-tauri",
|
||||
"private": true,
|
||||
"version": "0.1.0",
|
||||
"version": "0.1.1",
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"dev": "tauri dev",
|
||||
"build": "node tools/copy-binaries.js && tauri build && node tools/post-build-rcedit.js",
|
||||
"tauri": "node tools/copy-binaries.js && tauri"
|
||||
"build:sidecar": "npm --prefix sidecar install && npm --prefix sidecar run build",
|
||||
"dev": "npm run build:sidecar && node tools/copy-binaries.js && node tools/copy-ffmpeg.js && tauri dev",
|
||||
"dev:native": "node tools/copy-binaries.js && node tools/copy-ffmpeg.js && tauri dev",
|
||||
"ffmpeg:download": "powershell -NoProfile -ExecutionPolicy Bypass -File scripts/download-ffmpeg.ps1",
|
||||
"version:sync": "node tools/sync-version.js",
|
||||
"build": "node tools/sync-version.js && node tools/copy-binaries.js && node tools/copy-ffmpeg.js && node tools/write-build-flag.js set && tauri build && node tools/post-build-rcedit.js && node tools/write-build-flag.js clear",
|
||||
"build:devlike": "node tools/sync-version.js && node tools/copy-binaries.js && node tools/copy-ffmpeg.js && node tools/write-build-flag.js set --debug && cross-env RADIO_DEBUG_DEVTOOLS=1 tauri build && node tools/post-build-rcedit.js && node tools/write-build-flag.js clear",
|
||||
"tauri": "node tools/copy-binaries.js && node tools/copy-ffmpeg.js && tauri"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@tauri-apps/cli": "^2",
|
||||
"cross-env": "^7.0.3",
|
||||
"npx": "^3.0.0",
|
||||
"rcedit": "^1.1.2"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,15 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 200 200">
|
||||
<defs>
|
||||
<linearGradient id="g" x1="0" x2="1" y1="0" y2="1">
|
||||
<stop offset="0" stop-color="#7b7fd8"/>
|
||||
<stop offset="1" stop-color="#b57cf2"/>
|
||||
</linearGradient>
|
||||
</defs>
|
||||
<rect width="100%" height="100%" rx="24" fill="url(#g)" />
|
||||
<g fill="white" transform="translate(32,32)">
|
||||
<circle cx="48" cy="48" r="28" fill="rgba(255,255,255,0.15)" />
|
||||
<path d="M24 48c6-10 16-16 24-16v8c-6 0-14 4-18 12s-2 12 0 12 6-2 10-6c4-4 10-6 14-6v8c-6 0-14 4-18 12s-2 12 0 12" stroke="white" stroke-width="3" fill="none" stroke-linecap="round" stroke-linejoin="round" opacity="0.95" />
|
||||
<text x="96" y="98" font-family="sans-serif" font-size="18" fill="white" opacity="0.95">Radio</text>
|
||||
</g>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 815 B |
@@ -1,27 +0,0 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8" />
|
||||
<meta name="viewport" content="width=device-width,initial-scale=1" />
|
||||
<title>Radio Player</title>
|
||||
|
||||
<!-- Google Cast Receiver SDK -->
|
||||
<script src="https://www.gstatic.com/cast/sdk/libs/caf_receiver/v3/cast_receiver_framework.js"></script>
|
||||
|
||||
<link rel="stylesheet" href="styles.css" />
|
||||
</head>
|
||||
<body>
|
||||
<div id="app">
|
||||
<h1>Radio Player</h1>
|
||||
<p id="status">Ready</p>
|
||||
|
||||
<div id="artwork">
|
||||
<img src="assets/logo.svg" alt="Radio Player" />
|
||||
</div>
|
||||
|
||||
<p id="station">Radio – Live Stream</p>
|
||||
</div>
|
||||
|
||||
<script src="receiver.js"></script>
|
||||
</body>
|
||||
</html>
|
||||
@@ -1,73 +0,0 @@
|
||||
/* Receiver for "Radio Player" using CAF Receiver SDK */
|
||||
(function () {
|
||||
const STREAM_URL = 'https://live.radio1.si/Radio1MB';
|
||||
|
||||
function $(id) { return document.getElementById(id); }
|
||||
|
||||
document.addEventListener('DOMContentLoaded', () => {
|
||||
const context = cast.framework.CastReceiverContext.getInstance();
|
||||
const playerManager = context.getPlayerManager();
|
||||
const statusEl = $('status');
|
||||
const stationEl = $('station');
|
||||
|
||||
// Intercept LOAD to enforce correct metadata for LIVE audio
|
||||
playerManager.setMessageInterceptor(
|
||||
cast.framework.messages.MessageType.LOAD,
|
||||
(request) => {
|
||||
if (!request || !request.media) return request;
|
||||
|
||||
request.media.contentId = request.media.contentId || STREAM_URL;
|
||||
request.media.contentType = 'audio/mpeg';
|
||||
request.media.streamType = cast.framework.messages.StreamType.LIVE;
|
||||
|
||||
request.media.metadata = request.media.metadata || {};
|
||||
request.media.metadata.title = request.media.metadata.title || 'Radio 1';
|
||||
request.media.metadata.images = request.media.metadata.images || [{ url: 'assets/logo.svg' }];
|
||||
|
||||
return request;
|
||||
}
|
||||
);
|
||||
|
||||
// Update UI on player state changes
|
||||
playerManager.addEventListener(
|
||||
cast.framework.events.EventType.PLAYER_STATE_CHANGED,
|
||||
() => {
|
||||
const state = playerManager.getPlayerState();
|
||||
switch (state) {
|
||||
case cast.framework.messages.PlayerState.PLAYING:
|
||||
statusEl.textContent = 'Playing';
|
||||
break;
|
||||
case cast.framework.messages.PlayerState.PAUSED:
|
||||
statusEl.textContent = 'Paused';
|
||||
break;
|
||||
case cast.framework.messages.PlayerState.IDLE:
|
||||
statusEl.textContent = 'Stopped';
|
||||
break;
|
||||
default:
|
||||
statusEl.textContent = state;
|
||||
}
|
||||
}
|
||||
);
|
||||
|
||||
// When a new media is loaded, reflect metadata (station name, artwork)
|
||||
playerManager.addEventListener(cast.framework.events.EventType.LOAD, (event) => {
|
||||
const media = event && event.data && event.data.media;
|
||||
if (media && media.metadata) {
|
||||
if (media.metadata.title) stationEl.textContent = media.metadata.title;
|
||||
if (media.metadata.images && media.metadata.images[0] && media.metadata.images[0].url) {
|
||||
const img = document.querySelector('#artwork img');
|
||||
img.src = media.metadata.images[0].url;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Optional: reflect volume in title attribute
|
||||
playerManager.addEventListener(cast.framework.events.EventType.VOLUME_CHANGED, (evt) => {
|
||||
const level = evt && evt.data && typeof evt.data.level === 'number' ? evt.data.level : null;
|
||||
if (level !== null) statusEl.title = `Volume: ${Math.round(level * 100)}%`;
|
||||
});
|
||||
|
||||
// Start the cast receiver context
|
||||
context.start({ statusText: 'Radio Player Ready' });
|
||||
});
|
||||
})();
|
||||
@@ -1,58 +0,0 @@
|
||||
html, body {
|
||||
margin: 0;
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
background: linear-gradient(135deg, #7b7fd8, #b57cf2);
|
||||
font-family: system-ui, -apple-system, 'Segoe UI', Roboto, 'Helvetica Neue', Arial, sans-serif;
|
||||
color: white;
|
||||
}
|
||||
|
||||
#app {
|
||||
height: 100%;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
text-align: center;
|
||||
padding: 24px;
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
#artwork {
|
||||
width: 240px;
|
||||
height: 240px;
|
||||
margin: 20px 0;
|
||||
border-radius: 24px;
|
||||
overflow: hidden;
|
||||
background: rgba(0,0,0,0.1);
|
||||
box-shadow: 0 8px 24px rgba(0,0,0,0.2);
|
||||
}
|
||||
|
||||
#artwork img {
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
object-fit: cover;
|
||||
display: block;
|
||||
}
|
||||
|
||||
#status {
|
||||
font-size: 18px;
|
||||
opacity: 0.95;
|
||||
margin: 6px 0 0 0;
|
||||
}
|
||||
|
||||
#station {
|
||||
font-size: 16px;
|
||||
opacity: 0.85;
|
||||
margin: 6px 0 0 0;
|
||||
}
|
||||
|
||||
h1 {
|
||||
font-size: 20px;
|
||||
margin: 0 0 6px 0;
|
||||
}
|
||||
|
||||
@media (max-width: 480px) {
|
||||
#artwork { width: 160px; height: 160px; }
|
||||
h1 { font-size: 18px; }
|
||||
}
|
||||
@@ -1,206 +0,0 @@
|
||||
<#
|
||||
Build helper for Android (Windows PowerShell)
|
||||
|
||||
What it does:
|
||||
- Checks for required commands (`npm`, `rustup`, `cargo`, `cargo-ndk`)
|
||||
- Builds frontend (runs `npm run build` if `dist`/`build` not present)
|
||||
- Copies frontend files from `dist` or `src` into `android/app/src/main/assets`
|
||||
- Builds Rust native libs using `cargo-ndk` (if available) for `aarch64` and `armv7`
|
||||
- Copies produced `.so` files into `android/app/src/main/jniLibs/*`
|
||||
|
||||
Note: This script prepares the Android project. To produce the APK, open `android/` in Android Studio and run Build -> Assemble, or run `gradlew assembleDebug` locally.
|
||||
#>
|
||||
|
||||
Set-StrictMode -Version Latest
|
||||
|
||||
function Check-Command($name) {
|
||||
$which = Get-Command $name -ErrorAction SilentlyContinue
|
||||
return $which -ne $null
|
||||
}
|
||||
|
||||
Write-Output "Starting Android prep script..."
|
||||
|
||||
if (-not (Check-Command npm)) { Write-Warning "npm not found in PATH. Install Node.js to build frontend." }
|
||||
if (-not (Check-Command rustup)) { Write-Warning "rustup not found in PATH. Install Rust toolchain." }
|
||||
if (-not (Check-Command cargo)) { Write-Warning "cargo not found in PATH." }
|
||||
|
||||
$cargoNdkAvailable = Check-Command cargo-ndk
|
||||
if (-not $cargoNdkAvailable) { Write-Warning "cargo-ndk not found. Native libs will not be built. Install via 'cargo install cargo-ndk'" }
|
||||
|
||||
# Determine repository root (parent of the scripts folder)
|
||||
$scriptDir = Split-Path -Parent $MyInvocation.MyCommand.Definition
|
||||
$root = Split-Path -Parent $scriptDir
|
||||
Push-Location $root
|
||||
|
||||
# Prefer Tauri-generated Android Studio project (tauri android init)
|
||||
$androidRoot = Join-Path $root 'src-tauri\gen\android'
|
||||
if (-not (Test-Path $androidRoot)) {
|
||||
# Legacy fallback (non-Tauri project)
|
||||
$androidRoot = Join-Path $root 'android'
|
||||
}
|
||||
|
||||
function Escape-LocalPropertiesPath([string]$p) {
|
||||
# local.properties expects ':' escaped and backslashes doubled on Windows.
|
||||
# Use plain string replacements to avoid regex escaping pitfalls.
|
||||
return ($p.Replace('\', '\\').Replace(':', '\:'))
|
||||
}
|
||||
|
||||
# Ensure Android SDK/NDK locations are set for Gradle (local.properties)
|
||||
$sdkRoot = $env:ANDROID_SDK_ROOT
|
||||
if (-not $sdkRoot) { $sdkRoot = $env:ANDROID_HOME }
|
||||
if (-not $sdkRoot) { $sdkRoot = Join-Path $env:LOCALAPPDATA 'Android\Sdk' }
|
||||
|
||||
$ndkRoot = $env:ANDROID_NDK_ROOT
|
||||
if (-not $ndkRoot) { $ndkRoot = $env:ANDROID_NDK_HOME }
|
||||
if (-not $ndkRoot -and (Test-Path (Join-Path $sdkRoot 'ndk'))) {
|
||||
$ndkVersions = Get-ChildItem -Path (Join-Path $sdkRoot 'ndk') -Directory -ErrorAction SilentlyContinue | Sort-Object Name -Descending
|
||||
if ($ndkVersions -and (@($ndkVersions)).Count -gt 0) { $ndkRoot = @($ndkVersions)[0].FullName }
|
||||
}
|
||||
|
||||
if (Test-Path $androidRoot) {
|
||||
$localPropsPath = Join-Path $androidRoot 'local.properties'
|
||||
$lines = @()
|
||||
if ($sdkRoot) { $lines += "sdk.dir=$(Escape-LocalPropertiesPath $sdkRoot)" }
|
||||
if ($ndkRoot) { $lines += "ndk.dir=$(Escape-LocalPropertiesPath $ndkRoot)" }
|
||||
if ($lines.Count -gt 0) {
|
||||
Set-Content -Path $localPropsPath -Value ($lines -join "`n") -Encoding ASCII
|
||||
Write-Output "Wrote Android SDK/NDK config to: $localPropsPath"
|
||||
}
|
||||
}
|
||||
|
||||
# Build frontend (optional)
|
||||
Write-Output "Preparing frontend files..."
|
||||
$distDirs = @('dist','build')
|
||||
$foundDist = $null
|
||||
foreach ($d in $distDirs) {
|
||||
if (Test-Path (Join-Path $root $d)) { $foundDist = $d; break }
|
||||
}
|
||||
|
||||
if (-not $foundDist) {
|
||||
# IMPORTANT: `npm run build` in this repo runs `tauri build`, which is a desktop bundling step.
|
||||
# For Android prep we only need web assets, so we fall back to copying `src/` as assets.
|
||||
Write-Warning "No dist/build output found — copying `src/` as assets (skipping `npm run build` to avoid desktop bundling)."
|
||||
}
|
||||
|
||||
$assetsDst = Join-Path $androidRoot 'app\src\main\assets'
|
||||
if (-not (Test-Path $assetsDst)) { New-Item -ItemType Directory -Path $assetsDst -Force | Out-Null }
|
||||
|
||||
if ($foundDist) {
|
||||
Write-Output "Copying frontend from '$foundDist' to Android assets..."
|
||||
robocopy (Join-Path $root $foundDist) $assetsDst /MIR | Out-Null
|
||||
} else {
|
||||
Write-Output "Copying raw 'src' to Android assets..."
|
||||
robocopy (Join-Path $root 'src') $assetsDst /MIR | Out-Null
|
||||
}
|
||||
|
||||
# Build native libs if cargo-ndk available
|
||||
if ($cargoNdkAvailable) {
|
||||
Write-Output "Building Rust native libs via cargo-ndk from project root: $root"
|
||||
try {
|
||||
# Build from the Rust crate directory `src-tauri`
|
||||
$crateDir = Join-Path $root 'src-tauri'
|
||||
if (-not (Test-Path (Join-Path $crateDir 'Cargo.toml'))) {
|
||||
Write-Warning "Cargo.toml not found in src-tauri; skipping native build."
|
||||
} else {
|
||||
# Prefer Ninja generator for CMake if available (avoids Visual Studio generator issues)
|
||||
# Restore env vars at the end so we don't pollute the current PowerShell session.
|
||||
$oldCmakeGenerator = $env:CMAKE_GENERATOR
|
||||
$oldCmakeMakeProgram = $env:CMAKE_MAKE_PROGRAM
|
||||
$ninjaCmd = Get-Command ninja -ErrorAction SilentlyContinue
|
||||
if ($ninjaCmd) {
|
||||
Write-Output "Ninja detected at $($ninjaCmd.Source); setting CMake generator to Ninja."
|
||||
$env:CMAKE_GENERATOR = 'Ninja'
|
||||
$env:CMAKE_MAKE_PROGRAM = $ninjaCmd.Source
|
||||
} else {
|
||||
Write-Warning "Ninja not found in PATH. Installing Ninja or adding it to PATH is strongly recommended to avoid Visual Studio CMake generator on Windows."
|
||||
}
|
||||
|
||||
# Attempt to locate Android NDK if environment variables are not set
|
||||
if (-not $env:ANDROID_NDK_ROOT -and -not $env:ANDROID_NDK_HOME) {
|
||||
$candidates = @()
|
||||
if ($env:ANDROID_SDK_ROOT) { $candidates += Join-Path $env:ANDROID_SDK_ROOT 'ndk' }
|
||||
if ($env:ANDROID_HOME) { $candidates += Join-Path $env:ANDROID_HOME 'ndk' }
|
||||
$candidates += Join-Path $env:LOCALAPPDATA 'Android\sdk\ndk'
|
||||
$candidates += Join-Path $env:USERPROFILE 'AppData\Local\Android\sdk\ndk'
|
||||
$candidates += 'C:\Program Files (x86)\Android\AndroidNDK'
|
||||
|
||||
foreach ($cand in $candidates) {
|
||||
if (Test-Path $cand) {
|
||||
$versions = Get-ChildItem -Path $cand -Directory -ErrorAction SilentlyContinue | Sort-Object Name -Descending
|
||||
if ($versions -and (@($versions)).Count -gt 0) {
|
||||
$ndkPath = @($versions)[0].FullName
|
||||
Write-Output "Detected Android NDK at: $ndkPath"
|
||||
$env:ANDROID_NDK_ROOT = $ndkPath
|
||||
$env:ANDROID_NDK = $ndkPath
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
if (-not $env:ANDROID_NDK_ROOT) { Write-Warning "ANDROID_NDK_ROOT/ANDROID_NDK not set and no NDK found in common locations. Set ANDROID_NDK_ROOT to your NDK path." }
|
||||
} else {
|
||||
Write-Output "Using existing ANDROID_NDK_ROOT: $($env:ANDROID_NDK_ROOT)"
|
||||
if (-not $env:ANDROID_NDK) { $env:ANDROID_NDK = $env:ANDROID_NDK_ROOT }
|
||||
}
|
||||
|
||||
# Ensure expected external binary placeholders exist so Tauri bundling doesn't fail
|
||||
$binariesDir = Join-Path $crateDir 'binaries'
|
||||
if (-not (Test-Path $binariesDir)) { New-Item -ItemType Directory -Path $binariesDir -Force | Out-Null }
|
||||
$placeholder1 = Join-Path $binariesDir 'RadioPlayer-aarch64-linux-android'
|
||||
$placeholder2 = Join-Path $binariesDir 'RadioPlayer-armv7-linux-androideabi'
|
||||
if (-not (Test-Path $placeholder1)) { New-Item -ItemType File -Path $placeholder1 -Force | Out-Null; Write-Output "Created placeholder: $placeholder1" }
|
||||
if (-not (Test-Path $placeholder2)) { New-Item -ItemType File -Path $placeholder2 -Force | Out-Null; Write-Output "Created placeholder: $placeholder2" }
|
||||
|
||||
# If a previous build used a different CMake generator (e.g., Visual Studio), aws-lc-sys can fail with
|
||||
# "Does not match the generator used previously". Clean only the aws-lc-sys CMake build dirs.
|
||||
$awsLcBuildDirs = Get-ChildItem -Path (Join-Path $crateDir 'target') -Recurse -Directory -ErrorAction SilentlyContinue |
|
||||
Where-Object { $_.Name -like 'aws-lc-sys-*' }
|
||||
foreach ($d in @($awsLcBuildDirs)) {
|
||||
$cmakeBuildDir = Join-Path $d.FullName 'out\build'
|
||||
$cmakeCache = Join-Path $cmakeBuildDir 'CMakeCache.txt'
|
||||
if (Test-Path $cmakeCache) {
|
||||
Write-Output "Cleaning stale CMake cache for aws-lc-sys: $cmakeBuildDir"
|
||||
Remove-Item -Path $cmakeBuildDir -Recurse -Force -ErrorAction SilentlyContinue
|
||||
}
|
||||
}
|
||||
|
||||
Push-Location $crateDir
|
||||
try {
|
||||
# Use API 24 to ensure libc symbols like getifaddrs/freeifaddrs are available.
|
||||
# Build only the library to avoid linking the desktop binary for Android.
|
||||
Write-Output "Running: cargo ndk -t arm64-v8a -t armeabi-v7a -P 24 build --release --lib (in $crateDir)"
|
||||
cargo ndk -t arm64-v8a -t armeabi-v7a -P 24 build --release --lib
|
||||
} finally {
|
||||
Pop-Location
|
||||
if ($null -eq $oldCmakeGenerator) { Remove-Item Env:\CMAKE_GENERATOR -ErrorAction SilentlyContinue } else { $env:CMAKE_GENERATOR = $oldCmakeGenerator }
|
||||
if ($null -eq $oldCmakeMakeProgram) { Remove-Item Env:\CMAKE_MAKE_PROGRAM -ErrorAction SilentlyContinue } else { $env:CMAKE_MAKE_PROGRAM = $oldCmakeMakeProgram }
|
||||
}
|
||||
|
||||
# Search for produced .so files under src-tauri/target
|
||||
$soFiles = Get-ChildItem -Path (Join-Path $crateDir 'target') -Recurse -Filter "*.so" -ErrorAction SilentlyContinue
|
||||
if (-not $soFiles) {
|
||||
Write-Warning "No .so files found after build. Check cargo-ndk output above for errors."
|
||||
} else {
|
||||
foreach ($f in @($soFiles)) {
|
||||
$full = $f.FullName
|
||||
if ($full -match 'aarch64|aarch64-linux-android|arm64-v8a') { $abi = 'arm64-v8a' }
|
||||
elseif ($full -match 'armv7|armv7-linux-androideabi|armeabi-v7a') { $abi = 'armeabi-v7a' }
|
||||
else { continue }
|
||||
|
||||
$dst = Join-Path $androidRoot "app\src\main\jniLibs\$abi"
|
||||
if (-not (Test-Path $dst)) { New-Item -ItemType Directory -Path $dst -Force | Out-Null }
|
||||
Copy-Item $full -Destination $dst -Force
|
||||
Write-Output "Copied $($f.Name) -> $dst"
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch {
|
||||
Write-Warning "cargo-ndk build failed. Exception: $($_.Exception.Message)"
|
||||
if ($_.ScriptStackTrace) { Write-Output $_.ScriptStackTrace }
|
||||
}
|
||||
} else {
|
||||
Write-Warning "Skipping native lib build (cargo-ndk missing)."
|
||||
}
|
||||
|
||||
Write-Output "Android prep complete. Open '$androidRoot' in Android Studio and build the APK (or run './gradlew assembleDebug' in that folder)."
|
||||
|
||||
Pop-Location
|
||||
@@ -1,45 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
# Cross-platform helper for Unix-like shells
|
||||
ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")/.." && pwd)"
|
||||
cd "$ROOT"
|
||||
|
||||
echo "Preparing Android assets and native libs..."
|
||||
|
||||
if command -v npm >/dev/null 2>&1; then
|
||||
echo "Running npm install & build"
|
||||
npm install
|
||||
npm run build || true
|
||||
fi
|
||||
|
||||
DIST_DIR="dist"
|
||||
if [ ! -d "$DIST_DIR" ]; then DIST_DIR="build"; fi
|
||||
if [ -d "$DIST_DIR" ]; then
|
||||
echo "Copying $DIST_DIR -> android/app/src/main/assets"
|
||||
mkdir -p android/app/src/main/assets
|
||||
rsync -a --delete "$DIST_DIR/" android/app/src/main/assets/
|
||||
else
|
||||
echo "No dist/build found, copying src/ -> android assets"
|
||||
mkdir -p android/app/src/main/assets
|
||||
rsync -a --delete src/ android/app/src/main/assets/
|
||||
fi
|
||||
|
||||
if command -v cargo-ndk >/dev/null 2>&1; then
|
||||
echo "Building native libs with cargo-ndk"
|
||||
cargo-ndk -t aarch64 -t armv7 build --release || true
|
||||
# copy so files
|
||||
find target -type f -name "*.so" | while read -r f; do
|
||||
if [[ "$f" =~ aarch64|aarch64-linux-android ]]; then abi=arm64-v8a; fi
|
||||
if [[ "$f" =~ armv7|armv7-linux-androideabi ]]; then abi=armeabi-v7a; fi
|
||||
if [ -n "${abi-}" ]; then
|
||||
mkdir -p android/app/src/main/jniLibs/$abi
|
||||
cp "$f" android/app/src/main/jniLibs/$abi/
|
||||
echo "Copied $f -> android/app/src/main/jniLibs/$abi/"
|
||||
fi
|
||||
done
|
||||
else
|
||||
echo "cargo-ndk not found; skipping native lib build"
|
||||
fi
|
||||
|
||||
echo "Prepared Android project. Open android/ in Android Studio to build the APK (or run ./gradlew assembleDebug)."
|
||||
71
scripts/download-ffmpeg.ps1
Normal file
@@ -0,0 +1,71 @@
|
||||
param(
|
||||
[string]$Url = "https://www.gyan.dev/ffmpeg/builds/ffmpeg-release-essentials.zip",
|
||||
[string]$OutDir = "tools/ffmpeg/bin",
|
||||
[switch]$DryRun
|
||||
)
|
||||
|
||||
$ErrorActionPreference = "Stop"
|
||||
|
||||
$isWindows = $env:OS -eq 'Windows_NT'
|
||||
if (-not $isWindows) {
|
||||
Write-Host "This script is intended for Windows (ffmpeg.exe)." -ForegroundColor Yellow
|
||||
exit 1
|
||||
}
|
||||
|
||||
$repoRoot = (Resolve-Path (Join-Path $PSScriptRoot "..")).Path
|
||||
$outDirAbs = (Resolve-Path (Join-Path $repoRoot $OutDir) -ErrorAction SilentlyContinue)
|
||||
if (-not $outDirAbs) {
|
||||
$outDirAbs = Join-Path $repoRoot $OutDir
|
||||
New-Item -ItemType Directory -Force -Path $outDirAbs | Out-Null
|
||||
} else {
|
||||
$outDirAbs = $outDirAbs.Path
|
||||
}
|
||||
|
||||
$ffmpegDest = Join-Path $outDirAbs "ffmpeg.exe"
|
||||
|
||||
# If already present, do nothing.
|
||||
if (Test-Path $ffmpegDest) {
|
||||
Write-Host "FFmpeg already present: $ffmpegDest"
|
||||
exit 0
|
||||
}
|
||||
|
||||
if ($DryRun) {
|
||||
Write-Host "Dry run:" -ForegroundColor Cyan
|
||||
Write-Host " Would download: $Url"
|
||||
Write-Host " Would install to: $ffmpegDest"
|
||||
exit 0
|
||||
}
|
||||
|
||||
Write-Host "About to download a prebuilt FFmpeg package:" -ForegroundColor Cyan
|
||||
Write-Host " $Url"
|
||||
Write-Host "You are responsible for reviewing the FFmpeg license/compliance for your use case." -ForegroundColor Yellow
|
||||
|
||||
$tempRoot = Join-Path $env:TEMP ("radioplayer-ffmpeg-" + [guid]::NewGuid().ToString("N"))
|
||||
$zipPath = Join-Path $tempRoot "ffmpeg.zip"
|
||||
$extractDir = Join-Path $tempRoot "extract"
|
||||
|
||||
New-Item -ItemType Directory -Force -Path $tempRoot | Out-Null
|
||||
New-Item -ItemType Directory -Force -Path $extractDir | Out-Null
|
||||
|
||||
try {
|
||||
Write-Host "Downloading..." -ForegroundColor Cyan
|
||||
Invoke-WebRequest -Uri $Url -OutFile $zipPath -UseBasicParsing
|
||||
|
||||
Write-Host "Extracting..." -ForegroundColor Cyan
|
||||
Expand-Archive -Path $zipPath -DestinationPath $extractDir -Force
|
||||
|
||||
$candidate = Get-ChildItem -Path $extractDir -Recurse -Filter "ffmpeg.exe" | Where-Object {
|
||||
$_.FullName -match "\\bin\\ffmpeg\.exe$"
|
||||
} | Select-Object -First 1
|
||||
|
||||
if (-not $candidate) {
|
||||
throw "Could not find ffmpeg.exe under extracted content. The archive layout may have changed."
|
||||
}
|
||||
|
||||
Copy-Item -Force -Path $candidate.FullName -Destination $ffmpegDest
|
||||
|
||||
Write-Host "Installed FFmpeg to: $ffmpegDest" -ForegroundColor Green
|
||||
Write-Host "Next: run 'node tools/copy-ffmpeg.js' (or 'npm run dev:native' / 'npm run build') to bundle it into src-tauri/resources/." -ForegroundColor Green
|
||||
} finally {
|
||||
try { Remove-Item -Recurse -Force -Path $tempRoot -ErrorAction SilentlyContinue } catch {}
|
||||
}
|
||||
@@ -23,15 +23,21 @@ function stopSessions(client, sessions, cb) {
|
||||
const session = remaining.shift();
|
||||
if (!session) return cb();
|
||||
|
||||
client.stop(session, (err) => {
|
||||
if (err) {
|
||||
log(`Stop session failed (${session.appId || 'unknown app'}): ${err.message || String(err)}`);
|
||||
} else {
|
||||
log(`Stopped session (${session.appId || 'unknown app'})`);
|
||||
}
|
||||
// Continue regardless; best-effort.
|
||||
try {
|
||||
client.stop(session, (err) => {
|
||||
if (err) {
|
||||
log(`Stop session failed (${session.appId || 'unknown app'}): ${err.message || String(err)}`);
|
||||
} else {
|
||||
log(`Stopped session (${session.appId || 'unknown app'})`);
|
||||
}
|
||||
// Continue regardless; best-effort.
|
||||
stopNext();
|
||||
});
|
||||
} catch (err) {
|
||||
// Some devices/library versions may throw synchronously; just log and continue.
|
||||
log(`Stop session threw (${session.appId || 'unknown app'}): ${err.message || String(err)}`);
|
||||
stopNext();
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
stopNext();
|
||||
@@ -52,7 +58,7 @@ rl.on('line', (line) => {
|
||||
|
||||
switch (command) {
|
||||
case 'play':
|
||||
play(args.ip, args.url);
|
||||
play(args.ip, args.url, args.metadata);
|
||||
break;
|
||||
case 'stop':
|
||||
stop();
|
||||
@@ -68,12 +74,16 @@ rl.on('line', (line) => {
|
||||
}
|
||||
});
|
||||
|
||||
function play(ip, url) {
|
||||
function play(ip, url, metadata) {
|
||||
if (activeClient) {
|
||||
try { activeClient.removeAllListeners(); } catch (e) { }
|
||||
try { activeClient.close(); } catch (e) { }
|
||||
}
|
||||
|
||||
activeClient = new Client();
|
||||
// Increase max listeners for this client instance to avoid Node warnings
|
||||
try { if (typeof activeClient.setMaxListeners === 'function') activeClient.setMaxListeners(50); } catch (e) {}
|
||||
activeClient._playMetadata = metadata || {};
|
||||
|
||||
activeClient.connect(ip, () => {
|
||||
log(`Connected to ${ip}`);
|
||||
@@ -100,20 +110,21 @@ function play(ip, url) {
|
||||
log('Join failed, attempting launch...');
|
||||
log(`Join error: ${err && err.message ? err.message : String(err)}`);
|
||||
// Join can fail if the session is stale; stop it and retry launch.
|
||||
stopSessions(activeClient, [session], () => launchPlayer(url, /*didStopFirst*/ true));
|
||||
stopSessions(activeClient, [session], () => launchPlayer(url, activeClient._playMetadata, /*didStopFirst*/ true));
|
||||
} else {
|
||||
activePlayer = player;
|
||||
loadMedia(url);
|
||||
}
|
||||
// Clean up previous player listeners before replacing
|
||||
try { if (activePlayer && typeof activePlayer.removeAllListeners === 'function') activePlayer.removeAllListeners(); } catch (e) {}
|
||||
activePlayer = player;
|
||||
try { if (typeof activePlayer.setMaxListeners === 'function') activePlayer.setMaxListeners(50); } catch (e) {}
|
||||
loadMedia(url, activeClient._playMetadata);
|
||||
}
|
||||
});
|
||||
} else {
|
||||
// If another app is running, stop it first to avoid NOT_ALLOWED.
|
||||
// Backdrop or other non-media session present: skip stopping to avoid platform sender crash, just launch.
|
||||
if (sessions.length > 0) {
|
||||
log('Non-media session detected, stopping before launch...');
|
||||
stopSessions(activeClient, sessions, () => launchPlayer(url, /*didStopFirst*/ true));
|
||||
} else {
|
||||
launchPlayer(url, /*didStopFirst*/ false);
|
||||
log('Non-media session detected; skipping stop and launching DefaultMediaReceiver...');
|
||||
}
|
||||
launchPlayer(url, activeClient._playMetadata, /*didStopFirst*/ false);
|
||||
}
|
||||
});
|
||||
});
|
||||
@@ -126,10 +137,11 @@ function play(ip, url) {
|
||||
});
|
||||
}
|
||||
|
||||
function launchPlayer(url, didStopFirst) {
|
||||
function launchPlayer(url, metadata, didStopFirst) {
|
||||
if (!activeClient) return;
|
||||
|
||||
activeClient.launch(DefaultMediaReceiver, (err, player) => {
|
||||
const launchApp = (metadata && metadata.appId) ? metadata.appId : DefaultMediaReceiver;
|
||||
activeClient.launch(launchApp, (err, player) => {
|
||||
if (err) {
|
||||
const details = `Launch error: ${err && err.message ? err.message : String(err)}${err && err.code ? ` (code: ${err.code})` : ''}`;
|
||||
// If launch fails with NOT_ALLOWED, the device may be busy with another app/session.
|
||||
@@ -149,8 +161,10 @@ function launchPlayer(url, didStopFirst) {
|
||||
try { error(`Launch retry error full: ${JSON.stringify(retryErr)}`); } catch (e) { /* ignore */ }
|
||||
return;
|
||||
}
|
||||
try { if (activePlayer && typeof activePlayer.removeAllListeners === 'function') activePlayer.removeAllListeners(); } catch (e) {}
|
||||
activePlayer = retryPlayer;
|
||||
loadMedia(url);
|
||||
try { if (typeof activePlayer.setMaxListeners === 'function') activePlayer.setMaxListeners(50); } catch (e) {}
|
||||
loadMedia(url, metadata);
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -161,24 +175,52 @@ function launchPlayer(url, didStopFirst) {
|
||||
try { error(`Launch error full: ${JSON.stringify(err)}`); } catch (e) { /* ignore */ }
|
||||
return;
|
||||
}
|
||||
try { if (activePlayer && typeof activePlayer.removeAllListeners === 'function') activePlayer.removeAllListeners(); } catch (e) {}
|
||||
activePlayer = player;
|
||||
loadMedia(url);
|
||||
try { if (typeof activePlayer.setMaxListeners === 'function') activePlayer.setMaxListeners(50); } catch (e) {}
|
||||
loadMedia(url, metadata);
|
||||
});
|
||||
}
|
||||
|
||||
function loadMedia(url) {
|
||||
function loadMedia(url, metadata) {
|
||||
if (!activePlayer) return;
|
||||
|
||||
const meta = metadata || {};
|
||||
// Build a richer metadata payload. Many receivers only honor specific
|
||||
// fields; we set both Music metadata and generic hints via `customData`.
|
||||
const media = {
|
||||
contentId: url,
|
||||
contentType: 'audio/mpeg',
|
||||
streamType: 'LIVE',
|
||||
metadata: {
|
||||
metadataType: 0,
|
||||
title: 'RadioPlayer'
|
||||
// Use MusicTrack metadata (common on audio receivers) but include
|
||||
// a subtitle field in case receivers surface it.
|
||||
metadataType: 3, // MusicTrackMediaMetadata
|
||||
title: meta.title || 'Radio Station',
|
||||
albumName: 'Radio Player',
|
||||
artist: meta.artist || meta.subtitle || meta.station || '',
|
||||
subtitle: meta.subtitle || '',
|
||||
images: (meta.image ? [
|
||||
{ url: meta.image },
|
||||
// also include a large hint for receivers that prefer big artwork
|
||||
{ url: meta.image, width: 1920, height: 1080 }
|
||||
] : [])
|
||||
},
|
||||
// Many receivers ignore `customData`, but some Styled receivers will
|
||||
// use it. Include background and theming hints here.
|
||||
customData: {
|
||||
appName: meta.appName || 'Radio Player',
|
||||
backgroundImage: meta.backgroundImage || meta.image || undefined,
|
||||
backgroundGradient: meta.bgGradient || '#6a0dad',
|
||||
themeHint: {
|
||||
primary: '#6a0dad',
|
||||
accent: '#b36cf3'
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// Ensure we don't accumulate 'status' listeners across loads
|
||||
try { if (activePlayer && typeof activePlayer.removeAllListeners === 'function') activePlayer.removeAllListeners('status'); } catch (e) {}
|
||||
activePlayer.load(media, { autoplay: true }, (err, status) => {
|
||||
if (err) return error(`Load error: ${err.message}`);
|
||||
log('Media loaded, playing...');
|
||||
@@ -192,9 +234,11 @@ function loadMedia(url) {
|
||||
function stop() {
|
||||
if (activePlayer) {
|
||||
try { activePlayer.stop(); } catch (e) { }
|
||||
try { if (typeof activePlayer.removeAllListeners === 'function') activePlayer.removeAllListeners(); } catch (e) {}
|
||||
log('Stopped playback');
|
||||
}
|
||||
if (activeClient) {
|
||||
try { if (typeof activeClient.removeAllListeners === 'function') activeClient.removeAllListeners(); } catch (e) {}
|
||||
try { activeClient.close(); } catch (e) { }
|
||||
activeClient = null;
|
||||
activePlayer = null;
|
||||
|
||||
3
sidecar/package-lock.json
generated
@@ -9,6 +9,9 @@
|
||||
"version": "1.0.0",
|
||||
"dependencies": {
|
||||
"castv2-client": "^1.2.0"
|
||||
},
|
||||
"bin": {
|
||||
"radiocast-sidecar": "index.js"
|
||||
}
|
||||
},
|
||||
"node_modules/@protobufjs/aspromise": {
|
||||
|
||||
1007
src-tauri/Cargo.lock
generated
@@ -1,9 +1,10 @@
|
||||
[package]
|
||||
name = "radio-tauri"
|
||||
version = "0.1.0"
|
||||
version = "0.1.1"
|
||||
description = "A Tauri App"
|
||||
authors = ["you"]
|
||||
edition = "2021"
|
||||
default-run = "radio-tauri"
|
||||
|
||||
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
|
||||
|
||||
@@ -24,7 +25,17 @@ serde = { version = "1", features = ["derive"] }
|
||||
serde_json = "1"
|
||||
rust_cast = "0.19.0"
|
||||
mdns-sd = "0.17.1"
|
||||
agnostic-mdns = { version = "0.4", features = ["tokio"], optional = true }
|
||||
async-channel = "2.5.0"
|
||||
tokio = { version = "1.48.0", features = ["full"] }
|
||||
tauri-plugin-shell = "2.3.3"
|
||||
reqwest = { version = "0.11", features = ["json", "rustls-tls"] }
|
||||
base64 = "0.22"
|
||||
cpal = "0.15"
|
||||
ringbuf = "0.3"
|
||||
tracing = "0.1"
|
||||
tracing-subscriber = { version = "0.3", features = ["fmt", "env-filter"] }
|
||||
|
||||
[features]
|
||||
use_agnostic_mdns = ["agnostic-mdns"]
|
||||
|
||||
|
||||
13
src-tauri/src/bin/check_ffmpeg.rs
Normal file
@@ -0,0 +1,13 @@
|
||||
fn main() {
|
||||
// Call the library's FFmpeg preflight check used by the application.
|
||||
match radio_tauri_lib::player::preflight_ffmpeg_only() {
|
||||
Ok(()) => {
|
||||
println!("FFmpeg preflight OK");
|
||||
std::process::exit(0);
|
||||
}
|
||||
Err(e) => {
|
||||
eprintln!("FFmpeg preflight failed: {}", e);
|
||||
std::process::exit(2);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,25 +1,367 @@
|
||||
use std::collections::HashMap;
|
||||
use std::sync::Mutex;
|
||||
use std::thread;
|
||||
use std::io::{BufRead, BufReader};
|
||||
use std::net::{IpAddr, SocketAddr, TcpListener, TcpStream, UdpSocket};
|
||||
use std::process::{Child, Command, Stdio};
|
||||
use std::sync::{Mutex, Arc};
|
||||
// thread usage replaced by async tasks; remove direct std::thread import
|
||||
use std::time::Duration;
|
||||
use tokio::sync::{RwLock as TokioRwLock, mpsc};
|
||||
|
||||
#[cfg(not(feature = "use_agnostic_mdns"))]
|
||||
use mdns_sd::{ServiceDaemon, ServiceEvent};
|
||||
use serde_json::json;
|
||||
use tauri::{AppHandle, Manager, State};
|
||||
use tauri::Emitter;
|
||||
use tracing::{info, warn, error};
|
||||
use tracing_subscriber;
|
||||
use tauri_plugin_shell::process::{CommandChild, CommandEvent};
|
||||
use tauri_plugin_shell::ShellExt;
|
||||
use reqwest;
|
||||
use base64::{engine::general_purpose, Engine as _};
|
||||
|
||||
pub mod player;
|
||||
use player::{PlayerCommand, PlayerController, PlayerShared, PlayerState};
|
||||
|
||||
struct SidecarState {
|
||||
child: Mutex<Option<CommandChild>>,
|
||||
}
|
||||
|
||||
struct AppState {
|
||||
known_devices: Mutex<HashMap<String, String>>,
|
||||
known_devices: Arc<TokioRwLock<HashMap<String, DeviceInfo>>>,
|
||||
}
|
||||
|
||||
struct DeviceInfo {
|
||||
ip: String,
|
||||
last_seen: std::time::Instant,
|
||||
}
|
||||
|
||||
struct CastProxy {
|
||||
child: Child,
|
||||
}
|
||||
|
||||
struct CastProxyState {
|
||||
inner: Mutex<Option<CastProxy>>,
|
||||
}
|
||||
|
||||
#[derive(serde::Serialize)]
|
||||
struct CastProxyStartResult {
|
||||
url: String,
|
||||
// "tap" | "proxy"
|
||||
mode: String,
|
||||
}
|
||||
|
||||
// Native (non-WebView) audio player state.
|
||||
// Step 1: state machine + command interface only (no decoding/output yet).
|
||||
struct PlayerRuntime {
|
||||
shared: Arc<PlayerShared>,
|
||||
controller: PlayerController,
|
||||
}
|
||||
|
||||
fn clamp01(v: f32) -> f32 {
|
||||
if v.is_nan() {
|
||||
0.0
|
||||
} else if v < 0.0 {
|
||||
0.0
|
||||
} else if v > 1.0 {
|
||||
1.0
|
||||
} else {
|
||||
v
|
||||
}
|
||||
}
|
||||
|
||||
fn format_http_host(ip: IpAddr) -> String {
|
||||
match ip {
|
||||
IpAddr::V4(v4) => v4.to_string(),
|
||||
IpAddr::V6(v6) => format!("[{v6}]"),
|
||||
}
|
||||
}
|
||||
|
||||
fn local_ip_for_peer(peer_ip: IpAddr) -> Result<IpAddr, String> {
|
||||
// Trick: connect a UDP socket to the peer and read the chosen local address.
|
||||
// Port number is irrelevant; no packets are sent for UDP connect().
|
||||
let peer = SocketAddr::new(peer_ip, 9);
|
||||
let bind_addr = match peer_ip {
|
||||
IpAddr::V4(_) => "0.0.0.0:0",
|
||||
IpAddr::V6(_) => "[::]:0",
|
||||
};
|
||||
let sock = UdpSocket::bind(bind_addr).map_err(|e| e.to_string())?;
|
||||
sock.connect(peer).map_err(|e| e.to_string())?;
|
||||
Ok(sock.local_addr().map_err(|e| e.to_string())?.ip())
|
||||
}
|
||||
|
||||
fn wait_for_listen(ip: IpAddr, port: u16) -> bool {
|
||||
// Best-effort: give ffmpeg a moment to bind before we tell the Chromecast.
|
||||
// Returns true if a listener accepted a connection during the wait window.
|
||||
let addr = SocketAddr::new(ip, port);
|
||||
for _ in 0..50 {
|
||||
if TcpStream::connect_timeout(&addr, Duration::from_millis(30)).is_ok() {
|
||||
return true;
|
||||
}
|
||||
std::thread::sleep(Duration::from_millis(20));
|
||||
}
|
||||
false
|
||||
}
|
||||
|
||||
fn stop_cast_proxy_locked(lock: &mut Option<CastProxy>) {
|
||||
if let Some(mut proxy) = lock.take() {
|
||||
let _ = proxy.child.kill();
|
||||
let _ = proxy.child.wait();
|
||||
info!("Cast proxy stopped");
|
||||
}
|
||||
}
|
||||
|
||||
fn spawn_standalone_cast_proxy(url: String, port: u16) -> Result<Child, String> {
|
||||
// Standalone path (fallback): FFmpeg pulls the station URL and serves MP3 over HTTP.
|
||||
// Try libmp3lame first, then fall back to the built-in "mp3" encoder if needed.
|
||||
let ffmpeg = player::ffmpeg_command();
|
||||
let ffmpeg_disp = ffmpeg.to_string_lossy();
|
||||
|
||||
let spawn = |codec: &str| -> Result<Child, String> {
|
||||
Command::new(&ffmpeg)
|
||||
.arg("-nostdin")
|
||||
.arg("-hide_banner")
|
||||
.arg("-loglevel")
|
||||
.arg("warning")
|
||||
.arg("-reconnect")
|
||||
.arg("1")
|
||||
.arg("-reconnect_streamed")
|
||||
.arg("1")
|
||||
.arg("-reconnect_delay_max")
|
||||
.arg("5")
|
||||
.arg("-i")
|
||||
.arg(&url)
|
||||
.arg("-vn")
|
||||
.arg("-c:a")
|
||||
.arg(codec)
|
||||
.arg("-b:a")
|
||||
.arg("128k")
|
||||
.arg("-f")
|
||||
.arg("mp3")
|
||||
.arg("-content_type")
|
||||
.arg("audio/mpeg")
|
||||
.arg("-listen")
|
||||
.arg("1")
|
||||
.arg(format!("http://0.0.0.0:{port}/stream.mp3"))
|
||||
.stdout(Stdio::null())
|
||||
.stderr(Stdio::piped())
|
||||
.spawn()
|
||||
.map_err(|e| {
|
||||
format!(
|
||||
"Failed to start ffmpeg cast proxy ({ffmpeg_disp}): {e}. Set RADIOPLAYER_FFMPEG, bundle ffmpeg next to the app, or install ffmpeg on PATH."
|
||||
)
|
||||
})
|
||||
};
|
||||
|
||||
let mut child = spawn("libmp3lame")?;
|
||||
std::thread::sleep(Duration::from_millis(150));
|
||||
if let Ok(Some(status)) = child.try_wait() {
|
||||
if !status.success() {
|
||||
warn!("Standalone cast proxy exited early; retrying with -c:a mp3");
|
||||
child = spawn("mp3")?;
|
||||
}
|
||||
}
|
||||
|
||||
Ok(child)
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
async fn cast_proxy_start(
|
||||
state: State<'_, AppState>,
|
||||
proxy_state: State<'_, CastProxyState>,
|
||||
player: State<'_, PlayerRuntime>,
|
||||
device_name: String,
|
||||
url: String,
|
||||
) -> Result<CastProxyStartResult, String> {
|
||||
// Make sure ffmpeg exists before we try to cast.
|
||||
player::preflight_ffmpeg_only()?;
|
||||
|
||||
let device_ip_str = {
|
||||
let devices = state.known_devices.read().await;
|
||||
devices
|
||||
.get(&device_name)
|
||||
.map(|d| d.ip.clone())
|
||||
.ok_or("Device not found")?
|
||||
};
|
||||
let device_ip: IpAddr = device_ip_str
|
||||
.parse()
|
||||
.map_err(|_| format!("Invalid device IP: {device_ip_str}"))?;
|
||||
let local_ip = local_ip_for_peer(device_ip)?;
|
||||
|
||||
// Stop any existing standalone proxy first.
|
||||
{
|
||||
let mut lock = proxy_state.inner.lock().unwrap();
|
||||
stop_cast_proxy_locked(&mut lock);
|
||||
}
|
||||
|
||||
// Prefer reusing the native decoder PCM when possible.
|
||||
// If the currently playing URL differs (or nothing is playing), start a headless decoder.
|
||||
let snapshot = player.shared.snapshot();
|
||||
let is_same_url = snapshot.url.as_deref() == Some(url.as_str());
|
||||
let is_decoding = matches!(snapshot.status, player::PlayerStatus::Playing | player::PlayerStatus::Buffering);
|
||||
if !(is_same_url && is_decoding) {
|
||||
player
|
||||
.controller
|
||||
.tx
|
||||
.send(PlayerCommand::PlayCast { url: url.clone() })
|
||||
.map_err(|e| e.to_string())?;
|
||||
}
|
||||
|
||||
// Try starting the TAP on several ephemeral ports before falling back.
|
||||
let host = format_http_host(local_ip);
|
||||
let max_attempts = 5usize;
|
||||
for attempt in 0..max_attempts {
|
||||
// Pick an ephemeral port.
|
||||
let listener = TcpListener::bind("0.0.0.0:0").map_err(|e| e.to_string())?;
|
||||
let port = listener.local_addr().map_err(|e| e.to_string())?.port();
|
||||
drop(listener);
|
||||
|
||||
let proxy_url = format!("http://{host}:{port}/stream.mp3");
|
||||
|
||||
let (reply_tx, reply_rx) = std::sync::mpsc::channel();
|
||||
let _ = player
|
||||
.controller
|
||||
.tx
|
||||
.send(PlayerCommand::CastTapStart {
|
||||
port,
|
||||
bind_host: host.clone(),
|
||||
reply: reply_tx,
|
||||
})
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
match reply_rx.recv_timeout(Duration::from_secs(2)) {
|
||||
Ok(Ok(())) => {
|
||||
if wait_for_listen(local_ip, port) {
|
||||
info!("Cast proxy started in TAP mode: {}", proxy_url);
|
||||
return Ok(CastProxyStartResult {
|
||||
url: proxy_url,
|
||||
mode: "tap".to_string(),
|
||||
});
|
||||
} else {
|
||||
warn!("Cast tap did not start listening on port {port}; attempt {}/{}", attempt+1, max_attempts);
|
||||
let _ = player.controller.tx.send(PlayerCommand::CastTapStop);
|
||||
std::thread::sleep(Duration::from_millis(100));
|
||||
continue;
|
||||
}
|
||||
}
|
||||
Ok(Err(e)) => {
|
||||
warn!("Cast tap start failed on attempt {}/{}: {e}", attempt+1, max_attempts);
|
||||
let _ = player.controller.tx.send(PlayerCommand::CastTapStop);
|
||||
std::thread::sleep(Duration::from_millis(100));
|
||||
continue;
|
||||
}
|
||||
Err(_) => {
|
||||
warn!("Cast tap start timed out on attempt {}/{}", attempt+1, max_attempts);
|
||||
let _ = player.controller.tx.send(PlayerCommand::CastTapStop);
|
||||
std::thread::sleep(Duration::from_millis(100));
|
||||
continue;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// All TAP attempts failed; fall back to standalone proxy on a fresh ephemeral port.
|
||||
warn!("All TAP attempts failed; falling back to standalone proxy");
|
||||
let listener = TcpListener::bind("0.0.0.0:0").map_err(|e| e.to_string())?;
|
||||
let port = listener.local_addr().map_err(|e| e.to_string())?.port();
|
||||
drop(listener);
|
||||
let proxy_url = format!("http://{host}:{port}/stream.mp3");
|
||||
let mut child = spawn_standalone_cast_proxy(url, port)?;
|
||||
if let Some(stderr) = child.stderr.take() {
|
||||
std::thread::spawn(move || {
|
||||
let reader = BufReader::new(stderr);
|
||||
for line in reader.lines().flatten() {
|
||||
warn!("[cast-proxy ffmpeg] {line}");
|
||||
}
|
||||
});
|
||||
}
|
||||
// best-effort wait for standalone proxy
|
||||
let _ = wait_for_listen(local_ip, port);
|
||||
let mut lock = proxy_state.inner.lock().unwrap();
|
||||
*lock = Some(CastProxy { child });
|
||||
info!("Cast proxy started in STANDALONE mode (after TAP attempts): {}", proxy_url);
|
||||
Ok(CastProxyStartResult {
|
||||
url: proxy_url,
|
||||
mode: "proxy".to_string(),
|
||||
})
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
async fn cast_proxy_stop(proxy_state: State<'_, CastProxyState>, player: State<'_, PlayerRuntime>) -> Result<(), String> {
|
||||
let _ = player.controller.tx.send(PlayerCommand::CastTapStop);
|
||||
let mut lock = proxy_state.inner.lock().unwrap();
|
||||
stop_cast_proxy_locked(&mut lock);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
async fn player_get_state(player: State<'_, PlayerRuntime>) -> Result<PlayerState, String> {
|
||||
Ok(player.shared.snapshot())
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
async fn player_set_volume(
|
||||
player: State<'_, PlayerRuntime>,
|
||||
volume: f32,
|
||||
) -> Result<(), String> {
|
||||
let volume = clamp01(volume);
|
||||
{
|
||||
let mut s = player.shared.state.lock().unwrap();
|
||||
s.volume = volume;
|
||||
}
|
||||
player
|
||||
.controller
|
||||
.tx
|
||||
.send(PlayerCommand::SetVolume { volume })
|
||||
.map_err(|e| e.to_string())?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
async fn player_play(player: State<'_, PlayerRuntime>, url: String) -> Result<(), String> {
|
||||
// Fail fast if audio output or ffmpeg is not available.
|
||||
// This keeps UX predictable: JS can show an error without flipping to "playing".
|
||||
if let Err(e) = player::preflight_check() {
|
||||
{
|
||||
let mut s = player.shared.state.lock().unwrap();
|
||||
s.status = player::PlayerStatus::Error;
|
||||
s.error = Some(e.clone());
|
||||
}
|
||||
return Err(e);
|
||||
}
|
||||
|
||||
{
|
||||
let mut s = player.shared.state.lock().unwrap();
|
||||
s.error = None;
|
||||
s.url = Some(url.clone());
|
||||
// Step 1: report buffering immediately; the engine thread will progress.
|
||||
s.status = player::PlayerStatus::Buffering;
|
||||
}
|
||||
|
||||
player
|
||||
.controller
|
||||
.tx
|
||||
.send(PlayerCommand::Play { url })
|
||||
.map_err(|e| e.to_string())?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
async fn player_stop(player: State<'_, PlayerRuntime>) -> Result<(), String> {
|
||||
{
|
||||
let mut s = player.shared.state.lock().unwrap();
|
||||
s.error = None;
|
||||
s.status = player::PlayerStatus::Stopped;
|
||||
}
|
||||
player
|
||||
.controller
|
||||
.tx
|
||||
.send(PlayerCommand::Stop)
|
||||
.map_err(|e| e.to_string())?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
async fn list_cast_devices(state: State<'_, AppState>) -> Result<Vec<String>, String> {
|
||||
let devices = state.known_devices.lock().unwrap();
|
||||
let devices = state.known_devices.read().await;
|
||||
let mut list: Vec<String> = devices.keys().cloned().collect();
|
||||
list.sort();
|
||||
Ok(list)
|
||||
@@ -32,13 +374,22 @@ async fn cast_play(
|
||||
sidecar_state: State<'_, SidecarState>,
|
||||
device_name: String,
|
||||
url: String,
|
||||
title: Option<String>,
|
||||
artist: Option<String>,
|
||||
image: Option<String>,
|
||||
) -> Result<(), String> {
|
||||
// Resolve device name -> ip with diagnostics on failure
|
||||
let ip = {
|
||||
let devices = state.known_devices.lock().unwrap();
|
||||
devices
|
||||
.get(&device_name)
|
||||
.cloned()
|
||||
.ok_or("Device not found")?
|
||||
let devices = state.known_devices.read().await;
|
||||
if let Some(d) = devices.get(&device_name) {
|
||||
info!("cast_play: resolved device '{}' -> {}", device_name, d.ip);
|
||||
d.ip.clone()
|
||||
} else {
|
||||
// Log known device keys for debugging
|
||||
let keys: Vec<String> = devices.keys().cloned().collect();
|
||||
warn!("cast_play: device '{}' not found; known: {:?}", device_name, keys);
|
||||
return Err(format!("Device not found: {} (known: {:?})", device_name, keys));
|
||||
}
|
||||
};
|
||||
|
||||
let mut lock = sidecar_state.child.lock().unwrap();
|
||||
@@ -47,21 +398,35 @@ async fn cast_play(
|
||||
let child = if let Some(ref mut child) = *lock {
|
||||
child
|
||||
} else {
|
||||
println!("Spawning new sidecar...");
|
||||
info!("Spawning new sidecar...");
|
||||
// Use the packaged sidecar binary (radiocast-sidecar-<target>.exe)
|
||||
let sidecar_command = app
|
||||
.shell()
|
||||
.sidecar("radiocast-sidecar")
|
||||
.map_err(|e| e.to_string())?;
|
||||
let (mut rx, child) = sidecar_command.spawn().map_err(|e| e.to_string())?;
|
||||
.map_err(|e| {
|
||||
error!("Sidecar command creation failed: {}", e);
|
||||
e.to_string()
|
||||
})?;
|
||||
let spawn_result = sidecar_command.spawn();
|
||||
let (mut rx, child) = match spawn_result {
|
||||
Ok(res) => {
|
||||
info!("Sidecar spawned successfully");
|
||||
res
|
||||
}
|
||||
Err(e) => {
|
||||
error!("Sidecar spawn failed: {}", e);
|
||||
return Err(e.to_string());
|
||||
}
|
||||
};
|
||||
|
||||
tauri::async_runtime::spawn(async move {
|
||||
while let Some(event) = rx.recv().await {
|
||||
match event {
|
||||
CommandEvent::Stdout(line) => {
|
||||
println!("Sidecar: {}", String::from_utf8_lossy(&line))
|
||||
info!("Sidecar: {}", String::from_utf8_lossy(&line))
|
||||
}
|
||||
CommandEvent::Stderr(line) => {
|
||||
eprintln!("Sidecar Error: {}", String::from_utf8_lossy(&line))
|
||||
error!("Sidecar Error: {}", String::from_utf8_lossy(&line))
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
@@ -74,12 +439,25 @@ async fn cast_play(
|
||||
|
||||
let play_cmd = json!({
|
||||
"command": "play",
|
||||
"args": { "ip": ip, "url": url }
|
||||
"args": {
|
||||
"ip": ip,
|
||||
"url": url,
|
||||
"metadata": {
|
||||
"title": title,
|
||||
"artist": artist,
|
||||
"image": image
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
child
|
||||
.write(format!("{}\n", play_cmd.to_string()).as_bytes())
|
||||
.map_err(|e| e.to_string())?;
|
||||
let play_payload = format!("{}\n", play_cmd.to_string());
|
||||
info!("Sending cast URL to device '{}': {}", device_name, url);
|
||||
match child.write(play_payload.as_bytes()) {
|
||||
Ok(()) => info!("Sidecar write OK"),
|
||||
Err(e) => {
|
||||
error!("Sidecar write failed: {}", e);
|
||||
return Err(e.to_string());
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
@@ -87,8 +465,18 @@ async fn cast_play(
|
||||
async fn cast_stop(
|
||||
_app: AppHandle,
|
||||
sidecar_state: State<'_, SidecarState>,
|
||||
proxy_state: State<'_, CastProxyState>,
|
||||
player: State<'_, PlayerRuntime>,
|
||||
_device_name: String,
|
||||
) -> Result<(), String> {
|
||||
{
|
||||
let mut lock = proxy_state.inner.lock().unwrap();
|
||||
stop_cast_proxy_locked(&mut lock);
|
||||
}
|
||||
|
||||
// Safety net: stop any active tap too.
|
||||
let _ = player.controller.tx.send(PlayerCommand::CastTapStop);
|
||||
|
||||
let mut lock = sidecar_state.child.lock().unwrap();
|
||||
if let Some(ref mut child) = *lock {
|
||||
let stop_cmd = json!({ "command": "stop", "args": {} });
|
||||
@@ -134,50 +522,222 @@ async fn fetch_url(_app: AppHandle, url: String) -> Result<String, String> {
|
||||
}
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
async fn fetch_image_data_url(url: String) -> Result<String, String> {
|
||||
// Fetch remote images via backend and return a data: URL.
|
||||
// This helps when WebView blocks http images (mixed-content) or some hosts block hotlinking.
|
||||
let parsed = reqwest::Url::parse(&url).map_err(|e| e.to_string())?;
|
||||
match parsed.scheme() {
|
||||
"http" | "https" => {}
|
||||
_ => return Err("Only http/https URLs are allowed".to_string()),
|
||||
}
|
||||
|
||||
let resp = reqwest::Client::new()
|
||||
.get(parsed)
|
||||
.header(reqwest::header::USER_AGENT, "RadioPlayer/1.0")
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
let status = resp.status();
|
||||
if !status.is_success() {
|
||||
return Err(format!("HTTP {} while fetching image", status));
|
||||
}
|
||||
|
||||
let content_type = resp
|
||||
.headers()
|
||||
.get(reqwest::header::CONTENT_TYPE)
|
||||
.and_then(|v| v.to_str().ok())
|
||||
.map(|s| s.split(';').next().unwrap_or(s).trim().to_string())
|
||||
.unwrap_or_else(|| "application/octet-stream".to_string());
|
||||
|
||||
let bytes = resp.bytes().await.map_err(|e| e.to_string())?;
|
||||
const MAX_BYTES: usize = 2 * 1024 * 1024;
|
||||
if bytes.len() > MAX_BYTES {
|
||||
return Err("Image too large".to_string());
|
||||
}
|
||||
|
||||
// Be conservative: prefer image/* content types, but allow svg even if mislabelled.
|
||||
let looks_like_image = content_type.starts_with("image/")
|
||||
|| content_type == "application/svg+xml"
|
||||
|| url.to_lowercase().ends_with(".svg");
|
||||
if !looks_like_image {
|
||||
return Err(format!("Not an image content-type: {}", content_type));
|
||||
}
|
||||
|
||||
let b64 = general_purpose::STANDARD.encode(bytes);
|
||||
Ok(format!("data:{};base64,{}", content_type, b64))
|
||||
}
|
||||
|
||||
#[cfg_attr(mobile, tauri::mobile_entry_point)]
|
||||
pub fn run() {
|
||||
tauri::Builder::default()
|
||||
.plugin(tauri_plugin_shell::init())
|
||||
.plugin(tauri_plugin_opener::init())
|
||||
.on_window_event(|window, event| {
|
||||
// Ensure native audio shuts down on app close.
|
||||
// We do not prevent the close; this is best-effort cleanup.
|
||||
if matches!(event, tauri::WindowEvent::CloseRequested { .. }) {
|
||||
let player = window.app_handle().state::<PlayerRuntime>();
|
||||
let _ = player.controller.tx.send(PlayerCommand::Shutdown);
|
||||
|
||||
// Also stop any active cast tap/proxy so we don't leave processes behind.
|
||||
let _ = player.controller.tx.send(PlayerCommand::CastTapStop);
|
||||
let proxy_state = window.app_handle().state::<CastProxyState>();
|
||||
let mut lock = proxy_state.inner.lock().unwrap();
|
||||
stop_cast_proxy_locked(&mut lock);
|
||||
}
|
||||
})
|
||||
.setup(|app| {
|
||||
app.manage(AppState {
|
||||
known_devices: Mutex::new(HashMap::new()),
|
||||
known_devices: Arc::new(TokioRwLock::new(HashMap::new())),
|
||||
});
|
||||
app.manage(SidecarState {
|
||||
child: Mutex::new(None),
|
||||
});
|
||||
app.manage(CastProxyState {
|
||||
inner: Mutex::new(None),
|
||||
});
|
||||
|
||||
// Initialize tracing subscriber for structured logging. Honor RUST_LOG if set.
|
||||
tracing_subscriber::fmt::init();
|
||||
|
||||
// Player scaffolding: create shared state behind an Arc and spawn the
|
||||
// player thread with a cloned Arc (avoids leaking memory).
|
||||
let shared = Arc::new(PlayerShared {
|
||||
state: Mutex::new(PlayerState::default()),
|
||||
});
|
||||
let controller = player::spawn_player_thread(Arc::clone(&shared));
|
||||
app.manage(PlayerRuntime { shared, controller });
|
||||
|
||||
let handle = app.handle().clone();
|
||||
thread::spawn(move || {
|
||||
let mdns = ServiceDaemon::new().expect("Failed to create daemon");
|
||||
let receiver = mdns
|
||||
.browse("_googlecast._tcp.local.")
|
||||
.expect("Failed to browse");
|
||||
while let Ok(event) = receiver.recv() {
|
||||
match event {
|
||||
ServiceEvent::ServiceResolved(info) => {
|
||||
let name = info
|
||||
.get_property_val_str("fn")
|
||||
.or_else(|| Some(info.get_fullname()))
|
||||
.unwrap()
|
||||
.to_string();
|
||||
let addresses = info.get_addresses();
|
||||
let ip = addresses
|
||||
.iter()
|
||||
.find(|ip| ip.is_ipv4())
|
||||
.or_else(|| addresses.iter().next());
|
||||
|
||||
if let Some(ip) = ip {
|
||||
let state = handle.state::<AppState>();
|
||||
let mut devices = state.known_devices.lock().unwrap();
|
||||
let ip_str = ip.to_string();
|
||||
if !devices.contains_key(&name) {
|
||||
println!("Discovered Cast Device: {} at {}", name, ip_str);
|
||||
devices.insert(name, ip_str);
|
||||
// Bridge blocking mdns-sd into async device handling via an unbounded channel.
|
||||
let mdns_handle = handle.clone();
|
||||
let (mdns_tx, mut mdns_rx) = mpsc::unbounded_channel::<(String, String)>();
|
||||
|
||||
// Task: consume events from the channel and update `known_devices` asynchronously.
|
||||
let consumer_handle = mdns_handle.clone();
|
||||
tauri::async_runtime::spawn(async move {
|
||||
while let Some((name, ip_str)) = mdns_rx.recv().await {
|
||||
let state = consumer_handle.state::<AppState>();
|
||||
let mut devices = state.known_devices.write().await;
|
||||
let now = std::time::Instant::now();
|
||||
if !devices.contains_key(&name) {
|
||||
let info = DeviceInfo { ip: ip_str.clone(), last_seen: now };
|
||||
devices.insert(name.clone(), info);
|
||||
let _ = consumer_handle.emit("cast-device-discovered", json!({"name": name, "ip": ip_str}));
|
||||
} else if let Some(d) = devices.get_mut(&name) {
|
||||
d.last_seen = now;
|
||||
d.ip = ip_str;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Probe implementation:
|
||||
// - If the feature `use_agnostic_mdns` is enabled, use the async `agnostic-mdns` API.
|
||||
// - Otherwise keep the existing blocking `mdns-sd` browse running in a blocking task.
|
||||
let probe_tx = mdns_tx.clone();
|
||||
|
||||
#[cfg(feature = "use_agnostic_mdns")]
|
||||
{
|
||||
// Use agnostic-mdns async API (tokio) to query for Google Cast services
|
||||
tauri::async_runtime::spawn(async move {
|
||||
// Create the async channel expected by agnostic-mdns query
|
||||
let (tx, rx) = agnostic_mdns::tokio::channel::unbounded::<agnostic_mdns::worksteal::ServiceEntry>();
|
||||
|
||||
// Build query params for _googlecast._tcp in the local domain.
|
||||
let params = agnostic_mdns::QueryParam::new("_googlecast._tcp".into())
|
||||
.with_domain("local.".into());
|
||||
|
||||
// Spawn the query task which will send ServiceEntry values into `tx`.
|
||||
let _ = tokio::spawn(async move {
|
||||
let _ = agnostic_mdns::tokio::query(params, tx).await;
|
||||
});
|
||||
|
||||
// Consume ServiceEntry results and forward (name, ip) into the probe channel.
|
||||
let rx = rx;
|
||||
while let Ok(entry) = rx.recv().await {
|
||||
// Try TXT records for friendly name: entries like "fn=Living Room".
|
||||
let mut friendly: Option<String> = None;
|
||||
for s in entry.txt() {
|
||||
let s_str = s.to_string();
|
||||
if let Some(rest) = s_str.strip_prefix("fn=") {
|
||||
friendly = Some(rest.to_string());
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
// Fallback: use debug-formatted entry name if TXT 'fn' not present.
|
||||
// This avoids depending on the concrete return type of `name()`.
|
||||
let name = friendly.unwrap_or_else(|| format!("{:?}", entry.name()));
|
||||
|
||||
// Prefer IPv4, then IPv6.
|
||||
let ip_opt = entry
|
||||
.ipv4_addr()
|
||||
.map(|a| a.to_string())
|
||||
.or_else(|| entry.ipv6_addr().map(|a| a.to_string()));
|
||||
|
||||
if let Some(ip_str) = ip_opt {
|
||||
let _ = probe_tx.send((name, ip_str));
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
#[cfg(not(feature = "use_agnostic_mdns"))]
|
||||
{
|
||||
// Offload blocking mdns-sd browse loop to a blocking thread and forward events over the channel.
|
||||
tauri::async_runtime::spawn(async move {
|
||||
let _ = tokio::task::spawn_blocking(move || {
|
||||
let mdns = ServiceDaemon::new().expect("Failed to create daemon");
|
||||
let receiver = mdns
|
||||
.browse("_googlecast._tcp.local.")
|
||||
.expect("Failed to browse");
|
||||
while let Ok(event) = receiver.recv() {
|
||||
if let ServiceEvent::ServiceResolved(info) = event {
|
||||
let name = info
|
||||
.get_property_val_str("fn")
|
||||
.or_else(|| Some(info.get_fullname()))
|
||||
.unwrap()
|
||||
.to_string();
|
||||
let addresses = info.get_addresses();
|
||||
let ip = addresses
|
||||
.iter()
|
||||
.find(|ip| ip.is_ipv4())
|
||||
.or_else(|| addresses.iter().next());
|
||||
if let Some(ip) = ip {
|
||||
let ip_str = ip.to_string();
|
||||
// Best-effort send into the async channel; ignore if receiver dropped.
|
||||
let _ = probe_tx.send((name, ip_str));
|
||||
}
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}).await;
|
||||
});
|
||||
}
|
||||
|
||||
// Spawn an async GC task to drop stale devices and notify frontend
|
||||
let gc_handle = handle.clone();
|
||||
tauri::async_runtime::spawn(async move {
|
||||
let stale_after = Duration::from_secs(30);
|
||||
let mut interval = tokio::time::interval(Duration::from_secs(10));
|
||||
loop {
|
||||
interval.tick().await;
|
||||
let state = gc_handle.state::<AppState>();
|
||||
let mut devices = state.known_devices.write().await;
|
||||
let now = std::time::Instant::now();
|
||||
let mut removed: Vec<String> = Vec::new();
|
||||
devices.retain(|name, info| {
|
||||
if now.duration_since(info.last_seen) > stale_after {
|
||||
removed.push(name.clone());
|
||||
false
|
||||
} else {
|
||||
true
|
||||
}
|
||||
});
|
||||
for name in removed {
|
||||
let _ = gc_handle.emit("cast-device-removed", json!({"name": name}));
|
||||
}
|
||||
}
|
||||
});
|
||||
@@ -188,8 +748,17 @@ pub fn run() {
|
||||
cast_play,
|
||||
cast_stop,
|
||||
cast_set_volume,
|
||||
cast_proxy_start,
|
||||
cast_proxy_stop,
|
||||
// allow frontend to request arbitrary URLs via backend (bypass CORS)
|
||||
fetch_url
|
||||
fetch_url,
|
||||
// fetch remote images via backend (data: URL), helps with mixed-content
|
||||
fetch_image_data_url,
|
||||
// native player commands (step 1 scaffold)
|
||||
player_play,
|
||||
player_stop,
|
||||
player_set_volume,
|
||||
player_get_state
|
||||
])
|
||||
.run(tauri::generate_context!())
|
||||
.expect("error while running tauri application");
|
||||
|
||||
936
src-tauri/src/player.rs
Normal file
@@ -0,0 +1,936 @@
|
||||
use serde::Serialize;
|
||||
use std::io::Read;
|
||||
use std::process::{Command, Stdio};
|
||||
use std::ffi::OsString;
|
||||
use std::sync::{
|
||||
atomic::{AtomicBool, AtomicU32, Ordering},
|
||||
mpsc, Arc, Mutex,
|
||||
};
|
||||
use std::time::Duration;
|
||||
|
||||
use cpal::traits::{DeviceTrait, HostTrait, StreamTrait};
|
||||
use ringbuf::HeapRb;
|
||||
|
||||
#[cfg(windows)]
|
||||
use std::os::windows::process::CommandExt;
|
||||
|
||||
#[cfg(windows)]
|
||||
const CREATE_NO_WINDOW: u32 = 0x08000000;
|
||||
|
||||
fn command_hidden(program: &OsString) -> Command {
|
||||
let mut cmd = Command::new(program);
|
||||
#[cfg(windows)]
|
||||
{
|
||||
cmd.creation_flags(CREATE_NO_WINDOW);
|
||||
}
|
||||
cmd
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, PartialEq, Eq)]
|
||||
#[serde(rename_all = "lowercase")]
|
||||
pub enum PlayerStatus {
|
||||
Idle,
|
||||
Buffering,
|
||||
Playing,
|
||||
Stopped,
|
||||
Error,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize)]
|
||||
pub struct PlayerState {
|
||||
pub status: PlayerStatus,
|
||||
pub url: Option<String>,
|
||||
pub volume: f32,
|
||||
pub error: Option<String>,
|
||||
}
|
||||
|
||||
impl Default for PlayerState {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
status: PlayerStatus::Idle,
|
||||
url: None,
|
||||
volume: 0.5,
|
||||
error: None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub struct PlayerShared {
|
||||
pub state: Mutex<PlayerState>,
|
||||
}
|
||||
|
||||
impl PlayerShared {
|
||||
pub fn snapshot(&self) -> PlayerState {
|
||||
self.state.lock().unwrap().clone()
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub enum PlayerCommand {
|
||||
Play { url: String },
|
||||
// Cast-only playback: decode to PCM and keep it available for cast taps,
|
||||
// but do not open a CPAL output stream.
|
||||
PlayCast { url: String },
|
||||
Stop,
|
||||
SetVolume { volume: f32 },
|
||||
CastTapStart {
|
||||
port: u16,
|
||||
bind_host: String,
|
||||
reply: mpsc::Sender<Result<(), String>>,
|
||||
},
|
||||
CastTapStop,
|
||||
Shutdown,
|
||||
}
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct PlayerController {
|
||||
pub tx: mpsc::Sender<PlayerCommand>,
|
||||
}
|
||||
|
||||
pub fn spawn_player_thread(shared: std::sync::Arc<PlayerShared>) -> PlayerController {
|
||||
let (tx, rx) = mpsc::channel::<PlayerCommand>();
|
||||
|
||||
let shared_for_thread = std::sync::Arc::clone(&shared);
|
||||
std::thread::spawn(move || player_thread(shared_for_thread, rx));
|
||||
PlayerController { tx }
|
||||
}
|
||||
|
||||
fn clamp01(v: f32) -> f32 {
|
||||
if v.is_nan() {
|
||||
0.0
|
||||
} else if v < 0.0 {
|
||||
0.0
|
||||
} else if v > 1.0 {
|
||||
1.0
|
||||
} else {
|
||||
v
|
||||
}
|
||||
}
|
||||
|
||||
fn volume_to_bits(v: f32) -> u32 {
|
||||
clamp01(v).to_bits()
|
||||
}
|
||||
|
||||
fn volume_from_bits(bits: u32) -> f32 {
|
||||
f32::from_bits(bits)
|
||||
}
|
||||
|
||||
fn set_status(shared: &std::sync::Arc<PlayerShared>, status: PlayerStatus) {
|
||||
let mut s = shared.state.lock().unwrap();
|
||||
if s.status != status {
|
||||
s.status = status;
|
||||
}
|
||||
}
|
||||
|
||||
fn set_error(shared: &std::sync::Arc<PlayerShared>, message: String) {
|
||||
let mut s = shared.state.lock().unwrap();
|
||||
s.status = PlayerStatus::Error;
|
||||
s.error = Some(message);
|
||||
}
|
||||
|
||||
pub(crate) fn ffmpeg_command() -> OsString {
|
||||
// Step 2: external ffmpeg binary.
|
||||
// Lookup order:
|
||||
// 1) RADIOPLAYER_FFMPEG (absolute or relative)
|
||||
// 2) ffmpeg next to the application executable
|
||||
// 3) PATH lookup (ffmpeg / ffmpeg.exe)
|
||||
if let Ok(p) = std::env::var("RADIOPLAYER_FFMPEG") {
|
||||
if !p.trim().is_empty() {
|
||||
return OsString::from(p);
|
||||
}
|
||||
}
|
||||
|
||||
let local_name = if cfg!(windows) { "ffmpeg.exe" } else { "ffmpeg" };
|
||||
if let Ok(exe) = std::env::current_exe() {
|
||||
if let Some(dir) = exe.parent() {
|
||||
// Common locations depending on bundler/platform.
|
||||
let candidates = [
|
||||
dir.join(local_name),
|
||||
// Some packagers place resources in a sibling folder.
|
||||
dir.join("resources").join(local_name),
|
||||
dir.join("Resources").join(local_name),
|
||||
// Or one level above.
|
||||
dir.join("..").join("resources").join(local_name),
|
||||
dir.join("..").join("Resources").join(local_name),
|
||||
];
|
||||
for candidate in candidates {
|
||||
if candidate.exists() {
|
||||
return candidate.into_os_string();
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
OsString::from(local_name)
|
||||
}
|
||||
|
||||
pub fn preflight_ffmpeg_only() -> Result<(), String> {
|
||||
let ffmpeg = ffmpeg_command();
|
||||
let status = command_hidden(&ffmpeg)
|
||||
.arg("-version")
|
||||
.stdout(Stdio::null())
|
||||
.stderr(Stdio::null())
|
||||
.status()
|
||||
.map_err(|e| {
|
||||
let ffmpeg_disp = ffmpeg.to_string_lossy();
|
||||
format!(
|
||||
"FFmpeg not available ({ffmpeg_disp}): {e}. Set RADIOPLAYER_FFMPEG, bundle ffmpeg next to the app, or install ffmpeg on PATH."
|
||||
)
|
||||
})?;
|
||||
if !status.success() {
|
||||
return Err("FFmpeg exists but returned non-zero for -version".to_string());
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn preflight_check() -> Result<(), String> {
|
||||
// Ensure we have an output device up-front so UI gets a synchronous error.
|
||||
let host = cpal::default_host();
|
||||
let device = host
|
||||
.default_output_device()
|
||||
.ok_or_else(|| "No default audio output device".to_string())?;
|
||||
let _ = device
|
||||
.default_output_config()
|
||||
.map_err(|e| format!("Failed to get output config: {e}"))?;
|
||||
|
||||
preflight_ffmpeg_only()?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
enum PipelineMode {
|
||||
WithOutput,
|
||||
Headless,
|
||||
}
|
||||
|
||||
struct CastTapProc {
|
||||
child: std::process::Child,
|
||||
writer_join: Option<std::thread::JoinHandle<()>>,
|
||||
server_join: Option<std::thread::JoinHandle<()>>,
|
||||
stop_flag: Arc<AtomicBool>,
|
||||
}
|
||||
|
||||
struct Pipeline {
|
||||
stop_flag: Arc<AtomicBool>,
|
||||
volume_bits: Arc<AtomicU32>,
|
||||
_stream: Option<cpal::Stream>,
|
||||
decoder_join: Option<std::thread::JoinHandle<()>>,
|
||||
cast_tx: Arc<Mutex<Option<mpsc::SyncSender<Vec<u8>>>>>,
|
||||
cast_proc: Option<CastTapProc>,
|
||||
sample_rate: u32,
|
||||
channels: u16,
|
||||
}
|
||||
|
||||
impl Pipeline {
|
||||
fn start(shared: std::sync::Arc<PlayerShared>, url: String, mode: PipelineMode) -> Result<Self, String> {
|
||||
let (device, sample_format, cfg, sample_rate, channels) = match mode {
|
||||
PipelineMode::WithOutput => {
|
||||
let host = cpal::default_host();
|
||||
let device = host
|
||||
.default_output_device()
|
||||
.ok_or_else(|| "No default audio output device".to_string())?;
|
||||
let default_cfg = device
|
||||
.default_output_config()
|
||||
.map_err(|e| format!("Failed to get output config: {e}"))?;
|
||||
let sample_format = default_cfg.sample_format();
|
||||
let cfg = default_cfg.config();
|
||||
let sample_rate = cfg.sample_rate.0;
|
||||
let channels = cfg.channels as u16;
|
||||
(Some(device), Some(sample_format), Some(cfg), sample_rate, channels)
|
||||
}
|
||||
PipelineMode::Headless => {
|
||||
// For cast-only, pick a sane, widely-supported PCM format.
|
||||
// This does not depend on an audio device.
|
||||
(None, None, None, 48_000u32, 2u16)
|
||||
}
|
||||
};
|
||||
|
||||
// 5 seconds of PCM buffering (i16 samples)
|
||||
let (mut prod_opt, mut cons_opt) = if mode == PipelineMode::WithOutput {
|
||||
let cfg = cfg.as_ref().expect("cfg must exist for WithOutput");
|
||||
let capacity_samples = (sample_rate as usize)
|
||||
.saturating_mul(cfg.channels as usize)
|
||||
.saturating_mul(5);
|
||||
let rb = HeapRb::<i16>::new(capacity_samples);
|
||||
let (prod, cons) = rb.split();
|
||||
(Some(prod), Some(cons))
|
||||
} else {
|
||||
(None, None)
|
||||
};
|
||||
|
||||
let stop_flag = Arc::new(AtomicBool::new(false));
|
||||
let volume_bits = Arc::new(AtomicU32::new({
|
||||
let s = shared.state.lock().unwrap();
|
||||
volume_to_bits(s.volume)
|
||||
}));
|
||||
|
||||
let cast_tx: Arc<Mutex<Option<mpsc::SyncSender<Vec<u8>>>>> = Arc::new(Mutex::new(None));
|
||||
|
||||
// Decoder thread: spawns ffmpeg, reads PCM, writes into ring buffer.
|
||||
let stop_for_decoder = Arc::clone(&stop_flag);
|
||||
let shared_for_decoder = std::sync::Arc::clone(&shared);
|
||||
let decoder_url = url.clone();
|
||||
let cast_tx_for_decoder = Arc::clone(&cast_tx);
|
||||
let decoder_join = std::thread::spawn(move || {
|
||||
let mut backoff_ms: u64 = 250;
|
||||
let mut pushed_since_start: usize = 0;
|
||||
let playing_threshold_samples = (sample_rate as usize)
|
||||
.saturating_mul(channels as usize)
|
||||
.saturating_div(4); // ~250ms
|
||||
|
||||
'outer: loop {
|
||||
if stop_for_decoder.load(Ordering::SeqCst) {
|
||||
break;
|
||||
}
|
||||
|
||||
set_status(&shared_for_decoder, PlayerStatus::Buffering);
|
||||
|
||||
let ffmpeg = ffmpeg_command();
|
||||
let ffmpeg_disp = ffmpeg.to_string_lossy();
|
||||
let mut child = match command_hidden(&ffmpeg)
|
||||
.arg("-nostdin")
|
||||
.arg("-hide_banner")
|
||||
.arg("-loglevel")
|
||||
.arg("warning")
|
||||
// basic reconnect flags (best-effort; not all protocols honor these)
|
||||
.arg("-reconnect")
|
||||
.arg("1")
|
||||
.arg("-reconnect_streamed")
|
||||
.arg("1")
|
||||
.arg("-reconnect_delay_max")
|
||||
.arg("5")
|
||||
.arg("-i")
|
||||
.arg(&decoder_url)
|
||||
.arg("-vn")
|
||||
.arg("-ac")
|
||||
.arg(channels.to_string())
|
||||
.arg("-ar")
|
||||
.arg(sample_rate.to_string())
|
||||
.arg("-f")
|
||||
.arg("s16le")
|
||||
.arg("pipe:1")
|
||||
.stdout(Stdio::piped())
|
||||
.stderr(Stdio::null())
|
||||
.spawn()
|
||||
{
|
||||
Ok(c) => c,
|
||||
Err(e) => {
|
||||
// If ffmpeg isn't available, this is a hard failure.
|
||||
set_error(
|
||||
&shared_for_decoder,
|
||||
format!(
|
||||
"Failed to start ffmpeg ({ffmpeg_disp}): {e}. Set RADIOPLAYER_FFMPEG, bundle ffmpeg next to the app, or install ffmpeg on PATH."
|
||||
),
|
||||
);
|
||||
break;
|
||||
}
|
||||
};
|
||||
|
||||
let mut stdout = match child.stdout.take() {
|
||||
Some(s) => s,
|
||||
None => {
|
||||
set_error(&shared_for_decoder, "ffmpeg stdout not available".to_string());
|
||||
let _ = child.kill();
|
||||
break;
|
||||
}
|
||||
};
|
||||
|
||||
let mut buf = [0u8; 8192];
|
||||
let mut leftover: Option<u8> = None;
|
||||
|
||||
loop {
|
||||
if stop_for_decoder.load(Ordering::SeqCst) {
|
||||
let _ = child.kill();
|
||||
let _ = child.wait();
|
||||
break 'outer;
|
||||
}
|
||||
|
||||
let n = match stdout.read(&mut buf) {
|
||||
Ok(0) => 0,
|
||||
Ok(n) => n,
|
||||
Err(_) => 0,
|
||||
};
|
||||
|
||||
if n == 0 {
|
||||
// EOF / disconnect. Try to reconnect after backoff.
|
||||
let _ = child.kill();
|
||||
let _ = child.wait();
|
||||
if stop_for_decoder.load(Ordering::SeqCst) {
|
||||
break 'outer;
|
||||
}
|
||||
set_status(&shared_for_decoder, PlayerStatus::Buffering);
|
||||
std::thread::sleep(Duration::from_millis(backoff_ms));
|
||||
backoff_ms = (backoff_ms * 2).min(5000);
|
||||
continue 'outer;
|
||||
}
|
||||
|
||||
backoff_ms = 250;
|
||||
|
||||
// Forward raw PCM bytes to cast tap (if enabled).
|
||||
if let Some(tx) = cast_tx_for_decoder.lock().unwrap().as_ref() {
|
||||
// Best-effort: never block local playback.
|
||||
let _ = tx.try_send(buf[..n].to_vec());
|
||||
}
|
||||
|
||||
// Convert bytes to i16 LE samples
|
||||
let mut i = 0usize;
|
||||
if let Some(b0) = leftover.take() {
|
||||
if n >= 1 {
|
||||
let b1 = buf[0];
|
||||
let sample = i16::from_le_bytes([b0, b1]);
|
||||
if let Some(prod) = prod_opt.as_mut() {
|
||||
let _ = prod.push(sample);
|
||||
}
|
||||
pushed_since_start += 1;
|
||||
i = 1;
|
||||
} else {
|
||||
leftover = Some(b0);
|
||||
}
|
||||
}
|
||||
|
||||
while i + 1 < n {
|
||||
let sample = i16::from_le_bytes([buf[i], buf[i + 1]]);
|
||||
if let Some(prod) = prod_opt.as_mut() {
|
||||
let _ = prod.push(sample);
|
||||
}
|
||||
pushed_since_start += 1;
|
||||
i += 2;
|
||||
}
|
||||
|
||||
if i < n {
|
||||
leftover = Some(buf[i]);
|
||||
}
|
||||
|
||||
// Move to Playing once we've decoded a small buffer.
|
||||
if pushed_since_start >= playing_threshold_samples {
|
||||
set_status(&shared_for_decoder, PlayerStatus::Playing);
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
let stream = if mode == PipelineMode::WithOutput {
|
||||
let device = device.expect("device must exist for WithOutput");
|
||||
let sample_format = sample_format.expect("sample_format must exist for WithOutput");
|
||||
let cfg = cfg.expect("cfg must exist for WithOutput");
|
||||
let mut cons = cons_opt.take().expect("cons must exist for WithOutput");
|
||||
|
||||
// Audio callback: drain ring buffer and write to output.
|
||||
let shared_for_cb = std::sync::Arc::clone(&shared);
|
||||
let shared_for_cb_err = std::sync::Arc::clone(&shared_for_cb);
|
||||
let stop_for_cb = Arc::clone(&stop_flag);
|
||||
let volume_for_cb = Arc::clone(&volume_bits);
|
||||
|
||||
let mut last_was_underrun = false;
|
||||
|
||||
let err_fn = move |err| {
|
||||
let msg = format!("Audio output error: {err}");
|
||||
set_error(&shared_for_cb_err, msg);
|
||||
};
|
||||
|
||||
let built = match sample_format {
|
||||
cpal::SampleFormat::F32 => device.build_output_stream(
|
||||
&cfg,
|
||||
move |data: &mut [f32], _| {
|
||||
if stop_for_cb.load(Ordering::Relaxed) {
|
||||
for s in data.iter_mut() {
|
||||
*s = 0.0;
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
let vol = volume_from_bits(volume_for_cb.load(Ordering::Relaxed));
|
||||
let mut underrun = false;
|
||||
for s in data.iter_mut() {
|
||||
if let Some(v) = cons.pop() {
|
||||
*s = (v as f32 / 32768.0) * vol;
|
||||
} else {
|
||||
*s = 0.0;
|
||||
underrun = true;
|
||||
}
|
||||
}
|
||||
if underrun != last_was_underrun {
|
||||
last_was_underrun = underrun;
|
||||
set_status(
|
||||
&shared_for_cb,
|
||||
if underrun {
|
||||
PlayerStatus::Buffering
|
||||
} else {
|
||||
PlayerStatus::Playing
|
||||
},
|
||||
);
|
||||
}
|
||||
},
|
||||
err_fn,
|
||||
None,
|
||||
),
|
||||
cpal::SampleFormat::I16 => device.build_output_stream(
|
||||
&cfg,
|
||||
move |data: &mut [i16], _| {
|
||||
if stop_for_cb.load(Ordering::Relaxed) {
|
||||
for s in data.iter_mut() {
|
||||
*s = 0;
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
let vol = volume_from_bits(volume_for_cb.load(Ordering::Relaxed));
|
||||
let mut underrun = false;
|
||||
for s in data.iter_mut() {
|
||||
if let Some(v) = cons.pop() {
|
||||
let scaled =
|
||||
(v as f32 * vol).clamp(i16::MIN as f32, i16::MAX as f32);
|
||||
*s = scaled as i16;
|
||||
} else {
|
||||
*s = 0;
|
||||
underrun = true;
|
||||
}
|
||||
}
|
||||
if underrun != last_was_underrun {
|
||||
last_was_underrun = underrun;
|
||||
set_status(
|
||||
&shared_for_cb,
|
||||
if underrun {
|
||||
PlayerStatus::Buffering
|
||||
} else {
|
||||
PlayerStatus::Playing
|
||||
},
|
||||
);
|
||||
}
|
||||
},
|
||||
err_fn,
|
||||
None,
|
||||
),
|
||||
cpal::SampleFormat::U16 => device.build_output_stream(
|
||||
&cfg,
|
||||
move |data: &mut [u16], _| {
|
||||
if stop_for_cb.load(Ordering::Relaxed) {
|
||||
for s in data.iter_mut() {
|
||||
*s = 0;
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
let vol = volume_from_bits(volume_for_cb.load(Ordering::Relaxed));
|
||||
let mut underrun = false;
|
||||
for s in data.iter_mut() {
|
||||
if let Some(v) = cons.pop() {
|
||||
// Convert signed i16 to unsigned with bias.
|
||||
let f = (v as f32 / 32768.0) * vol;
|
||||
let scaled = (f * 32767.0 + 32768.0).clamp(0.0, 65535.0);
|
||||
*s = scaled as u16;
|
||||
} else {
|
||||
*s = 0;
|
||||
underrun = true;
|
||||
}
|
||||
}
|
||||
if underrun != last_was_underrun {
|
||||
last_was_underrun = underrun;
|
||||
set_status(
|
||||
&shared_for_cb,
|
||||
if underrun {
|
||||
PlayerStatus::Buffering
|
||||
} else {
|
||||
PlayerStatus::Playing
|
||||
},
|
||||
);
|
||||
}
|
||||
},
|
||||
err_fn,
|
||||
None,
|
||||
),
|
||||
_ => return Err("Unsupported output sample format".to_string()),
|
||||
}
|
||||
.map_err(|e| format!("Failed to create output stream: {e}"))?;
|
||||
|
||||
built
|
||||
.play()
|
||||
.map_err(|e| format!("Failed to start output stream: {e}"))?;
|
||||
Some(built)
|
||||
} else {
|
||||
None
|
||||
};
|
||||
|
||||
Ok(Self {
|
||||
stop_flag,
|
||||
volume_bits,
|
||||
_stream: stream,
|
||||
decoder_join: Some(decoder_join),
|
||||
cast_tx,
|
||||
cast_proc: None,
|
||||
sample_rate,
|
||||
channels,
|
||||
})
|
||||
}
|
||||
|
||||
fn start_cast_tap(&mut self, port: u16, bind_host: &str, sample_rate: u32, channels: u16) -> Result<(), String> {
|
||||
// Stop existing tap first.
|
||||
self.stop_cast_tap();
|
||||
|
||||
let ffmpeg = ffmpeg_command();
|
||||
let ffmpeg_disp = ffmpeg.to_string_lossy();
|
||||
let bind_host = bind_host.to_owned();
|
||||
|
||||
let spawn = |codec: &str| -> Result<std::process::Child, String> {
|
||||
command_hidden(&ffmpeg)
|
||||
.arg("-nostdin")
|
||||
.arg("-hide_banner")
|
||||
.arg("-loglevel")
|
||||
.arg("warning")
|
||||
.arg("-f")
|
||||
.arg("s16le")
|
||||
.arg("-ac")
|
||||
.arg(channels.to_string())
|
||||
.arg("-ar")
|
||||
.arg(sample_rate.to_string())
|
||||
.arg("-i")
|
||||
.arg("pipe:0")
|
||||
.arg("-vn")
|
||||
.arg("-c:a")
|
||||
.arg(codec)
|
||||
.arg("-b:a")
|
||||
.arg("128k")
|
||||
.arg("-f")
|
||||
.arg("mp3")
|
||||
.arg("-")
|
||||
.stdin(Stdio::piped())
|
||||
.stdout(Stdio::piped())
|
||||
.stderr(Stdio::piped())
|
||||
.spawn()
|
||||
.map_err(|e| {
|
||||
format!(
|
||||
"Failed to start ffmpeg cast tap ({ffmpeg_disp}): {e}. Set RADIOPLAYER_FFMPEG, bundle ffmpeg next to the app, or install ffmpeg on PATH."
|
||||
)
|
||||
})
|
||||
};
|
||||
|
||||
let mut child = spawn("libmp3lame")?;
|
||||
std::thread::sleep(Duration::from_millis(150));
|
||||
if let Ok(Some(status)) = child.try_wait() {
|
||||
if !status.success() {
|
||||
// Some builds lack libmp3lame; fall back to built-in encoder.
|
||||
child = spawn("mp3")?;
|
||||
}
|
||||
}
|
||||
|
||||
let stdin = child
|
||||
.stdin
|
||||
.take()
|
||||
.ok_or_else(|| "ffmpeg cast tap stdin not available".to_string())?;
|
||||
|
||||
let stdout = child
|
||||
.stdout
|
||||
.take()
|
||||
.ok_or_else(|| "ffmpeg cast tap stdout not available".to_string())?;
|
||||
|
||||
// Log stderr for debugging tap failures
|
||||
if let Some(stderr) = child.stderr.take() {
|
||||
std::thread::spawn(move || {
|
||||
use std::io::BufRead;
|
||||
let reader = std::io::BufReader::new(stderr);
|
||||
for line in reader.lines().flatten() {
|
||||
eprintln!("[cast-tap ffmpeg] {}", line);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
let (tx, rx) = mpsc::sync_channel::<Vec<u8>>(1024);
|
||||
*self.cast_tx.lock().unwrap() = Some(tx);
|
||||
|
||||
let writer_join = std::thread::spawn(move || {
|
||||
use std::io::Write;
|
||||
let mut stdin = stdin;
|
||||
while let Ok(chunk) = rx.recv() {
|
||||
if chunk.is_empty() {
|
||||
continue;
|
||||
}
|
||||
if stdin.write_all(&chunk).is_err() {
|
||||
break;
|
||||
}
|
||||
}
|
||||
let _ = stdin.flush();
|
||||
});
|
||||
|
||||
// Spawn simple HTTP server to serve ffmpeg stdout
|
||||
let server_stop = Arc::new(AtomicBool::new(false));
|
||||
let server_stop_clone = Arc::clone(&server_stop);
|
||||
|
||||
// Use Arc<Mutex<Vec<mpsc::SyncSender>>> for broadcasting to multiple clients
|
||||
let clients: Arc<Mutex<Vec<mpsc::SyncSender<Vec<u8>>>>> = Arc::new(Mutex::new(Vec::new()));
|
||||
let clients_reader = Arc::clone(&clients);
|
||||
|
||||
// Reader thread: reads from ffmpeg stdout and broadcasts to all subscribers
|
||||
let reader_stop = Arc::clone(&server_stop);
|
||||
std::thread::spawn(move || {
|
||||
use std::io::Read;
|
||||
let mut ffmpeg_out = stdout;
|
||||
let mut buffer = vec![0u8; 16384];
|
||||
|
||||
loop {
|
||||
if reader_stop.load(Ordering::SeqCst) {
|
||||
break;
|
||||
}
|
||||
|
||||
match ffmpeg_out.read(&mut buffer) {
|
||||
Ok(0) => break,
|
||||
Ok(n) => {
|
||||
let chunk = buffer[..n].to_vec();
|
||||
let mut clients_lock = clients_reader.lock().unwrap();
|
||||
clients_lock.retain(|tx| tx.try_send(chunk.clone()).is_ok());
|
||||
}
|
||||
Err(_) => break,
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
let clients_server = Arc::clone(&clients);
|
||||
let server_join = std::thread::spawn(move || {
|
||||
use std::io::{BufRead, BufReader, Write};
|
||||
use std::net::TcpListener;
|
||||
|
||||
let listener = match TcpListener::bind(format!("{bind_host}:{port}")) {
|
||||
Ok(l) => l,
|
||||
Err(e) => {
|
||||
eprintln!("[cast-tap server] Failed to bind: {e}");
|
||||
return;
|
||||
}
|
||||
};
|
||||
if let Err(e) = listener.set_nonblocking(true) {
|
||||
eprintln!("[cast-tap server] Failed to set nonblocking: {e}");
|
||||
return;
|
||||
}
|
||||
|
||||
loop {
|
||||
if server_stop_clone.load(Ordering::SeqCst) {
|
||||
break;
|
||||
}
|
||||
|
||||
// Accept client connections
|
||||
let stream = match listener.accept() {
|
||||
Ok((s, addr)) => {
|
||||
eprintln!("[cast-tap server] Client connected: {addr}");
|
||||
s
|
||||
},
|
||||
Err(ref e) if e.kind() == std::io::ErrorKind::WouldBlock => {
|
||||
std::thread::sleep(Duration::from_millis(50));
|
||||
continue;
|
||||
}
|
||||
Err(e) => {
|
||||
eprintln!("[cast-tap server] Accept error: {e}");
|
||||
break;
|
||||
}
|
||||
};
|
||||
|
||||
// Spawn handler for each client
|
||||
let stop_flag = Arc::clone(&server_stop_clone);
|
||||
let (client_tx, client_rx) = mpsc::sync_channel::<Vec<u8>>(1024);
|
||||
|
||||
// Subscribe this client
|
||||
clients_server.lock().unwrap().push(client_tx);
|
||||
|
||||
std::thread::spawn(move || {
|
||||
// Read and discard HTTP request headers
|
||||
let mut reader = BufReader::new(stream.try_clone().unwrap());
|
||||
let mut line = String::new();
|
||||
loop {
|
||||
line.clear();
|
||||
if reader.read_line(&mut line).is_err() || line == "\r\n" || line == "\n" {
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
// Send HTTP response headers
|
||||
let mut writer = stream;
|
||||
let headers = b"HTTP/1.1 200 OK\r\nContent-Type: audio/mpeg\r\nConnection: close\r\nCache-Control: no-cache\r\nAccept-Ranges: none\r\nicy-br: 128\r\n\r\n";
|
||||
if writer.write_all(headers).is_err() {
|
||||
return;
|
||||
}
|
||||
|
||||
// Pre-buffer before streaming to prevent initial stuttering
|
||||
let mut prebuffer = Vec::with_capacity(65536);
|
||||
let prebuffer_start = std::time::Instant::now();
|
||||
while prebuffer.len() < 32768 && prebuffer_start.elapsed() < Duration::from_millis(500) {
|
||||
match client_rx.recv_timeout(Duration::from_millis(50)) {
|
||||
Ok(chunk) => prebuffer.extend_from_slice(&chunk),
|
||||
_ => break,
|
||||
}
|
||||
}
|
||||
|
||||
// Send prebuffered data
|
||||
if !prebuffer.is_empty() {
|
||||
if writer.write_all(&prebuffer).is_err() {
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
// Stream chunks to client
|
||||
loop {
|
||||
if stop_flag.load(Ordering::SeqCst) {
|
||||
break;
|
||||
}
|
||||
|
||||
match client_rx.recv_timeout(Duration::from_millis(100)) {
|
||||
Ok(chunk) => {
|
||||
if writer.write_all(&chunk).is_err() {
|
||||
break;
|
||||
}
|
||||
}
|
||||
Err(mpsc::RecvTimeoutError::Timeout) => continue,
|
||||
Err(mpsc::RecvTimeoutError::Disconnected) => break,
|
||||
}
|
||||
}
|
||||
eprintln!("[cast-tap server] Client disconnected");
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
self.cast_proc = Some(CastTapProc {
|
||||
child,
|
||||
writer_join: Some(writer_join),
|
||||
server_join: Some(server_join),
|
||||
stop_flag: server_stop,
|
||||
});
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn stop_cast_tap(&mut self) {
|
||||
*self.cast_tx.lock().unwrap() = None;
|
||||
if let Some(mut proc) = self.cast_proc.take() {
|
||||
proc.stop_flag.store(true, Ordering::SeqCst);
|
||||
let _ = proc.child.kill();
|
||||
let _ = proc.child.wait();
|
||||
if let Some(j) = proc.writer_join.take() {
|
||||
let _ = j.join();
|
||||
}
|
||||
if let Some(j) = proc.server_join.take() {
|
||||
let _ = j.join();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn stop(mut self, shared: &std::sync::Arc<PlayerShared>) {
|
||||
self.stop_flag.store(true, Ordering::SeqCst);
|
||||
self.stop_cast_tap();
|
||||
// dropping stream stops audio
|
||||
if let Some(j) = self.decoder_join.take() {
|
||||
let _ = j.join();
|
||||
}
|
||||
set_status(&shared, PlayerStatus::Stopped);
|
||||
}
|
||||
|
||||
fn set_volume(&self, volume: f32) {
|
||||
self.volume_bits.store(volume_to_bits(volume), Ordering::Relaxed);
|
||||
}
|
||||
}
|
||||
|
||||
fn player_thread(shared: std::sync::Arc<PlayerShared>, rx: mpsc::Receiver<PlayerCommand>) {
|
||||
// Step 2: FFmpeg decode + CPAL playback.
|
||||
let mut pipeline: Option<Pipeline> = None;
|
||||
let mut pipeline_cast_owned = false;
|
||||
while let Ok(cmd) = rx.recv() {
|
||||
match cmd {
|
||||
PlayerCommand::Play { url } => {
|
||||
if let Some(p) = pipeline.take() {
|
||||
p.stop(&shared);
|
||||
}
|
||||
|
||||
pipeline_cast_owned = false;
|
||||
|
||||
{
|
||||
let mut s = shared.state.lock().unwrap();
|
||||
s.error = None;
|
||||
s.url = Some(url.clone());
|
||||
s.status = PlayerStatus::Buffering;
|
||||
}
|
||||
|
||||
match Pipeline::start(std::sync::Arc::clone(&shared), url, PipelineMode::WithOutput) {
|
||||
Ok(p) => {
|
||||
// Apply current volume to pipeline atomics.
|
||||
let vol = { shared.state.lock().unwrap().volume };
|
||||
p.set_volume(vol);
|
||||
pipeline = Some(p);
|
||||
}
|
||||
Err(e) => {
|
||||
set_error(&shared, e);
|
||||
pipeline = None;
|
||||
}
|
||||
}
|
||||
}
|
||||
PlayerCommand::PlayCast { url } => {
|
||||
if let Some(p) = pipeline.take() {
|
||||
p.stop(&shared);
|
||||
}
|
||||
|
||||
pipeline_cast_owned = true;
|
||||
|
||||
{
|
||||
let mut s = shared.state.lock().unwrap();
|
||||
s.error = None;
|
||||
s.url = Some(url.clone());
|
||||
s.status = PlayerStatus::Buffering;
|
||||
}
|
||||
|
||||
match Pipeline::start(std::sync::Arc::clone(&shared), url, PipelineMode::Headless) {
|
||||
Ok(p) => {
|
||||
let vol = { shared.state.lock().unwrap().volume };
|
||||
p.set_volume(vol);
|
||||
pipeline = Some(p);
|
||||
}
|
||||
Err(e) => {
|
||||
set_error(&shared, e);
|
||||
pipeline = None;
|
||||
}
|
||||
}
|
||||
}
|
||||
PlayerCommand::Stop => {
|
||||
if let Some(p) = pipeline.take() {
|
||||
p.stop(&shared);
|
||||
} else {
|
||||
let mut s = shared.state.lock().unwrap();
|
||||
s.status = PlayerStatus::Stopped;
|
||||
s.error = None;
|
||||
}
|
||||
pipeline_cast_owned = false;
|
||||
}
|
||||
PlayerCommand::SetVolume { volume } => {
|
||||
let v = clamp01(volume);
|
||||
{
|
||||
let mut s = shared.state.lock().unwrap();
|
||||
s.volume = v;
|
||||
}
|
||||
if let Some(p) = pipeline.as_ref() {
|
||||
p.set_volume(v);
|
||||
}
|
||||
}
|
||||
PlayerCommand::CastTapStart { port, bind_host, reply } => {
|
||||
if let Some(p) = pipeline.as_mut() {
|
||||
// Current pipeline sample format is always s16le.
|
||||
let res = p.start_cast_tap(port, &bind_host, p.sample_rate, p.channels);
|
||||
let _ = reply.send(res);
|
||||
} else {
|
||||
let _ = reply.send(Err("No active decoder pipeline".to_string()));
|
||||
}
|
||||
}
|
||||
PlayerCommand::CastTapStop => {
|
||||
if let Some(p) = pipeline.as_mut() {
|
||||
p.stop_cast_tap();
|
||||
}
|
||||
if pipeline_cast_owned {
|
||||
if let Some(p) = pipeline.take() {
|
||||
p.stop(&shared);
|
||||
}
|
||||
pipeline_cast_owned = false;
|
||||
}
|
||||
}
|
||||
PlayerCommand::Shutdown => break,
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(p) = pipeline.take() {
|
||||
p.stop(&shared);
|
||||
} else {
|
||||
set_status(&shared, PlayerStatus::Stopped);
|
||||
}
|
||||
}
|
||||
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"$schema": "https://schema.tauri.app/config/2",
|
||||
"productName": "RadioPlayer",
|
||||
"version": "0.1.0",
|
||||
"version": "0.1.1",
|
||||
"identifier": "si.klevze.radioPlayer",
|
||||
"build": {
|
||||
"frontendDist": "../src"
|
||||
@@ -26,7 +26,10 @@
|
||||
"active": true,
|
||||
"targets": "all",
|
||||
"externalBin": [
|
||||
"binaries/RadioPlayer"
|
||||
"binaries/radiocast-sidecar"
|
||||
],
|
||||
"resources": [
|
||||
"resources/*"
|
||||
],
|
||||
"icon": [
|
||||
"icons/32x32.png",
|
||||
|
||||
@@ -6,6 +6,9 @@
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>RadioPlayer</title>
|
||||
<link rel="stylesheet" href="styles.css">
|
||||
<link rel="manifest" href="manifest.json">
|
||||
<meta name="theme-color" content="#1f1f2e">
|
||||
<link rel="apple-touch-icon" href="assets/favicon_io/apple-touch-icon.png">
|
||||
<script src="main.js" defer type="module"></script>
|
||||
</head>
|
||||
|
||||
@@ -66,8 +69,9 @@
|
||||
</header>
|
||||
|
||||
<section class="artwork-section">
|
||||
<div class="artwork-container">
|
||||
<div class="artwork-placeholder">
|
||||
<div class="artwork-stack">
|
||||
<div class="artwork-container">
|
||||
<div class="artwork-placeholder">
|
||||
<!-- Gooey SVG filter for fluid blob blending -->
|
||||
<svg width="0" height="0" style="position:absolute">
|
||||
<defs>
|
||||
@@ -95,6 +99,15 @@
|
||||
|
||||
<img id="station-logo-img" class="station-logo-img hidden" alt="station logo">
|
||||
<span class="station-logo-text">1</span>
|
||||
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Coverflow-style station carousel under the artwork (drag or use arrows) -->
|
||||
<div id="artwork-coverflow" class="artwork-coverflow" aria-label="Stations">
|
||||
<button id="artwork-prev" class="coverflow-arrow left" aria-label="Previous station">‹</button>
|
||||
<div id="artwork-coverflow-stage" class="artwork-coverflow-stage" role="list" aria-label="Station icons"></div>
|
||||
<button id="artwork-next" class="coverflow-arrow right" aria-label="Next station">›</button>
|
||||
</div>
|
||||
</div>
|
||||
</section>
|
||||
@@ -109,6 +122,7 @@
|
||||
<div id="status-indicator" class="status-indicator-wrap" aria-hidden="true">
|
||||
<span class="status-dot"></span>
|
||||
<span id="status-text"></span>
|
||||
<span id="engine-badge" class="engine-badge" title="Playback engine">FFMPEG</span>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
|
||||
725
src/main.js
@@ -1,13 +1,21 @@
|
||||
const { invoke } = window.__TAURI__.core;
|
||||
const { getCurrentWindow } = window.__TAURI__.window;
|
||||
|
||||
// In Tauri, the WebView may block insecure (http) images as mixed-content.
|
||||
// We can optionally fetch such images via backend and render as data: URLs.
|
||||
const runningInTauri = !!(window.__TAURI__ && window.__TAURI__.core);
|
||||
|
||||
// State
|
||||
let stations = [];
|
||||
let currentIndex = 0;
|
||||
let isPlaying = false;
|
||||
let currentMode = 'local'; // 'local' | 'cast'
|
||||
let currentCastDevice = null;
|
||||
const audio = new Audio();
|
||||
let currentCastTransport = null; // 'tap' | 'proxy' | 'direct' | null
|
||||
|
||||
// Local playback is handled natively by the Tauri backend (player_* commands).
|
||||
// The WebView is a control surface only.
|
||||
let localPlayerPollId = null;
|
||||
|
||||
// UI Elements
|
||||
const stationNameEl = document.getElementById('station-name');
|
||||
@@ -17,6 +25,7 @@ const nowArtistEl = document.getElementById('now-artist');
|
||||
const nowTitleEl = document.getElementById('now-title');
|
||||
const statusTextEl = document.getElementById('status-text');
|
||||
const statusDotEl = document.querySelector('.status-dot');
|
||||
const engineBadgeEl = document.getElementById('engine-badge');
|
||||
const playBtn = document.getElementById('play-btn');
|
||||
const iconPlay = document.getElementById('icon-play');
|
||||
const iconStop = document.getElementById('icon-stop');
|
||||
@@ -28,9 +37,99 @@ const castBtn = document.getElementById('cast-toggle-btn');
|
||||
const castOverlay = document.getElementById('cast-overlay');
|
||||
const closeOverlayBtn = document.getElementById('close-overlay');
|
||||
const deviceListEl = document.getElementById('device-list');
|
||||
const coverflowStageEl = document.getElementById('artwork-coverflow-stage');
|
||||
const coverflowPrevBtn = document.getElementById('artwork-prev');
|
||||
const coverflowNextBtn = document.getElementById('artwork-next');
|
||||
const artworkPlaceholder = document.querySelector('.artwork-placeholder');
|
||||
const logoTextEl = document.querySelector('.station-logo-text');
|
||||
const logoImgEl = document.getElementById('station-logo-img');
|
||||
const artworkPlaceholder = document.querySelector('.artwork-placeholder');
|
||||
|
||||
function toHttpsIfHttp(url) {
|
||||
if (!url || typeof url !== 'string') return '';
|
||||
return url.startsWith('http://') ? ('https://' + url.slice('http://'.length)) : url;
|
||||
}
|
||||
|
||||
function uniqueNonEmpty(urls) {
|
||||
const out = [];
|
||||
const seen = new Set();
|
||||
for (const u of urls) {
|
||||
if (!u || typeof u !== 'string') continue;
|
||||
const trimmed = u.trim();
|
||||
if (!trimmed) continue;
|
||||
if (seen.has(trimmed)) continue;
|
||||
seen.add(trimmed);
|
||||
out.push(trimmed);
|
||||
}
|
||||
return out;
|
||||
}
|
||||
|
||||
function setImgWithFallback(imgEl, urls, onFinalError) {
|
||||
let dataFallbackUrls = [];
|
||||
// Backward compatible signature; allow passing { dataFallbackUrls } as 4th param.
|
||||
// (Implemented below via arguments inspection.)
|
||||
|
||||
if (arguments.length >= 4 && arguments[3] && typeof arguments[3] === 'object') {
|
||||
const opt = arguments[3];
|
||||
if (Array.isArray(opt.dataFallbackUrls)) dataFallbackUrls = opt.dataFallbackUrls;
|
||||
}
|
||||
|
||||
const candidates = uniqueNonEmpty(urls);
|
||||
let i = 0;
|
||||
let dataIdx = 0;
|
||||
let triedData = false;
|
||||
|
||||
if (!imgEl || candidates.length === 0) {
|
||||
if (imgEl) {
|
||||
imgEl.onload = null;
|
||||
imgEl.onerror = null;
|
||||
imgEl.src = '';
|
||||
}
|
||||
if (onFinalError) onFinalError();
|
||||
return;
|
||||
}
|
||||
|
||||
const tryNext = () => {
|
||||
if (i >= candidates.length) {
|
||||
// If direct loads failed and we're in Tauri, try fetching via backend and set as data URL.
|
||||
if (runningInTauri && !triedData && dataFallbackUrls && dataFallbackUrls.length > 0) {
|
||||
triedData = true;
|
||||
const dataCandidates = uniqueNonEmpty(dataFallbackUrls);
|
||||
const tryData = () => {
|
||||
if (dataIdx >= dataCandidates.length) {
|
||||
if (onFinalError) onFinalError();
|
||||
return;
|
||||
}
|
||||
const u = dataCandidates[dataIdx++];
|
||||
invoke('fetch_image_data_url', { url: u })
|
||||
.then((dataUrl) => {
|
||||
// Once we have a data URL, we can stop the fallback chain.
|
||||
imgEl.src = dataUrl;
|
||||
})
|
||||
.catch(() => tryData());
|
||||
};
|
||||
tryData();
|
||||
return;
|
||||
}
|
||||
|
||||
if (onFinalError) onFinalError();
|
||||
return;
|
||||
}
|
||||
const nextUrl = candidates[i++];
|
||||
imgEl.src = nextUrl;
|
||||
};
|
||||
|
||||
imgEl.onload = () => {
|
||||
// keep last successful src
|
||||
};
|
||||
imgEl.onerror = () => {
|
||||
tryNext();
|
||||
};
|
||||
|
||||
// Some CDNs block referrers; this can improve logo load reliability.
|
||||
try { imgEl.referrerPolicy = 'no-referrer'; } catch (e) {}
|
||||
|
||||
tryNext();
|
||||
}
|
||||
// Global error handlers to avoid silent white screen and show errors in UI
|
||||
window.addEventListener('error', (ev) => {
|
||||
try {
|
||||
@@ -60,17 +159,99 @@ const usIndex = document.getElementById('us_index');
|
||||
// Init
|
||||
async function init() {
|
||||
try {
|
||||
// Helpful debug information for release builds so we can compare parity with dev.
|
||||
console.group && console.group('RadioCast init');
|
||||
console.log('runningInTauri:', runningInTauri);
|
||||
try { console.log('location:', location.href); } catch (_) {}
|
||||
try { console.log('userAgent:', navigator.userAgent); } catch (_) {}
|
||||
try { console.log('platform:', navigator.platform); } catch (_) {}
|
||||
try { console.log('RADIO_DEBUG_DEVTOOLS flag:', localStorage.getItem('RADIO_DEBUG_DEVTOOLS')); } catch (_) {}
|
||||
|
||||
// Always try to read build stamp if present (bundled by build scripts).
|
||||
try {
|
||||
const resp = await fetch('/build-info.json', { cache: 'no-store' });
|
||||
if (resp && resp.ok) {
|
||||
const bi = await resp.json();
|
||||
console.log('build-info:', bi);
|
||||
} else {
|
||||
console.log('build-info: not present');
|
||||
}
|
||||
} catch (e) {
|
||||
console.log('build-info: failed to read');
|
||||
}
|
||||
|
||||
restoreSavedVolume();
|
||||
await loadStations();
|
||||
try { console.log('stations loaded:', Array.isArray(stations) ? stations.length : typeof stations); } catch (_) {}
|
||||
setupEventListeners();
|
||||
ensureArtworkPointerFallback();
|
||||
updateUI();
|
||||
updateEngineBadge();
|
||||
|
||||
// Optionally open devtools in release builds for debugging parity with `tauri dev`.
|
||||
// Enable by setting `localStorage.setItem('RADIO_DEBUG_DEVTOOLS', '1')` or by creating
|
||||
// `src/build-info.json` with { debug: true } at build time (the `build:devlike` script does this).
|
||||
try {
|
||||
let shouldOpen = false;
|
||||
try { if (localStorage && localStorage.getItem && localStorage.getItem('RADIO_DEBUG_DEVTOOLS') === '1') shouldOpen = true; } catch (_) {}
|
||||
|
||||
// Build-time flag file (created by tools/write-build-flag.js when running `build`/`build:devlike`).
|
||||
try {
|
||||
const resp = await fetch('/build-info.json', { cache: 'no-store' });
|
||||
if (resp && resp.ok) {
|
||||
const bi = await resp.json();
|
||||
if (bi && bi.debug) shouldOpen = true;
|
||||
}
|
||||
} catch (_) {}
|
||||
|
||||
if (shouldOpen) {
|
||||
try {
|
||||
const w = getCurrentWindow();
|
||||
if (w && typeof w.openDevTools === 'function') {
|
||||
w.openDevTools();
|
||||
console.log('Opened devtools via build-info/localStorage flag');
|
||||
}
|
||||
} catch (e) { console.warn('Failed to open devtools:', e); }
|
||||
}
|
||||
} catch (e) { /* ignore */ }
|
||||
|
||||
console.groupEnd && console.groupEnd();
|
||||
} catch (e) {
|
||||
console.error('Error during init', e);
|
||||
if (statusTextEl) statusTextEl.textContent = 'Init error: ' + (e && e.message ? e.message : String(e));
|
||||
}
|
||||
}
|
||||
|
||||
function updateEngineBadge() {
|
||||
if (!engineBadgeEl) return;
|
||||
|
||||
// In this app:
|
||||
// - Local playback uses the native backend (FFmpeg decode + CPAL output).
|
||||
// - Cast mode plays via Chromecast.
|
||||
const kind = currentMode === 'cast' ? 'cast' : 'ffmpeg';
|
||||
const label = kind === 'cast' ? 'CAST' : 'FFMPEG';
|
||||
const title = kind === 'cast' ? 'Google Cast playback' : 'Native playback (FFmpeg)';
|
||||
|
||||
const iconSvg = kind === 'cast'
|
||||
? `<svg viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" aria-hidden="true">
|
||||
<path d="M2 16.1A5 5 0 0 1 5.9 20" />
|
||||
<path d="M2 12.05A9 9 0 0 1 9.95 20" />
|
||||
<path d="M2 8V6a14 14 0 0 1 14 14h-2" />
|
||||
</svg>`
|
||||
: `<svg viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" aria-hidden="true">
|
||||
<path d="M4 15V9" />
|
||||
<path d="M8 19V5" />
|
||||
<path d="M12 16V8" />
|
||||
<path d="M16 18V6" />
|
||||
<path d="M20 15V9" />
|
||||
</svg>`;
|
||||
|
||||
engineBadgeEl.innerHTML = `${iconSvg}<span>${label}</span>`;
|
||||
engineBadgeEl.title = title;
|
||||
engineBadgeEl.classList.remove('engine-ffmpeg', 'engine-cast', 'engine-html');
|
||||
engineBadgeEl.classList.add(`engine-${kind}`);
|
||||
}
|
||||
|
||||
// Volume persistence
|
||||
function saveVolumeToStorage(val) {
|
||||
try {
|
||||
@@ -94,7 +275,9 @@ function restoreSavedVolume() {
|
||||
volumeSlider.value = String(saved);
|
||||
volumeValue.textContent = `${saved}%`;
|
||||
const decimals = saved / 100;
|
||||
audio.volume = decimals;
|
||||
|
||||
// Keep backend player volume in sync (best-effort).
|
||||
invoke('player_set_volume', { volume: decimals }).catch(() => {});
|
||||
// If currently in cast mode and a device is selected, propagate volume
|
||||
if (currentMode === 'cast' && currentCastDevice) {
|
||||
invoke('cast_set_volume', { deviceName: currentCastDevice, volume: decimals }).catch(()=>{});
|
||||
@@ -102,11 +285,58 @@ function restoreSavedVolume() {
|
||||
}
|
||||
}
|
||||
|
||||
function stopLocalPlayerStatePolling() {
|
||||
if (localPlayerPollId) {
|
||||
try { clearInterval(localPlayerPollId); } catch (e) {}
|
||||
localPlayerPollId = null;
|
||||
}
|
||||
}
|
||||
|
||||
function startLocalPlayerStatePolling() {
|
||||
stopLocalPlayerStatePolling();
|
||||
// Polling keeps the existing UI in sync with native buffering/reconnect.
|
||||
localPlayerPollId = setInterval(async () => {
|
||||
try {
|
||||
if (!isPlaying || currentMode !== 'local') return;
|
||||
const st = await invoke('player_get_state');
|
||||
if (!st || !statusTextEl || !statusDotEl) return;
|
||||
|
||||
const status = String(st.status || '').toLowerCase();
|
||||
if (status === 'buffering') {
|
||||
statusTextEl.textContent = 'Buffering...';
|
||||
statusDotEl.style.backgroundColor = 'var(--text-muted)';
|
||||
} else if (status === 'playing') {
|
||||
statusTextEl.textContent = 'Playing';
|
||||
statusDotEl.style.backgroundColor = 'var(--success)';
|
||||
} else if (status === 'error') {
|
||||
statusTextEl.textContent = st.error ? `Error: ${st.error}` : 'Error';
|
||||
statusDotEl.style.backgroundColor = 'var(--danger)';
|
||||
|
||||
// Backend is no longer playing; reflect that in UX.
|
||||
isPlaying = false;
|
||||
stopLocalPlayerStatePolling();
|
||||
updateUI();
|
||||
} else if (status === 'stopped' || status === 'idle') {
|
||||
isPlaying = false;
|
||||
stopLocalPlayerStatePolling();
|
||||
updateUI();
|
||||
} else {
|
||||
// idle/stopped: keep UI consistent with our isPlaying flag
|
||||
}
|
||||
} catch (e) {
|
||||
// Don't spam; just surface a minimal indicator.
|
||||
try {
|
||||
if (statusTextEl) statusTextEl.textContent = 'Error';
|
||||
} catch (_) {}
|
||||
}
|
||||
}, 600);
|
||||
}
|
||||
|
||||
async function loadStations() {
|
||||
try {
|
||||
// stop any existing pollers before reloading stations
|
||||
stopCurrentSongPollers();
|
||||
const resp = await fetch('stations.json');
|
||||
const resp = await fetch('/stations.json');
|
||||
const raw = await resp.json();
|
||||
|
||||
// Normalize station objects so the rest of the app can rely on `name` and `url`.
|
||||
@@ -152,6 +382,11 @@ async function loadStations() {
|
||||
// Append user stations after file stations
|
||||
stations = stations.concat(userNormalized);
|
||||
|
||||
// Debug: report how many stations we have after loading
|
||||
try {
|
||||
console.debug('loadStations: loaded stations count:', stations.length);
|
||||
} catch (e) {}
|
||||
|
||||
if (stations.length > 0) {
|
||||
// Try to restore last selected station by id
|
||||
const lastId = getLastStationId();
|
||||
@@ -163,7 +398,9 @@ async function loadStations() {
|
||||
currentIndex = 0;
|
||||
}
|
||||
|
||||
console.debug('loadStations: loading station index', currentIndex);
|
||||
loadStation(currentIndex);
|
||||
renderCoverflow();
|
||||
// start polling for currentSong endpoints (if any)
|
||||
startCurrentSongPollers();
|
||||
}
|
||||
@@ -172,6 +409,216 @@ async function loadStations() {
|
||||
statusTextEl.textContent = 'Error loading stations';
|
||||
}
|
||||
}
|
||||
|
||||
// --- Coverflow UI (3D-ish station cards like your reference image) ---
|
||||
let coverflowPointerId = null;
|
||||
let coverflowStartX = 0;
|
||||
let coverflowLastX = 0;
|
||||
let coverflowAccum = 0;
|
||||
let coverflowMoved = false;
|
||||
let coverflowWheelLock = false;
|
||||
|
||||
function renderCoverflow() {
|
||||
try {
|
||||
if (!coverflowStageEl) return;
|
||||
coverflowStageEl.innerHTML = '';
|
||||
|
||||
stations.forEach((s, idx) => {
|
||||
const item = document.createElement('div');
|
||||
item.className = 'coverflow-item';
|
||||
item.dataset.idx = String(idx);
|
||||
|
||||
const rawLogoUrl = s.logo || (s.raw && (s.raw.logo || s.raw.poster)) || '';
|
||||
const fallbackLabel = (s && s.name ? String(s.name) : '?').trim();
|
||||
item.title = fallbackLabel;
|
||||
|
||||
if (rawLogoUrl) {
|
||||
const img = document.createElement('img');
|
||||
img.alt = `${s.name} logo`;
|
||||
|
||||
// Try https first (avoids mixed-content blocks), then fall back to original.
|
||||
const candidates = [
|
||||
toHttpsIfHttp(rawLogoUrl),
|
||||
rawLogoUrl,
|
||||
];
|
||||
|
||||
setImgWithFallback(img, candidates, () => {
|
||||
item.innerHTML = '';
|
||||
item.classList.add('fallback');
|
||||
item.textContent = fallbackLabel;
|
||||
}, { dataFallbackUrls: [rawLogoUrl] });
|
||||
|
||||
item.appendChild(img);
|
||||
} else {
|
||||
item.classList.add('fallback');
|
||||
item.textContent = fallbackLabel;
|
||||
}
|
||||
|
||||
// Click a card: if it's not selected, select it.
|
||||
// Double-click the selected card to open the full station grid.
|
||||
item.addEventListener('click', async (ev) => {
|
||||
if (coverflowMoved) return;
|
||||
const idxClicked = Number(item.dataset.idx);
|
||||
if (idxClicked !== currentIndex) {
|
||||
await setStationByIndex(idxClicked);
|
||||
}
|
||||
});
|
||||
item.addEventListener('dblclick', (ev) => {
|
||||
const idxClicked = Number(item.dataset.idx);
|
||||
if (idxClicked === currentIndex) openStationsOverlay();
|
||||
});
|
||||
|
||||
coverflowStageEl.appendChild(item);
|
||||
});
|
||||
|
||||
wireCoverflowInteractions();
|
||||
updateCoverflowTransforms();
|
||||
} catch (e) {
|
||||
console.debug('renderCoverflow failed', e);
|
||||
}
|
||||
}
|
||||
|
||||
function wireCoverflowInteractions() {
|
||||
try {
|
||||
const host = document.getElementById('artwork-coverflow');
|
||||
if (!host) return;
|
||||
|
||||
// Buttons
|
||||
// IMPORTANT: prevent the coverflow drag handler (pointer capture) from swallowing button clicks.
|
||||
if (coverflowPrevBtn) {
|
||||
coverflowPrevBtn.onpointerdown = (ev) => { try { ev.stopPropagation(); } catch (e) {} };
|
||||
coverflowPrevBtn.onclick = (ev) => {
|
||||
try { ev.stopPropagation(); ev.preventDefault(); } catch (e) {}
|
||||
setStationByIndex((currentIndex - 1 + stations.length) % stations.length);
|
||||
};
|
||||
}
|
||||
if (coverflowNextBtn) {
|
||||
coverflowNextBtn.onpointerdown = (ev) => { try { ev.stopPropagation(); } catch (e) {} };
|
||||
coverflowNextBtn.onclick = (ev) => {
|
||||
try { ev.stopPropagation(); ev.preventDefault(); } catch (e) {}
|
||||
setStationByIndex((currentIndex + 1) % stations.length);
|
||||
};
|
||||
}
|
||||
|
||||
// Pointer drag (mouse/touch)
|
||||
host.onpointerdown = (ev) => {
|
||||
if (!stations || stations.length <= 1) return;
|
||||
|
||||
// If the user clicked the arrow buttons, let the button handler run.
|
||||
// Otherwise pointer capture can prevent the click from reaching the button.
|
||||
try {
|
||||
if (ev.target && ev.target.closest && ev.target.closest('.coverflow-arrow')) return;
|
||||
} catch (e) {}
|
||||
|
||||
coverflowPointerId = ev.pointerId;
|
||||
coverflowStartX = ev.clientX;
|
||||
coverflowLastX = ev.clientX;
|
||||
coverflowAccum = 0;
|
||||
coverflowMoved = false;
|
||||
try { host.setPointerCapture(ev.pointerId); } catch (e) {}
|
||||
};
|
||||
host.onpointermove = (ev) => {
|
||||
if (coverflowPointerId === null || ev.pointerId !== coverflowPointerId) return;
|
||||
const dx = ev.clientX - coverflowLastX;
|
||||
coverflowLastX = ev.clientX;
|
||||
if (Math.abs(ev.clientX - coverflowStartX) > 6) coverflowMoved = true;
|
||||
|
||||
// Accumulate movement; change station when threshold passed.
|
||||
coverflowAccum += dx;
|
||||
const threshold = 36;
|
||||
if (coverflowAccum >= threshold) {
|
||||
coverflowAccum = 0;
|
||||
setStationByIndex((currentIndex - 1 + stations.length) % stations.length);
|
||||
} else if (coverflowAccum <= -threshold) {
|
||||
coverflowAccum = 0;
|
||||
setStationByIndex((currentIndex + 1) % stations.length);
|
||||
}
|
||||
};
|
||||
host.onpointerup = (ev) => {
|
||||
if (coverflowPointerId === null || ev.pointerId !== coverflowPointerId) return;
|
||||
coverflowPointerId = null;
|
||||
// reset moved flag after click would have fired
|
||||
setTimeout(() => { coverflowMoved = false; }, 0);
|
||||
try { host.releasePointerCapture(ev.pointerId); } catch (e) {}
|
||||
};
|
||||
host.onpointercancel = (ev) => {
|
||||
coverflowPointerId = null;
|
||||
coverflowMoved = false;
|
||||
};
|
||||
|
||||
// Wheel: next/prev with debounce
|
||||
host.onwheel = (ev) => {
|
||||
if (!stations || stations.length <= 1) return;
|
||||
if (coverflowWheelLock) return;
|
||||
const delta = Math.abs(ev.deltaX) > Math.abs(ev.deltaY) ? ev.deltaX : ev.deltaY;
|
||||
if (Math.abs(delta) < 6) return;
|
||||
ev.preventDefault();
|
||||
coverflowWheelLock = true;
|
||||
if (delta > 0) setStationByIndex((currentIndex + 1) % stations.length);
|
||||
else setStationByIndex((currentIndex - 1 + stations.length) % stations.length);
|
||||
setTimeout(() => { coverflowWheelLock = false; }, 160);
|
||||
};
|
||||
} catch (e) {
|
||||
console.debug('wireCoverflowInteractions failed', e);
|
||||
}
|
||||
}
|
||||
|
||||
function updateCoverflowTransforms() {
|
||||
try {
|
||||
if (!coverflowStageEl) return;
|
||||
const items = coverflowStageEl.querySelectorAll('.coverflow-item');
|
||||
const n = stations ? stations.length : 0;
|
||||
if (n <= 0) return;
|
||||
const maxVisible = 3;
|
||||
items.forEach((el) => {
|
||||
const idx = Number(el.dataset.idx);
|
||||
// Treat the station list as circular so the coverflow loops infinitely.
|
||||
// This makes the "previous" of index 0 be the last station, etc.
|
||||
let offset = idx - currentIndex;
|
||||
const half = Math.floor(n / 2);
|
||||
if (offset > half) offset -= n;
|
||||
if (offset < -half) offset += n;
|
||||
|
||||
if (Math.abs(offset) > maxVisible) {
|
||||
el.style.opacity = '0';
|
||||
el.style.pointerEvents = 'none';
|
||||
el.style.transform = 'translate(-50%, -50%) scale(0.6)';
|
||||
return;
|
||||
}
|
||||
|
||||
const abs = Math.abs(offset);
|
||||
const dir = offset === 0 ? 0 : (offset > 0 ? 1 : -1);
|
||||
|
||||
const translateX = dir * (abs * 78);
|
||||
const translateZ = -abs * 70;
|
||||
const rotateY = dir * (-28 * abs);
|
||||
const scale = 1 - abs * 0.12;
|
||||
const opacity = 1 - abs * 0.18;
|
||||
const zIndex = 100 - abs;
|
||||
|
||||
el.style.opacity = String(opacity);
|
||||
el.style.zIndex = String(zIndex);
|
||||
el.style.pointerEvents = 'auto';
|
||||
el.style.transform = `translate(-50%, -50%) translateX(${translateX}px) translateZ(${translateZ}px) rotateY(${rotateY}deg) scale(${scale})`;
|
||||
|
||||
if (offset === 0) el.classList.add('selected');
|
||||
else el.classList.remove('selected');
|
||||
});
|
||||
} catch (e) {
|
||||
console.debug('updateCoverflowTransforms failed', e);
|
||||
}
|
||||
}
|
||||
|
||||
async function setStationByIndex(idx) {
|
||||
if (idx < 0 || idx >= stations.length) return;
|
||||
const wasPlaying = isPlaying;
|
||||
if (wasPlaying) await stop();
|
||||
currentIndex = idx;
|
||||
saveLastStationId(stations[currentIndex].id);
|
||||
loadStation(currentIndex);
|
||||
updateCoverflowTransforms();
|
||||
if (wasPlaying) await play();
|
||||
}
|
||||
|
||||
|
||||
// --- Current Song Polling ---
|
||||
@@ -414,10 +861,9 @@ function updateNowPlayingUI() {
|
||||
if (!station) return;
|
||||
|
||||
if (nowPlayingEl && nowArtistEl && nowTitleEl) {
|
||||
// Show now-playing if we have either an artist or a title (some stations only provide title)
|
||||
if (station.currentSongInfo && (station.currentSongInfo.artist || station.currentSongInfo.title)) {
|
||||
nowArtistEl.textContent = station.currentSongInfo.artist || '';
|
||||
nowTitleEl.textContent = station.currentSongInfo.title || '';
|
||||
if (station.currentSongInfo && station.currentSongInfo.artist && station.currentSongInfo.title) {
|
||||
nowArtistEl.textContent = station.currentSongInfo.artist;
|
||||
nowTitleEl.textContent = station.currentSongInfo.title;
|
||||
nowPlayingEl.classList.remove('hidden');
|
||||
} else {
|
||||
nowArtistEl.textContent = '';
|
||||
@@ -604,6 +1050,24 @@ function setupEventListeners() {
|
||||
await appWindow.close();
|
||||
});
|
||||
|
||||
// Listen for cast device discovery events from backend
|
||||
if (runningInTauri && window.__TAURI__ && window.__TAURI__.event) {
|
||||
window.__TAURI__.event.listen('cast-device-discovered', (event) => {
|
||||
console.log('Cast device discovered:', event.payload);
|
||||
// If cast overlay is currently open, refresh the device list
|
||||
if (!castOverlay.classList.contains('hidden')) {
|
||||
refreshCastDeviceList();
|
||||
}
|
||||
});
|
||||
// Notify UI when a device is removed so the list can update
|
||||
window.__TAURI__.event.listen('cast-device-removed', (event) => {
|
||||
console.log('Cast device removed:', event.payload);
|
||||
if (!castOverlay.classList.contains('hidden')) {
|
||||
refreshCastDeviceList();
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Menu button - explicit functionality or placeholder?
|
||||
// Menu removed — header click opens stations via artwork placeholder
|
||||
|
||||
@@ -623,8 +1087,6 @@ function ensureArtworkPointerFallback() {
|
||||
|
||||
// Quick inline style fallback (helps when CSS is overridden)
|
||||
try { ap.style.cursor = 'pointer'; } catch (e) {}
|
||||
try { if (logoImgEl) logoImgEl.style.cursor = 'pointer'; } catch (e) {}
|
||||
try { if (logoTextEl) logoTextEl.style.cursor = 'pointer'; } catch (e) {}
|
||||
|
||||
let active = false;
|
||||
const onMove = (ev) => {
|
||||
@@ -663,33 +1125,45 @@ function loadStation(index) {
|
||||
if (nowArtistEl) nowArtistEl.textContent = '';
|
||||
if (nowTitleEl) nowTitleEl.textContent = '';
|
||||
|
||||
// Update Logo Text (First letter or number)
|
||||
// Simple heuristic: if name has a number, use it, else first letter
|
||||
// If station has a logo URL, show the image; otherwise show the text fallback
|
||||
if (station.logo && station.logo.length > 0) {
|
||||
// Verify the logo exists before showing it
|
||||
checkImageExists(station.logo).then((exists) => {
|
||||
if (exists) {
|
||||
logoImgEl.src = station.logo;
|
||||
logoImgEl.classList.remove('hidden');
|
||||
logoTextEl.classList.add('hidden');
|
||||
} else {
|
||||
logoImgEl.src = '';
|
||||
logoImgEl.classList.add('hidden');
|
||||
logoTextEl.classList.remove('hidden');
|
||||
}
|
||||
});
|
||||
} else {
|
||||
// Fallback: show the full station name when no logo is provided
|
||||
logoImgEl.src = '';
|
||||
logoImgEl.classList.add('hidden');
|
||||
try {
|
||||
logoTextEl.textContent = (station.name || '').trim();
|
||||
} catch (e) {
|
||||
logoTextEl.textContent = '';
|
||||
// Update main artwork logo (best-effort). Many station logo URLs are http; try https first.
|
||||
try {
|
||||
if (logoTextEl && station && station.name) {
|
||||
logoTextEl.textContent = String(station.name).trim();
|
||||
logoTextEl.classList.add('logo-name');
|
||||
}
|
||||
logoTextEl.classList.remove('hidden');
|
||||
|
||||
const rawLogo = (station && (station.logo || (station.raw && (station.raw.logo || '')))) || '';
|
||||
const rawPoster = (station && ((station.raw && station.raw.poster) || station.poster || '')) || '';
|
||||
|
||||
if (logoImgEl) {
|
||||
// Show fallback until load completes.
|
||||
logoImgEl.classList.add('hidden');
|
||||
if (logoTextEl) logoTextEl.classList.remove('hidden');
|
||||
|
||||
const candidates = uniqueNonEmpty([
|
||||
toHttpsIfHttp(rawLogo),
|
||||
rawLogo,
|
||||
toHttpsIfHttp(rawPoster),
|
||||
rawPoster,
|
||||
]);
|
||||
|
||||
setImgWithFallback(logoImgEl, candidates, () => {
|
||||
logoImgEl.classList.add('hidden');
|
||||
if (logoTextEl) logoTextEl.classList.remove('hidden');
|
||||
}, { dataFallbackUrls: [rawLogo, rawPoster] });
|
||||
|
||||
// If something loads successfully, show it.
|
||||
logoImgEl.onload = () => {
|
||||
logoImgEl.classList.remove('hidden');
|
||||
if (logoTextEl) logoTextEl.classList.add('hidden');
|
||||
};
|
||||
}
|
||||
} catch (e) {
|
||||
// non-fatal
|
||||
}
|
||||
|
||||
// Sync coverflow transforms (if present)
|
||||
try { updateCoverflowTransforms(); } catch (e) {}
|
||||
// When loading a station, ensure only this station's poller runs
|
||||
try { startCurrentSongPollers(); } catch (e) { console.debug('startCurrentSongPollers failed in loadStation', e); }
|
||||
}
|
||||
@@ -743,12 +1217,13 @@ async function play() {
|
||||
statusDotEl.style.backgroundColor = 'var(--text-muted)'; // Grey/Yellow while loading
|
||||
|
||||
if (currentMode === 'local') {
|
||||
audio.src = station.url;
|
||||
audio.volume = volumeSlider.value / 100;
|
||||
try {
|
||||
await audio.play();
|
||||
const vol = volumeSlider.value / 100;
|
||||
await invoke('player_set_volume', { volume: vol }).catch(() => {});
|
||||
await invoke('player_play', { url: station.url });
|
||||
isPlaying = true;
|
||||
updateUI();
|
||||
startLocalPlayerStatePolling();
|
||||
} catch (e) {
|
||||
console.error('Playback failed', e);
|
||||
statusTextEl.textContent = 'Error';
|
||||
@@ -756,7 +1231,49 @@ async function play() {
|
||||
} else if (currentMode === 'cast' && currentCastDevice) {
|
||||
// Cast logic
|
||||
try {
|
||||
await invoke('cast_play', { deviceName: currentCastDevice, url: station.url });
|
||||
// UX guard: if native playback is currently decoding a different station,
|
||||
// stop it explicitly before starting the cast pipeline (which would otherwise
|
||||
// replace the decoder behind the scenes).
|
||||
try {
|
||||
const st = await invoke('player_get_state');
|
||||
const nativeActive = st && (st.status === 'playing' || st.status === 'buffering') && st.url;
|
||||
if (nativeActive && st.url !== station.url) {
|
||||
stopLocalPlayerStatePolling();
|
||||
await invoke('player_stop').catch(() => {});
|
||||
}
|
||||
} catch (_) {
|
||||
// Ignore: best-effort guard only.
|
||||
}
|
||||
|
||||
let castUrl = station.url;
|
||||
currentCastTransport = null;
|
||||
try {
|
||||
const res = await invoke('cast_proxy_start', { deviceName: currentCastDevice, url: station.url });
|
||||
if (res && typeof res === 'object') {
|
||||
castUrl = res.url || station.url;
|
||||
currentCastTransport = res.mode || 'proxy';
|
||||
} else {
|
||||
// Backward-compat (older backend returned string)
|
||||
castUrl = res || station.url;
|
||||
currentCastTransport = 'proxy';
|
||||
}
|
||||
} catch (e) {
|
||||
// If proxy cannot start (ffmpeg missing, firewall, etc), fall back to direct station URL.
|
||||
console.warn('Cast proxy start failed; falling back to direct URL', e);
|
||||
currentCastTransport = 'direct';
|
||||
}
|
||||
|
||||
await invoke('cast_play', {
|
||||
deviceName: currentCastDevice,
|
||||
url: castUrl,
|
||||
title: station.title || 'Radio',
|
||||
artist: station.slogan || undefined,
|
||||
image: station.logo || undefined,
|
||||
// Additional metadata hints for receivers
|
||||
subtitle: station.slogan || station.name,
|
||||
backgroundImage: station.background || station.logo || undefined,
|
||||
bgGradient: station.bgGradient || 'linear-gradient(135deg,#5b2d91,#b36cf3)'
|
||||
});
|
||||
isPlaying = true;
|
||||
// Sync volume
|
||||
const vol = volumeSlider.value / 100;
|
||||
@@ -764,8 +1281,10 @@ async function play() {
|
||||
updateUI();
|
||||
} catch (e) {
|
||||
console.error('Cast failed', e);
|
||||
statusTextEl.textContent = 'Cast Error';
|
||||
statusTextEl.textContent = 'Cast Error (check LAN/firewall)';
|
||||
await invoke('cast_proxy_stop').catch(() => {});
|
||||
currentMode = 'local'; // Fallback
|
||||
currentCastTransport = null;
|
||||
updateUI();
|
||||
}
|
||||
}
|
||||
@@ -773,10 +1292,15 @@ async function play() {
|
||||
|
||||
async function stop() {
|
||||
if (currentMode === 'local') {
|
||||
audio.pause();
|
||||
audio.src = '';
|
||||
stopLocalPlayerStatePolling();
|
||||
try {
|
||||
await invoke('player_stop');
|
||||
} catch (e) {
|
||||
console.error(e);
|
||||
}
|
||||
} else if (currentMode === 'cast' && currentCastDevice) {
|
||||
try {
|
||||
await invoke('cast_proxy_stop').catch(() => {});
|
||||
await invoke('cast_stop', { deviceName: currentCastDevice });
|
||||
} catch (e) {
|
||||
console.error(e);
|
||||
@@ -784,6 +1308,9 @@ async function stop() {
|
||||
}
|
||||
|
||||
isPlaying = false;
|
||||
if (currentMode !== 'cast') {
|
||||
currentCastTransport = null;
|
||||
}
|
||||
updateUI();
|
||||
}
|
||||
|
||||
@@ -792,33 +1319,15 @@ async function playNext() {
|
||||
|
||||
// If playing, stop first? Or seamless?
|
||||
// For radio, seamless switch requires stop then play new URL
|
||||
const wasPlaying = isPlaying;
|
||||
|
||||
if (wasPlaying) await stop();
|
||||
|
||||
currentIndex = (currentIndex + 1) % stations.length;
|
||||
loadStation(currentIndex);
|
||||
|
||||
// persist selection
|
||||
saveLastStationId(stations[currentIndex].id);
|
||||
|
||||
if (wasPlaying) await play();
|
||||
const nextIndex = (currentIndex + 1) % stations.length;
|
||||
await setStationByIndex(nextIndex);
|
||||
}
|
||||
|
||||
async function playPrev() {
|
||||
if (stations.length === 0) return;
|
||||
|
||||
const wasPlaying = isPlaying;
|
||||
|
||||
if (wasPlaying) await stop();
|
||||
|
||||
currentIndex = (currentIndex - 1 + stations.length) % stations.length;
|
||||
loadStation(currentIndex);
|
||||
|
||||
// persist selection
|
||||
saveLastStationId(stations[currentIndex].id);
|
||||
|
||||
if (wasPlaying) await play();
|
||||
const prevIndex = (currentIndex - 1 + stations.length) % stations.length;
|
||||
await setStationByIndex(prevIndex);
|
||||
}
|
||||
|
||||
function updateUI() {
|
||||
@@ -829,15 +1338,27 @@ function updateUI() {
|
||||
playBtn.classList.add('playing'); // Add pulsing ring animation
|
||||
statusTextEl.textContent = 'Playing';
|
||||
statusDotEl.style.backgroundColor = 'var(--success)';
|
||||
stationSubtitleEl.textContent = currentMode === 'cast' ? `Casting to ${currentCastDevice}` : 'Live Stream';
|
||||
if (currentMode === 'cast') {
|
||||
const t = currentCastTransport ? ` (${currentCastTransport})` : '';
|
||||
stationSubtitleEl.textContent = `Casting${t} to ${currentCastDevice}`;
|
||||
} else {
|
||||
stationSubtitleEl.textContent = 'Live Stream';
|
||||
}
|
||||
} else {
|
||||
iconPlay.classList.remove('hidden');
|
||||
iconStop.classList.add('hidden');
|
||||
playBtn.classList.remove('playing'); // Remove pulsing ring
|
||||
statusTextEl.textContent = 'Ready';
|
||||
statusDotEl.style.backgroundColor = 'var(--text-muted)';
|
||||
stationSubtitleEl.textContent = currentMode === 'cast' ? `Connected to ${currentCastDevice}` : 'Live Stream';
|
||||
if (currentMode === 'cast') {
|
||||
const t = currentCastTransport ? ` (${currentCastTransport})` : '';
|
||||
stationSubtitleEl.textContent = `Connected${t} to ${currentCastDevice}`;
|
||||
} else {
|
||||
stationSubtitleEl.textContent = 'Live Stream';
|
||||
}
|
||||
}
|
||||
|
||||
updateEngineBadge();
|
||||
}
|
||||
|
||||
function handleVolumeInput() {
|
||||
@@ -846,7 +1367,7 @@ function handleVolumeInput() {
|
||||
const decimals = val / 100;
|
||||
|
||||
if (currentMode === 'local') {
|
||||
audio.volume = decimals;
|
||||
invoke('player_set_volume', { volume: decimals }).catch(() => {});
|
||||
} else if (currentMode === 'cast' && currentCastDevice) {
|
||||
invoke('cast_set_volume', { deviceName: currentCastDevice, volume: decimals });
|
||||
}
|
||||
@@ -860,6 +1381,10 @@ async function openCastOverlay() {
|
||||
castOverlay.setAttribute('aria-hidden', 'false');
|
||||
// ensure cast overlay shows linear list style
|
||||
deviceListEl.classList.remove('stations-grid');
|
||||
await refreshCastDeviceList();
|
||||
}
|
||||
|
||||
async function refreshCastDeviceList() {
|
||||
deviceListEl.innerHTML = '<li class="device"><div class="device-main">Scanning...</div><div class="device-sub">Searching for speakers</div></li>';
|
||||
|
||||
try {
|
||||
@@ -903,14 +1428,20 @@ async function selectCastDevice(deviceName) {
|
||||
await stop();
|
||||
}
|
||||
|
||||
// Best-effort cleanup: stop any lingering cast transport when changing device/mode.
|
||||
await invoke('cast_proxy_stop').catch(() => {});
|
||||
|
||||
if (deviceName) {
|
||||
currentMode = 'cast';
|
||||
currentCastDevice = deviceName;
|
||||
castBtn.style.color = 'var(--success)';
|
||||
// Transport mode gets set on play.
|
||||
currentCastTransport = currentCastTransport || null;
|
||||
} else {
|
||||
currentMode = 'local';
|
||||
currentCastDevice = null;
|
||||
castBtn.style.color = 'var(--text-main)';
|
||||
currentCastTransport = null;
|
||||
}
|
||||
|
||||
updateUI();
|
||||
@@ -920,8 +1451,61 @@ async function selectCastDevice(deviceName) {
|
||||
// Let's prompt user to play.
|
||||
}
|
||||
|
||||
// Best-effort: stop any cast transport when leaving the window.
|
||||
window.addEventListener('beforeunload', () => {
|
||||
try { invoke('cast_proxy_stop'); } catch (_) {}
|
||||
});
|
||||
|
||||
window.addEventListener('DOMContentLoaded', init);
|
||||
|
||||
// Service worker is useful for the PWA, but it can cause confusing caching during
|
||||
// Tauri development because it may serve an older cached `index.html`.
|
||||
if ('serviceWorker' in navigator) {
|
||||
if (runningInTauri) {
|
||||
// Best-effort cleanup so the desktop app doesn't get stuck on an old cached UI.
|
||||
// If we clear anything, do a one-time reload to ensure the new bundled assets are used.
|
||||
(async () => {
|
||||
let changed = false;
|
||||
|
||||
try {
|
||||
const regs = await navigator.serviceWorker.getRegistrations();
|
||||
if (regs && regs.length) {
|
||||
await Promise.all(regs.map((r) => r.unregister().catch(() => false)));
|
||||
changed = true;
|
||||
}
|
||||
} catch (_) {}
|
||||
|
||||
if ('caches' in window) {
|
||||
try {
|
||||
const keys = await caches.keys();
|
||||
if (keys && keys.length) {
|
||||
await Promise.all(keys.map((k) => caches.delete(k).catch(() => false)));
|
||||
changed = true;
|
||||
}
|
||||
} catch (_) {}
|
||||
}
|
||||
|
||||
try {
|
||||
if (changed) {
|
||||
const k = '__radiocast_sw_cleared_once';
|
||||
const already = sessionStorage.getItem(k);
|
||||
if (!already) {
|
||||
sessionStorage.setItem(k, '1');
|
||||
location.reload();
|
||||
}
|
||||
}
|
||||
} catch (_) {}
|
||||
})();
|
||||
} else {
|
||||
// Register Service Worker for PWA installation (non-disruptive)
|
||||
window.addEventListener('load', () => {
|
||||
navigator.serviceWorker.register('sw.js')
|
||||
.then((reg) => console.log('ServiceWorker registered:', reg.scope))
|
||||
.catch((err) => console.debug('ServiceWorker registration failed:', err));
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Open overlay and show list of stations (used by menu/hamburger)
|
||||
async function openStationsOverlay() {
|
||||
castOverlay.classList.remove('hidden');
|
||||
@@ -983,11 +1567,10 @@ async function openStationsOverlay() {
|
||||
li.onclick = async () => {
|
||||
currentMode = 'local';
|
||||
currentCastDevice = null;
|
||||
currentCastTransport = null;
|
||||
castBtn.style.color = 'var(--text-main)';
|
||||
currentIndex = idx;
|
||||
// Remember this selection
|
||||
saveLastStationId(stations[idx].id);
|
||||
loadStation(currentIndex);
|
||||
try { await invoke('cast_proxy_stop'); } catch (_) {}
|
||||
await setStationByIndex(idx);
|
||||
closeCastOverlay();
|
||||
try { await play(); } catch (e) { console.error('Failed to play station from grid', e); }
|
||||
};
|
||||
|
||||
22
src/manifest.json
Normal file
@@ -0,0 +1,22 @@
|
||||
{
|
||||
"name": "RadioPlayer",
|
||||
"short_name": "Radio",
|
||||
"description": "RadioPlayer — stream radio stations from the web",
|
||||
"start_url": ".",
|
||||
"scope": ".",
|
||||
"display": "standalone",
|
||||
"background_color": "#1f1f2e",
|
||||
"theme_color": "#1f1f2e",
|
||||
"icons": [
|
||||
{
|
||||
"src": "assets/favicon_io/android-chrome-192x192.png",
|
||||
"sizes": "192x192",
|
||||
"type": "image/png"
|
||||
},
|
||||
{
|
||||
"src": "assets/favicon_io/android-chrome-512x512.png",
|
||||
"sizes": "512x512",
|
||||
"type": "image/png"
|
||||
}
|
||||
]
|
||||
}
|
||||
153
src/styles.css
@@ -101,7 +101,7 @@ body {
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
position: relative;
|
||||
padding: 10px; /* Slight padding from window edges if desired, or 0 */
|
||||
padding: 8px; /* Slight padding from window edges if desired, or 0 */
|
||||
}
|
||||
|
||||
.glass-card {
|
||||
@@ -115,7 +115,7 @@ body {
|
||||
border-radius: var(--card-radius);
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
padding: 24px;
|
||||
padding: 11px 24px 24px;
|
||||
box-shadow: 0 16px 40px rgba(0, 0, 0, 0.2);
|
||||
}
|
||||
|
||||
@@ -131,7 +131,7 @@ body {
|
||||
align-items: center;
|
||||
margin-bottom: 20px;
|
||||
-webkit-app-region: drag; /* Draggable area */
|
||||
padding: 10px 14px 8px 14px;
|
||||
padding: 1px 14px 8px 14px;
|
||||
border-radius: 14px;
|
||||
background: linear-gradient(135deg, rgba(60,84,255,0.14), rgba(123,127,216,0.10));
|
||||
border: 1px solid rgba(120,130,255,0.12);
|
||||
@@ -211,6 +211,31 @@ body {
|
||||
gap: 8px;
|
||||
}
|
||||
|
||||
.engine-badge {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
gap: 6px;
|
||||
font-size: 0.72rem;
|
||||
letter-spacing: 0.6px;
|
||||
text-transform: uppercase;
|
||||
padding: 2px 8px;
|
||||
border-radius: 999px;
|
||||
border: 1px solid rgba(255,255,255,0.12);
|
||||
background: rgba(255,255,255,0.06);
|
||||
color: var(--text-main);
|
||||
opacity: 0.9;
|
||||
}
|
||||
|
||||
.engine-badge svg {
|
||||
width: 12px;
|
||||
height: 12px;
|
||||
display: block;
|
||||
}
|
||||
|
||||
.engine-ffmpeg { border-color: rgba(125,255,179,0.30); box-shadow: 0 0 10px rgba(125,255,179,0.12); }
|
||||
.engine-cast { border-color: rgba(223,166,255,0.35); box-shadow: 0 0 10px rgba(223,166,255,0.12); }
|
||||
.engine-html { border-color: rgba(255,255,255,0.22); }
|
||||
|
||||
.status-dot {
|
||||
width: 6px;
|
||||
height: 6px;
|
||||
@@ -260,9 +285,16 @@ body {
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.artwork-stack {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
align-items: center;
|
||||
gap: 12px;
|
||||
}
|
||||
|
||||
.artwork-container {
|
||||
width: 220px;
|
||||
height: 220px;
|
||||
width: 190px;
|
||||
height: 190px;
|
||||
border-radius: 24px;
|
||||
padding: 6px; /* spacing for ring */
|
||||
background: linear-gradient(135deg, rgba(255,255,255,0.03), rgba(255,255,255,0.00));
|
||||
@@ -329,6 +361,103 @@ body {
|
||||
z-index: 3;
|
||||
}
|
||||
|
||||
/* When we don't have an icon, show the station name nicely */
|
||||
.station-logo-text.logo-name {
|
||||
font-size: clamp(1.1rem, 5.5vw, 2.2rem);
|
||||
font-weight: 800;
|
||||
font-style: normal;
|
||||
max-width: 88%;
|
||||
text-align: center;
|
||||
line-height: 1.12;
|
||||
padding: 0 12px;
|
||||
overflow: hidden;
|
||||
display: -webkit-box;
|
||||
line-clamp: 2;
|
||||
-webkit-line-clamp: 2;
|
||||
-webkit-box-orient: vertical;
|
||||
}
|
||||
|
||||
/* Artwork coverflow (station carousel inside artwork) */
|
||||
.artwork-coverflow {
|
||||
position: relative;
|
||||
width: min(320px, 92vw);
|
||||
height: 108px;
|
||||
-webkit-app-region: no-drag;
|
||||
}
|
||||
|
||||
.artwork-coverflow-stage {
|
||||
position: absolute;
|
||||
inset: 0;
|
||||
z-index: 1;
|
||||
perspective: 900px;
|
||||
transform-style: preserve-3d;
|
||||
-webkit-app-region: no-drag;
|
||||
}
|
||||
|
||||
.coverflow-item {
|
||||
position: absolute;
|
||||
left: 50%;
|
||||
top: 50%;
|
||||
width: 66px;
|
||||
height: 66px;
|
||||
border-radius: 16px;
|
||||
background: rgba(255,255,255,0.08);
|
||||
border: 1px solid rgba(255,255,255,0.10);
|
||||
box-shadow: 0 10px 26px rgba(0,0,0,0.25);
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
overflow: hidden;
|
||||
backdrop-filter: blur(10px);
|
||||
transform-style: preserve-3d;
|
||||
z-index: 1;
|
||||
-webkit-app-region: no-drag;
|
||||
}
|
||||
|
||||
.coverflow-item.selected {
|
||||
background: rgba(255,255,255,0.12);
|
||||
border-color: rgba(255,255,255,0.18);
|
||||
}
|
||||
|
||||
.coverflow-item img {
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
object-fit: contain;
|
||||
padding: 10px;
|
||||
}
|
||||
|
||||
.coverflow-item.fallback {
|
||||
color: rgba(255,255,255,0.92);
|
||||
text-shadow: 0 2px 10px rgba(0,0,0,0.35);
|
||||
font-weight: 800;
|
||||
font-size: 0.72rem;
|
||||
letter-spacing: 0.2px;
|
||||
text-align: center;
|
||||
padding: 10px;
|
||||
line-height: 1.08;
|
||||
}
|
||||
|
||||
.coverflow-arrow {
|
||||
position: absolute;
|
||||
top: 50%;
|
||||
transform: translateY(-50%);
|
||||
width: 34px;
|
||||
height: 34px;
|
||||
border-radius: 999px;
|
||||
border: 1px solid rgba(255,255,255,0.12);
|
||||
background: rgba(30, 30, 40, 0.35);
|
||||
color: rgba(255,255,255,0.9);
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
cursor: pointer;
|
||||
z-index: 3;
|
||||
-webkit-app-region: no-drag;
|
||||
}
|
||||
|
||||
.coverflow-arrow.left { left: 10px; }
|
||||
.coverflow-arrow.right { right: 10px; }
|
||||
|
||||
.station-logo-img {
|
||||
/* Fill the artwork placeholder while keeping aspect ratio and inner padding */
|
||||
width: 100%;
|
||||
@@ -338,9 +467,10 @@ body {
|
||||
padding: 12px; /* inner spacing from rounded edges */
|
||||
box-sizing: border-box;
|
||||
border-radius: 12px;
|
||||
box-shadow: 0 8px 20px rgba(0,0,0,0.35);
|
||||
/*box-shadow: 0 8px 20px rgba(0,0,0,0.35);*/
|
||||
position: relative;
|
||||
z-index: 3;
|
||||
margin-left:1rem;
|
||||
}
|
||||
|
||||
/* Logo blobs container sits behind logo but inside artwork placeholder */
|
||||
@@ -399,15 +529,7 @@ body {
|
||||
.artwork-placeholder:hover,
|
||||
.station-logo-img,
|
||||
.station-logo-text {
|
||||
cursor: pointer !important;
|
||||
pointer-events: auto;
|
||||
}
|
||||
|
||||
/* Subtle hover affordance to make clickability clearer */
|
||||
.artwork-placeholder:hover .station-logo-img,
|
||||
.artwork-placeholder:hover .station-logo-text {
|
||||
transform: scale(1.03);
|
||||
transition: transform 160ms ease;
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
/* Track Info */
|
||||
@@ -466,6 +588,7 @@ body {
|
||||
height: 4px;
|
||||
background: rgba(255,255,255,0.1);
|
||||
border-radius: 2px;
|
||||
margin-top: 12px;
|
||||
margin-bottom: 30px;
|
||||
position: relative;
|
||||
}
|
||||
|
||||
94
src/sw.js
Normal file
@@ -0,0 +1,94 @@
|
||||
// NOTE: This service worker is for the web/PWA build.
|
||||
// For the Tauri desktop app we aggressively unregister SWs in `src/main.js`.
|
||||
//
|
||||
// Bump this value whenever caching logic changes to guarantee clients don't
|
||||
// keep an old UI after updates.
|
||||
const CACHE_NAME = 'radiocast-core-v3';
|
||||
|
||||
const CORE_ASSETS = [
|
||||
'.',
|
||||
'index.html',
|
||||
'main.js',
|
||||
'styles.css',
|
||||
'stations.json',
|
||||
'assets/favicon_io/android-chrome-192x192.png',
|
||||
'assets/favicon_io/android-chrome-512x512.png',
|
||||
'assets/favicon_io/apple-touch-icon.png',
|
||||
// Optional build stamp (only present for some builds).
|
||||
'build-info.json',
|
||||
];
|
||||
|
||||
const CORE_PATHS = new Set(CORE_ASSETS.map((p) => (p === '.' ? '/' : '/' + p.replace(/^\//, ''))));
|
||||
|
||||
self.addEventListener('install', (event) => {
|
||||
// Activate updated SW as soon as it's installed.
|
||||
self.skipWaiting();
|
||||
event.waitUntil(
|
||||
caches.open(CACHE_NAME).then((cache) => {
|
||||
const reqs = CORE_ASSETS.map((p) => {
|
||||
const url = p === '.' ? './' : p;
|
||||
// Force a fresh fetch for core assets to avoid carrying forward stale UI.
|
||||
return new Request(url, { cache: 'reload' });
|
||||
});
|
||||
return cache.addAll(reqs);
|
||||
})
|
||||
);
|
||||
});
|
||||
|
||||
self.addEventListener('activate', (event) => {
|
||||
event.waitUntil(
|
||||
Promise.all([
|
||||
self.clients.claim(),
|
||||
caches.keys().then((keys) => Promise.all(
|
||||
keys.map((k) => { if (k !== CACHE_NAME) return caches.delete(k); return null; })
|
||||
)),
|
||||
])
|
||||
);
|
||||
});
|
||||
|
||||
self.addEventListener('fetch', (event) => {
|
||||
// Only handle GET requests
|
||||
if (event.request.method !== 'GET') return;
|
||||
|
||||
const url = new URL(event.request.url);
|
||||
|
||||
// Don't cache cross-origin requests (station logos, APIs, etc.).
|
||||
if (url.origin !== self.location.origin) {
|
||||
return;
|
||||
}
|
||||
|
||||
const isCore = CORE_PATHS.has(url.pathname) || url.pathname === '/';
|
||||
const isHtmlNavigation = event.request.mode === 'navigate' || (event.request.headers.get('accept') || '').includes('text/html');
|
||||
|
||||
// Network-first for navigations and core assets to prevent "old UI" issues.
|
||||
if (isHtmlNavigation || isCore) {
|
||||
event.respondWith(
|
||||
fetch(event.request)
|
||||
.then((networkResp) => {
|
||||
const respClone = networkResp.clone();
|
||||
caches.open(CACHE_NAME).then((cache) => cache.put(event.request, respClone)).catch(() => {});
|
||||
return networkResp;
|
||||
})
|
||||
.catch(() => caches.match(event.request).then((cached) => cached || caches.match('index.html')))
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
event.respondWith(
|
||||
caches.match(event.request).then((cached) => {
|
||||
if (cached) return cached;
|
||||
return fetch(event.request).then((networkResp) => {
|
||||
// Optionally cache new resources (best-effort)
|
||||
try {
|
||||
const respClone = networkResp.clone();
|
||||
caches.open(CACHE_NAME).then((cache) => cache.put(event.request, respClone)).catch(()=>{});
|
||||
} catch (e) {}
|
||||
return networkResp;
|
||||
}).catch(() => {
|
||||
// If offline and HTML navigation, return cached index.html
|
||||
if (event.request.mode === 'navigate') return caches.match('index.html');
|
||||
return new Response('', { status: 503, statusText: 'Service Unavailable' });
|
||||
});
|
||||
})
|
||||
);
|
||||
});
|
||||
@@ -1,18 +1,37 @@
|
||||
#!/usr/bin/env node
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import { execSync } from 'child_process';
|
||||
|
||||
const repoRoot = process.cwd();
|
||||
const binariesDir = path.join(repoRoot, 'src-tauri', 'binaries');
|
||||
|
||||
// Existing filename and expected name (Windows x86_64 triple)
|
||||
// No rename needed; ensure the sidecar exists.
|
||||
const existing = 'radiocast-sidecar-x86_64-pc-windows-msvc.exe';
|
||||
const expected = 'RadioPlayer-x86_64-pc-windows-msvc.exe';
|
||||
const expected = existing;
|
||||
|
||||
const src = path.join(binariesDir, existing);
|
||||
const dst = path.join(binariesDir, expected);
|
||||
|
||||
// On Windows the running sidecar process can lock the binary and prevent rebuilds.
|
||||
// Try to kill any leftover sidecar processes before proceeding. This is best-effort
|
||||
// and will silently continue if no process is found or the kill fails.
|
||||
function tryKillSidecar() {
|
||||
if (process.platform !== 'win32') return;
|
||||
const candidates = ['radiocast-sidecar.exe', 'radiocast-sidecar-x86_64-pc-windows-msvc.exe', 'radiocast-sidecar'];
|
||||
for (const name of candidates) {
|
||||
try {
|
||||
execSync(`taskkill /IM ${name} /F`, { stdio: 'ignore' });
|
||||
console.log(`Killed leftover sidecar process: ${name}`);
|
||||
} catch (e) {
|
||||
// ignore errors; likely means the process wasn't running
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
try {
|
||||
tryKillSidecar();
|
||||
|
||||
if (!fs.existsSync(binariesDir)) {
|
||||
console.warn('binaries directory not found, skipping copy');
|
||||
process.exit(0);
|
||||
@@ -23,14 +42,8 @@ try {
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
if (fs.existsSync(dst)) {
|
||||
console.log(`Expected binary already present: ${dst}`);
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
fs.copyFileSync(src, dst);
|
||||
console.log(`Copied ${existing} -> ${expected}`);
|
||||
console.log(`Sidecar binary present: ${dst}`);
|
||||
} catch (e) {
|
||||
console.error('Failed to copy binary:', e);
|
||||
console.error('Failed to prepare binary:', e);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
91
tools/copy-ffmpeg.js
Normal file
@@ -0,0 +1,91 @@
|
||||
#!/usr/bin/env node
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
const repoRoot = process.cwd();
|
||||
const tauriDir = path.join(repoRoot, 'src-tauri');
|
||||
const resourcesDir = path.join(tauriDir, 'resources');
|
||||
|
||||
function platformBinName() {
|
||||
return process.platform === 'win32' ? 'ffmpeg.exe' : 'ffmpeg';
|
||||
}
|
||||
|
||||
function exists(p) {
|
||||
try { return fs.existsSync(p); } catch { return false; }
|
||||
}
|
||||
|
||||
function ensureDir(p) {
|
||||
if (!exists(p)) fs.mkdirSync(p, { recursive: true });
|
||||
}
|
||||
|
||||
// Source lookup order:
|
||||
// 1) RADIOPLAYER_FFMPEG (absolute or relative)
|
||||
// 2) tools/ffmpeg/ffmpeg(.exe)
|
||||
// 3) tools/ffmpeg/bin/ffmpeg(.exe)
|
||||
function resolveSource() {
|
||||
const env = process.env.RADIOPLAYER_FFMPEG;
|
||||
if (env && String(env).trim().length > 0) {
|
||||
const p = path.isAbsolute(env) ? env : path.join(repoRoot, env);
|
||||
if (exists(p)) return p;
|
||||
console.warn(`RADIOPLAYER_FFMPEG set but not found: ${p}`);
|
||||
}
|
||||
|
||||
const name = platformBinName();
|
||||
const candidates = [
|
||||
path.join(repoRoot, 'tools', 'ffmpeg', name),
|
||||
path.join(repoRoot, 'tools', 'ffmpeg', 'bin', name),
|
||||
];
|
||||
|
||||
return candidates.find(exists) || null;
|
||||
}
|
||||
|
||||
function main() {
|
||||
const name = platformBinName();
|
||||
// If CI or prior steps already placed ffmpeg into resources, prefer that and skip copying.
|
||||
const existingInResources = path.join(resourcesDir, name);
|
||||
if (exists(existingInResources)) {
|
||||
console.log(`FFmpeg already present in resources: ${existingInResources} — skipping copy.`);
|
||||
process.exit(0);
|
||||
}
|
||||
// Also search recursively in resources for any ffmpeg-like file (robustness for nested archives)
|
||||
if (exists(resourcesDir)) {
|
||||
const files = fs.readdirSync(resourcesDir, { withFileTypes: true });
|
||||
const found = (function findRec(dir) {
|
||||
for (const f of fs.readdirSync(dir, { withFileTypes: true })) {
|
||||
const p = path.join(dir, f.name);
|
||||
if (f.isFile() && f.name.toLowerCase().startsWith('ffmpeg')) return p;
|
||||
if (f.isDirectory()) {
|
||||
const r = findRec(p);
|
||||
if (r) return r;
|
||||
}
|
||||
}
|
||||
return null;
|
||||
})(resourcesDir);
|
||||
if (found) {
|
||||
console.log(`Found ffmpeg in resources at ${found} — skipping copy.`);
|
||||
process.exit(0);
|
||||
}
|
||||
}
|
||||
const src = resolveSource();
|
||||
if (!src) {
|
||||
console.log('FFmpeg not provided; skipping copy (set RADIOPLAYER_FFMPEG or place it under tools/ffmpeg/).');
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
ensureDir(resourcesDir);
|
||||
const dst = path.join(resourcesDir, name);
|
||||
|
||||
try {
|
||||
fs.copyFileSync(src, dst);
|
||||
// Best-effort: ensure executable bit on unix-like platforms.
|
||||
if (process.platform !== 'win32') {
|
||||
try { fs.chmodSync(dst, 0o755); } catch {}
|
||||
}
|
||||
console.log(`Copied FFmpeg into bundle resources: ${src} -> ${dst}`);
|
||||
} catch (e) {
|
||||
console.error('Failed to copy FFmpeg:', e);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
main();
|
||||
45
tools/ffmpeg/README.md
Normal file
@@ -0,0 +1,45 @@
|
||||
# FFmpeg (Optional) for Native Playback
|
||||
|
||||
The native player uses an external **FFmpeg** binary to decode radio streams.
|
||||
|
||||
## Why this exists
|
||||
|
||||
- The app intentionally does **not** download or embed FFmpeg automatically.
|
||||
- You provide FFmpeg yourself (license/compliance-friendly).
|
||||
|
||||
## How the app finds FFmpeg
|
||||
|
||||
At runtime it searches in this order:
|
||||
|
||||
1. `RADIOPLAYER_FFMPEG` environment variable (absolute or relative path)
|
||||
2. Next to the application executable (Windows: `ffmpeg.exe`, macOS/Linux: `ffmpeg`)
|
||||
3. Common bundle resource folders relative to the executable:
|
||||
- `resources/ffmpeg(.exe)`
|
||||
- `Resources/ffmpeg(.exe)`
|
||||
- `../resources/ffmpeg(.exe)`
|
||||
- `../Resources/ffmpeg(.exe)`
|
||||
4. Your system `PATH`
|
||||
|
||||
## Recommended setup (Windows dev)
|
||||
|
||||
- Put `ffmpeg.exe` somewhere stable, then set:
|
||||
|
||||
`RADIOPLAYER_FFMPEG=C:\\path\\to\\ffmpeg.exe`
|
||||
|
||||
Or copy `ffmpeg.exe` next to the built app binary:
|
||||
|
||||
- `src-tauri/target/debug/ffmpeg.exe` (dev)
|
||||
- `src-tauri/target/release/ffmpeg.exe` (release)
|
||||
|
||||
## Optional: download helper (Windows)
|
||||
|
||||
You can also run:
|
||||
|
||||
`npm run ffmpeg:download`
|
||||
|
||||
This downloads a prebuilt FFmpeg zip and extracts `ffmpeg.exe` into `tools/ffmpeg/bin/ffmpeg.exe`.
|
||||
|
||||
## Notes
|
||||
|
||||
- The player will fail fast with a clear error if FFmpeg is missing.
|
||||
- The project already includes a copy step (`tools/copy-ffmpeg.js`) that runs before `tauri`/`build` and places FFmpeg into `src-tauri/resources/` for bundling.
|
||||
@@ -19,11 +19,16 @@ if (!fs.existsSync(iconPath)) {
|
||||
|
||||
console.log('Patching EXE icon with rcedit...');
|
||||
|
||||
// Prefer local installed binary (node_modules/.bin) to avoid relying on npx.
|
||||
// On Windows, npm typically creates a .cmd shim, which Node can execute.
|
||||
// Prefer local installed binary to avoid relying on npx.
|
||||
// Note: the `rcedit` npm package places the binary at node_modules/rcedit/bin/rcedit.exe
|
||||
// and does not always create a node_modules/.bin shim.
|
||||
const binDir = path.join(repoRoot, 'node_modules', '.bin');
|
||||
const packageBinDir = path.join(repoRoot, 'node_modules', 'rcedit', 'bin');
|
||||
const localCandidates = process.platform === 'win32'
|
||||
? [
|
||||
// Preferred: direct binary shipped by the package
|
||||
path.join(packageBinDir, 'rcedit.exe'),
|
||||
// Fallbacks: npm/yarn shim locations (if present)
|
||||
path.join(binDir, 'rcedit.cmd'),
|
||||
path.join(binDir, 'rcedit.exe'),
|
||||
path.join(binDir, 'rcedit'),
|
||||
@@ -37,9 +42,8 @@ if (localBin) {
|
||||
cmd = localBin;
|
||||
args = [exePath, '--set-icon', iconPath];
|
||||
} else {
|
||||
// Fallback to npx. Note: Node can't execute PowerShell shims (npx.ps1), so this may fail
|
||||
// in environments that only provide .ps1 launchers.
|
||||
cmd = 'npx';
|
||||
// Last resort fallback to npx.
|
||||
cmd = process.platform === 'win32' ? 'npx.cmd' : 'npx';
|
||||
args = ['rcedit', exePath, '--set-icon', iconPath];
|
||||
}
|
||||
|
||||
|
||||
60
tools/sync-version.js
Normal file
@@ -0,0 +1,60 @@
|
||||
#!/usr/bin/env node
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
const repoRoot = process.cwd();
|
||||
|
||||
function readJson(p) {
|
||||
return JSON.parse(fs.readFileSync(p, 'utf8'));
|
||||
}
|
||||
|
||||
function writeJson(p, obj) {
|
||||
fs.writeFileSync(p, JSON.stringify(obj, null, 2) + '\n', 'utf8');
|
||||
}
|
||||
|
||||
function updateCargoTomlVersion(cargoTomlPath, version) {
|
||||
const input = fs.readFileSync(cargoTomlPath, 'utf8');
|
||||
|
||||
// Replace only the [package] version line.
|
||||
const packageBlockStart = input.indexOf('[package]');
|
||||
if (packageBlockStart === -1) {
|
||||
throw new Error('Could not find [package] in Cargo.toml');
|
||||
}
|
||||
|
||||
const packageBlockEnd = input.indexOf('\n[', packageBlockStart + 1);
|
||||
const blockEnd = packageBlockEnd === -1 ? input.length : packageBlockEnd;
|
||||
const pkgBlock = input.slice(packageBlockStart, blockEnd);
|
||||
|
||||
const versionRe = /^version\s*=\s*"([^"]*)"/m;
|
||||
const m = pkgBlock.match(versionRe);
|
||||
if (!m) {
|
||||
throw new Error('Could not find version line in Cargo.toml [package] block');
|
||||
}
|
||||
|
||||
const replaced = pkgBlock.replace(versionRe, `version = "${version}"`);
|
||||
|
||||
const output = input.slice(0, packageBlockStart) + replaced + input.slice(blockEnd);
|
||||
fs.writeFileSync(cargoTomlPath, output, 'utf8');
|
||||
}
|
||||
|
||||
try {
|
||||
const rootPkgPath = path.join(repoRoot, 'package.json');
|
||||
const tauriConfPath = path.join(repoRoot, 'src-tauri', 'tauri.conf.json');
|
||||
const cargoTomlPath = path.join(repoRoot, 'src-tauri', 'Cargo.toml');
|
||||
|
||||
const rootPkg = readJson(rootPkgPath);
|
||||
if (!rootPkg.version) throw new Error('Root package.json has no version');
|
||||
|
||||
const version = String(rootPkg.version);
|
||||
|
||||
const tauriConf = readJson(tauriConfPath);
|
||||
tauriConf.version = version;
|
||||
writeJson(tauriConfPath, tauriConf);
|
||||
|
||||
updateCargoTomlVersion(cargoTomlPath, version);
|
||||
|
||||
console.log(`Synced Tauri version to ${version}`);
|
||||
} catch (e) {
|
||||
console.error('sync-version failed:', e?.message || e);
|
||||
process.exit(1);
|
||||
}
|
||||
54
tools/write-build-flag.js
Normal file
@@ -0,0 +1,54 @@
|
||||
#!/usr/bin/env node
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
const cmd = process.argv[2] || 'set';
|
||||
const repoRoot = process.cwd();
|
||||
const dst = path.join(repoRoot, 'src', 'build-info.json');
|
||||
|
||||
function getPackageVersion() {
|
||||
try {
|
||||
const pkgPath = path.join(repoRoot, 'package.json');
|
||||
const pkg = JSON.parse(fs.readFileSync(pkgPath, 'utf8'));
|
||||
return pkg && pkg.version ? String(pkg.version) : null;
|
||||
} catch (_) {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
function computeDebugFlag() {
|
||||
const envVal = process.env.RADIO_DEBUG_DEVTOOLS;
|
||||
if (envVal === '1' || envVal === 'true') return true;
|
||||
const arg = (process.argv[3] || '').toLowerCase();
|
||||
return arg === 'debug' || arg === '--debug';
|
||||
}
|
||||
|
||||
if (cmd === 'set') {
|
||||
try {
|
||||
const version = getPackageVersion();
|
||||
const debug = computeDebugFlag();
|
||||
const payload = {
|
||||
version,
|
||||
debug,
|
||||
builtAt: new Date().toISOString(),
|
||||
};
|
||||
fs.writeFileSync(dst, JSON.stringify(payload, null, 2) + '\n', 'utf8');
|
||||
console.log(`Wrote build-info.json (debug=${debug}${version ? `, version=${version}` : ''})`);
|
||||
process.exit(0);
|
||||
} catch (e) {
|
||||
console.error('Failed to write build-info.json', e);
|
||||
process.exit(1);
|
||||
}
|
||||
} else if (cmd === 'clear') {
|
||||
try {
|
||||
if (fs.existsSync(dst)) fs.unlinkSync(dst);
|
||||
console.log('Removed build-info.json');
|
||||
process.exit(0);
|
||||
} catch (e) {
|
||||
console.error('Failed to remove build-info.json', e);
|
||||
process.exit(1);
|
||||
}
|
||||
} else {
|
||||
console.error('Unknown command:', cmd);
|
||||
process.exit(2);
|
||||
}
|
||||