Die Host-Seite ist jetzt auf eine gemeinsame software-first API ausgerichtet. In [control.rs](</c:/Users/janni/Documents/RFP/Infinity_Vis _Rust/crates/infinity_host/src/control.rs>) steckt jetzt das stabile gemeinsame Modell für Snapshots, Commands, Pattern-Katalog, Presets, Gruppen, Parameter, Preview und Übergänge. Darauf sitzen die neue Szenen-/Pattern-Schicht in [scene.rs](</c:/Users/janni/Documents/RFP/Infinity_Vis _Rust/crates/infinity_host/src/scene.rs>) und der simulationsbasierte Host-Service in [simulation.rs](</c:/Users/janni/Documents/RFP/Infinity_Vis _Rust/crates/infinity_host/src/simulation.rs>). Der neue Core kann jetzt softwareseitig schon: - Pattern-Katalog mit `solid_color`, `gradient`, `chase`, `pulse`, `noise`, `walking_pixel` - Preset-Recall, Gruppen-Targeting, Parameteränderungen und Übergänge - simulierte Preview-Daten für alle 18 Outputs - denselben API-Zugriff für CLI, Engineering-GUI und später Web-UI / grandMA-Adapter Zusätzlich gibt es im Host-CLI jetzt `snapshot`, also eine direkte JSON-Sicht auf den gemeinsamen Host-State über [main.rs](</c:/Users/janni/Documents/RFP/Infinity_Vis _Rust/crates/infinity_host/src/main.rs>). **Oberflächen** Die technische lokale GUI bleibt bestehen und hängt jetzt auf der neuen gemeinsamen API. In [app.rs](</c:/Users/janni/Documents/RFP/Infinity_Vis _Rust/crates/infinity_host_ui/src/app.rs>) zeigt sie weiter Mapping/Status/Testmuster, ergänzt um Engine-/Szene-/Übergangsstatus. Sie bleibt bewusst Engineering-orientiert und ist nicht zur kreativen Hauptoberfläche aufgeblasen worden. Die Beispielkonfiguration in [project.example.toml](</c:/Users/janni/Documents/RFP/Infinity_Vis _Rust/config/project.example.toml>) ist jetzt auch als Software-Spielwiese brauchbarer: mehr Gruppen, mehr kreative Presets und bessere Basis für Look-Entwicklung ohne echte Node-Aktivierung. Die neue API-Ausrichtung ist in [host_api.md](</c:/Users/janni/Documents/RFP/Infinity_Vis _Rust/docs/host_api.md>) und [architecture.md](</c:/Users/janni/Documents/RFP/Infinity_Vis _Rust/docs/architecture.md>) dokumentiert. **Verifikation** `cargo check` und `cargo test -q` laufen erfolgreich. Zusätzlich läuft `cargo run -p infinity_host -- snapshot --config config/project.example.toml` und liefert den gemeinsamen Host-Snapshot mit Katalog, aktiver Szene, Preview, Node- und Panelstatus. Der nächste sinnvolle Schritt ist jetzt ein echter API-Adapter fuer die kommende Web-UI, also HTTP/WebSocket auf genau diesem Host-Core statt einer frontend-spezifischen Parallelarchitektur.
108 lines
3.6 KiB
Markdown
108 lines
3.6 KiB
Markdown
# Architecture
|
|
|
|
## Goal
|
|
|
|
Build a live-capable LED control platform that keeps realtime output deterministic while letting operators change scenes, brightness, tests, and presets without UI jitter leaking into the hot path.
|
|
|
|
## Current Priority
|
|
|
|
The current delivery order is intentionally software-first:
|
|
|
|
1. host-core and shared API
|
|
2. scene, preset, group, parameter, transition, and simulation model
|
|
3. web UI as the primary creative surface
|
|
4. engineering GUI as the technical surface
|
|
5. external show-control adapters such as grandMA
|
|
6. hardware validation and real node activation later
|
|
|
|
## Layer Split
|
|
|
|
1. Control layer
|
|
- Shared host API first
|
|
- Creative web UI later
|
|
- Engineering GUI already implemented in `crates/infinity_host_ui`
|
|
- Monitoring, mapping, diagnostics, and admin
|
|
- Never the timing master for LED output
|
|
2. Realtime engine
|
|
- Owns the monotonic clock
|
|
- Computes scene state, transitions, and dirty regions
|
|
- Produces transport-ready commands or pixel frames
|
|
3. Transport and node layer
|
|
- Discovery, heartbeat, config sync, sequencing, and recovery
|
|
- Control protocol and realtime protocol stay separate
|
|
- Latest realtime state wins, stale frames may be dropped
|
|
4. ESP32 firmware
|
|
- Receives commands
|
|
- Maintains local buffers
|
|
- Drives three independent outputs per node
|
|
- Handles watchdog and reconnect logic locally
|
|
|
|
## Runtime Model
|
|
|
|
- Logic tick target: 120 Hz
|
|
- Frame synthesis target: 60 Hz
|
|
- Network send target: 40-60 Hz, profile dependent
|
|
- Preview target: 10-15 Hz
|
|
|
|
Preview and telemetry are explicitly degradable. Realtime output is not.
|
|
|
|
## Shared Surface Model
|
|
|
|
Every surface must talk to the same host API:
|
|
|
|
- engineering GUI
|
|
- future creative web UI
|
|
- CLI inspection
|
|
- future grandMA adapter
|
|
|
|
The current software-first implementation uses a simulation-backed host API so looks, presets, parameters, and grouping can be developed before real node activation.
|
|
|
|
## Modes
|
|
|
|
### Distributed Scene Mode
|
|
|
|
- Default operating mode
|
|
- Host sends scene parameters, time basis, seed, palette, and transitions
|
|
- Nodes render locally for low bandwidth and better resilience
|
|
|
|
### Frame Streaming Mode
|
|
|
|
- Used for mapping tests, debugging, and effects that cannot run node-local
|
|
- Host sends explicit output frames
|
|
- Kept logically separate so it does not contaminate the primary scene pipeline
|
|
|
|
## Mapping Model
|
|
|
|
The project configuration separates mapping into three layers:
|
|
|
|
1. Hardware mapping
|
|
- Node ID
|
|
- Top, middle, bottom output
|
|
- Physical output label
|
|
- Driver channel reference
|
|
- LED count, direction, color order, enable flag
|
|
2. Layout mapping
|
|
- Optional row and column placement
|
|
- Optional preview transforms only
|
|
3. Group mapping
|
|
- Explicit groups for artistic control and fast operator access
|
|
|
|
The current example config intentionally keeps layout mapping empty because the old XML is only a spatial reference and the final node-to-room placement must still be confirmed on real hardware.
|
|
|
|
## Validation Gates
|
|
|
|
The codebase deliberately blocks activation when these remain unresolved:
|
|
|
|
- `UART 6`, `UART 5`, `UART 4` still marked as `pending_validation`
|
|
- output validation state is not `validated`
|
|
- LED count deviates from 106
|
|
- node outputs are missing top, middle, or bottom
|
|
- driver references are ambiguous or duplicated per node
|
|
|
|
## Planned Next Steps
|
|
|
|
1. Add a network-facing adapter for the shared host API and start the web UI
|
|
2. Expand scene authoring and preset editing on top of the existing simulation core
|
|
3. Implement transport adapters without coupling them to any single frontend
|
|
4. Keep hardware activation behind explicit later validation gates
|