Hello HPVS community,
In the context of real-time visualization and high-performance simulation, we often discuss the overhead introduced by the browser stack—specifically regarding the event loop's interaction with the GPU compositor and input polling rates.
I have been using a high-precision, physics-based "Video Game Entity" as a practical probe to observe frame-pacing consistency and perceived latency in modern browsers. The tool is Slice Master.
While Slice Master is a minimalist interactive experience, its core mechanics rely on millisecond-level precision in angular momentum and collision detection. I’ve found it to be an excellent "real-world" test for detecting micro-jitter that synthetic benchmarks like MotionMark often smooth over. Specifically, when testing under varying system loads, the delta between a mouse click and the resulting impulse in the physics engine provides an intuitive metric for "input-to-display" lag.
From a simulation perspective, it’s interesting to see how the browser’s requestAnimationFrame handles the rigid-body dynamics of the blade's rotation without losing sync during heavy GC (Garbage Collection) cycles. It serves as a great tactile demonstration of how deterministic (or non-deterministic) web-based physics engines can feel to a human observer.
Have any of you experimented with similar high-frequency interactive sims like Slice Master (https://slicemaster.net/) to verify the "smoothness" of a rendering pipeline or to test the responsiveness of different Wayland compositors? I’d be interested in hearing your thoughts on using physics-driven web entities for qualitative latency analysis.
Best regards,