Tech Stack
Overview
This page describes the underlying technology for the curious. You do not need to know any of this to use Fantasynth.
Rendering
- WebGL 2.0 for all GPU rendering
- Three.js as the scene graph and rendering abstraction
- postprocessing (pmndrs) for the effect chain: bloom, SSAO, SSR, tone mapping, color grading, and more
- A custom MRT pass shared across post-processing effects to render normals, roughness, and velocity in a single pass
Shaders
Fantasynth uses a template-based shader system. Visual shaders are written in GLSL with a small JSON frontmatter that declares parameters and metadata. The template handles all the boilerplate: world-space mapping, palette management, fog, instancing, beat-reactive uniforms, lifecycles, and more.
Shaders are processed at load time. A shader processor handles #include directives, feature injection, and template assembly. The result is a single fragment and vertex shader pair that runs on the GPU.
Scenes
Scenes are written in JavaScript and exported as ES modules. Each scene is a factory function that receives engine state and returns a set of Three.js objects plus an update callback. Scenes can declare parameters that the user can tune live.
Compositions
Compositions, content definitions, music tracks, and patches are stored as plain JSON. The format is human-readable, easy to inspect, and designed to be hand-edited if needed. Loading a composition triggers a content preloader that fetches and parses every shader, scene, and texture referenced.
Audio
Music playback uses the standard HTML5 audio element. Beat sync is driven by a clock set manually to the current BPM. The clock advances each frame and exposes uniforms like uTime, uBeat, and uPhase to the shader system. Beat-reactive features in the visual template lock to the clock, which the performer aligns to the music either by syncing to an external source (DJ-style, BPM + nudge) or by uploading a file and mapping BPM plus a grid marker.
MIDI
Controller input uses the Web MIDI API. No drivers, no installation. Controllers are mapped to clips, patches, and parameters via JSON configuration files.
Remote control
A second device can connect to a running Fantasynth instance and act as a remote. The connection uses WebRTC for low-latency communication. The remote mirrors the clip grid and fader controls, letting you perform from a phone or tablet while the visuals run on a separate display.
Site
This documentation site is built with Astro. It is a separate static site from the instrument itself, deployed independently. The pages you are reading are written in Markdown and rendered to HTML at build time.
AI collaboration
Fantasynth was built in a long collaboration with Claude (Anthropic). Shader features, UI panels, the composition engine, the embed running on the landing page, the SVG diagrams in these docs, and this site itself. Claude wrote most of the code. The author made the choices.
The tone and direction are human. A lot of the typing is not.