Skip to main content

Engineering

Bellweather Studios Engineering

The engineering behind a multi-surface software platform - native C++20 plugins, web services, desktop tooling, validation infrastructure, and architecture visualization - built and maintained solo.

Platform at a Glance

A monorepo, built as one system.

Native plugins, web services, a desktop client, shared C++ modules, and a layered validation stack - all inside a single engineering system.

VST3, AU, AAX, CLAP, and Standalone plugin formats

One C++ codebase, every host on the market — VST3, AU, AAX, CLAP, Standalone.

25+ shared C++ modules

DSP, UI, RT-safety, testing, licensing, telemetry — built once, consumed by every plugin.

7 web & desktop apps

Next.js services and an Electron client in one workspace, one build graph.

90+ plugin test files, 110+ web/E2E specs

Catch2 for DSP internals, DawDreamer for plugin-as-host, Vitest for web, Playwright for journeys.

20+ CI workflows

Plugin validation, security baselines, drift detection, supply-chain audit, release gates.

65+ pre-push gates in ~20s

Type strictness, secret scan, architecture drift, reproducibility — caught before CI ever sees it.

System Architecture

How the pieces connect.

The platform is organized as a modular monorepo. Products, apps, cloud services, shared modules, and validation systems evolve inside one engineering system. For the product-facing side, see the company context and the plugin catalog.

Native Products

Pressure

Dynamics processor

Barometer

Metering & analysis

Stratus

Convolution reverb

Ceiling

Limiter

Flux

Earned saturation clipper

Equilibrium

Spatial processor

Web & Desktop

Website

Next.js 16 - product site + auth gate

Weather Station

Electron 38 - desktop app

Cupola

Three.js architecture viz

Observatory

Hosted VST3/AU release-gate

Bellweather Cloud

backend-api

Node service on AWS ECS Fargate (revenue runtime)

Ground Control

Licensing & update API

Mission Control

Admin dashboard & dispatch

Neon PostgreSQL

Serverless database

CloudFront CDN

cdn.bellweatherstudios.com

Upstash Redis

API rate limiting

Native Shared Kernel

Domain layer

Zero-dependency canonical types - compile-time boundary enforced

DSP layer

Native biquads, FFT, smoothing - no framework lock-in

RT-safety layer

Audio thread guards, allocation tracking, scope enforcement

Adapter boundary

Framework confined to one module - everything above speaks stdlib

Plugin infrastructure

Shared UI, licensing, state, presets, testing across all plugins

Validation & Observability

BPTS

Plugin testing suite (Catch2)

DawDreamer

Python integration tests

BOL Probe

Black-box binary validator

Format-native validators

pluginval / auval / clap-validator / DigiShell

Playwright

E2E journey tests

Observatory

Hosted release-gate + proof artifacts

Through-lines

Shared schemas, native modules, validation tooling, release workflows, and system documentation connect these surfaces. The design token pipeline, RT-safety enforcement, and contract boundaries keep them coherent as one system.

How It's Built

Systems that shape everything else.

Each one cuts across the entire platform and defines how products, services, and tooling stay coherent.

01

One token source, three target languages

Python generators emit C++ constexpr headers, TypeScript modules, and CSS custom properties from one JSON. Chosen over per-surface theming because color and spacing drift between the plugin UI and the marketing site is the canonical "looks fine in isolation, looks broken side-by-side" failure. Trade: any token change recompiles every consumer — drift detection costs more than recompilation.

02

Compile-time RT-safety, not runtime profiling

RtGuard makes audio-thread allocations and locks a build error, not a regression caught after a customer report. Sanitizer-backed validation lanes (ASan/UBSan/TSan) cover the runtime side. Chosen over "test for it after it ships" because audio dropouts under host pressure are invisible in normal play and unrecoverable at the customer site.

03

Layer-specific test canon, not one runner everywhere

Catch2 for plugin internals, DawDreamer for plugin-as-DAW-host integration, Vitest for web units, Playwright for cross-surface journeys (90+ plugin test files, 110+ web/E2E specs). Chosen over a single monorepo runner because each layer's failure mode is distinct — a DSP regression and a checkout-flow break are not the same question.

04

Pre-push enforcement, not post-merge inspection

65+ gates run locally in ~20s before push (type strictness, secret scan, architecture drift, reproducibility, public-API exposure) so CI catches escapes, not first-line violations. Chosen over branch-protection-only because a failed merge is a 10-minute round-trip; a failed pre-push is 30 seconds. Trade: every contributor's machine runs the full suite — acceptable solo, re-evaluated at first hire.

05

Boundaries enforced at the seam, not described in docs

Shared kernel admits only code identical across consumers, stable, and free of app-specific knowledge — a three-question rubric, not "best practices." API route safety, error-format consistency, and forward-compatibility are predeploy gates, not conventions. Chosen over architectural decision records alone because docs decay; gates fail loudly. Gate scope itself is now declared by manifests co-located with the source they govern, after a path-pinned check silently rotted.

06

Runtime split via proxy-via-website, not big-bang cutover

Revenue-side runtime is moving off Vercel onto AWS ECS Fargate one route at a time. The website stays the auth gate and frontend shell; routes opt into the new runtime per-flag, and the website signs internal envelopes for the AWS-side service. Chosen over a big-bang Vercel-to-AWS cutover because rollback at any moment is one env flip + redeploy — no data migration, no DNS dance, no customer-visible change. Trade: shadow-comparison and dual-secret rotation per route, paid as a recurring discipline cost rather than a single high-risk event.

Enforcement Signals

What is actually enforced.

Tests, package boundaries, validation flows, and reproducibility rules guard the architecture at its seams. Public-facing versions of that validation story live in the Testing Suite and Research Notes.

Boundary tests

Orchestration routes are guarded by seam tests that prevent transport code from absorbing command logic. Contract schemas are shared across surfaces via workspace packages.

Security gates

Binary hardening tests verify compiler flags and symbol stripping. A security audit suite covers malicious state injection, buffer boundary attacks, and RSA credential verification.

Validation lanes

Format-aware plugin release-baseline orchestrator wraps native validator lanes per format: pluginval for VST3, auval for AU, clap-validator for CLAP, DigiShell / AAX Validator for AAX. Hosted qualification via the JUCE-based Observatory app covers VST3 and AU. Daily contract smoke tests, weekly security and functional baselines, release-candidate gates, and production smoke tests run as named CI workflows on top of that.

Reproducibility gates

Environment pins, doctor checks, deterministic JUCE provisioning, and SHA-256 verified distribution artifacts reduce drift between dev, validation, and release.

Validation Pipeline

Five stages, progressive trust.

Each stage must pass before the next begins. Early stages are cheap and fast; later stages are exhaustive. The full pipeline is documented in the Observatory.

1

Characterize

Auto-discover modes, channels, parameter ranges across all plugin formats

2

Topology

Behavioral identity: latency, tail, silence, bypass coherence

3

Spec Validate

Hardware-derived assertions: THD, IMD, noise floor, frequency response across every engine type

4

Gauntlet Fuzz

Assertions covering NaN, Inf, DC offset, denormals, and rapid parameter automation

5

Mutation

Inject faults into DSP path, verify detection, confirm kill rate before release

All pass → ShipAny fail → Reject

Stack

What it runs on.

LayerTechnologies
NativeC++20, JUCE, CMake, Catch2, LibFuzzer - VST3/AU/AAX/CLAP/Standalone, universal binary (ARM + Intel)
DesktopElectron 38, Vite, electron-builder
WebNext.js 16, React 19, TypeScript, Tailwind 4, Zod
DataPostgreSQL (Neon), Prisma 6
VisualizationThree.js, React Three Fiber, React Spring
InfrastructureAWS ECS Fargate, ECR, Docker, ALB, S3 / CloudFront, Vercel, GitHub Actions, pre-push quality-gate system
ShippingApple Developer ID signing, notarization, Gatekeeper validation, S3/CloudFront warehouse, SHA-256 verification
QualitySanitizer-backed validation lanes (ASan/UBSan/TSan), DawDreamer integration tests, format-native validators (pluginval/auval/clap-validator/AAX Validator), Playwright, BPTS pipeline
CodegenPython - cross-language token pipeline (JSON → C++ constexpr, TypeScript, CSS)
CI / ToolingBash (CI, hooks), GitHub Actions

Public Surfaces

Live systems you can inspect.

These surfaces expose parts of the platform directly. The same thread continues through the blog and the Bellweather philosophy.

Ground Control

Public licensing and API surface. Useful for seeing the operational side of the platform.

Open Ground Control

Cupola

Architecture visualization surface. Useful for seeing how the platform is modeled and presented publicly.

Open Cupola

Bellweather Studios

The main product site. Useful for seeing how the engineering system is expressed at the product layer.

Open main site

GitHub profile

Public profile-level context and links into the broader Bellweather engineering story.

Open GitHub

Architecture In Motion

Serious systems are rarely static.

Boundaries, validation flows, and shell-versus-domain seams are explicit and test-enforced. Research notes, bug postmortems, and architecture documents track every structural decision as the system evolves.

The architecture keeps moving because the products and delivery surfaces keep moving. Legibility is the constraint - every decision gets documented, tested, or both.

Numbers on this page are floored from a build-time scan of the repo. Last verified ; a predeploy gate fails the build if they drift.