MayaPortal
Real-Time Rendering Engine for Digital Twins · 2025–present
The Observing Eye
A digital twin is not its visualization. The digital twin is the territory — a computational reconstruction of neurons, valleys, or cortical circuits. MayaPortal provides the map: an interactive window into that territory, where the scientist can move freely through the reconstructed world, examining it from any angle, at any scale.
Rather than producing static images or pre-rendered animations, MayaPortal maintains a live, interactive viewport into the digital twin. The observer — represented as a point in space with orientation — can orbit, pan, and zoom through neural circuits or mountain landscapes in real time. Like any scientific instrument, MayaPortal has characteristics that shape what can be observed: resolution, frame rate, fidelity, and interaction latency. Building it from first principles ensures the scientist understands these transfer functions rather than accepting black-box defaults.
Functional Architecture
MayaPortal's architecture separates pure cores (testable, deterministic logic) from effectful shells (OS and GPU interaction). This isn't accidental — it's the central design decision that makes a complex rendering engine comprehensible:
Pure Core (no side effects)
- Simulation state updates:
update(state, dt) → state' - Camera orbit mathematics
- Mesh generation from signed distance fields
- Rendering command list generation
Effectful Shell (I/O boundaries)
- SDL3 event polling and window creation
- WebGPU device/surface initialization
- GPU resource uploads and command submission
- File I/O for assets and shaders
The data flow per frame follows an explicit pipeline: State → Input Processing → State Update (Pure) → Command Generation (Pure) → Command Execution (Effectful) → Metrics Collection.
Monadic Patterns in C++23
The codebase names its recurring patterns using monadic vocabulary from functional programming, applied to GPU rendering:
| Pattern | Purpose | C++23 Mechanism |
|---|---|---|
| Reader | Immutable GPU context (device, pipelines, layouts) | const GPUContext& threaded to rendering functions |
| State | Simulation world evolution | Pure functions: update(state, input, dt) → state |
| Expected | Fallible operations without exceptions | std::expected<T, Error> with monadic chaining via and_then |
| Writer | Frame metrics accumulation | WithMetrics<T> carrying value + performance data |
These aren't theoretical exercises. std::expected replaces exception-based error handling throughout: shader compilation, asset loading, and pipeline creation all return Expected values that chain cleanly:
return load_shader("basic.wgsl")
.and_then([&](const std::string& wgsl) {
return create_pipeline(device, wgsl);
})
.transform([&](Pipeline p) {
return Renderer{ .pipeline = std::move(p) };
});
Technology Choices
| Technology | Rationale |
|---|---|
| WebGPU (wgpu-native) | Modern, explicit GPU API. Portable across platforms. Avoids building Chromium infrastructure (Dawn). |
| SDL3 | Windowing and input only. Mature, cross-platform. Emscripten-ready for web deployment. |
| WGSL | WebGPU's native shader language. No transpilation pipeline needed. |
| C++23 | std::expected for monadic error handling, ranges, concepts for type constraints. |
| CMake + FetchContent | Deterministic dependency management. All deps pinned to specific git tags. |
| Catch2 v3 | Automated test discovery. Agents run tests non-interactively via ctest. |
| pybind11 | Python bridge for interactive scientific exploration. |
| Emscripten | Future web deployment: share visualizations via browser. |
Literate Programming as Architecture
MayaPortal's source of truth is not .cpp files — it's .org files. Code is extracted ("tangled") from literate Org-mode documents where prose explains intent and code realizes it. The build system includes a tangle target:
cmake --build build --target tangle # Extract code from Org files
This forces explanations to stay synchronized with implementation. Students and future collaborators read the Org files first, then explore the generated C++.
Lesson-Based Development
The codebase is organized as a curriculum. Each lesson teaches one concept and leaves the repository in a buildable, testable state:
- Lesson 00: Prelude (complete) — Development process, literate programming,
std::expected, C++23 toolchain validation - Phase 1: Foundation — SDL3 window, WebGPU surface, camera controller, debug overlay
- Phase 2: Primitives — Point clouds, lines, meshes, lighting
- Phase 3: Phenomenon Modules — Terrain heightfield, neuron morphology, particle systems, volumes
- Phase 4: Integration — Python bindings, serialization, annotation system
Immutable git tags (lesson/NN-slug) mark each waypoint. Anyone can fork from any lesson and extend.
Three-Face Workflow
Each development task produces three artifacts:
- Discuss (dialogue face) — Questions, alternatives, decisions and why. The deliberation.
- Plan (human face) — Written for learners. Explains why before what. Philosophical voice, architectural reasoning.
- Spec (machine face) — Written for executors. Minimal prose, exact paths, concrete signatures. Mechanically verifiable done-when criteria.
This structure enables human-AI collaboration where the human contributes judgment and embodied intuition, while the machine contributes rapid traversal of conceptual space and tireless attention to detail.
Types as Scientific Claims
MayaPortal uses a three-language approach where type signatures encode scientific propositions:
- Haskell — Specification: algebraic structure and composition laws
- C++23 — Implementation: executable realization on hardware
- Python — Exploration: interactive REPL for scientists
A type signature like HeightField → Mesh isn't just an interface — it's a geometric proposition stating that terrain data can be triangulated. The implementation is the constructive proof. Invalid compositions become unwritable, catching scientific errors at compile time.
Current State
MayaPortal is in early development. Lesson 00 (Prelude) is complete, establishing the development process, literate programming workflow, and C++23 toolchain. The foundational architecture — pure cores, effectful shells, monadic composition — is designed and documented. The SDL3/WebGPU rendering loop is the immediate next milestone.
The project demonstrates how a scientist with deep domain expertise (but no graphics engineering background) can build a domain-specific visualization engine from first principles through principled collaboration with AI thinking partners.