NCL NightCity Labs

Project · 2025 · In progress

The Blue is Sky: Sensory Alignment & Perceptual Drift

Research into how agents and humans co-adapt to shifted spectral inputs, informing computer vision architectures and learned perception.

ResearchSimulation PerceptionNeuroscienceComputer Vision

Motivation

The Blue is Sky program extends our manifesto into hardware and field experiments. We engineer complementary multiband filters—one per eye—to create six effective cone fundamentals so participants can learn genuinely new chromatic categories. Rather than treat colour as fixed physics, we treat it as a learnable associative space and push on it until new qualia appear.

Spectral engineering

Using the Spectral Filter Statistical Study (v1 plan), we simulate hundreds of per-eye coatings against natural image statistics, daylight/LED illuminants, and physiological comfort constraints. Each candidate pair is scored for:

  1. Metamer splitting and mutual information gain relative to LMS vision.
  2. Comfort—binocular luminance balance, rivalry risk, migraine triggers.
  3. Stability across illuminants, manufacturing tolerances, and viewing angles.

We keep Pareto-optimal designs, send specs to vendors, and validate the coatings with spectrophotometer traces before they enter human trials.

Adaptation protocols

Participants wear the filters in graded sessions (lab + VR) while we measure:

  • Unique-hue drift and asymmetric matching tasks.
  • Object/affect confusability matrices (does “blue” still mean distant, calm, sky?).
  • Physiological comfort (stereo acuity, SSQ scores) and neural correlates when available.

The hypothesis is that new ecological clusters—e.g., different foliage species, painted blues vs natural ones—acquire stable language and feeling. We document the emergence of new colour categories through essays, instrumentation, and simulations.

Interfaces & outputs

The hardware and protocols bleed into public work: projection rooms where each eye sees different spectra, VR “qualia drift” experiences, and documentation that ties empirical findings back to the claim that colour = learned relation. The program feeds experiments, agent prompts, and computer vision models that learn alongside human perception.

Related projects and technical details can be linked here later.