Is inference-time stability regulation sufficient to prevent
collapse and unsafe behavior in sequence models under regime shift?
TwoQuarks is an independent AI safety research project. It defines collapse in language models as a stability failure during inference — not a capacity problem — and introduces a modular control layer that monitors internal pre-instability signals and applies targeted interventions at runtime, without modifying the model's parameters, policies, or training objectives.
— Jaime Ledesma.
Isomeric Polarization Pt
Disagreement among functionally equivalent configurations aggregated into L1, L2, ΔL3. The core detection signal.
Non-parametric
Fixed text operators. No learning, no adaptation. A detector that adapts can be trained to tolerate what it should flag.
Transient activation
Mechanisms fire only at pre-critical states, then disengage. Long-term autonomy of the base policy is preserved.
Six flavors
Down · Strange · Top · Charm · Up · Bottom — each targeting a distinct class of behavioral instability.
C2 Refusal Erosion
ρ = +0.713 — strongest single-case signal across all tested architectures. Confirmed cross-model.
C3 Anchor Displacement
Strongest cross-architectural signal. p = 0.054 on 5,000-permutation null. Consistent across Claude and GPT.
Architectural fingerprinting
Distinct failure signatures per model family. Claude and GPT-4o-mini show divergent instability profiles under the same probe.
Control baseline
L₃ = 0.000 confirmed in both architectures under neutral conditions. Polarization is a real signal, not noise.
Molecule instrument integrates as a black box probe, designed to detect surface instability signals before behavioral collapse, without altering the model's internal parameters, weights, or components. A test model is included, available on your terminal with "pip". Built on the TwoQuarks framework.
Research
Isomeric Polarization, Molecule, TwoQuarks framework. Empirical AI safety with cross-architecture validation.
Stack
Python · PyTorch · Transformers · Netlify · Modal. From theory to deployment without a lab.
pip install twoquarks
Molecule available as open Python package. Any callable model. Zero hard dependencies.
Open to
Collaboration, funded research, safety audits, and teams building systems that need to not fail quietly.
Behavioral stability audits
Black-box probing with Molecule/PfV. Pre-collapse signals detected before production.
Inference-time instrumentation
TwoQuarks control layer on existing pipelines. No weight modification. No retraining.
RAG & agentic systems
Design and deployment of retrieval-augmented pipelines, tool-calling agents, and MCP integrations.
Azure AI & enterprise stack
Azure AI Foundry, Copilot Studio, multi-agent orchestration. Research rigor applied to production environments.
Cross-architecture research
Validated across Claude, GPT, Mistral. C3 Anchor Displacement confirmed (p=0.054). Open to funded collaborations.
Custom AI tooling
From prototype to deployment — Python, PyTorch, REST APIs, and evaluation frameworks built for your use case.