WCNegentropy WCNegentropy

AI Models

HuggingFace

huggingface.co/WCNegentropy

Experimental AI architectures and novel algorithms — published openly for research and academic use. Each model explores a distinct structural hypothesis about how intelligence and memory might be better organized.

LICENSE AGPLv3
USE Free for research & academic use
Research · AGPLv3 793K – 771M params

BitTransformerLM

Binary transformer architecture

A novel binary transformer for low-resource settings using 9-bit encoding (8 data + 1 parity). Features a reversible architecture enabling exact gradient computation, and built-in safety telemetry for alignment research.

Encoding 9-bit (8 data + 1 parity)
Architecture Reversible transformer
Parameters 793K – 771M (configurable)
Safety Built-in telemetry
Research · AGPLv3 150+ dB PSNR

WrinkleBrane

Wave-interference memory system

Wave interference-based transformer variant using 4D tensor operations as the primary computational primitive. Achieves high-precision associative memory with parallel retrieval — demonstrated at 150+ dB PSNR recall accuracy.

Computation 4D tensor operations
Memory Wave-interference associative
Retrieval Parallel, 150+ dB PSNR
License AGPL-3.0

Open research posture. All models are published under AGPLv3 and freely available for academic research, experimentation, and non-commercial use. For commercial licensing inquiries, get in touch.

In Development

Further experimental architectures are in active development as part of the broader WCNegentropy research agenda — including work related to PQC-based privacy layers and procedural generation systems. Watch the HuggingFace profile for updates.