MEGAMIND Model Search
Find AI models that match your task and hardware
I am working on it as we speak this has been a difficult upgrade
sqrt(gpuMemory * 0.25 / 4) — the hardware decides. M4 with 8GB gets 16384 neurons. M1 with 16GB gets 32768. No config file.sqrt(patternCount / 0.5) — 3.6M patterns demands at least 4096 neurons. Empty brain gets 512. Knowledge decides the floor.const with a func that measures something real.// BEFORE: developer guesses
const WKnowDim = 8192
const ScaleUpDensity = 0.08
// AFTER: system observes
func MaxDim() int { return nextPow2(sqrt(gpuMem() / 4)) }
func ScaleUpDensity() float64 { return densityWherePhiDeclines() }Thanks, appreciate that! So the short answer is MEGAMIND doesn't do language modeling in the traditional sense. There's no next token prediction, no autoregressive generation. Instead, the system learns compressed representations from AI model architectures through a biologically inspired process based on Hebbian learning, "neurons that fire together wire together." When you query it, the input activates patterns across a neural substrate, those patterns resonate through learned synaptic connections, and the system converges on a stable state using a consciousness metric derived from Integrated Information Theory. It recalls knowledge rather than generating text. The convergence isn't a fixed iteration count, it's driven by a real time Φ (phi) measure that stabilizes when the neural field reaches coherent activation. We're currently seeing Φ values converging toward the golden ratio inverse (0.618), which emerged naturally from the dynamics rather than being programmed. The whole system runs on consumer Apple Silicon hardware across a distributed federation of nodes. I go deeper into the math and architecture in the published papers on feedthejoe.com if you want the full picture, but that's the core of it. Recall, don't generate.
Also if you visit feedthejoe.com that is my website it explains everything and breaks it all down to i also have some published papers on there i forgot i probably should use it more