A minimal example of an Emergent Model, cellular automaton pre-trained to perform computation — in this case, doubling a number.
Pre-print: https://doi.org/10.55277/ResearchHub.70e8enig
Support our project: https://new.researchhub.com/fund/4130/emergent-models-a-general-modeling-framework-as-an-alternative-to-neural-networks
EM-43 is a one-dimensional cellular automaton with a neighborhood of 3 and 4 possible cell states:
0
(blank), 1
(P – program), 2
(R – marker), and 3
(B – separator/halt).The initial state of the automaton (the tape) is constructed as follows:
[program] BB 0^(n+1) R 0
The program is a sequence of fixed length (in this model: 32 cells) placed at the beginning of the tape.
During training, this program is searched/optimized. Together with the rules, it defines the behavious of the model.
The separator BB
acts as a clear boundary between program and input.
The encoded input is placed after the separator: it consists of n+1 zeroes, followed by a red marker R
, and a trailing 0.
This structure is critical: the automaton's computation unfolds starting from this initialized state, processing the interaction between the program, the input beacon, and the evolving cell dynamics.
To provide an input number n
, the tape is initialized as:
[program] BB 0^(n+1) R 0
This creates a beacon, where the number of 0
s before the R
encodes the value n
.
The automaton halts when blue (B
) cells occupy ≥50% of the non-blank tape.
The rightmost R
is located, and the output is decoded by:
output = position(R) − position(last B) − 2
This mirrors the encoding procedure.
The system was trained using genetic algorithms to solve the task:
output = 2 × input
Only the rule table and initial state program were evolved.
Through this process, the automaton learned to perform emergent computation — solving the task without explicit programming.
The model was trained exclusively on inputs from 1 to 30, but it discovered a general algorithm for doubling — not just memorizing examples.
It generalizes perfectly to all natural numbers (excluded n = 0
), despite never having seen them during training.