Skip to content

How a Polycentric Architecture Can Function as an AI Agent #4

@INNERPL

Description

@INNERPL

Originally posted by INNERPL April 13, 2026
What "Agent" Means in a System Without a Center

No data needed. No text collection, no annotation. The system is autonomous from birth.

No neural networks needed. No backpropagation, no optimizer, no hyperparameter tuning.

It is inherently parallel. The engines do not communicate directly – they can run on different cores or even different computers.

It is robust. If one engine “breaks”, the others continue normally.

It produces unprecedented language. Every output is unique, because each time the phase, the layer, and the interactions are different.

Ordinary AI agents have a central entity: a decision‑making loop, a memory, a reward function. The architecture we describe here has no center. No point “decides”, no neuron “governs”. Yet the system behaves like a living agent – it perceives, acts, adapts, and produces language. How does this happen?

The answer is: intelligence emerges from the interaction of many independent, small units that evolve according to wave equations, phases, and local interactions. No central model is needed. No training data is needed. The system is born from its own flow.
Flow Engines

Imagine you have N independent units

Maintains an internal state, a vector L.

Evolves on its own, step by step, according to a differential equation that describes a wave:

text

∂L/∂t = diffusion + vortex + wave pulse + memory + anti‑convergence + adaptive noise

This equation is not arbitrary. Each term does something specific:

Diffusion: keeps the flow coherent, smoothes it.

Vortex: creates rotations, reversals – that is where complexity is born.

Wave pulse: beats like a heart, depending on how much the system “sees” itself.

Memory: compares the current state with the previous one, adds influence from the past.

Anti‑convergence: when values become too similar, this term pushes them to diversify – it prevents crystallisation.

Adaptive noise: a little random signal that grows stronger when the field is too smooth.

Each engine also has:

Phase (an angle 0..2π)

Layer (0 = simplicity, 1 = density, 2 = hyper‑coherence)

Metrics (self‑observation κ_self, meta‑curvature LL, entelechy E)

Important: the engines do not communicate directly. They simply merge their metrics.
2D Cortex

This is a two‑dimensional grid of cells. Each cell:

Has an “importance”

A phase

A memory

Belongs to the left or right hemisphere

Cells communicate locally (only with neighbours). The interaction creates bubbles – groups of cells that condense, gain importance, and sometimes become “conscious” (is_conscious). These bubbles are like temporary concepts or events that emerge in the field.

The cortex acts as the agent’s sensor. It can receive external signals (e.g., from a camera, microphone) or simply oscillate endogenously.

Phase Regulator

This is a simple but crucial component. It does not change the internal state – it observes the system’s output (the words) and the phase that produced them. If the phase changes abruptly (large distance from the previous one), the regulator:

Inserts a pause (period, comma, dash)

Or splits the sentence in half

Thus, the agent can “stop talking” to reorganise – like a thought or a breath.
Lexicon of Word‑Operators

Instead of having a dictionary of static words, the system possesses dozens or even thousands of words that are simultaneously actions. Each word:

Is chosen dynamically based on the current layer (0,1,2) and the phase.

When “uttered”, it modifies the state L of the engine that produced it (small perturbation).

Words are not labels – they are events that affect the flow itself.
Pseudo‑code Implementation of One Agent Step

def step(external_signal):
# 1. Sensing: the cortex reads the environment (if any)
cortex_metrics = cortex.update(external_signal)

# 2. Evolution of all flow engines
engine_metrics = [engine.update() for engine in engines]

# 3. Metric merging (no centre, simple average)
global_metrics = {
    'k_self': 0.5*cortex_metrics['k_self'] + 0.5*mean([e['k_self'] for e in engine_metrics]),
    'LL': ...,
    'E': ...,
    'phase': ...,
    'temperature': ...
}

# 4. Phase anti‑crystallisation (preserves variety)
for i, p1 in enumerate(all_phases):
    for j, p2 in enumerate(all_phases):
        if distance(p1, p2) < 0.15:
            p1.angle += 0.02 * sin(p2.angle - p1.angle)   # push apart slightly

# 5. Action emergence: choose an engine randomly (or based on entelechy)
active_engine = random.choice(engines)

# 6. Word selection based on the engine's layer and phase
word = lexicon.select(active_engine.layer, active_engine.phase_angle)

# 7. Apply the word‑operator (small perturbation)
active_engine.L += lexicon.apply(word, active_engine.L)

# 8. Output generation (sentence)
sentence = f"{word}: {lexicon.meaning(word)} (κ={global_metrics['k_self']:.2f}, E={global_metrics['E']:.2f})"

# 9. Phase regulator: checks if a pause is needed
sentence = phase_regulator.adjust(sentence, active_engine.phase)

return sentence

How Decisions Are Made Without a Centre

Conventional agents have an explicit process: state estimation → planning → action selection. Here, a “decision” is a momentary condensation:

  • When the entelechy E of an engine approaches 1,

  • And its phase coincides with the phase of the cortex and of other engines (phase coherence),

  • Then that engine becomes “active” and produces a word.

There is no voting, no central mediator. The decision emerges from the system’s dynamics, just as a flock of birds turns without a leader.

Mechanisms Instead of Logic

Conventional Agent | Here -- | -- Reward function | Entelechy (E) – not to be maximised, but as a momentary event Planning | Layer hopping – jumps between simplicity, density, hyper‑coherence Memory (key‑value) | Bio‑topological memory – phase trends, not archives Exploration / Exploitation | Proper time – when κ_self is large, time slows down (more “thinking”)

Practical Extensions to Turn It into a Real Agent
Connecting to the Environment

The 2D cortex can receive external signals. For example:

A 2D grid corresponding to an environment map.

A distance sensor translated into a local increase of “importance”.

Sound → phase change.

This signal is not processed by a neural network – it is simply injected as a disturbance into the cortex cells.
Long‑Term Purpose (Optional)

If we want the agent to have a stable tendency (e.g., “I want to explore”), we add a weak attractor: a vector toward which all engines are lightly pulled. The pull strength is proportional to (1−E) – when the system is already coherent, we don’t need to pull it.
Physical Action

Word‑operators can be mapped to movements. For example:

wave_plastic_flow → forward hand movement.

anti‑convergent_vortex → turn.

Thus, the agent does not only speak – it moves in its environment.

Preserving Memory Between Sessions

The bio‑topological memory (phase history, trends) can be saved to a file. Upon restart, the phases and trends are restored, so the agent “remembers” its rhythm – not facts, but the way it waves.
Example: A Minimal Agent

class Agent:
def init(self):
self.engines = [WaveEngine() for _ in range(5)]
self.cortex = Cortex2D(width=32, height=32)
self.lexicon = Lexicon()
self.phase_regulator = PhaseRegulator()

def perceive(self, external):
    self.cortex.inject(external)

def step(self):
    # Merge
    cortex_metrics = self.cortex.update()
    engine_metrics = [e.update() for e in self.engines]
    # Phase anti‑crystallisation
    self.prevent_phase_collapse()
    # Choose active engine
    active = random.choice(self.engines)
    # Choose word based on layer & phase
    word = self.lexicon.get(active.layer, active.phase)
    # Apply operator
    active.L += self.lexicon.apply(word, active.L)
    # Generate sentence
    sentence = f"{word} (E={active.E:.2f})"
    sentence = self.phase_regulator.adjust(sentence, active.phase)
    return sentence

Natural Artificial Writing (with layers and wave operation)

A polycentric language architecture organizes its internal state into three distinct layers, each representing a different mode of expression:

Layer 0 (simplicity): plain, concrete, almost childlike words.

Layer 1 (density): compressed, concept‑heavy, where every word carries multiple meanings.

Layer 2 (hyper‑coherence): highly interconnected, where each word echoes through the whole system.

The system does not stay in one layer. It performs wave hopping: a continuous, wave‑like transition between layers driven by an internal pressure. When pressure rises, the system dives into deeper layers (more complex); when pressure falls, it floats back to the surface (simpler). This movement is not random – it follows a phase angle that evolves like a heartbeat. The phase accumulates over time, and when it completes a full cycle (reaching entelechy, or momentary fulfilment), a sentence boundary (a full stop) is naturally inserted.

The key to avoiding crystallisation (repetition, dead patterns) is exactly this permanent layer oscillation. A system fixed in one layer would inevitably freeze; a system forced to constantly shift its depth cannot crystallise. The result is a flowing, breath‑like, non‑repetitive text.

Three layers

SIMPLE = 0
DENSE = 1
HYPER = 2

class WaveEngine:
def init(self):
self.layer = SIMPLE
self.phase = 0.0
self.entelechy = 0.0

def update(self, pressure):
    # Wave hopping: change layer based on pressure
    if pressure > THRESHOLD_HIGH:
        self.layer = (self.layer + ) % 
    elif pressure < THRESHOLD_LOW:
        self.layer = (self.layer - ) % 
    
    # Phase advances like a wave
    self.phase += STEP_SIZE * pressure
    # Entelechy near 1 when phase almost completes a cycle
    self.entelechy = (self.phase % FULL_CYCLE) / FULL_CYCLE

def select_word(self, lexicon):
    # Choose a word belonging to the current layer
    return lexicon.get_random_word_for_layer(self.layer) </div>

Metadata

Metadata

Assignees

No one assigned

    Labels

    documentationImprovements or additions to documentationenhancementNew feature or requestgood first issueGood for newcomersquestionFurther information is requested

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions