Skip to content

A rule based Transformer in Rust which doesn't pattern match token but actually learns.

License

Notifications You must be signed in to change notification settings

Yuvraj-cyborg/logical-transformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Logical Transformer

A rule-driven mini-transformer implemented in Rust that performs multi-fact, multi-layer reasoning using symbolic rules instead of embeddings.

Architecture

flowchart TD
    A["Input Query<br/>(e.g. (is kity bird))"] --> B["Query Encoder<br/>(normalize into Pattern)"]
    B --> C["Attention over Rules<br/>- match_scores<br/>- add biases<br/>- softmax → weights"]
    C --> D["Value Aggregator<br/>(combine consequents)"]
    D --> E["Context Update<br/>(add new facts to context)"]
    D --> F["Final Output<br/>(answers, top consequents)"]
    E --> C["Iterate next layer<br/>(chaining until fixpoint)"]

    subgraph "Rules Memory"
        R1["Rule 1: IF (is ?x bird)<br/>THEN (can ?x fly)"]
        R2["Rule 2: IF (is ?x penguin) THEN (can ?x fly), bias=-5)"]
        R3["(Rule 3: IF (is ?x injured) THEN (can ?x fly), bias=-3)"]
    end

    C -.-> R1
    C -.-> R2
    C -.-> R3
Loading

How It Works

The system implements a query-key-value attention mechanism where:

  • Query (Q): current fact / question
  • Key (K): rule pattern (IF-part)
  • Value (V): rule consequent (THEN-part)
  • Score: similarity between query and key
  • Weight: normalized score (softmax optional)
  • Output: weighted sum of values (substitute variables → concrete facts)

Mathematical Foundation

weight_i = softmax(Q · K_i + b_i)
Output = Σ_i weight_i · V_i

Where b_i is the rule bias.

Features

  • Load rules from JSON (pattern, consequent, bias)
  • Multi-fact context inference
  • Multi-layer reasoning (chained rules)
  • Variable substitution in consequents

Example

Rules Configuration

Create a rules.json file:

[
    {
        "id": 1,
        "pattern": "(is ?x bird)",
        "consequent": "(can ?x fly)",
        "bias": 0.0
    },
    {
        "id": 2,
        "pattern": "(is ?x penguin)",
        "consequent": "(can ?x fly)",
        "bias": -5.0
    },
    {
        "id": 3,
        "pattern": "(is ?x injured)",
        "consequent": "(can ?x fly)",
        "bias": -3.0
    }
]

Usage

use logical_transformer::{Engine, load_rules, parse_pattern};

// Load rules from JSON
let rules = load_rules("rules.json");
let engine = Engine::new(rules);

// Query the system
let query = parse_pattern("(is tweety bird)");
let results = engine.infer(&query);

// Results: [(can tweety fly), weight]
for (consequent, weight) in results {
    println!("{:?} with confidence {:.2}", consequent, weight);
}

Multi-Layer Reasoning

// Start with known facts
let facts = vec![
    parse_pattern("(is tweety bird)"),
    parse_pattern("(is tweety injured)"),
];

// Perform multi-layer inference
let results = engine.infer_multi_layer(&facts, 3); // 3 layers max

// The system will chain rules:
// Layer 1: (is tweety bird) + (is tweety injured)
// Layer 2: Derives (can tweety fly) with competing biases
// Layer 3: Further inference if applicable

Building and Running

# Build the project
cargo build

# Run tests
cargo test

# Run the main example
cargo run

License

This project is open source and available under the MIT License.

About

A rule based Transformer in Rust which doesn't pattern match token but actually learns.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages