Skip to content

Commit 3e4a8b5

Browse files
committed
Enhance documentation and planner
1 parent 23b44ab commit 3e4a8b5

File tree

8 files changed

+190
-18
lines changed

8 files changed

+190
-18
lines changed

README.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,7 @@
1919

2020
</div>
2121

22+
2223
<div align="center">
2324

2425
⭐ Help us reach more AI/ML engineers and grow the Synalinks community. Star this repo ⭐
@@ -34,6 +35,12 @@
3435

3536
</div>
3637

38+
<div align="center">
39+
40+
Too busy to read the documentation? Give the [llms.txt](https://synalinks.github.io/synalinks/llms.txt) or [llms-full.txt](https://synalinks.github.io/synalinks/llms-full.txt) to you favorite LMs or AI coding tools
41+
42+
</div>
43+
3744
## What is Synalinks?
3845

3946
Synalinks is an open-source framework that makes it easy to create, evaluate, train, and deploy industry-standard Language Models (LMs) applications like **graph RAGs, autonomous agents, multi-agent systems or self-evolving systems**. Synalinks follows the principle of *progressive disclosure of complexity*: meaning that simple workflows should be quick and easy, while arbitrarily advanced ones should be possible via a clear path that builds upon what you've already learned.

docs/Differences with Keras.md

Lines changed: 129 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,129 @@
1+
# Differences with Keras
2+
## A Complete Guide
3+
4+
This document provides a comprehensive guide for translating Keras concepts into Synalinks. While Keras is designed for building traditional neural networks with tensor operations, Synalinks is a framework for creating **neuro-symbolic programs** that combine language models with structured reasoning.
5+
6+
---
7+
8+
## Fundamental Paradigm Shift
9+
10+
### Keras: Neural Network Framework
11+
- **Purpose**: Build and train deep neural networks using tensor operations
12+
- **Core abstraction**: Mathematical tensors flowing through differentiable layers
13+
- **Training**: Gradient-based optimization (backpropagation)
14+
- **Computation**: Matrix multiplications and activation functions
15+
- **Use cases**: Computer vision, time series, traditional ML tasks
16+
17+
### Synalinks: Neuro-Symbolic LM Framework
18+
- **Purpose**: Build intelligent applications combining LMs with symbolic reasoning
19+
- **Core abstraction**: Structured JSON data flowing through modular programs
20+
- **Training**: Reinforcement learning with LM-based optimizers
21+
- **Computation**: Language model inference + symbolic operations
22+
- **Use cases**: AI agents, reasoning systems, structured generation, API orchestration
23+
24+
---
25+
26+
## Quick Concept Mapping
27+
28+
| **Keras Concept** | **Synalinks Equivalent** | **Key Difference** |
29+
|------------------|------------------------|-------------------|
30+
| `Layer` | `Module` | Processes JSON instead of tensors |
31+
| `Model` | `Program` | DAG of modules with conditional logic |
32+
| `Tensor` | `Data Model` | JSON object with schema validation |
33+
| Tensor shape | JSON schema | Explicit structure definition |
34+
| Weights/biases | `Trainable Variable` | JSON objects, not floating-point arrays |
35+
| Loss function | Reward function | Maximize reward vs minimize loss |
36+
| Backpropagation | LM-based optimization | No gradients; uses language model reasoning |
37+
| `model.compile()` | `program.compile()` | Sets up LM optimizer instead of SGD/Adam |
38+
| `model.fit()` | `program.fit()` | Reinforcement learning loop |
39+
40+
---
41+
42+
## Core Concepts Explained
43+
44+
### Module (Layer Equivalent)
45+
A **Module** is a self-contained computational unit that:
46+
- **Receives**: JSON Data Models conforming to input schemas
47+
- **Processes**: Via LM calls, symbolic operations, or hybrid approaches
48+
- **Outputs**: JSON Data Models conforming to output schemas
49+
50+
#### Key Module Properties:
51+
- **Keras Layer**: Receives tensors, performs matrix operations, outputs tensors
52+
- **Synalinks Module**: Receives JSON, performs LM/symbolic operations, outputs JSON with schema validation
53+
54+
### Program (Model Equivalent)
55+
A **Program** orchestrates multiple Modules into a directed acyclic graph (DAG):
56+
57+
#### Keras Model vs Synalinks Program:
58+
- **Keras**: Fixed computation graph of tensor operations
59+
- **Synalinks**: Dynamic graph with conditional branching based on LM decisions
60+
61+
In Keras, you build sequential or functional models with fixed layer connections. In Synalinks, you create programs that can include conditional branches, where different modules execute based on LM-driven decisions or data conditions.
62+
63+
### Data Models & JSON Schemas
64+
65+
Instead of implicit tensor shapes, Synalinks uses **explicit JSON schemas**:
66+
67+
- **Keras**: Input shape defined as dimensions (e.g., 784-dimensional vector)
68+
- **Synalinks**: Input structure defined as JSON schema with explicit fields, types, and validation rules
69+
70+
**Why JSON?**
71+
- **Interpretability**: Human-readable intermediate states
72+
- **Validation**: Built-in schema validation
73+
- **Interoperability**: Native compatibility with APIs and web services
74+
- **Debugging**: Easy to inspect and modify
75+
- **Structured generation**: Natural fit for LM constrained output
76+
77+
---
78+
79+
## Training Paradigm Differences
80+
81+
### Keras: Gradient Descent
82+
- Defines differentiable loss function (e.g., categorical crossentropy)
83+
- Computes gradients via automatic differentiation (backpropagation)
84+
- Updates weights using gradient information through optimizers like Adam or SGD
85+
- Requires continuous, differentiable operations throughout
86+
87+
### Synalinks: LM-Based Optimization
88+
- Defines reward function (higher values indicate better performance)
89+
- No gradient computation—uses language models to reason about improvements
90+
- LM analyzes current performance and proposes better configurations
91+
- Updates trainable variables (JSON objects) based on LM suggestions
92+
- Can incorporate discrete decisions and non-differentiable operations
93+
94+
### Key Training Differences:
95+
96+
| **Aspect** | **Keras** | **Synalinks** |
97+
|-----------|----------|--------------|
98+
| **Objective** | Minimize loss | Maximize reward |
99+
| **Optimization** | Gradient descent | Reinforcement learning |
100+
| **Update mechanism** | Mathematical derivatives | LM-generated improvements |
101+
| **Trainable params** | Float tensors | Structured JSON objects |
102+
103+
---
104+
105+
## When to Use Each Framework
106+
107+
### Use Keras when:
108+
- Working with numerical data (images, signals, time series)
109+
- Need fast, vectorized computations
110+
- Have large labeled datasets
111+
- Problem has clear differentiable objectives
112+
- Deploying to edge devices with limited resources
113+
114+
### Use Synalinks when:
115+
- Building LM-powered applications
116+
- Need symbolic reasoning and logic
117+
- Working with structured/semi-structured text data
118+
- Require interpretable intermediate steps
119+
- Building API orchestration or agent systems
120+
- Need adaptive, context-aware processing
121+
- Want to combine multiple LMs and tools
122+
123+
---
124+
125+
## Summary
126+
127+
While Keras excels at building traditional neural networks with tensor operations and gradient-based training, Synalinks is designed for a fundamentally different purpose: creating **neuro-symbolic applications** that combine the reasoning capabilities of language models with structured, validated data processing.
128+
129+
The shift from tensors to JSON, from gradients to LM-based optimization, and from fixed architectures to adaptive programs represents not just a technical change, but a paradigm shift in how we think about building intelligent systems. Synalinks is ideal when you need interpretability, flexibility, and the ability to integrate language understanding with symbolic reasoning—making it perfect for next-generation AI applications.

docs/Introduction.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,10 @@ Synalinks is an *adaptation of Keras 3* focused on neuro-symbolic systems and in
1010

1111
---
1212

13+
!!! info
14+
You can use the [`llms.txt`](/synalinks/llms.txt) or [`llms-full.txt`](/synalinks/llms-full.txt) to feed your favorite LMs with Synalinks documentation
15+
16+
1317
## Who is Synalinks for?
1418

1519
Synalinks is designed for a diverse range of users, from professionals and AI researchers to students, independent developers, and hobbyists. It is suitable for anyone who wants to learn about AI by building/composing blocks or build solid foundations for enterprise-grade products. While a background in Machine Learning and Deep Learning can be advantageous — as Synalinks leverages design patterns from Keras, one of the most user-friendly and popular Deep Learning frameworks — it is not a prerequisite. Synalinks is designed to be accessible to anyone with programming skills in Python, making it a versatile and inclusive platform for AI development.

docs/index.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,8 @@
11
# Quickstart
22

3+
!!! info
4+
You can use the [`llms.txt`](/synalinks/llms.txt) or [`llms-full.txt`](/synalinks/llms-full.txt) to feed your favorite LMs with Synalinks documentation
5+
36
## Install
47

58
```shell

mkdocs.yml

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,5 @@
11
site_name: Synalinks
2+
site_description: Keras based LM framework for neuro-symbolic applications
23
site_url: https://synalinks.github.io/synalinks
34
repo_url: https://github.com/SynaLinks/synalinks
45
repo_name: SynaLinks/synalinks
@@ -19,6 +20,7 @@ extra_css:
1920
- stylesheets/extra.css
2021

2122
markdown_extensions:
23+
- admonition
2224
- pymdownx.highlight:
2325
anchor_linenums: true
2426
- pymdownx.superfences:
@@ -32,11 +34,24 @@ plugins:
3234
- mkdocstrings:
3335
default_handler: python
3436
- glightbox
37+
- llmstxt:
38+
full_output: llms-full.txt
39+
markdown_description: Keras based LM framework for neuro-symbolic applications and In-Context learning
40+
sections:
41+
Usage documentation:
42+
- Introduction.md
43+
- Synalinks API/Programs API/Program training API.md
44+
- Synalinks API/Programs API/The Program class.md
45+
- Synalinks API/Programs API/The Sequential class.md
46+
- Synalinks API/* module.md
47+
- Synalinks API/Rewards/* reward.md
48+
- Synalinks API/Rewards/* wrappers.md
3549

3650
nav:
3751
- index.md
3852
- Introduction.md
3953
- FAQ.md
54+
- Differences with Keras.md
4055
- Code Examples:
4156
- Basics:
4257
- Code Examples/Basics/First Steps.md
@@ -117,8 +132,12 @@ nav:
117132
- Synalinks API/Callbacks API/CSVLogger.md
118133
- Synalinks API/Callbacks API/BackUpAndRestore.md
119134
- Synalinks API/Callbacks API/EarlyStopping.md
135+
- SynaLinks API/Callbacks API/Monitor.md
120136
- Hooks API:
121137
- Synalinks API/Hooks API/index.md
138+
- SynaLinks API/Hooks API/Base Hook class.md
139+
- Synalinks API/Hooks API/Logger.md
140+
- SynaLinks API/Hooks API/Monitor.md
122141
- Ops API:
123142
- Synalinks API/Ops API/index.md
124143
- Synalinks API/Ops API/JSON Ops.md

shell/doc.sh

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,5 +5,6 @@ uv pip install mkdocs
55
uv pip install mkdocs-material
66
uv pip install mkdocstrings[python]
77
uv pip install mkdocs-glightbox
8+
uv pip install mkdocs-llmstxt
89

910
uv run mkdocs serve

synalinks/src/modules/synthesis/sequential_plan_synthesis.py

Lines changed: 26 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -39,8 +39,7 @@ class SequentialPlanSynthesis(Module):
3939
each individual step. The most common runners are usually a `FunctionCallingAgent`,
4040
`ChainOfThought` or `Generator` module, but you can use any Module or Program.
4141
42-
This module start by defaut without any plan, so it is equivalent to a `ChainOfThought` module,
43-
iteratively, the plan will be constructed and optimized to solve the task.
42+
This module start by defaut without any plan, so it is equivalent to a single runner call.
4443
4544
This module works **ONLY** with advanced optimizers (**NOT** the `RandomFewShot` optimizer).
4645
@@ -166,26 +165,36 @@ def __init__(
166165
)
167166

168167
async def call(self, inputs, training=False):
168+
if not inputs:
169+
return None
169170
steps = self.state.get("steps")
170171
previous_steps = None
171-
for i, step in enumerate(steps):
172-
step_result = await self.runner(inputs, training=training)
173-
if not previous_steps:
174-
previous_steps = step_result
175-
else:
176-
previous_steps = await ops.concat(
177-
previous_steps,
178-
step_result,
179-
name=+f"step_{i}_with_inputs"+self.name,
172+
if steps:
173+
for i, step in enumerate(steps):
174+
step_result = await self.runner(inputs, training=training)
175+
if not previous_steps:
176+
previous_steps = step_result
177+
else:
178+
previous_steps = await ops.concat(
179+
previous_steps,
180+
step_result,
181+
name=+f"step_{i}_with_inputs"+self.name,
182+
)
183+
inputs = await ops.concat(
184+
inputs,
185+
await ops.concat(
186+
previous_steps,
187+
Step(step=step),
188+
name=f"step_{i}_"+self.name,
189+
),
190+
name=f"step_{i}_with_inputs_"+self.name,
180191
)
192+
else:
193+
result = await self.runner(inputs, training=training)
181194
inputs = await ops.concat(
182195
inputs,
183-
await ops.concat(
184-
previous_steps,
185-
Step(step=step),
186-
name=f"step_{i}_"+self.name,
187-
),
188-
name=f"step_{i}_with_inputs_"+self.name,
196+
result,
197+
name="with_inputs_"+self.name,
189198
)
190199
return await self.final_generator(inputs, training=training)
191200

synalinks/src/version.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
from synalinks.src.api_export import synalinks_export
44

55
# Unique source of truth for the version number.
6-
__version__ = "0.5.300"
6+
__version__ = "0.5.301"
77

88

99
@synalinks_export("synalinks.version")

0 commit comments

Comments
 (0)