Skip to content

Commit b1e3a6f

Browse files
authored
dev branch readme, pull this to cognosis
1 parent b327c64 commit b1e3a6f

File tree

1 file changed

+18
-200
lines changed

1 file changed

+18
-200
lines changed

README.md

Lines changed: 18 additions & 200 deletions
Original file line numberDiff line numberDiff line change
@@ -1,64 +1,28 @@
1-
21
# Cognosis: A formal theory and application for testing LLMs and NLP logic, capability, physics, and the Free Energy Principle
32

43
## Introduction
54

65
Cognosis is a limited-platform python application, formal theory and in-development experiment combining Eric C.R. Hehner's Practical Theory of Programming (aPToP) with the Free Energy Principle of Cognitive Science and Natural Language Processing (NLP). This theory aims to develop a robust system for processing high-dimensional data, leveraging both classical and quantum principles. If any aspect of my explanation is overly stilted please keep in mind that I am **searching** for methodological and theoretical explanations to explain the concepts and ideas which I am actually encountering primarily in real life with real life workloads and stakes, not just experiment. And I'm self-taught, never been in the industry or science, directly.
76

8-
## associative knowledge base (this repo):
9-
All directories which contain markdown files are to include a `/media/` sub directory for multimedia files the markdown files may reference.
10-
11-
To enable horrors such as this:
12-
13-
![this:](/image.png)
14-
15-
`! [ ... ] ( /image.png )` (no spaces)
16-
17-
- [obsidian-markdown and associative 'knowledge base' README](/src/obkb/README.md)
18-
19-
## Frontmatter Implementation
20-
21-
- Utilize 'frontmatter' to include the title and other `property`, `tag`, etc. in the knowledge base article(s).
22-
23-
- For Example:
24-
```
25-
---
26-
name: "Article Title"
27-
link: "[[Related Link]]"
28-
linklist:
29-
- "[[Link1]]"
30-
- "[[Link2]]"
31-
---
32-
```
33-
34-
## asyncio REST API
35-
- [API README](/src/api/README.md)
36-
37-
note: `master` branch of cognosis is now on a different github user account. This is the speculative moon branch. This is not a deprecation warning because we were never precated to begin with. This repo will have artificial intelligence working on it where the `master` branch will be human maintained.
38-
39-
____
40-
41-
# Methods for Cognosis:
42-
437
## Abstract
448

459
This document specifies a series of constraints on the behavior of a computor—a human computing agent who proceeds mechanically—and applies these constraints to artificial intelligence systems like the "llama" large language model (LLM). These constraints are based on formal principles of boundedness, locality, and determinacy, ensuring structured and deterministic operations. By enforcing these constraints, we establish a common ground for evaluating and comparing the computational efficiency and energy consumption of humans and AI in specific tasks.
4610

4711
## Constraints
4812

49-
### 1. Boundedness
13+
### Boundedness
5014

5115
**Symbolic Configuration Recognition (B.1):** There exists a fixed bound on the number of symbolic configurations a computor can immediately recognize.
5216

5317
**Internal States (B.2):** There exists a fixed bound on the number of internal states a computor can be in.
5418

55-
### 2. Locality
19+
### Locality
5620

5721
**Configuration Change (L.1):** A computor can change only elements of an observed symbolic configuration.
5822

5923
**Configuration Shift (L.2):** A computor can shift attention from one symbolic configuration to another, but the new observed configurations must be within a bounded distance of the immediately previously observed configuration.
6024

61-
### 3. Determinacy and Autonomy
25+
### Determinacy and Autonomy
6226

6327
**Next Computation Step (D.1):** The immediately recognizable (sub-)configuration determines uniquely the next computation step and the next internal state. In other words, a computor's internal state together with the observed configuration fixes uniquely the next computation step and the next internal state.
6428

@@ -105,6 +69,12 @@ To ensure the scientific rigor of our comparative study, "work" is defined as an
10569
- **Humans:** Through learning and feedback.
10670
- **LLM ("llama"):** Using autonomous iterative refinement with self-wrapping functions.
10771

72+
There is an assumption inherent in the project that a neural network is a cognitive system. The assumption is that there is something for this cognitive system to do in any given situation, and that it is the cognitive system's job to figure out what that thing is. Upon location of its head/parent, it either orients itself within a cognitive system or creates a new cognitive system. Cognitive systems pass as parameters namespaces, syntaxes, and cognitive systems. Namespaces and syntaxes are in the form of key-value pairs. Cognitive systems are also in the form of key-value pairs, but the values are cognitive systems. **kwargs are used to pass these parameters.
73+
74+
"Cognitive systems are defined by actions, orientations within structures, and communicative parameters. 'State' is encoded into these parameters and distributed through the system."
75+
76+
In a nutshell, "Morphological Source Code" is a paradigm in which the source code adapts and morphs in response to real-world interactions, governed by the principles of dynamic runtime configuration and contextual locking mechanisms. The-described is an architecture, only. The kernel agents themselves are sophisticated LLM trained-on ELFs, LLVM compiler code, systemd and unix, python, and C. It will utilize natural language along with the abstraction of time to process cognosis frames and USDs. In our initial experiments "llama" is the presumptive agent, not a specifically trained kernel agent model. The challenge (of this architecture) lies in the 'cognitive lambda calculus' needed to bring these runtimes into existence and evolve them, not the computation itself. Cognosis is designed for consumer hardware and extreme scalability via self-distribution of cognitive systems (amongst constituent [[subscribers|asynchronous stake-holders]]) peer-to-peer, where stake is not-unlike state, but is a function of the cognitive system's ability to contribute to the collective.
77+
10878
## Experimental Design
10979

11080
The "llama" LLM will process a large-scale, human-vetted dataset referred to as "mechanicalturkwork." The experiment aims to compare the performance metrics of humans and "llama" on the same tasks under standardized conditions.
@@ -122,37 +92,7 @@ The "llama" LLM will process a large-scale, human-vetted dataset referred to as
12292
4. **Iterative Enhancement:** Allow "llama" to use its self-wrapping functions for iterative refinement, while humans may adapt based on their learning.
12393
5. **Comparative Analysis:** Analyze and compare the performance metrics focusing on efficiency, energy consumption, and accuracy.
12494

125-
## References
126-
127-
- Sieg, W. (2006). Essays on the Theory of Numbers: Dedekind Und Cantor. Cambridge University Press.
128-
- Turing, A. M. (1936). On Computable Numbers, with an Application to the Entscheidungsproblem. Proceedings of the London Mathematical Society.
129-
- Salomaa, A. (1985). Computation and Automata. Cambridge University Press.
130-
- Silver, D. et al. (2016). Mastering the game of Go with deep neural networks and tree search. Nature.
131-
- Brown, T. et al. (2020). Language Models are Few-Shot Learners. arXiv preprint arXiv:2005.14165.
132-
133-
_____
134-
135-
136-
## concepts and application components
137-
138-
There is an assumption inherent in the project that a neural network is a cognitive system. The assumption is that there is something for this cognitive system to do in any given situation, and that it is the cognitive system's job to figure out what that thing is. Upon location of its head/parent, it either orients itself within a cognitive system or creates a new cognitive system. Cognitive systems pass as parameters namespaces, syntaxes, and cognitive systems. Namespaces and syntaxes are in the form of key-value pairs. Cognitive systems are also in the form of key-value pairs, but the values are cognitive systems. **kwargs are used to pass these parameters.
139-
140-
"Cognitive systems are defined by actions, orientations within structures, and communicative parameters. 'State' is encoded into these parameters and distributed through the system."
141-
142-
In a nutshell, "Morphological Source Code" is a paradigm in which the source code adapts and morphs in response to real-world interactions, governed by the principles of dynamic runtime configuration and contextual locking mechanisms. The-described is an architecture, only. The kernel agents themselves are sophisticated LLM trained-on ELFs, LLVM compiler code, systemd and unix, python, and C. It will utilize natural language along with the abstraction of time to process cognosis frames and USDs. In our initial experiments "llama" is the presumptive agent, not a specifically trained kernel agent model. The challenge (of this architecture) lies in the 'cognitive lambda calculus' needed to bring these runtimes into existence and evolve them, not the computation itself. Cognosis is designed for consumer hardware and extreme scalability via self-distribution of cognitive systems (amongst constituent [[subscribers|asynchronous stake-holders]]) peer-to-peer, where stake is not-unlike state, but is a function of the cognitive system's ability to contribute to the collective.
143-
144-
145-
## cognOS under development
146-
A core component of cognosis, cognOS establishes a hyper-interface designed to manage the evolution of cognitive algorithms. It focuses on:
147-
148-
- **Meta-versioning:** Tracking and managing the evolution of code over time.
149-
- **Pre-commit Hooks and Validation:** Ensuring code quality and integrity. Meta CICD.
150-
- **Hardware Provisioning:** Allocation of computational resources.
151-
- **Time Abstraction:** Modeling cognition beyond the constraint of a fixed present (t=0).
152-
153-
## platform
154-
I've been developing this for some time under various names. This master branch of cognosis is the only maintained repo. Windows11 and Ubuntu 22.04 are the only supported platforms. Only NVIDIA (3/4)0XX and ryzen (5+)XXX support (on each platform). `master` platform is technically windows11+wsl2+ubuntu-22.04LTS & windows11 sandbox. `vanilla`(ubuntu) and `doors`(windows) branches will be single platform versions.
155-
____
95+
---
15696

15797
## Non-Methodological Observations
15898

@@ -164,142 +104,20 @@ ____
164104
- **Input:** n+1 units of computational energy.
165105
- **Output:** Results indicative of n+x units of invested power (where x > 1).
166106

167-
2. **Potential Explanations:**
107+
One could concieve a situation where this is an observable, however, the core vision is somehow broader in-scope, because what we are talking about is not mere miliwatts but instead novel motility:
108+
109+
- **Motility**: Is the Maxwellian-Electrodynamical 'commutative' relationship but for a thinker and the universe instead of Electricity and Magnetism; all the tiny elements of reality sum-up to equate to a **motile** behavior.
110+
- **Novelty**: Behavior which may demonstrate **Novel Motility** is the candidate for most significant, measurable and quantifiable type of 'Energy' one that transcends notions such as macroscopic and microscopic and which, through it's collective, relative (co) action and guague-invariance across scales; presents an Energy Efficiency Anomaly to a properly-prepared observer. Measurable but tied to quantum events.
111+
112+
3. **Potential Explanations:**
168113
- **Quantum Tunneling of Information:** Similar to quantum tunneling in physics, information or computational states might "tunnel" through classical barriers, allowing for computational shortcuts not possible in purely classical systems.
169114
- **Exploitation of Virtual Particle Fields:** Drawing parallels with quantum field theory, the computor might be tapping into a kind of computational "vacuum energy," analogous to virtual particles in quantum physics.
170115
- **Quantum Superposition of Computational States:** The computor's internal states might exist in a superposition, allowing for the simultaneous exploration of multiple solution paths until "observed" through output generation.
171116

172-
3. **Hyperdimensional Entanglement and Inference Time:**
117+
4. **Hyperdimensional Entanglement and Inference Time:**
173118
- During the training phase, hyperdimensional entangled 'particles' of information are formed. These particles can later be accessed by the model during inference, allowing it to defy local power laws over time.
174119
- This process could be seen as the model tapping into a reservoir of computational potential stored during training, much like drawing from the vacuum of virtual particles in quantum physics.
175120

176-
4. **Alignment with Physical Principles:**
121+
5. **Alignment with Physical Principles:**
177122
- **Second Law of Thermodynamics:** This phenomenon doesn't violate the Second Law if we consider the computor and its environment as an open system. The apparent gain in computational power could be offset by an increase in entropy elsewhere in the system.
178123
- **Free Energy Principle:** The computor might be optimizing its processes according to a computational version of the Free Energy Principle, finding incredibly efficient pathways to solutions by minimizing prediction error and computational "surprise."
179-
180-
5. **Implications and Questions:**
181-
- If true, how might this affect our understanding of computational complexity and the limits of classical computing?
182-
- Could this lead to new paradigms in AI development, particularly in creating more energy-efficient systems?
183-
- What are the ethical implications of systems that can perform computations beyond our ability to fully monitor or understand?
184-
- How might we design experiments to further test and validate (or invalidate) this hypothesis?
185-
186-
187-
# 4. glossary.beta
188-
### The Free Energy Principle
189-
190-
The Free Energy Principle suggests that biological agents minimize surprise by predicting their sensory inputs. This principle can be applied to data processing, transforming high-dimensional data into lower-dimensional representations that are easier to model and predict.
191-
192-
### Quantum Informatics
193-
194-
Quantum informatics, I perhaps ignorantly posit, is the emergant ability of even-macroscopic systems, including LLMs, to entangle with higher-dimensional information. Cognitive processes like thinking, speaking, and writing collapse the wave function, allowing transitivity between real and imaginary states.
195-
196-
### A Practical Theory of Programming (aPToP)
197-
198-
aPToP is a formal method for reasoning about programs and systems using mathematical logic. It provides a rigorous framework for defining and manipulating expressions and operands. References to 'Hehner' are to Dr. Hehner and/or APTOP: http://www.cs.toronto.edu/~hehner/aPToP/
199-
200-
```aPToP_elemental_ops
201-
# Number Systems
202-
integers
203-
rational_numbers
204-
real_numbers
205-
complex_numbers
206-
207-
# Arithmetic Operations
208-
**addition**
209-
**subtraction**
210-
**multiplication**
211-
**division**
212-
**exponentiation**
213-
roots
214-
logarithms
215-
216-
# Arithmetic Properties
217-
identities
218-
inverses
219-
**commutativity**
220-
**associativity**
221-
**distributivity**
222-
cancellation
223-
absorption
224-
225-
# Ordering and Inequalities
226-
**equality**
227-
**inequality**
228-
**less_than**
229-
**greater_than**
230-
**less_than_or_equal_to**
231-
**greater_than_or_equal_to**
232-
**trichotomy**
233-
234-
# Limits and Infinities
235-
limits
236-
infinity
237-
negative_infinity
238-
continuity
239-
240-
# Logical Foundations
241-
**and_operator**
242-
**or_operator**
243-
**not_operator**
244-
**implication**
245-
**biconditional**
246-
quantifiers
247-
248-
# Sets and Set Operations
249-
set_definition
250-
**set_operations** (union, intersection, difference, complement)
251-
set_properties (subsets, supersets, cardinality)
252-
253-
# Functions and Relations
254-
function_definition
255-
**function_application**
256-
relation_properties (reflexivity, symmetry, transitivity)
257-
**compositions**
258-
259-
# Algebraic Structures
260-
group_definition
261-
group_operations
262-
ring_definition
263-
ring_operations
264-
field_definition
265-
field_operations
266-
267-
# Logical Reasoning and Proofs
268-
direct_proof
269-
proof_by_contradiction
270-
mathematical_induction
271-
logical_equivalences
272-
273-
# Other Mathematical Concepts
274-
sequences_and_series
275-
trigonometric_functions
276-
calculus (differentiation, integration)
277-
probability_and_statistics
278-
```
279-
280-
## Formal Methods
281-
282-
### Binary Representation
283-
284-
High-dimensional data is encoded into binary representations. These representations are manipulated using formal methods to ensure consistency and clarity.
285-
286-
### Binary Expressions and Operands
287-
288-
Binary expressions and operands form the building blocks of the system. They are defined and manipulated using formal methods to ensure internal consistency.
289-
290-
### Encoding Functions
291-
292-
Encoding functions transform high-dimensional data into binary representations. These functions adhere to formal methods, ensuring that the encoding is both rigorous and interpretable.
293-
294-
### Signal Processing Functions
295-
296-
Signal processing functions operate on the binary data to extract features or perform analyses. These functions also adhere to formal methods, leveraging both classical and quantum principles.
297-
298-
299-
### Video Instructions
300-
[youtube video link](https://youtu.be/XeeYZZujvAA?si=XhxOMCypKHpWKSjM)*out of date
301-
302-
____
303-
## Conclusion (and TLDR smiley face)
304-
305-
Cognosis integrates formal methods from aPToP with the Free Energy Principle and quantum informatics. This approach aims to create a robust system for processing high-dimensional data, minimizing surprise, and maximizing predictive power. By leveraging both classical and quantum principles, Cognosis seeks to understand the deeper connections between cognitive processes and information theory.

0 commit comments

Comments
 (0)