Skip to content

jeremy-costello/factored-cognition

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

31 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

factored-cognition

Factored Cognition with LLMs.

This is a reimagining of Ought's Factored Cognition Primer.

Requirements

The only requirement (besides Python) is vLLM.
Paper extraction additionally requires pdfminer.six for reading PDFs.

Models

Supports any model in vLLM, including quantized models.

Adding a new model class in models.py:

  • The new model should be a subclass of the Model class.
    • Include vocab size, context length, and prompt templates.
  • Subclass this new model class for specific instantiations of the model (e.g. sizes or quantizations).

See LLaMa2 and LLaMa2_7B_Chat_AWQ in models.py.

What I've Implemented

  • Question answering (with and without context)
  • Debate (including a judge)
  • Extracting title, authors, abstract, sections from PDFs
  • Answering questions based on a PDF
  • Recursive amplification
  • Verifiers

To Do

  • Verification chain
  • Tool Use
  • Deduction
  • Action Selection

Exercises

  • Long texts: 3
  • Amplification: 1, 2, 3
  • Verifiers: 1, 2

About

Factored Cognition with LLMs.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages