Skip to content

LocalDocs

3Simplex edited this page Aug 12, 2024 · 5 revisions

So you want to make sure each file type you need the LLM to read is listed here.
Screenshot 2024-07-28 075448
The list is comma separated you should add each extension in that format. (no,comma,at,the,end)

Make sure you use CUDA here if you can (default is CPU, slower 😦)
Screenshot 2024-07-28 075458

Advanced: I like having many snippets. I set them individually lower in size. (Be careful here)

  • With more characters per snippet you will give the LLM more relevant information with each snippet. (This line has 134 characters.)
  • With more snippets you will be capable of retrieving more areas with relevant information. (This line is part of one snippet.)
  • These three explanations contain 398 characters total and would be split into two snippets. (With the settings shown below.)

Screenshot 2024-07-28 075505

This is what happens when your model is not configured to handle your LocalDocs settings.
Screenshot 2024-07-28 080442

You filled all the context window the LLM had, so it lost stuff...(forced to drop old context)...and that's why its an "Advanced" setting.

Using a stronger model with a high context is the best way to use LocalDocs to its full potential.

  • Llama 3.1 8b 128k supports up to 128k context. You can set it as high as your systems memory will hold.
    • Setting the model context too high may crash. Set it lower and try again.

More Advanced: For you who are curious what is in your DB.
Portable sqlitebrowser is a good tool for any OS to help you see what is going on inside the db.

Screenshot 2024-07-28 075920
You can see all the good stuff you embedded.

Screenshot 2024-07-28 080035
As you can see above I am in "browse data" which shows the snippets.

Clone this wiki locally