-
Notifications
You must be signed in to change notification settings - Fork 8.3k
LocalDocs
So you want to make sure each file type you need the LLM to read is listed here.

The list is comma separated you should add each extension in that format. (no,comma,at,the,end)
Make sure you use CUDA here if you can (default is CPU, slower 😦)

Advanced: I like having many snippets. I set them individually lower in size. (Be careful here)

This is what happens when your model is not configured to handle your LocalDocs settings.

You filled all the context window the LLM had, so it lost stuff...(forced to drop old context)...and that's why its an "Advanced" setting.
Using a stronger model with a high context is the best way to use LocalDocs to its full potential.
- Llama 3.1 8b 128k supports up to 128k context. You can set it as high as your systems memory will hold.
- Setting the model context too high may crash. Set it lower and try again.
More Advanced: For you who are curious what is in your DB.
Portable sqlitebrowser is a good tool for any OS to help you see what is going on inside the db.

You can see all the good stuff you embedded.

As you can see above I am in "browse data" which shows the snippets.