When Toggle Artifacts UI is ON then RAG doesn't seem to work #5524
-
What happened?I believe I’ve identified an issue with the Toggle artifacts UI. When the toggle is enabled, RAG doesn’t work. The LLM responds that the content wasn’t provided. I’ve tested this with multiple language models running on Ollama. I’ve attached screenshots for reference. Steps to Reproduce
NOTE: Do it in this order to ensure context and content is not cached on Librechat or Ollama KV. What browsers are you seeing the problem on?Safari Relevant log outputchat-meilisearch | 2025-01-28T08:44:18.675708Z WARN HTTP request{method=GET host="meilisearch:7700" route=/indexes/convos/documents/cdceea98-e38c-4613-8285-2857e8216337 query_parameters= user_agent=node status_code=404 error=Document `cdceea98-e38c-4613-8285-2857e8216337` not found.}: tracing_actix_web::middleware: Error encountered while processing the incoming HTTP request: ResponseError { code: 404, message: "Document `cdceea98-e38c-4613-8285-2857e8216337` not found.", error_code: "document_not_found", error_type: "invalid_request", error_link: "https://docs.meilisearch.com/errors#document_not_found" } ScreenshotsCode of Conduct
|
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 2 replies
-
It's working, but especially with low-parameter open-source models, performance seems to degrade with complexity and longer system messages. With File context from RAG API + artifacts, the system message can grow to 5000 tokens or more. OpenAIOllama (deepseek-r1:32b)The solutionUse "Custom Prompt Mode" with a very simplified version of the artifacts prompt or use the two features separately. I will make it more plug-and-play very soon via agents |
Beta Was this translation helpful? Give feedback.
-
Thanks @danny-avila , are you refering to customizing the prompt in Model Specs, or something else? In my tests even with very small text files RAG did not work when Artifacts UI is On. |
Beta Was this translation helpful? Give feedback.
-
Hi @danny-avila , I have now enabled the "Custom Prompt Mode" in the settings menu. I think this is what you aluded to, but RAG still does not work. It will only work if UI Artifacts is disabled. Would you like my librechat.yml config to have a look? |
Beta Was this translation helpful? Give feedback.
It's an LLM issue. I double-checked and the context is getting provided from RAG to Ollama as it would for any other Provider