Skip to content

Streamlit cache leading to empty index #48

@tinamil

Description

@tinamil

I deployed with docker and uploaded a local file. When I tried to chat I got a blank response and the following error:
local-rag | 2024-05-02 12:28:54,674 - ollama - ERROR - Ollama chat stream error: 'HuggingFaceEmbedding' object has no attribute '_model'

However, it works normally when I upload a website instead.

I traced the problem to line 114 of utils/llama_index.py: @st.cache_data(show_spinner=False) for the create_index(_documents) function. So, one workaround is to comment out that line, and then local files work again. I believe create_index is being called when the documents are being uploaded, but before they have been saved to disk then read into memory, so the index is empty and then streamlit is caching the result instead of regenerating the document index when the query comes through.

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions