Skip to content

jimenezfer/chatwithdata-llamaindex

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Core Purpose:

A persistent RAG (Retrieval-Augmented Generation) system that answers questions about your documents using local AI models with guaranteed database persistence.

This application can ingest data from multiple sources:

  • A text file (facts.txt)
  • a DuckDB database (ducks.duckdb)
  • a SQLite database (places.sqlite)

Running with Docker

Build the Docker Image

docker build -t rag-app .

Run the Application

This will start a web server on port 8501.

docker run -p 8501:8501 rag-app

Open your browser and navigate to http://localhost:8501. Use the sidebar to select the data source you want to chat with.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •