Skip to content

Commit 448f107

Browse files
authored
add: custom GPT
1 parent 97f9f26 commit 448f107

File tree

4 files changed

+112
-0
lines changed

4 files changed

+112
-0
lines changed

Custom GPT/README.md

Lines changed: 66 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,66 @@
1+
# Conversational Retrieval with LangChain and OpenAI
2+
3+
This directory contains a Python script that implements a conversational retrieval system using LangChain and OpenAI's API. The script allows users to query a collection of documents and receive responses based on the retrieved information.
4+
5+
## Features
6+
7+
- Load documents from a specified directory.
8+
- Create and persist a vector store index for efficient querying.
9+
- Engage in conversational interactions, maintaining chat history.
10+
- Easily exit the program.
11+
12+
## Requirements
13+
14+
- Python 3.7+
15+
- Required packages:
16+
- `openai`
17+
- `langchain`
18+
- `chromadb`
19+
20+
You can install the required packages using pip:
21+
22+
```bash
23+
pip install openai langchain chromadb
24+
```
25+
## Setup
26+
1. Clone the Repository:
27+
```bash
28+
git clone https://github.com/king04aman/custom-gpt.git
29+
cd your_repository
30+
```
31+
2. Set the OpenAI API Key:
32+
Replace `your_api_key_here` in the script with your actual OpenAI API key. You can also set the environment variable directly in your terminal:
33+
```bash
34+
export OPENAI_API_KEY="your_api_key_here"
35+
```
36+
3. Prepare Your Data:
37+
Place your documents in a folder named `data/`. The script will load all documents from this directory.
38+
39+
## Usage
40+
Run the script from the command line:
41+
```bash
42+
python main.py
43+
```
44+
### Command Line Arguments
45+
You can provide an initial query as a command line argument:
46+
```bash
47+
python main.py "Your initial query here"
48+
```
49+
### Interactive Mode
50+
If no initial query is provided, the script will prompt you to enter queries interactively. Type your question and press Enter to get a response. Type `quit`, `q`, or exit to `exit` the program.
51+
52+
### Persistence
53+
- Set the `PERSIST` variable to `True` in the script to enable saving the vector store index to disk for reuse in future sessions.
54+
- The index will be saved in a directory named `persist/`.
55+
56+
## Example
57+
```bash
58+
Prompt (type 'quit' to exit): What is the significance of data persistence?
59+
Response: [Your response here based on the documents]
60+
```
61+
62+
## License
63+
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
64+
65+
## Contributing
66+
Feel free to submit issues or pull requests. Contributions are welcome!

Custom GPT/data/data.txt

Whitespace-only changes.

Custom GPT/example-env

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
OPENAI_API_KEY = "your_api_key"

Custom GPT/main.py

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
import os
2+
import openai
3+
import sys
4+
from langchain.chains import ConversationalRetrievalChain
5+
from langchain.chat_models import ChatOpenAI
6+
from langchain.document_loaders import DirectoryLoader
7+
from langchain.embeddings import OpenAIEmbeddings
8+
from langchain.indexes import VectorstoreIndexCreator
9+
from langchain.indexes.vectorstore import VectorStoreIndexWrapper
10+
from langchain.vectorstores import Chroma
11+
12+
13+
os.environ["OPENAI_API_KEY"] = "your_api_key_here"
14+
PERSIST = False
15+
16+
query = sys.argv[1] if len(sys.argv) > 1 else None
17+
18+
if PERSIST and os.path.exists("persist"):
19+
print("Reusing index...\n")
20+
vectorstore = Chroma(persist_directory="persist", embedding_function=OpenAIEmbeddings())
21+
index = VectorStoreIndexWrapper(vectorstore=vectorstore)
22+
else:
23+
loader = DirectoryLoader("data/")
24+
index = VectorstoreIndexCreator(vectorstore_kwargs={"persist_directory": "persist"}).from_loaders([loader]) if PERSIST else VectorstoreIndexCreator().from_loaders([loader])
25+
26+
chain = ConversationalRetrievalChain.from_llm(
27+
llm=ChatOpenAI(model="gpt-3.5-turbo"),
28+
retriever=index.vectorstore.as_retriever(search_kwargs={"k": 1}),
29+
)
30+
31+
chat_history = []
32+
33+
while True:
34+
if not query:
35+
query = input("Prompt (type 'quit' to exit): ")
36+
if query.lower() in ['quit', 'q', 'exit']:
37+
print("Exiting the program...")
38+
sys.exit()
39+
40+
result = chain({"question": query, "chat_history": chat_history})
41+
42+
print("Response:", result['answer'])
43+
44+
chat_history.append((query, result['answer']))
45+
query = None

0 commit comments

Comments
 (0)