You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, i have a question (one that i am not sure how to even ask). I have a Data Base with like 2-3K non-published science papers (due to different reasons).
I want ask questions related to that DB (with papers) but not restricted to it ( so i still want to retain the general knowledge of the GPT model). Already have an API key and have transform the DB to a vector db using python/langchain.
From my point of view i have 2 ways of doing it :
create a custom GPT model (like the one they create when they finetune) by adding the vector DB to GPT (not sure if that's possible). And use that for prompts.
use existing llm model (3.5) with openai lib in python to send the vector db always to GPT via API and promt queston based on it.
Both my options are limited with the knowledge i have, so i am not sure if they are even possible. If they are i am not sure which one will even be better and if they are not can you give me a suggestion how to tackle the problem. i just what to use GPT power to ask questions on my DB.
For now i managed to make GPT 3.5 (useing RetrievalQA) to answer questions on the DB, but it answers only on that (it can't tell what is the capital of france )
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
Hello, i have a question (one that i am not sure how to even ask). I have a Data Base with like 2-3K non-published science papers (due to different reasons).
I want ask questions related to that DB (with papers) but not restricted to it ( so i still want to retain the general knowledge of the GPT model). Already have an API key and have transform the DB to a vector db using python/langchain.
From my point of view i have 2 ways of doing it :
create a custom GPT model (like the one they create when they finetune) by adding the vector DB to GPT (not sure if that's possible). And use that for prompts.
use existing llm model (3.5) with openai lib in python to send the vector db always to GPT via API and promt queston based on it.
Both my options are limited with the knowledge i have, so i am not sure if they are even possible. If they are i am not sure which one will even be better and if they are not can you give me a suggestion how to tackle the problem. i just what to use GPT power to ask questions on my DB.
For now i managed to make GPT 3.5 (useing RetrievalQA) to answer questions on the DB, but it answers only on that (it can't tell what is the capital of france )
Beta Was this translation helpful? Give feedback.
All reactions