How can I tell if a search found or not something ? #631
Replies: 8 comments
-
hi @DrEight if you are using the Search API, it will return an instance of |
Beta Was this translation helpful? Give feedback.
-
@dluc thank you for the answer. Let me explain better my use case, because I'm confused about the result I have and your answer. The answer is: Based on the information provided, I don't have enough information to provide the best itinerary to visit Rome. The facts provided are related to UPS shipping services and do not pertain to travel or tourism. As relevant Sources I get 3 of the 4 documents I uploaded, that obviously, they don't make any sense in the context of the answer. My expectation is: to have a way to tell that there are no information found, and if there are no information, the relevant source list is empty, not filled with random documents. |
Beta Was this translation helpful? Give feedback.
-
I see, yes I've seen that behavior from GPT 3.5. Could you give it a try with GPT-4? You are right, the service should reply with "INFO NOT FOUND" (an exact string, that you could check in your code). However, the service will still include the sources considered. In other words the service will say "I looked at these sources, that seemed relevant, but I couldn't find an answer". In some instances GPT 3.5 doesn't follow the prompt instructions and generates the equivalent text. One quick solution is using GPT 4, another is tweaking the prompt. I'll see if I can find a more reliable prompt. |
Beta Was this translation helpful? Give feedback.
-
I tried with chatGPT 4, but Instead of INFO NOT FOUND, it suggests to search for a travel agency... _Answer: I'm sorry, but the information provided does not include any details about planning trips, tourist attractions, accommodations, or travel itineraries in Rome or any other location. The documents seem to be related to UPS shipping guidelines, packaging instructions, and shipping API details, which are not relevant to trip planning. If you're looking to plan a trip to Rome, you might want to consider researching travel guides, contacting a travel agency, or using online travel planning tools that can help you with booking flights, hotels, and creating an itinerary that includes visiting popular tourist sites such as the Colosseum, Vatican City, and the Pantheon. Unfortunately, I cannot assist you with trip planning based on the UPS-related information provided._ Can you please tell me where is exactly the prompt to tweak ? I found the file 'answer-with-facts.txt'. Is it this the place ? |
Beta Was this translation helpful? Give feedback.
-
wow...
yes that's the one. I think we could try using function calling for a more precise response, need to think about it. E.g. ask the model to call a specific function when no answer is available. The main challenge is how to fall back to a standard behavior with LLama and Gemini, ie when using other models... |
Beta Was this translation helpful? Give feedback.
-
I'm very sorry. My fault. What misled me is that the array 'RelevantSources' is full of 3 reference even if the model return INFO NOT FOUND. ( I tried removing the custom prompt provider ) |
Beta Was this translation helpful? Give feedback.
-
I see, makes sense. Calling them "relevant" is misleading if they are not. The challenge is that the decision whether sources are relevant or not is made internally by the LLM when generating an answer. If you want the list to match exactly we will need a different strategy, e.g. asking the model to say which chunk/source was used. |
Beta Was this translation helpful? Give feedback.
-
@dluc @DrEight : I am filtering on the score and only taking a reference if that scores above 0.75. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I would like to understand if a search found some data on the db vector.
If the search didn't return anything, I would like to route the search on a different data source or maybe another index.
Beta Was this translation helpful? Give feedback.
All reactions