Replies: 2 comments
-
I noticed the same thing, using the file-based The documents used as "ground facts" are the ones being "closest" to your query, where "closest" is defined using cosine similarity. I found that this kind of similarity often retrieves documents that have nothing to do with the subject at hand. For example, queries like "How do I do X?" will retrieve documents where X (or something related to X) doesn't even occur. I am investigating why this is. But this of course never happens with keyword search. Documents containing X would be retrieved by definition. This is all very strange. The investigation continues. You can find out which facts are being selected by placing a breakpoint at the call of SearchClient.GenerateAnswerAsyncin |
Beta Was this translation helpful? Give feedback.
-
Actually in my case facts are correct but OpenAI ignores them completely. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
When I try to search information I get the response that is completely not related to provided facts or it get information that it couldn't find the information. When I'm using Azure OpenAI with Own Data I get perfect answers. I wonder if there is a way to tune it somehow to have similar results. I'm using GPT 3.5 for text generation and Azure Cognitive Search. Although I noticed that while using Azure OpenAI with Own Data I use keyword search, not vector
Beta Was this translation helpful? Give feedback.
All reactions