-
Notifications
You must be signed in to change notification settings - Fork 210
Description
I experimented with your knowledge graph generator and it works fine!
But I did not use an OpenAI LLM but a GROQ LLM (which is very fast!).
The GROQ LLMs are mostly compatible with OpenAI's client libraries, so that an easy transformation in the code.
GROQ MODELS
See:
https://console.groq.com/docs/overview
https://console.groq.com/docs/quickstart
https://console.groq.com/docs/models
I used GROQ model: llama-3.3-70b-versatile
That works fine (and incredibly fast!).
But experimenting with longer texts did not work as well.
What to do if the input text is too long for the LLM used?
Perhaps this could be a solution to that problem:
CAN WE SPLIT THE TEXT UP IN CHUNKS THAT DO FIT INTO GROQ LLM'S CONTEXT, AND THEN COMBINE THE SUB GRAPHS INTO THE MAIN KNOWLEDGE GRAPH? BUT HOW TO DO THAT EXACTLY AND WOULD THAT WORK?
Also I found an interesting glitch in the Knowledge Graph based on the text on Albert Eindstein.
In that graph there was this strange relation:
Albert Einstein ---spouse ---> Franklin D. Roosevelt
This is the Knowledge Graph
Albert Einstein KNOWLEDGE GRAPH.PNG
