You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was looking into the requirements of modifying GraphRAG to support LLM provider plugin functionality, as well as the option to use different providers for indexing and search. I recently learnt about Google's Jule AI Coding platform entering public beta phase so I thought why not give this task to Jule for a test drive!
Here is the prompt I used:
_GraphRAG is currently capable of connecting to OpenAI services either on Microsoft Azure servers or OpenAI servers, for the purpose of accessing an LLM.
I need to achieve three objectives:
modify GraphRAG so that the code responsible for configuring, connecting and using an OpenAI service, such as the Azure or OpenAI services, is encapsulated into a plug-in that serves GraphRAG's corresponding needs. In other words, GraphRAG should be modified so that it uses plugins in order to access an LLM provider service, and the existing LLM provider services (Azure and OpenAI services using the OpenAI API) should still be available as plugins. The structure of the configuration file should reflect these modifications, which means the user should be able to configure different plugins through GraphRAG's configuration file. Also, the user should have the option to configure GraphRAG to use a different plugin for indexing and a different plugin for searching.
Develop a plugin that makes use of the capabilities added to GraphRAG above, allowing GraphRAG to connect and utilise the LLM services provided by an Ollama server.
Provide a configuration file that configures the updated version of GraphRAG to use LLM "llama3.3:70b-instruct-q8_0-32K" from an ollama server that is available on the localhost and default port number._
Jule came back with a reasonable plan on changes to be made, and modified or created the following files:
I randomly checked some of the changes and they made sense but they need to be thoroughly reviewed and tested.
01/06/2025 Update: The Google Jule proposed code changes don't work out of the box. I would be very surprised if they did. Regadless, I will review them to understand the plugin architecture and its implementation. If there's also some feedback from the authors of GraphRAG, I will take that into account.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I was looking into the requirements of modifying GraphRAG to support LLM provider plugin functionality, as well as the option to use different providers for indexing and search. I recently learnt about Google's Jule AI Coding platform entering public beta phase so I thought why not give this task to Jule for a test drive!
Here is the prompt I used:
_GraphRAG is currently capable of connecting to OpenAI services either on Microsoft Azure servers or OpenAI servers, for the purpose of accessing an LLM.
I need to achieve three objectives:
modify GraphRAG so that the code responsible for configuring, connecting and using an OpenAI service, such as the Azure or OpenAI services, is encapsulated into a plug-in that serves GraphRAG's corresponding needs. In other words, GraphRAG should be modified so that it uses plugins in order to access an LLM provider service, and the existing LLM provider services (Azure and OpenAI services using the OpenAI API) should still be available as plugins. The structure of the configuration file should reflect these modifications, which means the user should be able to configure different plugins through GraphRAG's configuration file. Also, the user should have the option to configure GraphRAG to use a different plugin for indexing and a different plugin for searching.
Develop a plugin that makes use of the capabilities added to GraphRAG above, allowing GraphRAG to connect and utilise the LLM services provided by an Ollama server.
Provide a configuration file that configures the updated version of GraphRAG to use LLM "llama3.3:70b-instruct-q8_0-32K" from an ollama server that is available on the localhost and default port number._
Jule came back with a reasonable plan on changes to be made, and modified or created the following files:
defaults.py
graph_rag_config.py
language_model_config.py
extract_graph.py
generate_text_embeddings.py
factory.py
init.py
azure_openai_plugin.py
ollama_plugin.py
openai_plugin.py
models.py
utils.py
factory.py
pyproject.toml
I randomly checked some of the changes and they made sense but they need to be thoroughly reviewed and tested.
01/06/2025 Update: The Google Jule proposed code changes don't work out of the box. I would be very surprised if they did. Regadless, I will review them to understand the plugin architecture and its implementation. If there's also some feedback from the authors of GraphRAG, I will take that into account.
The new branch can be found here.
Please share your feedback or thoughts and do get in touch if you can help with this.
NT
Beta Was this translation helpful? Give feedback.
All reactions