Skip to content

DEREFERENC3D/warframe-llm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Warframe LLM

An AI assistant for the video game Warframe, using an LLM (Gemini) and the data from the official wiki.

Summary

LLMs require data to be provided in advance, before they can be queried. There are multiple ways of achieving this, such as fine-tuning the models, but in the case of the Warframe wiki, the content is small enough (around 64 MB at the time of writing) that RAG or similar approaches can be used instead, without creating a new version of the model.

Gemini CLI was chosen for the LLM, both because it is easy to feed the LLM data from files using it, and because of generous free limits.

The system prompt makes the LLM impersonate Ordis, for fun.

Data acquisition

Use the Special:Export page to export the wiki's pages. If the resulting file(s) exceed 20 MB (Gemini's limit), use the split_dump script to split them into smaller ones.

LLM setup

The repo is set up for Gemini Code Assist (I think - it's the free 1k requests per day one, Google's naming is confusing AF) via the Gemini CLI, which has a file limit of 20 MB. As a result, the dump needed to be split up into smaller files, for which the split_dump script is provided. A system prompt is also provided for Gemini. Disclaimer: it might just be shit as I don't know much about prompting 😅

With the split files in place, simply start the CLI in the project's directory and start prompting - the system prompt from GEMINI.md should come into effect. Alternatively, since the content is small enough, it may be fully read into the LLM's context, using a revelant prompt, e.g.

Read the files into your context: @/split/

This seems unreliable with Gemini, but should work eventually: Gemini finally eating the files

At that point, you may save the conversation with /chat save <name>, so the context does not need to be re-created in future runs - just run /chat load <name> instead.

About

LLM-powered assistant for the video game Warframe

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages