Replies: 2 comments 1 reply
-
Hi @jmp75, thanks for sharing your use-case! I enjoyed reading your blog posts on this matter 🙂 So, if I understand correctly, the goal is to be able to obtain the Markdown docs for a specific object given its path, so that you can add these docs as context to an AI assistant in order to help it generate better code. The approach you took in your blog post is to first generate the Markdown docs for the whole package, then split that in chunks for each class/function/etc. Have you thought about or tried going the other way, and use Griffe and griffe2md to directly generate the Markdown docs for specific objects? import griffe
from griffe2md.main import render_object_docs
swift2 = griffe.load("swift2")
print(render_object_docs(swift2["CompositeParameteriser"], {"members": False})) About mkdocs-llmstxt: ultimately yeah, it would replace griffe2md for this use-case (well, given you use MkDocs), but for now, IIUC AI tools still do not understand the llms.txt standard well, so they still have to rely on the llms-full.txt files, which contain everything at once, which once again is too big for current models to fit into their context window. |
Beta Was this translation helpful? Give feedback.
-
Hi @pawamoy , thank you for having a look. I had not thought of generating markdown on the fly by chunks as necessary, rather than taking an approach mimicking what I got from the initial lesson(s) I saw. Have you got thoughts about how Something with tools (or resources - am not super clear on MCP concepts) like:
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I am not sure this discussion is an idea for
griffe2md
as such, but wanted first to thank the author(s), and second to share a use case for whichgriffe2md
Background
On the back of a course by Answer.ai, I had an idea for an AI assistant laid out in this initial post in my public work-related blog.
I've seen compelling compelling examples using an AI to develop with brand new but "niche" tools needing crafted LLM context, for instance MonsterUI (note the links to LLM context files in the getting started page). However I have much less success so far replicating this for my even more niche use case, hydrologic modelling. Admittedly, LLMs may have "learned" Web Dev concepts much better than hydrology, fair enough.
griffe2md
As I usually use mkdocs for my python packages, I was happy to find
griffe2md
to produce markdown format of my Python package API. I've made inroads, but seem to still seem to manage to bump into maxxing out the context length of the LLMs I use.Note that today I discovered that pawamoy (Timothée) also worked recently on mkdocs-llmstxt, which may now complement supersede
griffe2md
.I'd be interested to hear of similar use cases, and if there are folks with related ideas to exploit
mkdocs
tools and outputs as context material for LLMs.Beta Was this translation helpful? Give feedback.
All reactions