feat(public-docsite-v9): add llms docs#34838
Conversation
📊 Bundle size report✅ No changes found |
|
Pull request demo site: URL |
|
This is awesome! Can we apply the same pattern to composed stories as well? It would be really cool to get this done for charts and contrib as well. |
https://fluentuipr.z22.web.core.windows.net/pull/34838/public-docsite-v9/storybook/llms.txt - v9 llms.txt Contrib isn't ready yet since it's not in the monorepo, and I'm still figuring out the optimal way to distribute the script if we'll decide to go with it. |
|
distribution: i don't see how this could possible work as SB addon or bundler plugin because how this works under the hood. it's very similar to what storywright does for obtaining screenshots, which is actually desired behaviour as it makes the tool atomic and re-usable. While the implementation is tightly coupled to our full source addon, it shouldn't coupled as a pre-requirement to have - thus having a graceful behaviour, if full source exists we process that code otherwise standard sb code.
storybook composition: this approach won't scale outside repo linked SB, thus the approach here should be that it's responsibility of linked(composed) SB to generate the markdown assets as part of their production builds |
|
Thanks for the feedback @Hotell!
That makes sense.
Agree, do you think it should live in the core monorepo or as a standalone repo?
That's exactly how it works atm, we use the |
1b87b82 to
e9051fa
Compare
lets stick in core repo for now for logistic and distribution simplicity, in future it might make sense to create a new fluent-storybook-addons repo or something alike |
Hotell
left a comment
There was a problem hiding this comment.
looking great !
- added some commens/actionables ( mainly the SB api simplification / encapsulation )
A thing for thought:
- with this approach it's a black box that might come as a surprise what the deployed output will be. maybe we should consider actually storing the
.txtgeneration in git and force to re-generate if content changes ( similarly like we have for JSXIntrinsicElement in react-utilities )
That's a valid point about controlling the output, but it would mean core devs need to build a full docsite locally with every component story update PR, right? |
exactly, but that is actually desirable - same approach to api.md and test snapshots, for review and to guarantee we don't ship unexpected outputs. this is not a blocker as I mention - a thing t consider |
Hotell
left a comment
There was a problem hiding this comment.
left some additional comments, but nothing blocking.
lets go !
…torybook documentation extraction
af5b0a6 to
7e31f20
Compare
Previous Behavior
New Behavior
This PR introduces a new CLI tool that extracts documentation from Storybook builds and converts it to LLM-friendly formats following the llmstxt.org specification. The tool processes Storybook production builds to generate comprehensive documentation in plain text format optimized for Large Language Models.
Key Features
Technical Implementation
page.route()to serve Storybook files without needing a web server__STORYBOOK_PREVIEW__) for metadatastoryStore) and Storybook 8+ (storyStoreValue)Output Structure
Usage Examples
Basic Usage:
With Configuration File:
Files Added
tools/storybook-llms-extractor/src/cli.ts- CLI entry point and argument processingtools/storybook-llms-extractor/src/utils.ts- Core extraction and conversion logictools/storybook-llms-extractor/src/types.ts- TypeScript type definitionstools/storybook-llms-extractor/src/index.ts- Package exportstools/storybook-llms-extractor/src/utils.spec.ts- Unit teststools/storybook-llms-extractor/src/__fixtures__/- Test fixturestools/storybook-llms-extractor/README.md- Comprehensive documentation