You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We have various approaches for making our content visible to AI as well as making sure it's easily consumed in a plain-text format.
11
11
@@ -25,6 +25,11 @@ We have implemented `llms.txt`, `llms-full.txt` and also created per-page Markdo
25
25
26
26
In the top right of this page, you will see a `Page options` button where you can copy the current page as Markdown that can be given to your LLM of choice.
HTML is easily parsed - after all, the browser has to parse it to decide how to render the page you're reading now - it tends to not be very _portable_.
@@ -73,3 +78,15 @@ For example, take a look at our Markdown test fixture (or any page by appending
Most AI pricing is around input & output tokens and our approach greatly reduces the amount of input tokens required.
85
+
86
+
For example, let's take a look at the amount of tokens required for the [Workers Get Started](/workers/get-started/guide/) using [OpenAI's tokenizer](https://platform.openai.com/tokenizer):
87
+
88
+
- HTML: 15,229 tokens
89
+
- turndown: 3,401 tokens (4.48x less than HTML)
90
+
- index.md: 2,110 tokens (7.22x less than HTML)
91
+
92
+
When providing our content to AI, we can see a real-world ~7x saving in input tokens cost.
0 commit comments