You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This PR was merged into the main branch.
Discussion
----------
[Platform] Introduce `CachedPlatform`
| Q | A
| ------------- | ---
| Bug fix? | no
| New feature? | yes
| Docs? | yes
| Issues | Somehow related to #337
| License | MIT
Hi 👋🏻
This PR aim to introduce a caching layer for `Ollama` platform (like OpenAI, Anthropic and more already does).
Commits
-------
c66e733 feat(platform): Ollama prompt cache
Copy file name to clipboardExpand all lines: src/ai-bundle/config/options.php
+15Lines changed: 15 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -65,6 +65,21 @@
65
65
->end()
66
66
->end()
67
67
->end()
68
+
->arrayNode('cache')
69
+
->useAttributeAsKey('name')
70
+
->arrayPrototype()
71
+
->children()
72
+
->stringNode('platform')->isRequired()->end()
73
+
->stringNode('service')
74
+
->isRequired()
75
+
->info('The cache service id as defined under the "cache" configuration key')
76
+
->end()
77
+
->stringNode('cache_key')
78
+
->info('Key used to store platform results, if not set, the current platform name will be used, the "prompt_cache_key" can be set during platform call to override this value')
0 commit comments