-
Notifications
You must be signed in to change notification settings - Fork 10
Open
Description
- How can I create my niche LLM or AI tool?
- Can we switch AI providers on the cli and same context? (see https://github.com/foreveryh/claude-code-switch)
- Can we extend a "core" LLM? How?
- Unify all the mcp servers for all the local tools (JSON, symlinks, converters/transformers)
- Model arbitrage: strategically use different AI models for specific tasks to optimize cost and performance. Instead of using expensive models like Claude Code Sonnet for everything, you match each model's strengths to your needs.
- Can we create a whole Drupal site with AI cli (e.g. claude)?
- How to reduce token usage on local cli run?
- Can we run UI tests on GitHub by using simple AI commands?
- Should we use AI cli on the server (live site)?
- Is there any LLM with darknet knowledge, hidden, protected, pirate data etc?
- Can I expose my local files as AI data for LLM? (like torrents, startup idea)?
- Can I limit an MCP tool to use, e.g. only some domains for results?
- Should I create an MCP as AI persona
- Can AI do a cost estimate before starting a task? How to do this our own?
- Drupal LLM: Can we have each Drupal part as a mini, embedded LLM or an MCP?
- Drupal LLM: Can we have versions of each mini LLM according to the version of the tool we parse data for? (LLM variants)
- Drupal LLM: How much time, RAM (→ dollars) do we need?
- Drupal LLM: Estimate LLM vs Agent vs realtime RAG (cost, needs, maintenance, extensibility, UX etc)
Metadata
Metadata
Assignees
Labels
No labels