Skip to content

feat: Support ollama instead of llamafile#231

Open
mrorigo wants to merge 1 commit intosouzatharsis:mainfrom
mrorigo:abstract_llm_backend
Open

feat: Support ollama instead of llamafile#231
mrorigo wants to merge 1 commit intosouzatharsis:mainfrom
mrorigo:abstract_llm_backend

Conversation

@mrorigo
Copy link

@mrorigo mrorigo commented Jan 23, 2025

  • Removed llamafile dependency from requirements.txt
  • Use Ollama instead of llamafile
  • Introduces LLMBackendType enum

New command line flag:
--llm-type - Set to litellm by default, and accepts 'google' or 'ollama'

Removed is_local flag

Example:
podcastfy/client.py --llm-type ollama --llm-model-name=llama3.1:8b-instruct-q8_0 --transcript ./data/transcripts/transcript_a846e44acfe143579b1fa570feb73328.txt

@mrorigo mrorigo force-pushed the abstract_llm_backend branch 2 times, most recently from 6326335 to b7518fa Compare January 23, 2025 18:40
- Removed llamafile dependency from requirements.txt
- Use Ollama instead of llamafile
- Introduces LLMBackendType enum
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant