feat: add support for multiple LLM providers (OpenAI, Anthropic, Hugg… #11
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Add Support for Multiple LLM Providers (OpenAI, Anthropic, HuggingFace/Local, etc.)
Closes: #4
Description
This PR enhances the flexibility of the semantic chunking service by introducing support for multiple LLM providers. The system is no longer limited to OpenAI and can now leverage open-source, self-hosted, and cloud-based models, including Anthropic, HuggingFace-hosted, and local models (such as Llama, Mistral, and others). This update makes it easy to select and configure the desired backend for semantic chunking.
Goals
openai,anthropic,huggingface, etc.)Changes
modelProviderandmodelConfigoptions to select and configure the desired backend.anthropic,transformers, andtorch.README.mdandsetup.pyto clarify the new provider options and correct any misleading information about OCR.Usage Examples
Testing
Dependencies
anthropic(for Claude models)transformersandtorch(for HuggingFace/local models)Documentation
README.mdwith new provider options and usage instructions./claim #4