Skip to content

Make HuggingFace models optional with graceful ImportError handling#99

Open
PositiveMike33 wants to merge 2 commits intokyegomez:mainfrom
PositiveMike33:claude/fix-pytorch-dependency-1A3O7
Open

Make HuggingFace models optional with graceful ImportError handling#99
PositiveMike33 wants to merge 2 commits intokyegomez:mainfrom
PositiveMike33:claude/fix-pytorch-dependency-1A3O7

Conversation

@PositiveMike33
Copy link

Summary

This PR makes HuggingFace model dependencies optional by adding conditional imports and graceful error handling. Users can now install the package without PyTorch/transformers and only install these dependencies when needed via an optional extras group.

Key Changes

  • Conditional imports in huggingface_model.py: Wrapped transformers imports in a try-except block with a HUGGINGFACE_AVAILABLE flag to detect if dependencies are installed
  • Runtime validation: Added ImportError checks in both HuggingLanguageModel and HFPipelineModel constructors that provide helpful installation instructions when dependencies are missing
  • Optional package exports in __init__.py: Made HuggingFace model imports conditional, setting them to None if transformers is unavailable, allowing the package to load successfully without these dependencies
  • Setup.py extras group: Added extras_require with a huggingface group that specifies torch>=2.0.0 as an optional dependency
  • Updated requirements.txt: Added torch as a dependency (note: this may need adjustment depending on whether torch should be in base requirements or only in extras)

Implementation Details

  • The HUGGINGFACE_AVAILABLE flag is set at module import time, allowing early detection of missing dependencies
  • Error messages guide users to install with pip install tree-of-thoughts[huggingface] for a better experience
  • The conditional import pattern in __init__.py prevents the entire package from failing to import if transformers is unavailable
  • Users who don't need HuggingFace models can now use other model providers (e.g., OpenAI) without installing PyTorch

https://claude.ai/code/session_01VYXewQoaCJ3yZaQtQWDVfq

claude added 2 commits March 4, 2026 06:29
- Add PyTorch as optional dependency in setup.py with extras_require['huggingface']
- Add torch>=2.0.0 to requirements.txt
- Wrap transformers imports in try/except block in huggingface_model.py
- Add HUGGINGFACE_AVAILABLE flag to gracefully handle missing PyTorch
- Add clear error messages when attempting to use HuggingFace models without PyTorch
- Make HuggingFace model imports conditional in __init__.py to prevent import errors

Users can now install with: pip install tree-of-thoughts[huggingface]
to get PyTorch support for HuggingFace models.

This fixes the issue where PyTorch was not declared as a dependency even though
it was needed for HuggingFace model support, causing warnings from transformers.
…roq, DeepSeek, Ollama, LM Studio)

- Implement AnthropicLanguageModel with claude-4.5-haiku as default model
- Add GroqLanguageModel for Grok/Mixtral models via Groq API
- Add DeepSeekLanguageModel for DeepSeek API integration
- Add OllamaLanguageModel for local Ollama instances
- Add LMStudioLanguageModel for LM Studio local deployment
- All models implement AbstractLanguageModel interface
- Support both 'value' and 'vote' evaluation strategies
- Environment variable configuration for API keys and endpoints
- Complete generate_thoughts() and evaluate_states() implementations
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants