Update dependencies in requirements.txt#182
Conversation
WalkthroughThis pull request updates multiple dependency package versions in requirements.txt. The changes include upgrades to aiohttp, json-repair, pydantic, litellm, streamlit, json5, requests, pypdf, torch, lxml, and httpx with adjusted version constraints and pinning strategies. Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes
Possibly related PRs
Poem
Pre-merge checks and finishing touches✅ Passed checks (3 passed)
✨ Finishing touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Codecov Report✅ All modified and coverable lines are covered by tests. 📢 Thoughts on this report? Let us know! |
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
requirements.txt (1)
17-17: Reconsider loosely-pinned pypdf minimum version.Line 17:
pypdf>=6.4.0is permissive and could pull future incompatible versions. Consider usingpypdf~=6.4.0(compatible release) orpypdf>=6.4.0,<7.0.0to prevent unexpected breakage from future major releases.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
requirements.txt(2 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: test (3.10)
🔇 Additional comments (1)
requirements.txt (1)
9-9: Clarify intent of pydantic constraint change.Line 9: pydantic changed from exact pinning (
2.9.1) to a compatible release clause (~2.12.5). This semantic change allows automatic minor and patch updates. Confirm this is intentional for your use case, as it differs from the locked dependency strategy implied by the original exact pin.
| @@ -1,25 +1,25 @@ | |||
| aiohttp>=3.10.11 | |||
| aiohttp>=3.13.2 | |||
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, let's examine the actual requirements.txt file to see the full context
head -34 requirements.txtRepository: barun-saha/slide-deck-ai
Length of output: 540
🌐 Web query:
lxml 4.9 to 6.0 breaking changes migration
💡 Result:
Key breaking / migration items when moving from lxml 4.9 → 6.0
- Minimum Python now 3.8 — support for Python < 3.8 removed. [1][3]
- zlib/lzma “direct parsing from compressed data” is no longer guaranteed (treated as optional); code that relied on automatic compressed-data parsing must check etree.LIBXML_FEATURES and handle absence. [1][3]
- New parser option decompress=False (default) controls automatic input decompression when built against libxml2 ≥2.15; code that relied on automatic decompression should explicitly enable decompress=True to preserve old behavior. [1][2][3]
- Schematron class deprecated (will become non‑functional in a future lxml/libxml2 change) — migrate away from lxml.Schematron usage. [1][3]
- Removal of libxml2 debug memory methods: MemDebug.dump() and MemDebug.show() removed. [1][3]
- Binary-wheel libxml2/libxslt versions changed (affects features like direct HTTP/FTP URL parsing and available libxml2 features) — tests relying on those built-in URL handlers or specific libxml2 feature flags should verify etree.LIBXML_FEATURES. [1][2][3]
- Some private/internal symbols were changed/removed in recent 5.x→6.x work (e.g., historically removed private result classes); avoid relying on underscore-prefixed internals. [2][4]
Upgrade checklist
- Ensure your runtime is Python ≥3.8. [1][3]
- Run your test-suite with the target lxml 6.x wheel (and the platform libxml2 you expect) and inspect etree.LIBXML_FEATURES for missing features you rely on (e.g., "zlib", "http"). [1][2]
- If you parse potentially untrusted compressed input, explicitly decide whether to enable parser(decompress=True) or pre-decompress safely. [1][3]
- Replace any use of deprecated Schematron APIs and avoid private underscore APIs. [1][2]
Sources: lxml 6.0 changelog and release notes (official docs and GitHub/PyPI). [1] [2] [3]
References
[1] lxml changelog (6.0.0).
[2] lxml GitHub releases (6.0.x).
[3] lxml on PyPI (6.0.0).
🌐 Web query:
torch 2.6 to 2.9 compatibility breaking changes
💡 Result:
Here are the major compatibility‑breaking changes and important deprecations introduced between PyTorch 2.6 and 2.9 (high‑impact items to watch when upgrading):
-
torch.load default security change (2.6): the default for torch.load(weights_only) was flipped from False → True (can break code that relied on loading arbitrary pickled objects). See the 2.6 release note and discussion. [1][2]
-
Linux wheel ABI / build platform change (2.6→2.7): Linux wheels moved to Manylinux 2.28 and CXX11_ABI=1 → you may need to rebuild C++/CUDA extensions with the new ABI. [1]
-
CUDA toolkit / GPU-arch support changes (2.8→2.9): PyTorch changed which GPU architectures are included in various CUDA builds — Maxwell/Pascal/Volta support was removed/deprecated for CUDA 12.8/12.9+ binaries (affects older GPUs and which CUDA wheel you must pick). Check the release/packaging notices and related GitHub issue for details and recommended CUDA wheel choices. [3][4]
-
API deprecations / migrations (2.8): some APIs were deprecated (example: migration notes around quantization APIs / torch.ao.quantization → new locations or replacements are documented in the 2.8/2.9 ecosystem notes). Plan to update uses before final removal in later releases. [6]
-
libtorch / ABI and Python minimum (2.9): changes to the libtorch ABI stability and the minimum supported Python version was raised (PyTorch 2.9 requires Python ≥ 3.10); check 2.9 release notes for any other C++/ABI compatibility notes. [5][1]
References
- PyTorch 2.6 release notes (torch.load/ABI/Manylinux note). [1]
- Dev discussion / announcement about flipping torch.load(weights_only) default. [2]
- PyTorch 2.8 release / packaging notes (CUDA wheel, wheel variants). [3]
- PyTorch developer thread & GitHub issue on CUDA architecture removals (Maxwell/Pascal/Volta). [4]
- PyTorch 2.9 release notes / GitHub releases (Python min version, libtorch ABI). [5]
- 2.8 migration notes / summaries mentioning quantization API migration. [6]
Would you like a short upgrade checklist (exact code changes to search/replace, repro steps, and recommended wheel/CUDA combos for common GPUs)?
🌐 Web query:
json5 0.9.14 to 0.12.1 breaking changes
💡 Result:
Here are the notable changes between json5 0.9.14 and 0.12.1 that can affect compatibility or behavior (sources cited):
- json5 v0.9.17–v0.9.28: packaging/tooling changes (move to pyproject.toml, add typing, version metadata) — not functional but may affect builds/distributions. [1]
- v0.9.7 (earlier, but relevant): dumps()/dump()'s default() behavior was tightened to require the function to return a string (matching stdlib json); this was marked a potentially breaking change. If you used a default() that returned non-string values, code may break. [1]
- v0.10.0 (2024-11-25): added JSON5Encoder class (new encoder API) and new quote_style options for dump()/dumps(); code that subclasses or depends on the previous encoding internals may need changes. The release is stated as intended to be backwards-compatible, but the new encoder API can change customization paths. [1]
- v0.11.0 (2025-04-01): added a new parse() iterator-style method and two new parameters — consume_trailing and start — for load()/loads()/parse(); these add features but do not remove existing behavior. If you rely on implicit “consume trailing” behavior you now can change it; not a breaking removal but an API surface increase. [1]
- v0.12.0 (2025-04-03) / v0.12.1 (2025-08-12): primarily packaging/build/devenv changes and a small bugfix (0.12.1 fixes indentation of objects returned from custom encoders). No user-facing breaking changes documented. [1][2]
Summary advice: the only potentially breaking behavioral change between 0.9.14 and 0.12.1 you should watch for is the stricter requirement for encoder.default() to return a string (see v0.9.7 note). Other changes are mainly new/extended APIs (JSON5Encoder, quote_style, parse/consume_trailing/start) and packaging/tooling updates. [1][2]
Sources:
- pyjson5 (json5) repository release notes / version history. [1]
- json5 on PyPI (release files for 0.12.1). [2]
(If you want, I can grep your codebase for uses of json5.default()/custom encoders or show exact code changes to adapt.)
🏁 Script executed:
# Search for lxml usage patterns
rg "lxml|etree\." --type py -B 2 -A 2 | head -50Repository: barun-saha/slide-deck-ai
Length of output: 50
🏁 Script executed:
# Search for torch.load usage
rg "torch\.load" --type py -B 2 -A 2Repository: barun-saha/slide-deck-ai
Length of output: 50
🏁 Script executed:
# Search for json5 encoder/default usage
rg "json5\." --type py -B 2 -A 2Repository: barun-saha/slide-deck-ai
Length of output: 2219
🏁 Script executed:
# Check Python version requirements
rg "python_requires|Programming Language :: Python" --type py --type toml --type txt | head -20Repository: barun-saha/slide-deck-ai
Length of output: 125
🏁 Script executed:
# Check for Schematron usage (deprecated in lxml 6.0)
rg "Schematron|schematron" --type py -B 2 -A 2Repository: barun-saha/slide-deck-ai
Length of output: 50
🏁 Script executed:
# Check if transformers or other packages might use lxml indirectly
rg "from transformers|import transformers" --type py -B 1 -A 1 | head -20Repository: barun-saha/slide-deck-ai
Length of output: 305
🏁 Script executed:
# Search for python_requires in setup files
rg "python_requires" --type py --type tomlRepository: barun-saha/slide-deck-ai
Length of output: 50
🏁 Script executed:
# Check pyproject.toml fully for Python version specs
cat pyproject.toml | head -50Repository: barun-saha/slide-deck-ai
Length of output: 1171
🏁 Script executed:
# Check if there's a setup.py
ls -la setup.py 2>/dev/null || echo "No setup.py found"Repository: barun-saha/slide-deck-ai
Length of output: 85
Version updates are generally compatible with the codebase; minor verification recommended.
This PR updates 11 packages. Most changes are compatible with the codebase's current Python ≥3.10 requirement:
-
lxml ~6.0.2: This upgrade is safe. The codebase does not directly use lxml, so the documented breaking changes (compressed data parsing behavior, Schematron deprecation) do not apply. If a transitive dependency (e.g., transformers) introduces incompatibilities, they would surface in testing.
-
torch ~2.9.1: The constraint change from
>=2.6.0to~2.9.1is justified by PyTorch's breaking changes (ABI modifications for Linux wheels, CUDA GPU architecture support changes, Python 3.10+ requirement). Since the codebase targets Python ≥3.10, this is compatible. No directtorch.load()usage found in the codebase. -
json5 ~0.12.1: Safe upgrade. The codebase uses only
json5.loads()for parsing and does not use custom encoders or deprecated features. The upgrade from 0.9.14 introduces new APIs (JSON5Encoder, parse iterator) but remains backward compatible. -
Version constraint strategy inconsistencies (mixed use of
~=,==,>=): Noted but not a functional issue; consider standardizing for maintainability.
Run integration tests with the updated dependencies to confirm compatibility, particularly for transformers and its transitive dependencies.
🤖 Prompt for AI Agents
requirements.txt around lines 1-1: the PR updates multiple dependencies
(including aiohttp) and the reviewer flagged mixed version constraint styles and
recommended verification; run the full test suite and integration tests against
these updated packages (focus on transformers and any transitive deps like
lxml/torch/json5) to catch compatibility issues, and then normalize the version
constraint format across requirements.txt (pick a single convention such as ~=
for minor-compatible updates or >= for minimums) and update the file accordingly
before merging.
Summary by CodeRabbit
✏️ Tip: You can customize this high-level summary in your review settings.