Skip to content

chore(deps-dev): bump crawl4ai from 0.7.4 to 0.8.0#98

Open
dependabot[bot] wants to merge 1 commit intomainfrom
dependabot/uv/crawl4ai-0.8.0
Open

chore(deps-dev): bump crawl4ai from 0.7.4 to 0.8.0#98
dependabot[bot] wants to merge 1 commit intomainfrom
dependabot/uv/crawl4ai-0.8.0

Conversation

@dependabot
Copy link

@dependabot dependabot bot commented on behalf of github Jan 18, 2026

Bumps crawl4ai from 0.7.4 to 0.8.0.

Release notes

Sourced from crawl4ai's releases.

Release v0.8.0

🎉 Crawl4AI v0.8.0 Released!

📦 Installation

PyPI:

pip install crawl4ai==0.8.0

Docker:

docker pull unclecode/crawl4ai:0.8.0
docker pull unclecode/crawl4ai:latest

Note: Docker images are being built and will be available shortly. Check the Docker Release workflow for build status.

📝 What's Changed

See CHANGELOG.md for details.

Release v0.7.8

🎉 Crawl4AI v0.7.8 Released!

📦 Installation

PyPI:

pip install crawl4ai==0.7.8

Docker:

docker pull unclecode/crawl4ai:0.7.8
docker pull unclecode/crawl4ai:latest

Note: Docker images are being built and will be available shortly. Check the Docker Release workflow for build status.

📝 What's Changed

See CHANGELOG.md for details.

Release v0.7.7

🎉 Crawl4AI v0.7.7 Released!

This release introduces a complete self-hosting platform with enterprise-grade real-time monitoring. This release transforms Crawl4AI Docker from a simple containerized crawler into a production-ready platform with full operational transparency and control.

🚀 What's New

... (truncated)

Changelog

Sourced from crawl4ai's changelog.

[0.8.0] - 2026-01-12

Security

  • 🔒 CRITICAL: Remote Code Execution Fix: Removed __import__ from hook allowed builtins
    • Prevents arbitrary module imports in user-provided hook code
    • Hooks now disabled by default via CRAWL4AI_HOOKS_ENABLED environment variable
    • Credit: Neo by ProjectDiscovery
  • 🔒 HIGH: Local File Inclusion Fix: Added URL scheme validation to Docker API endpoints
    • Blocks file://, javascript:, data: URLs on /execute_js, /screenshot, /pdf, /html
    • Only allows http://, https://, and raw: URLs
    • Credit: Neo by ProjectDiscovery

Breaking Changes

  • Docker API: Hooks disabled by default: Set CRAWL4AI_HOOKS_ENABLED=true to enable
  • Docker API: file:// URLs blocked: Use Python library directly for local file processing

Added

  • 🚀 init_scripts for BrowserConfig: Pre-page-load JavaScript injection for stealth evasions
  • 🔄 CDP Connection Improvements: WebSocket URL support, proper cleanup, browser reuse
  • 💾 Crash Recovery for Deep Crawl: resume_state and on_state_change for BFS/DFS/Best-First strategies
  • 📄 PDF/MHTML for raw:/file:// URLs: Generate PDFs and MHTML from cached HTML content
  • 📸 Screenshots for raw:/file:// URLs: Render cached HTML and capture screenshots
  • 🔗 base_url Parameter: Proper URL resolution for raw: HTML processing
  • ⚡ Prefetch Mode: Two-phase deep crawling with fast link extraction
  • 🔀 Enhanced Proxy Support: Improved proxy rotation and sticky sessions
  • 🌐 HTTP Strategy Proxy Support: Non-browser crawler now supports proxies
  • 🖥️ Browser Pipeline for raw:/file://: New process_in_browser parameter
  • 📋 Smart TTL Cache for Sitemap Seeder: cache_ttl_hours and validate_sitemap_lastmod parameters
  • 📚 Security Documentation: Added SECURITY.md with vulnerability reporting guidelines

Fixed

  • raw: URL Parsing: Fixed truncation at # character (CSS color codes like #eee)
  • Caching System: Various improvements to cache validation and persistence

Documentation

  • Multi-sample schema generation section
  • URL seeder smart TTL cache parameters
  • v0.8.0 migration guide
  • Security policy and disclosure process

[Unreleased]

Added

  • 🔒 HTTPS Preservation for Internal Links: New preserve_https_for_internal_links configuration flag
    • Maintains HTTPS scheme for internal links even when servers redirect to HTTP
    • Prevents security downgrades during deep crawling
    • Useful for security-conscious crawling and sites supporting both protocols
    • Fully backward compatible with opt-in flag (default: False)
    • Fixes issue #1410 where HTTPS URLs were being downgraded to HTTP

... (truncated)

Commits
  • a5354f2 Merge branch 'develop' into release/v0.8.0
  • 6090629 Fix: Enable litellm.drop_params for O-series/GPT-5 model compatibility
  • a00da65 Add async agenerate_schema method for schema generation
  • 177e298 Update security researcher acknowledgment with a hyperlink for Neo by Project...
  • f09146c Release v0.8.0: The v0.8.0 Update
  • 315eae9 Add examples for deep crawl crash recovery and prefetch mode in documentation
  • 530cde3 Add release notes for v0.8.0, detailing breaking changes, security fixes, new...
  • 122b4fe Add release notes for v0.7.9, detailing breaking changes, security fixes, new...
  • acfab80 Enhance authentication flow by implementing JWT token retrieval and adding au...
  • f24396c Fix critical RCE and LFI vulnerabilities in Docker API deployment
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    You can disable automated security fix PRs for this repo from the Security Alerts page.

Bumps [crawl4ai](https://github.com/unclecode/crawl4ai) from 0.7.4 to 0.8.0.
- [Release notes](https://github.com/unclecode/crawl4ai/releases)
- [Changelog](https://github.com/unclecode/crawl4ai/blob/main/CHANGELOG.md)
- [Commits](unclecode/crawl4ai@v0.7.4...v0.8.0)

---
updated-dependencies:
- dependency-name: crawl4ai
  dependency-version: 0.8.0
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python:uv Pull requests that update python:uv code labels Jan 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file python:uv Pull requests that update python:uv code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants