Skip to content
This repository was archived by the owner on Nov 10, 2025. It is now read-only.

Fix firecrawl tool (Too many positional arguments)#275

Merged
lucasgomide merged 6 commits intocrewAIInc:mainfrom
benzakritesteur:fix-firecrawl-tool
Apr 28, 2025
Merged

Fix firecrawl tool (Too many positional arguments)#275
lucasgomide merged 6 commits intocrewAIInc:mainfrom
benzakritesteur:fix-firecrawl-tool

Conversation

@benzakritesteur
Copy link
Contributor

Corrected error when calling all 3 Firecrawl tools :

  • Firecrawl Crawl
  • Firecrawl Scrape
  • Firecrawl Search

Passing optional variables in a dict a **kwargs for the firecrawl-py python package was not working.

Indeed, firecrawl uses :

def func(url, var_1, ..., **kwargs):
    params = {}
    if var_1:
        params['var_1'] = var_1
    params.update(kwargs)

So calling :

options = {
    'var_1': var_1,
    ...
}
firecrawl.func(url, options)

leads to a Too many positional arguments error

Was leading to an error too many arguments when calling the craw_url() function
Corrected to avoid too many arguments error when calling firecrawl scrape_url function
Corrected to avoid error too many arguments when calling firecrawl search() function
Copy link
Contributor

@lucasgomide lucasgomide left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@benzakritesteur do you mind to sync with main?

Currently we pre-defined the available paramenters to call Firecrawl, this commit adds support to receive any parameter and propagate them
Copy link
Contributor

@lorenzejay lorenzejay left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lets just add the moved arguments as doc strings on the tool class itself 🙏🏼 and we should be gold

Comment on lines +24 to +37
config: Optional[dict[str, Any]] = Field(
default_factory=lambda: {
"max_depth": 2,
"ignore_sitemap": True,
"limit": 100,
"allow_backward_links": False,
"allow_external_links": False,
"scrape_options": ScrapeOptions(
formats=["markdown", "screenshot", "links"],
only_main_content=True,
timeout=30000,
),
}
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

love this

crawling_options["scrapeOptions"]["timeout"] = timeout

return self._firecrawl.crawl_url(url, crawling_options)
def _run(self, url: str):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

awesome moving to config

@lucasgomide lucasgomide merged commit a819296 into crewAIInc:main Apr 28, 2025
1 check passed
@lucasgomide
Copy link
Contributor

Closes crewAIInc/crewAI#2697, #226

mplachta pushed a commit to mplachta/crewAI-tools that referenced this pull request Aug 27, 2025
* Corrected to adapt to firecrawl package use

Was leading to an error too many arguments when calling the craw_url() function

* Corrected to adapt to firecrawl package use

Corrected to avoid too many arguments error when calling firecrawl scrape_url function

* Corrected to adapt to firecrawl package use

Corrected to avoid error too many arguments when calling firecrawl search() function

* fix: fix firecrawl integration

* feat: support define Firecrawl using any config

Currently we pre-defined the available paramenters to call Firecrawl, this commit adds support to receive any parameter and propagate them

* docs: added doc string to Firecrawls classes

---------

Co-authored-by: Lucas Gomide <lucaslg200@gmail.com>
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants