Skip to content

Commit 7d928fd

Browse files
authored
docs: switch from pipx to uvx (#1319)
1 parent c245a29 commit 7d928fd

File tree

5 files changed

+14
-12
lines changed

5 files changed

+14
-12
lines changed

README.md

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -66,16 +66,18 @@ For detailed installation instructions see the [Setting up](https://crawlee.dev/
6666

6767
### With Crawlee CLI
6868

69-
The quickest way to get started with Crawlee is by using the Crawlee CLI and selecting one of the prepared templates. First, ensure you have [Pipx](https://pipx.pypa.io/) installed:
69+
The quickest way to get started with Crawlee is by using the Crawlee CLI and selecting one of the prepared templates. First, ensure you have [uv](https://pypi.org/project/uv/) installed:
7070

7171
```sh
72-
pipx --help
72+
uv --help
7373
```
7474

75+
If [uv](https://pypi.org/project/uv/) is not installed, follow the official [installation guide](https://docs.astral.sh/uv/getting-started/installation/).
76+
7577
Then, run the CLI and choose from the available templates:
7678

7779
```sh
78-
pipx run 'crawlee[cli]' create my-crawler
80+
uvx 'crawlee[cli]' create my-crawler
7981
```
8082

8183
If you already have `crawlee` installed, you can spin it up by running:
@@ -124,6 +126,7 @@ async def main() -> None:
124126
# Run the crawler with the initial list of URLs.
125127
await crawler.run(['https://crawlee.dev'])
126128

129+
127130
if __name__ == '__main__':
128131
asyncio.run(main())
129132
```

docs/introduction/01_setting_up.mdx

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -103,20 +103,20 @@ python -m pip install 'crawlee[beautifulsoup,curl-impersonate]'
103103

104104
The quickest way to get started with Crawlee is by using the Crawlee CLI and selecting one of the prepared templates. The CLI helps you set up a new project in seconds.
105105

106-
### Using Crawlee CLI with Pipx
106+
### Using Crawlee CLI with uv
107107

108-
First, ensure you have Pipx installed. You can check if Pipx is installed by running:
108+
First, ensure you have [uv](https://pypi.org/project/uv/) installed. You can check if it is installed by running:
109109

110110
```sh
111-
pipx --version
111+
uv --version
112112
```
113113

114-
If Pipx is not installed, follow the official [installation guide](https://pipx.pypa.io/stable/installation/).
114+
If [uv](https://pypi.org/project/uv/) is not installed, follow the official [installation guide](https://docs.astral.sh/uv/getting-started/installation/).
115115

116-
Then, run the Crawlee CLI using Pipx and choose from the available templates:
116+
Then, run the Crawlee CLI using `uvx` and choose from the available templates:
117117

118118
```sh
119-
pipx run 'crawlee[cli]' create my-crawler
119+
uvx 'crawlee[cli]' create my-crawler
120120
```
121121

122122
### Using Crawlee CLI directly

docs/introduction/09_running_in_cloud.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ We started this guide by using the Crawlee CLI to bootstrap the project - it off
2626

2727
Before we get started, you'll need to install two new dependencies:
2828

29-
- [**Apify SDK**](https://pypi.org/project/apify/), a toolkit for working with the Apify platform. This will allow us to wire the storages (e.g. [`RequestQueue`](https://docs.apify.com/sdk/python/reference/class/RequestQueue) and [`Dataset`](https://docs.apify.com/sdk/python/reference/class/Dataset)) to the Apify cloud products. The Apify SDK, like Crawlee itself, is available as a PyPI package and can be installed with any Python package manager. To install it using [pip](https://pipx.pypa.io/), run:
29+
- [**Apify SDK**](https://pypi.org/project/apify/), a toolkit for working with the Apify platform. This will allow us to wire the storages (e.g. [`RequestQueue`](https://docs.apify.com/sdk/python/reference/class/RequestQueue) and [`Dataset`](https://docs.apify.com/sdk/python/reference/class/Dataset)) to the Apify cloud products. The Apify SDK, like Crawlee itself, is available as a PyPI package and can be installed with any Python package manager. To install it using [pip](https://pip.pypa.io/), run:
3030

3131
```sh
3232
pip install apify

docs/upgrading/upgrading_to_v1.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,6 @@ HeaderGeneratorOptions(browsers=['chrome'])
3131
HeaderGeneratorOptions(browsers=['safari'])
3232
```
3333

34-
3534
## Storage clients
3635

3736
In v1.0, we are introducing a new storage clients system. We have completely reworked their interface,

website/src/components/Homepage/HomepageCliExample.jsx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ import React from 'react';
33
import CopyButton from '../CopyButton';
44
import styles from './HomepageCliExample.module.css';
55

6-
const cliCommand = `pipx run 'crawlee[cli]' create my-crawler`;
6+
const cliCommand = `uvx 'crawlee[cli]' create my-crawler`;
77

88
export default function CliExample() {
99
return (

0 commit comments

Comments
 (0)