Skip to content

Commit d25d5b0

Browse files
committed
chore: describe
1 parent 03d80b4 commit d25d5b0

File tree

2 files changed

+72
-2093
lines changed

2 files changed

+72
-2093
lines changed

README.md

Lines changed: 16 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,7 @@
11
# x-crawl · [![npm](https://img.shields.io/npm/v/x-crawl.svg)](https://www.npmjs.com/package/x-crawl) [![NPM Downloads](https://img.shields.io/npm/dt/x-crawl)](https://www.npmjs.com/package/x-crawl) [![GitHub license](https://img.shields.io/badge/license-MIT-blue.svg)](https://github.com/coder-hxl/x-crawl/blob/main/LICENSE)
22

3+
[English](https://coder-hxl.github.io/x-crawl) | [简体中文](https://coder-hxl.github.io/x-crawl/cn)
4+
35
x-crawl is a flexible Node.js AI-assisted crawler library. Flexible usage and powerful AI assistance functions make crawler work more efficient, intelligent and convenient.
46

57
It consists of two parts:
@@ -24,13 +26,6 @@ AI: Currently based on the large AI model provided by OpenAI, AI simplifies many
2426
- **🧾 Crawl information** - Controllable crawl information, which will output colored string information in the terminal.
2527
- **🦾 TypeScript** - Own types and implement complete types through generics.
2628

27-
## Document
28-
29-
V9:
30-
31-
- English document: https://github.com/coder-hxl/x-crawl/blob/v9.0.0/README.md
32-
- Chinese document: https://github.com/coder-hxl/x-crawl/blob/v9.0.0/docs/cn.md
33-
3429
## Example
3530

3631
The combination of crawler and AI allows the crawler and AI to obtain pictures of high-rated vacation rentals according to our instructions:
@@ -79,4 +74,18 @@ Pictures of highly rated vacation rentals climbed to:
7974

8075
![](https://raw.githubusercontent.com/coder-hxl/x-crawl/main/assets/example.png)
8176

77+
**Want to know more?**
78+
79+
https://coder-hxl.github.io/x-crawl/guide/#example
80+
8281
**warning**: x-crawl is for legal use only. Any illegal activity using this tool is prohibited. Please be sure to comply with the robots.txt file regulations of the target website. This example is only used to demonstrate the use of x-crawl and is not targeted at a specific website.
82+
83+
## Document
84+
85+
x-crawl latest version documentation:
86+
87+
[English](https://coder-hxl.github.io/x-crawl) | [简体中文](https://coder-hxl.github.io/x-crawl/cn)
88+
89+
x-crawl v9 documentation:
90+
91+
[English](https://github.com/coder-hxl/x-crawl/blob/v9.0.0/README.md) | [简体中文](https://github.com/coder-hxl/x-crawl/blob/v9.0.0/docs/cn.md)

0 commit comments

Comments
 (0)