Skip to content

Commit 180e0d9

Browse files
committed
update readme
1 parent 0d1b435 commit 180e0d9

File tree

18 files changed

+36
-2
lines changed

18 files changed

+36
-2
lines changed

docs/AutoGen Core/index.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@ has_children: true
77

88
# Tutorial: AutoGen Core
99

10+
> This tutorial is AI-generated! To learn more: https://github.com/The-Pocket/Tutorial-Codebase-Knowledge
11+
1012
AutoGen Core helps you build applications with multiple **_Agents_** that can work together.
1113
Think of it like creating a team of specialized workers (*Agents*) who can communicate and use tools to solve problems.
1214
The **_AgentRuntime_** acts as the manager, handling messages and agent lifecycles.

docs/Browser Use/index.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@ has_children: true
77

88
# Tutorial: Browser Use
99

10+
> This tutorial is AI-generated! To learn more: https://github.com/The-Pocket/Tutorial-Codebase-Knowledge
11+
1012
**Browser Use** is a project that allows an *AI agent* to control a web browser and perform tasks automatically.
1113
Think of it like an AI assistant that can browse websites, fill forms, click buttons, and extract information based on your instructions. It uses a Large Language Model (LLM) as its "brain" to decide what actions to take on a webpage to complete a given *task*. The project manages the browser session, understands the page structure (DOM), and communicates back and forth with the LLM.
1214

docs/Celery/index.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@ has_children: true
77

88
# Tutorial: Celery
99

10+
> This tutorial is AI-generated! To learn more: https://github.com/The-Pocket/Tutorial-Codebase-Knowledge
11+
1012
Celery is a system for running **distributed tasks** *asynchronously*. You define *units of work* (Tasks) in your Python code. When you want a task to run, you send a message using a **message broker** (like RabbitMQ or Redis). One or more **Worker** processes are running in the background, listening for these messages. When a worker receives a message, it executes the corresponding task. Optionally, the task's result (or any error) can be stored in a **Result Backend** (like Redis or a database) so you can check its status or retrieve the output later. Celery helps manage this whole process, making it easier to handle background jobs, scheduled tasks, and complex workflows.
1113

1214

docs/Click/index.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@ has_children: true
77

88
# Tutorial: Click
99

10+
> This tutorial is AI-generated! To learn more: https://github.com/The-Pocket/Tutorial-Codebase-Knowledge
11+
1012
Click is a Python library that makes creating **command-line interfaces (CLIs)** *easy and fun*.
1113
It uses simple Python **decorators** (`@click.command`, `@click.option`, etc.) to turn your functions into CLI commands with options and arguments.
1214
Click handles parsing user input, generating help messages, validating data types, and managing the flow between commands, letting you focus on your application's logic.

docs/Crawl4AI/index.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@ has_children: true
77

88
# Tutorial: Crawl4AI
99

10+
> This tutorial is AI-generated! To learn more: https://github.com/The-Pocket/Tutorial-Codebase-Knowledge
11+
1012
`Crawl4AI` is a flexible Python library for *asynchronously crawling websites* and *extracting structured content*, specifically designed for **AI use cases**.
1113
You primarily interact with the `AsyncWebCrawler`, which acts as the main coordinator. You provide it with URLs and a `CrawlerRunConfig` detailing *how* to crawl (e.g., using specific strategies for fetching, scraping, filtering, and extraction).
1214
It can handle single pages or multiple URLs concurrently using a `BaseDispatcher`, optionally crawl deeper by following links via `DeepCrawlStrategy`, manage `CacheMode`, and apply `RelevantContentFilter` before finally returning a `CrawlResult` containing all the gathered data.

docs/CrewAI/index.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@ has_children: true
77

88
# Tutorial: CrewAI
99

10+
> This tutorial is AI-generated! To learn more: https://github.com/The-Pocket/Tutorial-Codebase-Knowledge
11+
1012
**CrewAI** is a framework for orchestrating *autonomous AI agents*.
1113
Think of it like building a specialized team (a **Crew**) where each member (**Agent**) has a role, goal, and tools.
1214
You assign **Tasks** to Agents, defining what needs to be done. The **Crew** manages how these Agents collaborate, following a specific **Process** (like sequential steps).

docs/DSPy/index.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@ has_children: true
77

88
# Tutorial: DSPy
99

10+
> This tutorial is AI-generated! To learn more: https://github.com/The-Pocket/Tutorial-Codebase-Knowledge
11+
1012
DSPy helps you build and optimize *programs* that use **Language Models (LMs)** and **Retrieval Models (RMs)**.
1113
Think of it like composing Lego bricks (**Modules**) where each brick performs a specific task (like generating text or retrieving information).
1214
**Signatures** define what each Module does (its inputs and outputs), and **Teleprompters** automatically tune these modules (like optimizing prompts or examples) to get the best performance on your data.

docs/FastAPI/index.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@ has_children: true
77

88
# Tutorial: FastAPI
99

10+
> This tutorial is AI-generated! To learn more: https://github.com/The-Pocket/Tutorial-Codebase-Knowledge
11+
1012
FastAPI is a modern, *high-performance* web framework for building APIs with Python.
1113
It's designed to be **easy to use**, fast to code, and ready for production.
1214
Key features include **automatic data validation** (using Pydantic), **dependency injection**, and **automatic interactive API documentation** (OpenAPI and Swagger UI).

docs/Flask/index.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@ has_children: true
77

88
# Tutorial: Flask
99

10+
> This tutorial is AI-generated! To learn more: https://github.com/The-Pocket/Tutorial-Codebase-Knowledge
11+
1012
Flask is a lightweight **web framework** for Python.
1113
It helps you build web applications by handling incoming *web requests* and sending back *responses*.
1214
Flask provides tools for **routing** URLs to your Python functions, managing *request data*, creating *responses*, and using *templates* to generate HTML.

docs/LangGraph/index.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@ has_children: true
77

88
# Tutorial: LangGraph
99

10+
> This tutorial is AI-generated! To learn more: https://github.com/The-Pocket/Tutorial-Codebase-Knowledge
11+
1012
LangGraph helps you build complex **stateful applications**, like chatbots or agents, using a *graph-based approach*.
1113
You define your application's logic as a series of steps (**Nodes**) connected by transitions (**Edges**) in a **Graph**.
1214
The system manages the application's *shared state* using **Channels** and executes the graph step-by-step with its **Pregel engine**, handling things like branching, interruptions, and saving progress (**Checkpointing**).

0 commit comments

Comments
 (0)