diff --git a/README.md b/README.md index 676bd9aa..3cadfd55 100644 --- a/README.md +++ b/README.md @@ -14,14 +14,23 @@ Clarifai Python SDK PyPI - Downloads - PyPI - Versions + PyPI - Python Versions

-This is the official Python client for interacting with our powerful [API](https://docs.clarifai.com). The Clarifai Python SDK offers a comprehensive set of tools to integrate Clarifai's AI platform to leverage computer vision capabilities like classification , detection ,segementation and natural language capabilities like classification , summarisation , generation , Q&A ,etc into your applications. With just a few lines of code, you can leverage cutting-edge artificial intelligence to unlock valuable insights from visual and textual content. +This is the official Python client for interacting with our powerful [API](https://docs.clarifai.com). The Clarifai Python SDK offers a comprehensive set of tools to integrate Clarifai's AI platform to leverage computer vision capabilities like classification, detection, segmentation and natural language capabilities like classification, summarization, generation, Q&A, etc into your applications. With just a few lines of code, you can leverage cutting-edge artificial intelligence to unlock valuable insights from visual and textual content. + +**Key Features:** +- 🚀 **Model Development & Deployment**: Complete model lifecycle management with local testing, upload, and deployment +- 🔧 **CLI Tool**: Powerful command-line interface for model operations, compute orchestration, and pipeline management +- 🌐 **Compute Orchestration**: Streamlined infrastructure management for training, deploying, and scaling ML models +- 🧠 **Multi-Modal AI**: Support for text, image, video, and audio processing with state-of-the-art models +- 🔍 **Smart Search**: Vector-based search capabilities for visual and semantic similarity +- 📊 **RAG (Retrieval Augmented Generation)**: Built-in support for document-based AI applications +- 🔌 **Local Development**: Local model runners with support for vLLM, Hugging Face, LMStudio, and Ollama [Website](https://www.clarifai.com/) | [Schedule Demo](https://www.clarifai.com/company/schedule-demo) | [Signup for a Free Account](https://clarifai.com/signup) | [API Docs](https://docs.clarifai.com/) | [Clarifai Community](https://clarifai.com/explore) | [Python SDK Docs](https://docs.clarifai.com/resources/api-references/python) | [Examples](https://github.com/Clarifai/examples) | [Colab Notebooks](https://github.com/Clarifai/colab-notebooks) | [Discord](https://discord.gg/XAPE3Vtg) @@ -34,10 +43,15 @@ Give the repo a star ⭐ * **[Installation](#rocket-installation)** * **[Getting Started](#memo-getting-started)** +* **[CLI Tool](#hammer_and_wrench-cli-tool)** + * [Model Operations](#model-operations) + * [Pipeline Operations](#pipeline-operations) + * [Local Development](#local-development) * **[Compute Orchestration](#rocket-compute-orchestration)** * [Cluster Operations](#cluster-operations) * [Nodepool Operations](#nodepool-operations) - * [Depolyment Operations](#deployment-operations) + * [Deployment Operations](#deployment-operations) +* **[Model Upload & Development](#open_file_folder-model-upload--development)** * **[Interacting with Datasets](#floppy_disk-interacting-with-datasets)** * **[Interacting with Inputs](#floppy_disk-interacting-with-inputs)** * [Input Upload](#input-upload) @@ -67,14 +81,15 @@ Give the repo a star ⭐ ## :rocket: Installation +**Requirements:** Python 3.9+ (supports 3.9, 3.10, 3.11, 3.12) -Install from PyPi: +### Install from PyPI (Recommended) ```bash pip install -U clarifai ``` -Install from Source: +### Install from Source ```bash git clone https://github.com/Clarifai/clarifai-python.git @@ -84,21 +99,33 @@ source .venv/bin/activate pip install -e . ``` -#### Linting +### Development Setup -For developers, use the precommit hook `.pre-commit-config.yaml` to automate linting. +For developers contributing to the project: ```bash +# Install development dependencies pip install -r requirements-dev.txt + +# Install pre-commit hooks for automatic linting pre-commit install -``` -Now every time you run `git commit` your code will be automatically linted and won't commit if it fails. +# Manual linting (uses ruff) +ruff check . --fix +ruff format . + +# Run pre-commit on all files +pre-commit run --all-files +``` -You can also manually trigger linting using: +### Verify Installation ```bash -pre-commit run --all-files +# Check installation +python -c "import clarifai; print(f'Version: {clarifai.__version__}')" + +# Test CLI tool +clarifai --help ``` @@ -136,6 +163,93 @@ from clarifai.client.user import User client = User(user_id="user_id", pat="your personal access token") ``` +## :hammer_and_wrench: CLI Tool + +The Clarifai CLI provides powerful commands for model development, compute orchestration, and pipeline management. All CLI operations work seamlessly with both local development and cloud deployment. + +### Model Operations + +#### Initialize a New Model +```bash +# Create a new model project structure +clarifai model init my-model +# Creates model.py, config.yaml, requirements.txt, and Dockerfile +``` + +#### Local Model Development & Testing +```bash +# Run model locally for development and debugging +clarifai model local-runner + +# Run model with local gRPC server +clarifai model local-grpc + +# Execute model unit tests +clarifai model local-test + +# Generate model method signatures +clarifai model signatures +``` + +#### Model Upload & Deployment +```bash +# Upload a trained model to Clarifai +clarifai model upload + +# Download model checkpoints +clarifai model download-checkpoints + +# List available models +clarifai model list +``` + +### Pipeline Operations + +#### Initialize Pipeline Project +```bash +# Create new pipeline with interactive prompts +clarifai pipeline init my-pipeline +# Creates config.yaml, step directories, and documentation +``` + +#### Pipeline Management +```bash +# Upload pipeline to Clarifai +clarifai pipeline upload + +# List all pipelines +clarifai pipeline list + +# Run pipeline and monitor progress +clarifai pipeline run + +# Validate pipeline configuration +clarifai pipeline validate-lock +``` + +### Local Development + +The CLI supports multiple AI toolkits for local model development: + +- **vLLM**: High-performance LLM inference +- **Hugging Face**: Extensive model library integration +- **LMStudio**: Local language model management +- **Ollama**: Local LLM deployment + +### Compute Orchestration CLI +```bash +# Manage compute clusters +clarifai computecluster create --config cluster-config.yaml +clarifai computecluster list + +# Manage nodepools +clarifai nodepool create --config nodepool-config.yaml +clarifai nodepool list + +# Manage deployments +clarifai deployment create --config deployment-config.yaml +clarifai deployment list +``` ## :rocket: Compute Orchestration @@ -190,7 +304,7 @@ nodepool = Nodepool(user_id="user_id",nodepool_id="demo-nodepool-id") deployment = nodepool.create_deployment(deployment_id="demo-deployment-id",config_filepath="deployment_config.yaml") #Get a deployment -deployment = nodepool.deployment(nodepool_id="demo-deployment-id") +deployment = nodepool.deployment(deployment_id="demo-deployment-id") print(deployment) # List deployments @@ -200,9 +314,54 @@ print(all_deployments) ``` ##### [Example Deployment config](https://github.com/Clarifai/examples/blob/main/ComputeOrchestration/configs/deployment_config.yaml) -#### Compute Orchestration CLI Operations -Refer Here: https://github.com/Clarifai/clarifai-python/tree/master/clarifai/cli +#### Recent Enhancements +- **Health Probes**: Models can define custom liveness/readiness checks for better reliability +- **Secrets Management**: CRUD operations for secrets with environment variable fallback +- **Git Integration**: Automatic metadata capture during model uploads with change detection +- **Enhanced Monitoring**: Improved logging and diagnostics across all compute operations + +## :open_file_folder: Model Upload & Development + +### Quick Start with CLI +```bash +# Initialize a new model project +clarifai model init my-custom-model + +# This creates: +# ├── config.yaml # Model configuration +# ├── requirements.txt # Dependencies +# ├── 1/ +# │ └── model.py # Model implementation +# └── Dockerfile # Container configuration +``` + +### Local Model Development +```bash +# Test your model locally +clarifai model local-test + +# Run interactive development server +clarifai model local-runner + +# Upload when ready +clarifai model upload +``` + +### Supported Toolkits +- **vLLM**: For high-performance LLM inference +- **Hugging Face**: Extensive model library integration +- **LMStudio**: Local language model management +- **Ollama**: Local LLM deployment + +### Advanced Features +- **Health Probes**: Define custom liveness/readiness checks +- **Secrets Management**: Secure handling of API keys and credentials +- **Git Integration**: Automatic versioning with change detection +- **Multi-Modal Support**: Text, image, video, and audio processing +For detailed examples and tutorials, visit: +- **[Model Examples Repository](https://github.com/Clarifai/runners-examples)** +- **[Official Documentation](https://docs.clarifai.com/compute/models/upload)** ## :floppy_disk: Interacting with Datasets @@ -554,30 +713,42 @@ for data in results: ## Retrieval Augmented Generation (RAG) -You can setup and start your RAG pipeline in 4 lines of code. The setup method automatically creates a new app and the necessary components under the hood. By default it uses the [mistral-7B-Instruct](https://clarifai.com/mistralai/completion/models/mistral-7B-Instruct) model. +Build powerful document-based AI applications with just a few lines of code. The RAG functionality automatically creates the necessary infrastructure and uses state-of-the-art language models. +### Quick Setup ```python from clarifai.rag import RAG +# Setup RAG pipeline (creates app and components automatically) rag_agent = RAG.setup(user_id="USER_ID") + +# Upload your documents rag_agent.upload(folder_path="~/docs") -rag_agent.chat(messages=[{"role":"human", "content":"What is Clarifai"}]) -``` -If you have previously run the setup method, you can instantiate the RAG class with the prompter workflow URL: +# Start chatting with your documents +response = rag_agent.chat(messages=[{"role":"human", "content":"What is Clarifai?"}]) +print(response) +``` +### Using Existing RAG Setup ```python from clarifai.rag import RAG +# Connect to previously created RAG workflow rag_agent = RAG(workflow_url="WORKFLOW_URL") + +# Continue chatting +rag_agent.chat(messages=[{"role":"human", "content":"Summarize the main features"}]) ``` +### Key Features +- **Automatic Setup**: Creates optimized workflows and infrastructure +- **Multi-Format Support**: PDFs, text files, web pages, and more +- **Advanced Models**: Uses latest language models like Mistral-7B-Instruct by default +- **Custom Workflows**: Support for existing apps and custom workflow configurations +- **Flexible Prompting**: Customizable prompt templates and parameters + ## :pushpin: More Examples See many more code examples in this [repo](https://github.com/Clarifai/examples). Also see the official [Python SDK docs](https://clarifai-python.readthedocs.io/en/latest/index.html) - -## :open_file_folder: Model Upload - -Examples for uploading models and runners have been moved to this [repo](https://github.com/Clarifai/runners-examples). -Find our official documentation at [docs.clarifai.com/compute/models/upload](https://docs.clarifai.com/compute/models/upload). diff --git a/clarifai/cli/README.md b/clarifai/cli/README.md index bcf33208..04e790a2 100644 --- a/clarifai/cli/README.md +++ b/clarifai/cli/README.md @@ -2,176 +2,312 @@ ## Overview -Clarifai offers a user-friendly interface for deploying your local model into production with Clarifai, featuring: +The Clarifai CLI provides powerful commands for the complete machine learning lifecycle, from model development to production deployment. Key features include: -* A convenient command-line interface (CLI) -* Easy implementation and testing in Python -* No need for MLops expertise. +* **Model Development & Testing**: Initialize, develop, and test models locally +* **Pipeline Management**: Create and manage complex ML workflows +* **Compute Orchestration**: Deploy and scale models on cloud infrastructure +* **Local Development**: Support for multiple AI toolkits (vLLM, Hugging Face, LMStudio, Ollama) +* **Context Management**: Manage multiple authentication profiles and environments +* **No MLOps Expertise Required**: Streamlined workflows for rapid development and deployment -## Context Management +## Quick Start -Manage CLI contexts for authentication and environment configuration: -### List all contexts +### Installation & Setup ```bash -clarifai config get-contexts +# Install from PyPI +pip install -U clarifai + +# Verify installation +clarifai --version + +# Login and configure +clarifai login ``` -### Switch context +## Model Operations + +### Initialize New Model Project ```bash -clarifai config use-context production +# Create a new model project structure +clarifai model init my-model + +# This creates: +# ├── config.yaml # Model configuration +# ├── requirements.txt # Dependencies +# ├── 1/ +# │ └── model.py # Model implementation +# └── Dockerfile # Container configuration ``` -### Show current context + +### Local Development & Testing ```bash -clarifai config current-context +# Run model locally for development and debugging +clarifai model local-runner + +# Run model with local gRPC server +clarifai model local-grpc + +# Execute model unit tests +clarifai model local-test + +# Generate model method signatures +clarifai model signatures ``` -### Create new context +### Model Upload & Deployment ```bash -clarifai config create-context staging --user-id myuser --pat 678*** +# Upload a trained model to Clarifai +clarifai model upload + +# Download model checkpoints +clarifai model download-checkpoints + +# List available models +clarifai model list + +# Make predictions +clarifai model predict --model-url --input ``` -### View entire configuration + +## Pipeline Operations + +### Initialize Pipeline Project ```bash -clarifai config view +# Create new pipeline with interactive prompts +clarifai pipeline init my-pipeline + +# This creates a complete pipeline structure: +# ├── config.yaml # Pipeline configuration +# ├── stepA/ # First pipeline step +# │ ├── config.yaml # Step A configuration +# │ ├── requirements.txt # Step A dependencies +# │ └── 1/ +# │ └── pipeline_step.py # Step A implementation +# ├── stepB/ # Second pipeline step +# └── README.md # Documentation ``` -### Delete a context + +### Pipeline Management ```bash -clarifai config delete-context old-context +# Upload pipeline to Clarifai +clarifai pipeline upload + +# List all pipelines +clarifai pipeline list + +# Run pipeline and monitor progress +clarifai pipeline run + +# Validate pipeline configuration +clarifai pipeline validate-lock ``` -### Edit configuration file + +## Context Management + +Manage CLI contexts for authentication and environment configuration: + +### Basic Context Operations ```bash -clarifai config edit +# List all contexts +clarifai config get-contexts + +# Switch context +clarifai config use-context production + +# Show current context +clarifai config current-context + +# Create new context +clarifai config create-context staging --user-id myuser --pat 678*** ``` -### Print environment variables for the active context +### Configuration Management ```bash -clarifai context env +# View entire configuration +clarifai config view + +# Delete a context +clarifai config delete-context old-context + +# Edit configuration file +clarifai config edit + +# Print environment variables for the active context +clarifai config env ``` ## Compute Orchestration -Quick example for deploying a `visual-classifier` model +Streamlined infrastructure management for training, deploying, and scaling ML models with automatic scaling and cross-provider support. -### Login +### Quick Deployment Example -First, login to cli using clarifai account details in a config file as shown below: +Complete workflow for deploying a model: ```bash -$ clarifai login --config -``` +# 1. Login to Clarifai +clarifai login -### Setup +# 2. Create compute cluster +clarifai computecluster create --config cluster-config.yaml -To prepare for deployment step, we have to setup a Compute Cluster with Nodepool of required server config to deploy the model. +# 3. Create nodepool +clarifai nodepool create --config nodepool-config.yaml -So, First, create a new Compute Cluster -```bash -$ clarifai computecluster create --config +# 4. Deploy model +clarifai deployment create --config deployment-config.yaml ``` -Then, create a new Nodepool in the created Compute Cluster +### Compute Cluster Management ```bash -$ clarifai nodepool create --config -``` +# Create cluster +clarifai computecluster create --config -### Deployment +# List clusters +clarifai computecluster list -After setup, we can deploy the `visual-classifier` model using a deployment config file as shown below: +# Delete cluster +clarifai computecluster delete --compute_cluster_id +``` +### Nodepool Management ```bash -$ clarifai deployment create --config -``` +# Create nodepool +clarifai nodepool create --config -### List Resources +# List nodepools +clarifai nodepool list --compute_cluster_id -List out existing Compute Clusters: +# Delete nodepool +clarifai nodepool delete --compute_cluster_id --nodepool_id +``` +### Deployment Management ```bash -$ clarifai computecluster list -``` +# Create deployment +clarifai deployment create --config -List out existing Nodepools: +# List deployments +clarifai deployment list --nodepool_id -```bash -$ clarifai nodepool list --compute_cluster_id +# Delete deployment +clarifai deployment delete --nodepool_id --deployment_id ``` -List out existing Deployments: +### Advanced Features +- **Health Probes**: Automatic liveness/readiness checks for deployed models +- **Secrets Management**: Secure handling of API keys and credentials +- **Git Integration**: Automatic metadata capture during deployments +- **Enhanced Monitoring**: Improved logging and diagnostics + +## Pipeline Steps Management +### List Pipeline Steps ```bash -$ clarifai deployment list --nodepool_id -``` +# List all pipeline steps +clarifai pipeline-step list -### Delete Resources +# List steps in specific app +clarifai pipeline-step list --app_id -Delete existing Deployment: +# List steps for specific pipeline +clarifai pipeline-step list --app_id --pipeline_id -```bash -$ clarifai deployment delete --nodepool_id --deployment_id +# Using alias +clarifai ps ls ``` -Delete existing Nodepool: - +### Pipeline Step Operations ```bash -$ clarifai nodepool delete --compute_cluster_id --nodepool_id -``` +# Upload pipeline step +clarifai pipeline-step upload -Delete existing Compute Clusters: +# Test pipeline step +clarifai pipeline-step test -```bash -$ clarifai computecluster delete --compute_cluster_id +# List with pagination +clarifai pipeline-step list --page_no 1 --per_page 10 ``` -## Pipelines +## Local Development Toolkits -### List Pipelines +The CLI supports multiple AI toolkits for local model development: -List all pipelines for the user across all apps: +### Supported Toolkits +- **vLLM**: High-performance LLM inference +- **Hugging Face**: Extensive model library integration +- **LMStudio**: Local language model management +- **Ollama**: Local LLM deployment +### Local Development Workflow ```bash -$ clarifai pipeline list -``` +# Initialize with specific toolkit +clarifai model init my-llm-model -List pipelines within a specific app: +# Develop and test locally +clarifai model local-runner -```bash -$ clarifai pipeline list --app_id +# Upload when ready +clarifai model upload ``` -List with pagination: +## Additional Commands +### Shell Completion ```bash -$ clarifai pipeline list --page_no 1 --per_page 10 -``` +# Generate shell completion script +clarifai shell-completion -### List Pipeline Steps +# For bash +eval "$(clarifai shell-completion bash)" -List all pipeline steps for the user across all apps: +# For zsh +eval "$(clarifai shell-completion zsh)" +``` +### Context Execution ```bash -$ clarifai pipelinestep list +# Execute script with current context environment +clarifai run my-script.py ``` -List pipeline steps within a specific app: - +### Help & Information ```bash -$ clarifai pipelinestep list --app_id +# Get help for any command +clarifai --help +clarifai model --help +clarifai pipeline --help + +# Check version +clarifai --version ``` -List pipeline steps for a specific pipeline: +## Examples & Resources -```bash -$ clarifai pipelinestep list --app_id --pipeline_id -``` +### Configuration Examples +- **[Compute Orchestration Configs](https://github.com/Clarifai/examples/tree/main/ComputeOrchestration/configs)** +- **[Model Examples Repository](https://github.com/Clarifai/runners-examples)** -### Aliases +### Documentation +- **[Official CLI Documentation](https://docs.clarifai.com/cli/)** +- **[Python SDK Documentation](https://docs.clarifai.com/resources/api-references/python)** +- **[Model Upload Guide](https://docs.clarifai.com/compute/models/upload)** +- **[Compute Orchestration Guide](https://docs.clarifai.com/compute/compute-orchestration/)** -Both commands support the `ls` alias for convenience: +### Community & Support +- **[Clarifai Community](https://clarifai.com/explore)** +- **[Discord Community](https://discord.gg/XAPE3Vtg)** +- **[GitHub Issues](https://github.com/Clarifai/clarifai-python/issues)** -```bash -$ clarifai pipeline ls -$ clarifai pipelinestep ls -``` +## Getting Help -## Learn More +For command-specific help, use the `--help` flag: -* [Example Configs](https://github.com/Clarifai/examples/tree/main/ComputeOrchestration/configs) +```bash +clarifai --help # General help +clarifai model --help # Model commands +clarifai pipeline --help # Pipeline commands +clarifai computecluster --help # Compute cluster commands +clarifai config --help # Configuration commands +```