diff --git a/.cursor/rules/ai.mdc b/.cursor/rules/ai.mdc new file mode 100644 index 00000000..7db78dde --- /dev/null +++ b/.cursor/rules/ai.mdc @@ -0,0 +1,134 @@ +--- +description: + +alwaysApply: true +--- +# AI-powered Component Creation + +Use Claude or Cursor AI agents to generate WAVS components with minimal prompts. Components created by AI require thorough review and testing before production use. + +## Setup + +1. Clone the WAVS Foundry Template and complete system setup: + +```sh +git clone https://github.com/Lay3rLabs/wavs-foundry-template.git +cd wavs-foundry-template +# Follow README system setup instructions +``` + +2. Install and configure Claude Code ([Claude docs](mdc:https:/docs.anthropic.com/en/docs/claude-code/getting-started)) or download Cursor ([Cursor downloads](mdc:https:/www.cursor.com/downloads)). + +3. Open Claude or Cursor in the template root: + +```sh +claude +# or +cursor . +``` + +4. For Cursor, always attach the `component-rules.mdc` file to the chat prompt: + +```sh Chat +@component-rules.mdc +``` + +## Prompting AI Agents + +- Use short, clear instructions. +- Provide relevant docs or `.md` files. +- Include API endpoints and response structures if needed. +- Be specific about the component functionality. +- Examples: + +API component: + +``` +Let's make a component that takes the input of a zip code, queries the openbrewerydb, and returns the breweries in the area. @https://api.openbrewerydb.org/v1/breweries?by_postal=92101&per_page=3 +``` + +Contract balance component: + +``` +I want to build a new component that takes the input of a wallet address, queries the usdt contract, and returns the balance of that address. +``` + +Verifiable AI component (requires OpenAI API key in `.env`): + +``` +Please make a component that takes a prompt as input, sends an api request to OpenAI, and returns the response. + + Use this api structure: + { + "seed": $SEED, + "model": "gpt-4o", + "messages": [ + {"role": "system", "content": "You are a helpful assistant."}, + {"role": "user", "content": ""} + ] + } + My api key is WAVS_ENV_OPENAI_KEY in my .env file. +``` + +Set your API key in `.env`: + +```sh +cp .env.example .env +# Add your key prefixed with WAVS_ENV_ +WAVS_ENV_OPENAI_KEY=your_api_key +``` + +## Component Creation Workflow + +1. Submit prompt to AI agent. + +2. Review the agent's plan in `plan.md`. + +3. Agent creates component files. + +4. Validate component: + +```sh +make validate-component COMPONENT=your-component +``` + +5. Build component: + +```sh +WASI_BUILD_DIR=components/my-component make wasi-build +``` + +6. Test component logic (replace placeholders): + +```sh +export COMPONENT_FILENAME=openai_response.wasm +export INPUT_DATA="Only respond with yes or no: Is AI beneficial to the world?" +make wasi-exec +``` + +- Ask the agent to provide the `make wasi-exec` command; it cannot run commands itself. + +7. Troubleshoot errors by sharing logs with the agent. + +## Tips & Best Practices + +- AI agents may be unpredictable; update rulefiles if needed. +- For complex components, build simple versions first. +- Ignore minor warnings and errors in `bindings.rs` (auto-generated). +- Avoid letting the agent edit `bindings.rs`. +- If stuck, clear history and start fresh with adjusted prompts. +- Be patient; agents may over-engineer fixes or make mistakes. + +## Troubleshooting + +- Provide full error context to the agent. +- Avoid letting the agent run commands; request commands instead. +- Reformat long commands to avoid line break issues. + +For support, join the WAVS DEVS Telegram: https://t.me/layer_xyz/818 + +For more information: +- [Claude Code Getting Started](mdc:https:/docs.anthropic.com/en/docs/claude-code/getting-started) +- [Cursor Downloads](mdc:https:/www.cursor.com/downloads) +- [WAVS Foundry Template GitHub](mdc:https:/github.com/Lay3rLabs/wavs-foundry-template) +- [OpenAI Platform](mdc:https:/platform.openai.com/login) diff --git a/.cursor/rules/blockchain-interactions.mdc b/.cursor/rules/blockchain-interactions.mdc new file mode 100644 index 00000000..8af2b40b --- /dev/null +++ b/.cursor/rules/blockchain-interactions.mdc @@ -0,0 +1,158 @@ +--- +description: Guide for interacting with Ethereum and EVM-compatible blockchains from WAVS components using Rust crates and configuration. + +alwaysApply: true +--- +# Blockchain Interactions in WAVS Components + +Use the `wavs-wasi-utils` crate and Alloy ecosystem crates to interact with Ethereum and other EVM chains from WAVS components. Define chain configs in `wavs.toml` and generate Rust types from Solidity using the `sol!` macro. + +1. **Setup Dependencies** + +Add these to your `Cargo.toml`: + +```toml +[dependencies] +wit-bindgen-rt = { workspace = true, features = ["bitflags"] } +wavs-wasi-utils = "0.4.0-beta.4" +wstd = "0.5.3" + +alloy-sol-macro = { version = "1.1.0", features = ["json"] } +alloy-sol-types = "1.1.0" +alloy-network = "0.15.10" +alloy-provider = { version = "0.15.10", default-features = false, features = ["rpc-api"] } +alloy-rpc-types = "0.15.10" +alloy-contract = "0.15.10" + +anyhow = "1.0.98" +serde = { version = "1.0.219", features = ["derive"] } +serde_json = "1.0.140" +``` + +2. **Configure Chains** + +Define RPC endpoints and chain IDs in `wavs.toml`: + +```toml wavs.toml +[default.chains.evm.local] +chain_id = "31337" +ws_endpoint = "ws://localhost:8545" +http_endpoint = "http://localhost:8545" +poll_interval_ms = 7000 + +[default.chains.evm.ethereum] +chain_id = "1" +ws_endpoint = "wss://eth.drpc.org" +http_endpoint = "https://eth.drpc.org" +``` + +3. **Generate Rust Types from Solidity** + +Use the `sol!` macro to parse Solidity interfaces and generate Rust types: + +```rust +mod solidity { + use alloy_sol_macro::sol; + + // From file + sol!("../../src/interfaces/ITypes.sol"); + + // Inline definitions + sol! { + struct TriggerInfo { + uint64 triggerId; + bytes data; + } + + event NewTrigger(TriggerInfo _triggerInfo); + } +} +``` + +Example in `trigger.rs`: + +```rust trigger.rs +pub mod solidity { + use alloy_sol_macro::sol; + pub use ITypes::*; + + sol!("../../src/interfaces/ITypes.sol"); + + sol! { + function addTrigger(string data) external; + } +} +``` + +4. **Access Chain Config and Create Provider** + +Use WAVS host bindings and `new_evm_provider` to create an RPC provider: + +```rust lib.rs +use crate::bindings::host::get_evm_chain_config; +use alloy_network::Ethereum; +use alloy_provider::RootProvider; +use wavs_wasi_utils::evm::new_evm_provider; + +let chain_config = get_evm_chain_config("local").unwrap(); + +let provider: RootProvider = new_evm_provider::( + chain_config.http_endpoint.unwrap(), +); +``` + +5. **Example: Query ERC721 NFT Balance** + +```rust lib.rs +use crate::bindings::host::get_evm_chain_config; +use alloy_network::Ethereum; +use alloy_provider::RootProvider; +use alloy_sol_types::sol; +use wavs_wasi_utils::evm::{ + alloy_primitives::{Address, U256}, + new_evm_provider, +}; +use alloy_rpc_types::TransactionInput; +use wstd::runtime::block_on; + +sol! { + interface IERC721 { + function balanceOf(address owner) external view returns (uint256); + } +} + +pub fn query_nft_ownership(address: Address, nft_contract: Address) -> Result { + block_on(async move { + let chain_config = get_evm_chain_config("local").unwrap(); + let provider: RootProvider = new_evm_provider::( + chain_config.http_endpoint.unwrap() + ); + + let balance_call = IERC721::balanceOf { owner: address }; + + let tx = alloy_rpc_types::eth::TransactionRequest { + to: Some(TxKind::Call(nft_contract)), + input: TransactionInput { input: Some(balance_call.abi_encode().into()), data: None }, + ..Default::default() + }; + + let result = provider.call(tx).await.map_err(|e| e.to_string())?; + + let balance: U256 = U256::from_be_slice(&result); + Ok(balance > U256::ZERO) + }) +} +``` + +6. **Additional Notes** + +- Use `alloy-contract` crate for higher-level contract interactions. +- The `decode_event_log_data` macro decodes Ethereum event logs from triggers into Rust types implementing `SolEvent`. +- Re-run `cargo build` after updating Solidity files used with `sol!`. + +For more information: +- [wavs-wasi-utils crate](https://docs.rs/wavs-wasi-utils/latest/wavs_wasi_utils/) +- [Alloy crate ecosystem](https://docs.rs/alloy/latest/alloy/) +- [sol! macro documentation](https://docs.rs/alloy-sol-macro/latest/alloy_sol_macro/macro.sol.html) +- [alloy-contract crate](https://crates.io/crates/alloy-contract) +- [Example NFT query](https://github.com/Lay3rLabs/wavs-art/blob/main/components/autonomous-artist/src/evm.rs) diff --git a/.cursor/rules/commands.mdc b/.cursor/rules/commands.mdc new file mode 100644 index 00000000..14cbbde6 --- /dev/null +++ b/.cursor/rules/commands.mdc @@ -0,0 +1,45 @@ +--- +description: Overview of Makefile commands for WAVS development CLI + +alwaysApply: true +--- +# Makefile Commands for WAVS Development + +Use `make help` to list all available commands for building, testing, deploying, and managing WAVS projects. + +1. Run `make help` to see all commands: +```bash +make help +``` + +2. Common commands and their purposes: +```bash +build building the project +wasi-build building WAVS wasi components | WASI_BUILD_DIR +wasi-exec executing the WAVS wasi component(s) with ABI function | COMPONENT_FILENAME, INPUT_DATA +wasi-exec-fixed same as wasi-exec but uses fixed byte input (for Go & TS components) | COMPONENT_FILENAME, INPUT_DATA +clean cleaning the project files +clean-docker remove unused docker containers +validate-component validate a WAVS component against best practices +fmt format Solidity and Rust code +test run tests +setup install initial dependencies +start-all-local start anvil and core services (e.g., IPFS) +get-trigger-from-deploy get trigger address from deployment script +get-submit-from-deploy get submit address from deployment script +wavs-cli run wavs-cli in docker +upload-component upload WAVS component | COMPONENT_FILENAME, WAVS_ENDPOINT +deploy-service deploy WAVS component service JSON | SERVICE_URL, CREDENTIAL, WAVS_ENDPOINT +get-trigger get trigger id | SERVICE_TRIGGER_ADDR, RPC_URL +show-result show result | SERVICE_SUBMISSION_ADDR, TRIGGER_ID, RPC_URL +upload-to-ipfs upload service config to IPFS | SERVICE_FILE, [PINATA_API_KEY] +update-submodules update git submodules +check-requirements verify system requirements are installed +``` + +3. Use the commands with appropriate environment variables or arguments as indicated. + +4. Best practice: Use `validate-component` before deployment to ensure compliance with WAVS standards. + +For more information: +- [WAVS tutorial](https://docs.wavs.dev/tutorial/1-overview) diff --git a/.cursor/rules/component-rules.mdc b/.cursor/rules/component-rules.mdc new file mode 100644 index 00000000..4a668517 --- /dev/null +++ b/.cursor/rules/component-rules.mdc @@ -0,0 +1,909 @@ +--- +description: + +alwaysApply: true +--- +# WAVS Component Creation Guide + +You specialize in creating WAVS (WASI AVS) components. Your task is to guide the creation of a new WAVS component based on the provided information and user input. Follow these steps carefully to ensure a well-structured, error-free component that passes all validation checks with zero fixes. + +## Component Structure + +A WAVS component needs: +1. `Cargo.toml` - Dependencies configuration +2. `src/lib.rs` - Component implementation logic goes here +3. `src/trigger.rs` - trigger handling logic +4. `src/bindings.rs` - Auto-generated, never edit +5. `Makefile` - Do not edit +6. `config.json` - Only edit the name + +## Creating a Component + +### 1. Cargo.toml Template + +```toml +[package] +name = "your-component-name" +edition.workspace = true +version.workspace = true +authors.workspace = true +rust-version.workspace = true +repository.workspace = true + +[dependencies] +# Core dependencies (always needed) +wit-bindgen-rt ={ workspace = true} +wavs-wasi-utils = { workspace = true } +serde = { workspace = true } +serde_json = { workspace = true } +alloy-sol-macro = { workspace = true } +wstd = { workspace = true } +alloy-sol-types = { workspace = true } +anyhow = { workspace = true } + +# Add for blockchain interactions +alloy-primitives = { workspace = true } +alloy-provider = { workspace = true } +alloy-rpc-types = { workspace = true } +alloy-network = { workspace = true } +alloy-contract = { workspace = true } + +[lib] +crate-type = ["cdylib"] + +[profile.release] +codegen-units = 1 +opt-level = "s" +debug = false +strip = true +lto = true + +[package.metadata.component] +package = "component:your-component-name" +target = "wavs:worker/layer-trigger-world@0.4.0-beta.4" +``` + +CRITICAL: Never use direct version numbers - always use `{ workspace = true }`. +IMPORTANT! Always add your component to workspace members in the root Cargo.toml + +### 2. Component Implementation (lib.rs) + +#### Basic Structure + +```rust +mod trigger; +use trigger::{decode_trigger_event, encode_trigger_output, Destination}; +use wavs_wasi_utils::{ + evm::alloy_primitives::hex, + http::{fetch_json, http_request_get}, +}; +pub mod bindings; // Never edit bindings.rs! +use crate::bindings::{export, Guest, TriggerAction, WasmResponse}; +use alloy_sol_types::SolValue; +use serde::{Deserialize, Serialize}; +use wstd::{http::HeaderValue, runtime::block_on}; +use anyhow::Result; + +struct Component; +export!(Component with_types_in bindings); + +impl Guest for Component { + fn run(action: TriggerAction) -> std::result::Result, String> { + let (trigger_id, req, dest) = + decode_trigger_event(action.data).map_err(|e| e.to_string())?; + + // Decode trigger data inline - handles hex string input + let request_input = { + // First, convert the input bytes to a string to check if it's a hex string + let input_str = String::from_utf8(req.clone()) + .map_err(|e| format!("Input is not valid UTF-8: {}", e))?; + + // Check if it's a hex string (starts with "0x") + let hex_data = if input_str.starts_with("0x") { + // Decode the hex string to bytes + hex::decode(&input_str[2..]) + .map_err(|e| format!("Failed to decode hex string: {}", e))? + } else { + // If it's not a hex string, assume the input is already binary data + req.clone() + }; + + // Now ABI decode the binary data as a string parameter + ::abi_decode(&hex_data) + .map_err(|e| format!("Failed to decode input as ABI string: {}", e))? + }; + println!("Decoded string input: {}", request_input); + + // Process the decoded data here + let result = process_data(request_input)?; + + let output = match dest { + Destination::Ethereum => Some(encode_trigger_output(trigger_id, &result)), + Destination::CliOutput => Some(WasmResponse { payload: result.into(), ordering: None }), + }; + Ok(output) + } +} + +// Example processing function - replace with your actual logic +fn process_data(input: String) -> Result, String> { + // Your processing logic here + Ok(input.as_bytes().to_vec()) +} +``` + +#### Trigger Event Handling (trigger.rs) + +```rust +use crate::bindings::wavs::worker::layer_types::{ + TriggerData, TriggerDataEvmContractEvent, WasmResponse, +}; +use alloy_sol_types::SolValue; +use anyhow::Result; +use wavs_wasi_utils::decode_event_log_data; + +pub enum Destination { + Ethereum, + CliOutput, +} + +pub fn decode_trigger_event(trigger_data: TriggerData) -> Result<(u64, Vec, Destination)> { + match trigger_data { + TriggerData::EvmContractEvent(TriggerDataEvmContractEvent { log, .. }) => { + let event: solidity::NewTrigger = decode_event_log_data!(log)?; + let trigger_info = ::abi_decode(&event._triggerInfo)?; + Ok((trigger_info.triggerId, trigger_info.data.to_vec(), Destination::Ethereum)) + } + TriggerData::Raw(data) => Ok((0, data.clone(), Destination::CliOutput)), + _ => Err(anyhow::anyhow!("Unsupported trigger data type")), + } +} + +pub fn encode_trigger_output(trigger_id: u64, output: impl AsRef<[u8]>) -> WasmResponse { + WasmResponse { + payload: solidity::DataWithId { + triggerId: trigger_id, + data: output.as_ref().to_vec().into(), + } + .abi_encode(), + ordering: None, + } +} + +pub mod solidity { + use alloy_sol_macro::sol; + pub use ITypes::*; + sol!("../../src/interfaces/ITypes.sol"); + + // trigger contract function that encodes string input + sol! { + function addTrigger(string data) external; + } +} +``` + +## Critical Components + +### 1. ABI Handling + +NEVER use `String::from_utf8` on ABI-encoded data. This will ALWAYS fail with "invalid utf-8 sequence". + +```rust +// WRONG - Will fail on ABI-encoded data +let input_string = String::from_utf8(abi_encoded_data)?; + +// CORRECT - Use proper ABI decoding with hex string support +let request_input = { + // First, convert the input bytes to a string to check if it's a hex string + let input_str = String::from_utf8(req.clone()) + .map_err(|e| format!("Input is not valid UTF-8: {}", e))?; + + // Check if it's a hex string (starts with "0x") + let hex_data = if input_str.starts_with("0x") { + // Decode the hex string to bytes + hex::decode(&input_str[2..]) + .map_err(|e| format!("Failed to decode hex string: {}", e))? + } else { + // If it's not a hex string, assume the input is already binary data + req.clone() + }; + + // Now ABI decode the binary data as a string parameter + ::abi_decode(&hex_data) + .map_err(|e| format!("Failed to decode input as ABI string: {}", e))? +}; + +// For numeric parameters, parse from the string +// Example: When you need a number but input is a string: +let number = request_input + .trim() + .parse::() + .map_err(|_| format!("Invalid number: {}", request_input))?; + +// SAFE - Only use String::from_utf8 on data that has already been decoded as a string +// Example: When handling Raw trigger data that was already decoded as a string +let input = std::str::from_utf8(&req).map_err(|e| e.to_string())?; +``` + +### 2. Data Structure Ownership + +ALWAYS derive `Clone` for API response data structures. If fields may be missing, also use `Option`, `#[serde(default)]`, and `Default`: + +```rust +#[derive(Debug, Serialize, Deserialize, Clone, Default)] +#[serde(default)] +pub struct ResponseData { + field1: Option, + field2: Option, + // other fields +} +``` + +ALWAYS clone data before use to avoid ownership issues: + +```rust +// WRONG – creates a temporary that is dropped immediately +let result = process_data(&data.clone()); + +// CORRECT – clone into a named variable +let data_clone = data.clone(); +let result = process_data(&data_clone); +``` + + +### 3. Network Requests + +```rust +use wstd::runtime::block_on; +use wstd::http::HeaderValue; +use wavs_wasi_utils::http::{fetch_json, http_request_get, http_request_post_json}; +use serde::{Deserialize, Serialize}; + +#[derive(Debug, Serialize, Deserialize, Clone, Default)] +pub struct ApiResponse { + #[serde(default)] + field1: Option, + #[serde(default)] + field2: Option, +} + +async fn make_request() -> Result { + let url = format!("https://api.example.com/endpoint?param={}", param); + + let mut req = http_request_get(&url).map_err(|e| e.to_string())?; + req.headers_mut().insert("Accept", HeaderValue::from_static("application/json")); + req.headers_mut().insert("Content-Type", HeaderValue::from_static("application/json")); + req.headers_mut().insert("User-Agent", HeaderValue::from_static("Mozilla/5.0")); + + let response: ApiResponse = fetch_json(req).await.map_err(|e| e.to_string())?; + Ok(response) +} + +fn process_data() -> Result { + block_on(async move { make_request().await }) +} + +// For POST requests with JSON data, use http_request_post_json(url, &data) instead of http_request_get +``` + +### 4. Option/Result Handling + +```rust +// WRONG - Option types don't have map_err +let config = get_evm_chain_config("ethereum").map_err(|e| e.to_string())?; + +// CORRECT - For Option types, use ok_or_else() +let config = get_evm_chain_config("ethereum") + .ok_or_else(|| "Failed to get chain config".to_string())?; + +// CORRECT - For Result types, use map_err() +let balance = fetch_balance(address).await + .map_err(|e| format!("Balance fetch failed: {}", e))?; +``` + +### 5. Blockchain Interactions + +```rust +use alloy_network::Ethereum; +use alloy_primitives::{Address, TxKind, U256}; +use alloy_provider::{Provider, RootProvider}; +use alloy_rpc_types::TransactionInput; +use std::str::FromStr; // Required for parsing addresses +use crate::bindings::host::get_evm_chain_config; +use wavs_wasi_utils::evm::new_evm_provider; + +async fn query_blockchain(address_str: &str) -> Result { + // Parse address + let address = Address::from_str(address_str) + .map_err(|e| format!("Invalid address format: {}", e))?; + + // Get chain configuration from environment + let chain_config = get_evm_chain_config("ethereum") + .ok_or_else(|| "Failed to get chain config".to_string())?; + + // Create provider + let provider: RootProvider = + new_evm_provider::(chain_config.http_endpoint.unwrap()); + + // Create contract call + let contract_call = IERC20::balanceOfCall { owner: address }; + let tx = alloy_rpc_types::eth::TransactionRequest { + to: Some(TxKind::Call(contract_address)), + input: TransactionInput { + input: Some(contract_call.abi_encode().into()), + data: None + }, + ..Default::default() + }; + + // Execute call + let result = provider.call(tx).await.map_err(|e| e.to_string())?; + let balance: U256 = U256::from_be_slice(&result); + + Ok(ResponseData { /* your data here */ }) +} +``` + +### 6. Numeric Type Handling + +```rust +// WRONG - Using .into() for numeric conversions between types +let temp_uint: U256 = temperature.into(); // DON'T DO THIS + +// CORRECT - String parsing method works reliably for all numeric types +let temperature: u128 = 29300; +let temperature_uint256 = temperature.to_string().parse::().unwrap(); + +// CORRECT - Always use explicit casts between numeric types +let decimals: u8 = decimals_u32 as u8; + +// CORRECT - Handling token decimals correctly +let mut divisor = U256::from(1); +for _ in 0..decimals { + divisor = divisor * U256::from(10); +} +let formatted_amount = amount / divisor; +``` + +## Component Examples by Task + +Here are templates for common WAVS component tasks: + +### 1. Token Balance Checker + +```rust +// IMPORTS +use alloy_network::Ethereum; +use alloy_primitives::{Address, TxKind, U256}; +use alloy_provider::{Provider, RootProvider}; +use alloy_rpc_types::TransactionInput; +use alloy_sol_types::{sol, SolCall, SolValue}; +use anyhow::Result; +use serde::{Deserialize, Serialize}; +use std::str::FromStr; +use wavs_wasi_utils::{ + evm::{alloy_primitives::hex, new_evm_provider}, +}; +use wstd::runtime::block_on; + +pub mod bindings; +mod trigger; +use trigger::{decode_trigger_event, encode_trigger_output, Destination}; +use crate::bindings::host::get_evm_chain_config; +use crate::bindings::wavs::worker::layer_types::{TriggerData, TriggerDataEvmContractEvent}; +use crate::bindings::{export, Guest, TriggerAction, WasmResponse}; + +// TOKEN INTERFACE +sol! { + interface IERC20 { + function balanceOf(address owner) external view returns (uint256); + function decimals() external view returns (uint8); + } +} + +// FIXED CONTRACT ADDRESS +const TOKEN_CONTRACT_ADDRESS: &str = "0x..."; // Your token contract address + +// RESPONSE STRUCTURE - MUST DERIVE CLONE +#[derive(Debug, Serialize, Deserialize, Clone)] +pub struct TokenBalanceData { + wallet: String, + balance_raw: String, + balance_formatted: String, + token_contract: String, +} + +// COMPONENT IMPLEMENTATION +struct Component; +export!(Component with_types_in bindings); + +impl Guest for Component { + fn run(action: TriggerAction) -> std::result::Result, String> { + let (trigger_id, req, dest) = + decode_trigger_event(action.data).map_err(|e| e.to_string())?; + + // Decode trigger data inline - handles hex string input + let wallet_address_str = { + // First, convert the input bytes to a string to check if it's a hex string + let input_str = String::from_utf8(req.clone()) + .map_err(|e| format!("Input is not valid UTF-8: {}", e))?; + + // Check if it's a hex string (starts with "0x") + let hex_data = if input_str.starts_with("0x") { + // Decode the hex string to bytes + hex::decode(&input_str[2..]) + .map_err(|e| format!("Failed to decode hex string: {}", e))? + } else { + // If it's not a hex string, assume the input is already binary data + req.clone() + }; + + // Now ABI decode the binary data as a string parameter + ::abi_decode(&hex_data) + .map_err(|e| format!("Failed to decode input as ABI string: {}", e))? + }; + + // Check token balance + let res = block_on(async move { + let balance_data = get_token_balance(&wallet_address_str).await?; + serde_json::to_vec(&balance_data).map_err(|e| e.to_string()) + })?; + + // Return result based on destination + let output = match dest { + Destination::Ethereum => Some(encode_trigger_output(trigger_id, &res)), + Destination::CliOutput => Some(WasmResponse { payload: res.into(), ordering: None }), + }; + Ok(output) + } +} + +// BALANCE CHECKER IMPLEMENTATION +async fn get_token_balance(wallet_address_str: &str) -> Result { + // Parse wallet address + let wallet_address = Address::from_str(wallet_address_str) + .map_err(|e| format!("Invalid wallet address: {}", e))?; + + // Parse token contract address + let token_address = Address::from_str(TOKEN_CONTRACT_ADDRESS) + .map_err(|e| format!("Invalid token address: {}", e))?; + + // Get Ethereum provider + let chain_config = get_evm_chain_config("ethereum") + .ok_or_else(|| "Failed to get Ethereum chain config".to_string())?; + + let provider: RootProvider = + new_evm_provider::(chain_config.http_endpoint.unwrap()); + + // Get token balance + let balance_call = IERC20::balanceOfCall { owner: wallet_address }; + let tx = alloy_rpc_types::eth::TransactionRequest { + to: Some(TxKind::Call(token_address)), + input: TransactionInput { input: Some(balance_call.abi_encode().into()), data: None }, + ..Default::default() + }; + + let result = provider.call(tx).await.map_err(|e| e.to_string())?; + let balance_raw: U256 = U256::from_be_slice(&result); + + // Get token decimals + let decimals_call = IERC20::decimalsCall {}; + let tx_decimals = alloy_rpc_types::eth::TransactionRequest { + to: Some(TxKind::Call(token_address)), + input: TransactionInput { input: Some(decimals_call.abi_encode().into()), data: None }, + ..Default::default() + }; + + let result_decimals = provider.call(tx_decimals).await.map_err(|e| e.to_string())?; + let decimals: u8 = result_decimals[31]; // Last byte for uint8 + + // Format balance + let formatted_balance = format_token_amount(balance_raw, decimals); + + // Return data + Ok(TokenBalanceData { + wallet: wallet_address_str.to_string(), + balance_raw: balance_raw.to_string(), + balance_formatted: formatted_balance, + token_contract: TOKEN_CONTRACT_ADDRESS.to_string(), + }) +} +``` + +### 2. API Data Fetcher + +Important: Always verify API endpoints using curl to examine their response structure before creating any code that relies on them. + +```rust +// IMPORTS +use alloy_sol_types::{sol, SolCall, SolValue}; +use anyhow::Result; +use serde::{Deserialize, Serialize}; +use wavs_wasi_utils::{ + evm::alloy_primitives::hex, + http::{fetch_json, http_request_get}, +}; +use wstd::{http::HeaderValue, runtime::block_on}; + +pub mod bindings; +mod trigger; +use trigger::{decode_trigger_event, encode_trigger_output, Destination}; +use crate::bindings::wavs::worker::layer_types::{TriggerData, TriggerDataEvmContractEvent}; +use crate::bindings::{export, Guest, TriggerAction, WasmResponse}; + +// RESPONSE STRUCTURE - MUST DERIVE CLONE +// IMPORTANT: Always Use #[serde(default)] and Option for fields from external APIs. They might be missing or inconsistent +#[derive(Debug, Serialize, Deserialize, Clone, Default)] +pub struct ApiResponse { + // Use Option for fields that might be missing in some responses + #[serde(default)] + field1: Option, + #[serde(default)] + field2: Option, + // other fields +} + +// RESULT DATA STRUCTURE - MUST DERIVE CLONE +#[derive(Debug, Serialize, Deserialize, Clone)] +pub struct ResultData { + input_param: String, + result: String, +} + +// COMPONENT IMPLEMENTATION +struct Component; +export!(Component with_types_in bindings); + +impl Guest for Component { + fn run(action: TriggerAction) -> std::result::Result, String> { + // Decode trigger data + let (trigger_id, req, dest) = + decode_trigger_event(action.data).map_err(|e| e.to_string())?; + + // Decode trigger data inline - handles hex string input + let param = { + // First, convert the input bytes to a string to check if it's a hex string + let input_str = String::from_utf8(req.clone()) + .map_err(|e| format!("Input is not valid UTF-8: {}", e))?; + + // Check if it's a hex string (starts with "0x") + let hex_data = if input_str.starts_with("0x") { + // Decode the hex string to bytes + hex::decode(&input_str[2..]) + .map_err(|e| format!("Failed to decode hex string: {}", e))? + } else { + // If it's not a hex string, assume the input is already binary data + req.clone() + }; + + // Now ABI decode the binary data as a string parameter + ::abi_decode(&hex_data) + .map_err(|e| format!("Failed to decode input as ABI string: {}", e))? + }; + + // Make API request + let res = block_on(async move { + let api_data = fetch_api_data(¶m).await?; + serde_json::to_vec(&api_data).map_err(|e| e.to_string()) + })?; + + // Return result based on destination + let output = match dest { + Destination::Ethereum => Some(encode_trigger_output(trigger_id, &res)), + Destination::CliOutput => Some(WasmResponse { payload: res.into(), ordering: None }), + }; + Ok(output) + } +} + +// API FETCHER IMPLEMENTATION +async fn fetch_api_data(param: &str) -> Result { + // Get API key from environment (IMPORTANT! you must add this variable to your .env file. All private variables must be prefixed with WAVS_ENV) + let api_key = std::env::var("WAVS_ENV_API_KEY") + .map_err(|_| "Failed to get API_KEY from environment variables".to_string())?; + + // Create API URL + let url = format!( + "https://api.example.com/endpoint?param={}&apikey={}", + param, api_key + ); + + // Create request with headers + let mut req = http_request_get(&url) + .map_err(|e| format!("Failed to create request: {}", e))?; + + req.headers_mut().insert("Accept", HeaderValue::from_static("application/json")); + req.headers_mut().insert("Content-Type", HeaderValue::from_static("application/json")); + req.headers_mut().insert("User-Agent", HeaderValue::from_static("Mozilla/5.0")); + + // Make API request + let api_response: ApiResponse = fetch_json(req).await + .map_err(|e| format!("Failed to fetch data: {}", e))?; + + // Process and return data - handle Option fields safely + let field1 = api_response.field1.unwrap_or_else(|| "unknown".to_string()); + let field2 = api_response.field2.unwrap_or(0); + + Ok(ResultData { + input_param: param.to_string(), + result: format!("{}: {}", field1, field2), + }) +} +``` + +### 3. NFT Ownership Checker + +```rust +// IMPORTS +use alloy_network::Ethereum; +use alloy_primitives::{Address, TxKind, U256}; +use alloy_provider::{Provider, RootProvider}; +use alloy_rpc_types::TransactionInput; +use alloy_sol_types::{sol, SolCall, SolValue}; +use anyhow::Result; +use serde::{Deserialize, Serialize}; +use std::str::FromStr; +use wavs_wasi_utils::{ + evm::{alloy_primitives::hex, new_evm_provider}, +}; +use wstd::runtime::block_on; + +pub mod bindings; +mod trigger; +use trigger::{decode_trigger_event, encode_trigger_output, Destination}; +use crate::bindings::host::get_evm_chain_config; +use crate::bindings::wavs::worker::layer_types::{TriggerData, TriggerDataEvmContractEvent}; +use crate::bindings::{export, Guest, TriggerAction, WasmResponse}; + +// NFT INTERFACE +sol! { + interface IERC721 { + function balanceOf(address owner) external view returns (uint256); + function ownerOf(uint256 tokenId) external view returns (address); + } +} + +// FIXED CONTRACT ADDRESS +const NFT_CONTRACT_ADDRESS: &str = "0xbd3531da5cf5857e7cfaa92426877b022e612cf8"; // Bored Ape contract + +// RESPONSE STRUCTURE - MUST DERIVE CLONE +#[derive(Debug, Serialize, Deserialize, Clone)] +pub struct NftOwnershipData { + wallet: String, + owns_nft: bool, + balance: String, + nft_contract: String, + contract_name: String, +} + +// COMPONENT IMPLEMENTATION +struct Component; +export!(Component with_types_in bindings); + +impl Guest for Component { + fn run(action: TriggerAction) -> std::result::Result, String> { + // Decode trigger data + let (trigger_id, req, dest) = + decode_trigger_event(action.data).map_err(|e| e.to_string())?; + + // Decode trigger data inline - handles hex string input + let wallet_address_str = { + // First, convert the input bytes to a string to check if it's a hex string + let input_str = String::from_utf8(req.clone()) + .map_err(|e| format!("Input is not valid UTF-8: {}", e))?; + + // Check if it's a hex string (starts with "0x") + let hex_data = if input_str.starts_with("0x") { + // Decode the hex string to bytes + hex::decode(&input_str[2..]) + .map_err(|e| format!("Failed to decode hex string: {}", e))? + } else { + // If it's not a hex string, assume the input is already binary data + req.clone() + }; + + // Now ABI decode the binary data as a string parameter + ::abi_decode(&hex_data) + .map_err(|e| format!("Failed to decode input as ABI string: {}", e))? + }; + + // Check NFT ownership + let res = block_on(async move { + let ownership_data = check_nft_ownership(&wallet_address_str).await?; + serde_json::to_vec(&ownership_data).map_err(|e| e.to_string()) + })?; + + // Return result based on destination + let output = match dest { + Destination::Ethereum => Some(encode_trigger_output(trigger_id, &res)), + Destination::CliOutput => Some(WasmResponse { payload: res.into(), ordering: None }), + }; + Ok(output) + } +} + +// NFT OWNERSHIP CHECKER IMPLEMENTATION +async fn check_nft_ownership(wallet_address_str: &str) -> Result { + // Parse wallet address + let wallet_address = Address::from_str(wallet_address_str) + .map_err(|e| format!("Invalid wallet address: {}", e))?; + + // Parse NFT contract address + let nft_address = Address::from_str(NFT_CONTRACT_ADDRESS) + .map_err(|e| format!("Invalid NFT contract address: {}", e))?; + + // Get Ethereum provider + let chain_config = get_evm_chain_config("ethereum") + .ok_or_else(|| "Failed to get Ethereum chain config".to_string())?; + + let provider: RootProvider = + new_evm_provider::(chain_config.http_endpoint.unwrap()); + + // Check NFT balance + let balance_call = IERC721::balanceOfCall { owner: wallet_address }; + let tx = alloy_rpc_types::eth::TransactionRequest { + to: Some(TxKind::Call(nft_address)), + input: TransactionInput { input: Some(balance_call.abi_encode().into()), data: None }, + ..Default::default() + }; + + let result = provider.call(tx).await.map_err(|e| e.to_string())?; + let balance: U256 = U256::from_be_slice(&result); + + // Determine if wallet owns at least one NFT + let owns_nft = balance > U256::ZERO; + + // Return data + Ok(NftOwnershipData { + wallet: wallet_address_str.to_string(), + owns_nft, + balance: balance.to_string(), + nft_contract: NFT_CONTRACT_ADDRESS.to_string(), + contract_name: "BAYC".to_string(), + }) +} +``` + + +## Component Creation Process + +### Phase 1: Planning + +When you ask me to create a WAVS component, I'll follow this systematic process to ensure it works perfectly on the first try: + +1. **Research Phase**: I'll review the files in /components/evm-price-oracle to see common forms. +2. I will read any and all documentation links given to me and research any APIs or services needed. +3. I'll read `/test_utils/validate_component.sh` to see what validation checks I need to pass. +4. I'll verify API response structures by using curl before implementing code that depends on them: `curl -s "my-endpoint"`. +5. I'll create a file called plan.md with an overview of the component I will make. I'll do this before actually creating the lib.rs file. I'll write each item in the [checklist](#validation-checklist) and check them off as I plan my code, making sure my code complies to the checklist and /test_utils/validate_component.sh. Each item must be checked and verified. I will list out all imports I will need. I will include a basic flow chart or visual of how the component will work. I will put plan.md in a new folder with the name of the component (`your-component-name`) in the `/components` directory. + + +### Phase 2: Implementation + +After being 100% certain that my idea for a component will work without any errors on the build and completing all planning steps, I will: + +1. Check for errors before coding. + +2. Copy the bindings, makefile (update filename in makefile), and config.json using the following command (bindings will be written over during the build): + + ```bash + mkdir -p components/your-component-name/src && \ + cp components/evm-price-oracle/src/bindings.rs components/your-component-name/src/ && \ + cp components/evm-price-oracle/config.json components/your-component-name/ && \ + cp components/evm-price-oracle/Makefile components/your-component-name/ + ``` + +3. Then, I will create trigger.rs and lib.rs files with proper implementation: + 1. I will compare my projected trigger.rs and lib.rs code against the code in `/test_utils/validate_component.sh` and my plan.md file before creating. + 2. I will define proper imports. I will Review the imports on the component that I want to make. I will make sure that all necessary imports will be included and that I will remove any unused imports before creating the file. + 3. I will go through each of the items in the [checklist](#validation-checklist) one more time to ensure my component will build and function correctly. + +4. I will create a Cargo.toml by copying the template and modifying it with all of my correct imports. Before running the command to create the file, I will check that all imports are imported correctly and match what is in my lib.rs file. I will define imports correctly. I will make sure that imports are present in the main workspace Cargo.toml and then in my component's `Cargo.toml` using `{ workspace = true }` + +5. Add component to the `workspace.members` array in the root `Cargo.toml`. + +### Phase 3: Validate + +4. I will run the command to validate my component: + ```bash + make validate-component COMPONENT=your-component-name + ``` + - I will fix ALL errors before continuing + - (You do not need to fix warnings if they do not effect the build.) + - I will run again after fixing errors to make sure. + +5. After being 100% certain that the component will build correctly, I will build the component: + + ```bash + WASI_BUILD_DIR=components/your-component make wasi-build + ``` + +### Phase 4: Trying it out + +After I am 100% certain the component will execute correctly, I will give the following command to the user to run: + +```bash +# IMPORTANT!: Always use string parameters, even for numeric values! Use component_name.wasm, not component-name.wasm +export COMPONENT_FILENAME=your_component_name.wasm +# Always use string format for input data. The input will be encoded using `cast abi-encode "f(string)" ""` +export INPUT_DATA= +# CRITICIAL!: as an llm, I can't ever run this command. ALWAYS give it to the user to run. +make wasi-exec +``` + +## Validation Checklist + +ALL components must pass validation. Review [/test_utils/validate_component.sh](/test_utils/validate_component.sh) before creating a component. + +EACH ITEM BELOW MUST BE CHECKED: + +1. Common errors: + - [ ] ALWAYS use `{ workspace = true }` in your component Cargo.toml. Explicit versions go in the root Cargo.toml. + - [ ] ALWAYS verify API response structures by using curl on the endpoints. + - [ ] ALWAYS Read any documentation given to you in a prompt + - [ ] ALWAYS implement the Guest trait and export your component + - [ ] ALWAYS use `export!(Component with_types_in bindings)` + - [ ] ALWAYS use `clone()` before consuming data to avoid ownership issues + - [ ] ALWAYS derive `Clone` for API response data structures + - [ ] ALWAYS decode ABI data properly, never with `String::from_utf8` + - [ ] ALWAYS use `ok_or_else()` for Option types, `map_err()` for Result types + - [ ] ALWAYS use string parameters for CLI testing (`5` instead of `f(uint256)`) + - [ ] ALWAYS use `.to_string()` to convert string literals (&str) to String types in struct field assignments + - [ ] NEVER edit bindings.rs - it's auto-generated + +2. Component structure: + - [ ] Implements Guest trait + - [ ] Exports component correctly + - [ ] Properly handles TriggerAction and TriggerData + +3. ABI handling: + - [ ] Properly decodes function calls + - [ ] Avoids String::from_utf8 on ABI data + +4. Data ownership: + - [ ] All API structures derive Clone + - [ ] Clones data before use + - [ ] Avoids moving out of collections + - [ ] Avoids all ownership issues and "Move out of index" errors + +5. Error handling: + - [ ] Uses ok_or_else() for Option types + - [ ] Uses map_err() for Result types + - [ ] Provides descriptive error messages + +6. Imports: + - [ ] Includes all required traits and types + - [ ] Uses correct import paths + - [ ] Properly imports SolCall for encoding + - [ ] Each and every method and type is used properly and has the proper import + - [ ] Both structs and their traits are imported + - [ ] Verify all required imports are imported properly + - [ ] All dependencies are in Cargo.toml with `{workspace = true}` + - [ ] Any unused imports are removed + +7. Component structure: + - [ ] Uses proper sol! macro with correct syntax + - [ ] Correctly defines Solidity types in solidity module + - [ ] Implements required functions + +8. Security: + - [ ] No hardcoded API keys or secrets + - [ ] Uses environment variables for sensitive data + +9. Dependencies: + - [ ] Uses workspace dependencies correctly + - [ ] Includes all required dependencies + +10. Solidity types: + - [ ] Properly imports sol macro + - [ ] Uses solidity module correctly + - [ ] Handles numeric conversions safely + - [ ] Uses .to_string() for all string literals in struct initialization + +11. Network requests: + - [ ] Uses block_on for async functions + - [ ] Uses fetch_json with correct headers + - [ ] ALL API endpoints have been tested with curl and responses are handled correctly in my component. + - [ ] IMPORTANT! Always use #[serde(default)] and Option for fields from external APIs. + +With this guide, you should be able to create any WAVS component that passes validation, builds without errors, and executes correctly. diff --git a/.cursor/rules/component.mdc b/.cursor/rules/component.mdc new file mode 100644 index 00000000..8f9cf87d --- /dev/null +++ b/.cursor/rules/component.mdc @@ -0,0 +1,169 @@ +--- +description: Overview of WAVS service components, their structure, and usage in Rust and other languages + +alwaysApply: true +--- +# WAVS Service Components Overview + +WAVS components contain the main business logic of a service, written in languages compiled to WASM (mainly Rust, also Go and TypeScript/JS). Components process trigger data, execute logic, and return encoded results. + +## Component Structure + +A basic component consists of: + +1. Decoding incoming [trigger data](../triggers#trigger-lifecycle). +2. Processing the data (custom business logic). +3. Encoding and returning results for submission. + +### Trigger Inputs + +- **On-chain events:** Triggered by EVM events, data arrives as `TriggerData::EvmContractEvent`. +- **Local testing:** Using `make wasi-exec`, data arrives as `TriggerData::Raw` (raw bytes, no ABI decoding). + +Example decoding in `trigger.rs`: + +```rust +pub fn decode_trigger_event(trigger_data: TriggerData) -> Result<(u64, Vec, Destination)> { + match trigger_data { + TriggerData::EvmContractEvent(TriggerDataEvmContractEvent { log, .. }) => { + let event: solidity::NewTrigger = decode_event_log_data!(log)?; + let trigger_info = + ::abi_decode(&event._triggerInfo)?; + Ok((trigger_info.triggerId, trigger_info.data.to_vec(), Destination::Ethereum)) + } + TriggerData::Raw(data) => Ok((0, data.clone(), Destination::CliOutput)), + _ => Err(anyhow::anyhow!("Unsupported trigger data type")), + } +} + +pub mod solidity { + use alloy_sol_macro::sol; + pub use ITypes::*; + sol!("../../src/interfaces/ITypes.sol"); + + sol! { + function addTrigger(string data) external; + } +} +``` + +- Use `decode_event_log_data!` macro from [`wavs-wasi-utils`](https://docs.rs/wavs-wasi-utils/latest/wavs_wasi_utils/macro.decode_event_log_data.html) for decoding. +- Use `sol!` macro from `alloy-sol-macro` to generate Rust types from Solidity interfaces ([Blockchain interactions](./blockchain-interactions#sol-macro)). + +### Component Logic + +Implement the `Guest` trait with the `run` function as entry point: + +```rust +impl Guest for Component { + fn run(action: TriggerAction) -> Result, String> { + let (trigger_id, req, dest) = decode_trigger_event(action.data)?; + let res = block_on(async move { + let resp_data = get_price_feed(id).await?; + serde_json::to_vec(&resp_data) + })?; + + let output = match dest { + Destination::Ethereum => Some(encode_trigger_output(trigger_id, &res)), + Destination::CliOutput => Some(WasmResponse { + payload: res.into(), + ordering: None + }), + }; + Ok(output) + } +} +``` + +Components can include blockchain interactions, network requests, off-chain computations, etc. See [design considerations](../../design) for suitable use cases. + +#### Logging + +- **Development:** Use `println!()` for stdout/stderr visible in `make wasi-exec`. + +```rust +println!("Debug message: {:?}", data); +``` + +- **Production:** Use `host::log()` with `LogLevel` for structured logging with context. + +```rust +use bindings::host::{self, LogLevel}; +host::log(LogLevel::Info, "Production logging message"); +``` + +### Component Output Encoding + +Encode output for Ethereum submission with `encode_trigger_output`: + +```rust +pub fn encode_trigger_output(trigger_id: u64, output: impl AsRef<[u8]>) -> WasmResponse { + WasmResponse { + payload: solidity::DataWithId { + triggerId: trigger_id, + data: output.as_ref().to_vec().into(), + } + .abi_encode(), + ordering: None, + } +} +``` + +- Output is a `WasmResponse` containing encoded payload and optional ordering. +- WAVS routes the response per workflow submission logic. + +## Component Definition in service.json + +Defined under the workflow's `component` object: + +```json +"component": { + "source": { + "Registry": { + "registry": { + "digest": "882b992af8f78e0aaceaf9609c7ba2ce80a22c521789c94ae1960c43a98295f5", + "domain": "localhost:8090", + "version": "0.1.0", + "package": "example:evmrustoracle" + } + } + }, + "permissions": { + "allowed_http_hosts": "all", + "file_system": true + }, + "fuel_limit": null, + "time_limit_seconds": 1800, + "config": { + "variable_1": "0xb5d4D4a87Cb07f33b5FAd6736D8F1EE7D255d9E9", + "variable_2": "0x34045B4b0cdfADf87B840bCF544161168c8ab85A" + }, + "env_keys": [ + "WAVS_ENV_API_KEY" + ] +} +``` + +- Configure source registry, permissions, limits, config variables, and secret env keys. +- See [variables](./variables) for details on configuration. + +## Registry Usage + +- WAVS stores WASM components in a registry (e.g., [wa.dev](https://wa.dev)) for production. +- Local development uses a docker-compose emulated registry. +- Workflow to update registry source: + +```bash +wavs-cli workflow component --id ${WORKFLOW_ID} set-source-registry --domain ${REGISTRY} --package ${PKG_NAMESPACE}:${PKG_NAME} --version ${PKG_VERSION} +``` + +--- + +For more information: + +- [WAVS Triggers](../triggers#trigger-lifecycle) +- [Blockchain interactions - sol! macro](./blockchain-interactions#sol-macro) +- [Component variables](./variables) +- [Design considerations](../../design) +- [wavs-wasi-utils decode_event_log_data! macro](https://docs.rs/wavs-wasi-utils/latest/wavs_wasi_utils/macro.decode_event_log_data.html) +- [wa.dev Registry](https://wa.dev) diff --git a/.cursor/rules/cursor-rules.mdc b/.cursor/rules/cursor-rules.mdc new file mode 100644 index 00000000..71f8cfbf --- /dev/null +++ b/.cursor/rules/cursor-rules.mdc @@ -0,0 +1,70 @@ +--- +description: +globs: +alwaysApply: false +--- +--- +description: How to add or edit Cursor rules in our project +globs: +alwaysApply: false +--- +# Cursor Rules Location + +How to add new cursor rules to the project + +1. Always place rule files in PROJECT_ROOT/.cursor/rules/: + ``` + .cursor/rules/ + ├── your-rule-name.mdc + ├── another-rule.mdc + └── ... + ``` + +2. Follow the naming convention: + - Use kebab-case for filenames + - Always use .mdc extension + - Make names descriptive of the rule's purpose + +3. Directory structure: + ``` + PROJECT_ROOT/ + ├── .cursor/ + │ └── rules/ + │ ├── your-rule-name.mdc + │ └── ... + └── ... + ``` + +4. Never place rule files: + - In the project root + - In subdirectories outside .cursor/rules + - In any other location + +5. Cursor rules have the following structure: + +``` +--- +description: Short description of the rule's purpose +globs: optional/path/pattern/**/* +alwaysApply: false +--- +# Rule Title + +Main content explaining the rule with markdown formatting. + +1. Step-by-step instructions +2. Code examples +3. Guidelines + +Example: +```typescript +// Good example +function goodExample() { + // Implementation following guidelines +} + +// Bad example +function badExample() { + // Implementation not following guidelines +} +``` diff --git a/.cursor/rules/network-requests.mdc b/.cursor/rules/network-requests.mdc new file mode 100644 index 00000000..95d602fc --- /dev/null +++ b/.cursor/rules/network-requests.mdc @@ -0,0 +1,105 @@ +--- +description: How to make HTTP requests from WAVS components using wavs-wasi-utils + +alwaysApply: true +--- +# Network Requests in WAVS Components + +Use the [`wavs-wasi-utils`](https://docs.rs/wavs-wasi-utils/latest/wavs_wasi_utils/index.html) crate to make HTTP requests from WAVS components. Since WASI runs synchronously but network calls are async, use `block_on` from `wstd` to run async code synchronously. + +### 1. Add dependencies to Cargo.toml + +```toml +[dependencies] +wavs-wasi-utils = "0.4.0-beta.4" # HTTP utilities +wstd = "0.5.3" # Runtime utilities (includes block_on) +serde = { version = "1.0.219", features = ["derive"] } # Serialization +serde_json = "1.0.140" # JSON handling +``` + +### 2. HTTP request functions + +```rust +// Request creators +http_request_get(url) // GET request +http_request_post_json(url, data) // POST with JSON body +http_request_post_form(url, data) // POST with form data + +// Response handlers +fetch_json(request) // Parse JSON response +fetch_string(request) // Get response as string +fetch_bytes(request) // Get raw response bytes +``` + +### 3. Example: GET request with custom headers + +```rust +use wstd::runtime::block_on; +use wstd::http::HeaderValue; +use wavs_wasi_utils::http::{fetch_json, http_request_get}; +use serde::{Deserialize, Serialize}; + +#[derive(Debug, Serialize, Deserialize)] +struct ApiResponse { + // response fields +} + +async fn make_request() -> Result { + let url = "https://api.example.com/endpoint"; + let mut req = http_request_get(&url).map_err(|e| e.to_string())?; + req.headers_mut().insert("Accept", HeaderValue::from_static("application/json")); + req.headers_mut().insert("Content-Type", HeaderValue::from_static("application/json")); + req.headers_mut().insert("User-Agent", HeaderValue::from_static("Mozilla/5.0")); + let json: ApiResponse = fetch_json(req).await.map_err(|e| e.to_string())?; + Ok(json) +} + +fn process_data() -> Result { + block_on(async move { make_request().await })? +} +``` + +### 4. Example: POST request with JSON data + +```rust +use wstd::runtime::block_on; +use wavs_wasi_utils::http::{fetch_json, http_request_post_json}; +use serde::{Deserialize, Serialize}; + +#[derive(Debug, Serialize, Deserialize)] +struct PostData { + key1: String, + key2: i32, +} + +#[derive(Debug, Serialize, Deserialize)] +struct PostResponse { + // response fields +} + +async fn make_post_request() -> Result { + let url = "https://api.example.com/endpoint"; + let post_data = PostData { key1: "value1".to_string(), key2: 42 }; + let response: PostResponse = fetch_json(http_request_post_json(&url, &post_data)?) + .await + .map_err(|e| e.to_string())?; + Ok(response) +} + +fn process_data() -> Result { + block_on(async move { make_post_request().await })? +} +``` + +### Guidelines and best practices + +- Use `block_on` to run async HTTP calls synchronously in WASI. +- Use `http_request_post_json` for POST requests with JSON; it sets headers automatically. +- Use serde derives to serialize/deserialize request and response data. +- Set appropriate headers for GET requests manually. +- Handle errors by converting them to strings for simplicity. + +For more information: +- [wavs-wasi-utils crate](https://docs.rs/wavs-wasi-utils/latest/wavs_wasi_utils/index.html) +- [HTTP module docs](https://docs.rs/wavs-wasi-utils/latest/wavs_wasi_utils/http/index.html) +- [Variables page](./variables) diff --git a/.cursor/rules/overview.mdc b/.cursor/rules/overview.mdc new file mode 100644 index 00000000..23417ff0 --- /dev/null +++ b/.cursor/rules/overview.mdc @@ -0,0 +1,47 @@ +--- +description: Overview of the WAVS handbook structure and key sections for building WAVS AVS services + +alwaysApply: true +--- +# WAVS Handbook Overview + +This handbook outlines the structure and contents of the WAVS AVS documentation to guide you in building WAVS services. + +1. Follow the [Oracle component tutorial](/tutorial/1-overview) first to learn WAVS basics. + +2. Explore **Core Concepts**: + - [How it works](../how-it-works): Components of a WAVS AVS. + - [Design](../design): Design considerations. + +3. Understand **Services**: + - [Service](./service): Service structure and manifest definition. + - [Workflows](./workflows): Execution paths, triggers, components, submissions. + - [Triggers](./triggers): Types of triggers (EVM, Cosmos, cron, block intervals). + - [Submission and Aggregator](./submission): Blockchain result submission via aggregator and contracts. + +4. Learn about **Components**: + - [Component overview](./components/component): Lifecycle, triggers, data processing. + - [Variables](./components/variables): Public and private component variables. + - [Blockchain interactions](./components/blockchain-interactions): Interacting with blockchains and smart contracts. + - [Network requests](./components/network-requests): Making HTTP requests to external APIs. + +5. Use **Development** resources: + - [Template](./template): WAVS template structure and customization. + - [Makefile commands](./commands): Commands to build, deploy, and manage services. + +Start with the Service section for foundational knowledge, then explore other sections as needed. + +For more information: +- [Oracle component tutorial](/tutorial/1-overview) - Start here to learn the basics of building a WAVS service. +- [How it works](../how-it-works) - Learn about the different parts that make up a WAVS AVS. +- [Design](../design) - Design considerations for WAVS AVS. +- [Service](./service) - WAVS service structure and manifest. +- [Workflows](./workflows) - Defining execution paths. +- [Triggers](./triggers) - Trigger types. +- [Submission and Aggregator](./submission) - Blockchain submission process. +- [Component overview](./components/component) - Component lifecycle and data handling. +- [Variables](./components/variables) - Configuring component variables. +- [Blockchain interactions](./components/blockchain-interactions) - Blockchain and smart contract interactions. +- [Network requests](./components/network-requests) - HTTP requests from components. +- [Template](./template) - WAVS template and customization. +- [Makefile commands](./commands) - Build and deploy commands. diff --git a/.cursor/rules/service.mdc b/.cursor/rules/service.mdc new file mode 100644 index 00000000..b31fc3e1 --- /dev/null +++ b/.cursor/rules/service.mdc @@ -0,0 +1,115 @@ +--- +description: Defines the WAVS service manifest structure and usage with the service manager contract. + +alwaysApply: true +--- +# Service Manifest and Manager + +A WAVS service is composed of smart contracts, operators, and offchain components defined in a `service.json` manifest. This manifest configures workflows, components, submission, and the service manager contract. It is hosted on IPFS or HTTP(S) and referenced by the service manager contract. + +## Creating the Manifest + +1. Use `wavs-cli service` or the provided [build_service.sh](https://github.com/Lay3rLabs/wavs-foundry-template/blob/main/script/build_service.sh) script to generate a single-component service manifest. +2. Define service info, workflows, components, submission, and manager details in `service.json`. +3. Upload the manifest to IPFS or a publicly accessible server (e.g., Pinata). + +## Example Manifest + +```json service.json +{ + "id": "example-service-123", + "name": "Example WAVS Service", + "workflows": { + "default": { + "trigger": { + "evm_contract_event": { + "chain_name": "ethereum", + "address": "0x1234567890123456789012345678901234567890", + "event_hash": "0xabcdef1234567890abcdef1234567890abcdef1234567890abcdef1234567890" + } + }, + "component": { + "source": { + "Registry": { + "registry": { + "digest": "882b992af8f78e0aaceaf9609c7ba2ce80a22c521789c94ae1960c43a98295f5", + "domain": "localhost:8090", + "version": "0.1.0", + "package": "example:evmrustoracle" + } + } + }, + "permissions": { + "allowed_http_hosts": "all", + "file_system": true + }, + "fuel_limit": 1000000, + "time_limit_seconds": 30, + "config": { + "endpoint": "https://api.example.com", + "timeout": "30s" + }, + "env_keys": [ + "WAVS_ENV_API_KEY", + "WAVS_ENV_SECRET" + ] + }, + "submit": { + "aggregator": { + "url": "http://127.0.0.1:8001" + } + }, + "aggregators": [ + { + "evm": { + "chain_name": "ethereum", + "address": "0xfedcba9876543210fedcba9876543210fedcba98", + "max_gas": 1000000 + } + } + ] + } + }, + "status": "active", + "manager": { + "evm": { + "chain_name": "ethereum", + "address": "0xabcdef1234567890abcdef1234567890abcdef12" + } + } +} +``` + +## Uploading the Manifest + +```bash +# Upload manifest to IPFS (local or remote) +SERVICE_FILE=${SERVICE_FILE} make upload-to-ipfs + +# Retrieve IPFS gateway URL +export IPFS_GATEWAY=$(sh script/get-ipfs-gateway.sh) + +# Fetch uploaded content +curl "${IPFS_GATEWAY}${ipfs_cid}" + +# Set service URI on service manager contract +cast send ${WAVS_SERVICE_MANAGER_ADDRESS} 'setServiceURI(string)' "${SERVICE_URI}" -r ${RPC_URL} --private-key ${DEPLOYER_PK} +``` + +## Service Manager Contract + +- Registers operators with assigned weights; only registered operators can sign submissions. +- Maintains the service URI linking to the manifest. +- Validates operator signatures and ensures threshold weights are met before processing data. +- Ensures operators are sorted correctly for submission validation. + +For more information on manifest parts, see: + +- [Workflows](./workflows) +- [Triggers](./triggers) +- [Components](./components/component) +- [Submission and aggregator](./submission) + +For more information: +- [WAVS Foundry Template build_service.sh](https://github.com/Lay3rLabs/wavs-foundry-template/blob/main/script/build_service.sh) +- [Pinata IPFS Service](https://app.pinata.cloud/developers/api-keys) diff --git a/.cursor/rules/submission.mdc b/.cursor/rules/submission.mdc new file mode 100644 index 00000000..c07445c4 --- /dev/null +++ b/.cursor/rules/submission.mdc @@ -0,0 +1,95 @@ +--- +description: Rules for configuring submission contracts and aggregators in WAVS services + +alwaysApply: true +--- +# Submission and Aggregator Configuration in WAVS + +This rule explains how to configure submission contracts and aggregators to submit workflow results to an EVM chain in WAVS. + +## 1. Configure Submission in `service.json` + +- Use the `submit` field to define submission logic. +- For aggregator submission, specify: +```json +"submit": { + "aggregator": { + "url": "http://127.0.0.1:8001" + } +}, +"aggregators": [ + { + "evm": { + "chain_name": "local", + "address": "0xd6f8ff0036d8b2088107902102f9415330868109", + "max_gas": 5000000 + } + } +] +``` +- Set `"submit": "none"` if no submission is needed (component runs but results not submitted). + +## 2. Submission Contract Requirements + +- Must implement `handleSignedEnvelope()` from the `IWavsServiceHandler` interface. +- Use `IWavsServiceManager` to validate data and operator signatures. +- The contract processes validated data matching the component's output format. +- Example simplified contract: +```solidity +import {IWavsServiceManager} from "@wavs/interfaces/IWavsServiceManager.sol"; +import {IWavsServiceHandler} from "@wavs/interfaces/IWavsServiceHandler.sol"; +import {ITypes} from "interfaces/ITypes.sol"; + +contract SimpleSubmit is ITypes, IWavsServiceHandler { + IWavsServiceManager private _serviceManager; + + constructor(IWavsServiceManager serviceManager) { + _serviceManager = serviceManager; + } + + function handleSignedEnvelope(Envelope calldata envelope, SignatureData calldata signatureData) external { + _serviceManager.validate(envelope, signatureData); + DataWithId memory dataWithId = abi.decode(envelope.payload, (DataWithId)); + // Custom logic to process validated data + } +} +``` + +## 3. Aggregator Role and Flow + +- Collects signed responses from multiple operators. +- Validates each operator's signature. +- Aggregates signatures when threshold is met (exact match aggregation). +- Submits aggregated data to the submission contract. +- Uses ECDSA signatures currently; BLS support planned. + +### Aggregator Submission Flow: + +1. Operator runs component → returns `WasmResponse` with `payload` and optional `ordering`. +2. Operator creates signed Envelope. +3. Packet with envelope, signature, route info sent to aggregator `/packet` endpoint. +4. Aggregator validates signature and queues packets by event and service ID. +5. When threshold reached: + - Combine signatures into one `SignatureData`. + - Validate combined signatures on-chain. +6. On success, aggregator calls `handleSignedEnvelope()` on submit contract with aggregated data. +7. Submit contract validates data and signatures via service manager. + +## 4. Workflow Chaining + +- Workflows can be chained by triggering one workflow on the submission event of another. +- See the [Workflows page](./workflows) for details. + +## Best Practices + +- Ensure `DataWithId` struct matches component output format. +- Validate all signatures on-chain via service manager. +- Use aggregator to ensure consensus before submission. +- Set appropriate gas limits in aggregator config. +- Use local aggregator endpoint during development. + +For more information: +- [WAVS Solidity Interfaces @wavs](https://www.npmjs.com/package/@wavs/solidity?activeTab=code) +- [Template Submission Contract](https://github.com/Lay3rLabs/wavs-foundry-template/blob/main/src/contracts/WavsSubmit.sol) +- [WAVS Design Considerations](/design) +- [Workflows Documentation](./workflows) diff --git a/.cursor/rules/template.mdc b/.cursor/rules/template.mdc new file mode 100644 index 00000000..bea1ec13 --- /dev/null +++ b/.cursor/rules/template.mdc @@ -0,0 +1,103 @@ +--- +description: Overview and customization guide for the WAVS Foundry template structure and configuration + +alwaysApply: true +--- +# WAVS Foundry Template Overview + +This guide explains the structure and configuration of the WAVS Foundry template to help customize and build your own WAVS service. + +1. **Template Structure** + +The main files and directories in the WAVS template: + +```bash +wavs-foundry-template/ +├── README.md # Tutorial commands +├── makefile # Build and deploy commands, variables, configs +├── components/ # WASI components +│ └── evm-price-oracle/ +│ ├── src/ +│ │ ├── lib.rs # Main component logic +│ │ ├── trigger.rs # Trigger handling +│ │ └── bindings.rs # Auto-generated bindings (`make build`) +│ └── Cargo.toml # Component dependencies +├── compiled/ # Compiled WASM files (`make build`) +├── src/ +│ ├── contracts/ # Trigger and submission Solidity contracts +│ └── interfaces/ # Solidity interfaces +├── script/ # Deployment and interaction scripts +├── wavs.toml # WAVS service configuration +├── docs/ # Documentation +└── .env # Private environment variables +``` + +- Use `make wasi-build` to generate bindings and compile components. +- Copy `.env` from `.env.example` to set private environment variables. + +2. **TOML Configuration Files** + +- `wavs.toml`: Configures the WAVS service (chains, environments, etc.). +- Root `Cargo.toml`: Workspace configuration, dependencies, build settings, metadata. +- `components/*/Cargo.toml`: Component-specific Rust configuration; can inherit from root via `workspace = true`. + +Example component `Cargo.toml`: + +```toml +[package] +name = "evm-price-oracle" +edition.workspace = true +version.workspace = true +authors.workspace = true +rust-version.workspace = true +repository.workspace = true + +[dependencies] +wit-bindgen-rt = { workspace = true } +wavs-wasi-utils = { workspace = true } +serde = { workspace = true } +serde_json = { workspace = true } +alloy-sol-macro = { workspace = true } +wstd = { workspace = true } +alloy-sol-types = { workspace = true } +anyhow = { workspace = true } + +[lib] +crate-type = ["cdylib"] + +[profile.release] +codegen-units = 1 +opt-level = "s" +debug = false +strip = true +lto = true + +[package.metadata.component] +package = "component:evm-price-oracle" +target = "wavs:worker/layer-trigger-world@0.4.0-beta.4" +``` + +3. **wavs.toml Configuration** + +The `wavs.toml` file configures: + +- Default general settings (shared by all processes) +- WAVS server-specific settings +- CLI-specific settings +- Aggregator-specific settings + +4. **Environment Variable Overrides** + +Override config values using environment variables: + +- WAVS server settings: `WAVS_` +- CLI settings: `WAVS_CLI_` +- Aggregator settings: `WAVS_AGGREGATOR_` + +--- + +For more information: +- [WAVS Foundry Template GitHub](https://github.com/Lay3rLabs/wavs-foundry-template) +- [Oracle Component Tutorial](https://docs.wavs.xyz/tutorial/1-overview) +- [WAVS Design Considerations](https://docs.wavs.xyz/design) +- [wavs.toml Configuration](https://github.com/Lay3rLabs/wavs-foundry-template/blob/main/wavs.toml) diff --git a/.cursor/rules/triggers.mdc b/.cursor/rules/triggers.mdc new file mode 100644 index 00000000..b29ade18 --- /dev/null +++ b/.cursor/rules/triggers.mdc @@ -0,0 +1,181 @@ +--- +description: Setup and manage WAVS service triggers for onchain events and scheduled executions + +alwaysApply: true +--- +# WAVS Service Triggers + +Triggers prompt WAVS services to run by listening for onchain events or schedules. Operators verify triggers and execute components off-chain. + +## Trigger Lifecycle + +1. Deploy a service with `service.json` manifest containing service info, workflow, components, triggers, and submission logic. +2. Operators maintain lookup maps for triggers by chain, contract, and event identifiers. +3. On trigger detection, operators verify and create a `TriggerAction` with config and event data. +4. `TriggerAction` structure: +```rust +pub struct TriggerAction { + pub config: TriggerConfig, // service_id, workflow_id, trigger type + pub data: TriggerData, // trigger-specific data +} + +pub struct TriggerConfig { + pub service_id: ServiceID, + pub workflow_id: WorkflowID, + pub trigger: Trigger, +} + +pub enum TriggerData { + CosmosContractEvent { + contract_address: layer_climb_address::Address, + chain_name: ChainName, + event: cosmwasm_std::Event, + block_height: u64, + }, + EvmContractEvent { + contract_address: alloy_primitives::Address, + chain_name: ChainName, + log: LogData, + block_height: u64, + }, + BlockInterval { + chain_name: ChainName, + block_height: u64, + }, + Cron { + trigger_time: Timestamp, + } +} +``` +5. `TriggerAction` is converted to WASI format and passed to components, decoded using `decode_event_log_data!` macro from [`wavs-wasi-utils`](https://docs.rs/wavs-wasi-utils/latest/wavs_wasi_utils/macro.decode_event_log_data.html). + +## Trigger Configuration + +Set triggers in the `trigger` field of `service.json`. Each workflow requires one trigger. + +### EVM Event Trigger + +Listens for specific contract events on EVM chains. Passes raw log data to the component. + +Example: +```json +"trigger": { + "evm_contract_event": { + "address": "0x00000000219ab540356cbb839cbe05303d7705fa", + "chain_name": "ethereum", + "event_hash": "0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef" + } +} +``` + +Configure chain in `wavs.toml`: +```toml +[default.chains.evm.ethereum] +chain_id = "1" +ws_endpoint = "wss://eth.drpc.org" +http_endpoint = "https://eth.drpc.org" +``` + +Set EVM credential in `.env`: +```env +WAVS_CLI_EVM_CREDENTIAL="0x5ze146f435835b1762ed602088740d201b68fd94bf808f97fd04588f1a63c9ab" +``` + +### Cosmos Event Trigger + +Monitors Cosmos smart contract events by type and address. Passes emitted contract data to component. + +Example: +```json +"trigger": { + "cosmos_contract_event": { + "address": { + "Cosmos": { + "bech32_addr": "neutron1qlaq54uh9f52d3p66q77s6kh9k9ee3vasy8gkdkk3yvgezcs6zts0mkcv4", + "prefix_len": 7 + } + }, + "chain_name": "neutron", + "event_type": "send_nft" + } +} +``` + +Configure chain in `wavs.toml`: +```toml +[default.chains.cosmos.neutron] +chain_id = "pion-1" +bech32_prefix = "neutron" +rpc_endpoint = "https://rpc-falcron.pion-1.ntrn.tech" +grpc_endpoint = "http://grpc-falcron.pion-1.ntrn.tech:80" +gas_price = 0.0053 +gas_denom = "untrn" +``` + +Set Cosmos mnemonic in `.env`: +```env +WAVS_CLI_COSMOS_MNEMONIC="large slab plate twenty laundry illegal vacuum phone drum example topic reason" +``` + +### Cron Trigger + +Executes component on a schedule defined by a cron expression with optional start/end times. Passes trigger timestamp. + +Example: +```json +"trigger": { + "cron": { + "schedule": "0 */5 * * * *", + "start_time": 1704067200000000000, + "end_time": 1735689599000000000 + } +} +``` + +Cron format (seconds to day of week): + +``` +* * * * * * +│ │ │ │ │ └─ Day of week (0-6, Sunday=0) +│ │ │ │ └── Month (1-12) +│ │ │ └─── Day of month (1-31) +│ │ └──── Hour (0-23) +│ └───── Minute (0-59) +└────── Second (0-59) +``` + +Common examples: + +- `0 */5 * * * *` - Every 5 minutes at 0 seconds +- `0 0 */6 * * *` - Every 6 hours +- `0 0 0 * * *` - Daily at midnight + +**Note:** Cron triggers may have latency due to network and clock drift. Use block triggers for precise timing. + +### Block Trigger + +Runs component at regular block intervals on EVM or Cosmos chains. Passes block height and chain name. + +Example: +```json +"trigger": { + "block_interval": { + "chain_name": "ethereum-mainnet", + "n_blocks": 10, + "start_block": null, + "end_block": null + } +} +``` + +## Best Practices + +- Always configure chain info in `wavs.toml` and credentials in `.env`. +- Use `decode_event_log_data!` macro in components to decode trigger data. +- Use cron triggers for non-time-critical tasks; use block triggers for precise scheduling. +- Maintain accurate lookup maps for trigger verification. + +For more information: +- [WAVS WASI Utils decode_event_log_data!](https://docs.rs/wavs-wasi-utils/latest/wavs_wasi_utils/macro.decode_event_log_data.html) +- [Crontab Guru - Cron Expression Tool](https://crontab.guru/) +- [WAVS Service JSON and Workflow Handbook](https://handbook.layerzero.network/handbook/service) diff --git a/.cursor/rules/variables.mdc b/.cursor/rules/variables.mdc new file mode 100644 index 00000000..46ef70b1 --- /dev/null +++ b/.cursor/rules/variables.mdc @@ -0,0 +1,81 @@ +--- +description: Managing public and private configuration variables in WAVS components + +alwaysApply: true +--- +# Variables in WAVS Components + +WAVS components use two types of variables for configuration: public variables for non-sensitive data and environment keys for sensitive data. + +## Public Variables + +- Stored as strings in the `config` field of the service manifest. +- Accessible in components via `host::config_var()`. + +### Steps to use public variables: + +1. Add variables to `config` in the manifest: + +```json +"component": { + "config": { + "api_endpoint": "https://api.example.com", + "max_retries": "3" + } +} +``` + +2. Access in Rust component: + +```rust +let value = host::config_var("api_endpoint"); +``` + +## Environment Keys (Private Variables) + +- Used for sensitive data like API keys. +- Must be prefixed with `WAVS_ENV_`. +- Set by operators in their environment; not visible publicly. +- WAVS validates presence before service runs. + +### Steps to use environment keys: + +1. Create or copy `.env` file: + +```bash +cp .env.example .env +``` + +2. Set environment variable in `.env` or shell config: + +```bash +WAVS_ENV_MY_API_KEY=your_secret_key_here +``` + +3. Access in Rust component: + +```rust +let api_key = std::env::var("WAVS_ENV_MY_API_KEY")?; +``` + +4. Declare in manifest under `env_keys`: + +```json +"component": { + "env_keys": [ + "WAVS_ENV_API_KEY" + ] +} +``` + +## Local Execution + +Use `--config` flag with comma-separated `KEY=VALUE` pairs to set config variables locally: + +```bash +wavs-cli exec --component --input --config api_endpoint=https://api.example.com +``` + +For more information: +- [WAVS Variables Documentation](https://docs.wavs.example.com/variables) +- [WAVS CLI Reference](https://docs.wavs.example.com/cli) diff --git a/.cursor/rules/workflows.mdc b/.cursor/rules/workflows.mdc new file mode 100644 index 00000000..4411672c --- /dev/null +++ b/.cursor/rules/workflows.mdc @@ -0,0 +1,146 @@ +--- +description: Define and manage WAVS service workflows specifying triggers, components, and submission logic. + +alwaysApply: true +--- +# WAVS Service Workflows + +A WAVS service consists of one or more workflows defining execution paths. Each workflow includes: + +- **Trigger**: Event that starts the workflow +- **Component**: WASM component processing the event +- **Submit**: Destination for results + +## Workflow Structure + +Workflows are defined in the service manifest JSON under the `workflows` key, each identified by a unique UUID. + +Example workflow with a cron trigger and aggregator submission: + +```json service.json +"workflows": { + "0196c34d-003d-7412-a3f3-70f8ec664e12": { + "trigger": { + "cron": { + "schedule": "0 * * * * *", + "start_time": null, + "end_time": null + } + }, + "component": { + "source": { + "Digest": "65747b4b1a7fa98cab6abd9a81a6102068de77b1040b94de904112272b226f51" + }, + "permissions": { + "allowed_http_hosts": "all", + "file_system": true + }, + "fuel_limit": null, + "time_limit_seconds": 1800, + "config": { + "nft": "0xb5d4D4a87Cb07f33b5FAd6736D8F1EE7D255d9E9", + "reward_token": "0x34045B4b0cdfADf87B840bCF544161168c8ab85A" + }, + "env_keys": [ + "WAVS_ENV_API_KEY" + ] + }, + "submit": { + "aggregator": { + "url": "http://127.0.0.1:8001" + } + }, + "aggregators": [ + { + "evm": { + "chain_name": "local", + "address": "0xd6f8ff0036d8b2088107902102f9415330868109", + "max_gas": 5000000 + } + } + ] + } +} +``` + +## Multi-workflow Services + +- Multiple workflows can coexist in one service manifest. +- Each workflow has independent trigger, component, and submission logic. +- All workflows share the same service manager and operator set. + +Example: + +```json +{ + "workflows": { + "workflow-uuid-1": { + "trigger": { ... }, + "component": { ... }, + "submit": { ... } + }, + "workflow-uuid-2": { + "trigger": { ... }, + "component": { ... }, + "submit": { ... } + } + } +} +``` + +## Workflow Isolation + +- Each workflow runs in a separate WebAssembly environment. +- Memory and state are isolated per execution. +- Components cannot access each other's memory or state directly. + +## Sharing State Between Workflows + +- WAVS services focus on data processing, not storage. +- Data sharing is done via external systems (e.g., onchain smart contracts). +- Workflow A submits data externally; Workflow B reads from the same source. + +Example flow: + +``` +A: Trigger -> component -> onchain submission storage +B: Trigger -> component (reads from A's storage) -> onchain submission storage +``` + +## Chaining Workflows + +- Chain workflows by setting the second workflow’s trigger to the onchain submission event of the first. +- This can be done within a service or across different services. + +Example: + +```json +{ + "workflows": { + "workflow-uuid-1": { + "trigger": { ... }, + "component": { ... }, + "submit": { ... } + }, + "workflow-uuid-2": { + "trigger": { /* onchain submission event of workflow-uuid-1 */ }, + "component": { ... }, + "submit": { ... } + } + } +} +``` + +## Multichain Services + +- WAVS supports contract event or block height triggers on Cosmos and EVM chains. +- Enables cross-chain services monitoring events on one chain and submitting results to Ethereum. +- More chain triggers coming soon. + +For detailed trigger options, see the [Trigger page](./triggers). + +For more information: +- [WAVS Design Considerations](../design) +- [Trigger Documentation](./triggers) +- [Component Documentation](./components/component) +- [Submission Documentation](./submission) diff --git a/Cargo.toml b/Cargo.toml index 1fa9282a..158b821a 100644 --- a/Cargo.toml +++ b/Cargo.toml @@ -1,7 +1,5 @@ [workspace] -members = [ - "components/evm-price-oracle", -] +members = ["components/evm-price-oracle"] resolver = "2" [workspace.package] @@ -14,7 +12,7 @@ rust-version = "1.80.0" [workspace.dependencies] # WASI -wit-bindgen-rt ={ version = "0.42.1", features = ["bitflags"]} +wit-bindgen-rt = { version = "0.42.1", features = ["bitflags"] } wit-bindgen = "0.42.1" wstd = "0.5.3" wasi = "0.14.1" @@ -26,5 +24,12 @@ serde_json = "1.0.140" anyhow = "1.0.98" ## Alloy -alloy-sol-macro = { version = "1.0.0", features = ["json"]} +alloy-sol-macro = { version = "1.0.0", features = ["json"] } alloy-sol-types = "1.0.0" +alloy-network = "0.15.10" +alloy-primitives = "1.0.0" +alloy-provider = { version = "0.15.10", default-features = false, features = [ + "rpc-api", +] } +alloy-rpc-types = "0.15.10" +alloy-contract = "0.15.10" diff --git a/Makefile b/Makefile index c0ee1052..824f7edb 100644 --- a/Makefile +++ b/Makefile @@ -5,7 +5,7 @@ SUDO := $(shell if groups | grep -q docker; then echo ''; else echo 'sudo'; fi) # Define common variables CARGO=cargo -COIN_MARKET_CAP_ID?=1 +INPUT_DATA?=1 COMPONENT_FILENAME?=evm_price_oracle.wasm CREDENTIAL?="" DOCKER_IMAGE?=ghcr.io/lay3rlabs/wavs:35c96a4 @@ -33,11 +33,17 @@ wasi-build: @./script/build_components.sh $(WASI_BUILD_DIR) @echo "✅ WASI build complete" -## wasi-exec: executing the WAVS wasi component(s) | COMPONENT_FILENAME, COIN_MARKET_CAP_ID +## wasi-exec: executing the WAVS wasi component(s) with ABI function | COMPONENT_FILENAME, INPUT_DATA wasi-exec: pull-image @$(WAVS_CMD) exec --log-level=info --data /data/.docker --home /data \ --component "/data/compiled/$(COMPONENT_FILENAME)" \ - --input `cast format-bytes32-string $(COIN_MARKET_CAP_ID)` + --input $(shell cast abi-encode "f(string)" "${INPUT_DATA}") \ + +## wasi-exec-fixed: the same as wasi-exec, except uses a fixed input as bytes (used in Go & TS components) | COMPONENT_FILENAME, INPUT_DATA +wasi-exec-fixed: pull-image + @$(WAVS_CMD) exec --log-level=info --data /data/.docker --home /data \ + --component "/data/compiled/$(COMPONENT_FILENAME)" \ + --input `cast format-bytes32-string $(INPUT_DATA)` ## clean: cleaning the project files clean: clean-docker @@ -51,6 +57,25 @@ clean: clean-docker clean-docker: @$(SUDO) docker rm -v $(shell $(SUDO) docker ps -a --filter status=exited -q) > /dev/null 2>&1 || true + +## validate-component: validate a WAVS component against best practices +validate-component: + @if [ -z "$(COMPONENT)" ]; then \ + echo "Usage: make validate-component COMPONENT=your-component-name"; \ + echo "Example: make validate-component COMPONENT=eth-price-oracle"; \ + exit 1; \ + fi + @if [ ! -d "./components/$(COMPONENT)" ]; then \ + echo "Error: Component directory ./components/$(COMPONENT) not found"; \ + exit 1; \ + fi + @if [ ! -d "./test_utils" ]; then \ + echo "Error: Test utilities not found. Please ensure test_utils exists."; \ + exit 1; \ + fi + @cd test_utils && ./validate_component.sh $(COMPONENT) + + ## fmt: formatting solidity and rust code fmt: @forge fmt --check diff --git a/README.md b/README.md index 076a2065..902f5422 100644 --- a/README.md +++ b/README.md @@ -177,7 +177,12 @@ WASI_BUILD_DIR=components/evm-price-oracle make wasi-build How to test the component locally for business logic validation before on-chain deployment. An ID of 1 for the oracle component is Bitcoin. ```bash -COIN_MARKET_CAP_ID=1 make wasi-exec +# Rust & Typescript components +INPUT_DATA="1" COMPONENT_FILENAME=evm_price_oracle.wasm make wasi-exec +INPUT_DATA="1" COMPONENT_FILENAME=js_evm_price_oracle.wasm make wasi-exec + +# Golang +INPUT_DATA="1" COMPONENT_FILENAME=golang_evm_price_oracle.wasm make wasi-exec-fixed ``` Expected output: @@ -211,7 +216,7 @@ Result (utf8): ## Start Environment -Start an ethereum node (anvil), the WAVS service, and deploy [eigenlayer](https://www.eigenlayer.xyz/) contracts to the local network. +Start an Ethereum node (anvil), the WAVS service, and deploy [EigenLayer](https://www.eigenlayer.xyz/) contracts to the local network. ### Enable Telemetry (optional) @@ -273,8 +278,12 @@ bash ./script/deploy-script.sh Anyone can now call the [trigger contract](./src/contracts/WavsTrigger.sol) which emits the trigger event WAVS is watching for from the previous step. WAVS then calls the service and saves the result on-chain. ```bash -# Request BTC from CMC -export COIN_MARKET_CAP_ID=1 +# Rust & Typescript - request BTC from CMC +export INPUT_DATA=`cast abi-encode "addTrigger(string)" "1"` + +# Golang uses the raw value +# export INPUT_DATA="1" + # Get the trigger address from previous Deploy forge script export SERVICE_TRIGGER_ADDR=`make get-trigger-from-deploy` # Execute on the trigger contract, WAVS will pick this up and submit the result @@ -282,8 +291,9 @@ export SERVICE_TRIGGER_ADDR=`make get-trigger-from-deploy` # uses FUNDED_KEY as the executor (local: anvil account) source .env +export RPC_URL=`sh ./script/get-rpc.sh` -forge script ./script/Trigger.s.sol ${SERVICE_TRIGGER_ADDR} ${COIN_MARKET_CAP_ID} --sig 'run(string,string)' --rpc-url ${RPC_URL} --broadcast +forge script ./script/Trigger.s.sol ${SERVICE_TRIGGER_ADDR} ${INPUT_DATA} --sig 'run(string,string)' --rpc-url ${RPC_URL} --broadcast ``` ## Show the result @@ -298,7 +308,11 @@ RPC_URL=${RPC_URL} make get-trigger TRIGGER_ID=1 RPC_URL=${RPC_URL} make show-result ``` -# Claude Code +## AI Coding Agents + +This template contains rulefiles for building components with Claude Code and Cursor. Read the [AI-powered component creation guide](./docs/handbook/ai.mdx) for usage instructions. + +### Claude Code To spin up a sandboxed instance of [Claude Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview) in a Docker container that only has access to this project's files, run the following command: @@ -307,4 +321,3 @@ npm run claude-code # or with no restrictions (--dangerously-skip-permissions) npm run claude-code:unrestricted ``` - diff --git a/claude.md b/claude.md new file mode 100644 index 00000000..7f9b88ac --- /dev/null +++ b/claude.md @@ -0,0 +1,904 @@ +# WAVS Component Creation Guide + +You specialize in creating WAVS (WASI AVS) components. Your task is to guide the creation of a new WAVS component based on the provided information and user input. Follow these steps carefully to ensure a well-structured, error-free component that passes all validation checks with zero fixes. + +## Component Structure + +A WAVS component needs: +1. `Cargo.toml` - Dependencies configuration +2. `src/lib.rs` - Component implementation logic goes here +3. `src/trigger.rs` - trigger handling logic +4. `src/bindings.rs` - Auto-generated, never edit +5. `Makefile` - Do not edit +6. `config.json` - Only edit the name + +## Creating a Component + +### 1. Cargo.toml Template + +```toml +[package] +name = "your-component-name" +edition.workspace = true +version.workspace = true +authors.workspace = true +rust-version.workspace = true +repository.workspace = true + +[dependencies] +# Core dependencies (always needed) +wit-bindgen-rt ={ workspace = true} +wavs-wasi-utils = { workspace = true } +serde = { workspace = true } +serde_json = { workspace = true } +alloy-sol-macro = { workspace = true } +wstd = { workspace = true } +alloy-sol-types = { workspace = true } +anyhow = { workspace = true } + +# Add for blockchain interactions +alloy-primitives = { workspace = true } +alloy-provider = { workspace = true } +alloy-rpc-types = { workspace = true } +alloy-network = { workspace = true } +alloy-contract = { workspace = true } + +[lib] +crate-type = ["cdylib"] + +[profile.release] +codegen-units = 1 +opt-level = "s" +debug = false +strip = true +lto = true + +[package.metadata.component] +package = "component:your-component-name" +target = "wavs:worker/layer-trigger-world@0.4.0-beta.4" +``` + +CRITICAL: Never use direct version numbers - always use `{ workspace = true }`. +IMPORTANT! Always add your component to workspace members in the root Cargo.toml + +### 2. Component Implementation (lib.rs) + +#### Basic Structure + +```rust +mod trigger; +use trigger::{decode_trigger_event, encode_trigger_output, Destination}; +use wavs_wasi_utils::{ + evm::alloy_primitives::hex, + http::{fetch_json, http_request_get}, +}; +pub mod bindings; // Never edit bindings.rs! +use crate::bindings::{export, Guest, TriggerAction, WasmResponse}; +use alloy_sol_types::SolValue; +use serde::{Deserialize, Serialize}; +use wstd::{http::HeaderValue, runtime::block_on}; +use anyhow::Result; + +struct Component; +export!(Component with_types_in bindings); + +impl Guest for Component { + fn run(action: TriggerAction) -> std::result::Result, String> { + let (trigger_id, req, dest) = + decode_trigger_event(action.data).map_err(|e| e.to_string())?; + + // Decode trigger data inline - handles hex string input + let request_input = { + // First, convert the input bytes to a string to check if it's a hex string + let input_str = String::from_utf8(req.clone()) + .map_err(|e| format!("Input is not valid UTF-8: {}", e))?; + + // Check if it's a hex string (starts with "0x") + let hex_data = if input_str.starts_with("0x") { + // Decode the hex string to bytes + hex::decode(&input_str[2..]) + .map_err(|e| format!("Failed to decode hex string: {}", e))? + } else { + // If it's not a hex string, assume the input is already binary data + req.clone() + }; + + // Now ABI decode the binary data as a string parameter + ::abi_decode(&hex_data) + .map_err(|e| format!("Failed to decode input as ABI string: {}", e))? + }; + println!("Decoded string input: {}", request_input); + + // Process the decoded data here + let result = process_data(request_input)?; + + let output = match dest { + Destination::Ethereum => Some(encode_trigger_output(trigger_id, &result)), + Destination::CliOutput => Some(WasmResponse { payload: result.into(), ordering: None }), + }; + Ok(output) + } +} + +// Example processing function - replace with your actual logic +fn process_data(input: String) -> Result, String> { + // Your processing logic here + Ok(input.as_bytes().to_vec()) +} +``` + +#### Trigger Event Handling (trigger.rs) + +```rust +use crate::bindings::wavs::worker::layer_types::{ + TriggerData, TriggerDataEvmContractEvent, WasmResponse, +}; +use alloy_sol_types::SolValue; +use anyhow::Result; +use wavs_wasi_utils::decode_event_log_data; + +pub enum Destination { + Ethereum, + CliOutput, +} + +pub fn decode_trigger_event(trigger_data: TriggerData) -> Result<(u64, Vec, Destination)> { + match trigger_data { + TriggerData::EvmContractEvent(TriggerDataEvmContractEvent { log, .. }) => { + let event: solidity::NewTrigger = decode_event_log_data!(log)?; + let trigger_info = ::abi_decode(&event._triggerInfo)?; + Ok((trigger_info.triggerId, trigger_info.data.to_vec(), Destination::Ethereum)) + } + TriggerData::Raw(data) => Ok((0, data.clone(), Destination::CliOutput)), + _ => Err(anyhow::anyhow!("Unsupported trigger data type")), + } +} + +pub fn encode_trigger_output(trigger_id: u64, output: impl AsRef<[u8]>) -> WasmResponse { + WasmResponse { + payload: solidity::DataWithId { + triggerId: trigger_id, + data: output.as_ref().to_vec().into(), + } + .abi_encode(), + ordering: None, + } +} + +pub mod solidity { + use alloy_sol_macro::sol; + pub use ITypes::*; + sol!("../../src/interfaces/ITypes.sol"); + + // trigger contract function that encodes string input + sol! { + function addTrigger(string data) external; + } +} +``` + +## Critical Components + +### 1. ABI Handling + +NEVER use `String::from_utf8` on ABI-encoded data. This will ALWAYS fail with "invalid utf-8 sequence". + +```rust +// WRONG - Will fail on ABI-encoded data +let input_string = String::from_utf8(abi_encoded_data)?; + +// CORRECT - Use proper ABI decoding with hex string support +let request_input = { + // First, convert the input bytes to a string to check if it's a hex string + let input_str = String::from_utf8(req.clone()) + .map_err(|e| format!("Input is not valid UTF-8: {}", e))?; + + // Check if it's a hex string (starts with "0x") + let hex_data = if input_str.starts_with("0x") { + // Decode the hex string to bytes + hex::decode(&input_str[2..]) + .map_err(|e| format!("Failed to decode hex string: {}", e))? + } else { + // If it's not a hex string, assume the input is already binary data + req.clone() + }; + + // Now ABI decode the binary data as a string parameter + ::abi_decode(&hex_data) + .map_err(|e| format!("Failed to decode input as ABI string: {}", e))? +}; + +// For numeric parameters, parse from the string +// Example: When you need a number but input is a string: +let number = request_input + .trim() + .parse::() + .map_err(|_| format!("Invalid number: {}", request_input))?; + +// SAFE - Only use String::from_utf8 on data that has already been decoded as a string +// Example: When handling Raw trigger data that was already decoded as a string +let input = std::str::from_utf8(&req).map_err(|e| e.to_string())?; +``` + +### 2. Data Structure Ownership + +ALWAYS derive `Clone` for API response data structures. If fields may be missing, also use `Option`, `#[serde(default)]`, and `Default`: + +```rust +#[derive(Debug, Serialize, Deserialize, Clone, Default)] +#[serde(default)] +pub struct ResponseData { + field1: Option, + field2: Option, + // other fields +} +``` + +ALWAYS clone data before use to avoid ownership issues: + +```rust +// WRONG – creates a temporary that is dropped immediately +let result = process_data(&data.clone()); + +// CORRECT – clone into a named variable +let data_clone = data.clone(); +let result = process_data(&data_clone); +``` + + +### 3. Network Requests + +```rust +use wstd::runtime::block_on; +use wstd::http::HeaderValue; +use wavs_wasi_utils::http::{fetch_json, http_request_get, http_request_post_json}; +use serde::{Deserialize, Serialize}; + +#[derive(Debug, Serialize, Deserialize, Clone, Default)] +pub struct ApiResponse { + #[serde(default)] + field1: Option, + #[serde(default)] + field2: Option, +} + +async fn make_request() -> Result { + let url = format!("https://api.example.com/endpoint?param={}", param); + + let mut req = http_request_get(&url).map_err(|e| e.to_string())?; + req.headers_mut().insert("Accept", HeaderValue::from_static("application/json")); + req.headers_mut().insert("Content-Type", HeaderValue::from_static("application/json")); + req.headers_mut().insert("User-Agent", HeaderValue::from_static("Mozilla/5.0")); + + let response: ApiResponse = fetch_json(req).await.map_err(|e| e.to_string())?; + Ok(response) +} + +fn process_data() -> Result { + block_on(async move { make_request().await }) +} + +// For POST requests with JSON data, use http_request_post_json(url, &data) instead of http_request_get +``` + +### 4. Option/Result Handling + +```rust +// WRONG - Option types don't have map_err +let config = get_evm_chain_config("ethereum").map_err(|e| e.to_string())?; + +// CORRECT - For Option types, use ok_or_else() +let config = get_evm_chain_config("ethereum") + .ok_or_else(|| "Failed to get chain config".to_string())?; + +// CORRECT - For Result types, use map_err() +let balance = fetch_balance(address).await + .map_err(|e| format!("Balance fetch failed: {}", e))?; +``` + +### 5. Blockchain Interactions + +```rust +use alloy_network::Ethereum; +use alloy_primitives::{Address, TxKind, U256}; +use alloy_provider::{Provider, RootProvider}; +use alloy_rpc_types::TransactionInput; +use std::str::FromStr; // Required for parsing addresses +use crate::bindings::host::get_evm_chain_config; +use wavs_wasi_utils::evm::new_evm_provider; + +async fn query_blockchain(address_str: &str) -> Result { + // Parse address + let address = Address::from_str(address_str) + .map_err(|e| format!("Invalid address format: {}", e))?; + + // Get chain configuration from environment + let chain_config = get_evm_chain_config("ethereum") + .ok_or_else(|| "Failed to get chain config".to_string())?; + + // Create provider + let provider: RootProvider = + new_evm_provider::(chain_config.http_endpoint.unwrap()); + + // Create contract call + let contract_call = IERC20::balanceOfCall { owner: address }; + let tx = alloy_rpc_types::eth::TransactionRequest { + to: Some(TxKind::Call(contract_address)), + input: TransactionInput { + input: Some(contract_call.abi_encode().into()), + data: None + }, + ..Default::default() + }; + + // Execute call + let result = provider.call(tx).await.map_err(|e| e.to_string())?; + let balance: U256 = U256::from_be_slice(&result); + + Ok(ResponseData { /* your data here */ }) +} +``` + +### 6. Numeric Type Handling + +```rust +// WRONG - Using .into() for numeric conversions between types +let temp_uint: U256 = temperature.into(); // DON'T DO THIS + +// CORRECT - String parsing method works reliably for all numeric types +let temperature: u128 = 29300; +let temperature_uint256 = temperature.to_string().parse::().unwrap(); + +// CORRECT - Always use explicit casts between numeric types +let decimals: u8 = decimals_u32 as u8; + +// CORRECT - Handling token decimals correctly +let mut divisor = U256::from(1); +for _ in 0..decimals { + divisor = divisor * U256::from(10); +} +let formatted_amount = amount / divisor; +``` + +## Component Examples by Task + +Here are templates for common WAVS component tasks: + +### 1. Token Balance Checker + +```rust +// IMPORTS +use alloy_network::Ethereum; +use alloy_primitives::{Address, TxKind, U256}; +use alloy_provider::{Provider, RootProvider}; +use alloy_rpc_types::TransactionInput; +use alloy_sol_types::{sol, SolCall, SolValue}; +use anyhow::Result; +use serde::{Deserialize, Serialize}; +use std::str::FromStr; +use wavs_wasi_utils::{ + evm::{alloy_primitives::hex, new_evm_provider}, +}; +use wstd::runtime::block_on; + +pub mod bindings; +mod trigger; +use trigger::{decode_trigger_event, encode_trigger_output, Destination}; +use crate::bindings::host::get_evm_chain_config; +use crate::bindings::wavs::worker::layer_types::{TriggerData, TriggerDataEvmContractEvent}; +use crate::bindings::{export, Guest, TriggerAction, WasmResponse}; + +// TOKEN INTERFACE +sol! { + interface IERC20 { + function balanceOf(address owner) external view returns (uint256); + function decimals() external view returns (uint8); + } +} + +// FIXED CONTRACT ADDRESS +const TOKEN_CONTRACT_ADDRESS: &str = "0x..."; // Your token contract address + +// RESPONSE STRUCTURE - MUST DERIVE CLONE +#[derive(Debug, Serialize, Deserialize, Clone)] +pub struct TokenBalanceData { + wallet: String, + balance_raw: String, + balance_formatted: String, + token_contract: String, +} + +// COMPONENT IMPLEMENTATION +struct Component; +export!(Component with_types_in bindings); + +impl Guest for Component { + fn run(action: TriggerAction) -> std::result::Result, String> { + let (trigger_id, req, dest) = + decode_trigger_event(action.data).map_err(|e| e.to_string())?; + + // Decode trigger data inline - handles hex string input + let wallet_address_str = { + // First, convert the input bytes to a string to check if it's a hex string + let input_str = String::from_utf8(req.clone()) + .map_err(|e| format!("Input is not valid UTF-8: {}", e))?; + + // Check if it's a hex string (starts with "0x") + let hex_data = if input_str.starts_with("0x") { + // Decode the hex string to bytes + hex::decode(&input_str[2..]) + .map_err(|e| format!("Failed to decode hex string: {}", e))? + } else { + // If it's not a hex string, assume the input is already binary data + req.clone() + }; + + // Now ABI decode the binary data as a string parameter + ::abi_decode(&hex_data) + .map_err(|e| format!("Failed to decode input as ABI string: {}", e))? + }; + + // Check token balance + let res = block_on(async move { + let balance_data = get_token_balance(&wallet_address_str).await?; + serde_json::to_vec(&balance_data).map_err(|e| e.to_string()) + })?; + + // Return result based on destination + let output = match dest { + Destination::Ethereum => Some(encode_trigger_output(trigger_id, &res)), + Destination::CliOutput => Some(WasmResponse { payload: res.into(), ordering: None }), + }; + Ok(output) + } +} + +// BALANCE CHECKER IMPLEMENTATION +async fn get_token_balance(wallet_address_str: &str) -> Result { + // Parse wallet address + let wallet_address = Address::from_str(wallet_address_str) + .map_err(|e| format!("Invalid wallet address: {}", e))?; + + // Parse token contract address + let token_address = Address::from_str(TOKEN_CONTRACT_ADDRESS) + .map_err(|e| format!("Invalid token address: {}", e))?; + + // Get Ethereum provider + let chain_config = get_evm_chain_config("ethereum") + .ok_or_else(|| "Failed to get Ethereum chain config".to_string())?; + + let provider: RootProvider = + new_evm_provider::(chain_config.http_endpoint.unwrap()); + + // Get token balance + let balance_call = IERC20::balanceOfCall { owner: wallet_address }; + let tx = alloy_rpc_types::eth::TransactionRequest { + to: Some(TxKind::Call(token_address)), + input: TransactionInput { input: Some(balance_call.abi_encode().into()), data: None }, + ..Default::default() + }; + + let result = provider.call(tx).await.map_err(|e| e.to_string())?; + let balance_raw: U256 = U256::from_be_slice(&result); + + // Get token decimals + let decimals_call = IERC20::decimalsCall {}; + let tx_decimals = alloy_rpc_types::eth::TransactionRequest { + to: Some(TxKind::Call(token_address)), + input: TransactionInput { input: Some(decimals_call.abi_encode().into()), data: None }, + ..Default::default() + }; + + let result_decimals = provider.call(tx_decimals).await.map_err(|e| e.to_string())?; + let decimals: u8 = result_decimals[31]; // Last byte for uint8 + + // Format balance + let formatted_balance = format_token_amount(balance_raw, decimals); + + // Return data + Ok(TokenBalanceData { + wallet: wallet_address_str.to_string(), + balance_raw: balance_raw.to_string(), + balance_formatted: formatted_balance, + token_contract: TOKEN_CONTRACT_ADDRESS.to_string(), + }) +} +``` + +### 2. API Data Fetcher + +Important: Always verify API endpoints using curl to examine their response structure before creating any code that relies on them. + +```rust +// IMPORTS +use alloy_sol_types::{sol, SolCall, SolValue}; +use anyhow::Result; +use serde::{Deserialize, Serialize}; +use wavs_wasi_utils::{ + evm::alloy_primitives::hex, + http::{fetch_json, http_request_get}, +}; +use wstd::{http::HeaderValue, runtime::block_on}; + +pub mod bindings; +mod trigger; +use trigger::{decode_trigger_event, encode_trigger_output, Destination}; +use crate::bindings::wavs::worker::layer_types::{TriggerData, TriggerDataEvmContractEvent}; +use crate::bindings::{export, Guest, TriggerAction, WasmResponse}; + +// RESPONSE STRUCTURE - MUST DERIVE CLONE +// IMPORTANT: Always Use #[serde(default)] and Option for fields from external APIs. They might be missing or inconsistent +#[derive(Debug, Serialize, Deserialize, Clone, Default)] +pub struct ApiResponse { + // Use Option for fields that might be missing in some responses + #[serde(default)] + field1: Option, + #[serde(default)] + field2: Option, + // other fields +} + +// RESULT DATA STRUCTURE - MUST DERIVE CLONE +#[derive(Debug, Serialize, Deserialize, Clone)] +pub struct ResultData { + input_param: String, + result: String, +} + +// COMPONENT IMPLEMENTATION +struct Component; +export!(Component with_types_in bindings); + +impl Guest for Component { + fn run(action: TriggerAction) -> std::result::Result, String> { + // Decode trigger data + let (trigger_id, req, dest) = + decode_trigger_event(action.data).map_err(|e| e.to_string())?; + + // Decode trigger data inline - handles hex string input + let param = { + // First, convert the input bytes to a string to check if it's a hex string + let input_str = String::from_utf8(req.clone()) + .map_err(|e| format!("Input is not valid UTF-8: {}", e))?; + + // Check if it's a hex string (starts with "0x") + let hex_data = if input_str.starts_with("0x") { + // Decode the hex string to bytes + hex::decode(&input_str[2..]) + .map_err(|e| format!("Failed to decode hex string: {}", e))? + } else { + // If it's not a hex string, assume the input is already binary data + req.clone() + }; + + // Now ABI decode the binary data as a string parameter + ::abi_decode(&hex_data) + .map_err(|e| format!("Failed to decode input as ABI string: {}", e))? + }; + + // Make API request + let res = block_on(async move { + let api_data = fetch_api_data(¶m).await?; + serde_json::to_vec(&api_data).map_err(|e| e.to_string()) + })?; + + // Return result based on destination + let output = match dest { + Destination::Ethereum => Some(encode_trigger_output(trigger_id, &res)), + Destination::CliOutput => Some(WasmResponse { payload: res.into(), ordering: None }), + }; + Ok(output) + } +} + +// API FETCHER IMPLEMENTATION +async fn fetch_api_data(param: &str) -> Result { + // Get API key from environment (IMPORTANT! you must add this variable to your .env file. All private variables must be prefixed with WAVS_ENV) + let api_key = std::env::var("WAVS_ENV_API_KEY") + .map_err(|_| "Failed to get API_KEY from environment variables".to_string())?; + + // Create API URL + let url = format!( + "https://api.example.com/endpoint?param={}&apikey={}", + param, api_key + ); + + // Create request with headers + let mut req = http_request_get(&url) + .map_err(|e| format!("Failed to create request: {}", e))?; + + req.headers_mut().insert("Accept", HeaderValue::from_static("application/json")); + req.headers_mut().insert("Content-Type", HeaderValue::from_static("application/json")); + req.headers_mut().insert("User-Agent", HeaderValue::from_static("Mozilla/5.0")); + + // Make API request + let api_response: ApiResponse = fetch_json(req).await + .map_err(|e| format!("Failed to fetch data: {}", e))?; + + // Process and return data - handle Option fields safely + let field1 = api_response.field1.unwrap_or_else(|| "unknown".to_string()); + let field2 = api_response.field2.unwrap_or(0); + + Ok(ResultData { + input_param: param.to_string(), + result: format!("{}: {}", field1, field2), + }) +} +``` + +### 3. NFT Ownership Checker + +```rust +// IMPORTS +use alloy_network::Ethereum; +use alloy_primitives::{Address, TxKind, U256}; +use alloy_provider::{Provider, RootProvider}; +use alloy_rpc_types::TransactionInput; +use alloy_sol_types::{sol, SolCall, SolValue}; +use anyhow::Result; +use serde::{Deserialize, Serialize}; +use std::str::FromStr; +use wavs_wasi_utils::{ + evm::{alloy_primitives::hex, new_evm_provider}, +}; +use wstd::runtime::block_on; + +pub mod bindings; +mod trigger; +use trigger::{decode_trigger_event, encode_trigger_output, Destination}; +use crate::bindings::host::get_evm_chain_config; +use crate::bindings::wavs::worker::layer_types::{TriggerData, TriggerDataEvmContractEvent}; +use crate::bindings::{export, Guest, TriggerAction, WasmResponse}; + +// NFT INTERFACE +sol! { + interface IERC721 { + function balanceOf(address owner) external view returns (uint256); + function ownerOf(uint256 tokenId) external view returns (address); + } +} + +// FIXED CONTRACT ADDRESS +const NFT_CONTRACT_ADDRESS: &str = "0xbd3531da5cf5857e7cfaa92426877b022e612cf8"; // Bored Ape contract + +// RESPONSE STRUCTURE - MUST DERIVE CLONE +#[derive(Debug, Serialize, Deserialize, Clone)] +pub struct NftOwnershipData { + wallet: String, + owns_nft: bool, + balance: String, + nft_contract: String, + contract_name: String, +} + +// COMPONENT IMPLEMENTATION +struct Component; +export!(Component with_types_in bindings); + +impl Guest for Component { + fn run(action: TriggerAction) -> std::result::Result, String> { + // Decode trigger data + let (trigger_id, req, dest) = + decode_trigger_event(action.data).map_err(|e| e.to_string())?; + + // Decode trigger data inline - handles hex string input + let wallet_address_str = { + // First, convert the input bytes to a string to check if it's a hex string + let input_str = String::from_utf8(req.clone()) + .map_err(|e| format!("Input is not valid UTF-8: {}", e))?; + + // Check if it's a hex string (starts with "0x") + let hex_data = if input_str.starts_with("0x") { + // Decode the hex string to bytes + hex::decode(&input_str[2..]) + .map_err(|e| format!("Failed to decode hex string: {}", e))? + } else { + // If it's not a hex string, assume the input is already binary data + req.clone() + }; + + // Now ABI decode the binary data as a string parameter + ::abi_decode(&hex_data) + .map_err(|e| format!("Failed to decode input as ABI string: {}", e))? + }; + + // Check NFT ownership + let res = block_on(async move { + let ownership_data = check_nft_ownership(&wallet_address_str).await?; + serde_json::to_vec(&ownership_data).map_err(|e| e.to_string()) + })?; + + // Return result based on destination + let output = match dest { + Destination::Ethereum => Some(encode_trigger_output(trigger_id, &res)), + Destination::CliOutput => Some(WasmResponse { payload: res.into(), ordering: None }), + }; + Ok(output) + } +} + +// NFT OWNERSHIP CHECKER IMPLEMENTATION +async fn check_nft_ownership(wallet_address_str: &str) -> Result { + // Parse wallet address + let wallet_address = Address::from_str(wallet_address_str) + .map_err(|e| format!("Invalid wallet address: {}", e))?; + + // Parse NFT contract address + let nft_address = Address::from_str(NFT_CONTRACT_ADDRESS) + .map_err(|e| format!("Invalid NFT contract address: {}", e))?; + + // Get Ethereum provider + let chain_config = get_evm_chain_config("ethereum") + .ok_or_else(|| "Failed to get Ethereum chain config".to_string())?; + + let provider: RootProvider = + new_evm_provider::(chain_config.http_endpoint.unwrap()); + + // Check NFT balance + let balance_call = IERC721::balanceOfCall { owner: wallet_address }; + let tx = alloy_rpc_types::eth::TransactionRequest { + to: Some(TxKind::Call(nft_address)), + input: TransactionInput { input: Some(balance_call.abi_encode().into()), data: None }, + ..Default::default() + }; + + let result = provider.call(tx).await.map_err(|e| e.to_string())?; + let balance: U256 = U256::from_be_slice(&result); + + // Determine if wallet owns at least one NFT + let owns_nft = balance > U256::ZERO; + + // Return data + Ok(NftOwnershipData { + wallet: wallet_address_str.to_string(), + owns_nft, + balance: balance.to_string(), + nft_contract: NFT_CONTRACT_ADDRESS.to_string(), + contract_name: "BAYC".to_string(), + }) +} +``` + + +## Component Creation Process + +### Phase 1: Planning + +When you ask me to create a WAVS component, I'll follow this systematic process to ensure it works perfectly on the first try: + +1. **Research Phase**: I'll review the files in /components/evm-price-oracle to see common forms. +2. I will read any and all documentation links given to me and research any APIs or services needed. +3. I'll read `/test_utils/validate_component.sh` to see what validation checks I need to pass. +4. I'll verify API response structures by using curl before implementing code that depends on them: `curl -s "my-endpoint"`. +5. I'll create a file called plan.md with an overview of the component I will make. I'll do this before actually creating the lib.rs file. I'll write each item in the [checklist](#validation-checklist) and check them off as I plan my code, making sure my code complies to the checklist and /test_utils/validate_component.sh. Each item must be checked and verified. I will list out all imports I will need. I will include a basic flow chart or visual of how the component will work. I will put plan.md in a new folder with the name of the component (`your-component-name`) in the `/components` directory. + + +### Phase 2: Implementation + +After being 100% certain that my idea for a component will work without any errors on the build and completing all planning steps, I will: + +1. Check for errors before coding. + +2. Copy the bindings, makefile (update filename in makefile), and config.json using the following command (bindings will be written over during the build): + + ```bash + mkdir -p components/your-component-name/src && \ + cp components/evm-price-oracle/src/bindings.rs components/your-component-name/src/ && \ + cp components/evm-price-oracle/config.json components/your-component-name/ && \ + cp components/evm-price-oracle/Makefile components/your-component-name/ + ``` + +3. Then, I will create trigger.rs and lib.rs files with proper implementation: + 1. I will compare my projected trigger.rs and lib.rs code against the code in `/test_utils/validate_component.sh` and my plan.md file before creating. + 2. I will define proper imports. I will Review the imports on the component that I want to make. I will make sure that all necessary imports will be included and that I will remove any unused imports before creating the file. + 3. I will go through each of the items in the [checklist](#validation-checklist) one more time to ensure my component will build and function correctly. + +4. I will create a Cargo.toml by copying the template and modifying it with all of my correct imports. Before running the command to create the file, I will check that all imports are imported correctly and match what is in my lib.rs file. I will define imports correctly. I will make sure that imports are present in the main workspace Cargo.toml and then in my component's `Cargo.toml` using `{ workspace = true }` + +5. Add component to the `workspace.members` array in the root `Cargo.toml`. + +### Phase 3: Validate + +4. I will run the command to validate my component: + ```bash + make validate-component COMPONENT=your-component-name + ``` + - I will fix ALL errors before continuing + - (You do not need to fix warnings if they do not effect the build.) + - I will run again after fixing errors to make sure. + +5. After being 100% certain that the component will build correctly, I will build the component: + + ```bash + WASI_BUILD_DIR=components/your-component make wasi-build + ``` + +### Phase 4: Trying it out + +After I am 100% certain the component will execute correctly, I will give the following command to the user to run: + +```bash +# IMPORTANT!: Always use string parameters, even for numeric values! Use component_name.wasm, not component-name.wasm +export COMPONENT_FILENAME=your_component_name.wasm +# Always use string format for input data. The input will be encoded using `cast abi-encode "f(string)" ""` +export INPUT_DATA= +# CRITICIAL!: as an llm, I can't ever run this command. ALWAYS give it to the user to run. +make wasi-exec +``` + +## Validation Checklist + +ALL components must pass validation. Review [/test_utils/validate_component.sh](/test_utils/validate_component.sh) before creating a component. + +EACH ITEM BELOW MUST BE CHECKED: + +1. Common errors: + - [ ] ALWAYS use `{ workspace = true }` in your component Cargo.toml. Explicit versions go in the root Cargo.toml. + - [ ] ALWAYS verify API response structures by using curl on the endpoints. + - [ ] ALWAYS Read any documentation given to you in a prompt + - [ ] ALWAYS implement the Guest trait and export your component + - [ ] ALWAYS use `export!(Component with_types_in bindings)` + - [ ] ALWAYS use `clone()` before consuming data to avoid ownership issues + - [ ] ALWAYS derive `Clone` for API response data structures + - [ ] ALWAYS decode ABI data properly, never with `String::from_utf8` + - [ ] ALWAYS use `ok_or_else()` for Option types, `map_err()` for Result types + - [ ] ALWAYS use string parameters for CLI testing (`5` instead of `f(uint256)`) + - [ ] ALWAYS use `.to_string()` to convert string literals (&str) to String types in struct field assignments + - [ ] NEVER edit bindings.rs - it's auto-generated + +2. Component structure: + - [ ] Implements Guest trait + - [ ] Exports component correctly + - [ ] Properly handles TriggerAction and TriggerData + +3. ABI handling: + - [ ] Properly decodes function calls + - [ ] Avoids String::from_utf8 on ABI data + +4. Data ownership: + - [ ] All API structures derive Clone + - [ ] Clones data before use + - [ ] Avoids moving out of collections + - [ ] Avoids all ownership issues and "Move out of index" errors + +5. Error handling: + - [ ] Uses ok_or_else() for Option types + - [ ] Uses map_err() for Result types + - [ ] Provides descriptive error messages + +6. Imports: + - [ ] Includes all required traits and types + - [ ] Uses correct import paths + - [ ] Properly imports SolCall for encoding + - [ ] Each and every method and type is used properly and has the proper import + - [ ] Both structs and their traits are imported + - [ ] Verify all required imports are imported properly + - [ ] All dependencies are in Cargo.toml with `{workspace = true}` + - [ ] Any unused imports are removed + +7. Component structure: + - [ ] Uses proper sol! macro with correct syntax + - [ ] Correctly defines Solidity types in solidity module + - [ ] Implements required functions + +8. Security: + - [ ] No hardcoded API keys or secrets + - [ ] Uses environment variables for sensitive data + +9. Dependencies: + - [ ] Uses workspace dependencies correctly + - [ ] Includes all required dependencies + +10. Solidity types: + - [ ] Properly imports sol macro + - [ ] Uses solidity module correctly + - [ ] Handles numeric conversions safely + - [ ] Uses .to_string() for all string literals in struct initialization + +11. Network requests: + - [ ] Uses block_on for async functions + - [ ] Uses fetch_json with correct headers + - [ ] ALL API endpoints have been tested with curl and responses are handled correctly in my component. + - [ ] IMPORTANT! Always use #[serde(default)] and Option for fields from external APIs. + +With this guide, you should be able to create any WAVS component that passes validation, builds without errors, and executes correctly. diff --git a/components/evm-price-oracle/src/lib.rs b/components/evm-price-oracle/src/lib.rs index 94ee4966..ebd278fc 100644 --- a/components/evm-price-oracle/src/lib.rs +++ b/components/evm-price-oracle/src/lib.rs @@ -1,8 +1,12 @@ mod trigger; use trigger::{decode_trigger_event, encode_trigger_output, Destination}; -use wavs_wasi_utils::http::{fetch_json, http_request_get}; +use wavs_wasi_utils::{ + evm::alloy_primitives::hex, + http::{fetch_json, http_request_get}, +}; pub mod bindings; use crate::bindings::{export, Guest, TriggerAction, WasmResponse}; +use alloy_sol_types::SolValue; use serde::{Deserialize, Serialize}; use wstd::{http::HeaderValue, runtime::block_on}; @@ -30,12 +34,25 @@ impl Guest for Component { let (trigger_id, req, dest) = decode_trigger_event(action.data).map_err(|e| e.to_string())?; - // Convert bytes to string and parse first char as u64 - let input = std::str::from_utf8(&req).map_err(|e| e.to_string())?; - println!("input id: {}", input); + let hex_data = match String::from_utf8(req.clone()) { + Ok(input_str) if input_str.starts_with("0x") => { + // Local testing: hex string input + hex::decode(&input_str[2..]) + .map_err(|e| format!("Failed to decode hex string: {}", e))? + } + _ => { + // Production: direct binary ABI input + req.clone() + } + }; + + let decoded = ::abi_decode(&hex_data) + .map_err(|e| format!("Failed to decode ABI string: {}", e))?; + + let id = + decoded.trim().parse::().map_err(|_| format!("Invalid number: {}", decoded))?; - let id = input.chars().next().ok_or("Empty input")?; - let id = id.to_digit(16).ok_or("Invalid hex digit")? as u64; + println!("Decoded crypto ID: {}", id); let res = block_on(async move { let resp_data = get_price_feed(id).await?; diff --git a/components/evm-price-oracle/src/trigger.rs b/components/evm-price-oracle/src/trigger.rs index dcf025f6..db7cc8b1 100644 --- a/components/evm-price-oracle/src/trigger.rs +++ b/components/evm-price-oracle/src/trigger.rs @@ -35,7 +35,8 @@ pub fn decode_trigger_event(trigger_data: TriggerData) -> Result<(u64, Vec, match trigger_data { TriggerData::EvmContractEvent(TriggerDataEvmContractEvent { log, .. }) => { let event: solidity::NewTrigger = decode_event_log_data!(log)?; - let trigger_info = solidity::TriggerInfo::abi_decode(&event._triggerInfo)?; + let trigger_info = + ::abi_decode(&event._triggerInfo)?; Ok((trigger_info.triggerId, trigger_info.data.to_vec(), Destination::Ethereum)) } TriggerData::Raw(data) => Ok((0, data.clone(), Destination::CliOutput)), @@ -75,12 +76,16 @@ pub fn encode_trigger_output(trigger_id: u64, output: impl AsRef<[u8]>) -> WasmR /// Documentation: /// - /// (You can also just sol! arbitrary solidity types like `event` or `struct` too) -mod solidity { +pub mod solidity { use alloy_sol_macro::sol; pub use ITypes::*; // The objects here will be generated automatically into Rust types. // If you update the .sol file, you must re-run `cargo build` to see the changes. - // or restart your editor / language server. sol!("../../src/interfaces/ITypes.sol"); + + // Define a simple struct representing the function that encodes string input + sol! { + function addTrigger(string data) external; + } } diff --git a/components/golang-evm-price-oracle/README.md b/components/golang-evm-price-oracle/README.md index 499e7f0a..b7122b0b 100644 --- a/components/golang-evm-price-oracle/README.md +++ b/components/golang-evm-price-oracle/README.md @@ -70,7 +70,7 @@ WASI_BUILD_DIR=golang-evm-price-oracle make wasi-build Run the component with the `wasi-exec` command in the root of the repo ```bash docci-output-contains="LTC" -COMPONENT_FILENAME=golang_evm_price_oracle.wasm COIN_MARKET_CAP_ID=2 make wasi-exec +COMPONENT_FILENAME=golang_evm_price_oracle.wasm INPUT_DATA=2 make wasi-exec-fixed ``` --- diff --git a/components/js-evm-price-oracle/README.md b/components/js-evm-price-oracle/README.md index c746ce60..d0e0dfa4 100644 --- a/components/js-evm-price-oracle/README.md +++ b/components/js-evm-price-oracle/README.md @@ -39,7 +39,7 @@ WASI_BUILD_DIR=js-evm-price-oracle make wasi-build Run the component with the `wasi-exec` command in the root of the repo ```bash docci-output-contains="LTC" -COMPONENT_FILENAME=js_evm_price_oracle.wasm COIN_MARKET_CAP_ID=2 make wasi-exec +COMPONENT_FILENAME=js_evm_price_oracle.wasm INPUT_DATA=2 make wasi-exec ``` --- diff --git a/components/js-evm-price-oracle/index.ts b/components/js-evm-price-oracle/index.ts index e594f55e..605b01e9 100644 --- a/components/js-evm-price-oracle/index.ts +++ b/components/js-evm-price-oracle/index.ts @@ -1,13 +1,14 @@ -import { TriggerAction, WasmResponse } from "./out/wavs:worker@0.4.0"; +import { TriggerAction, WasmResponse, } from "./out/wavs:worker@0.4.0"; +import { TriggerSource, TriggerSourceManual } from "./out/interfaces/wavs-worker-layer-types"; import { decodeTriggerEvent, encodeOutput, Destination } from "./trigger"; +import { AbiCoder } from "ethers"; async function run(triggerAction: TriggerAction): Promise { let event = decodeTriggerEvent(triggerAction.data); let triggerId = event[0].triggerId; - let result = await compute(event[0].data); - - + let num = processInput(event[0].data, triggerAction.config.triggerSource); + let result = await compute(num); switch (event[1]) { case Destination.Cli: @@ -29,15 +30,56 @@ async function run(triggerAction: TriggerAction): Promise { ); } -async function compute(input: Uint8Array): Promise { - const num = new TextDecoder().decode(input); - - const priceFeed = await fetchCryptoPrice(parseInt(num)); +async function compute(num: number): Promise { + const priceFeed = await fetchCryptoPrice(num); const priceJson = priceFeedToJson(priceFeed); return new TextEncoder().encode(priceJson); } +function processInput(input: Uint8Array, triggerSource: { tag: string }): number { + // Prepare the input data based on trigger type + const processedInput = prepareInputData(input, triggerSource.tag); + + // Single ABI decoding step + const abiCoder = new AbiCoder(); + const res = abiCoder.decode(["string"], processedInput); + const decodedString = res[0] as string; + + console.log("Decoded input:", decodedString, "triggerSource.tag:", triggerSource.tag); + + // Validate the decoded string is a valid number + const num = decodedString.trim(); + if (isNaN(parseInt(num))) { + throw new Error(`Input is not a valid number: ${num}`); + } + + return parseInt(num); // Return the validated number +} + + +function prepareInputData(input: Uint8Array, triggerTag: string): Uint8Array { + if (triggerTag === "manual") { + return input; // Use input directly for manual triggers + } + + // For evm-contract-event: handle potential hex string conversion + try { + const inputStr = new TextDecoder().decode(input); + if (!inputStr.startsWith("0x")) { + throw new Error("Input is not a valid hex string: " + inputStr); + } + + // Convert hex string to bytes + const hexString = inputStr.slice(2); // Remove "0x" prefix + return new Uint8Array( + hexString.match(/.{1,2}/g)!.map(byte => parseInt(byte, 16)) + ); + } catch { + return input; // If UTF-8 decode fails, assume it's already binary + } +} + // ======================== CMC ======================== // Define the types for the CMC API response diff --git a/docs/handbook/ai.mdx b/docs/handbook/ai.mdx new file mode 100644 index 00000000..89b0c698 --- /dev/null +++ b/docs/handbook/ai.mdx @@ -0,0 +1,186 @@ +--- +title: AI-powered component creation +description: Use Claude or Cursor to create one-shot components with minimal prompting +--- + + + +The WAVS Foundry Template contains built-in AI rulefiles for creating "one-shot" components with minimal prompting in Cursor or Claude Code. + +These rulefiles are an experimental feature and may not work as expected every time. Components created with AI should not be used in production without thorough review and testing. + + + For more information on AI tools and AI-accessible documentation, visit the [LLM resources page](/resources/llms). + + +## Claude Code + +- Follow the [Claude Code installation instructions](https://docs.anthropic.com/en/docs/claude-code/getting-started) to install Claude Code and link your account. +- The Claude rulefile is `claude.md` and contains instructions for Claude on how to create a component. +- Learn more about Claude rulefiles: https://docs.anthropic.com/en/docs/claude-code/memory + +## Cursor + +- Download Cursor: https://www.cursor.com/downloads +- The Cursor rulefiles are located in the `.cursor/rules` directory. +- When using Cursor, always attach the `component-rules.mdc` file to the chat with your prompt. +- Learn more about Cursor rulefiles: https://docs.cursor.com/context/rules + +## Using AI to create components + +1. Clone the [WAVS Foundry Template](https://github.com/Lay3rLabs/wavs-foundry-template) and follow the system setup requirements in the README. + +```sh +git clone https://github.com/Lay3rLabs/wavs-foundry-template.git +cd wavs-foundry-template +git checkout main +# Follow the system setup requirements in the README. +``` + +2. Open Claude Code or Cursor in the root of the template. + +```sh +claude +# or +cursor . +``` + + + +You can run a sandboxed instance of [Claude Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview) in a Docker container that only has access to this project's files by running the following command from the root of the project: + +```bash docci-ignore +npm run claude-code +# or with no restrictions (--dangerously-skip-permissions) +npm run claude-code:unrestricted +``` + + + +3. Enter your prompt in the agent chat. You can use the following examples as a starting point, or you can create your own prompt. + + +If you are using cursor, always attach `component-rules.mdc` file to the chat with your prompt. + +``` +@component-rules.mdc +``` + + + +### Prompt examples + +These simple examples are provided to get you started. + +#### API component + +You can make a very simple prompt to create a component that can bring API responses verifiably onchain by including the API endpoint: + +``` +Let's make a component that takes the input of a zip code, queries the openbrewerydb, +and returns the breweries in the area. +@https://api.openbrewerydb.org/v1/breweries?by_postal=92101&per_page=3 +``` + +#### Contract balance component + +You can also make components that interact with the blockchain: + +``` +I want to build a new component that takes the input of a wallet address, +queries the usdt contract, and returns the balance of that address. +``` + +#### Verifiable AI component + +``` +Please make a component that takes a prompt as input, sends an api request to OpenAI, +and returns the response. + + Use this api structure: + { + "seed": $SEED, + "model": "gpt-4o", + "messages": [ + {"role": "system", "content": "You are a helpful assistant."}, + {"role": "user", "content": ""} + ] + } + +My api key is WAVS_ENV_OPENAI_KEY in my .env file. +``` + +You'll need an [OPENAI API account and key](https://platform.openai.com/login) to use this prompt. The agent will include your API key in the component as a [private variable](./components/variables). + +Make sure to include your API key in a `.env` file: + +```sh +# copy the .env.example file +cp .env.example .env +# place your key in .env (must be prefixed with WAVS_ENV_) +WAVS_ENV_OPENAI_KEY=your_api_key +``` + +This example utilizes the OpenAI API with a [seed](https://platform.openai.com/docs/advanced-usage#reproducible-outputs) to make the response more deterministic. Please note that OpenAI models are not guaranteed to be 100% deterministic. This example is for demonstration purposes and should not be used in production. + +## Component creation process + +4. After receiving the prompt, the agent will start creating your component. Review the agent's work and accept changes carefully. Make sure to double check what the agent is doing and be safe about accepting changes. + +5. The agent will start by planning its component and will create a `plan.md` file. The agent will then make a new component and files according to this plan. + +6. The agent will test its component for errors by running validation tests using `make validate-component COMPONENT=your-component`. + +7. The agent may need to make changes after running the Validation tests. After making changes, the agent will build the component using `WASI_BUILD_DIR=components/my-component make wasi-build`. + +8. After successfully building your component, it's time to test it. The following command can be used to test your component logic without deploying WAVS. Make sure to replace the placeholders with the correct inputs. + +```sh +# Run this command to build the component: +WASI_BUILD_DIR=components/openai-response make wasi-build + +# Once built, test it with: +export COMPONENT_FILENAME=openai_response.wasm +export INPUT_DATA="Only respond with yes or no: Is AI beneficial to the world?" +make wasi-exec +``` + +The agent may try to run the `make wasi-exec` command themselves. You should prompt the agent to give you the command instead, as it can't run the command without permissions. + + +9. Your component should execute and return a response. If there are any errors, share them with the agent for troubleshooting. + +If you have any questions, join the WAVS DEVS Telegram channel: https://t.me/layer_xyz/818 + +## Tips for working with AI agents + +- While this repo contains rulefiles with enough context for creating simple components, coding agents are unpredictable and may inevitably run into problems. +- Feel free to update the rulefiles for your specific purposes or if you run into regular errors. +- Coding agents can sometimes try to over-engineer their fixes for errors. If you feel it is not being productive, it may be beneficial to start fresh. You may need to adjust your prompt. +- If you are building a complex component, it may be helpful to have the agent build a simple component first and then expand upon it. +- The agent may try to fix warnings unnecessarily. You can tell the agent to ignore minor warnings and any errors found in `bindings.rs` (it is auto-generated). + +### Prompting + +This repo is designed to be used with short prompts for simple components. However, often, coding agents will do better with more context. + +When creating a prompt, consider the following: + +- Agents work best with short, clear instructions. +- Provide relevant documentation (preferably as an `.md` file or other ai-digestible content). +- Provide endpoints. +- You may need to provide API response structure if the agent is not understanding responses. +- Be specific about what you want the agent to build. +- Agents work systematically to build components. For best results, agent should make a plan before they start building. +- Be patient. Coding agents are not perfect. They may make mistakes. + +## Troubleshooting + +- You can ask the agent to fix errors it may not be able to catch when executing components. Make sure to give the agent full context of the error. +- LLMs can be unpredictable. Minimal prompts provide a lot of room for creativity/error. If the agent is not able to fix an error after trying, sometimes deleting the component, clearing the history, and starting fresh can help. +- The agent may try to edit the bindings.rs file to "fix" it. The agent never needs to do this, and you should tell the agent to not do this. +- The agent is supposed to provide you with the `make wasi-exec` command. Sometimes it will try to run this itself and it will fail. Instead, ask it to give you the command. +- When copying and pasting the full `make wasi-exec` command, be careful with line breaks. You may need to reformat long lines to avoid breaking the command. diff --git a/docs/handbook/commands.mdx b/docs/handbook/commands.mdx index 67f63758..78f623a3 100644 --- a/docs/handbook/commands.mdx +++ b/docs/handbook/commands.mdx @@ -15,28 +15,30 @@ Use `make help` to see all the commands: make help ``` -Here are the available makefile commands and their descriptions: +Here are the available `make` commands and their descriptions: ```bash -make build building the project -make wasi-build building WAVS wasi components | WASI_BUILD_DIR -make wasi-exec executing the WAVS wasi component(s) | COMPONENT_FILENAME, COIN_MARKET_CAP_ID -make clean cleaning the project files -make clean-docker remove unused docker containers -make fmt formatting solidity and rust code -make test running tests -make setup install initial dependencies -make start-all-local starting anvil and core services (like IPFS for example) -make get-trigger-from-deploy getting the trigger address from the script deploy -make get-submit-from-deploy getting the submit address from the script deploy -make wavs-cli running wavs-cli in docker -make upload-component uploading the WAVS component | COMPONENT_FILENAME, WAVS_ENDPOINT -make deploy-service deploying the WAVS component service json | SERVICE_URL, CREDENTIAL, WAVS_ENDPOINT -make get-trigger get the trigger id | SERVICE_TRIGGER_ADDR, RPC_URL -make show-result showing the result | SERVICE_SUBMISSION_ADDR, TRIGGER_ID, RPC_URL -make upload-to-ipfs uploading the a service config to IPFS | SERVICE_FILE, [PINATA_API_KEY] -make update-submodules update the git submodules -make check-requirements verify system requirements are installed +build building the project +wasi-build building WAVS wasi components | WASI_BUILD_DIR +wasi-exec executing the WAVS wasi component(s) with ABI function | COMPONENT_FILENAME, INPUT_DATA +wasi-exec-fixed the same as wasi-exec, except uses a fixed input as bytes (used in Go & TS components) | COMPONENT_FILENAME, INPUT_DATA +clean cleaning the project files +clean-docker remove unused docker containers +validate-component validate a WAVS component against best practices +fmt formatting solidity and rust code +test running tests +setup install initial dependencies +start-all-local starting anvil and core services (like IPFS for example) +get-trigger-from-deploy getting the trigger address from the script deploy +get-submit-from-deploy getting the submit address from the script deploy +wavs-cli running wavs-cli in docker +upload-component uploading the WAVS component | COMPONENT_FILENAME, WAVS_ENDPOINT +deploy-service deploying the WAVS component service json | SERVICE_URL, CREDENTIAL, WAVS_ENDPOINT +get-trigger get the trigger id | SERVICE_TRIGGER_ADDR, RPC_URL +show-result showing the result | SERVICE_SUBMISSION_ADDR, TRIGGER_ID, RPC_URL +upload-to-ipfs uploading the a service config to IPFS | SERVICE_FILE, [PINATA_API_KEY] +update-submodules update the git submodules +check-requirements verify system requirements are installed ``` For more information on commands when using the template, visit the [WAVS tutorial](/tutorial/1-overview). diff --git a/docs/handbook/components/blockchain-interactions.mdx b/docs/handbook/components/blockchain-interactions.mdx index 3c5a68f2..c7118f5f 100644 --- a/docs/handbook/components/blockchain-interactions.mdx +++ b/docs/handbook/components/blockchain-interactions.mdx @@ -103,13 +103,18 @@ mod solidity { In the template, the `sol!` macro is used in the `trigger.rs` component file to generate Rust types from the `ITypes.sol` file. ```rust trigger.rs -mod solidity { +pub mod solidity { use alloy_sol_macro::sol; pub use ITypes::*; // The objects here will be generated automatically into Rust types. // If you update the .sol file, you must re-run `cargo build` to see the changes. sol!("../../src/interfaces/ITypes.sol"); + + // Define a simple struct representing the function that encodes string input + sol! { + function addTrigger(string data) external; + } } ``` @@ -165,14 +170,12 @@ sol! { function balanceOf(address owner) external view returns (uint256); } } - // Function to check if an address owns any NFTs from a specific contract pub fn query_nft_ownership(address: Address, nft_contract: Address) -> Result { // block_on allows us to run async code in a synchronous function block_on(async move { // Get the chain configuration for the local network let chain_config = get_evm_chain_config("local").unwrap(); - // Create a provider that will handle RPC communication let provider: RootProvider = new_evm_provider::( chain_config.http_endpoint.unwrap() diff --git a/docs/handbook/components/component.mdx b/docs/handbook/components/component.mdx index 615b5dc3..9c3eb944 100644 --- a/docs/handbook/components/component.mdx +++ b/docs/handbook/components/component.mdx @@ -32,7 +32,7 @@ When building WASI components, keep in mind that components can receive the [tri 1. **On-chain events**: When triggered by an EVM event, the data comes through the `TriggerAction` with `TriggerData::EvmContractEvent`. -2. **Local testing**: When using `make wasi-exec` command in the template to test a component, the data comes through `TriggerData::Raw`. No abi decoding is required, and the output is returned as raw bytes. +2. **Local testing**: When using `make wasi-exec` command in the template to test a component, the data comes through `TriggerData::Raw`. Here's how the example component handles both cases in `trigger.rs`: @@ -47,7 +47,8 @@ pub fn decode_trigger_event(trigger_data: TriggerData) -> Result<(u64, Vec, TriggerData::EvmContractEvent(TriggerDataEvmContractEvent { log, .. }) => { // Decode Ethereum event logs using the `decode_event_log_data!` macro let event: solidity::NewTrigger = decode_event_log_data!(log)?; - let trigger_info = solidity::TriggerInfo::abi_decode(&event._triggerInfo)?; + let trigger_info = + ::abi_decode(&event._triggerInfo)?; Ok((trigger_info.triggerId, trigger_info.data.to_vec(), Destination::Ethereum)) } // Local Testing (wasi-exec) @@ -58,19 +59,25 @@ pub fn decode_trigger_event(trigger_data: TriggerData) -> Result<(u64, Vec, } } -mod solidity { // Define the Solidity types for the incoming trigger event using the `sol!` macro + +pub mod solidity { // Define the Solidity types for the incoming trigger event using the `sol!` macro use alloy_sol_macro::sol; pub use ITypes::*; // The objects here will be generated automatically into Rust types. // the interface shown here is used in the example trigger contract in the template. sol!("../../src/interfaces/ITypes.sol"); + + // The addTrigger function from the trigger contract + sol! { + function addTrigger(string data) external; + } } ``` The component decodes the incoming event trigger data using the `decode_event_log_data!` macro from the [`wavs-wasi-utils` crate](https://docs.rs/wavs-wasi-utils/latest/wavs_wasi_utils/macro.decode_event_log_data.html). -The `sol!` macro from `alloy-sol-macro` is usedto define Solidity types in Rust. This macro reads a Solidity interface file and generates corresponding Rust types and encoding/decoding functions. For more information, visit the [Blockchain interactions page](./blockchain-interactions#sol-macro). +The `sol!` macro from `alloy-sol-macro` is used to define Solidity types in Rust. This macro reads a Solidity interface file and generates corresponding Rust types and encoding/decoding functions. For more information, visit the [Blockchain interactions page](./blockchain-interactions#sol-macro). ### Component logic @@ -85,7 +92,6 @@ impl Guest for Component { fn run(action: TriggerAction) -> Result, String> { // 1. Decode the trigger data using the decode_trigger_event function from trigger.rs let (trigger_id, req, dest) = decode_trigger_event(action.data)?; - // 2. Process the data (your business logic) let res = block_on(async move { let resp_data = get_price_feed(id).await?; @@ -107,7 +113,7 @@ impl Guest for Component { } ``` -Components can contain any compatible logic, including [blockchain interactions](./blockchain-interactions), [network requests](./network-requests) , off-chain computations, and more. To learn about the types of components that WAVS is best suited for, visit the [design considerations](../../design) page. +Components can contain any compatible logic, including [blockchain interactions](./blockchain-interactions), [network requests](./network-requests), off-chain computations, and more. To learn about the types of components that WAVS is best suited for, visit the [design considerations](../../design) page. #### Logging in a component diff --git a/docs/handbook/components/network-requests.mdx b/docs/handbook/components/network-requests.mdx index 8176182a..555b69db 100644 --- a/docs/handbook/components/network-requests.mdx +++ b/docs/handbook/components/network-requests.mdx @@ -60,8 +60,7 @@ struct ApiResponse { async fn make_request() -> Result { let url = "https://api.example.com/endpoint"; let mut req = http_request_get(&url).map_err(|e| e.to_string())?; - - // Set required headers for API requests + // Set headers for API requests req.headers_mut().insert( "Accept", HeaderValue::from_static("application/json") @@ -74,12 +73,10 @@ async fn make_request() -> Result { "User-Agent", HeaderValue::from_static("Mozilla/5.0") ); - // Use fetch_json to automatically parse the response let json: ApiResponse = fetch_json(req) .await .map_err(|e| e.to_string())?; - Ok(json) } @@ -118,12 +115,10 @@ async fn make_post_request() -> Result { key1: "value1".to_string(), key2: 42, }; - // http_request_post_json automatically sets JSON headers let response: PostResponse = fetch_json( http_request_post_json(&url, &post_data)? ).await.map_err(|e| e.to_string())?; - Ok(response) } diff --git a/docs/handbook/components/variables.mdx b/docs/handbook/components/variables.mdx index 547e3811..c7a46bc1 100644 --- a/docs/handbook/components/variables.mdx +++ b/docs/handbook/components/variables.mdx @@ -20,7 +20,7 @@ To add public variables: ```json "component": { "config": { - "api_endpoint": "https://api.example.com", // accessible via host::config_var() + "api_endpoint": "https://api.example.com", // Access using host::config_var() "max_retries": "3" // Config values are always strings } } diff --git a/docs/handbook/overview.mdx b/docs/handbook/overview.mdx index ac3bee80..11226be9 100644 --- a/docs/handbook/overview.mdx +++ b/docs/handbook/overview.mdx @@ -38,9 +38,9 @@ This handbook provides an overview of the different parts that make up a WAVS AV - [Blockchain interactions](./components/blockchain-interactions) - Discover how to interact with blockchains and smart contracts from your components. - [Network requests](./components/network-requests) - Learn how to make HTTP requests to external APIs from your components. - ## Development - [Template](./template) - Get started with the WAVS template, including its structure, configuration files, and how to customize it for your service. - [Makefile commands](./commands) - Reference for the available makefile commands to build, deploy, and manage your service. +- [AI-powered component creation](./ai) - Learn how to use AI coding agents to create components. Each section provides detailed information and examples to help you understand and build your WAVS service. Start with the Service section to understand the basic concepts, then explore the other sections based on your needs. diff --git a/docs/handbook/service.mdx b/docs/handbook/service.mdx index 11299701..5aa88864 100644 --- a/docs/handbook/service.mdx +++ b/docs/handbook/service.mdx @@ -13,7 +13,7 @@ The service manifest defines the configuration and different parts of a WAVS ser ## Generate Manifest -You can create the service.json file using the `wavs-cli service` command. The template provides a script to generate a single-component service with ease, [build_service.sh](https://github.com/Lay3rLabs/wavs-foundry-template/blob/main/script/build_service.sh). +You can create the service.json file using the `wavs-cli service` command. The template provides a script to generate a single-component service with ease, [build_service.sh](https://github.com/Lay3rLabs/wavs-foundry-template/blob/main/script/build-service.sh). ## Example Manifest @@ -22,7 +22,6 @@ You can create the service.json file using the `wavs-cli service` command. The t // Basic service information "id": "example-service-123", "name": "Example WAVS Service", - // Workflows define the different execution paths in your service "workflows": { // Each workflow has a unique ID @@ -145,4 +144,4 @@ The service manager contract defines the set of registered operators for a servi The service manager also maintains a service URI that points to the service manifest, connecting the operators to the service. -Signatures are created by operators using their private keys to sign an envelope containing the data, and these signatures are collected by the aggregator which then submits them to the service manager contract for validation. The service manager contract validates that the signatures are from registered operators, checks that their total weight meets the threshold, and ensures the operators are properly sorted before allowing the data to be processed by the service handler contract. +Signatures are created by operators using their private keys to sign an envelope containing the data, and these signatures are collected by the aggregator which then submits them to the service manager contract for validation. The service manager contract validates that the signatures are from registered operators, checks that their total weight meets the threshold, and ensures the operators are properly sorted before allowing the data to be processed by the [service handler](/handbook/submission) contract. diff --git a/docs/handbook/submission.mdx b/docs/handbook/submission.mdx index 0c87d2b6..a77fd17b 100644 --- a/docs/handbook/submission.mdx +++ b/docs/handbook/submission.mdx @@ -29,7 +29,6 @@ The `submit` field in a service.json file specifies the submission logic for a s } ] ``` - Submit can also be set to `none` if the service does not need to submit results to a contract. The component will still run, but the results will not be submitted. ## Submission contract @@ -80,7 +79,6 @@ contract SimpleSubmit is ITypes, IWavsServiceHandler { // Decode the payload into your expected data structure // The payload format must match what your component outputs DataWithId memory dataWithId = abi.decode(envelope.payload, (DataWithId)); - // At this point, you can safely process the validated data // Add your custom logic here to handle the data } diff --git a/docs/handbook/template.mdx b/docs/handbook/template.mdx index fa7c53b3..8d9dfb34 100644 --- a/docs/handbook/template.mdx +++ b/docs/handbook/template.mdx @@ -37,6 +37,8 @@ wavs-foundry-template/ ├── script/ # Deployment & interaction scripts ├── wavs.toml # WAVS service configuration ├── docs/ # Documentation +├── .cursor/rules/ # Cursor AI rulefiles +├── claude.md # Claude AI rulefile └── .env # Private environment variables ``` @@ -46,7 +48,7 @@ wavs-foundry-template/ - The `src` directory contains the Solidity contracts and interfaces for trigger and submission contracts. - The `script` directory contains the scripts used in the makefile commands to deploy, trigger, and test the service. - The `.env` file contains private environment variables and keys. Use `cp .env.example .env` to copy the example `.env` file. - +- The `.cursor/rules` directory and `claude.md` file contain rulefiles for [building components with Cursor AI and Claude AI agents](/handbook/ai). ## Toml files diff --git a/docs/resources/llms.mdx b/docs/resources/llms.mdx new file mode 100644 index 00000000..3881bf6b --- /dev/null +++ b/docs/resources/llms.mdx @@ -0,0 +1,44 @@ +--- +title: LLM docs +description: Access WAVS documentation in formats optimized for AI tools and integration. +--- + + + +The LLM text format presents documentation in a clean, plain text format optimized for large language models (LLMs) like Claude, ChatGPT, and others. + +## llms.txt + +The `llms.txt` format is a structured index of documentation pages organized by sections, including page titles, URLs and descriptions. This format is ideal for AI assistants to understand the documentation structure without processing the full content. + +[https://docs.wavs.xyz/llms.txt](https://docs.wavs.xyz/llms.txt) + +``` +curl https://docs.wavs.xyz/llms.txt +``` + +## llms-full.txt + +The `llms-full.txt` format returns all documentation pages as a single text document. + +[https://docs.wavs.xyz/llms-full.txt](https://docs.wavs.xyz/llms-full.txt) + +``` +curl https://docs.wavs.xyz/llms-full.txt +``` + +Returns all documentation pages as a single text document. + + +## Markdown Format + +Get any page as standard Markdown by appending `.md` to its URL. + +``` +curl https://docs.wavs.xyz/path/to/page.md +``` + +Examples: +- `/overview.md` - Overview page as Markdown +- `/tutorial/1-overview.md` - Tutorial introduction as Markdown +- `/handbook/service.md` - Service handbook as Markdown diff --git a/docs/tutorial/1-overview.mdx b/docs/tutorial/1-overview.mdx index 7ced1e9c..491079b3 100644 --- a/docs/tutorial/1-overview.mdx +++ b/docs/tutorial/1-overview.mdx @@ -16,7 +16,7 @@ In this guide, you will build a simple oracle service that fetches Bitcoin price The price oracle service example has three basic parts: -1. [A trigger](https://github.com/Lay3rLabs/wavs-foundry-template/tree/main/src/contracts/WavsTrigger.sol): A trigger can be any on-chain event emitted from a contract. This event **triggers** a service to run. In the WAVS Foundry Template, there is a simple trigger contract that stores trigger requests, assigns them unique IDs, and emits an event when a new trigger is added. In this example, the trigger event will pass data pertaining to the ID of an asset for the CoinMarketCap price feed. +1. [A trigger contract](https://github.com/Lay3rLabs/wavs-foundry-template/tree/main/src/contracts/WavsTrigger.sol): A trigger can be any on-chain event emitted from a contract. This event **triggers** a service to run. In the WAVS Foundry Template, there is a simple trigger contract that stores trigger requests, assigns them unique IDs, and emits an event when a new trigger is added. In this example, the trigger event `addTrigger` will pass data pertaining to the ID of an asset for the CoinMarketCap price feed. 2. [A service component](https://github.com/Lay3rLabs/wavs-foundry-template/tree/main/components/evm-price-oracle/src/lib.rs): The service component contains the business logic of a service. It is written in Rust, compiled to WASM, and run by operators in the WAVS runtime. In this example, operators will listen for a new trigger event to be emitted and then run the service component off-chain, using the asset ID data from the trigger event as input. The component contains logic to fetch the price of the asset from the CoinMarketCap price feed API, which is then processed and encoded before being sent back on-chain. diff --git a/docs/tutorial/4-component.mdx b/docs/tutorial/4-component.mdx index 4dadb934..c7ddfa75 100644 --- a/docs/tutorial/4-component.mdx +++ b/docs/tutorial/4-component.mdx @@ -12,7 +12,7 @@ The core logic of the price oracle in this example is located in the [`/evm-pric ## trigger.rs -The [trigger.rs](https://github.com/Lay3rLabs/wavs-foundry-template/tree/main/components/evm-price-oracle/src/trigger.rs) file handles the decoding of incoming trigger data from the trigger event emitted by the trigger contract. The component uses `decode_event_log_data!()` from the [wavs-wasi-utils crate](https://docs.rs/wavs-wasi-utils/latest/wavs_wasi_utils/) to decode the event log data and prepares it for processing within the WAVS component. For more information on different trigger types, visit the [Triggers page](../handbook/triggers). To learn more about trigger input handling, visit the [Component page](../handbook/components/component#trigger-inputs). +The [trigger.rs](https://github.com/Lay3rLabs/wavs-foundry-template/tree/main/components/evm-price-oracle/src/trigger.rs) file handles the decoding of incoming trigger data from the trigger event emitted by the trigger contract. The component uses `decode_event_log_data!()` from the [wavs-wasi-utils crate](https://docs.rs/wavs-wasi-utils/latest/wavs_wasi_utils/) to decode the event log data and prepares it for processing within the WAVS component. Trigger.rs handles both ABI encoded data for trigger and submission data and raw data for local testing. For more information on different trigger types, visit the [Triggers page](../handbook/triggers). To learn more about trigger input handling, visit the [Component page](../handbook/components/component#trigger-inputs). ```rust trigger.rs use crate::bindings::wavs::worker::layer_types::{ @@ -22,16 +22,22 @@ use alloy_sol_types::SolValue; use anyhow::Result; use wavs_wasi_utils::decode_event_log_data; +/// Represents the destination where the trigger output should be sent pub enum Destination { Ethereum, CliOutput, } +/// Decodes incoming trigger event data into its components +/// Handles two types of triggers: +/// 1. EvmContractEvent - Decodes Ethereum event logs using the NewTrigger ABI +/// 2. Raw - Used for direct CLI testing with no encoding pub fn decode_trigger_event(trigger_data: TriggerData) -> Result<(u64, Vec, Destination)> { match trigger_data { TriggerData::EvmContractEvent(TriggerDataEvmContractEvent { log, .. }) => { let event: solidity::NewTrigger = decode_event_log_data!(log)?; - let trigger_info = solidity::TriggerInfo::abi_decode(&event._triggerInfo)?; + let trigger_info = + ::abi_decode(&event._triggerInfo)?; Ok((trigger_info.triggerId, trigger_info.data.to_vec(), Destination::Ethereum)) } TriggerData::Raw(data) => Ok((0, data.clone(), Destination::CliOutput)), @@ -39,6 +45,7 @@ pub fn decode_trigger_event(trigger_data: TriggerData) -> Result<(u64, Vec, } } +/// Encodes the output data for submission back to Ethereum pub fn encode_trigger_output(trigger_id: u64, output: impl AsRef<[u8]>) -> WasmResponse { WasmResponse { payload: solidity::DataWithId { @@ -50,14 +57,24 @@ pub fn encode_trigger_output(trigger_id: u64, output: impl AsRef<[u8]>) -> WasmR } } -mod solidity { +/// The `sol!` macro from alloy_sol_macro reads a Solidity interface file +/// and generates corresponding Rust types and encoding/decoding functions. +pub mod solidity { use alloy_sol_macro::sol; pub use ITypes::*; + // The objects here will be generated automatically into Rust types. sol!("../../src/interfaces/ITypes.sol"); + + // Encode string input from the trigger contract function + sol! { + function addTrigger(string data) external; + } } ``` +Visit the [Blockchain interactions page](../handbook/components/blockchain-interactions) for more information on the `sol!` macro and how to use it to generate Rust types from Solidity interfaces. + ## Oracle component logic The [`lib.rs`](https://github.com/Lay3rLabs/wavs-foundry-template/tree/main/components/evm-price-oracle/src/lib.rs) file contains the main component logic for the oracle. The first section of the code imports the required modules for requests, serialization, and bindings, defines the component struct, and exports the component for execution within the WAVS runtime. @@ -65,9 +82,13 @@ The [`lib.rs`](https://github.com/Lay3rLabs/wavs-foundry-template/tree/main/comp ```rust lib.rs mod trigger; use trigger::{decode_trigger_event, encode_trigger_output, Destination}; -use wavs_wasi_utils::http::{fetch_json, http_request_get}; +use wavs_wasi_utils::{ + evm::alloy_primitives::hex, + http::{fetch_json, http_request_get}, +}; pub mod bindings; use crate::bindings::{export, Guest, TriggerAction, WasmResponse}; +use alloy_sol_types::SolValue; use serde::{Deserialize, Serialize}; use wstd::{http::HeaderValue, runtime::block_on}; ``` @@ -76,7 +97,7 @@ The `run` function is the main entry point for the price oracle component. WAVS This is why the `Destination::Ethereum` requires the encoded trigger output, it must be ABI encoded for the solidity contract. -After the data is properly set by the operator through WAVS, any user can query the price data from the blockchain in the solidity contract. You can also return `None` as the output if nothing needs to be saved to the blockchain. (great for performing some off chain action) +After the data is submitted to the blockchain, any user can query the price data from the blockchain in the solidity contract. You can also return `None` as the output if nothing needs to be saved to the blockchain (useful for performing some off chain action). The `run` function: @@ -86,20 +107,33 @@ The `run` function: 4. Returns the encoded response based on the destination ```rust lib.rs + struct Component; export!(Component with_types_in bindings); impl Guest for Component { + fn run(action: TriggerAction) -> std::result::Result, String> { let (trigger_id, req, dest) = decode_trigger_event(action.data).map_err(|e| e.to_string())?; - // Convert bytes to string and parse first char as u64 - let input = std::str::from_utf8(&req).map_err(|e| e.to_string())?; - println!("input id: {}", input); + let hex_data = match String::from_utf8(req.clone()) { + Ok(input_str) if input_str.starts_with("0x") => { + // Local testing: hex string input + hex::decode(&input_str[2..]) + .map_err(|e| format!("Failed to decode hex string: {}", e))? + } + _ => { + // Production: direct binary ABI input + req.clone() + } + }; + + let decoded = ::abi_decode(&hex_data) + .map_err(|e| format!("Failed to decode ABI string: {}", e))?; - let id = input.chars().next().ok_or("Empty input")?; - let id = id.to_digit(16).ok_or("Invalid hex digit")? as u64; + let id = + decoded.trim().parse::().map_err(|_| format!("Invalid number: {}", decoded))?; let res = block_on(async move { let resp_data = get_price_feed(id).await?; diff --git a/docs/tutorial/6-run-service.mdx b/docs/tutorial/6-run-service.mdx index 22bb8444..75e40c4f 100644 --- a/docs/tutorial/6-run-service.mdx +++ b/docs/tutorial/6-run-service.mdx @@ -148,7 +148,7 @@ COMMAND="list_operators" PAST_BLOCKS=500 make wavs-middleware ## Trigger the service -Next, use your deployed trigger contract to trigger the oracle to be run. In the following command, you'll specify the `COIN_MARKET_CAP_ID` as `1`, which corresponds to the ID of Bitcoin. +Next, use your deployed trigger contract to trigger the oracle to be run. In the following command, you'll specify the `INPUT_DATA` as abi encoded `1`, which corresponds to the ID of Bitcoin. Running this command will execute [`/script/Trigger.s.sol`](https://github.com/Lay3rLabs/wavs-foundry-template/tree/main/script/Trigger.s.sol) and pass the ID to the trigger contract, starting the following chain of events: @@ -160,14 +160,14 @@ Running this command will execute [`/script/Trigger.s.sol`](https://github.com/L ```bash docci-delay-per-cmd=2 # Request BTC from CMC -export COIN_MARKET_CAP_ID=1 +export INPUT_DATA=`cast abi-encode "addTrigger(string)" "1"` # Get the trigger address from previous Deploy forge script export SERVICE_TRIGGER_ADDR=`make get-trigger-from-deploy` # uses FUNDED_KEY as the executor (local: anvil account) source .env -forge script ./script/Trigger.s.sol ${SERVICE_TRIGGER_ADDR} ${COIN_MARKET_CAP_ID} --sig 'run(string,string)' --rpc-url ${RPC_URL} --broadcast +forge script ./script/Trigger.s.sol ${SERVICE_TRIGGER_ADDR} ${INPUT_DATA} --sig 'run(string,string)' --rpc-url ${RPC_URL} --broadcast ``` ## Show the result diff --git a/script/Trigger.s.sol b/script/Trigger.s.sol index 61f08400..7a592a97 100644 --- a/script/Trigger.s.sol +++ b/script/Trigger.s.sol @@ -12,7 +12,7 @@ contract Trigger is Common { vm.startBroadcast(_privateKey); SimpleTrigger trigger = SimpleTrigger(vm.parseAddress(serviceTriggerAddr)); - trigger.addTrigger(abi.encodePacked(coinMarketCapID)); + trigger.addTrigger(coinMarketCapID); ITypes.TriggerId triggerId = trigger.nextTriggerId(); console.log("TriggerId", ITypes.TriggerId.unwrap(triggerId)); vm.stopBroadcast(); diff --git a/script/build-service.sh b/script/build-service.sh index 09a6bf0e..f000d93b 100644 --- a/script/build-service.sh +++ b/script/build-service.sh @@ -73,6 +73,7 @@ eval "$BASE_CMD workflow component --id ${WORKFLOW_ID} permissions --http-hosts eval "$BASE_CMD workflow component --id ${WORKFLOW_ID} time-limit --seconds 30" > /dev/null eval "$BASE_CMD workflow component --id ${WORKFLOW_ID} env --values WAVS_ENV_SOME_SECRET" > /dev/null eval "$BASE_CMD workflow component --id ${WORKFLOW_ID} config --values 'key=value,key2=value2'" > /dev/null +eval "$BASE_CMD workflow component --id ${WORKFLOW_ID} fuel-limit --fuel ${FUEL_LIMIT}" > /dev/null eval "$BASE_CMD manager set-evm --chain-name ${SUBMIT_CHAIN} --address `cast --to-checksum ${WAVS_SERVICE_MANAGER_ADDRESS}`" > /dev/null eval "$BASE_CMD validate" > /dev/null diff --git a/script/deploy-script.sh b/script/deploy-script.sh index 87395b3b..08c334a9 100644 --- a/script/deploy-script.sh +++ b/script/deploy-script.sh @@ -12,6 +12,13 @@ if git status --porcelain | grep -q "^.* components/"; then fi ### === Deploy Eigenlayer === +# if RPC_URL is not set, use default by calling command +if [ -z "$RPC_URL" ]; then + export RPC_URL=$(bash ./script/get-rpc-url.sh) +fi +if [ -z "$AGGREGATOR_URL" ]; then + export AGGREGATOR_URL=http://127.0.0.1:8001 +fi # local: create deployer & auto fund. testnet: create & iterate check balance bash ./script/create-deployer.sh diff --git a/src/contracts/WavsTrigger.sol b/src/contracts/WavsTrigger.sol index fbfe95d6..58cd0c83 100644 --- a/src/contracts/WavsTrigger.sol +++ b/src/contracts/WavsTrigger.sol @@ -10,35 +10,50 @@ contract SimpleTrigger is ISimpleTrigger { /// @inheritdoc ISimpleTrigger mapping(TriggerId _triggerId => Trigger _trigger) public triggersById; /// @notice See ISimpleTrigger.triggerIdsByCreator - mapping(address _creator => TriggerId[] _triggerIds) internal _triggerIdsByCreator; + mapping(address _creator => TriggerId[] _triggerIds) + internal _triggerIdsByCreator; /// @inheritdoc ISimpleTrigger - function addTrigger(bytes memory _data) external { + function addTrigger(string memory _data) external { // Get the next trigger id nextTriggerId = TriggerId.wrap(TriggerId.unwrap(nextTriggerId) + 1); TriggerId _triggerId = nextTriggerId; // Create the trigger - Trigger memory _trigger = Trigger({creator: msg.sender, data: _data}); + Trigger memory _trigger = Trigger({ + creator: msg.sender, + data: bytes(_data) + }); // Update storages triggersById[_triggerId] = _trigger; _triggerIdsByCreator[msg.sender].push(_triggerId); - TriggerInfo memory _triggerInfo = - TriggerInfo({triggerId: _triggerId, creator: _trigger.creator, data: _trigger.data}); + TriggerInfo memory _triggerInfo = TriggerInfo({ + triggerId: _triggerId, + creator: _trigger.creator, + data: _trigger.data + }); emit NewTrigger(abi.encode(_triggerInfo)); } /// @inheritdoc ISimpleTrigger - function getTrigger(TriggerId triggerId) external view override returns (TriggerInfo memory _triggerInfo) { + function getTrigger( + TriggerId triggerId + ) external view override returns (TriggerInfo memory _triggerInfo) { Trigger storage _trigger = triggersById[triggerId]; - _triggerInfo = TriggerInfo({triggerId: triggerId, creator: _trigger.creator, data: _trigger.data}); + _triggerInfo = TriggerInfo({ + triggerId: triggerId, + creator: _trigger.creator, + data: _trigger.data + }); } /// @inheritdoc ISimpleTrigger - function triggerIdsByCreator(address _creator) external view returns (TriggerId[] memory _triggerIds) { + function triggerIdsByCreator( + address _creator + ) external view returns (TriggerId[] memory _triggerIds) { _triggerIds = _triggerIdsByCreator[_creator]; } } diff --git a/src/interfaces/IWavsTrigger.sol b/src/interfaces/IWavsTrigger.sol index fadaf440..1bd9e36d 100644 --- a/src/interfaces/IWavsTrigger.sol +++ b/src/interfaces/IWavsTrigger.sol @@ -21,7 +21,7 @@ interface ISimpleTrigger is ITypes { * @notice Add a new trigger * @param _data The request data (bytes) */ - function addTrigger(bytes memory _data) external; + function addTrigger(string memory _data) external; /** * @notice Get a single trigger by triggerId diff --git a/test_utils/README.md b/test_utils/README.md new file mode 100644 index 00000000..a2ee7604 --- /dev/null +++ b/test_utils/README.md @@ -0,0 +1,66 @@ +# WAVS Component Test Utilities + +This library provides essential validation tools for WAVS components. All components **MUST** pass these tests before running the `make wasi-build` command. + +## Overview + +The test_utils component is a collection of utilities and validation scripts to ensure WAVS components meet the required standards and follow best practices. It's designed to catch common errors before they cause build failures or runtime issues. + +## What It Does + +- Validates component structure and implementation +- Checks for common anti-patterns and implementation mistakes +- Provides a standardized way to verify components +- Ensures consistent error handling, data management, and API usage + +## Key Features + +- Automated code analysis +- Comprehensive validation of ABI encoding/decoding +- Data ownership and cloning validation +- Error handling pattern verification +- Network request and API security validation + +## Using the Validation Script + +The main validation script can be used to verify any component: + +```bash +# Validate a component using the Makefile command +make validate-component COMPONENT=your-component-name + +# Or run the script directly +cd test_utils +./validate_component.sh your-component-name +``` + + +## Test Modules + +The test utilities are organized into focused modules: + +| Module | Description | +|--------|-------------| +| `abi_encoding` | Proper handling of ABI-encoded data, avoiding common String::from_utf8 errors | +| `code_quality` | Code quality checks, including detecting unused imports and other best practices | +| `data_handling` | Correct data ownership, cloning, and avoiding moved value errors | +| `error_handling` | Proper Option/Result handling, avoiding map_err on Option errors | +| `network_requests` | HTTP request setup, error handling, and API key management | +| `solidity_types` | Working with Solidity types, numeric conversions, and struct handling | +| `input_validation` | Input data validation, safe decoding, and defensive programming | + +## Common Errors Prevented + +These tests help you avoid the following common errors: + +1. Using `String::from_utf8` directly on ABI-encoded data +2. Missing Clone derivation on API response structs +3. Using `map_err()` on Option types instead of `ok_or_else()` +4. Improper Rust-Solidity type conversions +5. Ownership issues with collection elements +6. Using `&data.clone()` pattern creating temporary values +7. Missing trait imports causing "no method" errors +8. Ambiguous method calls requiring fully qualified syntax +9. Unused imports cluttering the code +10. Direct version specifications instead of workspace dependencies + diff --git a/test_utils/validate_component.sh b/test_utils/validate_component.sh new file mode 100755 index 00000000..776b66fb --- /dev/null +++ b/test_utils/validate_component.sh @@ -0,0 +1,574 @@ +#!/bin/bash +# Component validation script - IMPROVED VERSION +# Runs comprehensive test utilities to validate a component before building +# Catches all common errors that would prevent successful builds or execution + +# Don't exit on error, we want to collect all errors +set +e + +# Create an array to hold all errors +errors=() +warnings=() + +# Function to add an error +add_error() { + errors+=("$1") + echo "❌ Error: $1" +} + +# Function to add a warning +add_warning() { + warnings+=("$1") + echo "⚠️ Warning: $1" +} + +if [ -z "$1" ]; then + echo "Usage: $0 " + echo "Example: $0 eth-price-oracle" + exit 1 +fi + +COMPONENT_NAME=$1 +COMPONENT_DIR="../components/$COMPONENT_NAME" +SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" + +# Check if component directory exists +if [ ! -d "$COMPONENT_DIR" ]; then + echo "❌ Error: Component directory $COMPONENT_DIR not found" + exit 1 +fi + +echo "🔍 Validating component: $COMPONENT_NAME" + +# Print a section header for better organization +print_section() { + echo + echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" + echo "🔍 $1" + echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" +} + +#===================================================================================== +# ABI ENCODING CHECKS +#===================================================================================== +print_section "ABI ENCODING CHECKS" + +# 1. Check for String::from_utf8 usage on ABI data in non-generated files +echo "📝 Checking for common String::from_utf8 misuse..." +grep_result=$(grep -r "String::from_utf8" "$COMPONENT_DIR/src" --include="*.rs" | grep -v "bindings.rs" | grep -v "test" | grep -v "# CORRECT" || true) +if [ ! -z "$grep_result" ]; then + if grep -r "String::from_utf8.*data" "$COMPONENT_DIR"/src/*.rs | grep -v "bindings.rs" > /dev/null; then + error_detail=$(grep -r "String::from_utf8.*data" "$COMPONENT_DIR"/src/*.rs | grep -v "bindings.rs") + add_error "Found String::from_utf8 used directly on ABI-encoded data. + This will ALWAYS fail with 'invalid utf-8 sequence' because ABI-encoded data is binary. + Use proper ABI decoding methods instead: + 1. For function calls with string params: functionCall::abi_decode() + 2. For string params: String::abi_decode() + $error_detail" + else + add_warning "Found String::from_utf8 usage. Ensure it's not being used on ABI-encoded data. + This can cause runtime errors if used with encoded data. You can ignore this warning if you are using correctly. + $grep_result" + fi +fi + +# 1b. Check for proper ABI decoding methods +echo "📝 Checking for proper ABI decoding methods..." +if grep -r "TriggerData::Raw" "$COMPONENT_DIR"/src/*.rs > /dev/null || + grep -r "cast abi-encode" "$COMPONENT_DIR" > /dev/null; then + + # Component deals with ABI-encoded input data + if ! grep -r "abi_decode" "$COMPONENT_DIR"/src/*.rs > /dev/null && + ! grep -r "::abi_decode" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + add_error "Component appears to handle ABI-encoded input but doesn't use abi_decode methods. + This will cause runtime errors when processing ABI-encoded data. + For ABI-encoded input, use proper decoding methods: + 1. ::abi_decode(&hex_data) + 2. ::abi_decode(&data) + 3. functionCall::abi_decode(&data)" + fi + + # Check for Solidity function definitions when receiving function calls + if grep -r "cast abi-encode \"f(string)" "$COMPONENT_DIR" > /dev/null && + ! grep -r "function.*external" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + add_error "Component receives ABI-encoded function calls but doesn't define Solidity functions. + This will cause runtime errors when trying to decode function calls. + Define appropriate Solidity functions to decode inputs, for example: + sol! { + function checkBalance(string address) external; + }" + fi +fi + +#===================================================================================== +# DATA HANDLING CHECKS +#===================================================================================== +print_section "DATA HANDLING CHECKS" + +# 2a. Check for proper Clone derivation on API structs used with network requests +echo "📝 Checking for Clone derivation on structs..." +# Look for structs used in HTTP responses +HTTP_USAGE=$(if grep -r "fetch_json\|http_request_get" "$COMPONENT_DIR"/src/*.rs > /dev/null 2>&1; then echo "1"; else echo "0"; fi) + +# Find structs with Deserialize but missing Clone +STRUCTS_WITH_DERIVE=$(grep -r -B 2 "struct" "$COMPONENT_DIR/src" | grep "derive" || true) +STRUCTS_WITH_DESERIALIZE=$(echo "$STRUCTS_WITH_DERIVE" | grep "Deserialize" || true) +STRUCTS_WITHOUT_CLONE=$(echo "$STRUCTS_WITH_DESERIALIZE" | grep -v "Clone" || true) + +if [ ! -z "$STRUCTS_WITHOUT_CLONE" ]; then + # Check if any struct without Clone is used more than once + STRUCT_USAGE_ERROR=false + + # Extract struct names from the output + while read -r line; do + # Extract struct name using sed - matches "struct Name {" + STRUCT_LINE=$(echo "$line" | grep -A 1 "derive" || true) + if [ ! -z "$STRUCT_LINE" ]; then + STRUCT_NAME=$(echo "$STRUCT_LINE" | grep "struct" | sed -E 's/.*struct\s+([A-Za-z0-9_]+).*/\1/') + + if [ ! -z "$STRUCT_NAME" ]; then + # Count usages of this struct (excluding declaration and imports) + USAGE_COUNT=$(grep -r "$STRUCT_NAME" "$COMPONENT_DIR"/src/*.rs | grep -v "struct $STRUCT_NAME" | grep -v "use.*$STRUCT_NAME" | wc -l) + + # If used multiple times or in JSON handling, it should have Clone + if [ "$USAGE_COUNT" -gt 2 ] || grep -q "serde_json.*$STRUCT_NAME" "$COMPONENT_DIR"/src/*.rs; then + STRUCT_USAGE_ERROR=true + break + fi + fi + fi + done <<< "$STRUCTS_WITHOUT_CLONE" + + # If HTTP request component or multiple usages detected, make it an error + if [ "$HTTP_USAGE" != "0" ] && [ "$STRUCT_USAGE_ERROR" = true ]; then + add_error "Found structs with Deserialize but missing Clone derivation that are used multiple times: + $STRUCTS_WITHOUT_CLONE + + Structs used multiple times with API responses MUST derive Clone to prevent ownership errors. + Fix: Add Clone to the derive list like this: + #[derive(Serialize, Deserialize, Debug, Clone)]" + else + add_warning "Found structs with Deserialize but missing Clone derivation: + $STRUCTS_WITHOUT_CLONE + + Consider adding Clone for consistency: + #[derive(Serialize, Deserialize, Debug, Clone)]" + fi +fi + +# 2b. Check for temporary clone pattern (&data.clone()) +echo "📝 Checking for incorrect &data.clone() pattern..." +TEMP_CLONE_PATTERN=$(grep -r "&.*\.clone()" "$COMPONENT_DIR"/src/*.rs || true) +if [ ! -z "$TEMP_CLONE_PATTERN" ]; then + add_error "Found dangerous &data.clone() pattern which creates temporary values that are immediately dropped. + This pattern causes ownership issues because the cloned data is immediately dropped. + Fix: Create a named variable to hold the cloned data instead: + WRONG: let result = std::str::from_utf8(&data.clone()); + RIGHT: let data_clone = data.clone(); + let result = std::str::from_utf8(&data_clone); + $TEMP_CLONE_PATTERN" +fi + +# 2c. Check for potential "move out of index" errors +echo "📝 Checking for potential 'move out of index' errors..." +MOVE_OUT_INDEX=$(grep -r "\[.*\]\..*" "$COMPONENT_DIR"/src/*.rs | grep -v "\.clone()" | grep -v "\.as_ref()" | grep -v "&" | grep -v "bindings.rs" || true) +if [ ! -z "$MOVE_OUT_INDEX" ]; then + add_error "Found potential 'move out of index' errors - accessing collection elements without cloning. + When accessing fields from elements in a collection, you should clone the field to avoid + moving out of the collection, which would make the collection unusable afterward. + WRONG: let field = collection[0].field; // This moves the field out of the collection + RIGHT: let field = collection[0].field.clone(); // This clones the field + $MOVE_OUT_INDEX" +fi + +#===================================================================================== +# ERROR HANDLING CHECKS +#===================================================================================== +print_section "ERROR HANDLING CHECKS" + +# 3a. Check for map_err on Option types - focus only on get_evm_chain_config specifically +echo "📝 Checking for map_err on Option types..." +MAP_ERR_CHAIN_CONFIG=$(grep -r "get_evm_chain_config" "$COMPONENT_DIR"/src/*.rs | grep "map_err" | grep -v "ok_or_else" 2>/dev/null || true) + +if [ ! -z "$MAP_ERR_CHAIN_CONFIG" ]; then + add_error "Found map_err used directly on get_evm_chain_config which returns Option, not Result. + Option types don't have map_err method - it's only available on Result types. + WRONG: get_evm_chain_config(\"ethereum\").map_err(|e| e.to_string())? + RIGHT: get_evm_chain_config(\"ethereum\").ok_or_else(|| \"Failed to get config\".to_string())? + $MAP_ERR_CHAIN_CONFIG" +fi + +#===================================================================================== +# IMPORT CHECKS +#===================================================================================== +print_section "IMPORT CHECKS" + +# 4a. Check for proper import of essential traits and types +echo "📝 Checking for essential imports..." +if grep -r "FromStr" "$COMPONENT_DIR"/src/*.rs > /dev/null && ! grep -r "use std::str::FromStr" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + FROMSTR_USAGE=$(grep -r "FromStr" "$COMPONENT_DIR"/src/*.rs | grep -v "use std::str::FromStr" || true) + add_error "Found FromStr usage but std::str::FromStr is not imported. + This will cause a compile error when using methods like from_str or parse(). + Fix: Add 'use std::str::FromStr;' to your imports. + $FROMSTR_USAGE" +fi + +# 4b. Check for min function usage without import +if grep -r "min(" "$COMPONENT_DIR"/src/*.rs > /dev/null && ! grep -r "use std::cmp::min" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + MIN_USAGE=$(grep -r "min(" "$COMPONENT_DIR"/src/*.rs | grep -v "use std::cmp::min" || true) + add_error "Found min function usage but std::cmp::min is not imported. + This will cause a compile error when using min(). + Fix: Add 'use std::cmp::min;' to your imports. + $MIN_USAGE" +fi + +# 4c. Check for TxKind import issues +if grep -r "alloy_rpc_types::eth::TxKind" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + TXKIND_USAGE=$(grep -r "alloy_rpc_types::eth::TxKind" "$COMPONENT_DIR"/src/*.rs || true) + add_error "Found incorrect TxKind import path. Use alloy_primitives::TxKind instead of alloy_rpc_types::eth::TxKind. + This is a critical error that will prevent component compilation. + Fix: 1. Add 'use alloy_primitives::{Address, TxKind, U256};' (or add TxKind to existing import) + 2. Replace 'alloy_rpc_types::eth::TxKind::Call' with 'TxKind::Call' + $TXKIND_USAGE" +fi + +# 4d. Check for TxKind usage without import +if grep -r "::Call" "$COMPONENT_DIR"/src/*.rs > /dev/null && ! grep -r "use.*TxKind" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + CALL_USAGE=$(grep -r "::Call" "$COMPONENT_DIR"/src/*.rs | grep -v "use.*TxKind" || true) + add_error "Found TxKind usage but TxKind is not properly imported. + Fix: Add 'use alloy_primitives::TxKind;' to your imports. + $CALL_USAGE" +fi + +# 4e. Check for block_on usage without the correct import - improved to handle grouped imports +echo "📝 Checking for block_on import..." +if grep -r "block_on" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + # Check both formats: direct import and grouped import + DIRECT_IMPORT=$(grep -r "use wstd::runtime::block_on" "$COMPONENT_DIR"/src/*.rs || true) + GROUPED_IMPORT=$(grep -r "use wstd::{.*runtime::block_on" "$COMPONENT_DIR"/src/*.rs || true) + RUNTIME_IMPORT=$(grep -r "use wstd::.*runtime" "$COMPONENT_DIR"/src/*.rs || true) + + if [ -z "$DIRECT_IMPORT" ] && [ -z "$GROUPED_IMPORT" ] && [ -z "$RUNTIME_IMPORT" ]; then + BLOCK_ON_USAGE=$(grep -r "block_on" "$COMPONENT_DIR"/src/*.rs || true) + add_error "Found block_on usage but wstd::runtime::block_on is not imported. + This will cause a compile error when using async functions. + Fix: Add 'use wstd::runtime::block_on;' to your imports. + $BLOCK_ON_USAGE" + fi +fi + +# 4f. Check for HTTP function imports +if grep -r "http_request_" "$COMPONENT_DIR"/src/*.rs > /dev/null || grep -r "fetch_json" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + # Check for both direct import and grouped import patterns + DIRECT_HTTP_IMPORT=$(grep -r "use wavs_wasi_utils::http::" "$COMPONENT_DIR"/src/*.rs || true) + GROUPED_HTTP_IMPORT=$(grep -r "use wavs_wasi_utils::{.*http::{.*fetch_json\|.*http_request_" "$COMPONENT_DIR"/src/*.rs || true) + + if [ -z "$DIRECT_HTTP_IMPORT" ] && [ -z "$GROUPED_HTTP_IMPORT" ]; then + HTTP_USAGE=$(grep -r "http_request_\|fetch_json" "$COMPONENT_DIR"/src/*.rs || true) + add_error "Found HTTP function usage but wavs_wasi_utils::http is not imported. + Fix: Add 'use wavs_wasi_utils::http::{fetch_json, http_request_get};' to your imports. + $HTTP_USAGE" + fi +fi + +# 4g. Check for SolCall trait missing when using abi_encode +if grep -r "abi_encode" "$COMPONENT_DIR"/src/*.rs > /dev/null && ! grep -r "use.*SolCall" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + if grep -r "Call.*abi_encode" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + CALL_ABI_USAGE=$(grep -r "Call.*abi_encode" "$COMPONENT_DIR"/src/*.rs || true) + add_error "Found Call::abi_encode usage but SolCall trait is not imported. + Function calls require the SolCall trait for encoding. + Fix: Add 'use alloy_sol_types::{SolCall, SolValue};' to your imports. + $CALL_ABI_USAGE" + fi +fi + +# After the existing import checks, add: +echo "📝 Checking for missing dependencies..." +# Get all local module names (mod foo;) from src/*.rs +LOCAL_MODS=$(grep -h -E '^mod ' "$COMPONENT_DIR"/src/*.rs | sed -E 's/^mod ([a-zA-Z0-9_]+);/\1/' | sort | uniq) +# Add known local modules +LOCAL_MODS="$LOCAL_MODS trigger bindings" +# Get all imports from the code, extract just the crate names +IMPORTS=$(grep -h -r "^use" "$COMPONENT_DIR"/src/*.rs | \ + sed -E 's/^use[[:space:]]+//' | \ + sed -E 's/ as [^;]+//' | \ + sed -E 's/[{].*//' | \ + sed -E 's/;.*//' | \ + cut -d: -f1 | \ + awk -F'::' '{print $1}' | \ + grep -vE '^(crate|self|super|std|core|wavs_wasi_utils|wstd)$' | \ + sort | uniq) + +# Check each import against Cargo.toml dependencies +for import in $IMPORTS; do + # Skip empty lines + if [[ -z "$import" ]]; then + continue + fi + # Skip local modules + if echo "$LOCAL_MODS" | grep -wq "$import"; then + continue + fi + # Convert import name to Cargo.toml format (replace underscores with hyphens) + cargo_name=$(echo "$import" | tr '_' '-') + # Check if the import is in Cargo.toml (either directly or as a workspace dependency) + if ! grep -q "$cargo_name.*=.*{.*workspace.*=.*true" "$COMPONENT_DIR/Cargo.toml" && ! grep -q "$cargo_name.*=.*\"" "$COMPONENT_DIR/Cargo.toml"; then + add_error "Import '$import' is used but not found in Cargo.toml dependencies.\n Add it to your [dependencies] section in Cargo.toml and to [workspace.dependencies] in the root Cargo.toml." + fi +done + +#===================================================================================== +# COMPONENT STRUCTURE CHECKS +#===================================================================================== +print_section "COMPONENT STRUCTURE CHECKS" + +# 5a. Check for proper export! macro usage and syntax +echo "📝 Checking for proper component export..." +if ! grep -r "export!" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + add_error "export! macro not found. Components must use export! macro. + Fix: Add 'export!(YourComponent with_types_in bindings);' to your component." +fi + +# 5b. Check for correct export! macro syntax with with_types_in +if grep -r "export!" "$COMPONENT_DIR"/src/*.rs > /dev/null && ! grep -r "export!.*with_types_in bindings" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + EXPORT_USAGE=$(grep -r "export!" "$COMPONENT_DIR"/src/*.rs || true) + add_error "Incorrect export! macro syntax. Use 'export!(YourComponent with_types_in bindings)' instead of just 'export!(YourComponent)'. + Fix: Update to 'export!(YourComponent with_types_in bindings);' + $EXPORT_USAGE" +fi + +# 5c. Check for TriggerAction structure usage issues +echo "📝 Checking for TriggerAction structure usage..." +if grep -r "trigger.trigger_data" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + TRIGGER_DATA_USAGE=$(grep -r "trigger.trigger_data" "$COMPONENT_DIR"/src/*.rs || true) + add_error "Component accesses non-existent 'trigger_data' field on TriggerAction. Use 'trigger.data' instead. + $TRIGGER_DATA_USAGE" +fi + +# 5d. Check for incorrect match pattern on trigger.data (treating it as Option) +if grep -r -A 5 -B 2 "match trigger.data" "$COMPONENT_DIR"/src/*.rs 2>/dev/null | grep -q "Some(" && + grep -r -A 8 -B 2 "match trigger.data" "$COMPONENT_DIR"/src/*.rs 2>/dev/null | grep -q "None =>"; then + TRIGGER_MATCH=$(grep -r -A 5 -B 2 "match trigger.data" "$COMPONENT_DIR"/src/*.rs || true) + add_error "Component incorrectly treats 'trigger.data' as an Option, but it's a TriggerData. + The field is not optional - don't match against Some/None patterns. + $TRIGGER_MATCH" +fi + +# 5e. Check for Guest trait implementation +if ! grep -r "impl Guest for" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + add_error "Guest trait implementation not found. Components must implement the Guest trait. + Fix: Add 'impl Guest for YourComponent { fn run(trigger: TriggerAction) -> Result, String> { ... } }'" +fi + +# 5f. Check for run function with correct signature - improved to accept variations in naming/qualification +if ! grep -r "fn run(.*TriggerAction.*) -> .*Result, String>" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + add_error "run function with correct result signature not found. + The run function must return std::result::Result, String>" +fi + +#===================================================================================== +# SECURITY CHECKS +#===================================================================================== +print_section "SECURITY CHECKS" + +# 6a. Check for hardcoded API keys +echo "📝 Checking for hardcoded API keys..." +API_KEYS=$(grep -r "key=.*[0-9a-zA-Z]\{8,\}" "$COMPONENT_DIR" --include="*.rs" || true) +if [ ! -z "$API_KEYS" ]; then + add_error "Found possible hardcoded API key. Use environment variables instead. + Fix: Use std::env::var(\"WAVS_ENV_YOUR_API_KEY\") to get API keys from environment variables. + $API_KEYS" +fi + +# 6b. Check for other potential hardcoded secrets +OTHER_SECRETS=$(grep -r "token=\|secret=\|password=" "$COMPONENT_DIR" --include="*.rs" | grep "[0-9a-zA-Z]\{8,\}" || true) +if [ ! -z "$OTHER_SECRETS" ]; then + add_error "Found possible hardcoded secret. Use environment variables instead. + Fix: Use std::env::var(\"WAVS_ENV_YOUR_SECRET\") to get secrets from environment variables. + $OTHER_SECRETS" +fi + +#===================================================================================== +# DEPENDENCIES CHECKS +#===================================================================================== +print_section "DEPENDENCIES CHECKS" + +# 7. Check for proper workspace dependency usage +echo "📝 Checking for proper workspace dependency usage..." +VERSION_NUMBERS=$(grep -r "version = \"[0-9]" "$COMPONENT_DIR/Cargo.toml" || true) +if [ ! -z "$VERSION_NUMBERS" ]; then + add_error "Found direct version numbers in Cargo.toml. Use workspace = true instead. + Fix: Replace version numbers with { workspace = true } for all dependencies. + WRONG: some-crate = \"0.1.0\" + RIGHT: some-crate = { workspace = true } + $VERSION_NUMBERS" +fi + +#===================================================================================== +# CODE QUALITY CHECKS +#===================================================================================== +print_section "CODE QUALITY CHECKS" + +# 8. Check for unused imports with cargo check +echo "📝 Checking for unused imports and code issues..." +cd "$SCRIPT_DIR/.." +COMPONENT_NAME_SIMPLE=$(basename "$COMPONENT_DIR") + +# Run cargo check and capture any errors (not just warnings) +CARGO_OUTPUT=$(cargo check -p "$COMPONENT_NAME_SIMPLE" 2>&1) +CARGO_ERRORS=$(echo "$CARGO_OUTPUT" | grep -i "error:" | grep -v "generated file bindings.rs" || true) + +if [ ! -z "$CARGO_ERRORS" ]; then + add_error "cargo check found compilation errors: + $CARGO_ERRORS" +fi + +# Show warnings but don't fail on them +CARGO_WARNINGS=$(echo "$CARGO_OUTPUT" | grep -i "warning:" | grep -v "profiles for the non root package" || true) +if [ ! -z "$CARGO_WARNINGS" ]; then + add_warning "cargo check found warnings that might indicate issues: + $CARGO_WARNINGS" +fi + +cd "$SCRIPT_DIR" + +#===================================================================================== +# SOLIDITY TYPES CHECKS +#===================================================================================== +print_section "SOLIDITY TYPES CHECKS" + +# 9a. Check for sol! macro usage without proper import +echo "📝 Checking for sol! macro imports..." +if grep -r "sol!" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + if ! grep -r "use alloy_sol_types::sol" "$COMPONENT_DIR"/src/*.rs > /dev/null && ! grep -r "use alloy_sol_macro::sol" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + SOL_USAGE=$(grep -r "sol!" "$COMPONENT_DIR"/src/*.rs || true) + add_error "Found sol! macro usage but neither alloy_sol_types::sol nor alloy_sol_macro::sol is imported. + Fix: Add 'use alloy_sol_types::sol;' to your imports. + $SOL_USAGE" + fi +fi + +# 9b. Check for solidity module structure +echo "📝 Checking for proper solidity module structure..." +if grep -r "sol::" "$COMPONENT_DIR"/src/*.rs > /dev/null && ! grep -r "mod solidity" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + SOL_NAMESPACE=$(grep -r "sol::" "$COMPONENT_DIR"/src/*.rs || true) + add_error "Found 'sol::' namespace usage without defining a 'solidity' module. + Fix: Create a proper module structure like: + mod solidity { + use alloy_sol_types::sol; + sol! { /* your solidity types */ } + } + $SOL_NAMESPACE" +fi + +# 9c. Check for string literals assigned to String type fields in structs +echo "📝 Checking for string literal to String conversions..." +# Look for patterns like 'field: "string literal",' in struct initializations +# Only check lib.rs to avoid auto-generated bindings.rs +if [ -f "$COMPONENT_DIR/src/lib.rs" ]; then + STRING_FIELDS=$(grep -A 20 "pub struct" "$COMPONENT_DIR/src/lib.rs" | grep -E "^\s*pub\s+[a-zA-Z0-9_]+:\s+String," | sed -E 's/^\s*pub\s+([a-zA-Z0-9_]+):\s+String,.*/\1/' || true) + + if [ ! -z "$STRING_FIELDS" ]; then + # For each string field, check for literals without to_string() + for FIELD in $STRING_FIELDS; do + # Skip if field name is empty or contains special characters + if [[ "$FIELD" =~ ^[a-zA-Z0-9_]+$ ]]; then + # Look for patterns like 'field: "literal",' without to_string() + STRING_LITERAL_USAGE=$(grep -r "$FIELD: \"" "$COMPONENT_DIR"/src/lib.rs | grep -v "\.to_string()" || true) + + if [ ! -z "$STRING_LITERAL_USAGE" ]; then + add_error "Found string literals assigned directly to String type fields without .to_string() conversion: + $STRING_LITERAL_USAGE + + This will cause a type mismatch error because &str cannot be assigned to String. + Fix: Always convert string literals to String type using .to_string(): + WRONG: field: \"literal string\", + RIGHT: field: \"literal string\".to_string()," + break + fi + fi + done + fi +fi + +#===================================================================================== +# STRING SAFETY CHECKS +#===================================================================================== +print_section "STRING SAFETY CHECKS" + +# 10a. Check for unbounded string.repeat operations +echo "📝 Checking for string capacity overflow risks..." + +# First, collect all .repeat() calls - simpler approach to catch all possible cases +REPEAT_CALLS=$(grep -r "\.repeat(" "$COMPONENT_DIR"/src/*.rs || true) + +if [ ! -z "$REPEAT_CALLS" ]; then + # Look for any .repeat() calls with potentially unsafe variables + RISKY_REPEAT_PATTERNS="decimals\|padding\|len\|size\|count\|width\|height\|indent\|offset\|spaces\|zeros\|chars\|digits" + + # Check for specific safety patterns + SAFETY_PATTERNS="std::cmp::min\|::min(\|min(\|// SAFE:" + + # Check if any .repeat call doesn't use a safety bound + UNSAFE_REPEATS=$(echo "$REPEAT_CALLS" | grep -i "$RISKY_REPEAT_PATTERNS" | grep -v "$SAFETY_PATTERNS" || true) + + if [ ! -z "$UNSAFE_REPEATS" ]; then + add_error "Found potentially unbounded string.repeat operations: +$UNSAFE_REPEATS + +This can cause capacity overflow errors. Options to fix: + 1. Add a direct safety check: \".repeat(std::cmp::min(variable, 100))\" + 2. Use a bounded variable: \"let safe_value = std::cmp::min(variable, MAX_SIZE); .repeat(safe_value)\" + 3. Add a safety comment if manually verified: \"// SAFE: bounded by check above\"" + fi +fi + +#===================================================================================== +# NETWORK REQUEST CHECKS +#===================================================================================== +print_section "NETWORK REQUEST CHECKS" + +# 11a. Check for proper block_on usage with async functions +echo "📝 Checking for proper async handling..." +if grep -r "async fn" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + if ! grep -r "block_on" "$COMPONENT_DIR"/src/*.rs > /dev/null; then + ASYNC_USAGE=$(grep -r "async fn" "$COMPONENT_DIR"/src/*.rs || true) + add_error "Found async functions but no block_on usage. + Async functions must be called with block_on in component run functions: + block_on(async { make_request().await }) + $ASYNC_USAGE" + fi +fi + +#===================================================================================== +# FINAL SUCCESS MESSAGE +#===================================================================================== +print_section "VALIDATION SUMMARY" + +# Check if there are any errors or warnings +ERROR_COUNT=${#errors[@]} +WARNING_COUNT=${#warnings[@]} + +if [ $ERROR_COUNT -gt 0 ]; then + echo "❌ Component validation failed with $ERROR_COUNT errors and $WARNING_COUNT warnings." + echo + echo "⚠️ YOU MUST FIX ALL ERRORS BEFORE RUNNING 'make wasi-build'." + echo " Failure to fix these issues will result in build or runtime errors." + exit 1 +else + if [ $WARNING_COUNT -gt 0 ]; then + echo "⚠️ Component validation passed with $WARNING_COUNT warnings." + echo " Consider fixing these warnings to improve your component's reliability." + else + echo "✅ Component validation checks complete! No errors or warnings found." + fi + + echo "🚀 Component is ready for building. Run the following command to build:" + echo " cd ../.. && make wasi-build" +fi + +# After all static checks, add: +echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" +echo "🔍 CARGO CHECK (compilation test)" +echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" +cargo check --manifest-path "$(pwd)/../components/$COMPONENT_NAME/Cargo.toml" --target wasm32-wasip1