Skip to content

mozilla-ai/wasm-agents-blueprint

Repository files navigation

Project logo

Wasm Agents Blueprint

This Blueprint bridges the gap between powerful Python AI frameworks and browser-based applications, by running the OpenAI Agents Python SDK in WebAssembly (Wasm) through Pyodide. While you'll still use external (even local!) LLMs for inferencing, you can experience the power of Python-based AI agents without external dependencies. No server configurations, Docker containers, or complex deployments: just open an HTML file.

Pre-requisites

  • System requirements:

    • OS: Windows, macOS, or Linux
    • Modern web browser with WebAssembly support (Chrome 57+, Firefox 52+, Safari 11+, Edge 16+)
  • Model Access:

    OR

    • API key for OpenAI models

Quick-start

The agents available in this repository use the OpenAI API to communicate with any compatible LLM server:

  • the local_model.html example relies on open-weights models (e.g. Qwen3-8b, Devstral Small 2507) served locally;
  • the other examples make use of the default model configured in the library (currently OpenAI's gpt-4o).

Here's how you can run these agents in your own browser:

  1. Clone the repository:

    git clone https://github.com/mozilla-ai/wasm-agents-blueprint.git
    cd wasm-agents-blueprint/demos
  2. Configure your API key (required for gpt models only):

    • Get your OpenAI API key
    • Copy it into config.js:
      window.APP_CONFIG = {
          OPENAI_API_KEY: 'your-api-key-here'
      };
  3. Start serving a local model (required for local_model example only)

    OLLAMA_CONTEXT_LENGTH=40000 OLLAMA_ORIGINS="*" ollama serve
    • for LM Studio, customize context length in the Edit model default parameters section and make sure Enable CORS is active in the Developer / model serving section.
  4. Open one of the following HTML files directly in your browser:

    • hello_agent.html - Basic agent example
    • handoff_demo.html - Multi-agent handoff system
    • tool_calling.html - Tool calling agent with web scraping and character counting capabilities
    • local_model.html - Tool calling agent with local model support

Available Demos

Basic Agent (hello_agent.html)

Simple conversational agent with customizable instructions. Useful for understanding the basics of Wasm-based agents.

Agent Handoff (handoff_demo.html)

Multi-agent system that routes requests to specialized agents based on the prompt's characteristics.

Tool-Calling Agent (tool_calling.html)

Advanced agent with built-in tools for practical tasks:

  • count_character_occurrences: addresses the famous "How many Rs in strawberry?" problem :-)
  • visit_webpage: downloads web content and converts it to markdown

Local Model Agent (local_model.html)

Run agents with local models via Ollama or LM Studio, ensuring higher privacy and offline capability.

Troubleshooting

Common Issues:

  1. CORS Errors: the agent replies it cannot access some Web resources (examples in the pictures below).

A screenshot of Firefox showing the agent response "I am unable to access the GitHub page for..." and below that, the Console tab open showing a CORS error. A screenshot of Firefox showing the agent response "It seems there was an issue connecting to the website..." and below that, the Network tab showing a CORS error.

This happens when you ask the agent to do something with data retrieved via the visit_webpage tool. To fix this there are different approaches, depending on the browser: CORS Everywhere extension for Firefox, Cross Domain - CORS for Chrome. For Safari, open Settings, choose Advanced -> Show features for Web developers, then choose the Developer Tab and check Disable cross-origin restrictions. The Settings->Advanced window in Safari with the "Show features for web developers" option checked. The Settings->Developer window in Safari with the "Disable cross-origin restrictions" option checked.

  1. Pyodide Loading Issues: Ensure stable internet connection for initial package downloads (this is required even if you are planning to hit a local LLM)

  2. API Key Problems: Verify your OpenAI API key is correctly set in config.js

  3. Issues With Local Model: For Ollama, make sure you enable CORS and set a context length larger than the default one:

    OLLAMA_CONTEXT_LENGTH=40000 OLLAMA_ORIGINS="*" ollama serve

    For LM Studio, make sure Enable CORS is active in the Developer / model serving section:

License

This project is licensed under the Apache 2.0 License. See the LICENSE file for details.

Contributing

Contributions are welcome! To get started, you can check out the CONTRIBUTING.md file.

Acknowledgments

This Blueprint is built on top of:

About

Testing WASM-powered AI agents

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •