OnChainAI purpose is to propose a fully decentralized way to interact onchain, between smartcontracts and AI
A running demo of OnChainAI using this extension (with also Scaffold-ETH Fleek extension) is available on IPFS here:
-
OnChainAI extensionis a Scaffold-eth-2 extension, allowing you to develop Dapps usingOpenAI GPT -
OnChainAIprotocol is an onchain solution for any smartcontracts to make AI calls. -
OnChainAIusesOpenAI GPT4o-miniwithChainlink Functions. EachOpenAIrequest launched byOnChainAIis sent by multipleChainlinkservers that have to reach consensus to return a unique answer.Chainlinkanswer can be retrieved only after a few blocks, and may take more than one minute, depending on the network. -
OnChainAIis not free (on mainnet) asChainlinkrequires someLINKtokens andOpenAIrequires some$. Default model will be a fixed price of0.0002 ethper request. BUT this will be changed in the future to a more dynamic pricing model. -
You can use
OnChainAIprotocol as is, with the contracts already deployed, or you can deploy your own, where you will be able to set your own configuration, and decide on the price of AI requests. -
OnChainAI extensionis available with aHardhatsetup with 3 specific AI tasks to help you start with theOnChainAIprotocol.
Install via this command:
$ npx create-eth@latest -e kredeum/onchain-ai-extensionThen run the following commands to initialize the new repo,
$ cd <your new repo>
$ ./init.shFinally the classic Scaffold-eth-2 commands in 3 different terminals:
$ yarn chain$ yarn deploy$ yarn startIn all these commands use hardhat option --network <NETWORK> to specify the network you want to use.
Note that OnChainAI will not work on hardhat network (no Chainlink there...), so rather use a tesnet like baseSepolia or optimismSepolia for your tests (avoid Sepolia that is slower).
You can send your prompt to OnChainAI in different ways:
- using
debugpage ofScaffold-eth-2(out of the box) - using
OnChainAI UIincluded in this extension, via the menu link inScaffold-eth-2 - using
hardhat ai requesttask - via your smartcontracts using
OnChainAIprotocol
You can run hardhat AI task with yarn hardhat --network <NETWORK> ai <TASK>
3 tasks available, 1 for the users: request, 2 for the OnChainAI admin : secrets, config
AVAILABLE TASKS:
config Display [and update] OnChainAI config
request Read last OnChainAI response [and send OnChainAI request]
secrets Upload OnChainAI secrets to Chainlink
ai: OnChainAI with Chainlink and OpenAIMain task, to be used to send your prompt
Ex: yarn hardhat --network baseSepolia ai request --prompt "13 time 5 equal ?"
Usage: hardhat [GLOBAL OPTIONS] ai request [--prompt <STRING>]
OPTIONS:
--prompt OpenAI prompt request for Chainlink
request: Read last OnChainAI response [and send OnChainAI request]Admin task, to be used to upload your secrets to Chainlink
Ex: yarn hardhat --network baseSepolia ai secrets --expiration 10
Usage: hardhat [GLOBAL OPTIONS] ai secrets [--expiration <INT>]
OPTIONS:
--expiration Expiration time in minutes of uploaded secrets (default: 60)
secrets: Upload OnChainAI secrets to ChainlinkAdmin task, to manage OnChainAI configuration
Ex: yarn hardhat --network baseSepolia ai config --price 0.0002
Usage: hardhat [GLOBAL OPTIONS] ai config [--chainname <STRING>] [--donid <INT>] [--explorer <STRING>] [--router <STRING>] [--rpc <STRING>] [--subid <INT>]
OPTIONS:
--chainname Chain name
--donid Chainlink DON Id
--explorer Chain explorer url
--router Chainlink routeur address
--rpc Base Rpc url
--subid Chainlink Subscription Id
config: Display [and update] OnChainAI configAny updated value, will be written to the config file, and store onchain for donidand subid
Router address must be set before deployment of a new version of OnChainAI contract.
Config file can be found at packages/hardhat/chainlink/config.json
You can define a shortcut in your package.json like that :
"scripts": {
"ai": "hardhat --network baseSepolia ai"
}then call it with yarn ai <TASK> <OPTIONS>
A specific system prompt is used for each OpenAI request, you can view it inside the javascript code run by Chainlink DON : packages/hardhat/chainlink/source/onChainAI.js
In order to never store your secrets and private keys in plain text on your hard disk ("hi @PatrickAlphaC"), this extension use Chainlink env-enc module to encrypt your secrets before storing them.
In order to setup env-enc, in hardhat directory first define a password with yarn env-enc set-pw then input your secrets with yarn env-enc set
If you want to keep original unsecure dotenv stuff just comment 2 env-enc lines, and uncomment the 2 dotenv lines at the begining of hardhat.config.ts
Same ENV values are needed for both dotenv and env-enc:
DEPLOYER_PRIVATE_KEY: private key of the deployerALCHEMY_API_KEY: alchemy api keyETHERSCAN_API_KEY: etherscan api keyOPENAI_API_KEY: openai api key
OPENAI_API_KEY will be uploaded in a secure way to Chainlink DON (don't use centralized S3 solutions also proposed by Chainlink)
-
Chainlink Functionsis currently inbetaso asOnChainAIis. -
OpenAiprompt must be kept simple, asChainlink Functionshas a limited memory capacity -
OpenAIanswer must very short, in order forChainlink Functionsto be able to reach a consensus on an answer. i.e. you can ask '13 time 5 equal ?' but not ask 'Tell me a story'. And you can add to your prompt some requirements as: answer withone word,YES or NOortrue or false...
- deploy on Mainnet: requires some tuning on requested price, using some
Chainlink Oracle Price feed - implement other AI models :
Mistral,Claude,Lama3and otherOpenAImodels - deploy
OnChainAIon all networks supported byChainlink Functions(curently as of August 2024 : Ethereum, Arbitrum, Base, Optimism, Polygon, Avalanche) - deploy with same address on all networks
- setup an foundry extension too
- propose a choice between multiple system prompts
