Deploy llama3.2 on Oyster using the ollama framework and interact with it in a verifiable manner.
Setup the Development environment
- Clone the repo
git clone https://github.com/marlinprotocol/ollama_oyster_setup.git
cd ollama_oyster_setup- Update the following docker images according to your system's architecture in the
docker-compose.yml
# llama proxy service
llama_proxy:
image: kalpita888/ollama_arm64:0.0.1 # For arm64 system use kalpita888/ollama_arm64:0.0.1 and for amd64 system use kalpita888/ollama_amd64:0.0.1-
Set up a wallet where you can export the private key. Deposit 0.001 ETH and 1 USDC to the wallet on the Arbitrum One network.
-
Deploy the enclave image
# for amd64
# replace <key> with private key of the wallet
oyster-cvm deploy --wallet-private-key <key> --duration-in-minutes 20 --docker-compose docker-compose.yml --instance-type c6a.2xlarge --arch amd64
# for arm64
# replace <key> with private key of the wallet
oyster-cvm deploy --wallet-private-key <key> --duration-in-minutes 20 --docker-compose docker-compose.yml --instance-type c6g.2xlargeMake a note of the IP from the output and wait for ~4min for the model pull to finish.
- Test using
curlfrom host machine
curl http://{{instance-ip}}:5000/api/generate -d '{
"model": "llama3.2",
"prompt":"Why is the sky blue?"
}'- Running the
curlcommand above with-voption shows us two important headers:
x-oyster-timestamp: 1741620242
x-oyster-signature: 8781e472b0f8e3693c1c6cec60b1ae0f5fed4c574d24e3bfcc6cc23f02a918a8785709ceb8a464a7d1dbbb8809ba73047acaa3ff5f1918ba565d82d177e123801b
The above signature can be used to verify that the response is received from an enclave.
- Verify a remote attestation (recommended)
# Replace <ip> with the IP you obtained above
oyster-cvm verify --enclave-ip <ip>You should see Verification successful along with some attestation fields printed out.
Head over to Oyster Confidential VM tutorials for more details.