Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions algorithms/benches/snark/varuna.rs
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,7 @@ fn snark_batch_prove(c: &mut Criterion) {

let mut pks = Vec::with_capacity(circuit_batch_size);
let mut all_circuits = Vec::with_capacity(circuit_batch_size);
#[allow(clippy::mutable_key_type)]
Copy link
Copy Markdown
Collaborator Author

@ljedrz ljedrz Oct 16, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note: this is needed, because the CircuitVerifyingKey has interior mutability now; however, in its case it is perfectly fine, as the new member has no impact on the Ord impl (which only considers the id), and the fact that the key implements it is the reason why the warning is raised; see the corresponding clippy lint link.

let mut keys_to_constraints = BTreeMap::new();

for i in 0..circuit_batch_size {
Expand Down Expand Up @@ -191,7 +192,9 @@ fn snark_batch_verify(c: &mut Criterion) {
let mut vks = Vec::with_capacity(circuit_batch_size);
let mut all_circuits = Vec::with_capacity(circuit_batch_size);
let mut all_inputs = Vec::with_capacity(circuit_batch_size);
#[allow(clippy::mutable_key_type)]
let mut keys_to_constraints = BTreeMap::new();
#[allow(clippy::mutable_key_type)]
let mut keys_to_inputs = BTreeMap::new();
for i in 0..circuit_batch_size {
let num_constraints = num_constraints_base + i;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
// See the License for the specific language governing permissions and
// limitations under the License.

use crate::{polycommit::sonic_pc, snark::varuna::ahp::indexer::*};
use crate::{AlgebraicSponge, polycommit::sonic_pc, snark::varuna::ahp::indexer::*};
use snarkvm_curves::PairingEngine;
use snarkvm_utilities::{FromBytes, FromBytesDeserializer, ToBytes, ToBytesSerializer, into_io_error, serialize::*};

Expand All @@ -25,17 +25,33 @@ use std::{
io::{self, Read, Write},
str::FromStr,
string::String,
sync::OnceLock,
};

/// Verification key for a specific index (i.e., R1CS matrices).
#[derive(Debug, Clone, PartialEq, Eq, CanonicalSerialize, CanonicalDeserialize)]
#[derive(Debug, Clone, PartialEq, Eq)]
pub struct CircuitVerifyingKey<E: PairingEngine> {
/// Stores information about the size of the circuit, as well as its defined
/// field.
pub circuit_info: CircuitInfo,
/// Commitments to the indexed polynomials.
pub circuit_commitments: Vec<sonic_pc::Commitment<E>>,
pub id: CircuitId,
pub circuit_commitments_hash: OnceLock<E::Fq>,
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is expensive enough that we should store it to disk - and it would be great if we can get rid of the OnceLock (which obfuscates when initialization happens, making performance analysis harder). O:)

You correctly observed that we don't have to transmit it over the wire though.

Copy link
Copy Markdown
Collaborator Author

@ljedrz ljedrz Oct 16, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The issue was that at the moment of creation, the fs_params is not available, and at later stages the VKs are immutable - hence the OnceLock.

Storing to the disk would probably work around this, but retrieving it would be expensive, perhaps to the point of offsetting any performance gains from caching it, unless the hashing is really computationally expensive.

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Storing to the disk would probably work around this, but retrieving it would be expensive, perhaps to the point of offsetting any performance gains from caching it, unless the hashing is really computationally expensive.

To be clear, we would retrieve it from disk only when we retrieve the VK from disk. And the hashes are very expensive.

However, a big downside I do see is the sheer amount of work to adjust the database logic. So for the first version you can also compute it during construction.

The issue was that at the moment of creation, the fs_params is not available

Looks to me it's always available in N::varuna_fs_parameters() ? Or is there a scoping issue? Have fun with that :")

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks to me it's always available in N::varuna_fs_parameters()?

That's correct; however, there is no notion of the Network - or even snarkvm-console - in algorithms. Would it be acceptable to alter SNARK::circuit_setup to require FSParameters, like the other VarunaSNARK functions do?

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

However, a big downside I do see is the sheer amount of work to adjust the database logic. So for the first version you can also compute it during construction.

For my future self reading this: we should just store to disk, but we can write out that logic when we have a definite timeline for landing this feature.

}

impl<E: PairingEngine> CircuitVerifyingKey<E> {
pub fn get_or_calculate_circuit_commitments_hash<FS: AlgebraicSponge<E::Fq, 2>>(
&self,
fs_parameters: &FS::Parameters,
) -> &E::Fq {
self.circuit_commitments_hash.get_or_init(|| {
let mut sponge = FS::new_with_parameters(fs_parameters);
sponge.absorb_native_field_elements(&self.circuit_commitments);

sponge.squeeze_native_field_elements(1)[0]
})
}
}

impl<E: PairingEngine> FromBytes for CircuitVerifyingKey<E> {
Expand Down Expand Up @@ -99,6 +115,58 @@ impl<'de, E: PairingEngine> Deserialize<'de> for CircuitVerifyingKey<E> {
}
}

impl<E: PairingEngine> CanonicalSerialize for CircuitVerifyingKey<E> {
fn serialize_with_mode<W: Write>(&self, mut writer: W, compress: Compress) -> Result<(), SerializationError> {
self.circuit_info.serialize_with_mode(&mut writer, compress)?;
self.circuit_commitments.serialize_with_mode(&mut writer, compress)?;
self.id.serialize_with_mode(&mut writer, compress)?;
// The hash is omitted.
Ok(())
}

fn serialized_size(&self, compress: Compress) -> usize {
self.circuit_info.serialized_size(compress)
+ self.circuit_commitments.serialized_size(compress)
+ self.id.serialized_size(compress)
// The hash is omitted.
}
}

impl<E: PairingEngine> CanonicalDeserialize for CircuitVerifyingKey<E> {
fn deserialize_with_mode<R: Read>(
mut reader: R,
compress: Compress,
validate: Validate,
) -> Result<Self, SerializationError> {
let circuit_info = CanonicalDeserialize::deserialize_with_mode(&mut reader, compress, validate)?;
let circuit_commitments = CanonicalDeserialize::deserialize_with_mode(&mut reader, compress, validate)?;
let id = CanonicalDeserialize::deserialize_with_mode(&mut reader, compress, validate)?;
Ok(Self { circuit_info, circuit_commitments, id, circuit_commitments_hash: Default::default() })
}
}

impl<E: PairingEngine> Valid for CircuitVerifyingKey<E> {
fn check(&self) -> Result<(), SerializationError> {
Valid::check(&self.circuit_info)?;
Valid::check(&self.circuit_commitments)?;
Valid::check(&self.id)?;
// The hash is omitted.
Ok(())
}

fn batch_check<'a>(batch: impl Iterator<Item = &'a Self> + Send) -> Result<(), SerializationError>
where
Self: 'a,
{
let batch: Vec<_> = batch.collect();
Valid::batch_check(batch.iter().map(|v| &v.circuit_info))?;
Valid::batch_check(batch.iter().map(|v| &v.circuit_commitments))?;
Valid::batch_check(batch.iter().map(|v| &v.id))?;
// The hash is omitted.
Ok(())
}
}

impl<E: PairingEngine> Ord for CircuitVerifyingKey<E> {
fn cmp(&self, other: &Self) -> Ordering {
self.id.cmp(&other.id)
Expand Down
3 changes: 3 additions & 0 deletions algorithms/src/snark/varuna/tests.rs
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,9 @@ mod varuna {
$snark_inst::batch_circuit_setup(&universal_srs, unique_instances.as_slice()).unwrap();
println!("Called circuit setup");

#[allow(clippy::mutable_key_type)]
let mut pks_to_constraints = BTreeMap::new();
#[allow(clippy::mutable_key_type)]
let mut vks_to_inputs = BTreeMap::new();

for (index_pk, index_vk) in index_keys.iter() {
Expand Down Expand Up @@ -147,6 +149,7 @@ mod varuna {
}
fake_instance_inputs.push(fake_instance_input);
}
#[allow(clippy::mutable_key_type)]
let mut vks_to_fake_inputs = BTreeMap::new();
for (i, vk) in vks_to_inputs.keys().enumerate() {
vks_to_fake_inputs.insert(*vk, fake_instance_inputs[i].as_slice());
Expand Down
54 changes: 37 additions & 17 deletions algorithms/src/snark/varuna/varuna.rs
Original file line number Diff line number Diff line change
Expand Up @@ -46,12 +46,13 @@ use crate::{
use rand::RngCore;
use snarkvm_curves::PairingEngine;
use snarkvm_fields::{One, PrimeField, ToConstraintField, Zero};
use snarkvm_utilities::{ToBytes, dev_eprintln, dev_println, to_bytes_le};
use snarkvm_utilities::{CanonicalSerialize, ToBytes, dev_eprintln, dev_println, to_bytes_le};

use anyhow::{Result, anyhow, bail, ensure};
use core::marker::PhantomData;
use itertools::Itertools;
use rand::{CryptoRng, Rng};
use sha2::Digest;
use std::{borrow::Borrow, collections::BTreeMap, ops::Deref, sync::Arc};

use crate::srs::UniversalProver;
Expand All @@ -67,6 +68,20 @@ impl<E: PairingEngine, FS: AlgebraicSponge<E::Fq, 2>, SM: SNARKMode> VarunaSNARK
/// Used to personalize the Fiat-Shamir RNG.
pub const PROTOCOL_NAME: &'static [u8] = b"VARUNA-2023";

/// Hash batches in advance for the purposes of `Self::init_sponge`.
fn hash_batches(inputs_and_batch_sizes: &BTreeMap<CircuitId, (usize, &[Vec<E::Fr>])>) -> Result<[u8; 32]> {
let mut hash = blake2::Blake2s256::new();

for (batch_size, inputs) in inputs_and_batch_sizes.values() {
(*batch_size as u64).serialize_uncompressed(&mut hash)?;
for input in *inputs {
input.serialize_uncompressed(&mut hash)?;
}
}

Ok(hash.finalize().into())
}

// TODO: implement optimizations resulting from batching
// (e.g. computing a common set of Lagrange powers, FFT precomputations,
// etc)
Expand Down Expand Up @@ -120,6 +135,7 @@ impl<E: PairingEngine, FS: AlgebraicSponge<E::Fq, 2>, SM: SNARKMode> VarunaSNARK
let circuit_verifying_key = CircuitVerifyingKey {
circuit_info: indexed_circuit.index_info,
circuit_commitments,
circuit_commitments_hash: Default::default(),
id: indexed_circuit.id,
};
let circuit_proving_key = CircuitProvingKey {
Expand All @@ -134,22 +150,15 @@ impl<E: PairingEngine, FS: AlgebraicSponge<E::Fq, 2>, SM: SNARKMode> VarunaSNARK
Ok(circuit_keys)
}

fn init_sponge<'a>(
fn init_sponge(
fs_parameters: &FS::Parameters,
inputs_and_batch_sizes: &BTreeMap<CircuitId, (usize, &[Vec<E::Fr>])>,
circuit_commitments: impl Iterator<Item = &'a [crate::polycommit::sonic_pc::Commitment<E>]>,
hashed_batches: [u8; 32],
circuit_commitments_hashes: Vec<E::Fq>,
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you make a 3rd VarunaVersion, and guard the changes by it to preserve backwards compatibility? You can peek at where we use VarunaVersion::V2 for inspiration.

) -> FS {
let mut sponge = FS::new_with_parameters(fs_parameters);
sponge.absorb_bytes(Self::PROTOCOL_NAME);
for (batch_size, inputs) in inputs_and_batch_sizes.values() {
sponge.absorb_bytes(&(*batch_size as u64).to_le_bytes());
for input in inputs.iter() {
sponge.absorb_nonnative_field_elements(input.iter().copied());
}
}
for circuit_specific_commitments in circuit_commitments {
sponge.absorb_native_field_elements(circuit_specific_commitments);
}
sponge.absorb_bytes(&hashed_batches);
sponge.absorb_native_field_elements(&circuit_commitments_hashes);
sponge
}

Expand Down Expand Up @@ -393,10 +402,14 @@ where

let committer_key = CommitterUnionKey::union(keys_to_constraints.keys().map(|pk| pk.committer_key.deref()));

let circuit_commitments =
keys_to_constraints.keys().map(|pk| pk.circuit_verifying_key.circuit_commitments.as_slice());
let circuit_commitments_hashes = keys_to_constraints
.keys()
.map(|pk| pk.circuit_verifying_key.get_or_calculate_circuit_commitments_hash::<FS>(fs_parameters))
.copied()
.collect();

let mut sponge = Self::init_sponge(fs_parameters, &inputs_and_batch_sizes, circuit_commitments.clone());
let hashed_batches = Self::hash_batches(&inputs_and_batch_sizes)?;
let mut sponge = Self::init_sponge(fs_parameters, hashed_batches, circuit_commitments_hashes);

// --------------------------------------------------------------------
// First round
Expand Down Expand Up @@ -860,7 +873,14 @@ where
let fifth_commitments = [LabeledCommitment::new_with_info(&fifth_round_info["h_2"], comms.h_2)];

let circuit_commitments = keys_to_inputs.keys().map(|vk| vk.circuit_commitments.as_slice());
let mut sponge = Self::init_sponge(fs_parameters, &inputs_and_batch_sizes, circuit_commitments.clone());
let circuit_commitments_hashes = keys_to_inputs
.keys()
.map(|vk| vk.get_or_calculate_circuit_commitments_hash::<FS>(fs_parameters))
.copied()
.collect();

let hashed_batches = Self::hash_batches(&inputs_and_batch_sizes)?;
let mut sponge = Self::init_sponge(fs_parameters, hashed_batches, circuit_commitments_hashes);

// --------------------------------------------------------------------
// First round
Expand Down
1 change: 1 addition & 0 deletions synthesizer/snark/src/proving_key/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,7 @@ impl<N: Network> ProvingKey<N> {

// Prepare the instances.
let num_expected_instances = assignments.len();
#[allow(clippy::mutable_key_type)]
let instances: BTreeMap<_, _> = assignments
.iter()
.map(|(proving_key, assignments)| (proving_key.deref(), assignments.as_slice()))
Expand Down
1 change: 1 addition & 0 deletions synthesizer/snark/src/verifying_key/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -87,6 +87,7 @@ impl<N: Network> VerifyingKey<N> {

// Convert the instances.
let num_expected_keys = inputs.len();
#[allow(clippy::mutable_key_type)]
let keys_to_inputs: BTreeMap<_, _> =
inputs.iter().map(|(verifying_key, inputs)| (verifying_key.deref(), inputs.as_slice())).collect();
ensure!(keys_to_inputs.len() == num_expected_keys, "Incorrect number of verifying keys for batch proof");
Expand Down