Skip to content

Conversation

@JoeyT1994
Copy link
Collaborator

This PR makes the default contraction sequence backend for the tn::AbstractITensorNetwork type the optimaltree function from TensorOperations.jl.

This is achieved by extracting the tensors from the tn and passing them to the overwritten function ITensors.optimal_contraction_sequence(tensors::Vector{<:ITensor}) which strips the tensors down to their indices and passes to optimaltree which happily recognizes the ITensor Index type.

This sequence finder is noticeably faster (well over an order of magnitude) than the current optimal_contraction_sequence function in ITensors.jl for a range of different networks tested - especially when a large number of tensors is involved.

TensorOperations is therefore now a dependency for ITensorNetworks.

A similar PR will be made to ITensors.jl in order to change the default there.

@mtfishman
Copy link
Member

mtfishman commented Mar 20, 2025

Thanks for taking a look at this @JoeyT1994. Some suggestions:

  1. I would prefer to turn this into an actual package extension. That would mean requiring loading TensorOperations.jl to use optimal contraction sequence optimization, but I think that is ok. Even though it is slightly less convenient, it is also more explicit for users, potentially improves load times if someone doesn't need contraction sequence optimization, etc.
  2. Let's try to not use ContractionSequenceOptimization at all (including the function optimal_contraction_sequence), we should probably just get rid of that module entirely if we are moving to using TensorOperations.optimaltree here and in ITensors.jl. I think as an alternative we can overload ITensorNetworks.contraction_sequence(::Algorithm"optimal", tn::Vector{ITensor}) to convert and call out to TensorOperations.optimaltree in the package extension. That would also mean defining deepmap in ITensorNetworks.jl, which I think is reasonable anyway.

1. is technically breaking so we should bump to v0.12.

@JoeyT1994
Copy link
Collaborator Author

Thanks for taking a look at this @JoeyT1994. Some suggestions:

  1. I would prefer to turn this into an actual package extension. That would mean requiring loading TensorOperations.jl to use optimal contraction sequence optimization, but I think that is ok. Even though it is slightly less convenient, it is also more explicit for users, potentially improves load times if someone doesn't need contraction sequence optimization, etc.
  2. Let's try to not use ContractionSequenceOptimization at all (including the function optimal_contraction_sequence), we should probably just get rid of that module entirely if we are moving to using TensorOperations.optimaltree here and in ITensors.jl. I think as an alternative we can overload ITensorNetworks.contraction_sequence(::Algorithm"optimal", tn::Vector{ITensor}) to convert and call out to TensorOperations.optimaltree in the package extension. That would also mean defining deepmap in ITensorNetworks.jl, which I think is reasonable anyway.

1. is technically breaking so we should bump to v0.12.

Thanks Matt. I followed your suggestions and made TensorOperations a weak dependency. Presumably this means that if contraction_sequence() is called and TensorOperations is not loaded the code will error? I think optimal sequence finding is used in a lot of places in the code so is there a way to make it error so that it tells the user they should load TensorOperations ?

@mtfishman
Copy link
Member

mtfishman commented Mar 20, 2025

Thanks Matt. I followed your suggestions and made TensorOperations a weak dependency. Presumably this means that if contraction_sequence() is called and TensorOperations is not loaded the code will error? I think optimal sequence finding is used in a lot of places in the code so is there a way to make it error so that it tells the user they should load TensorOperations ?

Yes, that's a downside of this. We could make a backup definition:

function contraction_sequence(alg::Algorithm, tn::Vector{ITensor})
  return throw(ArgumentError(#= Error message saying `alg` isn't defined, suggest loading a backend like TensorOperations.jl or OMEinsumContractionOrders.jl =#))
end

@mtfishman
Copy link
Member

@JoeyT1994 looks like you need to add TensorOperations as a test dependency by adding it to test/Project.toml.

@JoeyT1994
Copy link
Collaborator Author

@mtfishman ! Thanks, I forgot about that and wondered what has happening

@mtfishman
Copy link
Member

Looks good, thanks! One step closer to getting rid of more code from ITensors.jl, which is always a good thing.

@mtfishman mtfishman merged commit f1cce56 into ITensor:main Mar 21, 2025
5 of 6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants