Reducing the coordination number below Z=2.0 #2964
Unanswered
LeonardoPEQ
asked this question in
Q&A
Replies: 1 comment
-
The reason we use the spanning tree is so that the net work stays fully connected and hence viable for simulations. If you can find a different way to identify throats for deletion then that is an option. What I mean is that there is nothing special about the mst. For instance, you could select pores at random, then delete all their throat but 1 or 2. Repeat X times...THEN do the mst coordination number trick to bring the total coordination down. Perhaps there is already and algorithm for finding spanning trees other than the minimum one? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Recently, I’ve been working on a problem where I need to perform absolute permeability simulations on simple cubic networks ranging from 10x10x10 to 70x70x70. In these simulations, I need to compute the permeability coefficient for a variety of mean coordination numbers, including Z = 6, 5, 4, 3, 2, 1.75, 1.65, and 1.5.
However, I’ve run into an issue: the method suggested in the documentation doesn’t seem to work when the mean coordination number drops below 2. I suspect this is due to how the code constructs the network. Since it ensures all pores are connected by creating a minimum spanning tree, the network can’t achieve a mean coordination number lower than 2.
I’m wondering if you have any suggestions for how I might approach this differently. Alternatively, are there any plans to support networks with mean coordination numbers below 2 in future releases?
Here’s what I’ve tried so far in an attempt to reduce the coordination number. When Z < 2 (specifically, for 1.75, 1.65, and 1.5), the method seems to stop at Z = 2 and won’t go any lower:
import openpnm as op
import numpy as np
N, spacing, connectivity = 10, 1e-4, 6
pn = op.network.Cubic(shape=[N, N, N], spacing=spacing, connectivity=connectivity)
pn.add_model_collection(op.models.collections.geometry.spheres_and_cylinders)
pn.regenerate_models()
drop = op.topotools.reduce_coordination(pn, 1.7)
op.topotools.trim(network=pn, throats=drop)
pn.regenerate_models()
N_mean_all = np.mean(pn.num_neighbors(pn.Ps, mode='or', flatten=False))
print(N_mean_all)
In this case, the output is 1.998.
Any insights or advice would be greatly appreciated!
Beta Was this translation helpful? Give feedback.
All reactions