Skip to content

Commit 51a8612

Browse files
committed
Merge remote-tracking branch 'origin/main'
2 parents 67cbfe3 + ff0f126 commit 51a8612

29 files changed

+858
-416
lines changed

.gitignore

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -157,3 +157,9 @@ examples/integrations/*.out
157157
# Approval tests
158158
*.received.txt
159159
*.Identifier
160+
/.obsidian/app.json
161+
/.obsidian/appearance.json
162+
/.obsidian/core-plugins.json
163+
/.obsidian/core-plugins-migration.json
164+
/.obsidian/hotkeys.json
165+
/.obsidian/workspace.json

DevelopersGuide.md

Lines changed: 0 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -27,15 +27,6 @@ New
2727
twine upload dist/*
2828
```
2929

30-
Old
31-
```
32-
# First create the dist
33-
python3 setup.py sdist bdist_wheel
34-
35-
# Second upload the distributions
36-
twine upload dist/*
37-
38-
3930
### Type of commits:
4031

4132
- ENH: Enhancement, new functionality
File renamed without changes.
Lines changed: 93 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,93 @@
1+
## Optimizing solver
2+
3+
### Debugging Vector model 1
4+
5+
#### Questions
6+
7+
- [x] How many times do the system of equations gets solved?
8+
- Without caching is 1 per scalar field x 2 (centers and corners) x n_octree levels
9+
10+
- [x] How many iterations we need to reach the solution?
11+
- Model 1 takes 5000 iterations (actually starts to diverge after 4k)
12+
13+
- [ ] Has Pykeops become slower with the latest additions to the gradient?
14+
15+
- [x] Is the Jacobi pre-conditioner bad for all cases?
16+
- Model 1:
17+
- Numpy Jacobi: Converge in
18+
- 1602 iterations and reach 5000
19+
- time 150 sec
20+
- More or less good results: Mesh closed
21+
- Numpy No jacobi
22+
- 1018 iterations and reach 5000
23+
- time 150 sec too
24+
- Results similar? I would say maybe a bit worse
25+
- Pykeosp No jacobi
26+
- 92 iterations and reach 5000
27+
- time 324 (all on running the solver)
28+
- [ ] Can be that pykeops is faster than invert the matrix but still slower that any other CG
29+
- Since the matrix is already constructed, it makes sense
30+
- [ ] Then which cases is worth using pykeops? The number of iterations will need to be rather small
31+
Will it make sense on edits reusing the previous weights? This is going to need a hell of orchestrator
32+
- [ ] How much memory takes numpy direct, numpy cg and pykeops cg
33+
- numpy direct: 1.5 or so
34+
- numpy cg: 1.25
35+
- pykeops cg: 1.1
36+
- **Note** Now the exporting takes quite a lot of memory
37+
38+
39+
40+
### Experiments conclusions
41+
- Using pykeops CG to solve a ill posed problem is not realistic
42+
- if the model fits in memory probably the best thing to do is make the first run in torch an use the weights as initial point
43+
- if the model does not fit in memory we should try to construct the sparse version of it and try to run the incomplete cholesky decomposition
44+
45+
- I didn't find any improvement using the Jacobi pre-conditioner. Mostly the opposite
46+
- With ichol we get different results with numpy or pykeops
47+
- Pykeops
48+
- ichol does not give super good results either at least before the convergence happens
49+
- Model 1 converge **without** ichol on 1290 and 8518
50+
- Model 1 converge **With** ichol 496 (506 in pykeops) 5080 (4995 in pykeops)
51+
52+
53+
### TODO:
54+
55+
- [x] Clean solver interface
56+
- [ ] Add citation for pymatting
57+
58+
### Things to try:
59+
- [x] Pykeops as linear operator to use exact solver
60+
- Does not look good. Probably is not even possible mathematically
61+
- [x] Falkon
62+
63+
### Optimal solver
64+
65+
- Tools we have:
66+
- keops
67+
- CG
68+
- Incomplete cholesky decomposition
69+
- passing the weights as initial point
70+
- Variables to choose what to use:
71+
- Size of the covariance matrix
72+
- Number of CG iterations we are going to need. This depends on:
73+
- Conditional number
74+
- Do we have weights
75+
76+
- **Strategy**:
77+
- If the Number of CG iterations is a handful -> **use keops**
78+
- If the number of CG iterations is large:
79+
- Use **direct solver** if the matrix fits in memory
80+
- Then for changes hopefully we can use the weights as initial point and use **keops**
81+
- If the matrix does not fit in memory:
82+
- Construct the sparse version of the matrix
83+
- Use **incomplete cholesky decomposition**
84+
- Once we have the decomposition is better to use **keops** to solve the system (except if the number of iterations has to be very large)
85+
- Optimizing nugget effects still seems the best way to do this
86+
- If memory is not an issue. **Direct solver with torch and keops for exporting seems to be the best option**
87+
88+
89+
#### Notes
90+
- [ ] I need to make similar default env file for gempy itself and using the default when we call the engine or whatever
91+
92+
93+
Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
# Refactoring grid to add transform
2+
3+
## TODO:
4+
- [ ] Rotation of pyvista grid
5+
- [ ] Rotation of pyvista meshes
6+
7+
## Questions:

examples/tutorials/ch1_fundamentals/ch1_3a_grids.py

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,7 @@
88

99
import gempy as gp
1010
from gempy.core.data import Grid
11+
from gempy.core.data.grid_modules import RegularGrid
1112

1213
np.random.seed(55500)
1314

@@ -103,11 +104,15 @@
103104
# activate them.
104105
#
105106

107+
106108
# %%
107-
help(grid.create_regular_grid)
109+
help(RegularGrid)
108110

109111
# %%
110-
grid.create_regular_grid(extent=[0, 100, 0, 100, -100, 0], resolution=[20, 20, 20])
112+
extent = np.array([0, 100, 0, 100, -100, 0])
113+
resolution = np.array([20, 20, 20])
114+
grid.regular_grid = RegularGrid(extent, resolution)
115+
grid.regular_grid
111116

112117
# %%
113118
# Now the regular grid object composed on ``Grid`` has been filled:

examples/tutorials/ch1_fundamentals/ch1_4_onlap_relations.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,7 @@
4747
geo_model.structural_frame
4848

4949
# %%
50-
geo_model.transform.apply_anisotropy(gp.data.GlobalAnisotropy.NONE)
50+
geo_model.input_transform.apply_anisotropy(gp.data.GlobalAnisotropy.NONE)
5151
gp.add_structural_group(
5252
model=geo_model,
5353
group_index=0,

examples/tutorials/ch1_fundamentals/ch1_5_fault_relations.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@
6363

6464
# %%
6565

66-
geo_model.transform.apply_anisotropy(gp.data.GlobalAnisotropy.NONE)
66+
geo_model.input_transform.apply_anisotropy(gp.data.GlobalAnisotropy.NONE)
6767
if False:
6868
gp.compute_model(geo_model)
6969
# %%

examples/tutorials/ch3-Interpolations/ch3_1_kriging_interpolation_and_simulation.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -93,7 +93,7 @@
9393
# creating a domain object from the gempy solution, a defined domain conditioning data
9494
domain = kriging.Domain(
9595
model_solutions=sol,
96-
transform=geo_data.transform,
96+
transform=geo_data.input_transform,
9797
domain=[2],
9898
data=cond_data
9999
)

gempy/API/grid_API.py

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -2,17 +2,17 @@
22

33
import numpy as np
44

5-
from gempy.core.data import Grid
6-
from gempy.core.data.grid import GridTypes
7-
from gempy.core.data.grid_modules import CustomGrid
8-
from gempy.core.data.grid_modules.topography import Topography
9-
from gempy.modules.grids.create_topography import create_random_topography
10-
from gempy.optional_dependencies import require_subsurface
5+
from ..core.data import Grid
6+
from ..core.data.grid_modules import CustomGrid, Sections
7+
from ..core.data.grid_modules.topography import Topography
8+
from ..modules.grids.create_topography import create_random_topography
9+
from ..optional_dependencies import require_subsurface
1110

1211

1312
def set_section_grid(grid: Grid, section_dict: dict):
1413
if grid.sections is None:
15-
grid.create_section_grid(section_dict=section_dict)
14+
grid.sections = Sections(regular_grid=grid.regular_grid, section_dict=section_dict)
15+
grid.sections
1616
else:
1717
grid.sections.set_sections(section_dict,
1818
regular_grid=grid.regular_grid)
@@ -91,12 +91,12 @@ def set_topography_from_array():
9191
raise NotImplementedError("This is not implemented yet")
9292

9393

94-
def set_active_grid(grid: Grid, grid_type: list[GridTypes], reset: bool = False):
94+
def set_active_grid(grid: Grid, grid_type: list[Grid.GridTypes], reset: bool = False):
9595
if reset is True:
96-
grid.deactivate_all_grids()
96+
grid.active_grids = GridTypes.NONE
9797
for grid_type in grid_type:
98-
grid.active_grids_bool[grid_type.value] = True
98+
grid.active_grids |= grid_type
9999

100-
print(f'Active grids: {grid.grid_types[grid.active_grids_bool]}')
100+
print(f'Active grids: {grid.active_grids}')
101101

102102
return grid

0 commit comments

Comments
 (0)