Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
62484fd
Some updates to the serial run of the supersonic panel case for the
kejacobson Dec 11, 2024
59f3929
tweaks for: calling stop_server in parallel, remote DVs with dots in …
Asthelen Mar 19, 2024
d2ee8e8
add way to restart job if it is no longer running
Asthelen Mar 21, 2024
f6c95d6
keep track of down time between model evaluations
Asthelen Mar 21, 2024
d8482a9
stop_server in example changed from eval command to explicit stop_ser…
Asthelen Mar 21, 2024
373038b
replace dummy port forwarding process with dummy zeromq socket, to av…
Asthelen Mar 21, 2024
a46a536
support running on more than 1 rank
Asthelen Mar 22, 2024
e840edf
send write_n2 option to server
Asthelen Mar 28, 2024
27c86c3
fix final reading of sql file for newer openmdao versions, which save…
Asthelen Oct 22, 2024
9f357b2
fix server for case with additional remote input only on first rank a…
Asthelen Nov 15, 2024
b0a99e4
add additional_remote_constants option, for non-DV inputs, which won'…
Asthelen Nov 16, 2024
f6d2cc8
fix hang with additional remote constant that only exists on rank 0
Asthelen Nov 17, 2024
079cbd3
fix broken connection in run.py; clean up other things a bit and veri…
Asthelen Dec 11, 2024
12bedfc
adjust supersonic panel readme a bit
Asthelen Dec 12, 2024
c0debfe
Merge branch 'main' into network_remote_dvs
Asthelen Jan 24, 2025
e459560
change bool to type(True)
Asthelen Jan 24, 2025
2545a73
flake fixes
Asthelen Jan 24, 2025
e2a714c
sort imports
Asthelen Jan 24, 2025
d6be740
run black on supersonic_panel codes again
Asthelen Jan 24, 2025
fc8a744
black corrections on server.py
Asthelen Jan 24, 2025
c9577de
change forward_reverse to fwd_rev in tests
Asthelen Feb 10, 2025
b6b759f
try to fix tests for latest OM change; fix broken connection in as_op…
Asthelen Feb 10, 2025
923764e
formatting; import ordering
Asthelen Feb 10, 2025
1120576
more formatting
Asthelen Feb 10, 2025
f05cc79
change another integration test tolerance from 1e-12 to 1e-11 due to …
Asthelen Feb 10, 2025
74547d6
fixing broken integration tests
timryanb Feb 26, 2025
be4c77d
fixing broken integration tests
timryanb Feb 26, 2025
4efa8a9
black
timryanb Feb 26, 2025
7ed3f42
Merge branch 'main' into network_remote_dvs
Asthelen Mar 12, 2025
83bcd2d
Merge branch 'OpenMDAO:main' into network_remote_dvs
Asthelen Mar 20, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/basics/remote_components.rst
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,7 @@ Troubleshooting
---------------
The :code:`dump_json` option for :code:`RemoteZeroMQComp` will make the component write input and output JSON files, which contain all data sent to and received from the server.
An exception is the :code:`wall_time` entry (given in seconds) in the output JSON file, which is added on the client-side after the server has completed the design evaluation.
Similarly, the :code:`down_time` entry keeps track of the elapsed time between the end of the previous design evaluation and the beginning of the current one.
Another entry that is only provided for informational purposes is :code:`design_counter`, which keeps track of how many different designs have been evaluated on the current server.
If :code:`dump_separate_json` is set to True, then separate files will be written for each design evaluation.
On the server side, an n2 file titled :code:`n2_inner_analysis_<component name>.html` will be written after each evaluation.
Expand Down
37 changes: 37 additions & 0 deletions examples/aerostructural/supersonic_panel/README
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
Summary of top-level codes:

run.py: most basic code that demonstrates a single scenario, doing a derivative check at the starting design
run_parallel.py: code that demonstrates two scenarios (differing by Mach and dynamic pressure), evaluated in parallel with MultipointParallel, doing a derivative check at the starting design
as_opt_parallel.py: runs an optimization of the previous code's two scenarios (mass minimization subject to lift and stress constraints at the two flight conditions)
as_opt_remote_serial.py: runs the same optimization using one remote component that evaluates the MultipointParallel group in as_opt_parallel.py
as_opt_remote_parallel.py: runs the same optimization using two parallel remote components, which each evaluates the Multipoint analysis in run.py

The optimizations should complete with the following metrics (with C_L being lift coefficient and func_struct being an aggregated von Mises stress).
Note that objective and constraint names can vary slightly based on the optimization script.

Design Vars
{'aoa': array([10.52590682, 18.2314054 ]),
'dv_struct': array([0.0001 , 0.0001 , 0.0001 , 0.0001 , 0.0001 ,
0.0001 , 0.0001 , 0.0001 , 0.0001 , 0.00010421,
0.00010883, 0.00011221, 0.00011371, 0.00011452, 0.0001133 ,
0.00010892, 0.00010359, 0.0001 , 0.0001 , 0.0001 ]),
'geometry_morph_param': array([0.1])}

Nonlinear constraints
{'multipoint.aerostructural1.C_L': array([0.15]),
'multipoint.aerostructural1.func_struct': array([1.00000023]),
'multipoint.aerostructural2.C_L': array([0.45]),
'multipoint.aerostructural2.func_struct': array([1.00000051])}

Objectives
{'multipoint.aerostructural1.mass': array([8.73298752e-05])}

Optimization terminated successfully (Exit mode 0)
Current function value: 0.008732987524877025
Iterations: 22
Function evaluations: 24
Gradient evaluations: 22

Note that the remote scripts, which both use mphys_server.py to launch the HPC job used for the analyses, are set up to use the K4 queue of NASA Langley's K cluster.
To run this script on an HPC not supported by pbs4py, you will likely have to write your own pbs4py Launcher constructor.
Further details on remote components may be found on the document page on remote components: https://openmdao.github.io/mphys/basics/remote_components.html
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ def initialize(self):
self.options.declare("x_aero0")

def setup(self):
self.x_aero0_name = MPhysVariables.Aerodynamics.Surface.COORDINATES_INITIAL
self.x_aero0_name = MPhysVariables.Aerodynamics.Surface.Mesh.COORDINATES
self.add_output(
self.x_aero0_name,
val=self.options["x_aero0"],
Expand Down
8 changes: 4 additions & 4 deletions examples/aerostructural/supersonic_panel/as_opt_parallel.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,9 +11,8 @@
from mphys import Multipoint, MultipointParallel
from mphys.scenarios.aerostructural import ScenarioAeroStructural

check_totals = (
False # True=check objective/constraint derivatives, False=run optimization
)
# True=check objective/constraint derivatives, False=run optimization
check_totals = False

# panel geometry
panel_chord = 0.3
Expand All @@ -23,6 +22,7 @@
N_el_struct = 20
N_el_aero = 7


# Mphys parallel multipoint scenarios
class AerostructParallel(MultipointParallel):
def initialize(self):
Expand Down Expand Up @@ -224,7 +224,7 @@ def get_model(scenario_names):
prob.cleanup()

if prob.model.comm.rank == 0: # write out data
cr = om.CaseReader("optimization_history.sql")
cr = om.CaseReader(f"{prob.get_outputs_dir()}/optimization_history.sql")
driver_cases = cr.list_cases("driver")

case = cr.get_case(0)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,9 @@

from mphys.network.zmq_pbs import RemoteZeroMQComp

check_totals = (
False # True=check objective/constraint derivatives, False=run optimization
)
# True=check objective/constraint derivatives, False=run optimization
check_totals = False


# for running scenarios on different servers in parallel
class ParallelRemoteGroup(om.ParallelGroup):
Expand Down Expand Up @@ -58,9 +58,6 @@ def setup(self):

class TopLevelGroup(om.Group):
def setup(self):
if self.comm.size != 2:
raise SystemError("Please launch with 2 processors")

# IVCs that feed into both parallel groups
self.add_subsystem("ivc", om.IndepVarComp(), promotes=["*"])

Expand Down Expand Up @@ -140,7 +137,9 @@ def setup(self):

# write out data
if prob.model.comm.rank == 0:
cr = om.CaseReader("optimization_history_parallel.sql")
cr = om.CaseReader(
f"{prob.get_outputs_dir()}/optimization_history_parallel.sql"
)
driver_cases = cr.list_cases("driver")

case = cr.get_case(0)
Expand Down Expand Up @@ -193,5 +192,6 @@ def setup(self):
)
f.write(" " + "\n")

# shutdown each rank's server
eval(f"prob.model.multipoint.remote_scenario{prob.model.comm.rank}.stop_server()")
# shutdown the servers
prob.model.multipoint.remote_scenario0.stop_server()
prob.model.multipoint.remote_scenario1.stop_server()
116 changes: 61 additions & 55 deletions examples/aerostructural/supersonic_panel/as_opt_remote_serial.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,8 @@

from mphys.network.zmq_pbs import RemoteZeroMQComp

check_totals = (
False # True=check objective/constraint derivatives, False=run optimization
)
# True=check objective/constraint derivatives, False=run optimization
check_totals = False

# initialize pbs4py
pbs = PBS.k4(time=1)
Expand Down Expand Up @@ -60,59 +59,66 @@
prob.cleanup()

# write out data
cr = om.CaseReader("optimization_history.sql")
driver_cases = cr.list_cases("driver")

case = cr.get_case(0)
cons = case.get_constraints()
dvs = case.get_design_vars()
objs = case.get_objectives()

with open("optimization_history.dat", "w+") as f:

for i, k in enumerate(objs.keys()):
f.write("objective: " + k + "\n")
for j, case_id in enumerate(driver_cases):
f.write(
str(j)
+ " "
+ str(cr.get_case(case_id).get_objectives(scaled=False)[k][0])
+ "\n"
)
f.write(" " + "\n")

for i, k in enumerate(cons.keys()):
f.write("constraint: " + k + "\n")
for j, case_id in enumerate(driver_cases):
f.write(
str(j)
+ " "
+ " ".join(
map(str, cr.get_case(case_id).get_constraints(scaled=False)[k])
if prob.model.comm.rank == 0:
cr = om.CaseReader(f"{prob.get_outputs_dir()}/optimization_history.sql")
driver_cases = cr.list_cases("driver")

case = cr.get_case(0)
cons = case.get_constraints()
dvs = case.get_design_vars()
objs = case.get_objectives()

with open("optimization_history.dat", "w+") as f:

for i, k in enumerate(objs.keys()):
f.write("objective: " + k + "\n")
for j, case_id in enumerate(driver_cases):
f.write(
str(j)
+ " "
+ str(cr.get_case(case_id).get_objectives(scaled=False)[k][0])
+ "\n"
)
+ "\n"
)
f.write(" " + "\n")

for i, k in enumerate(dvs.keys()):
f.write("DV: " + k + "\n")
for j, case_id in enumerate(driver_cases):
f.write(
str(j)
+ " "
+ " ".join(
map(str, cr.get_case(case_id).get_design_vars(scaled=False)[k])
f.write(" " + "\n")

for i, k in enumerate(cons.keys()):
f.write("constraint: " + k + "\n")
for j, case_id in enumerate(driver_cases):
f.write(
str(j)
+ " "
+ " ".join(
map(
str,
cr.get_case(case_id).get_constraints(scaled=False)[k],
)
)
+ "\n"
)
+ "\n"
)
f.write(" " + "\n")
f.write(" " + "\n")

for i, k in enumerate(dvs.keys()):
f.write("DV: " + k + "\n")
for j, case_id in enumerate(driver_cases):
f.write(
str(j)
+ " "
+ " ".join(
map(
str,
cr.get_case(case_id).get_design_vars(scaled=False)[k],
)
)
+ "\n"
)
f.write(" " + "\n")

f.write("run times, function\n")
for i in range(len(prob.model.remote.times_function)):
f.write(f"{prob.model.remote.times_function[i]}\n")
f.write(" " + "\n")
f.write("run times, function\n")
for i in range(len(prob.model.remote.times_function)):
f.write(f"{prob.model.remote.times_function[i]}\n")
f.write(" " + "\n")

f.write("run times, gradient\n")
for i in range(len(prob.model.remote.times_gradient)):
f.write(f"{prob.model.remote.times_gradient[i]}\n")
f.write(" " + "\n")
f.write("run times, gradient\n")
for i in range(len(prob.model.remote.times_gradient)):
f.write(f"{prob.model.remote.times_gradient[i]}\n")
f.write(" " + "\n")
40 changes: 30 additions & 10 deletions examples/aerostructural/supersonic_panel/geometry_morph.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
import openmdao.api as om
from mpi4py import MPI

from mphys import Builder
from mphys import Builder, MPhysVariables


# EC which morphs the geometry
Expand All @@ -14,34 +14,54 @@ def initialize(self):
def setup(self):
self.add_input("geometry_morph_param")

self.x_names = {}
for name, n_nodes in zip(self.options["names"], self.options["n_nodes"]):
self.add_input(f"x_{name}_in", distributed=True, shape_by_conn=True)
if name == "aero":
self.x_names[name] = {
"input": MPhysVariables.Aerodynamics.Surface.Geometry.COORDINATES_INPUT,
"output": MPhysVariables.Aerodynamics.Surface.Geometry.COORDINATES_OUTPUT,
}
elif name == "struct":
self.x_names[name] = {
"input": MPhysVariables.Structures.Geometry.COORDINATES_INPUT,
"output": MPhysVariables.Structures.Geometry.COORDINATES_OUTPUT,
}
self.add_input(
self.x_names[name]["input"],
distributed=True,
shape_by_conn=True,
tags=["mphys_coordinates"],
)
self.add_output(
f"x_{name}0",
self.x_names[name]["output"],
shape=n_nodes * 3,
distributed=True,
tags=["mphys_coordinates"],
)

def compute(self, inputs, outputs):
for name in self.options["names"]:
outputs[f"x_{name}0"] = (
inputs["geometry_morph_param"] * inputs[f"x_{name}_in"]
outputs[self.x_names[name]["output"]] = (
inputs["geometry_morph_param"] * inputs[self.x_names[name]["input"]]
)

def compute_jacvec_product(self, inputs, d_inputs, d_outputs, mode):
if mode == "rev":
for name in self.options["names"]:
if f"x_{name}0" in d_outputs:
if self.x_names[name]["output"] in d_outputs:
if "geometry_morph_param" in d_inputs:
d_inputs["geometry_morph_param"] += self.comm.allreduce(
np.sum(d_outputs[f"x_{name}0"] * inputs[f"x_{name}_in"]),
np.sum(
d_outputs[self.x_names[name]["output"]]
* inputs[self.x_names[name]["input"]]
),
op=MPI.SUM,
)

if f"x_{name}_in" in d_inputs:
d_inputs[f"x_{name}_in"] += (
d_outputs[f"x_{name}0"] * inputs["geometry_morph_param"]
if self.x_names[name]["input"] in d_inputs:
d_inputs[self.x_names[name]["input"]] += (
d_outputs[self.x_names[name]["output"]]
* inputs["geometry_morph_param"]
)


Expand Down
26 changes: 20 additions & 6 deletions examples/aerostructural/supersonic_panel/run.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
from structures_mphys import StructBuilder
from xfer_mphys import XferBuilder

from mphys import Multipoint
from mphys import MPhysVariables, Multipoint
from mphys.scenarios.aerostructural import ScenarioAeroStructural

comm = MPI.COMM_WORLD
Expand All @@ -22,6 +22,7 @@
N_el_struct = 20
N_el_aero = 7


# Mphys
class Model(Multipoint):
def initialize(self):
Expand Down Expand Up @@ -82,9 +83,6 @@ def setup(self):
"geometry", geometry_builder.get_mesh_coordinate_subsystem(), promotes=["*"]
)

self.connect("struct_mesh.x_struct0", "x_struct_in")
self.connect("aero_mesh.x_aero0", "x_aero_in")

# create the run directory
if self.comm.rank == 0:
if not os.path.isdir(self.scenario_name):
Expand Down Expand Up @@ -118,11 +116,27 @@ def setup(self):
"qdyn",
"aoa",
"dv_struct",
"x_struct0",
"x_aero0",
]:
self.connect(var, self.scenario_name + "." + var)

self.connect(
f"aero_mesh.{MPhysVariables.Aerodynamics.Surface.Mesh.COORDINATES}",
MPhysVariables.Aerodynamics.Surface.Geometry.COORDINATES_INPUT,
)
self.connect(
MPhysVariables.Aerodynamics.Surface.Geometry.COORDINATES_OUTPUT,
f"{self.scenario_name}.{MPhysVariables.Aerodynamics.Surface.COORDINATES_INITIAL}",
)

self.connect(
f"struct_mesh.{MPhysVariables.Structures.Mesh.COORDINATES}",
MPhysVariables.Structures.Geometry.COORDINATES_INPUT,
)
self.connect(
MPhysVariables.Structures.Geometry.COORDINATES_OUTPUT,
f"{self.scenario_name}.{MPhysVariables.Structures.COORDINATES}",
)

# add design variables, to simplify remote setup
self.add_design_var("geometry_morph_param", lower=0.1, upper=10.0)
self.add_design_var("dv_struct", lower=1.0e-4, upper=1.0e-2, ref=1.0e-3)
Expand Down
1 change: 1 addition & 0 deletions examples/aerostructural/supersonic_panel/run_parallel.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@
N_el_struct = 20
N_el_aero = 7


# Mphys parallel multipoint scenarios
class AerostructParallel(MultipointParallel):
# class AerostructParallel(Multipoint):
Expand Down
Loading