MPI run #2868
-
Hi In my code I am using mpi command and printing a value at a node in 2d domain. If it is mpirun-3 then the same value prints in 3 times.
I need to assign results of the current iteration (1s) to initial values of next iteration(2s), so order needs to be kept. But sometimes this order changes as,
Does the Printing order is changing mean, before executing the code for the whole domain in the current time step, the next time step starts? How can I change it as the next time step can start only if the current time step finishes(then the values should be printed as the first examples)? I have tried the command ' mpiexec -n' as well. But the same issue occured. Appreciate your comments! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
Please can you share the code that you are running? MPI ranks run effectively in isolation to each other so, unless you have collective MPI operations (e.g. barrier), the ordering between ranks is non-deterministic. |
Beta Was this translation helpful? Give feedback.
There is no imposed ordering between the different processes when you print, so data can come out in any order. In particular, when running in parallel there is often some buffering that occurs such that some processes may appear to print after others.
You can sometimes avoid this by saying
print(..., flush=True)
, but even then, output can be mixed up.The parallel printing demo shows one way to print using PETSc facilities: the prints are now collective on a communicator and only rank 0 on that communicator prints.
If you really want everyone to print and you want deterministic ordering, then use
PETSc.Sys.syncPrint
. This buffers up (in rank order over a communicator) the prints from eac…