Skip to content

Conversation

@brownbaerchen
Copy link
Contributor

Python distutils has been deprecated since version 3.10 and removed in version 3.12. In 3.12 I could still use it but after switching to 3.13, the plot helper was broken for me.
Apparently,distutil has been replaced by shutil. shutil.which returns what you get when you use which in the shell you run Python in, or None if there is no executable of the name you are looking for. Therefore, this should give the same functionality as distutil.spawn.find_executable as it is used in the plot helper.

@pancetta
Copy link
Member

pancetta commented May 4, 2025

Any updates here?

@brownbaerchen
Copy link
Contributor Author

This action also fails on the master branch of my fork. Not sure why it doesn't here. I am pretty sure that it is unrelated to the changes in this PR, but I'll try and change the way the test is launched later and see if it helps.

benchmark(wrapper)


@pytest.mark.base
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@tlunet, do you agree with this markers? I don't think we need to run these tests in all environments. They are throwing some numpy deprecation warnings at the moment, which we will need to deal with someday.

Copy link
Member

@tlunet tlunet May 5, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

They are necessary. You just need to replace in the first assertion the == between arrays by some np.allclose (as the warning is suggesting ?)

EDIT : nvm, missread ...

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a polynomial deprecation warrning ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't pay much attention to the warning, tbh. For now, I just want to run these tests only in the base environment and not in every environment.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, then that's fine indeed ... sorry I didn't realized that was an addition 😅

. venv-pySDC/bin/activate
pip install -e /repositories/pySDC
pip install qmat
pip install pytest-isolate-mpi
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should be added as a test dependency, directly in the pyproject.toml file, I beleive (see https://github.com/Parallel-in-Time/qmat/blob/4919f197bac230a27f8ccea4188645cb53fefd0a/pyproject.toml#L34) ... also, qmat should already be installed when installing pySDC

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, I disagree. We still want environments without MPI after all.

Copy link
Member

@tlunet tlunet May 5, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hence the idea of optional dependencies defined in the pyproject.toml file. You could create a gusto one or a gusto-test, etc ... and simply install using :

pip install -e ./repositories/pySDC[gusto]

which would install the base dependencies, + the optional ones (cf here). It makes the pipeline way cleaner, and centralizes the dependencies in the pyproject.toml, which allows a better control for local or CI tests ...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can think of adding MPI dependencies there, but I don't really see the point, tbh. I prefer this way where the setup of the container and what goes into the container is all together. Outside of the testing pipeline people are not going to use the container and will not need the testing thing.

Also, turns out the test is flaky somehow and adding pytest-isolate-mpi did not at all fix it. This is pretty annoying :D

Copy link
Member

@tlunet tlunet May 5, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Then, my guess is that you can remove this and the pip install qmat 😛 (stop the unnecessary code spamming ...)

@tlunet
Copy link
Member

tlunet commented May 5, 2025

Is the firedrake test the problematic one, of the Allen_Cahn ones ?

@brownbaerchen
Copy link
Contributor Author

Is the firedrake test the problematic one, of the Allen_Cahn ones ?

The Allen_Cahn ones are just a one time whatever problem, I suppose. This one Firedrake test keeps failing.

@tlunet
Copy link
Member

tlunet commented May 5, 2025

Does firedrake imports mpi4py or some MPI binding already on his side ? Got some problem with fenix once, where importing mpi4py before dolfin was producing an error in tests ...

@brownbaerchen
Copy link
Contributor Author

Does firedrake imports mpi4py or some MPI binding already on his side ? Got some problem with fenix once, where importing mpi4py before dolfin was producing an error in tests ...

There are other tests using firedrake and mpi that run just fine. The failing one uses Firedrake ensemble communicators for space-time parallelism.

@pancetta
Copy link
Member

Updates here?

@brownbaerchen
Copy link
Contributor Author

Updates here?

I have not managed to fix the firedrake tests yet. This plotting stuff is not related, but I will try to fix the firedrake stuff, maybe this week, hopefully by the end of next week. No need to merge this before.

@brownbaerchen brownbaerchen merged commit 1b5b2dc into Parallel-in-Time:master May 27, 2025
87 checks passed
@brownbaerchen brownbaerchen deleted the python3.13 branch May 27, 2025 15:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants