Skip to content

Consistent test data structure for processImage, newCurv#30

Merged
ctrueden merged 81 commits intomainfrom
22-consistent-test-struct
Feb 21, 2026
Merged

Consistent test data structure for processImage, newCurv#30
ctrueden merged 81 commits intomainfrom
22-consistent-test-struct

Conversation

@dlee489
Copy link
Member

@dlee489 dlee489 commented Feb 11, 2026

This PR combines #5, #6, #22.

Translated surface level implementation of
getTifBoundary.m and getSegmentPixels.m, will focus on identifying
possible test cases and start debugging with the correct input/output
expectation
Added gitignore for IDEA.

Ref: #5
Fixes discrepancy that occurs between get_tif_boundary.py and
getTifBoundary.m; however, this now causes the pytests for
get_relative_angle.py to fail
Make changes that allow both get_relative_angles.py and
get_tif_boundary.py to call find_outline_slope.py without any
miscalculations.

See #5
Added round_mlab.py helper function to easily round values MATLAB's
way rather than Python/NumPy's.
Direct comparison between the resulting matrix of MATLAB and Python
implementation.

Main error results from 1-indexing conversion, where if all the centers
are decremented by 1, then nearest_region_distance is recorded properly
on Python's end, but the nearest_boundary_distance is wrong.

If all the centers aren't decremented by 1, then nearest_region_distance
is always wrong.

See #5
Changed KNN algorithm to match MATLAB, edited tester methods for clarity.
Modified the get_tif_boundary_output.csv because the local output from MATLAB
was different than what was originally provided.

See #5
So pip doesn't skip proper dependency resolutions
@elevans
Copy link
Member

elevans commented Feb 19, 2026

I got the test_get_ct.py test to work and it also fails for me. The assert failures look like small floating point error. I'll look at the test more closely but we might need to implement result tolerance if the values are coming for C++ code that we don't control (I haven't looked so please tell me if you know).

@dlee489
Copy link
Member Author

dlee489 commented Feb 19, 2026

I got the test_get_ct.py test to work and it also fails for me. The assert failures look like small floating point error. I'll look at the test more closely but we might need to implement result tolerance if the values are coming for C++ code that we don't control (I haven't looked so please tell me if you know).

Yes, I've talked about modifying the result tolerance with @yuminguw earlier about this and forgot to change it -- it looks like the values don't pass for Linux but it does for me locally, maybe an OS issue? Anyhow, I will relax the tolerance to make it consistent with the other tests.

All rtols have been changed to 0.05, atol to 15
@dlee489
Copy link
Member Author

dlee489 commented Feb 19, 2026

@yuminguw @dlee489 The test_get_ct.py test does not run for me on Linux. I get some deprecation warnings from curvelops but otherwise it skips the test. This is strange because the skip condition is if I don't have curvelops but I do...and I can import it.

Just to be in the loop for user experience, was this due to not declaring TMEQ_RUN_CURVELETS=1 when running the tests?

@dlee489
Copy link
Member Author

dlee489 commented Feb 19, 2026

By the way, when running the example_process_image.py, the boundary association is off as shown below. Could you troubleshoot that? image

Thanks for pointing that out. Looking into it, I believe it should be fixed now with this commit, as the new changes affected the coordinates when mapping. If you can double check that it works on your end?

rtol = 1e-3 # 0.1% relative tolerance
atol = 1e-3 # 0.1% absolute tolerance for angles
rtol = 0.05 # 0.1% relative tolerance
atol = 15
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dlee489 Can you tell me the origin of the original tolerance values and how you decided these new ones? Your commit doesn't indicate why these values changed. Also atol going from 1e-3 to 15 is a huge change, no?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@elevans The tolerance changes are honestly pretty arbitrary values -- I switched the atol to 15 for consistency with test_process_image, which was decided after talking with @yuminguw.

Hindsight should probably decide on more legitimate values, although I'm not sure how to decide/calculate that. Both original tolerance and new tolerance values are arbitrary.

Copy link
Member

@elevans elevans Feb 19, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the context! My concern here is that setting arbitrary tolerance values without good reason may defeat the point of the test if the tolerance is too large. Did you ever measure the value differences between expected and failing tests? Is the difference an epsilon value? To be clear I do think the right answer here is to set tolerance values, we just need a good reason for the value we choose.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not entirely following what you mean by an epsilon value? But yes, I do measure value differences between expected and failing in the tests with both the max atol and rtol.

From my memory, I set the absolute tolerance to 15 due to a test case failing because of one value being a difference >10, with it being the only error and every other value being within the threshold. I can investigate this further later, however.

@elevans
Copy link
Member

elevans commented Feb 19, 2026

@yuminguw @dlee489 The test_get_ct.py test does not run for me on Linux. I get some deprecation warnings from curvelops but otherwise it skips the test. This is strange because the skip condition is if I don't have curvelops but I do...and I can import it.

Just to be in the loop for user experience, was this due to not declaring TMEQ_RUN_CURVELETS=1 when running the tests?

Great question. So what I did, naively, was go to the tests/ directory of tme-quant (this branch) and I ran pytest. This runs this set of tests:

platform linux -- Python 3.11.14, pytest-9.0.2, pluggy-1.6.0
rootdir: /home/edward/Documents/repos/loci/tme-quant
configfile: pyproject.toml
plugins: npe2-0.8.1, zarr-3.1.5, napari-plugin-engine-0.2.1, napari-0.6.6
collected 30 items / 3 skipped                                                                                         

napari_curvealign_test.py .                                                                                      [  3%]
test_get_alignment_to_roi.py ...                                                                                 [ 13%]
test_get_tif_boundary.py ...                                                                                     [ 23%]
test_relative_angles.py .......................                                                                  [100%]

I missed your special test command with my first pass at this, but after I found it I got the test to run like everyone else.

Can we get all the tests to run with pytest at the project root?

Edit: I just needed to do PYTHONPATH=src TMEQ_RUN_CURVELETS=1 pytest tests/ this to run all tests.

@elevans
Copy link
Member

elevans commented Feb 19, 2026

By the way, when running the example_process_image.py, the boundary association is off as shown below. Could you troubleshoot that? image

Thanks for pointing that out. Looking into it, I believe it should be fixed now with this commit, as the new changes affected the coordinates when mapping. If you can double check that it works on your end?

We need a test for this. Can you write one @dlee489 to check that our coords are as expected so we catch this going forward? We wouldn't have seen this without yuming looking at some outputs, but we can easily test coordinates.

pyproject.toml Outdated
"openpyxl",
"matplotlib==3.10.1",
"numpy==2.2.6",
"opencv-python-headless==4.10.0.84",
Copy link
Member

@elevans elevans Feb 19, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we have plans for any other opencv stuff for TME-quant? We're using it here to just read in tiff files here. Using opencv like this is totally okay...but consider the size of the dependency. opencv-python-headless is 62.3 MB when installed via pip where as tifffile, another very popular tiff file reader, is 2.1 MB!

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, I do not think so. Changing the tifffile is probably the better idea here, but one thing to note is that pycurvelets3D.py is from a year ago and it's an unfinished attempt at doing curvelets in 3D -- so this file currently has no useful functionality and is fine if we change to tifffile for the env

@yuminguw
Copy link
Member

I got the test_get_ct.py test to work and it also fails for me. The assert failures look like small floating point error. I'll look at the test more closely but we might need to implement result tolerance if the values are coming for C++ code that we don't control (I haven't looked so please tell me if you know).

Yes, I've talked about modifying the result tolerance with @yuminguw earlier about this and forgot to change it -- it looks like the values don't pass for Linux but it does for me locally, maybe an OS issue? Anyhow, I will relax the tolerance to make it consistent with the other tests.

@dlee489 the test passed on my WSL as shown below:
image

@yuminguw
Copy link
Member

By the way, when running the example_process_image.py, the boundary association is off as shown below. Could you troubleshoot that? image

Thanks for pointing that out. Looking into it, I believe it should be fixed now with this commit, as the new changes affected the coordinates when mapping. If you can double check that it works on your end?

@dlee489 Yes, it worked well now. thanks !

@yuminguw
Copy link
Member

By the way, when running the example_process_image.py, the boundary association is off as shown below. Could you troubleshoot that? image

Thanks for pointing that out. Looking into it, I believe it should be fixed now with this commit, as the new changes affected the coordinates when mapping. If you can double check that it works on your end?

We need a test for this. Can you write one @dlee489 to check that our coords are as expected so we catch this going forward? We wouldn't have seen this without yuming looking at some outputs, but we can easily test coordinates.

Good suggestion. @dlee489 in the test_process_image.py, could you add a coordinates check for the fiber and boundary association just as you do for the test_get_tif_boundary.py? Though I saw the association was off previously, I was pretty assure that the measurements should be correct as other tests passed. Essentially, would be nice to add a test for the association visualization.

@ctrueden
Copy link
Member

The merge of #26 led to some merge conflicts for this PR. I have now merged the latest main back to this branch to fix them.

@dlee489 Heads up that after discussion with @elevans and @Phil-Dua, we have also merged the 22-consistent-test-struct branch onto napari-curvealign, so that @Phil-Dua can begin migrating the napari-curvealign plugin over to your manually ported functions here. There is no action you need to take to support this—just please do not rebase 22-consistent-test-struct after this point, because it would make the to-be-merged git history much messier. Instead, just keep pushing more commits to finish the work here. Thanks!

@dlee489
Copy link
Member Author

dlee489 commented Feb 19, 2026

The merge of #26 led to some merge conflicts for this PR. I have now merged the latest main back to this branch to fix them.

@dlee489 Heads up that after discussion with @elevans and @Phil-Dua, we have also merged the 22-consistent-test-struct branch onto napari-curvealign, so that @Phil-Dua can begin migrating the napari-curvealign plugin over to your manually ported functions here. There is no action you need to take to support this—just please do not rebase 22-consistent-test-struct after this point, because it would make the to-be-merged git history much messier. Instead, just keep pushing more commits to finish the work here. Thanks!

Got it, so just a git pull and commit/push as normal should be fine, correct? Thanks for the heads up!

@ctrueden
Copy link
Member

so just a git pull and commit/push as normal should be fine, correct?

Yep! git pull, or if you have local-only commits ahead of the remote you could git pull --rebase if you prefer to staple them back on top of the history. As long as you don't git rebase -i away any commits already on GitHub, we are good!

WIP as found that boundary_point row/col have wrong labeling that causes it to flip
(I.E. center_row -> boundary_point_col and center_col -> boundary_point_row leads to
correct plotting)
Swapped row/col for boundary point in get_tif_boundary and process_image
due to inconsistencies as all MATLAB outputs were [col, row] instead of [row, col]
@dlee489
Copy link
Member Author

dlee489 commented Feb 20, 2026

By the way, when running the example_process_image.py, the boundary association is off as shown below. Could you troubleshoot that? image

Thanks for pointing that out. Looking into it, I believe it should be fixed now with this commit, as the new changes affected the coordinates when mapping. If you can double check that it works on your end?

We need a test for this. Can you write one @dlee489 to check that our coords are as expected so we catch this going forward? We wouldn't have seen this without yuming looking at some outputs, but we can easily test coordinates.

Good suggestion. @dlee489 in the test_process_image.py, could you add a coordinates check for the fiber and boundary association just as you do for the test_get_tif_boundary.py? Though I saw the association was off previously, I was pretty assure that the measurements should be correct as other tests passed. Essentially, would be nice to add a test for the association visualization.

This has been resolved through this commit.

More generally: test-only deps should go into the dev dependencies, not the main ones. Also, due to how Python dependency resolution works when composing components, the main deps should not be pinned to specific versions (==x.y.z), or else the component becomes largely unusable in larger/composed environments.

Should I modify all deps to be lower bound instead of pinning?

If by lower bound you mean something like this scyjava jpype dependency then yes I'd try and unpin as many dependencies as you can. This will require testing of course.

This seems to have been done in #26?

I believe all relevant changes have gone through, if I could get another go-through that would be great. Thank you!

@yuminguw
Copy link
Member

By the way, when running the example_process_image.py, the boundary association is off as shown below. Could you troubleshoot that? image

Thanks for pointing that out. Looking into it, I believe it should be fixed now with this commit, as the new changes affected the coordinates when mapping. If you can double check that it works on your end?

We need a test for this. Can you write one @dlee489 to check that our coords are as expected so we catch this going forward? We wouldn't have seen this without yuming looking at some outputs, but we can easily test coordinates.

Good suggestion. @dlee489 in the test_process_image.py, could you add a coordinates check for the fiber and boundary association just as you do for the test_get_tif_boundary.py? Though I saw the association was off previously, I was pretty assure that the measurements should be correct as other tests passed. Essentially, would be nice to add a test for the association visualization.

This has been resolved through this commit.

More generally: test-only deps should go into the dev dependencies, not the main ones. Also, due to how Python dependency resolution works when composing components, the main deps should not be pinned to specific versions (==x.y.z), or else the component becomes largely unusable in larger/composed environments.

Should I modify all deps to be lower bound instead of pinning?

If by lower bound you mean something like this scyjava jpype dependency then yes I'd try and unpin as many dependencies as you can. This will require testing of course.

This seems to have been done in #26?

I believe all relevant changes have gone through, if I could get another go-through that would be great. Thank you!

Yes, the added test can catch the previous error. All tests passed in my test on WSL system. I think this branch is ready to merge to the main.

Copy link
Member

@yuminguw yuminguw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All the tests passed and all the comments have been addressed. I think it is ready to merge.

@ctrueden ctrueden merged commit 80ee91c into main Feb 21, 2026
6 checks passed
@ctrueden ctrueden deleted the 22-consistent-test-struct branch February 24, 2026 20:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants