Consistent test data structure for processImage, newCurv#30
Conversation
See #16 for reference
Translated surface level implementation of getTifBoundary.m and getSegmentPixels.m, will focus on identifying possible test cases and start debugging with the correct input/output expectation
Added gitignore for IDEA. Ref: #5
Fixes discrepancy that occurs between get_tif_boundary.py and getTifBoundary.m; however, this now causes the pytests for get_relative_angle.py to fail
Make changes that allow both get_relative_angles.py and get_tif_boundary.py to call find_outline_slope.py without any miscalculations. See #5
Added round_mlab.py helper function to easily round values MATLAB's way rather than Python/NumPy's.
Direct comparison between the resulting matrix of MATLAB and Python implementation. Main error results from 1-indexing conversion, where if all the centers are decremented by 1, then nearest_region_distance is recorded properly on Python's end, but the nearest_boundary_distance is wrong. If all the centers aren't decremented by 1, then nearest_region_distance is always wrong. See #5
Changed KNN algorithm to match MATLAB, edited tester methods for clarity. Modified the get_tif_boundary_output.csv because the local output from MATLAB was different than what was originally provided. See #5
So pip doesn't skip proper dependency resolutions
|
I got the |
Yes, I've talked about modifying the result tolerance with @yuminguw earlier about this and forgot to change it -- it looks like the values don't pass for Linux but it does for me locally, maybe an OS issue? Anyhow, I will relax the tolerance to make it consistent with the other tests. |
All rtols have been changed to 0.05, atol to 15
Just to be in the loop for user experience, was this due to not declaring |
Thanks for pointing that out. Looking into it, I believe it should be fixed now with this commit, as the new changes affected the coordinates when mapping. If you can double check that it works on your end? |
| rtol = 1e-3 # 0.1% relative tolerance | ||
| atol = 1e-3 # 0.1% absolute tolerance for angles | ||
| rtol = 0.05 # 0.1% relative tolerance | ||
| atol = 15 |
There was a problem hiding this comment.
@dlee489 Can you tell me the origin of the original tolerance values and how you decided these new ones? Your commit doesn't indicate why these values changed. Also atol going from 1e-3 to 15 is a huge change, no?
There was a problem hiding this comment.
@elevans The tolerance changes are honestly pretty arbitrary values -- I switched the atol to 15 for consistency with test_process_image, which was decided after talking with @yuminguw.
Hindsight should probably decide on more legitimate values, although I'm not sure how to decide/calculate that. Both original tolerance and new tolerance values are arbitrary.
There was a problem hiding this comment.
Thanks for the context! My concern here is that setting arbitrary tolerance values without good reason may defeat the point of the test if the tolerance is too large. Did you ever measure the value differences between expected and failing tests? Is the difference an epsilon value? To be clear I do think the right answer here is to set tolerance values, we just need a good reason for the value we choose.
There was a problem hiding this comment.
I'm not entirely following what you mean by an epsilon value? But yes, I do measure value differences between expected and failing in the tests with both the max atol and rtol.
From my memory, I set the absolute tolerance to 15 due to a test case failing because of one value being a difference >10, with it being the only error and every other value being within the threshold. I can investigate this further later, however.
Great question. So what I did, naively, was go to the platform linux -- Python 3.11.14, pytest-9.0.2, pluggy-1.6.0
rootdir: /home/edward/Documents/repos/loci/tme-quant
configfile: pyproject.toml
plugins: npe2-0.8.1, zarr-3.1.5, napari-plugin-engine-0.2.1, napari-0.6.6
collected 30 items / 3 skipped
napari_curvealign_test.py . [ 3%]
test_get_alignment_to_roi.py ... [ 13%]
test_get_tif_boundary.py ... [ 23%]
test_relative_angles.py ....................... [100%]I missed your special test command with my first pass at this, but after I found it I got the test to run like everyone else.
Edit: I just needed to do |
We need a test for this. Can you write one @dlee489 to check that our coords are as expected so we catch this going forward? We wouldn't have seen this without yuming looking at some outputs, but we can easily test coordinates. |
pyproject.toml
Outdated
| "openpyxl", | ||
| "matplotlib==3.10.1", | ||
| "numpy==2.2.6", | ||
| "opencv-python-headless==4.10.0.84", |
There was a problem hiding this comment.
Do we have plans for any other opencv stuff for TME-quant? We're using it here to just read in tiff files here. Using opencv like this is totally okay...but consider the size of the dependency. opencv-python-headless is 62.3 MB when installed via pip where as tifffile, another very popular tiff file reader, is 2.1 MB!
There was a problem hiding this comment.
No, I do not think so. Changing the tifffile is probably the better idea here, but one thing to note is that pycurvelets3D.py is from a year ago and it's an unfinished attempt at doing curvelets in 3D -- so this file currently has no useful functionality and is fine if we change to tifffile for the env
@dlee489 the test passed on my WSL as shown below: |
@dlee489 Yes, it worked well now. thanks ! |
Good suggestion. @dlee489 in the test_process_image.py, could you add a coordinates check for the fiber and boundary association just as you do for the test_get_tif_boundary.py? Though I saw the association was off previously, I was pretty assure that the measurements should be correct as other tests passed. Essentially, would be nice to add a test for the association visualization. |
|
The merge of #26 led to some merge conflicts for this PR. I have now merged the latest main back to this branch to fix them. @dlee489 Heads up that after discussion with @elevans and @Phil-Dua, we have also merged the |
Got it, so just a |
Yep! |
WIP as found that boundary_point row/col have wrong labeling that causes it to flip (I.E. center_row -> boundary_point_col and center_col -> boundary_point_row leads to correct plotting)
Swapped row/col for boundary point in get_tif_boundary and process_image due to inconsistencies as all MATLAB outputs were [col, row] instead of [row, col]
This has been resolved through this commit.
This seems to have been done in #26? I believe all relevant changes have gone through, if I could get another go-through that would be great. Thank you! |
Yes, the added test can catch the previous error. All tests passed in my test on WSL system. I think this branch is ready to merge to the main. |
yuminguw
left a comment
There was a problem hiding this comment.
All the tests passed and all the comments have been addressed. I think it is ready to merge.


This PR combines #5, #6, #22.